## Convergence Rates for Single Hidden Layer Feedforward Networks

Daniel F. McCaffrey, A. Ronald Gallant

*Neural Networks*, Vol. 7, No. 1, pp. 147--158.

### Abstract

By allowing the training set to become arbitrarily large,
appropriately trained and configured single hidden layer
feedforward networks converge in probability to the smooth
function that they were trained to estimate. A bound on the
probabilistic rate of convergence of these network estimates
is given. The convergence rate is calculated as a function
of the sample size *n*. If the function being estimated has
square integrable *m*th order partial derivatives then the
*L*_{2}-norm estimation error approaches
*O*_{p} (*n*^{-1/2})
for large *m*. Two steps are required for determining these bounds. A
bound on the rate of convergence of approximations to an
unknown smooth function by members of a special class of
single hidden layer feedforward networks is determined. The
class of networks considered can embed Fourier series.
Using this fact and results on approximation properties of
Fourier series yields a bound on the *L*_{2}-norm approximation
error. This bound is less than
*O* (*q*^{-1/2})
for approximating a smooth function by networks with *q* hidden
units. A modification of existing results for bounding
estimation error provides a general theorem for calculating
estimation error convergence rates. Combining this result
with the bound on approximation rates yields the final
convergence rates.
**Keywords:** Nonparametric regression, Fourier series, Embedding.