Neural Networks. Vol. 5, PP. 129-138, 1992

Printed in the USA. All rights reserved.

ORIGINAL CONTRIBUTION



On Learning the Derivatives of an Unknown Mapping With Multilayer Feedforward Networks



A. RONALD GALLANT1 AND HALBERT WHITE2

1North Carolina State University and 2University of California, San Diego

(Received 29 November 1989; revised and accepted 20 June 1991)





Abstract-Recently, multiple input, single output, single hidden-layer feedforward neural networks have been shown to be capable of approximating a nonlinear map and its partial derivatives. Specifically, neural nets have been shown to be dense in various Sobolev spaces. Building upon this result, we show that a net can be trained so that the map and its derivatives are learned. Specifically, we use a result of Gallant's to show that least squares and similar estimates are strongly consistent in Sobolev norm provided the number of hidden units and the size of the training set increase together. We illustrate these results by an application to the inverse problem of chaotic dynamics: recovery of a nonlinear map from a time series of iterates. These results extend automatically to nets that embed the single hidden layer, feedforward network as a special case.





Keywords-Estimating derivatives, Chastic dynamics, Sobolev spaces, Denseness.