Original contribution
On the approximate realization of continuous mappings by neural networks

https://doi.org/10.1016/0893-6080(89)90003-8Get rights and content

Abstract

In this paper, we prove that any continuous mapping can be approximately realized by Rumelhart-Hinton-Williams' multilayer neural networks with at least one hidden layer whose output functions are sigmoid functions. The starting point of the proof for the one hidden layer case is an integral formula recently proposed by Irie-Miyake and from this, the general case (for any number of hidden layers) can be proved by induction. The two hidden layers case is proved also by using the Kolmogorov-Arnold-Sprecher theorem and this proof also gives non-trivial realizations.

References (18)

  • Y. Uesaka

    Analog perceptrons: On additive representation of functions

    Information and Control

    (1971)
  • S. Amari

    A theory of adaptive pattern classifiers

    IEEE Transactions on Electronic Computers

    (1967)
  • R.O. Duda et al.

    Pattern classification by iteratively determined linear and piecewise linear discriminant functions

    IEEE Transactions on Electronic Computers

    (1966)
  • I.M. Gel'fand et al.
  • R. Hecht-Nielsen

    Kolmogorov mapping neural network existence theorem

  • W.Y. Huang et al.

    Neural net and traditional classifiers

  • B. Irie et al.

    Capabilities of three-layered Perceptrons

  • A.N. Kolmogorov

    On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition

    Doklady Akademii Nauk SSSR

    (1957)

    American Mathematical Society Translation

    (1963)
  • R.P. Lippmann

    An introduction to computing with neural nets

    IEEE ASSP Magazine

    (1987, April)
There are more references available in the full text version of this article.

Cited by (3449)

  • Physics-driven neural networks for nonlinear micromechanics

    2024, International Journal of Mechanical Sciences
View all citing articles on Scopus
View full text