In this paper, we develop a constructive theory for approximating absolutely continuous functions by series of certain sigmoidal functions. Estimates for the approximation error are also derived. The relation with neural networks (NNs) approximation is discussed. The connection between sigmoidal functions and the scaling functions of $r$-regular multiresolution approximations are investigated. In this setting, we show that the approximation error for $C^1$-functions decreases as $2^{-j}$, as $j \to + \infty$. Examples with sigmoidal functions of several kinds, such as logistic, hyperbolic tangent, and Gompertz functions, are given.
Costarelli, D., Spigler, R. (2015). Approximation by series of sigmoidal functions with applications to neural networks. ANNALI DI MATEMATICA PURA ED APPLICATA, 194(1), 289-306 [10.1007/s10231-013-0378-y].
Approximation by series of sigmoidal functions with applications to neural networks
SPIGLER, Renato
2015-01-01
Abstract
In this paper, we develop a constructive theory for approximating absolutely continuous functions by series of certain sigmoidal functions. Estimates for the approximation error are also derived. The relation with neural networks (NNs) approximation is discussed. The connection between sigmoidal functions and the scaling functions of $r$-regular multiresolution approximations are investigated. In this setting, we show that the approximation error for $C^1$-functions decreases as $2^{-j}$, as $j \to + \infty$. Examples with sigmoidal functions of several kinds, such as logistic, hyperbolic tangent, and Gompertz functions, are given.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.