Loading [MathJax]/jax/output/CommonHTML/jax.js
World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Construction and approximation for a class of feedforward neural networks with sigmoidal function

    https://doi.org/10.1142/S0219691323500285Cited by:0 (Source: Crossref)

    As we know, feedforward neural networks (FNNs) with sigmoidal activation function are universal approximators. Theoretically, for any continuous function defined on a compact set, there exists an FNN such that the FNN can approximate the function with arbitrary accuracy, which constitutes a theoretical guarantee that FNN can be used as an efficient learning machine. This paper addresses the construction and approximation for FNNs. We construct an FNN with sigmoidal activation function and estimate its approximation error. In particular, an inverse theorem of the approximation is established, which implies the equivalence characterization theorem of the approximation and reveals the relationship between the topological structure of the FNN and its approximation ability. As keys in this study, the concepts of modulus of continuity of function, K-functional, and their relationship are utilized, and two Bernstein-type inequalities are established.

    AMSC: 41A25, 42A30, 31A46