World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

DETERMINATION OF NEURAL NETWORK PARAMETERS BY INFORMATION THEORY

    https://doi.org/10.1142/S0218213092000193Cited by:0 (Source: Crossref)

    It is well known that artificial neural nets can be used as approximators of any continous functions to any desired degree and therefore be used e.g. in high-speed, real-time process control. Nevertheless, for a given application and a given network architecture the non-trivial task rests to determine the necessary number of neurons and the necessary accuracy (number of bits) per weight for a satisfactory operation.

    In this paper the accuracy of the weights and the number of neurons are seen as general system parameters which determine the maximal output information (i.e. the approximation error) by the absolute amount (network description complexity) and the relative distribution of information contained in the network. A new principle of optimal information distribution is proposed and the conditions for the optimal system parameters are derived.

    For two examples, a simple linear approximation of a non-linear, quadratic function and a non-linear approximation of the inverse kinematic transformation used in robot manipulator control, the principle of optimal information distribution gives the the optimal system parameters, i.e. the number of neurons and the different resolutions of the variables.