World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

THE IMPACT OF ENTROPY OPTIMIZATION PRINCIPLES ON THE PROBABILITY ASSIGNMENT TO THE MEASUREMENT UNCERTAINTY

    https://doi.org/10.1142/9789812702647_0019Cited by:0 (Source: Crossref)
    Abstract:

    A measurement process has imperfections that give rise to uncertainty in each measurement result. Statistical tools give the assessment of uncertainties associated to the results only if all the relevant quantities involved in the process are interpreted or regarded as random variables. In other terms all the sources of uncertainty are characterized by probability distribution functions, the form of which is assumed to either be known from measurements or unknown and so conjectured. Entropy is an information measure associated with the probability distribution of any random variable, so that it plays an important role in the metrological activity.

    In this paper the authors introduce two basic entropy optimization principles: the Jaynes’s principle of maximum entropy and the Kulback’s principle of minimum cross-entropy (minimum directed divergence) and discuss the methods to approach the optimal solution of those entropic forms in some specific measurements models.