World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Spring Sale: Get 35% off with a min. purchase of 2 titles. Use code SPRING35. Valid till 31st Mar 2025.

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.
Information Theory and the Central Limit Theorem cover

This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic introduction to the concepts of entropy and Fisher information, and collects together standard results concerning their behaviour. It brings together results from a number of research papers as well as unpublished material, showing how the techniques can give a unified view of limit theorems.


Contents:
  • Introduction to Information Theory
  • Convergence in Relative Entropy
  • Non-Identical Variables and Random Vectors
  • Dependent Random Variables
  • Convergence to Stable Laws
  • Convergence on Compact Groups
  • Convergence to Poisson Distribution
  • Free Random Variables

Readership: Graduate students, academics and researchers in probability and statistics.