Become a Readings Member to make your shopping experience even easier. Sign in or sign up for free!

Become a Readings Member. Sign in or sign up for free!

Hello Readings Member! Go to the member centre to view your orders, change your details, or view your lists, or sign out.

Hello Readings Member! Go to the member centre or sign out.

Estimation of Mutual Information
Hardback

Estimation of Mutual Information

$366.99
Sign in or become a Readings Member to add this title to your wishlist.

This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.

Read More
In Shop
Out of stock
Shipping & Delivery

$9.00 standard shipping within Australia
FREE standard shipping within Australia for orders over $100.00
Express & International shipping calculated at checkout

MORE INFO
Format
Hardback
Publisher
Springer Verlag, Singapore
Country
Singapore
Date
21 August 2022
Pages
120
ISBN
9789811307331

This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.

Read More
Format
Hardback
Publisher
Springer Verlag, Singapore
Country
Singapore
Date
21 August 2022
Pages
120
ISBN
9789811307331