Warehouse Stock Clearance Sale

Grab a bargain today!


Sign Up for Fishpond's Best Deals Delivered to You Every Day
Go
The Interplay Between ­Information and Estimation ­Measures (Foundations and ­Trends
R) in Signal Processing

Rating
Format
Paperback, 214 pages
Published
United States, 28 November 2013

If information theory and estimation theory are thought of as two scientific languages, then their key vocabularies are information measures and estimation measures, respectively. The basic information measures are entropy, mutual information and relative entropy. Among the most important estimation measures are mean square error (MSE) and Fisher information. Playing a paramount role in information theory and estimation theory, those measures are akin to mass, force and velocity in classical mechanics, or energy, entropy and temperature in thermodynamics.



The Interplay Between Information and Estimation Measures is intended as handbook of known formulas which directly relate to information measures and estimation measures. It provides intuition and draws connections between these formulas, highlights some important applications, and motivates further explorations. The main focus is on such formulas in the context of the additive Gaussian noise model, with lesser treatment of others such as the Poisson point process channel. Also included are a number of new results which are published here for the first time. Proofs of some basic results are provided, whereas many more technical proofs already available in the literature are omitted. In 2004, the authors of this monograph found a general differential relationship commonly referred to as the I-MMSE formula. In this book a new, complete proof for the I-MMSE formula is developed, which includes some technical details omitted in the original papers relating to this.



The Interplay Between Information and Estimation Measures concludes by highlighting the impact of the information-estimation relationships on a variety of information-theoretic problems of current interest, and provide some further perspective on their applications.

Show more

Our Price
$160
Ships from Australia Estimated delivery date: 28th Apr - 6th May from Australia
Free Shipping Worldwide

Buy Together
+
Buy Together
$251.08

Product Description

If information theory and estimation theory are thought of as two scientific languages, then their key vocabularies are information measures and estimation measures, respectively. The basic information measures are entropy, mutual information and relative entropy. Among the most important estimation measures are mean square error (MSE) and Fisher information. Playing a paramount role in information theory and estimation theory, those measures are akin to mass, force and velocity in classical mechanics, or energy, entropy and temperature in thermodynamics.



The Interplay Between Information and Estimation Measures is intended as handbook of known formulas which directly relate to information measures and estimation measures. It provides intuition and draws connections between these formulas, highlights some important applications, and motivates further explorations. The main focus is on such formulas in the context of the additive Gaussian noise model, with lesser treatment of others such as the Poisson point process channel. Also included are a number of new results which are published here for the first time. Proofs of some basic results are provided, whereas many more technical proofs already available in the literature are omitted. In 2004, the authors of this monograph found a general differential relationship commonly referred to as the I-MMSE formula. In this book a new, complete proof for the I-MMSE formula is developed, which includes some technical details omitted in the original papers relating to this.



The Interplay Between Information and Estimation Measures concludes by highlighting the impact of the information-estimation relationships on a variety of information-theoretic problems of current interest, and provide some further perspective on their applications.

Show more
Product Details
EAN
9781601987488
ISBN
160198748X
Publisher
Other Information
Illustrated
Dimensions
23.4 x 15.6 x 1.1 centimeters (0.31 kg)

Table of Contents

1. Introduction 2: Basic Information and Estimation Measures 3: Properties of the MMSE in Gaussian Noise 4: Mutual Information and MMSE: Basic Relationship 5: Mutual Information and MMSE in Discrete- and Continuous-time Gaussian Channels 6: Entropy, Relative Entropy, Fisher Information, and Mismatched Estimation 7: Applications of I-MMSE 8: Information and Estimation Measures in Poisson Models and Channels 9: Beyond Gaussian and Poisson Models 10: Outlook. Acknowledgements. Appendices. References

Review this Product
Ask a Question About this Product More...
 
Look for similar items by category
Item ships from and is sold by Fishpond Retail Limited.

Back to top