Mathematical Foundations of Information Theory


The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite “scheme,” and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts “to give a complete, detailed proof of both … Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.”
Partial Contents: I. The Entropy Concept in Probability Theory — Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory — Two generalizations of Shannon’s inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein’s Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.
"1103268985"
Mathematical Foundations of Information Theory


The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite “scheme,” and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts “to give a complete, detailed proof of both … Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.”
Partial Contents: I. The Entropy Concept in Probability Theory — Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory — Two generalizations of Shannon’s inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein’s Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.
10.95 In Stock
Mathematical Foundations of Information Theory

Mathematical Foundations of Information Theory

by A. Ya. Khinchin
Mathematical Foundations of Information Theory

Mathematical Foundations of Information Theory

by A. Ya. Khinchin

Paperback

$10.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview



The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite “scheme,” and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts “to give a complete, detailed proof of both … Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.”
Partial Contents: I. The Entropy Concept in Probability Theory — Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory — Two generalizations of Shannon’s inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinstein’s Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.

Product Details

ISBN-13: 9780486604343
Publisher: Dover Publications
Publication date: 06/01/1957
Series: Dover Books on Mathematics Series
Pages: 128
Product dimensions: 5.50(w) x 8.50(h) x (d)

Table of Contents

The Entropy Concept In Probability Theory
1. Entropy of Finite Schemes
2. The Uniqueness Theorem
3. Entropy of Markov chains
4. Fundamental Theorems
5. Application to Coding Theory
On the Fundamental Theorems of Information Theory
INTRODUCTION
CHAPTER I. Elementary Inequalities
1. Two generalizations of Shannon's inequality
2. Three inequalities of Feinstein
CHAPTER II. Ergodic Sources
3. Concept of a source. Stationarity. Entropy
4. Ergodic Sources
5. The E property. McMillan's theorem.
6. The martingale concept. Doob's theorem.
7. Auxillary propositions
8. Proof of McMillan's theorem.
CHAPTER III. Channels and the sources driving them
9. Concept of channel. Noise. Stationarity. Anticipation and memory
10. Connection of the channel to the source
11. The ergodic case
CHAPTER IV. Feinstein's Fundamental Lemma
12. Formulation of the problem
13. Proof of the lemma
CHAPTER V. Shannon's Theorems
14. Coding
15. The first Shannon theorem
16. The second Shannon theorem
CONCLUSION
REFERENCES
From the B&N Reads Blog

Customer Reviews