Information Theory - Part I: An Introduction To The Fundamental Concepts

Information Theory - Part I: An Introduction To The Fundamental Concepts

by Arieh Ben-naim
ISBN-10:
9813208821
ISBN-13:
9789813208827
Pub. Date:
08/02/2017
Publisher:
World Scientific Publishing Company, Incorporated
ISBN-10:
9813208821
ISBN-13:
9789813208827
Pub. Date:
08/02/2017
Publisher:
World Scientific Publishing Company, Incorporated
Information Theory - Part I: An Introduction To The Fundamental Concepts

Information Theory - Part I: An Introduction To The Fundamental Concepts

by Arieh Ben-naim
$132.0
Current price is , Original price is $132.0. You
$132.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Overview

This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory — the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.

Product Details

ISBN-13: 9789813208827
Publisher: World Scientific Publishing Company, Incorporated
Publication date: 08/02/2017
Pages: 368
Product dimensions: 6.10(w) x 9.10(h) x 0.90(d)

Table of Contents

Preface xiii

Acknowledgments xvii

List of Abbreviations xix

Chapter 0 Elements of Probability Theory 1

0.1 Introduction 1

0.2 The Axiomatic Approach 2

0.2.1 The Sample Space, Ω 2

0.2.2 The Field of Events, F 3

0.2.3 The Probability Function P 5

0.3 The Classical "Definition" of Probability 9

0.4 The Relative Frequency "Definition" of Probability 9

0.5 Independent Events and Conditional Probability 10

0.6 Bayes' Theorem 17

0.7 Random Variables, Average, Variance, and Correlation 19

0.8 Some Specific Distributions 24

0.8.1 The Uniform Distribution 24

0.8.2 The Binomial Distribution 25

0.8.3 The Normal Distribution 30

0.8.4 The Poisson Distribution 32

0.8.5 The Exponential Distribution 34

0.9 Bernoulli Trials and the Law of Large Numbers 34

0.9.1 Generalization for Two Unequal Probabilities 39

0.9.2 Generalization for Any Number of Outcomes 41

0.9.3 The Law of Large Numbers 44

0.10 The Type of a Sequence and the Typical Sequence 45

0.11 Markov Chains 47

0.12 The Sum of Independent Random Variables as a Markov Chain 51

Chapter 1 Introduction, Definition, and Interpretations of Shannon's Measure of Information 55

1.1 Shannon's Motivation for Constructing a Theory of Information 55

1.2 Interpretations of SMI 60

1.2.1 The Uncertainty Meaning of SMI 61

1.2.2 The Unlikelihood Interpretation 62

1.2.3 The Meaning of SMI as a Measure of Information 63

1.3 What Shannon Achieved 67

1.4 Summary of Chapter 1 68

Chapter 2 Properties of Shannon's Measure of Information 69

2.1 Definition of SMI for a Finite Set of Events 69

2.2 The Case of an Experiment Having Two Outcomes; the Definition of a Unit of Information 71

2.3 SMI for a Finite Number of Outcomes 75

2.4 The Special Case of the Grouping Property for the 20Q Game 77

2.5 What Is the 20Q Game? 79

2.6 A Simple 20Q Game 80

2.7 The General Uniform 20Q Game 87

2.8 The Nonuniform Game 90

2.9 The Maximum of the SMI Over All Possible Discrete Distributions 94

2.10 The Case of Infinite Number of Outcomes 97

2.11 Three Extremum Theorems on SMI 98

2.11.1 The Uniform Distribution of Locations 98

2.11.2 The Normal Distribution 102

2.11.3 The Exponential or Boltzmann Distribution 105

2.12 Examples 106

2.12.1 Spins Having Two Possible Orientations: "Up" and "Down" 107

2.12.2 Spins Having Multiple Orientations 111

2.12.3 The SMI of the Letters of an Alphabet 114

2.13 Redundancy 117

2.13.1 Qualitative Meaning 117

2.13.2 Redundancy Defined in Terms of SMI 119

2.13.3 Redundancy in the Letters of an Alphabet 120

2.13.4 Wright's Novel Without the Letter "e" 123

2.13.5 An Amusing Example Involving Redundancy 125

2.14 Summary of Chapter 2 127

Chapter 3 Conditional and Mutual Information 129

3.1 Conditional Information 130

3.2 Example: The Urn Problem 134

3.3 Mutual Information 138

3.4 A System of Two Spins 142

3.4.1 Two Interacting Spins without an External Field 145

3.4.2 Two Interacting Spins with an External Field F 152

3.4.3 Two Interacting Spins in a Nonuniform External Field 163

3.5 An Evolving SMI in a Markov Chain 164

3.6 The Ehrenfest Model 166

3.7 An Evolving SMI in a Two-State Markov Chain 174

3.8 A One-Dimensional Lattice of Spins 180

3.8.1 Singlet Probabilities 181

3.8.2 Pair Probabilities 183

3.8.3 The SMI for Single and Pairs of Spins 185

3.8.4 Mutual Information 188

3.9 Summary of Chapter 3 191

Chapter 4 Multivariate Mutual Information 193

4.1 Multivariate MI Based on Total Correlations 193

4.2 Multivariate MI Based on Conditional Information 195

4.3 The Relationship between the Conditional MI and the SMI 197

4.4 The Connection between the TI and the CI 198

4.5 The Connection between the Conditional MI and the Kirkwood Superposition Approximation 199

4.6 Interpretation of the CI in Terms of the MI between Two rv's and the MI between One rv and the Joint rv 203

4.7 Generalization to n rv's 204

4.8 Properties of the Multivariate MI 205

4.9 A Three-Spin System 207

4.9.1 Probabilities 210

4.9.2 Conditional Probabilities 215

4.9.3 The SMI and the Conditional SMI 219

4.9.4 Correlations and the MI 227

4.10 A Four-Spin System 235

4.11 Summary of Chapter 4 240

Chapter 5 Entropy and the Second Law of Thermodynamics 241

5.1 A Few Historical Milestones 242

5.2 Derivation of the Entropy Function of an Ideal Gas 243

5.2.1 The Locational SMI of a Particle in a 1D Box of Length L 244

5.2.2 The Velocity SMI of a Particle in a 1D Box of Length L 245

5.2.3 Combining the SMI's of the Location and Momentum of One Particle in a 1D System 247

5.2.4 The SMI of a Particle in a Box of Volume V 248

5.2.5 The SMI's of Locations and Momenta of N Independent Particles in a Box of Volume V 248

5.2.6 Conclusion 255

5.3 The Entropy Formulation of the Second Law 257

5.3.1 The Simplest Expansion Process of an Ideal Gas From V to 2 V 257

5.3.2 The Expansion of an Ideal Gas From V to 3V 264

5.3.3 The General Case of a c Compartment System 265

5.3.4 A Process of Expansion in a Gravitational Field at Constant Temperature 270

5.3.5 Particles Having Internal Rotational Degrees of Freedom 275

5.3.6 Dipoles in an External Field 277

5.4 The Helmholtz Energy Formulation of the Second Law 278

5.5 The Gibbs Energy Formulation of the Second Law 278

5.6 Summary of Chapter 5 279

Appendix A Proof of an Equivalent Markovian Property 281

Appendix B Proof of the Uniqueness of the Function H 285

Appendix C The SMI for the Continuous Random Variable 291

Appendix D Functional Derivatives and Functional Taylor Expansion 295

Appendix E Some Inequalities for Convex Functions 301

Appendix F Distribution Functions in ID Models 307

Appendix G Entropy Change in an Expansion Process in a Gravitational Field 315

Notes 319

References and Suggested Reading 337

Index 341

From the B&N Reads Blog

Customer Reviews