Probability and Information: An Integrated Approach / Edition 2

Probability and Information: An Integrated Approach / Edition 2

by David Applebaum
ISBN-10:
0521899044
ISBN-13:
9780521899048
Pub. Date:
08/14/2008
Publisher:
Cambridge University Press
ISBN-10:
0521899044
ISBN-13:
9780521899048
Pub. Date:
08/14/2008
Publisher:
Cambridge University Press
Probability and Information: An Integrated Approach / Edition 2

Probability and Information: An Integrated Approach / Edition 2

by David Applebaum
$116.0
Current price is , Original price is $116.0. You
$116.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE

    Your local store may have stock of this item.


Overview

This new and updated textbook is an excellent way to introduce probability and information theory to students new to mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it begins by building a clear and systematic foundation to probability and information. Classic topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information. Newly covered for this edition is modern material on Markov chains and their entropy. Examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.

Product Details

ISBN-13: 9780521899048
Publisher: Cambridge University Press
Publication date: 08/14/2008
Edition description: 2nd Updated, New ed.
Pages: 250
Product dimensions: 6.80(w) x 9.80(h) x 0.70(d)

About the Author

David Applebaum is a Professor in the Department of Probability and Statistics at the University of Sheffield.

Table of Contents

Preface to the first edition; Preface to the second edition; 1. Introduction; 2. Combinatorics; 3. Sets and measures; 4. Probability; 5. Discrete random variables; 6. Information and entropy; 7. Communication; 8. Random variables with probability density functions; 9. Random vectors; 10. Markov chains and their entropy; Exploring further; Appendix 1. Proof by mathematical induction; Appendix 2. Lagrange multipliers; Appendix 3. Integration of exp (-½x²); Appendix 4. Table of probabilities associated with the standard normal distribution; Appendix 5. A rapid review of Matrix algebra; Selected solutions; Index.
From the B&N Reads Blog

Customer Reviews