Introduction to Coding and Information Theory / Edition 1

Introduction to Coding and Information Theory / Edition 1

by Steven Roman
ISBN-10:
0387947043
ISBN-13:
9780387947044
Pub. Date:
11/26/1996
Publisher:
Springer New York
ISBN-10:
0387947043
ISBN-13:
9780387947044
Pub. Date:
11/26/1996
Publisher:
Springer New York
Introduction to Coding and Information Theory / Edition 1

Introduction to Coding and Information Theory / Edition 1

by Steven Roman

Hardcover

$84.99
Current price is , Original price is $84.99. You
$84.99 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

Product Details

ISBN-13: 9780387947044
Publisher: Springer New York
Publication date: 11/26/1996
Series: Undergraduate Texts in Mathematics
Edition description: 1997
Pages: 326
Product dimensions: 7.01(w) x 10.00(h) x 0.03(d)

Table of Contents

Introduction:

Preliminaries; Miscellany; Some Probability; Matrices

1. An Introduction to Codes

Strings and Things; What are codes? Uniquely Decipherable Codes;

Instantaneous Codes and Kraft's Theorem

2. Efficient Encoding

Information Sources; Average Codeword Length; Huffman Encoding; The

Proof that Huffman Encoding is the Most Efficient

3. Noiseless Coding

Entropy; Properties of Entropy; Extensions of an Information 1= Source; The Noiseless Coding Theorem

II Coding Theory

4. The Main Coding Theory Problem

Communications Channels; Decision Rules; Nearest Neighbor Decoding;

From the B&N Reads Blog

Customer Reviews