## Information TheoryWe live in an information society. This book describes the basic ideas of information theory, showing how we can transmit and store information as compactly as possible, how much information can be transmitted by a particular channel or network, and how security can be assured. It covers all the basic ideas of information theory and sets them in the context of current applications. These include Shannon's information measure, discrete and continuous information sources and information channels with or without memory, source and channel decoding, rate distortion theory, error correcting codes and the information theoretical approach to cryptology. Throughout the book the author has paid special attention to multiterminal or network information theory. This text will be of interest to advanced undergraduates and graduate students in electrical engineering, computer science, informatics, mathematics, physics and management sciences. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

II | 1 |

III | 4 |

IV | 8 |

V | 16 |

VI | 22 |

VII | 24 |

VIII | 27 |

IX | 29 |

XXXVII | 215 |

XXXVIII | 218 |

XXXIX | 222 |

XL | 227 |

XLI | 229 |

XLII | 238 |

XLIII | 243 |

XLIV | 250 |

X | 39 |

XI | 43 |

XII | 49 |

XIII | 56 |

XIV | 60 |

XV | 63 |

XVI | 79 |

XVII | 85 |

XVIII | 91 |

XIX | 95 |

XX | 98 |

XXI | 109 |

XXII | 116 |

XXIII | 126 |

XXIV | 130 |

XXV | 133 |

XXVI | 136 |

XXVII | 138 |

XXVIII | 142 |

XXIX | 155 |

XXX | 164 |

XXXI | 171 |

XXXII | 176 |

XXXIII | 186 |

XXXIV | 190 |

XXXV | 209 |

XXXVI | 214 |

### Other editions - View all

### Common terms and phrases

amount of information assume average distortion bandwidth binary symmetric channel bits bits/sec bits/symbol broadcast channel Calculate the amount capacity region channel capacity ciphertext code symbols code word coding theorem communication channel conditional amount conditional probability consider continuous stochastic variable corresponding cryptanalyst decoder defined denoted Determine diagram efficiency enciphering encoder equal probability equation equivocation error probability example FN(U follows gives Hamming distance HN(U Huffman information H(X information power information theory loge Markov chain maximum amount messages of length multi-access channel mutual information noise number of possible number of symbols outcomes p(ul p(xi plaintext probability density function probability distribution probability of occurring probable messages rate distortion function received redundancy regard result sequence Shannon's information measure Sn(a source output symbols source symbols source with memory statistically independent stochastic variable substitution symbol probability code syndrome transition probabilities transmission transmitted uncertainty unicity distance yields zero