Limit search to available items
Book Cover
E-book
Author Goldie, Charles M., author

Title Communication theory / Charles M. Goldie and Richard G.E. Pinch
Published Cambridge ; New York : Cambridge University Press, 1991

Copies

Description 1 online resource : illustrations
Series London Mathematical Society student texts ; 20
London Mathematical Society student texts ; 20
Contents Cover -- Series -- Title -- Copyright -- Dedication -- CONTENTS -- PREFACE -- 0 INTRODUCTION -- 1 ECONOMICAL REPRESENTATIONS: NOISELESS CODING -- 1.1 Sources, messages, codes -- 1.2 Decipherability: the prefix condition -- Decision-tree representation -- Exercises -- 1.3 The Kraft inequality -- Exercises -- 1.4 Noiseless-coding Theorem: source entropy -- Exercises -- 1.5 Segmented (block) codes -- Exercises -- 1.6 Shannon-Fano encoding -- Exercise -- 1.7 Huffman encoding -- Exercises -- 1.8 Further topics -- Run-length constraints -- Universal coding -- Tree codes -- Other reading
1.9 Problems -- 2 PROPERTIES OF A MESSAGE SOURCE -- 2.1 Introduction: probability -- Probability spaces -- The random source -- Exercise -- 2.2 Reliable encoding rates: the source information-rate -- Telegraph English -- Exercises -- 2.3 The First Coding Theorem (FCT) -- Convergence of random variables -- The First Coding Theorem -- Exercises -- 2.4 Asymptotic Equipartition Property (AEP) -- Exercise -- 2.5 The information rate for a Bernoulli source -- Exercises -- 2.6 Finite Markov chains -- Geometric ergodicity -- Exercises -- 2.7 Markov sources -- Simple Markov sources
Higher-order Markov sources -- Exercises -- 2.8 The genetic code -- Location of redundancy -- 2.9 Entropy -- Joint entropy -- Exercises -- 2.10 Conditional entropy -- Conditional independence -- Exercises -- 2.11 The uncertainty of a stationary source -- The Ergodic Theorem -- Exercises -- 2.12 Ergodic sources -- The Shannon-McMillan-Breiman Theorem -- Failure of ergodicity -- Exercises -- 2.13 Further topics -- Source coding with a fidelity criterion -- Epsilon-entropy, metric entropy and algorithmic information theory -- Statistical inference -- Ergodic theory -- 2.14 Problems
3 RELIABLE TRANSMISSION -- 3.1 Reliable transmission rates -- channel capacity -- Exercises -- 3.2 Decoding: receiver optimization -- A little decision theory -- The discrete case -- The continuous case -- ML decoding -- 3.3 Random coding -- Exercise -- 3.4 Introduction to channels -- Exercises -- 3.5 Reliable transmission through the BSC -- Exercises -- 3.6 Reliable transmission through the memoryless Gaussian channel -- Recollections of normality -- The memoryless Gaussian channel (MGG) -- ML decoding rule -- Reliable transmission -- Exercises -- 4 CHANNEL CODING THEOREMS
4.1 Mutual information -- The Hu correspondence -- Exercises -- 4.2 The Second Coding Theorem (SCT) -- 4.3 The discrete memoryless channel (DMC) -- Exercises -- 4.4 Symmetric channels -- Exercises -- 4.5 Continuous entropy and mutual information -- Continuous entropy -- Mutual information -- Exercises -- 4.6 The memoryless channel with additive white noise -- The memoryless Gaussian channel (MGG) -- Capacity under signal-power constraint -- Exercises -- 4.7 Further topics -- Evaluation of channel capacity -- Magnitude of the probability of error -- Channels with input costs -- Inequalities
Summary This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors
Bibliography Includes bibliographical references (pages 191-195) and index
Notes Print version record
Subject Coding theory.
Information theory.
SCIENCE -- System Theory.
TECHNOLOGY & ENGINEERING -- Operations Research.
Coding theory
Information theory
Form Electronic book
Author Pinch, Richard G. E., author
LC no. 92163857
ISBN 9781107088252
1107088259
9781107094475
110709447X