| 
论坛元老  
 威望159 贡献225 热心值11 金币17849 注册时间2020-8-31
 
 | 
 
 
| 1997年Jan C.A van der Lubbe所著教材]Information Theory的再版,1997年版由Cambridge University Press出版,再版由Delft Academic Press出版。每章节最后的Solutions部分附上了每张练习的答案。原版英文。 PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE
 The Pitt Building. Trumpington Street, Cambridge CB2 IRP, United Kingdom
 CAMERIDGE UNIVERSLTY PRESS
 The Edinburgh Building, Cambridge CB2 2RU, United Kingdom
 40 West 20th Street, New York, NY10011-421L, USA
 10, Stamford road, Oakleigh, Melbourne 3166, Australia
 Originally published in Dutch as informatietheorie by vSSD and a vSSD
 1988
 C English translation Cambridge University Press 1997
 The book is in copyright, Subject to statutory exception and to the provisions
 take place without the written permission of cambridge University Press,
 of relevant collective licensing agreements, no reproduction of any part m
 First published in English by Cambridge University Press 1997 as
 Information Theory
 Panted in the United kingdon at the University Press, Canbridge
 Typeset in Times
 A catalogue record for this book is available fiom the British Library
 ISBN 0 521 461987 hardback
 ISBN0 $21467608 paperback
 
 Preface
 On all levels of society systems have been introduced that deal with the
 transmission, storage and processing of information. we live in what is
 usually called the infomation society Information has become a key word
 in our society. It is not surprising therefore that from all sorts of quarters
 interest has been shown in what information really is and consequently in
 quiring a better knowledge as to how information can be dealt with as
 fficiently as possible
 Information theory is characterized by a quantitative approach to the notion
 of information. By means of the introduction of measures for intormation
 answers will be sought to such questions as: How to transmit and store
 information as compactly as possible? What is the maxinum quantity of
 information that can be transmitted through a channel? How can security
 best be arranged? Etcetera. Crucial questions that enable us to enhance the
 performance and to grasp the limits of our information systems
 This book has the purpose of introducing a number of basic notions of
 information theory and clarifying them by showing their significance in
 present applications. Matters that will be described are, among others:
 Shannons information measure. discrete and continuous information
 sources and information channels with or without memory, source and
 hannel decoding, rate distortion theory, error-correcting codes and the
 information theoretical approach to cryptology Special attention has been
 paid to multiterminal or network information theory; an area with still lots
 of unanswered questions, but which is of great significance because most of
 our information is transmitted by networks
 All chapters are concluded with questions and worked solutions. That
 makes the book suitable for self study
 
 xii Preface
 The content of the book has been largely based on the present lectures b
 the author for students in Electrical Engincering, Tcchnical Mathematics
 and Informatics, Applied Physics and Mechanical Engincering at the Delft
 University of Technology, as well as on former lecture notes by profs
 Ysbrand Boxma, Dick Boekec and Jan Biemond. The questions have becn
 derived from recent exams
 The author wishes to express his gratitude to the colleagues mentioncd
 above as well as the other colleagues who in one way or other contributed to
 this textbook. Especially I wish to thank e. Prof. Ysbrand Boxma, who
 lectured on information theory at the delft university of technology when i
 was a student and who introduced me to information theory. Under his
 inspiring guidance I rcceived my M.Sc. in Elcctrical Engineering and my
 Ph.D. in the techrical sciences. In writing this book his old lecture notes
 were still very helpful to me. His influence has been a determining factor in
 my later career.
 Delft, December 1996
 Jan C.A. van der Lubbe
 
 Contents
 reace
 page xr
 Discrete information
 1.1 The ongin of information theory
 2 The concept of probability
 4
 1. 3 Shannon's information measure
 8
 1. 4 Conditional, joint and mutual information measures
 16
 1.5 Axiomatic foundations
 22
 1. 6 The communication model
 d7 Exercises
 27
 1. 8 Solutions
 29
 2 The discrete memoryless information source
 39
 2.1 The discrete information source
 39
 2.2 Source coding
 43
 2.3 Coding strategies
 49
 2. 4 Most probable messages
 56
 2.5 Exercises
 2.6 Solutions
 3
 The discrete information source with memory
 79
 3.1 Markov processes
 79
 3.2 The information of a discrete source with memory
 85
 3.3 Coding aspects
 9
 3.4 Exercises
 95
 3.5 Solutions
 98
 The discrete communication channel
 l09
 4.1 Capacity of noiseless channels
 109
 4.2 Capacity of noisy channels
 l16
 4.3 Error probability and equivocation
 126
 
 vila Contents
 4.4 Coding theorem for discrete memory less channeis
 4.5 Cascading of channels
 133
 4.6 Channels with memory
 136
 47 Exercises
 138
 4.8 Solutions
 42
 s The continue
 nformation source
 155
 5.1 Probability density functions
 155
 5.2 Stochastic signals
 164
 5.3 The continuous information measure
 171
 5.4 Information measures and sources with memory
 176
 5.5 Information power
 186
 56 Exercises
 190
 5.7 Solutions
 194
 6 The continuous communication changel
 209
 6. 1 The capacity of continuous communication channe]s
 209
 6.2 The capacity in the case of additive gaussian white noise
 214
 6.3 Capacity bounds in the case of non-gaussian white noise
 215
 6. 4 Channel coding theorem
 218
 6.5 The capacity of a gaussian channel with memory
 222
 6.6 Exercises
 227
 6.7 Solutions
 229
 7 Rate distortion theory
 238
 7.1 The discrete rate distortion function
 238
 7.2 Properties of the r(D) function
 243
 7.3 The binary case
 250
 7.4 Source coding and information transmission thcorems
 253
 7.5 The continuous rate distortion function
 25
 7.6 Exercise
 264
 TI Solutions
 265
 Network information theory
 268
 8.1 Introduction
 268
 8.2 MulLi-access communication channel
 269
 8. 3 Broadcast channels
 28I
 8.4 Two-way channels
 292
 8.5 Exercises
 298
 8.6 Solutions
 9 Error-correcting codes
 305
 9.1 Introduction
 305
 
 Contents
 9.2 Linear block codes
 307
 9.3 Syndrome coding
 3I2
 9.4 Hamming codes
 316
 9.5 Exercises
 318
 9.6 Solutions
 319
 10 Cryptology
 324
 10.1 Cryptography and cryptanalysis
 324
 10.2 The general scheme of cipher systems
 325
 10.3 Cipher systems
 327
 10.4 Amount of information and security
 334
 10.5 The unicity dislance
 33
 10.6 Exercises
 340
 10.7 Solutions
 341
 Bibliography
 345
 Index
 347
 
 1
 Discrete information
 1.I The origin of information theory
 Information theory is the science which deals with the concept informa
 tion, its measurement and its applications. In its broadest sense distinction
 can be made between the American and British traditions in information
 theory
 In general there are three types of informaLion
 syntactic information, related to the symbols from which messages are
 built up and to their interrelations
 semantic information, related to the meaning of messages, their
 referential aspect
 pragmatic informarion, related to the usage and effect of messages
 This being so, syntactic information mainly considers the form of
 nformation, whereas semantic and pragmatic information are related to the
 information content
 Consider the following sentences
 (O John was brought to the railway station by taxi
 (U The taxi brought John to the railway station
 (i) There is a traffic jam on highway A3, between Nuremberg and Munich
 in Germany.
 iv) There is a traffic jam on highway A3 in Germany.
 The sentences(i)and (ii)are syntactically different. HoweveR, semantically
 and pragmatically they are identical. They bave the same meaning and are
 both equally informative
 The sentences (ii)and (iv) do not differ only with respect to their syntax
 but also with respect to their semantics Sentence (iii)gives more precise
 information than sentence (iv)
 
 2 Discrete information
 The pragmatic aspect of information mainly depends on the context. The
 information contained in the sentences (iii) and (iv) for example is relevant
 for someone in Germany, but not for someone in the USA
 The semantic and pragmatic aspects of information are studied in the british
 tradition of information theory. This being So, the British tradition is closely
 related to philosophy, psychology and biology. The British tradition is
 influenced mainly by scientists like MacKay, Carnap, Bar-Hillel, Ackoff
 and hintikka
 The American tradition deals with the syntactic aspects of information. I
 this approach there is full abstraction from the meaning aspects of informa
 tion. There, basic questions are the measurement of syntactic information
 the fundamental limits on the amount of information which can be trans
 mitted, the fundamental limits on the compression of information which can
 be achieved and how to build information processing systems approaching
 these limits. A rather technical approach to information remains
 The American tradition in information theory is sometimes referred to as
 communication theory, mathematical information theory or in short as
 information theory. Well-known scientists of the American tradition are
 Shannon, Renyi, Gallager and Csiszar among others
 However, Claude E. Shannon, who published his article"A mathematical
 theory of communication in 1948, is generally considered to be the founder
 of the American tradition in information theory. There are, nevertheless, a
 number of forerunners to Shannon who attempted to formalise the efficient
 use of communication systems.
 In 1924 H. Nyquist published an article wherein he raised the matter of how
 messages (or characters, to use his own words) could be sent over a
 telegraph channel with maximum possible speed, but without distortion. The
 term information however was not yet used by him as such
 It was R.Y. L. Hartley(1928)whc first tried to define a measure of
 information. He went about it in the following manner.
 Assume that for every symbol of a message one has a choice of s
 possibilities. By now considering messages of I symbols, one can distinguish
 s messages. Hartley now defined the amount of information as the
 logarithm of the number of distinguishable messages. In the case of
 messages of length I one therefore finds
 HH(s)=log[sF=I log(s).
 For messages of length 1 one would find
 
 
 
 
 | 
 |