Upload
karthik-v-kalyani
View
214
Download
0
Embed Size (px)
DESCRIPTION
REZA2
Citation preview
PREFACE
Statistical theory of communication is a broad new field comprised ofmethods for the study of the statistical problems encountered in all typesof communications. The field embodies many topics such as radardetection, sources of physical noise in linear and nonlinear systems,filtering and prediction, information theory, coding, and decision theory.The theory of probability provides the principal tools for the study of
problems in this field.
Information theory as outlined in the present work is a part of this
broader body of knowledge. This theory, originated by C. E. Shannon,
introduced several important new concepts and, although a part of
applied communications sciences, has acquired the unique distinction of
opening a new path of research in pure mathematics.
The communication of information is generally of a statistical nature,
and a current theme of information theory is the study of simple ideal
statistical communication models. The first objective of informationtheory is to define different types of sources and channels and to devise
statistical parameters describing their individual and ensemble operations.The concept of Shannon's communication entropy of a source and the
transformation of a channel provide most useful means for studyingsimple communication models. In this respect it appears that the con
cept of communication entropy is a type of describing function that is
most appropriate for the statistical models of communications. This is
similar in principle to the way that an impedance function describes a
linear network, or a moment indicates certain properties of a random
variable. The introduction of the concepts of communication entropy,
transformation, and channel capacity is a basic contribution of information theory, and these concepts are of such fundamental significance thatthey may parallel in importance the concepts of power, impedance, and
moment.
Perhaps the most important theoretical result of information theory is
Shannon's fundamental theorem, which implies that it is possible tocommunicate information at an ideal rate with utmost reliability in the
presence of "noise." This succinct but deep statement and its conse
quences unfold the limitation and complexity of present and future
ix
Genera
ted o
n 2
01
5-0
9-2
4 0
6:3
2 G
MT /
htt
p:/
/hd
l.hand
le.n
et/
20
27
/mdp.3
90
15
00
37
30
40
8Public
Dom
ain
, G
oog
le-d
igit
ized
/
htt
p:/
/ww
w.h
ath
itru
st.o
rg/a
ccess
_use
#pd-g
oogle