1
PREFACE Statistical theory of communication is a broad new field comprised of methods for the study of the statistical problems encountered in all types of communications. The field embodies many topics such as radar detection, sources of physical noise in linear and nonlinear systems, filtering and prediction, information theory, coding, and decision theory. The theory of probability provides the principal tools for the study of problems in this field. Information theory as outlined in the present work is a part of this broader body of knowledge. This theory, originated by C. E. Shannon, introduced several important new concepts and, although a part of applied communications sciences, has acquired the unique distinction of opening a new path of research in pure mathematics. The communication of information is generally of a statistical nature, and a current theme of information theory is the study of simple ideal statistical communication models. The first objective of information theory is to define different types of sources and channels and to devise statistical parameters describing their individual and ensemble operations. The concept of Shannon's communication entropy of a source and the transformation of a channel provide most useful means for studying simple communication models. In this respect it appears that the con cept of communication entropy is a type of describing function that is most appropriate for the statistical models of communications. This is similar in principle to the way that an impedance function describes a linear network, or a moment indicates certain properties of a random variable. The introduction of the concepts of communication entropy, transformation, and channel capacity is a basic contribution of informa tion theory, and these concepts are of such fundamental significance that they may parallel in importance the concepts of power, impedance, and moment. Perhaps the most important theoretical result of information theory is Shannon's fundamental theorem, which implies that it is possible to communicate information at an ideal rate with utmost reliability in the presence of "noise." This succinct but deep statement and its conse quences unfold the limitation and complexity of present and future ix Generated on 2015-09-24 06:32 GMT / http://hdl.handle.net/2027/mdp.39015003730408 Public Domain, Google-digitized / http://www.hathitrust.org/access_use#pd-google

Document2

Embed Size (px)

DESCRIPTION

REZA2

Citation preview

Page 1: Document2

PREFACE

Statistical theory of communication is a broad new field comprised ofmethods for the study of the statistical problems encountered in all typesof communications. The field embodies many topics such as radardetection, sources of physical noise in linear and nonlinear systems,filtering and prediction, information theory, coding, and decision theory.The theory of probability provides the principal tools for the study of

problems in this field.

Information theory as outlined in the present work is a part of this

broader body of knowledge. This theory, originated by C. E. Shannon,

introduced several important new concepts and, although a part of

applied communications sciences, has acquired the unique distinction of

opening a new path of research in pure mathematics.

The communication of information is generally of a statistical nature,

and a current theme of information theory is the study of simple ideal

statistical communication models. The first objective of informationtheory is to define different types of sources and channels and to devise

statistical parameters describing their individual and ensemble operations.The concept of Shannon's communication entropy of a source and the

transformation of a channel provide most useful means for studyingsimple communication models. In this respect it appears that the con

cept of communication entropy is a type of describing function that is

most appropriate for the statistical models of communications. This is

similar in principle to the way that an impedance function describes a

linear network, or a moment indicates certain properties of a random

variable. The introduction of the concepts of communication entropy,

transformation, and channel capacity is a basic contribution of information theory, and these concepts are of such fundamental significance thatthey may parallel in importance the concepts of power, impedance, and

moment.

Perhaps the most important theoretical result of information theory is

Shannon's fundamental theorem, which implies that it is possible tocommunicate information at an ideal rate with utmost reliability in the

presence of "noise." This succinct but deep statement and its conse

quences unfold the limitation and complexity of present and future

ix

Genera

ted o

n 2

01

5-0

9-2

4 0

6:3

2 G

MT /

htt

p:/

/hd

l.hand

le.n

et/

20

27

/mdp.3

90

15

00

37

30

40

8Public

Dom

ain

, G

oog

le-d

igit

ized

/

htt

p:/

/ww

w.h

ath

itru

st.o

rg/a

ccess

_use

#pd-g

oogle