5
Information A source is supposed to produce symbols. The symbols form the message The message contain information.(May be relative to the user) The Information is categorised only on the basis of its probability of occurrence. Information in general will have Four Distinct properties: Prof. Budhaditya Bhattacharyya, SENSE,VIT, University Property 1 : Information (I) should always be positive that is I => 0 The property is obvious, otherwise the source producing the symbol will not be called a source. Hence a source should be such that there is no loss of Information

WINSEM2015 16 CP2494 07 Jan 2016 RM02 Introduction to Self Information and Property

  • Upload
    murthy

  • View
    1

  • Download
    0

Embed Size (px)

DESCRIPTION

good one

Citation preview

Information A source is supposed to produce symbols. The symbols form the message The message contain information.(May be relative to the

user) The Information is categorised only on the basis of its

probability of occurrence. Information in general will have Four Distinct properties:

Prof

. Bud

hadi

tya

Bhat

tach

aryy

a, S

ENSE

, VIT

, Uni

vers

ity

Property 1 : Information (I) should always be positive that is I => 0

The property is obvious, otherwise the source producing the symbol will not be called a source. Hence

a source should be such that there is no loss of Information

Information Pr

of. B

udha

dity

a Bh

atta

char

yya,

SEN

SE, V

IT, U

nive

rsity

Property 2 : For a symbol with probability approaching its highest value 1; the amount of information in it should approach its lowest

value.

If we are absolutely certain about the outcome, even before the event occurs, no Information

Property 3 : For two different symbols xi and xj with respective probabilities Pi and Pj, the one with lower probability should

contain more information i.e. for Pi < Pj , we must have Ii > Ij

Property 4 : The total information conveyed by two independent symbols should be the sum of their respective information

content.Iij = Ii + Ij

Information From the above properties it is obvious that

I is a function of P The function should be an inverse relation

between I and P.

Prof

. Bud

hadi

tya

Bhat

tach

aryy

a, S

ENSE

, VIT

, Uni

vers

ity

The only possible way to mathematically represent thisrelationship is self information : I = log (1/P)

base 2 (Number of alphabet in a source is Two) : Bits or binitbase e (Number of alphabet in a source is n) : natsbase 10 (Number of alphabet in a source is 10) : decit or hurtley

Information Let there is one event E having two sub event which are

statistically independent given as e1 and e2. Each having probability of occurrence as P(e1) and P(e2)

Prof

. Bud

hadi

tya

Bhat

tach

aryy

a, S

ENSE

, VIT

, Uni

vers

ity

I (E) = log (1/P (E))= log (1/ P (e1,e2))

[considering P (e1) and P (e2) are two independent event]

= ( log (1/P (e1)) + log (1/P (e2))

= I (e1) + I (e2)

Information cont……. All discrete sources emit outputs which are sequences of a

finite number of symbols called “alphabets”. Like Englishlanguage has 26 alphabets, similarly a binary source willhave alphabets as 0 and 1.

Discrete Memoryless Source (DMS) : When a source isstatistically independent. The output letter is statisticallyindependent from all past and future outputs.

The physical quantity that expresses the information contentof a DMS along with giving probabilistic behaviour of thissource is termed as “ENTROPY”Prof

. Bud

hadi

tya

Bhat

tach

aryy

a, S

ENSE

, VIT

, Uni

vers

ity