24
Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory 1 Prof. Ja-Ling Wu Department of Computer Science and Information Engineering National Taiwan University

Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

  • Upload
    others

  • View
    13

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory and Coding

Techniques

Lecture 1.2:

Course Outlines and Grading

Information Theory 1

Prof. Ja-Ling Wu

Department of Computer Science

and Information Engineering

National Taiwan University

Page 2: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 2

Digital Communication System

Source

Encoder

Channel

Encoder

Modulator Demodulator

Channel

Decoder

Source

DecoderSource Coding

Channel Coding

Modulation

Source Destination

Estimated Source codeword

Received codeword

Channel

Source codeword

Channel codeword

Noise

Page 3: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 3

Source Coding

Based on characteristics/features of a source,

Source Encoder-Decoder pair is designate to

reduce the source output to a Minimal

Representation.

[Shannon 1948]

How to model a signal source? Random process

How to measure the content of a source? Entropy

How to represent a source? Code-design

How to model the behavior of a channel? Stochastic mapping

Channel capacity

Page 4: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 4

Source Coding (cont.)

Redundancy Reduction Data Compression

Data Compaction

Modalities of Sources:

Text

Image

Speech/Audio

Video

Graphics

Hybrid

Page 5: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 5

Channel coding

Introduction redundancy into the channel encoder

and using this redundancy at the decoder to

reconstitute the input sequences as accurately as

possible, i.e., Channel Coding is designate to

minimize the effect of the channel noise.

Channel

EncoderChannel Channel

Decoder

Noiseless

Channel

Noise

Page 6: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 6

Modulation

Physical channels can require electrical signals,

radio signals, or optical signals. The modulator

takes the channel encoder/source encoder

outputs into account and transfers the output

waveforms that suit the physical nature of the

channel, and are also chosen to yield either

system simplicity or optimal detection

performance.

Page 7: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 7

What is information?

What is meant by the “information”

contained in an event?

If we are formally to defined a quantitative measure

of information contained in an event, this measure

should have some intuitive properties such as:

1. Information contained in events ought to be defined

in terms of some measure of the uncertainty of the

events.

2. Less certain events ought to contain more information

than more certain events.

3. The information of unrelated/independent events taken

as a single event should equal the sum of the

information of the unrelated events.

Page 8: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 8

A nature measure of the uncertainty of an event

is the probability of denoted P().

Once we agree to define the information of an

event in terms of P(), the properties (2) and

(3) will be satisfied if the information in is

defined as

I() = -log P()

*The base of the logarithm depends on the unit of

information to be used.

Self-information

Page 9: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 9

Information Unit:log2 : bit

loge : nat

log10 : Hartley

base conversions:log102 = 0.30103, log210 = 3.3219

log10e = 0.43429, loge10 = 2.30259

loge2 = 0.69315, log2e = 1.44270

Xba

XX ba

b

ba loglog

log

loglog

Page 10: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 10

Information (Source)

S1 S2 ‧ ‧ ‧ Sq:Source alphabet

P1 P2 ‧ ‧ ‧ Pq:Probability

Facts:

1) The information content (surprise) is somewhat inversely

related to the probability of occurrence.

2) The information content from two different independent

symbols is the sum of the information content from each

separately. Since the probability of two independent choices

are multiplied together to get the probability of the compound

event, it is natural to define the amount of information as

As a result, we have

ii

i PP

SI logor 1

log

21

21

21 ,1

log SSIPP

SISI

Page 11: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 11

Entropy: Average information content over the

whole alphabet of symbols

*Consider the entropy of the Source can have no

meaning unless a model of the Source is included.

For a sequence of numbers and if we cannot

recognize that they are pseudo-random numbers, then

we would probably compute the entropy based on the

frequency of occurrence of the individual numbers.

2log

log

log

2

1

21

21

1

1

rr

q

i

iri

q

qq

i

Prir

SHSH

PP

PPP

SSSPSH

i

Page 12: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 12

*The entropy function involves only the distribution of

the probabilities it is a function of a probability

Distribution Pi and does not involve the Si

Ex: Weather of Taipei

X = {Rain, fine, cloudy, snow} = {R, f, c, s}

P(R) = ¼ , P(F) = ½ , P(C) = ¼ , P(S) = 0

H2(X) = 1.5 bits/symbol

If ¼ for each P(i) H2(X) = 2 bits/symbol. (>1.5)

H(X) = 0 for a certain event

(equal probability event)P(ai)=0

or

P(ai)=1

Page 13: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 13

The logarithmic measure is more convenient for

various reasons:

1. It is practically more useful. Parameters of engineering

importance such as time, bandwidth, number of

relays, etc., tend to vary linearly with the logarithm of

the number of possibilities.

For example, adding one relay to a group doubles the

number of possible states of the relays. It adds 1 to the

base 2 logarithm of this number.

Page 14: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 14

2. It is nearer to the perception feeling of a human

body

Light Intensity eye

Sound volume ear

3. It is mathematically more suitable

log2 bits

log10 decimal digits

loge natural unit

Change from the base a to base b merely requires

multiplication by logb a

Page 15: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Information Theory 15

Course contents Basic Information Theory:

– Entropy, Relative Entropy and Mutual Information

Data Compression / Compaction:– Kraft Inequality, the prefix condition and Instantaneous decodable

codes.

Variable Length Codes– Huffman code, Arithmetic code and L-Z code.

Coding Techniques– DPCM (predictive coding)

– Transform coding (Discrete Cosine Transform)

– JPEG (JPEG2000)

– Motion Estimation and Compensation

– MPEG-1, 2, 4

– H.26P, H.264, HEVC (H.265)

-- Distributed Video Coding

Steganography and Information Hiding– Digital Watermarking

Page 16: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

References:

1. Elements of Information Theory, Thomas

M. Cover & Joy A. Thomas, Wiley 2nd Edit.

2006

2. Introduction to Data Compression, Khalid

Sayood, 1996.

3. JPEG/ MPEG-1 coding standard

4. Introduction to Information Theory and Data

Compression, Greg A. Harris & Peter D.

Johnson, CRC Press, 1998

Page 17: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

5. The Minimum Description Length

Principle, Peter D. Grunward, the MIT Press,

2007.

6. IEEE Trans. on

Information Theory, Circuits and Systems for

Video Technology, Signal Processing, Image

Processing, Multimedia, Information

Forensics and security, Communications,

and Computers

7. Open Journal : Entropy

Page 18: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Grading: Basic Requirements :

1. Home works (20%)

2. In-Class Written Exam (20%)

3. Programming Assignments (25%)

Variable Length Codes

JPEG Compressor

Grade: from C+ to B+ (Midterm dependent)

Page 19: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

1. Realization of Video Codecs:

a Real time MPEG-1 decoder --- (A- to A)

----------------------------------------------------------------

b Real time MPEG-2 / H.264 decoder --- (A to A+)

Near real time MPEG-1 Encoder --- (A to A+)

----------------------------------------------------------------

c H.264 encoder/ H.265 Codecs --- (A+)

d Other advanced Multimedia Codecs

Final Project: (35%)

Page 20: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Examples of Other advanced Multimedia

Codecs : (A- to A+)

1. Compression Algorithms for 3D Videos

2. Parallelized Video Codecs – GPU-based

3. Scalable Video Codecs – Multi-resolution

4. Distributed Video Codecs – Cloud-based

5. Advanced Audio Codecs

6. Compression Algorithms for AR/VR

7. Compression Algorithms in Encrypted

Domain

Page 21: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

2. Applications of Information Theory in

Specific Research Field (1) :

a Survey of Non-Shannon Entropies and their

Applications (e.g. Permutation Entropy)

b Information Geometric and Topology

c Information in Complexity Theory

d Information theory based Machine Learning

e Information Theory in Security and Privacy

Protection

Information Theory 21

Page 22: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

2. Applications of Information Theory in

Specific Research Field (2):

f Information Theory in Data Visualization

g Information Theory in Computer Vision and

Image Processing

h Information Theoretic Analysis of DL

(or CNN)

I Information Theory in Finance Analysis

j Others --- propose and confirmed

Information Theory 22

Page 23: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Another choice for those who don’t interest in

Multimedia Coding/Programing Assignment:

Mid-term : (In addition to in-class written exam)

1. A brief Survey report about any one of the above-

mentioned research subjects. (15%)

4 pages maximum

2. A brief Oral presentation about any one of the

above-mentioned research subjects. (10%)

10 mins. max

Information Theory 23

Page 24: Information Theory and Coding Techniquesitct/slide/2019/ITCT Lecture 1.2.pdf · Information Theory and Coding Techniques Lecture 1.2: Course Outlines and Grading Information Theory

Final: (35%)

A group (with Maximum 3 members) tries to

realize/investigate an Information Theory

related final project proposed by the group

members.

The grading depends on the satisfaction of the

group’s 20 mins. Oral presentation in the

Final and the hand-in report.

Information Theory 24