2
ECE307 Information Theory and Coding L T P C 3 0 0 3 Version No.: 1.10 Prerequisite: ECE305 Digital Communication Objectives: Describe and analyze the information source and channel capacity Differentiate between the uniform and non-uniform quantization Analyze the source coding techniques such as Shanan Fano Encoding, Huffman Coding, Arithmetic Coding. Apply statistical techniques for signal detection Construct the various channel coding schemes such as block codes, cyclic codes and convolutional codes. Expected Outcome: 1. Apply mathematical models that describes the behavior of information source and channel capacity and the performance of source coding and channel coding techniques 2. Solve mathematical problems in source coding and channel coding techniques and implement in Matlab. Unit I Information Theory Introduction, Uncertainty, Information and it‟s property, Entropy and its property, Joint and Conditional Entropy, Mutual Information and its property, Information measures for Continuous random variables. Unit II Channel classification and Capacity Channel capacity theorem, Continuous and Discrete Communication channels Discrete memory less channels - channel representations - noiseless channel, lossless channels, Deterministic, Binary symmetric channel (BSC), Binary Erasure channel (BEC) and their capacities. Unit III Source Coding Techniques Coding for Discrete memory less sources: Fixed length code words, Variable length code words, Kraft Inequality, Prefix coding, Shannon‟s first , second and third theorem, Shannon binary Encoding, Shannon- Fano Encoding, Huffman Coding : minimum and maximum variance method, Arithmetic Coding, Dictionary Coding- LZ , LZW Coding. Unit IV Error Control Coding Types of Errors, Types of Codes, Linear Block Codes: Error Detection and Error Correction Capabilities of Linear Block codes, Binary Cyclic codes , Encoding using Shift register, Syndrome Calculation, Error detection, and Error correction, Convolutional codes Encoders and Decoders for convolutional codes, LDPC Codes, Trellis Codes, Turbo Codes, Viterbi Coding. Unit V Detection of Signals and Channels with Noise Hypothesis testing Baye‟s criterion – Minimum error probability criterion, Neyman Pearson criterion, Minmax criterion-Maximum Likelihood detector-Wiener filter-Continuous and Discrete channels with noise. Proceedings of the 29th Academic Council [26.4.2013] 330

Ece307 Information-Theory-And-coding Th 1.10 Ac29

  • Upload
    murthy

  • View
    5

  • Download
    0

Embed Size (px)

DESCRIPTION

good

Citation preview

Page 1: Ece307 Information-Theory-And-coding Th 1.10 Ac29

ECE307 Information Theory and Coding L T P C 3 0 0 3

Version No.: 1.10 Prerequisite: ECE305 Digital Communication

Objectives:

• Describe and analyze the information source and channel capacity

• Differentiate between the uniform and non-uniform quantization

• Analyze the source coding techniques such as Shanan Fano Encoding, Huffman Coding,

Arithmetic Coding.

• Apply statistical techniques for signal detection

• Construct the various channel coding schemes such as block codes, cyclic codes and

convolutional codes.

Expected Outcome: 1. Apply mathematical models that describes the behavior of information source and channel

capacity and the performance of source coding and channel coding techniques 2. Solve mathematical problems in source coding and channel coding techniques and

implement in Matlab.

Unit I Information Theory Introduction, Uncertainty, Information and it‟s property, Entropy and its property, Joint and Conditional Entropy, Mutual Information and its property, Information measures for Continuous random variables. Unit II Channel classification and Capacity Channel capacity theorem, Continuous and Discrete Communication channels – Discrete memory less channels - channel representations - noiseless channel, lossless channels, Deterministic, Binary symmetric channel (BSC), Binary Erasure channel (BEC) and their capacities. Unit III Source Coding Techniques Coding for Discrete memory less sources: – Fixed length code words, Variable length code words, Kraft Inequality, Prefix coding, Shannon‟s first , second and third theorem, Shannon binary Encoding, Shannon- Fano Encoding, Huffman Coding : minimum and maximum variance method, Arithmetic Coding, Dictionary Coding- LZ , LZW Coding. Unit IV Error Control Coding Types of Errors, Types of Codes, Linear Block Codes: Error Detection and Error Correction Capabilities of Linear Block codes, Binary Cyclic codes , Encoding using Shift register, Syndrome Calculation, Error detection, and Error correction, Convolutional codes – Encoders and Decoders for convolutional codes, LDPC Codes, Trellis Codes, Turbo Codes, Viterbi Coding. Unit V Detection of Signals and Channels with Noise Hypothesis testing – Baye‟s criterion – Minimum error probability criterion, Neyman Pearson criterion, Minmax criterion-Maximum Likelihood detector-Wiener filter-Continuous and Discrete channels with noise.

Proceedings of the 29th Academic Council [26.4.2013] 330

Page 2: Ece307 Information-Theory-And-coding Th 1.10 Ac29

Textbooks: 1. K. Sam Shanmugam, “Digital and Analog Communication Systems”, John Wiley and Sons,

2006. 2. Simon Haykin, “Communication Systems”, John Wiley and Sons, 2009. Reference Books: 1. Thomas M. Cover, Joy A. Thomas,” Elements of Information Theory”, John Wiley and

Sons, 2004. 2. Ranjan Bose,” Information Theory, Coding and Cryptography”, Tata McGraw Hill, 2012.

Mode of Evaluation: CAT- I & II, Quizzes, Assignments/ other tests, Term End

Examination.

Proceedings of the 29th Academic Council [26.4.2013] 331