Prof. S.N. Merchant - IIT Bombay
Introduction to Information Theory and Coding
Definition of Information Measure and Entropy
Extention of An Information Source and Markov Source
Adjoint of An Information Source, Joint and Conditional Information Measure
Properties of Joint and Conditional Information Measures and A Morkov Source
Asymptotic Properties of Entropy and Problem Solving in Entropy
Block Code and its Properties
Instantaneous Code and Its Properties
Kraft-Mcmillan Equality and Compact Codes
Shannon's First Theorem
Coding Strategies and Introduction to Huffman Coding
Huffman Coding and Proof of Its Optamality
Competitive Optamality of The Shannon Code
Non-Binary Huffman Code and Other Codes
Adaptive Huffman Coding Part-I
Adaptive Huffman Coding Part-II
Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
Arithmetic Coding Part-I
Arithmetic Coding Part-II
Introdution to Information Channels
Equivocation and Mutual Information
Properties of Different Information Channels
Reduction of Information Channels
Properties of Mutual Information and Introduction to Channel Capacity
Calculation of Channel Capacity for Different Information Channels
Shannon's Second Theorem
Discussion On Error Free Communication Over Noisy Channel
Error Free Communication Over A Binary Symmetric Channel and Introduction to Continuous Sources and Channels
Differential Entropy and Evaluation of Mutual Information for Continuous Sources and Channels
Channel Capacity of A BandLimited Continuous Channel
Introduction to Rate-Distortion Theory
Definition and Properties of Rate-Distortion Functions
Calculation of Rate-Distortion Functions
Computational Approach for Calculation of Rate-Distortion Functions
Introduction to Quantization
Lloyd-Max Quantizer
Companded Quantization
Variable Length Coding and Problem Solving in Quantizer Design
Vector Quantization
Transform Coding Part-I
Transform Coding Part-II
0 Comments