THEORY EXAMINATION (SEM–IV) 2016-17 INFORMATION THEORY AND CODING
Course: B.Tech (ECE – 4th Semester)
Subject Code: NEC408
Subject Title: Information Theory and Coding
Exam Type: Theory
Duration: 3 Hours
Maximum Marks: 100
SECTION – A (10 × 2 = 20 Marks)
Short, concept-based questions covering basic principles of communication and coding.
Key Topics:
Communication System Block Diagram – Source → Encoder → Channel → Decoder → Receiver.
Entropy Maximum Condition – Entropy is maximum when all symbols are equiprobable.
- Hmax=log2NH_{max} = \log_2 NHmax=log2N
Source efficiency: η=HHmax×100%\eta = \frac{H}{H_{max}} \times 100\%η=HmaxH×100%
Non-Singular Code – Each codeword must be distinct (one-to-one mapping).
Properties of Mutual Information:
I(X;Y)≥0I(X;Y) \geq 0I(X;Y)≥0
I(X;Y)=H(X)−H(X∣Y)=H(Y)−H(Y∣X)I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)I(X;Y)=H(X)−H(X∣Y)=H(Y)−H(Y∣X)
Shannon-Hartley Theorem:
- C=Blog2(1+SN)C = B \log_2(1 + \frac{S}{N})C=Blog2(1+NS)
where CCC=capacity, BBB=bandwidth, S/NS/NS/N=signal-to-noise ratio.
Properties of Block Codes: Linearity, fixed block length, error detection and correction.
Hamming Weight: Number of 1’s in a code vector.
Example: C₁ = 0001010 → weight = 2; C₂ = 1010101 → weight = 4.
Convolutional Codes: Continuous stream-based coding vs block-based; defined by generator polynomials.
Zero Memory Information Source: Emits independent symbols →
- H(Xn)=nH(X)H(X^n) = nH(X)H(Xn)=nH(X)
(23,12) Golay Code: Called a perfect code because it meets the Hamming bound with equality.
SECTION – B (5 × 10 = 50 Marks)
Numerical and derivation-based questions on entropy, coding algorithms, and information measures.
Key Problems:
(a) Entropy Calculation & Source Extension
Source: S={S1,S2,S3,S4}S = \{S_1, S_2, S_3, S_4\}S={S1,S2,S3,S4}
Probabilities: 4/11,3/11,2/11,2/114/11, 3/11, 2/11, 2/114/11,3/11,2/11,2/11
Compute:
- H(S)=−∑pilog2piH(S) = -\sum p_i \log_2 p_iH(S)=−∑pilog2pi
and show H(S2)=2H(S)H(S^2) = 2H(S)H(S2)=2H(S).
(b) Source Coding
Need: Reduce redundancy → improve channel efficiency.
Compact Codes: Codes minimizing average length (e.g., Huffman codes).
External Property of Entropy: Considers both signal and noise contributions to total entropy.
(c) Shannon–Fano Coding Example
Source: S={X1,X2,X3,X4,X5,X6}S = \{X_1,X_2,X_3,X_4,X_5,X_6\}S={X1,X2,X3,X4,X5,X6}
Probabilities: 0.4, 0.2, 0.2, 0.1, 0.08, 0.02
Tasks:
Generate Shannon–Fano code table.
Calculate efficiency:
- η=HLavg×100%\eta = \frac{H}{L_{avg}} \times 100\%η=LavgH×100%
Explain differential entropy for continuous signals.
(d) Linear Block Codes
Given parity matrix:
- P=[111110101011]P = \begin{bmatrix} 1 & 1 & 1\\ 1 & 1 & 0\\ 1 & 0 & 1\\ 0 & 1 & 1 \end{bmatrix}P=111011011011
Find code vectors.
Detect & correct error in R = [1011100].
(e) Convolutional Code (4,3,2)
Given input polynomials:
u(1)(D)=1+D2,u(2)(D)=1+D,u(3)(D)=1+Du^{(1)}(D) = 1 + D^2,\quad u^{(2)}(D) = 1 + D,\quad u^{(3)}(D) = 1 + Du(1)(D)=1+D2,u(2)(D)=1+D,u(3)(D)=1+D
Construct codeword using transform-domain approach.
(f) Mutual Information Example
Given joint probability matrix (page 2, visual table) of 5×4 values for A and B.
Compute:
- H(A), H(B), H(A,B), I(A;B)=H(A)+H(B)−H(A,B)H(A),\ H(B),\ H(A,B),\ I(A;B) = H(A) + H(B) - H(A,B)H(A), H(B), H(A,B), I(A;B)=H(A)+H(B)−H(A,B)
Discuss a priori entropy, posteriori entropy, and equivocation.
(g) Huffman Coding Example
Source probabilities: A=1/3,B=1/27,C=1/3,D=1/9,E=1/9,F=1/27,G=1/27A=1/3, B=1/27, C=1/3, D=1/9, E=1/9, F=1/27, G=1/27A=1/3,B=1/27,C=1/3,D=1/9,E=1/9,F=1/27,G=1/27
Construct ternary Huffman code, compute average length & efficiency.
(h) Shannon–Fano–Elias Coding
Source: A=0.25,B=0.25,C=0.2,D=0.15,E=0.15A=0.25, B=0.25, C=0.2, D=0.15, E=0.15A=0.25,B=0.25,C=0.2,D=0.15,E=0.15
Find code length and efficiency.
SECTION – C (2 × 15 = 30 Marks)
Advanced questions focusing on channel capacity, error control codes, and system design.
Q3. Binary Symmetric Channel (BSC)
Given transition matrix and probabilities:
P(Y/X)=[3/41/41/43/4], P(X1)=2/3, P(X2)=1/3P(Y/X) = \begin{bmatrix} 3/4 & 1/4 \\ 1/4 & 3/4 \end{bmatrix},\ P(X_1)=2/3,\ P(X_2)=1/3P(Y/X)=[3/41/41/43/4], P(X1)=2/3, P(X2)=1/3
Find:
H(X)H(X)H(X), H(Y)H(Y)H(Y), H(Y∣X)H(Y|X)H(Y∣X), and Channel Capacity
- C=H(Y)−H(Y∣X)C = H(Y) - H(Y|X)C=H(Y)−H(Y∣X)
Also check Kraft–McMillan Inequality to identify instantaneous codes among A, B, C, D (table on page 2).
Q4. Short Notes
(i) BCH & RS Codes – Cyclic error-correcting codes for multi-bit errors.
(ii) Golay Codes – Perfect codes used in deep-space communication.
(iii) Burst vs Random Error Correction – Burst errors occur in clusters; random occur independently.
Also, given a (6,3) linear block code, where:
C4=d1+d2,C5=d1+d3,C6=d2+d3C_4 = d_1 + d_2, \quad C_5 = d_1 + d_3, \quad C_6 = d_2 + d_3C4=d1+d2,C5=d1+d3,C6=d2+d3
Tasks:
Write Generator (G) and Parity-check (H) matrices.
Construct Standard Array Table for decoding.
Q5. Convolutional Codes & Hamming Distance
(a) Define and explain Hamming Distance and Minimum Distance with examples.
(b) For (3,1,2) convolutional code with
g1=110,g2=101,g3=111g_1 = 110, g_2 = 101, g_3 = 111g1=110,g2=101,g3=111:
Draw Encoder Diagram.
Find Generator Matrix.
Encode information sequence (11101) using time-domain approach.
Key Concepts Summary
| Concept | Formula / Definition |
|---|---|
| Entropy | H=−∑pilog2piH = -\sum p_i \log_2 p_iH=−∑pilog2pi |
| Mutual Information | ( I(X;Y) = H(X) - H(X |
| Channel Capacity | C=Blog2(1+S/N)C = B\log_2(1 + S/N)C=Blog2(1+S/N) |
| Source Efficiency | η=H/Lavg\eta = H / L_{avg}η=H/Lavg |
| Hamming Distance | No. of differing bits between two codewords |
| Linear Block Code | C=uGC = uGC=uG, where ( G = [I_k |
| Convolutional Code | Output = input convolved with generator polynomials |
Summary
The Information Theory and Coding (NEC408) paper thoroughly tests:
Fundamentals of information measures (entropy, mutual info)
Source coding algorithms (Shannon–Fano, Huffman, Elias)
Error control coding (block, cyclic, convolutional)
Channel capacity and performance
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies