(SEM VIII) THEORY EXAMINATION 2018-19 DATACOMPRESSION

B.Tech Data Structure 0 downloads
₹29.00

SECTION A (2 × 10 = 20 Marks)

 

Section A contains short-answer questions, but you should not write one-line answers. Even for 2 marks, you must write definition + explanation + application (where required).

Let us explain each concept clearly.

 

(a) Huffman Coding – Lossless or Lossy?

 

Huffman coding is a lossless compression technique. In lossless compression, the original data can be perfectly reconstructed from the compressed data without any loss of information.

 

Huffman coding assigns variable-length codes to symbols based on their probabilities. Symbols with higher probability get shorter codes, while less frequent symbols get longer codes. This minimizes the average code length and improves compression efficiency.

 

Applications of Huffman coding include:

 

ZIP file compression

JPEG image compression (lossless stage)

MP3 compression (entropy coding stage)

Text compression utilities

Since it preserves exact data, it is suitable for text, executable files, and medical data.

 

(b) Composite Source Model

 

A composite source model represents a source that consists of multiple sub-sources. Instead of assuming a single probability distribution, the source is modeled as a mixture of several distributions.

 

For example, in image compression, different regions of an image may have different statistical properties. A composite model handles such variations more efficiently by adapting to local characteristics.

 

It improves compression efficiency when source statistics change over time.

 

(c) Prefix Codes

 

A prefix code is a type of code in which no codeword is a prefix of another codeword. This ensures that the code is uniquely decodable.

 

For example, the set {0, 10, 110, 111} is a prefix code because none of the codewords begin with another complete codeword.

 

Huffman coding produces prefix codes. Prefix property ensures instantaneous decoding without ambiguity.

 

(d) JBIG Standard

 

JBIG stands for Joint Bi-level Image Experts Group. It is a standard for compressing bi-level images (images with only black and white pixels).

 

It uses arithmetic coding and context modeling to achieve high compression efficiency. JBIG is widely used in fax transmission and document imaging systems.

 

(e) Entropy

 

Entropy is a measure of average information content or uncertainty of a source. It is defined by Shannon as:

H = − ∑ p(x) log₂ p(x)

 

Entropy represents the theoretical lower bound on the average number of bits required to encode symbols.

Higher entropy means more randomness and less compressibility.

 

(f) Compression Ratio

 

Compression ratio is defined as:

Compression Ratio = Original Size / Compressed Size

It indicates how effectively data has been compressed. Higher ratio means better compression efficiency.

 

(g) Uniquely Decodable Code

 

Given code: {0, 10, 110, 111}

 

This is uniquely decodable because no codeword is a prefix of another. Therefore, it satisfies prefix condition and is uniquely decodable.

 

(h) Compression Technique in Unix “compress”

 

The Unix "compress" command uses LZW (Lempel-Ziv-Welch) algorithm, which is a dictionary-based lossless compression technique.

 

(i) Uniform Quantizer

 

A uniform quantizer divides the range of signal values into equal-sized intervals. Each interval is represented by a fixed quantization level.

 

It is simple but not optimal when signal distribution is non-uniform.

 

(j) Entropy Coded Quantization

 

Entropy coded quantization combines quantization with entropy coding. After quantization, output symbols are encoded using Huffman or arithmetic coding.

 

It improves overall compression efficiency.

 

SECTION B (10 × 3 = 30 Marks)

 

This section requires detailed conceptual explanation.

Vector Quantization vs Scalar Quantization

 

Scalar quantization quantizes one sample at a time. Vector quantization quantizes blocks of samples together.

 

Vector quantization exploits correlation between samples. It provides better compression performance but requires larger codebooks and higher computational complexity.

 

Example:

 

Instead of quantizing pixels individually, a 2×2 block can be quantized as a vector.

 

What is Data Compression?
 

Data compression reduces the number of bits required to represent data. It removes redundancy and irrelevance.

 

Why needed?

 

Reduce storage cost

Reduce transmission bandwidth

Improve transmission speed

Compression system has two blocks:

Encoder → removes redundancy
Decoder → reconstructs original or approximate data

 

Golomb Codes & Tunstall Codes

Golomb codes are used for encoding geometrically distributed data. They are efficient when probabilities decrease exponentially.

 

Tunstall coding is a variable-to-fixed length coding technique. It produces fixed-length output codes and variable-length input sequences.

 

Quantization Problem

Quantization converts continuous amplitude values into discrete levels.

 

Problem:

It introduces distortion.

Need to minimize Mean Square Error (MSE).

 

Example:

If input range is 0–10 and divided into 5 levels, each level represents a range. Any value inside that range is approximated.

 

Dictionary Based Coding Techniques

Dictionary-based coding stores repeated sequences in a dictionary.

 

Main types:

LZ77 (sliding window)

LZ78 (explicit dictionary)

LZW (improved LZ78)

They are widely used in ZIP, GIF, and PNG formats.

 

SECTION C (10 Marks Each – Analytical)

Lossless vs Lossy Compression

Lossless compression reconstructs exact data. Used for text and documents.

Lossy compression allows some loss of information. Used for images, audio, and video.

Lossless ensures zero distortion. Lossy provides higher compression ratios.

Entropy Calculation

 

Given:

P(a1) = 1/2
P(a2) = 1/4
P(a3) = 1/8
P(a4) = 1/8

Entropy:

H = − [ (1/2)log₂(1/2) + (1/4)log₂(1/4) + 2(1/8)log₂(1/8) ]

H = − [ (1/2)(−1) + (1/4)(−2) + 2(1/8)(−3) ]

H = 1/2 + 1/2 + 3/4 = 1.75 bits/symbol

Huffman Coding Problem

 

Steps:

Arrange probabilities in ascending order.

Merge lowest probabilities.

Construct tree.

Assign 0 and 1 to branches.

Then calculate average bits per symbol:

L = ∑ p(x) × code length

LZ77 vs LZ78

LZ77 uses sliding window and pointer-length pairs.

 

LZ78 builds explicit dictionary.

LZ77 uses past buffer. LZ78 grows dictionary dynamically.

PPM (Prediction by Partial Matching)

PPM predicts next symbol based on context. It uses adaptive probability estimation. Then arithmetic coding encodes output.

Distortion Criteria

 

Common distortion measures:

Mean Square Error (MSE)

Peak Signal to Noise Ratio (PSNR)

 

Absolute error

Used to evaluate lossy compression performance.

Scalar vs Vector Quantization

Scalar quantizes single sample.

Vector quantizes group of samples.

Vector gives better performance but more complexity.

LBG Algorithm (Linde-Buzo-Gray)

LBG is used to design optimal codebook in vector quantization.

 

Steps:

Initialize codebook.

Partition training vectors.

Update centroids.

Repeat until convergence.

File Size
147.61 KB
Uploader
SuGanta International
⭐ Elite Educators Network

Meet Our Exceptional Teachers

Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication

KISHAN KUMAR DUBEY

KISHAN KUMAR DUBEY

Sant Ravidas Nagar Bhadohi, Uttar Pradesh , Babusarai Market , 221314
5 Years
Years
₹10000+
Monthly
₹201-300
Per Hour

This is Kishan Kumar Dubey. I have done my schooling from CBSE, graduation from CSJMU, post graduati...

Swethavyas bakka

Swethavyas bakka

Hyderabad, Telangana , 500044
10 Years
Years
₹10000+
Monthly
₹501-600
Per Hour

I have 10+ years of experience in teaching maths physics and chemistry for 10th 11th 12th and interm...

Vijaya Lakshmi

Vijaya Lakshmi

Hyderabad, Telangana , New Nallakunta , 500044
30+ Years
Years
₹9001-10000
Monthly
₹501-600
Per Hour

I am an experienced teacher ,worked with many reputed institutions Mount Carmel Convent , Chandrapu...

Shifna sherin F

Shifna sherin F

Gudalur, Tamilnadu , Gudalur , 643212
5 Years
Years
₹6001-7000
Monthly
₹401-500
Per Hour

Hi, I’m Shifna Sherin! I believe that every student has the potential to excel in Math with the righ...

Divyank Gautam

Divyank Gautam

Pune, Maharashtra , Kothrud , 411052
3 Years
Years
Not Specified
Monthly
Not Specified
Per Hour

An IIT graduate having 8 years of experience teaching Maths. Passionate to understand student proble...

Explore Tutors In Your Location

Discover expert tutors in popular areas across India

German Language Classes Near Golf Course Road – Learn German for Career & Study Abroad Golf Course Road, Gurugram
Drawing & Sketching Classes Near By Uttam Nagar – Explore Your Creative Potential Uttam Nagar, Delhi
Painting Classes Near By Dwarka Mor – Discover the Artist Within You Dwarka Mor, Delhi
Baking Classes Near Sector 84 Gurugram – Learn Cake & Bakery Skills Professionally Sector 84, Gurugram
Physiotherapy Guidance (Certified Professionals Only) Near Sector 120 Noida – Expert Care for Pain Relief and Recovery Sector 120, Noida
Voice-over Training Classes Near By Saket – Build a Powerful & Professional Voice Saket, Delhi
SEO Training Near Noida Sector 93 – Learn Search Engine Optimization and Build a Digital Career Sector 93, Noida
Accounts & Commerce Classes Near Sector 99 Dwarka Expressway, Gurugram – Build Strong Financial & Business Foundations Sector 99A, Gurugram
Fashion Designing Classes Near By Dwarka Mor – Turn Your Creativity into a Stylish Career Dwarka Mor, Delhi
Guitar Classes Near Central Noida Sector 10 – Learn Guitar with Expert Trainers A Block Sector 10, Noida
Voice-over Training Near Sushant Lok Phase 1 – Learn Professional Voice Acting Phase I Sushant Lok, Gurugram
Fashion Designing Course Near Sector 81 Gurugram – Turn Your Creativity into a Successful Career Sector 81, Gurugram
UI/UX Designing Course Near Sector 66 Gurugram – Build a Creative & High-Paying Design Career Sector 66, Gurugram
Home Tuition (All Subjects) Near Dwarka Mor – Personalized Learning for Academic Success Dwarka Mor, Delhi
Personal Fitness Training Near Sector 132 Greater Noida – Achieve Your Health and Fitness Goals with Expert Trainers Noida
Candle Making Classes In Dwarka Mor – Learn the Art of Handmade Candle Crafting Dwarka Mor, Delhi
Tailoring & Stitching Classes Near By Dwarka Mor – Learn Professional Sewing Skills Dwarka Mor, Delhi
Soap Making Classes Near Sector 85 Gurugram – Learn Handmade & Herbal Soap Craft Sector 85, Gurugram
Social Science Classess Dwarka Mor, Delhi
Yoga Classes Near by Dwarka Mor – A Complete Guide to Better Health & Wellness Dwarka Mor, Delhi
⭐ Premium Institute Network

Discover Elite Educational Institutes

Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies

Réussi Academy of languages

sugandha mishra

Réussi Academy of languages
Madhya pradesh, Indore, G...

Details

Coaching Center
Private
Est. 2021-Present

Sugandha Mishra is the Founder Director of Réussi Academy of Languages, a premie...

IGS Institute

Pranav Shivhare

IGS Institute
Uttar Pradesh, Noida, Sec...

Details

Coaching Center
Private
Est. 2011-2020

Institute For Government Services

Krishna home tutor

Krishna Home tutor

Krishna home tutor
New Delhi, New Delhi, 110...

Details

School
Private
Est. 2001-2010

Krishna home tutor provide tutors for all subjects & classes since 2001

Edustunt Tuition Centre

Lakhwinder Singh

Edustunt Tuition Centre
Punjab, Hoshiarpur, 14453...

Details

Coaching Center
Private
Est. 2021-Present
Great success tuition & tutor

Ginni Sahdev

Great success tuition & tutor
Delhi, Delhi, Raja park,...

Details

Coaching Center
Private
Est. 2011-2020