(SEM VIII) THEORY EXAMINATION 2018-19 DIGITAL IMAGE PROCESSING

B.Tech Engineering 0 downloads
₹29.00

SECTION A – Detailed Conceptual Explanation (2 × 10 = 20 Marks)

 

Section A requires short answers, but you must write clear definitions with explanation, not just one-line statements.

 

(a) Components of Image Processing System

 

A digital image processing system consists of several interconnected components that work together to acquire, process, and analyze images. The major components include:

 

Image acquisition system, which includes sensors and digitizers to capture images. A processing unit, usually a computer with specialized hardware or software, performs operations such as filtering, enhancement, and analysis. Storage devices store input and processed images. Display devices are used to visualize results. Finally, image communication systems may transmit processed images.

 

Each component plays a crucial role in transforming raw image data into meaningful information.

 

(b) Applications of Digital Image Processing

 

Digital image processing has applications in many fields. In medical imaging, it is used for MRI, CT scan, and X-ray enhancement. In remote sensing, satellite images are processed for environmental monitoring.

 

 In industrial inspection, image processing detects defects. In security systems, it is used for face recognition and fingerprint identification. In multimedia, it is used for compression and enhancement.

 

(c) Mask of Sobel Filter

 

The Sobel filter is a first-order derivative operator used for edge detection. It calculates gradient in horizontal and vertical directions.

 

Horizontal Sobel mask:

-1 0 +1
-2 0 +2
-1 0 +1

 

Vertical Sobel mask:

-1 -2 -1
0 0 0
+1 +2 +1

These masks emphasize edges by computing intensity gradient.

 

(d) Difference Between Image Enhancement and Restoration

Image enhancement improves visual appearance of an image without considering mathematical degradation models. It is subjective and application dependent.

 

Image restoration, on the other hand, attempts to reconstruct the original image from degraded image using mathematical models of degradation and noise.

 

Enhancement improves perception; restoration improves fidelity.

 

(e) Laplacian Filter

 

The Laplacian filter is a second-order derivative operator used for edge detection. It highlights regions of rapid intensity change.

Common mask:

0 -1 0
-1 4 -1
0 -1 0

It enhances fine details and edges.

 

(f) Dilation Process

Dilation is a morphological operation that expands the boundaries of objects in a binary image. It adds pixels to object boundaries using a structuring element. It is useful for filling gaps and connecting broken components.

 

(g) Boundary Extraction

Boundary extraction identifies outer edges of objects in binary images. It is used in shape analysis and object recognition.

 

(h) Morphological Image Processing

Morphological image processing involves processing based on shapes. It uses set theory concepts and structuring elements to analyze geometrical structure in images.

 

(i) Geometric Transformation

Geometric transformation changes spatial relationship of pixels. It includes translation, rotation, scaling, and shearing.

 

(j) First Order Derivative Filters

First-order derivative filters detect edges by computing gradient of image intensity. Examples include Roberts, Prewitt, and Sobel operators.

 

SECTION B – Detailed Analytical Explanation (10 × 3 = 30 Marks)

 

2(a) Shortest Path in Digital Image

This question tests understanding of connectivity: 4-connectivity, 8-connectivity, and m-connectivity 

 

4-path considers horizontal and vertical neighbors only.

8-path considers diagonal neighbors also.

m-path (mixed connectivity) avoids ambiguous connections.

 

To solve:

Identify allowed pixel values (V = {0,1} or V = {1,2}).
Trace shortest path using allowed pixels.
Count number of steps.
If no continuous path exists using given V, state reason clearly.

 

2(b) Linear and Non-linear Smoothing Filters

Linear smoothing filters include mean filter and Gaussian filter. They use linear convolution operations.

Non-linear smoothing filters include median filter and max/min filters. These do not use linear combination.

 

For 3×3 box filter:

Take average of 9 neighboring pixels. Replace center pixel with average value.

Apply this to each position of given 5×5 matrix.

Show calculations for at least 3–4 positions.

 

2(c) Edge Detection and Edge Linking

Edge detection identifies intensity discontinuities. It uses gradient operators.

Edge linking connects detected edge pixels to form continuous boundaries.

 

Difference:

Edge detection finds candidate edge points.
Edge linking forms meaningful object boundaries.

 

2(d) Image Restoration in Presence of Noise Only

Image degradation model:

g(x,y) = f(x,y) + n(x,y)

 

If only noise present, restoration can be performed using:

Mean filter (for Gaussian noise)

Median filter (for salt and pepper noise)

Adaptive filters

The goal is to reduce noise while preserving edges.

 

2(e) Morphological Operations

Opening = Erosion followed by dilation. Removes small objects.

Closing = Dilation followed by erosion. Fills small holes.

Region filling = Filling interior pixels of a boundary using connectivity rules.

 

SECTION C – Long Answer Type (10 Marks Each)

 

3(a) Low, Mid and High Level Processing

Low level processing includes filtering and enhancement. Input and output are images.

Mid level processing includes segmentation and feature extraction.

High level processing includes object recognition and interpretation.

Sampling converts continuous image into discrete pixels.

Quantization converts continuous amplitude into discrete levels.

 

3(b) Frequency Domain Filtering

In frequency domain, filtering is done by:

Taking Fourier transform.

Multiplying with filter function.

Taking inverse transform.

Low-pass filters remove high frequency noise.

High-pass filters enhance edges.

 

4(a) Piecewise Linear Transformations

These are intensity transformations where mapping function consists of multiple linear segments.

Examples:

Contrast stretching improves dynamic range.

Thresholding converts grayscale image to binary.

 

4(b) Notes

Bit plane slicing extracts individual bit planes from image.

Homomorphic filter separates illumination and reflectance.

Histogram shows distribution of intensity values.

 

5(a) Order Statistics Filters

Median filter replaces pixel with median value.

Max filter selects maximum value.

Min filter selects minimum value.

These are non-linear filters.

 

5(b) Minimum Mean Square Error Restoration

MMSE filter minimizes average squared error between restored and original image.

It is also known as Wiener filter.

 

6(a) Thinning and Thickening

Thinning reduces object thickness to single-pixel width.

Thickening increases object thickness.

Both are morphological operations.

 

6(b) Duality Proof

Morphological duality:

(A ● B)ᶜ = Aᶜ ○ B̂
(A ○ B)ᶜ = Aᶜ ● B̂

Where:

● = Dilation
○ = Erosion
B̂ = Reflection of structuring element

Proof uses set theory properties.

 

7(a) Advanced Topics

Stereo imaging uses two images to estimate depth.

Multi-level thresholding separates image into multiple regions.

Image registration aligns two images geometrically.

 

7(b) Compass Gradient Operators

Eight 3×3 masks detect edges in 8 directions (E, NE, N, NW, W, SW, S, SE).

Each mask contains -1, 0, 1 values arranged to detect direction-specific gradients.

File Size
173.65 KB
Uploader
Payal Saini