(SEM VII) THEORY EXAMINATION 2024-25 NATURAL LANGUAGE PROCESSING
NATURAL LANGUAGE PROCESSING (KCS072)
B.Tech – Semester VII
Time: 3 Hours | Max Marks: 100
SECTION – A (10 × 2 = 20 Marks)
(Attempt all questions in brief)
(a) How does context influence error detection?
Context helps error detection by considering surrounding words and sentence meaning. A word that is correct in spelling may still be incorrect in context, for example “their” instead of “there,” which can only be detected using contextual information.
(b) Explain the backward algorithm used in HMM for PoS tagging.
The backward algorithm computes the probability of observing the remaining sequence of words from a given state. It starts from the end of the sentence and moves backward, helping calculate overall tag probabilities efficiently.
(c) Example of unification of feature structures for number and gender agreement.
For the sentence “She runs”, unification ensures:
Subject: number = singular, gender = feminine
Verb: number = singular
Both agree, so unification succeeds.
(d) Limitations of CFGs in modeling natural language syntax.
CFGs cannot handle long-distance dependencies, agreement constraints, and semantic relationships effectively. They also struggle with ambiguity and context-sensitive constructions.
(e) Dictionary-based vs distributional methods for word similarity.
Dictionary-based methods use predefined meanings and synonyms, while distributional methods learn similarity from word usage patterns in large corpora. Distributional methods adapt better to real-world language use.
(f) Define selectional restrictions.
Selectional restrictions are semantic constraints that restrict which words can logically combine, such as “eat” requiring an animate subject and edible object.
(g) How are speech sounds classified?
Speech sounds are classified as vowels and consonants based on vocal tract configuration, voicing, place of articulation, and manner of articulation.
(h) Effect of vocal tract shape and size on speech spectrum.
The shape and size of the vocal tract determine formant frequencies, which define vowel quality and affect the spectral envelope of speech sounds.
(i) Demonstrate the Viterbi algorithm.
The Viterbi algorithm finds the most probable sequence of hidden states by dynamic programming, storing maximum probabilities and backtracking to find the best path.
(j) Compare LPC and PLP coefficients.
LPC models speech using linear prediction, while PLP incorporates perceptual aspects of human hearing, making PLP more robust to noise.
SECTION – B (Attempt any THREE) (3 × 10 = 30 Marks)
2(a) Regular expression (ab)*c: Finite State Automaton
The regular expression allows zero or more repetitions of “ab” followed by “c”.
The FSA starts at the initial state, loops on “ab”, and transitions to the final state on “c”. It accepts strings like “c”, “abc”, “abababc”.
2(b) Ambiguity in sentence using dependency grammar
Sentence: “The dog saw the man with the telescope.”
Ambiguity arises because “with the telescope” can modify either “saw” or “the man”.
Resolution can be achieved using semantic roles or probabilistic parsing.
2(c) Syntax-driven semantic analysis
Syntax-driven semantic analysis attaches meaning during parsing.
For “John gave Mary a book”:
Giver: John
Receiver: Mary
Object: book
This semantic structure is built alongside syntactic parsing.
2(d) Filter-bank vs LPC methods
Filter-bank methods analyze speech energy across frequency bands. LPC models the vocal tract using linear prediction. Filter-bank methods are simpler, while LPC provides compact representation.
2(e) Likelihood distortions in speech recognition
Likelihood distortions occur due to noise, channel mismatch, or speaker variation.
These distortions reduce recognition accuracy and affect perceived speech quality.
SECTION – C (Attempt any ONE) (1 × 10 = 10 Marks)
3(a) Minimum Edit Distance between “intention” and “execution”
Operations used: insertion, deletion, substitution.
Alignment steps show multiple substitutions and insertions.
The minimum edit distance = 5.
3(b) Interpolation vs Backoff smoothing
Interpolation combines probabilities from different n-gram models using weights.
Backoff uses lower-order models only when higher-order counts are zero.
SECTION – D (Attempt any ONE) (1 × 10 = 10 Marks)
4(a) Treebanks in NLP
Treebanks are annotated corpora with syntactic trees. They help train and evaluate parsers by providing supervised learning data.
Example: Penn Treebank.
4(b) CYK parsing algorithm
CYK is a bottom-up parsing algorithm for CFGs in Chomsky Normal Form.
It uses a dynamic programming table to determine whether a sentence belongs to a grammar.
SECTION – E (Attempt any ONE) (1 × 10 = 10 Marks)
5(a) Supervised WSD using example “bank”
Supervised WSD uses labeled training data where each sense is predefined. A classifier learns contextual features to predict correct word sense.
5(b) Bootstrapping method for WSD
Bootstrapping starts with seed words and iteratively expands sense-labeled data by learning from high-confidence predictions.
SECTION – F (Attempt any ONE) (1 × 10 = 10 Marks)
6(a) Log-spectral distance measure
It measures the difference between two spectra using logarithmic power values.
It reflects perceptual differences in speech signals.
6(b) LPC coefficient derivation
LPC coefficients are derived by minimizing prediction error using autocorrelation and Levinson-Durbin algorithm.
SECTION – G (Attempt any ONE) (1 × 10 = 10 Marks)
7(a) Spectral distortion measures
Measures like cepstral distance quantify differences between speech spectra and are used to evaluate speech coding quality.
7(b) Role of HMMs in speech recognition
HMMs model temporal variations in speech.
The forward algorithm computes observation probability, while the backward algorithm computes future likelihoods.
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies