(SEM VIII) THEORY EXAMINATION 2021-22 NATURAL LANGUAGE PROCESSING

B.Tech Engineering 0 downloads
₹29.00

SECTION A

(Attempt all questions in brief – 10 × 2 = 20 marks)

 

(a) Semantics in Natural Language Processing

Semantics in NLP deals with the meaning of words, phrases, and sentences. It focuses on understanding what a sentence actually means rather than just its structure.

 

(b) Natural Language

Natural language is the language used by humans for communication, such as English, Hindi, or French. It is informal, ambiguous, and context-dependent.

 

(c) Concept of Knowledge

In NLP, knowledge refers to stored information about words, grammar, meaning, and real-world facts that help a system understand and interpret language.

 

(d) Concept of Parsing

Parsing is the process of analyzing the grammatical structure of a sentence according to a given grammar to determine its syntactic structure.

 

(e) Applications of NLP

Applications of NLP include machine translation, chatbots, voice assistants, sentiment analysis, information retrieval, and text summarization.

 

(f) Probabilistic Language Processing (PLP)

PLP uses probability and statistics to handle uncertainty and ambiguity in language by selecting the most likely interpretation.

 

(g) Grammars for Natural Language

Grammars define rules that describe how words combine to form valid sentences. Examples include context-free grammar and feature-based grammar.

 

(h) Top-Down and Bottom-Up Parsers

Top-down parsers start from the sentence symbol and work towards words.
Bottom-up parsers start from words and build up to the sentence structure.

 

(i) Need of Verb Phrases in NLP

Verb phrases are essential because they describe actions, events, and states, which are crucial for understanding sentence meaning.

 

(j) Ambiguity

Ambiguity occurs when a sentence has more than one possible meaning or interpretation.

 

SECTION B

(Attempt any THREE – 10 × 3 = 30 marks)

 

2(a) Statistical Methods for Ambiguity Resolution

Statistical methods resolve ambiguity using probabilities derived from large corpora. Techniques such as n-grams, Hidden Markov Models, and Bayesian models assign likelihoods to different interpretations. The interpretation with the highest probability is selected. These methods are effective for word sense disambiguation and syntactic ambiguity resolution.

 

2(b) Machine Translation

Machine Translation converts text from one language to another automatically. It includes rule-based, statistical, and neural approaches. Modern systems use neural networks to capture context and meaning more accurately. Applications include Google Translate and multilingual communication systems.

 

2(c) Feature Systems and Augmented Grammar

Feature systems add attributes like number, gender, and tense to grammar rules.
Augmented grammar extends basic grammar by including semantic and syntactic constraints, improving parsing accuracy and reducing ambiguity.

 

2(d) Movement Phenomenon in Language

Movement phenomenon explains how elements in a sentence appear in positions different from their original place, such as in questions or passive sentences. It helps NLP systems understand sentence transformations and deep structures.

 

2(e) Evaluating Language Understanding Systems

Language understanding systems are evaluated using accuracy, precision, recall, and F-score. Human evaluation and benchmark datasets are also used to assess system performance.

 

SECTION C

 

3(a) Probabilistic Context-Free Grammars (PCFGs)

PCFGs extend context-free grammars by assigning probabilities to grammar rules. These probabilities help choose the most likely parse tree among many possibilities, making PCFGs effective for ambiguity resolution in parsing.

 

3(b)

(i) Basic Feature System for English

The feature system includes grammatical features like tense, number, person, gender, and case. These features help ensure grammatical agreement in sentences.

 

(ii) Morphological Analysis

Morphological analysis studies the internal structure of words. It identifies roots, prefixes, and suffixes to determine word forms and meanings.

 

4(a) Different Levels of Language Analysis

Language analysis includes phonological, morphological, syntactic, semantic, discourse, and pragmatic levels. Each level contributes to understanding language from sound to meaning and context.

 

4(b)

(i) Transition Network Grammars

These grammars use state transition diagrams to represent grammatical structures, enabling efficient sentence parsing.

(ii) Top-Down Chart Parsing

Top-down chart parsing combines top-down parsing with dynamic programming to avoid redundant computations.

 

5(a) Handling Questions in Context-Free Grammars

Questions are handled by modifying grammar rules to represent interrogative structures. Special rules account for word movement and auxiliary verbs.

 

5(b) Word Senses and Ambiguity

Words may have multiple meanings depending on context. Word sense disambiguation resolves this ambiguity using context, rules, or statistical methods.

 

6(a) Database Interface in Natural Language Understanding

A database interface allows users to query databases using natural language. The system converts queries into structured database commands like SQL.

 

6(b) Organization of Natural Language Understanding Systems

These systems consist of modules for lexical analysis, parsing, semantic interpretation, discourse processing, and knowledge representation.

 

7(a) Augmented Transition Networks (ATNs)

ATNs extend transition networks by adding conditions, actions, and memory. They provide powerful mechanisms for parsing complex language structures.

 

7(b) Human Preferences in Parsing

Human preferences such as minimal attachment and right association influence parsing decisions. NLP systems model these preferences to improve accuracy.

 

File Size
127.25 KB
Uploader
Payal Saini