(SEM V) THEORY EXAMINATION 2021-22 MACHINE LEARNING TECHNIQUES
MACHINE LEARNING TECHNIQUES (KCS-055)
B.Tech (Sem V) – Exam Notes & Solved Guide
SECTION A – Short Answer Type (2 Marks Each)
a. Well-posed learning problem
A learning problem is well-posed if it has Task (T), Performance measure (P), and Experience (E).
Example: Learning to play checkers (T), win rate (P), past games (E).
b. Occam’s Razor in ML
Prefer the simplest hypothesis that explains the data, to avoid overfitting.
c. Inductive Bias in ANN
Assumptions that guide learning (e.g., smoothness, linearity) to generalize beyond seen data.
d. Gradient descent delta rule
Updates weights to minimize error:
Δw = −η(∂E/∂w)
e. Paired t-test
Statistical test to compare two learning algorithms on the same datasets.
f. Confidence interval
Range around an estimate within which the true value lies with given confidence (e.g., 95%).
g. Sample complexity
Number of training examples required to learn a hypothesis with desired accuracy and confidence.
h. Lazy vs Eager learning
Lazy (e.g., k-NN): defer computation until query.
Eager (e.g., decision tree): build model before queries.
i. Crowding problem in GA
Loss of diversity due to similar individuals dominating the population.
j. Analytical vs Inductive learning
Analytical uses prior domain knowledge; inductive relies mainly on data.
SECTION B – Descriptive Answers (10 Marks Each)
2(a) Checkers Learning Program – Final Design
Define the board state as features, evaluation function as linear combination of features, and update weights via temporal difference learning from game outcomes. Performance is measured by win rate over games.
2(b) Maximum Likelihood & Least Squares Hypotheses
Maximum Likelihood chooses parameters maximizing P(data|model).
Least Squares minimizes sum of squared errors between predictions and targets; equivalent to ML under Gaussian noise.
2(c) EM Algorithm – What Problem It Solves
EM estimates parameters when data has hidden variables. It alternates between E-step (expectation of hidden vars) and M-step (maximize likelihood).
2(d) Importance of Case-Based Learning
Solves new problems by reusing similar past cases; flexible, interpretable, and effective with sparse models.
2(e) Learning First-Order Rules
Learns rules with predicates and variables (FO logic), enabling richer representations than propositional rules.
SECTION C – Long Answer Type
3(a) Concept Learning (with Example)
Task of inferring a boolean concept from labeled examples.
Example: “PlayTennis” from attributes like Sky, Humidity, Wind using hypothesis space and generalization.
3(b) Candidate Elimination (Given Table)
Initialize S (most specific) and G (most general).
Process each example to minimally generalize S for positives and specialize G for negatives until convergence to version space.
4(a) ANN Convergence & Generalization
Convergence depends on learning rate and error surface; generalization improves with regularization, sufficient data, and early stopping.
4(b) Decision Tree Learning Issues
Overfitting controlled by pruning; bad attribute choices handled by gain ratios; continuous attributes via thresholds; missing values via probabilities; differing costs via cost-sensitive learning.
5(a) Naïve Bayes vs Bayesian Classifier
Naïve Bayes assumes conditional independence; Bayesian classifier does not, hence more expressive but computationally heavier.
5(b) CLT for Confidence Intervals
By CLT, sample mean approximates normal distribution, enabling CI computation even for non-normal
populations with large samples.
6(a) PAC Learning Model
Defines learnability with bounds on accuracy (ε) and confidence (δ); sample complexity grows with hypothesis space size.
6(b) Mistake Bound Model
Bounds number of mistakes an online learner makes before convergence (e.g., Perceptron).
7(a) Learn-One Rule Algorithm
Learns one rule at a time covering positives while excluding negatives; simple and interpretable.
7(b) Prototypical Genetic Algorithm
Initialize population → selection → crossover → mutation → evaluation → replacement. Balances exploration and exploitation.
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies