(SEM VII) THEORY EXAMINATION 2023-24 MACHINE LEARNING
KOE073 – MACHINE LEARNING
B.Tech (SEM VII) – Theory Examination
Time: 3 Hours | Max Marks: 100
SECTION A
(Attempt all questions in brief – 2 × 10 = 20 marks)
a. What is Machine Learning?
Machine Learning is a branch of Artificial Intelligence that enables computers to learn patterns from data and improve performance on tasks without being explicitly programmed.
b. Steps involved in designing a learning system in Machine Learning
The main steps are: Problem definition
Data collection Data preprocessing
Feature selection Model selection
Training Testing and evaluation
c. Explain Artificial Neural Network (ANN).
An Artificial Neural Network is a computational model inspired by the human brain. It consists of interconnected neurons organized in input, hidden, and output layers, used for pattern recognition and prediction.
d. What do you understand by Gradient Descent?
Gradient Descent is an optimization algorithm used to minimize the loss function by iteratively updating model parameters in the direction of the negative gradient.
e. Explain Bayes Classifier.
Bayes Classifier is a probabilistic classifier based on Bayes’ Theorem. It predicts class labels using prior probability and likelihood of features assuming conditional independence.
f. What are the basics of Sampling Theory?
Sampling theory deals with selecting a representative subset of data from a population to make predictions while preserving the statistical properties of the entire dataset.
g. What is the mistake bound model of learning?
The mistake bound model measures learning performance by counting the maximum number of mistakes an algorithm can make before learning the correct concept.
h. Explain Case-Based Learning.
Case-Based Learning solves new problems by comparing them with previously solved cases and adapting old solutions to new situations.
i. How do you evaluate the performance of a model based on first-order rules?
Performance is evaluated using accuracy, precision, recall, and rule coverage by testing how well the rules classify unseen examples.
j. What is Reinforcement Learning?
Reinforcement Learning is a learning technique where an agent learns optimal actions by interacting with the environment using rewards and penalties.
SECTION B
(Attempt any three – answers provided for ALL)
2(a). Differentiate between Supervised, Unsupervised, and Reinforcement Learning
| Feature | Supervised | Unsupervised | Reinforcement |
|---|---|---|---|
| Data | Labeled | Unlabeled | Feedback-based |
| Goal | Prediction | Pattern discovery | Optimal action |
| Examples | Classification, Regression | Clustering | Game playing |
2(b). Decision Tree Terms
(i) Entropy:
Measures impurity in data.
Entropy(S)=−∑pilog2piEntropy(S) = -\sum p_i \log_2 p_iEntropy(S)=−∑pilog2pi
(ii) Information Gain: Reduction in entropy after split.
(iii) Gini Index: Measures node impurity.
(iv) Gain Ratio: Information gain normalized by split information.
(v) Chi-Square: Statistical test to measure dependency between variables.
2(c). Expectation Maximization (EM) Algorithm
EM is an iterative algorithm used for parameter estimation in probabilistic models with hidden variables.
Steps:
E-Step: Estimate expected values M-Step: Maximize likelihood
Used in clustering and Gaussian Mixture Models.
2(d). Backpropagation Algorithm in ANN with Example
Backpropagation trains neural networks by minimizing error.
Steps: Forward propagation
Error calculation Backward propagation
Weight update
Example: Used in digit recognition systems.
2(e). Hypothesis Space Search
Hypothesis space is the set of all possible models.
Impact: Too small → underfitting
Too large → overfitting
Example: Linear vs polynomial regression models.
SECTION C
3(a). Inductive Bias in Machine Learning
Inductive bias refers to assumptions a learning algorithm uses to predict unseen data.
Example:
Decision trees prefer shorter trees, influencing learned models.
3(b). Candidate Elimination Algorithm
Maintains:
S (Specific boundary) G (General boundary)
Steps: Initialize S and G
Update with positive examples Generalize or specialize hypotheses
Used in concept learning.
4(a). Forward vs Backward Propagation
| Forward Propagation | Backward Propagation |
|---|---|
| Computes output | Updates weights |
| Input → Output | Output → Input |
| Prediction phase | Training phase |
Weight calculation:
wnew=wold−η∂E∂ww_{new} = w_{old} - \eta \frac{\partial E}{\partial w}wnew=wold−η∂w∂E
4(b). Single-Layer Neural Network (XOR Problem – One Iteration)
Given:
wih=0.5w_{ih} = 0.5wih=0.5, who=−0.5w_{ho} = -0.5who=−0.5
Learning rate = 0.1
Steps: Forward pass using sigmoid
Compute error Backpropagate error
Update weights and biases (Numerical steps shown clearly in exam)
5(a). Weather Dataset – Rule-Based Classification
(i) If weather is Sunny: Majority output → No
(ii) Humidity = Normal & Windy = True: Majority output → Yes
5(b). Bayes Theorem – Disease Problem
Given: Disease probability = 1/10,000
Test accuracy = 99%
P(D∣+)=0.99×0.0001(0.99×0.0001)+(0.01×0.9999)≈0.0098P(D|+) = \frac{0.99 \times 0.0001}{(0.99 \times 0.0001) + (0.01 \times 0.9999)} \approx 0.0098P(D∣+)=(0.99×0.0001)+(0.01×0.9999)0.99×0.0001≈0.0098
Probability ≈ 0.98%
6(a). Mistake Bound Model of Learning
Concept: Bounds number of mistakes made during learning.
Benefits: Theoretical guarantee
Simple evaluation
Limitations: Assumes linearly separable data
6(b). K-Nearest Neighbors (KNN)
Given points: (2,3), (5,4), (9,6), (8,1), (7,2)
Labels: A, A, B, B, B New point: (6,5), k = 3
Nearest neighbors → Majority = B Classified as B
7(a). Genetic Algorithm – Binary String Optimization
(i) Representation: Binary string of length 8
(ii) Initialization: Random population of binary strings
(iii) Operations: Selection
Crossover Mutation
7(b). Types of Reinforcement
Positive reinforcement: Reward Negative reinforcement: Remove penalty
Punishment: Reduce reward No reinforcement: No feedback
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies