(SEM V) THEORY EXAMINATION 2024-25 MACHINE LEARNING TECHNIQUES
Subject Code: BCS055
Maximum Marks: 70
Time: 3 Hours
Paper ID: 310309
Question Paper Overview
SECTION A (2 × 7 = 14 Marks)
(Conceptual short-answer questions — basic ML theory and applications)
a. What constitutes a well-defined learning problem in Machine Learning? Give an example.
b. How does Machine Learning differ from Data Science, and how do they complement each other?
c. Explain the role of the Sigmoid Function in Logistic Regression.
d. What is the significance of the decision surface in Support Vector Machines (SVM)?
e. What is overfitting in decision tree learning, and how can it be avoided?
f. Explain the role of the activation function in a Perceptron.
g. Describe the role of the reward function in Reinforcement Learning.
SECTION B (Attempt any three × 7 = 21 Marks)
a. Describe the steps involved in designing a Machine Learning system. Illustrate each step with a real-world example.
b. Differentiate between Simple Linear Regression and Multiple Linear Regression. How does including multiple features affect model complexity?
c. Illustrate the steps of the ID3 algorithm, explaining attribute selection and tree construction.
d. Discuss the Self-Organizing Map (SOM) algorithm and how it performs clustering and dimensionality reduction.
e. Explain the role of the Q-learning function in Reinforcement Learning. How does it help agents learn optimal actions and policies?
SECTION C (Attempt one part from each question × 7 = 35 Marks)
Q3
(a) Discuss the history of Machine Learning with major milestones and advancements.
OR
(b) Compare Supervised, Unsupervised, and Reinforcement Learning with appropriate examples.
Q4
(a) Apply the Naïve Bayes Classifier to the given dataset (Play Tennis example).
Dataset:
| Day | Outlook | Temperature | Humidity | Wind | Play Tennis? |
|---|---|---|---|---|---|
| 1 | Sunny | Hot | High | Weak | No |
| 2 | Overcast | Hot | High | Weak | Yes |
| 3 | Rainy | Cool | Normal | Weak | No |
| 4 | Overcast | Cool | Normal | Strong | Yes |
| 5 | Rainy | Mild | Normal | Weak | Yes |
| 6 | Sunny | Mild | Normal | Strong | No |
| 7 | Overcast | Mild | High | Strong | Yes |
| 8 | Overcast | Hot | Normal | Weak | No |
Predict: {Outlook = Sunny, Temperature = Cool, Humidity = High, Wind = Strong}
OR
(b) Fit a Linear Regression model for the dataset
(x,y):(1,1.5),(2,3.0),(3,4.5),(4,6.0)(x, y): (1, 1.5), (2, 3.0), (3, 4.5), (4, 6.0)(x,y):(1,1.5),(2,3.0),(3,4.5),(4,6.0)
and predict y when x = 5.
Q5
(a) Describe Locally Weighted Regression (LWR) and how it differs from traditional regression models. Discuss its applications.
OR
(b) Given the dataset:
| A | 1 | 2 | 1 | 2 | 1 | 2 |
|---|---|---|---|---|---|---|
| B | 2 | 3 | 3 | 2 | 2 | 3 |
| T | Yes | No | Yes | No | Yes | No |
Compute entropy before split and after splitting on A. Determine which attribute (A or B) gives highest information gain.
Q6
(a) Derive the mathematical steps of Backpropagation for training a neural network.
OR
(b) Explain the architecture and working of Convolutional Neural Networks (CNNs).
Q7
(a) Explain the concept of Reinforcement Learning (RL) and how it differs from Supervised and Unsupervised Learning.
OR
(b) Describe the components of a Genetic Algorithm (GA) — chromosomes, genes, fitness function, and population — and explain the GA cycle.
Key Topics for Revision
1. Machine Learning Overview
Definition: Field of AI that enables systems to learn patterns from data.
Core Components: Data, Model, Learning Algorithm, Evaluation Metric.
Example: Email spam detection, recommender systems.
2. Types of Machine Learning
| Type | Description | Example |
|---|---|---|
| Supervised | Labeled data → predict output | Linear Regression, SVM |
| Unsupervised | Unlabeled data → find structure | K-Means, PCA, SOM |
| Reinforcement | Learn via rewards/punishments | Q-Learning, Deep Q-Network |
3. Logistic Regression & Sigmoid Function
Sigmoid Function:
- f(x)=11+e−xf(x) = \frac{1}{1 + e^{-x}}f(x)=1+e−x1
Converts linear output into probabilities (0–1 range).
Use: Classification problems such as spam vs. non-spam.
4. SVM Decision Surface
Separates classes with a maximum-margin hyperplane.
Non-linear data handled via kernel trick (e.g., RBF kernel).
5. Overfitting in Decision Trees
Definition: Tree fits noise, not trend.
Prevention: Pruning (post/pre).
Limiting depth. Minimum samples per leaf.
6. Self-Organizing Map (SOM)
Unsupervised neural network for clustering and visualization.
Maps high-dimensional data to 2D grid preserving topological properties.
Useful in pattern recognition and dimensionality reduction.
7. Q-Learning
Model-free Reinforcement Learning algorithm.
Q(s, a) = Expected cumulative reward for taking action a in state s.
Update rule:
- Q(s,a)=Q(s,a)+α[r+γmaxa′Q(s′,a′)−Q(s,a)]Q(s,a) = Q(s,a) + \alpha [r + \gamma \max_{a'} Q(s',a') - Q(s,a)]Q(s,a)=Q(s,a)+α[r+γa′maxQ(s′,a′)−Q(s,a)]
8. Naïve Bayes Classifier
Based on Bayes’ theorem and feature independence assumption.
- P(C∣X)=P(X∣C)P(C)P(X)P(C|X) = \frac{P(X|C)P(C)}{P(X)}P(C∣X)=P(X)P(X∣C)P(C)
Commonly used in text classification and sentiment analysis.
9. Locally Weighted Regression (LWR)
A non-parametric regression technique.
Assigns weights to nearby data points (closer points have higher influence).
Used in robotics and real-time predictive modeling.
10. Backpropagation
Gradient descent-based algorithm for training neural networks.
Steps: Forward propagation (compute output).
Compute loss function (e.g., MSE).
Backpropagate errors to update weights.
11. Convolutional Neural Networks (CNNs)
Composed of convolutional, pooling, and fully connected layers.
Extracts spatial hierarchies of features.
Used in image classification, object detection, and medical imaging.
12. Genetic Algorithms (GAs)
Evolution-inspired optimization approach:
Initialize random population.
Evaluate fitness.
Apply selection, crossover, mutation.
Repeat until convergence.
Used in feature selection, scheduling, and parameter tuning.
Exam Tips
Draw diagrams for Neural Network, SVM margin, Decision Tree, CNN layers.
Remember entropy/information gain formulas for ID3.
Practice Naïve Bayes and Linear Regression numericals.
Revise short definitions: overfitting, reward function, activation function, Q-value, feature scaling.
Learn backpropagation steps and Reinforcement Learning cycle diagram.
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies