THEORY EXAMINATION (SEM–VI) 2016-17 ARTIFICIAL NEURAL NETWORK
ARTIFICIAL NEURAL NETWORK (NEC013)
Time: 3 Hours Max Marks: 100
SECTION – A (Short Answer Questions)
(10 × 2 = 20 Marks)
(a) Neural Computing
Neural computing is a computing paradigm inspired by the human brain, where information is processed using interconnected artificial neurons that learn from data.
(b) BNN and ANN
BNN (Biological Neural Network): Natural network of neurons in the human brain.
ANN (Artificial Neural Network): Mathematical model inspired by BNN, implemented using algorithms and software.
(c) ADALINE model
ADALINE (Adaptive Linear Neuron) is a single-layer neural network that uses a linear activation function and updates weights using the least mean square (LMS) rule.
(d) ART models
ART (Adaptive Resonance Theory) models are neural networks used for pattern clustering that maintain stability while learning new patterns without forgetting old ones.
(e) Boltzmann learning
Boltzmann learning is a stochastic learning rule where neurons update states probabilistically to minimize network energy.
(f) Linear Associative Network (LAN)
LAN is a network that associates input patterns with output patterns using linear mapping, commonly used in memory models.
(g) Number of hidden nodes
The number of hidden nodes depends on problem complexity. Too few cause underfitting; too many cause overfitting. There is no fixed rule, usually decided experimentally.
(h) Pattern association
Pattern association is the task of mapping an input pattern to a corresponding output pattern, such as input–output recall.
(i) Network inversion
Network inversion is the process of determining the input that produces a given output in a neural network.
(j) Components of CL network
CL (Competitive Learning) network components include:
Input layer Competitive layer Winner-take-all mechanism
Used for pattern clustering and feature extraction.
SECTION – B (Long Answer Questions)
(Attempt any FIVE – 5 × 10 = 50 Marks)
2(a) Full Counter Propagation Network (Full CPN)
Architecture: Full CPN consists of:
Input layer
Kohonen (competitive) layer Grossberg (output) layer
Training Phases:
Unsupervised learning in Kohonen layer (clustering)
Supervised learning in Grossberg layer (association)
Used in pattern classification and mapping.
2(b) Biological neuron and neuron models
Biological neuron parts: Dendrites
Cell body Axon
Synapse
Neuron models: McCulloch–Pitts neuron
ADALINE Sigmoid neuron
Each model simplifies biological behavior mathematically.
2(c) Types of learning – Hebbian and Boltzmann
Types of learning:
Supervised Unsupervised
Reinforcement
Hebbian learning:
“Neurons that fire together wire together.” Weight increases if both input and output are active.
Boltzmann learning:
Uses probabilistic neuron activation and simulated annealing to reach minimum energy state.
2(d) RBF network for pattern classification
In pattern classification, data clusters around centers. RBF networks use radial basis functions (usually Gaussian) centered at these clusters.
Basis functions are decided by: Data distribution
Distance measure Cluster centers
2(e) MLP architecture and backpropagation
MLP Architecture: Input layer
One or more hidden layers Output layer
Backpropagation:
Error is propagated backward to update weights using gradient descent:
Δw=−η∂E∂w\Delta w = -\eta \frac{\partial E}{\partial w}Δw=−η∂w∂E
Used for nonlinear classification problems.
2(f) Recognition of consonant–vowel (CV) segments
ANNs recognize CV segments by extracting spectral and temporal features.
Texture classification:
Classifies patterns based on texture properties.
Segmentation:
Divides image or signal into meaningful regions.
2(g) Pattern association, classification, and mapping
Pattern association: Input → output recall
Pattern classification: Assign input to a class
Pattern mapping: Transform input pattern to output pattern
Example: Speech recognition systems.
2(h) Feed-forward vs Feed-back networks
| Feed-forward | Feed-back |
|---|---|
| No cycles | Has cycles |
| Faster | Can store memory |
| Example: MLP | Example: Hopfield |
Stochastic networks: Use randomness Simulated annealing: Gradual reduction of randomness
Boltzmann machine: Energy-based stochastic network
SECTION – C (Very Long Answer Questions)
(Attempt any TWO – 2 × 15 = 30 Marks)
3(a) Hopfield network – storage and recall algorithm
Storage algorithm: Weights are computed using:
wij=∑xixjw_{ij} = \sum x_i x_jwij=∑xixj
Recall algorithm: Initialize with input pattern
Update neurons iteratively Network converges to stored pattern
Hopfield network acts as content-addressable memory.
3(b) AND, OR, XOR using MP neurons
AND & OR can be implemented using single-layer perceptrons.
XOR problem:
XOR is not linearly separable, so it cannot be solved by a single-layer perceptron.
Solution:
Use Multilayer Perceptron (MLP) with hidden layer.
4(a) Self-Organizing Maps (SOM)
SOM maps high-dimensional data to low-dimensional grids.
Training steps:
Initialize weights Find Best Matching Unit (BMU) Update neighborhood weights
Applications:
Data compression Visualization Clustering
4(b) ART networks
ART networks perform stable pattern clustering.
Features:
Plasticity Stability Vigilance parameter
Advantages:
No catastrophic forgetting Online learning
5(a–c) Short Notes
(a) Principal Component Analysis (PCA)
PCA reduces dimensionality by transforming data into uncorrelated components.
(b) Vector Quantization (VQ)
VQ represents large datasets using a small set of representative vectors.
(c) Mexican Hat Networks
These networks use excitation at center and inhibition around, useful in feature detection.
Related Notes
BASIC ELECTRICAL ENGINEERING
ENGINEERING PHYSICS THEORY EXAMINATION 2024-25
(SEM I) ENGINEERING CHEMISTRY THEORY EXAMINATION...
THEORY EXAMINATION 2024-25 ENGINEERING MATHEMATICS...
(SEM I) THEORY EXAMINATION 2024-25 ENGINEERING CHE...
(SEM I) THEORY EXAMINATION 2024-25 ENVIRONMENT AND...
Need more notes?
Return to the notes store to keep exploring curated study material.
Back to Notes StoreLatest Blog Posts
Best Home Tutors for Class 12 Science in Dwarka, Delhi
Top Universities in Chennai for Postgraduate Courses with Complete Guide
Best Home Tuition for Competitive Exams in Dwarka, Delhi
Best Online Tutors for Maths in Noida 2026
Best Coaching Centers for UPSC in Rajender Place, Delhi 2026
How to Apply for NEET in Gurugram, Haryana for 2026
Admission Process for BTech at NIT Warangal 2026
Best Home Tutors for JEE in Maharashtra 2026
Meet Our Exceptional Teachers
Discover passionate educators who inspire, motivate, and transform learning experiences with their expertise and dedication
Explore Tutors In Your Location
Discover expert tutors in popular areas across India
Discover Elite Educational Institutes
Connect with top-tier educational institutions offering world-class learning experiences, expert faculty, and innovative teaching methodologies