Artificial Neural Networks
Course Nos. ECE.09.560 (Grad), ECE.09.454 (Senior
Elective)
Fall 2010
Lab Project 2: Multilayer Perceptrons
Objective
In this project, you will develop techniques for designing multilayer
perceptron
models for solving complex function approximation and
classification
problems. This project has five parts. In Part 1, you will exercise
your
MLP model to solve canonical logic-gate problems. In Part 2, you will
attempt
to use your MLP model for performing function approximation. In Parts
3,
4, and 5 you will solve pattern recognition problems.
Pre-Lab Exercise
-
Multilayer Perceptrons are modeled in Matlab in a 4-step process:
-
The function newff creates a feedforward MLP neural
network
-
The function net.trainParam is used specify the training
parameters
to the MLP
-
The function train is used to train the MLP
-
The function sim is used to simulate/test the MLP
Do >> help newff for more details
- See classroom demo Multilayer
Percepton - Backprop
Training.
-
Experiment with different number of training
passes,
input-output configurations, etc.
Part 1
In this part, you are required to demonstrate the capability of an MLP
to model the XOR logic gate. Generate performance curves/surfacess for
these MLP-models as the inputs vary continuously from 0.0 to 1.0.
Part 2
In this part you are required to demonstrate the capability of an MLP
to
approximate the function
f(t) = sin(t)*exp(-t/20); 0 < t < 50
Experiment with varying the number and location of training data
points,
number of hidden layers and nodes, etc. What are your
conclusions?
Part 3
Problem 4.16 in the
textbook: Repeat the double-moon classification computer experiment of
section 4.7 of the textbook for the MLP classifier, where the distance
between the two moons is set at d = 0. Comment on the findings of your
experiment in light of the corresponding experiment performed on the
perceptron inPart 2 of Lab Project 1.
Part 4
Repeat Part 3 of Lab Project 1 by using a
multi-layer perceptron model
to separate classes in the Iris
database of the UCI Machine
Learning
Repository:
http://www.ics.uci.edu/~mlearn/MLRepository.html
Part 5
In this part, you will repeat Part 4 of Lab
Project 1 and construct
an MLP to identify a subset (of your choice) of the students in the
class.
Jpeg image files of student pictures can be downloaded from classpics.
You would need the following Matlab functions to convert the image
files
into a vector suitable for neural network input:
-
imread to read the image,
-
rgb2gray to convert the image to
grayscale,
-
mat2gray to convert the image into a
matrix
in the range 0-1,
-
imresize to shrink the image,
-
imshow to display the image,
-
reshape to convert the matrix into a
vector.
(do help with all of these functions for
more
details).
As part of your neural network design process,
you will experiment with
-
Choosing appropriate training and test data
-
Data preprocessing - feeding raw image
vectors vs.
truncated FFT's or DCT's of the images.
-
Number and type of output vectors
-
Network architectures (number of hidden nodes
and
layers)
-
Training algorithms and strategies
-
Images corrupted with noise (use the imnoise
function). Identify the SNR for network failure. Not familiar with SNR
concepts? See ECOMMS
Class Lab Project 1 Pre-Lab Lecture and ECOMMS
Class Lab Project 1.
Tabulate your percentage correct classification
results
for each of these runs. Be aware that as in any real neural network,
you
will
have misclassifications. You are required to draw conclusions from this
study as you develop the most "optimal" MLP model for this problem.
Your laboratory report (individual
report is required of each student) should be in the usual
format.