Artificial Neural Networks

Course Nos. 0909-560-01, 0909-454-01

Fall 2004

Lab Project 2: Multilayer Perceptrons


Objective

In this project, you will develop techniques for designing multilayer perceptron models for solving complex function approximation and  classification problems. This project has four parts. In Part 1, you will exercise your MLP model to solve canonical logic-gate problems. In Part 2, you will attempt to use your MLP model for performing function approximation. In Parts 3, and 4 you will solve pattern recognition problems.

Pre-Lab Exercise

  1. Multilayer Perceptrons are modeled in Matlab in a 4-step process:
    1. The function newff creates a  feedforward MLP neural network
    2. The function net.trainParam is used specify the training parameters to the MLP
    3. The function train is used to train the MLP
    4. The function sim is used to simulate/test the MLP
    Do >> help newff  for more details
     
  2.  The following exercise (similar to the classroom demo) is used to model an MLP
      1. %Multilayer Perceptron implementation example
        %Neural Nets, Fall 04, S. Mandayam

        clear;close all;

        %Define a 2-3-1 MLP with Sigmoidal Neurons at the Hidden
        %and output layers; train using gradient descent backprop
        net=newff([0 5; 0 5],[3,1],{'logsig','logsig'},'traingd');

        %Show training progress every 50 epochs
        net.trainParam.show=50;

        %Set learning rate to 0.5
        net.trainParam.lr=0.5;

        %Set maximum epochs to 1000
        net.trainParam.epochs=1000;

        %Set error goal to 0.01
        net.trainparam.goal=1e-2;

        %Define training input and target data vectors
        p=[[1;1],[1;2],[2;1],[1.5;3.0],[2.0;2.0],[3.0;1.7]];
        t=[0 0 0 1 1 1];
        figure; plotpv(p,t);
        figure;

        %train the network
        net=train(net,p,t);

        %Test network
        a1=sim(net,[1.5;1.5])
        a2=sim(net,[2.5;2.5])

               
  3. Experiment with different number of training passes, input-output configurations, etc.

Part 1

In this part, you are required to demonstrate the capability of an MLP to model the XOR logic gate. Generate performance curves/surfacess for these MLP-models as the inputs vary continuously from 0.0 to 1.0.

Part 2

In this part you are required to demonstrate the capability of an MLP to approximate the function
f(t) = sin(t)*exp(-t/20); 0 < t < 50
Experiment with varying the number and location of training data points, number of hidden layers and nodes, etc. What are  your conclusions?

Part 3

Repeat Part 2 of Lab Project 1 by using a multi-layer perceptron model to separate classes in the Iris database of the UCI Machine Learning Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html 

<>Part 4<>

In this part, you will repeat Part 3 of Lab Project 1 and construct  an MLP to identify a subset (of your choice) of the students in the class. Jpeg image files of student pictures can be downloaded from classpics. You would need the following Matlab functions to convert the image files into a vector suitable for neural network input: (do help with all of these functions for more details).

As part of your neural network design process, you will experiment with

Tabulate your percentage correct classification results for each of these runs. Be aware that as in any real neural network, you will have misclassifications. You are required to draw conclusions from this study as you develop the most "optimal" MLP model for this problem.

Your laboratory report (individual report is required of each student) should be in the usual format.
 
Instructor Schedule Textbook  Tutorials  Grading Links Homepage