Artificial Neural Networks

Course Nos. ECE.09.560 (Grad), ECE.09.454 (Senior Elective)

Fall 2010

Lab Project 1: Single Layer Perceptrons


Objective

In this project, you will develop techniques for designing single layer perceptron models for solving simple classification problems. This project has four parts. In Part 1, you will exercise your perceptron model to solve canonical logic-gate problems. In Part 3, you will repeat the computer experiment described in Section 1.5 of the textbook, and solve problem 1.6. In Part 3, you will attempt to use your perceptron model for solving a pattern recognition problem with data drawn from a standard database. In Part4, you will attempt to use your perceptron model for solving a pattern recognition problem with data generated in class.

Pre-Lab Exercise

  1. Run the neural networks demo programs in Matlab. Type in

  2. >> demo
    at the prompt and choose neural networks. The first four demos provide an introduction to neural network terminology, decision boundaries and the perceptron model and algorithm.
  1. Perceptrons are modeled in Matlab in a 3-step process:
    1. The function newp creates a  perceptron architecture
    2. The function adapt is used train the perceptron
    3. The function sim is used to simulate/test the perceptron
    Do >> help newp, >> help adapt and help sim for more details
     
  2.  The following exercise simulates an AND gate using the perceptron model.

  3. >> net = newp([0 1; 0 1], 1);
              this defines a perceptron with 2 input elements (input data for each in the range 0 to 1)  and 1 output element
    >>p = [[0;0], [0;1],[1;0],[1;1]];
              this defines 4 input training vectors
    >>t = [ 0 0 0 1];
              this defines the corresponding output for each vector
    >>net.adaptParam.passes=10;
              specifies 10 iterations of the perceptron algorithm
    >>[net, a, e] = adapt(net, p, t);
              train the network - the udpdated network weights are in net; the predict outputs are in a and the training error in e
    >>a1=sim(net, [0; 0])
              test the network with the input [0;0]
    >>plotpv(p, t)
              plot the network training data
    >>plotpc(net.iw{1,1},net.b{1})
              plot the network classification boundary.
  4. Experiment with different number of training passes, input-output configurations, etc.

Part 1

In this part, you are required to demonstrate the capability of a single-layer perceptron to model the following logic gates: Generate performance curves/surfacess for these perceptron-models as the input/s vary continuously from 0.0 to 1.0.

Part 2

Reapeat the double-moon classification problem described in Section 1.5 of the textbook. Follow this with the Computer Experiment described in Problem 1.6.

Part 3

Use a single-layer perceptron model to separate classes in the Iris database of the UCI Machine Learning Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html
Describe your neural network design (see instructions in Part 3). Tabulate your percentage correct classification results - what can you infer from the network performance?

Part 4

In this part, you will construct  a single-layer perceptron to identify a subset (of your choice) of the students in the class. Jpeg image files of student pictures can be downloaded from classpics. You would need the following Matlab functions to convert the image files into a vector suitable for neural network inout: (do help with all of these functions for more details).

As part of your neural network design process, you will experiment with

Tabulate your percentage correct classification results for each of these runs. Be aware that as in any real neural network, you will have misclassifications. You are required to draw conclusions from this study as you develop the most "optimal" single-layer perceptron model for this problem.

Your laboratory report (individual report is required of each student) should be in the usual format.
 
Instructor Schedule Textbook  Grading Links Homepage