Homework  #5 Back Propagation

Visual Perception Modeling and Its Applications

CIS 4930/5930, Spring 2001

Department of Computer Science, Florida State University

¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾¾

Due:  Week 13, Monday, April 2, 2001 Points: 100

 

The following questions are based on the following back propagation algorithm implementation located at

~liux/public_html/courses/research/programs/neural-networks

 

  1. Each neuron (computational unit) in the implementation does not have a bias associated with it. In other words, its output is a function of the weighted linear summation of its inputs. Explain how the bias is handled in the given implementation.
  2. Use the given program to solve the XOR problem with four variables by training the network with the following examples

X1

X2

X3

X4

XOR

0

0

0

0

0

1

0

0

0

1

0

1

0

0

1

1

1

0

0

0

0

0

1

0

1

1

0

1

0

0

0

1

1

0

0

1

1

1

0

1

0

0

0

1

1

1

0

0

1

0

0

1

0

1

0

1

1

0

1

1

0

0

1

1

0

1

0

1

1

1

0

1

1

1

1

1

1

1

1

0

 

a)      There are three free parameters for the back propagation algorithm: b for the steepness of the sigmoid activation function, h for the learning rate, and a for the momentum in updating the weights. Train the neural network with the following parameter settings and explain the training behavior changes (i.e., number of training iterations, and training error and so on)

b

h

a

0.5

0.1

0.9

0.05

0.1

0.9

0.5

0.01

0.9

0.05

0.01

0.9

0.5

0.1

0.1

0.05

0.1

0.1

0.5

0.01

0.1

0.05

0.01

0.1

b)     (Optional) Can you improve the training efficiency, i.e., reduce the number of training iterations using some techniques?

  1. Describe how you can utilize the given back propagation program to solve other computer vision problems such as face recognition and face detection.