The Multi-Layer-Perceptron was first introduced by M. Minsky and S. Papert in 1969.

It is an extended Perceptron and has one ore more hidden neuron layers between its input and output layers.

Due to its extended structure, a Multi-Layer-Perceptron is able to solve every logical operation, including the XOR problem.

Sample Structure


Type feedforward
Neuron layers 1 input layer, 1 or more hidden layer(s), 1 output layer
Input value types binary
Activation function hard limiter / sigmoid
Learning method supervised
Learning algorithm delta learning rule, backpropagation (mostly used)
Mainly used in complex logical operations, pattern classification
Neural Net Components in an Object Oriented Class Structure