Simple logic operations like AND, OR, NOT can be solved by using a single perceptron (uses heaviside activation function) or neuron (uses sigmoid or other activation functions):
Feedforward: R = X1*W1 + X2*W2 + 1*W3 //regression value O = activate(R) //output value E = Expected_O - O //error value Update weights: W1 = X1*E*Learning_Rate W2 = X2*E*Learning_Rate W3 = 1 *E*Learning_Rate Note: W3 is the weight for bias input value
However, the logic operation XOR can not be solved using a single perceptron or neuron as the results of XOR operation are not linear separable:
The result values of XOR (in red colour) are not separable into 2 classes (class of value 0, class of value 1) by a single straight line, thus XOR results are not linear separable.
Solution for solving XOR using a single neuron:
Solution for solving XOR using 2 neurons:
Solution for solving XOR using standard basic 3-neuron network: