Perceptron algorithm (pseudo-code) without machine learning feature:

//constants
//max number of loops over training set
MAX_SET_ITERATION = 100; //example value
LEARNING_RATIO    = 0.1; //example value

//variables
//training data (learnt data with known results)
//Data, Input, Weights contain value '1' as bias value to 
//make sure weights change continuously
Data = [
  {Input:[...,1],Output},
  ...
];
Input   = [...,1];
Weights = [...,1]; //array of the same length with Input

//initialisation
get_input();
randomise_weights();

//perceptron algorithm
Set_Iteration_Count = 0;
while (true) {
  Set_Iteration_Count++;
  Diff_Count = 0;
  for (Jndex=0; Jndex<Data.length; Jndex++) {          

    //get known input & result          
    Data_Entry   = Data[Jndex];          
    Known_Input  = Data_Entry.Input;          
    Known_Output = Data_Entry.Output;          

    //apply weights to input          
    //this step is usually just Weighteds[i] = Input[i]*Weights[i]          
    Weighteds = apply_weights(Input,Weights);          

    //get a summary of input values after having weights applied          
    //this step is usually just summing up values          
    Weighted_Sum = get_sum(Weighteds);          

    //get result from summary          
    //this is aka 'step function', Result is usually in a list          
    //of classified values, not just any value          
    Output = get_output(Weighted_Sum);          

    //get difference (delta) compared to known result          
    //THE ORDER OF OPERANDS IN THIS MINUS OPERATION IS IMPORTANT!          
    Diff = Known_Output-Output;          
    if (Diff!=0)              
      Diff_Count++;          

    //adjust weights based on Data_Entry, Diff, & LEARNING_RATIO          
    //make sure weights may go in both directions          
    //usually: Weights[i] += LEARNING_RATIO*Known_Input[i]*Diff          
    Weights = adjust_weights(Weights);      
  }      

  if (Diff_Count==0 || Set_Iteration_Count>=MAX_SET_ITERATION)
    break;
}

//get final result from final weights
Weighteds    = apply_weights(Input,Weights);
Weighted_Sum = get_sum(Weighteds);
Output       = get_output(Weighted_Sum);
print Output;

To add machine learning feature:

  • Every time the final printed result is not correct: A human supervisor should add the correct result into the training data
  • AI to suggest possible entries with wrong outputs in training data
  • AI to self-train (adjust training data output) by popularity of correct results & odd wrong result

Reference:
https://blog.dbrgn.ch/2013/3/26/perceptrons-in-python

Notes in the reference above:
XOR result can’t be obtained from a single layer perceptron as XOR result is non-linearly separable (the Weighted_Sum can’t be classified into classifications of different results). However, NOT/AND/OR operations are all classifiable over weighted sum (scalar product).

XOR can be implemented using 2 perceptron layers:
http://toritris.weebly.com/perceptron-5-xor-how–why-neurons-work-together.html

Input: (x,y)

(x,y) --> Perceptron (AND) --> \
                                | --> Perceptron (XOR)
(x,y) --> Perceptron (OR)  --> /

XOR can be solved by a single perceptron too, but with addition of a value x*y to input. See Razvan Popovici’s answer for this question on Quora:
https://www.quora.com/Why-cant-the-XOR-problem-be-solved-by-a-one-layer-perceptron

Advertisements