Artificial neural network processing flow with forward propagation and backward propagation:

#neunet class
class neunet:

  #forward propagation
  def propagate_forwards(this,Input):
    Loop thru' all layers from left to right to get 
    weighted values, sum of weighted values, 
    and put this sum into activation function (eg. tanh(x)).
    This produces output in the last layer.

  #backward propagation and weight correction
  def propagate_backwards(this,Output):
    Loop thru' all layers from right to left,
    calculate delta using derivative of activation function.
    (Derivative function of tanh(x): 1-x**2)
    Update weights in all layers.
#end of neunet class

#train the network
while True:
  Error_Count = 0

  for Index in range(len(Samples)):
    Sample = Samples[Index]
    Neunet.propagate_forwards(Sample["Input"]) #generate Output
    Error = Sample["Output"]-Output
    if not in_acceptable_range(Error):
      Error_Count += 1
  #end of for
  if Error_Count==0:
#end of while

#process main input
Neunet.propagate_forwards(Input) #generate Output
print Output