Recommended Posts

the IDE complains over the rem's

it is translated from c++ throu basic

i can not test this

let me know if you got it working

	'''
   bluatigro 14 nov 2021
   ann try
   based on :
   http://code.activestate.com/recipes/578148-simple-back-propagation-neural-network-in-python-s/
'''
import random , math
	# consts
NI = 2 # number of inputs
NH = 2 # number of hidden cels
NO = 1 # number of outputs
	# cels
ai = list()
for i in range( NI ) :
    ai.append( 0.0 )
ah = list()
for h in range( NH ) :
    ah.append( 0.0 )
ao = list()
for o in range( NO ) :
    ao.append( 0.0 )
# wished output
wish = list()
for w in range( NO ) :
    wish.append( 0.0 ) 
# weights
wih = list()
ci = list()
for i in range( NI ) :
    wih[ i ] = list()
    ci[ i ] = list()
    for h in range( NH ) :
        wih[ i ].append( random.random() )
        ci[ i ].append( random.random() )
who = list()
co = list()
for h in range( NH ) :
    who[ h ] = list()
    co( h ) = list()
    for o in range( NO ) :
        who[ h ].append( random.random() )
        co[ h ].append( random.random() )
od = list()
hd = list()
for h in range( NH ) :
    od.append( random.random() )
    hd.append( random.random() )
	# input output training data : XOR function    
paterns = 4
pin = {{0.0,0.0},{1.0,0.0},{0.0,1.0},{1.0,1.0}}
pout = { 0.0    , 1.0     , 1.0     , 0.0 }
	def tanh( x ) :
     return ( 1 - math.exp( -2 * x ) ) \
     / ( 1 + math.exp( -2 *x ) )
	def dsignoid( x ) :
    return 1 - x * x
    
def calc( p ) :
    for i in range( NI ) :
        ai[ i ] = pin[ i ][ p ]
    for h in range( NH ) :
        som = 0.0
        for i in range( NI ) :
            som += ai[ i ] * wih[ i ][ h ]
        ah[ h ] = tanh( som / NI )
    for o in range( NO ) :
        som = 0.0
        for h in range( NH ) :
            som += ah[ h ] * who[ h ][ o ]
        ao[ o ] = tanh( som / NH )
	def backprop( n , m ) :
''' http://www.youtube.com/watch?v=aVId8KMsdUU&feature=BFa&list=LLldMCkmXl4j9_v0HeKdNcRA
    calc output deltas
    we want to find the instantaneous rate of change of ( error with respect to weight from node j to node k)
    output_delta is defined as an attribute of each ouput node. It is not the final rate we need.
    To get the final rate we must multiply the delta by the activation of the hidden layer node in question.
    This multiplication is done according to the chain rule as we are taking the derivative of the activation function
    of the ouput node.
    dE/dw[j][k] = (t[k] - ao[k]) * s'( SUM( w[j][k]*ah[j] ) ) * ah[j]
'''
    totfout = 0
    for k in range( NO ) :
        totfout += math.abs( ao( k ) - wish( k ) )
    # update output weights
    for j in range( NH ) :
        for k in range( NO )
# output_deltas[k] * self.ah[j]
# is the full derivative of
# dError/dweight[j][k]
            c = od[ k ] * ah[ j ]
            wo[ j ][ k ] += n * c + m * co[ j ][ k ]
            co[ j ][ k ] = c
# calc hidden deltas
    for j in range( NH ) :
        fout = 0
        for k in range( NO ) :
            fout += od[ k ] * wo[ j ][ k )
        hd[ j ] = fout * dsignoid( ah[ j ] )
 
# update input weights
    for i in range( NI ) :
        for j in range( NH ) :
            c = hd[ j ] * ai[ i ]
            wi[ i ][ j ] += n * c + m * ci[ i ][ j ]
            ci[ i ][ j ] = c
 
     return totfout / 2
     
for e in range( 10000 ) :
    fout = 0
# fill input cel's
    for p in range( paterns ) :
        for i in range( NH ) :
            ai[ i ] = pin[ i ][ p ]
# fill output wish
        for o in range( NO ) :
            wish[ o ] = pout[ p ]
        fout += backprop( .5 , .5 )
	     print( 'generatie ' , e , '    error ' , fout )
 
	

Share this post


Link to post
Share on other sites

I've started learning Python at the end of last month so I might not be able to fix this script for you at the moment, but from what I learned already I can tell you that this script isn't going to work.

Share this post


Link to post
Share on other sites

The syntax is wrong.

15 hours ago, bluatigro said:

the IDE complains over the rem's

Yes, sure

15 hours ago, bluatigro said:

I can not test this
let me know if you got it working 

What is the Idea, if you cant' run it? What is the question?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.