The Future of Neural Networks

I found this video some time ago when I was searching YouTube for interesting demonstrations of neural networks. It is, by far, the best explanation of modern Neural-Net theory that I’ve ever encountered, and I thought that some of you out in cyberspace might enjoy it as well.

The video is presented by Geoffrey Hinton, a machine-learning pioneer. His treatment of what he calls “Restricted Boltzmann Machine” neural nets is incredibly nuanced and mathematically rigorous, a — and forgive the cliché here — must-see for machine-learning enthusiasts. An excellent presentation based on an excellent idea, and certainly the most brain-like (maybe even mind-like) system I’ve ever seen. Some of the particular Machines he explores can do some fairly amazing things, like natural character recognition and sorting documents by semantic content.  And he manages to throw in a joke or two, as well.

You can find the video here. Enjoy.

Advertisements

Hebbian Neural Networks

Against all odds, I’ve started an A.I. project, and have actually made some progress. I never thought I’d see the day. It’s not a whole lot of progress, but when you’ve been tinkering as long as I have, you learn to take what you can get.

What I’ve got is a fairly simplistic neural network model, utilizing Hebbian learning. That is to say, whenever two neurons in the network happen to be switched on at the same time, their connection gets stronger. For the last week or two, I’ve been tinkering with the parameters and different methods of inputting the data, and I finally have something that performs something roughly like learning.

I feel the need to repeat that last part: roughly like learning. I have no idea if it’s actually learned anything. Sometimes when I test it, it seems to be able to predict simple patterns, and learn how to tell a small prime from a small non-prime. Other times, it becomes so profoundly stupid that it actually anti-learns, refusing to respond to any stimulus even remotely like its training data. And at yet other times, it doesn’t do anything at all.

This last bit is worsened by my habit of taking a perfectly good program, tinkering with it until it becomes unusable, then accidentally saving over the original. The fact that I wrote the program in Python makes that all the worse, since with Python, you have to save the program every time you run it, and I’ve gotten into the bad habit of just pressing F5 without making sure I’ve saved a backup. The end result is that the current version is pretty much nonfunctional.

Still, the very fact that I was able to write an implement a neural network model makes me pretty happy. I’ve always had trouble handling networks, and now it seems that I’ve got something vaguely workable. So, without further ado (or further clichés), I present to you (okay, one more cliché) Hebbian v5.0 (be warned: there is quite a lot of garbage code and artifacts in there, and frankly, I’m too damn lazy to take it out. Hey, if the human genetic code can be full of junk DNA, then why can’t my code?):

(Written in Python 2.43 (I think))

################################################################################
#Hebbian, version 5.0 #
#Written by Asymptote. #
#Feel free to modify and distribute this code (I dont’ know why you’d want to, #
#but hey, whatever makes you happy), as long as you keep this header intact. #
################################################################################

import random
import math

connectivity = []
activation = []
ns = 100

for i in range(0,ns):
activation.append(0.1)

temp = []

for i in range(0,ns):
temp = []
for j in range(0,ns):
temp.append(0)
connectivity.append(temp)

def sign(n):
if n == 0:
return 0
else:
sg = abs(n)/n
return sg

def transmission(act,conn,nsize,thresh):
summ = 0
for a in range(0,nsize-1):
summ = 0
for b in range(0,nsize-1):
summ += act[b] * conn[a][b]
if float(summ)/float(ns) > thresh:
act[a] = 1
else:
act[a] = 0

def hebbian(act,conn,nsize):
for a in range(0,nsize-1):
for b in range(0,nsize-1):
if act[a] == act[b] == 1:
conn[a][b] += sign(conn[a][b]) * 0.1
for a in range(0,nsize-1):
for b in range(0,nsize-1):
conn[a][b] -= sign(conn[a][b]) * 0.01
for a in range(0,nsize-1):
for b in range(0,nsize-1):
if conn[a][b] > 1:
conn[a][b] = 1
if conn[a][b] < -1:
conn[a][b] = -1

def run(act,conn,nsize,thresh,runlength):
for i in range(0,runlength-1):
transmission(act,conn,nsize,thresh)
hebbian(act,conn,nsize)
print act

def actprint(act,nsize):
strg = “”
for a in range(0,nsize-1):
if act[a] == 1:
strg+= “#”
else:
strg += “_”
print strg

def connprint(conn,nsize):
printarr = []
tempstr = “”
for a in range(0,nsize-1):
tempstr = “”
for b in range(0,nsize-1):
if abs(conn[b][a]) > 0.5:
tempstr += “#”
else:
tempstr += “_”
printarr.append(tempstr)
for a in printarr:
print a

def striphex(i):
s=i
h=i%255
h=hex(i)
h=h[2:]
if s<16:
h=”0″+h
return h

from Tkinter import *
root = Tk()
w = Canvas(root,width=1000,height=1000)
w.pack()

def Binary(n):
out = “”
x = n
while x > 0:
out = str(x % 2) + out
x = (int(x / 2))
return out

def make_input(n,ml):
bin = Binary(n)
inarr = []
for i in range(0,len(bin)):
inarr.append(int(bin[i]))
while len(inarr) <= ml – 1:
inarr = [0] + inarr
return inarr

def drawnetwork(numnodes,connectivity):
import random
points = []
for i in range(0,numnodes – 1):
points.append([random.randint(0,1000),random.randint(0,1000)])
for i in range(0,numnodes – 1):
for j in range(0,numnodes – 1):
if abs(connectivity[i][j]) > 0.1:
if i == j:
w.create_line(points[i][0],points[i][1],points[i][0]+25,points[i][1],points[j][0],points[j][1]+25,points[j][0],points[j][1],smooth=TRUE,fill=”#”+striphex(255-abs(int(connectivity[i][j]*255)))+striphex(255-abs(int(connectivity[i][j]*255)))+striphex(255-abs(int(connectivity[i][j]*255))))
else:
w.create_line(points[i][0],points[i][1],points[j][0],points[j][1],arrow=LAST,fill=”#”+striphex(255-abs(int(connectivity[i][j]*255)))+striphex(255-abs(int(connectivity[i][j]*255)))+striphex(255-abs(int(connectivity[i][j]*255))))

def drawconn(numnodes,connectivity):
for a in range(0,numnodes-1):
for b in range(0,numnodes-1):
w.create_line(a,b,a+1,b+1,fill=”#”+striphex(255-abs(int(connectivity[a][b]*255)))+striphex(255-abs(int(connectivity[a][b]*255)))+striphex(255-abs(int(connectivity[a][b]*255))))
#drawconn(ns,connectivity)
#drawnetwork(ns,connectivity)
xor = {“00″:0,”01″:1,”10″:1,”11”:0}
#connprint(connectivity,ns)

def isprime(n):
for i in range(2,n-1):
if n % i == 0:
return False
return True

primelist = []

for i in range(2,1000):
if isprime(i) == True:
primelist.append(i)

for i in range(1,1000):
#activation[primelist[i]%ns] = 1
#activation[(ns – primelist[i])% ns – 1] = 1
for a in range(0,ns-1):
if (a + i%2)%10 == 0:
activation[a] = 1
actprint(activation,ns)
hebbian(activation,connectivity,ns)
transmission(activation,connectivity,ns,0.1)
#actprint(activation,ns)
#print “***”

print “*”*100
connprint(connectivity,ns)

#Good threshold = 0.25

drawnetwork(ns,connectivity)
drawconn(ns,connectivity)

mainloop()