Posts

Showing posts from August, 2011

Poor Man's Colour Detector (Part 2) - The project

Image
In this project we will be detecting the colour of 3 different Mega Blok colours (Red, Yellow and Green). We will be using an Arduino UNO connected to  2 LEDs (one Yellow and one Red LED) as light detectors, and an RGB LED to illuminate the subject. We will use a Photocell to account for varying ambient light levels.  The signals from the LED light sensors will be sent to a Processing.org program via a Serial command. The computer program will make use of my Neural Network to classify the pattern of results and hopefully provide the correct colour "answer". The program should change the colour of the computer screen background to coincide with the colour of the Mega Blok. The Video Parts Required : Arduino UNO...........x1    Red LED .................x1 Yellow LED.............x1 330 Ohm resistors... x 5  (for the LEDs) Photocell .................x1 10K Ohm resistor....x1   (for the Photocell) Around 11 wires and a Breadboard (o...

Neural Network (Part 7) : Cut and Paste Code

Image
Ok - so you don't like tutorials, and would rather just cut and paste some code. This is for you. http://www.openprocessing.org/visuals/?visualID=33991 Make sure to select "Source code" when you get there, otherwise it will be quite boring. Here is an animated gif which shows the program in action. See BELOW for the WHOLE screenshot so that you don't have to speed read. If you want to know how it works, then you will have to go back and read part 1 to 6. Neural Network Part 1 : The Connection class   Part 2 : The Neuron class   Part 3 : The Layer class Part 4 : The Neural Network class Part 5:  Back Propagation - the process Part 6 : Back Propagation (A complete walk through) But as you can see from the example above: Before the neural network is trained, the outputs are not even close to the expected outputs. After training, the neural network produces the desired results (or very close to it). Please note, this neural network also allows more than one ...

Neural Network (Part 6) : Back Propagation, a worked example

Image
A worked example of a Back-propagation training cycle. In this example we will create a 2 layer network (as seen above), to accept 2 readings, and produce 2 outputs. The readings are (0,1) and the expectedOutputs in this example are (1,0). Step 1: Create the network NeuralNetwork NN = new NeuralNetwork();    NN.addLayer(2,2); NN.addLayer(2,2); float[] readings = {0,1}; float[] expectedOutputs = {1,0}; NN.trainNetwork(readings,expectedOutputs); This neural network will have randomised weights and biases when created. Let us assume that the network generates the following random variables: LAYER1.Neuron1 Layer1.Neuron1.Connection1.weight = cW111 = 0.3 Layer1.Neuron1.Connection2.weight = cW112 = 0.8 Layer1.Neuron1.Bias = bW11 = 0.5 LAYER1.Neuron2 Layer1.Neuron2.Connection1.weight = cW121 =  0.1 Layer1.Neuron2.Connection2.weight = cW122 =  0.1 Layer1.Neuron2.Bias = bW12 = 0.2 LAYER2.Neuron1 Layer2.Neuron1.Connection1.weight = cW211 = ...