Archive

Monthly Archives: November 2016

So today after a session of the Goldsmiths Deep Learning with Tensorflow group, I decided to go back to the basics and start ML101. I found a good resource for that:

http://neuralnetworksanddeeplearning.com

The first chapter in this book gives a primer in how basic artificial neurons, such as the perceptron and the sigmoid neuron work.

The perceptron is a basically a mathematical approach to define a decision making model. The perceptron defines a set of binary variables (the inputs xs and output y) and a set of parameters (the respective weights for each input, ws, and a threshold value). Each configuration of weights and threshold provide us with a different decision-making model.

Now, you can scale up the power of a perceptron to a network of perceptrons, composed by several interconnected layers of perceptrons. Each input is connect to the first layer of perceptrons and each perceptron output is multiply connected to all the perceptrons of the subsequent layer. Thus, each layer subsequent layer will be making more complex and abstract decisions, providing a very sophisticated mechanism for decision making.

Here I’m using the dot product and the negative of the threshold to express the bias

Output = 0 => w≤ 0

Output = 1 => wb > 0

To be updated…

Also began the basic Tensorflow starter tutorial with the MNIST dataset, for recognition of handwritten digits.

https://www.tensorflow.org/versions/r0.11/tutorials/mnist/beginners/index.html

 

Advertisements