In this post, You going to learn how you can easily build a Neural network with just 9 lines of Python code. If you are new to this subject, I highly recommend you to get a basic understanding of Deep Learning.

Let’s get started!

## What is a Neural Network?

A series of algorithms, specially designed to recognize patterns. The sensory data is interpreted through machine perception. It may be clustering or labeling raw inputs. Numbers, vectors, such patterns are interpreted into real-world data. Images, sound text, time series must be translated.

The link between two nodes is called a *Synaptic link*. Initially, the input data will find the right link to provide the output node which is called *“Thinking”*(sense).

Learn how it is made mathematically from here.

We’re going to build Neural Network with fewer data.

1 2 3 4 5 6 7 8 9 | from numpy import exp, array, random, dot train_i = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]]) train_o = array([[0, 1, 0, 1]]).T random.seed(1) link_w = 2 * random.random((3, 1)) - 1 for iteration in xrange(100): output = 1 / (1 + exp(-(dot(train_i, link_w)))) link_w += dot(train_i.T, (train_o - output) * output * (1 - output)) print (1 / (1 + exp(-(dot(array([1, 1, 0]), link_w))))) |

## Data for Training

Three inputs and one output accordingly, We have four sets of I/O data is below,

- 0 0 1 0
- 1 1 1 1
- 1 0 1 0
- 0 1 1 1

### Input for the Test

- 1 1 0 ?

Now, We have the sense to find the *output* for E

How?

As you can see the output of the training data is just as same as the middle of input data but multiple combinations of inputs are also there. Now let’s take a look with the lines of code,

**Line #01**

1 | from numpy import exp, array, random, dot |

*numpy*in python language.

You can also see a single node(neuron) mathematical computation from here.

But how to give a sense for our model with the correct answer?

- Get the Training data as input(
) and do the dot matrics with the random link(*train_i*) weight(**synapses**) to find the output by sigmoid activation function (*link_w*)**output** - Calculate the error with the simple mathematical operation.
- The output is delivered.

### Training I/O data

**Line #02**

1 | train_i = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]]) |

**Line #03**

1 | train_o = array([[0, 1, 0, 1]]).T |

*link_w*(link weight) to the input node, which can be either a positive or a negative number. Each node-link with the next layer along with a random number is

*link_w*(synapses).

To generate random weight using a random function (*numpy*).

**Line #04**

1 | random.seed(1) |

**Line #05**

1 | link_w = 2 * random.random((3, 1)) - 1 |

*link_w*(synapses) with random numbers both negative and positive.

**Line #06**

1 | for iteration in xrange(1000): |

**Line #07**

1 | output = 1 / (1 + exp(-(dot(train_i, link_w)))) |

**Line #08**

1 | link_w += dot(train_i.T, (train_o - output) * output * (1 - output)) |

**1. dot(train_i, link_w)****2. 1 / (1 + exp(-(dot(train_i, link_w))))**

*3. output = 1 / (1 + exp(-(dot(train_i, link_w))))*

Output for the next node(neuron) repeats iteration to make sense to identify the correct answer (output) by calculating an error.

Adjust weight depends on error.

1 | link_w += dot(train_i.T, (train_o - output) * output * (1 - output)) |

In *numpy* ‘.T’ function represents, transposes the matrix so our input should be like this for calculation.

INPUT | OUTPUT |
---|---|

0 0 1 | 0 |

1 1 1 | 1 |

1 0 1 | 0 |

0 1 1 | 1 |

**Line #09**

1 | print (1 / (1 + exp(-(dot(array([1, 1, 0]), link_w))))) |

*Output*–

**0.98807249**

*Result of input test data – ***1 1 01**

Congratulations on getting this far! Happy learning.

Agira provides various Mobile and Web Development Services. You can hire our dedicated Developers to transform your Business.

#### Rajasekar

#### Latest posts by Rajasekar (see all)

- How To Use Python Lambda Functions With Examples - November 13, 2019
- Introduction To Deep Learning (Neural Network) And Its Mathematical Computations - September 10, 2019
- How To Build A Simple Neural Network In 9 Lines of Python Code - September 10, 2019