# 【part 0: Perceptron】Understanding Deep Learning principles with Python codes from scratch not using frameworks

Hi there:)

I’m gonna write about a series of deep learning blog to explain deep learning principles. To explain deep learning principle clearly, this series of the articles will be explained with Python codes from scratch not using frameworks such as Tensorflow, Caffe and Chainer etc. Because I believe that building deep learning program with Python code from scratch is best way to go into deep “deep learning” principles.

At the beginning, this articles will describe what’s perceptron and how it works.

Because perceptron is origin of neural network(deep learning) and it help you guys to understand neural network principles.

**■What’s Perceptron**

The perceptron algorithm was invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt.

Peiceptron is a series of a process of input(x) and output(y).

Most important point is that output(y) will be only two types, 0 or 1.

When perceptron signal is under threshold value(θ), output(y) outputs 0.

When it is over threshold value(θ), output(y) outputs 1.

θ means threshold value.

※1. a simple perceptron model.

x1 and x2 are input signals. y is output signal. w1 and w2 are weights.

◯ is called node or neuron. θ means threshold value.

Input signal multiplying inputs with weights for 2 inputs(w1x1, w2x2).Following is mathematical formula which show the description above.

※2. mathematical formula of a perceptron model of ※1.

**■AND gate**

I’m gonna describe AND gate as perceptron.

The AND gate is a basic digital logic gate that implements logical conjunction – it behaves according to the truth table to the right. A HIGH output (1) results only if all the inputs to the AND gate are HIGH (1). If none or not all inputs to the AND gate are HIGH, a LOW output results. The function can be extended to any number of inputs.

※3. AND gate truth table.

The numbers how to choose a parameter which meet ※3 is limitlessness.

The values(w1, w2, θ) need to be fixed as fill AND gate chart condition.

For example, (w1, w2, θ) = (0.5, 0.5, 0.7) works like (w1, w2, θ) = (1.0, 1.0, 1.0).

That’s why, when we set the parameter like above and x1 and x2 are both 1, the sum of w1x1+w2x2 will be over threshold value(θ).

**■Let’s build perceptron with Python from scratch**

Define AND gate which receive x1 and x2 as arguments.

Following is Python code to define AND function.

```
>>> def AND(x1, x2):
... w1, w2, theta = 0.5, 0.5, 0.7
... tmp = x1*w1 + x2*w2
... if tmp <= theta:
... return 0
... elif tmp > theta:
... return 1
```

I’ll finish to write a perceptron with bias in next article.

Thank you for reading everyone 😉