Perceptron : The Basic Building Block Of A Neural Network

Spoorthi U K
2 min readOct 23, 2020

Start your journey with machine learning by learning about a perceptron (First ever neuron model )

Every time someone decides to start learning ML , they might get over whelmed by the search results with all the complicated networks and code. But where should one start their journey? I would say the answer should be from a perceptron mostly because it’s the simplest and the most primitive form of neural network

Neuron in an Artificial Neural Network is similar to the Biological Neuron in the brain. All computations on the incoming signals takes place in these neurons and appropriate response is taken.

Artificial Neuron-Biological Neuron Analogy

Parts of a perceptron:

  1. Inputs: Inputs are the features given to a neural network to train the model or to test the model .Here they are denoted by x₀,x₁ and x₂ and are usually denoted in the vector form X=[x₀,x₁….xₘ]ᵗ
  2. Weights: These are the learnable parameters in a neural networks and are tuned to give the desired response.They represent the strength of the connection.Here they are denoted by w₀,w₁ and w₂ and are usually denoted in the vector form W=[w₀,w₁….wₘ]ᵗ
  3. Summation Block : Here all the incoming signals are summed to give out single output v.
  4. Activation function: They are mathematical equations that determine the output y of the neural network. Based on our application, there are various types of activation functions that can be used but that’s a different topic all together.

Usually w₀ is called the bias.

So, what’s a bias?

Bias is a constant which helps the model in a way that it can fit best for the given data. Basically like the intercept added in a linear equation.

For simplification purpose we write it as w₀.

Formula used in a perceptron:

v=Xᵗ.W

Output: y=Φ(v)

Based on our application the weights of the perceptron can be adjusted based on certain rules

These perceptrons are interconnected to form various networks such as Feed Forward Network, CNN , etc.

They’re essentially just peceptrons connected in certain manner.

Now that you have finished taking your first step towards machine learning, you’re all set to dive into the deep sea of algorithms and statistics.

Happy Learning!

--

--

Spoorthi U K

Software Engineer, Data Science and Machine Learning enthusiast