## Learn How Neural Networks Learn. It’s not so different from humans… | by Anna Shi | Nov, 2020

What was the problem? Well, I trusted my sister more than my parents to give me the right answer. I assigned more weight to my sister’s explanation than my parents’ answer. Next time, I’ll know to listen more to my parents and less to my sister. This process of adjusting weights is called backpropagation.

When a neural network is first set up, the weights and biases in each layer are randomized (we can think of a bias as a weight). The prediction is wildly inaccurate as a result. Backpropagation updates the weights and biases of each of the nodes until the model can make consistently accurate predictions.

Here were the steps in the context of my story:

1. My parents and my sister both provided me with answers. Keeping them both in mind, I chose to go with an answer of window.
3. Window was completely wrong. The answer was 2.
4. I looked back to my sources to figure out where I went wrong. My parents are clearly smarter than my sister.
5. I decided to start trusting my parents more than my sister from now on.
6. I use my new methodology to figure out the answer to 2+2.

Here’s the rundown in machine learning terms:

1. Perform a feedforward operation.
2. Compare the model’s output with the desired output.
3. Calculate the error with the error function.
4. Run the feedforward operation backwards.
5. Update the weights.
6. Rinse and repeat.

The feedforward operation is the process of generating an output from the given inputs in a neural network.

I won’t go in-depth on this, but I’ve made a quick sequence to follow.

Receive inputs: If this is the first hidden layer, the inputs will come directly from the data. Otherwise, the inputs will be the outputs generated from the previous layer.

Calculate prediction: The prediction depends on the weight for each input and the bias. The bias can be considered its own weight for an input of 1. The formula for the prediction: