Deep Neural Network from Scratch in Python | Fully Connected Feedforward Neural Network


In this video we build on last week Multilayer perceptrons to allow for more flexibility in the architecture!


Backpropagation from scratch video:

Understanding Backpropagation video:

Music found on Youtube music

Making sure a flexible neural network architecture API isn’t too difficult. However, we need to be careful about the layer of abstraction we put in place in order to facilitate the work of the user who want to simply fit and predict. Here we make use of the following three concept: Network, Layer and Neuron. These three components will be composed together to make a fully connected feedforward neural network neural network.

For those who don’t know a fully connected feedforward neural network is defined as follows (From Wikipedia):
“A feedforward neural network is an artificial neural network wherein connections between the nodes do not form a cycle. As such, it is different from its descendant: recurrent neural networks.
The feedforward neural network was the first and simplest type of artificial neural network devised. In this network, the information moves in only one direction, forward, from the input nodes, through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network.”

If you have any questions feel free to drop a question down below!



Comment List

  • Yacine Mahdid
    January 10, 2021

    Thanks for this very informative video! Is this a very general approach which works for an arbitrary amount of layers and arbitrary amount of neurons in each layer? Is the code that flexible? So can I simply add an layer by calling another clf.add_hidden_layer(num_neurons = x)?

  • Yacine Mahdid
    January 10, 2021

    Great video i was just looking for this. Just 1 tip you may want to boost the volume of your mic or talk a little louder

Write a comment