## Building a Neural Network from Scratch in Python

We’re gonna use python to build a simple 3-layer feedforward neural network to predict the next number in a sequence. We’ll go over the concepts involved, the theory, and the applications. Lead by: Siraj Raval of Sirajology

Source

### Comment List

• HackingEDU
November 27, 2020

Mantap videonya.
Saya juga ada nih rekomendasi channel lain buat belajar neural network siapa tau cocok hehe.

https://youtu.be/vyAsO_fzNF8

• HackingEDU
November 27, 2020

Far better to learn Neural Networks from Sentdex.He has a channel, and also a website.He gives a very good and simple explanation for everything that he does.

• HackingEDU
November 27, 2020

This is so bad. I don't know how these people are allowed to teach.

• HackingEDU
November 27, 2020

Excellent!

• HackingEDU
November 27, 2020

Siraj Raval plagiarized got away with it and even made money I guess

• HackingEDU
November 27, 2020

i feel like he doesn't give good explanation for using dot and transpose. i took matrix algebra but didn't have a focus on implementation and would love an idea of why and how to apply those concepts

• HackingEDU
November 27, 2020

Maybe I missed something but I'm wondering why there is no backpropagation function for an L1_error like there is for L2_error (line 40). I'm new to both Python and Neural Networks but that seems odd to me. Would you (or anyone) mind helping me understand why backpropagation is not required for L1? Thanks

• HackingEDU
November 27, 2020

Hey I followed along recreating the python program as he went. I double checked to make sure we typed the same things. However, my error goes down a lot slower than his. Why might this be?

• HackingEDU
November 27, 2020

'''
The code in-case someone needs it

This is a three-layer neural network
Version Python 3.8.0
Numpy 1.17.3
'''
import numpy as np

#sigmoid functionkkk
def nonlin(x, deriv=False):
if(deriv==True):
return (x*(1-x))

return 1/(1+np.exp(-x))

x = np.array([[0,0,1],
[0,1,1],
[1,0,1],
[1,1,1]]) #input data

y = np.array([[0],
[1],
[1],
[0]]) #classes

np.random.seed(1) #seeding to get repetetive values

#synapses(weights)
syn0 = 2*np.random.random((3,4)) -1
syn1 = 2*np.random.random((4,1)) -1

#training

for j in range(60000):

#making the layers
l0 = x # the created x will be the input of l0
l1 = nonlin(np.dot(l0, syn0))
l2 = nonlin(np.dot(l1, syn1))

#backpropagation
l2_error = y – l2
if (j% 10000) == 0:
print('Error: '+ str(np.mean(np.abs(l2_error))))

#calculate deltas
l2_delta = l2_error*nonlin(l2, deriv=True)
l1_error = l2_delta.dot(syn1.T)
l1_delta = l1_error * nonlin(l1, deriv=True)

#update weights7synapses
syn1 = syn1 + l1.T.dot(l2_delta)
syn0 = syn0 + l0.T.dot(l1_delta)

print('Output after training is done')
print(l2)

• HackingEDU
November 27, 2020

line 43 there are 3 opening brackets but only 2 closing brackets, yet his code runs…this is amazing.

• HackingEDU
November 27, 2020

Good training

• HackingEDU
November 27, 2020

Why would someone mixup l with 1, they’re so far apart!! Doesn’t make any sense.

• HackingEDU
November 27, 2020

Is that potato 🥔?

• HackingEDU
November 27, 2020

why is siraj so dumb in this video

• HackingEDU
November 27, 2020

All this is cool, but can someone explain what does this actually mean? Like what does the x and y array represent?

• HackingEDU
November 27, 2020

I'm learning more on this I'm just getting started I've learned alot if python

• HackingEDU
November 27, 2020
• HackingEDU
November 27, 2020

numpy isnt from scratch

• HackingEDU
November 27, 2020

Very good Video

• HackingEDU
November 27, 2020

Can i get a copy of the code please?

• HackingEDU
November 27, 2020

at least we know the first words post-singularity already

in a somber tone: “holy shit it worked”

• HackingEDU
November 27, 2020

Why deriv is x*(1-x)? Filipe from brazilll

• HackingEDU
November 27, 2020

• HackingEDU
November 27, 2020

3:08 "Tokyo sex without a condom"

Hellllllll yeah Siraj… get that pussy bro

• HackingEDU
November 27, 2020

after breaking my head for entire day behind logic of dot product of synapses and input and the order they should be multiplied i really dont know whether i am right or wrong but what i understand is when initializing dimensions for synapses suppose for layer 1 i.e syn0, dimensions should be (4,3).It make sense to me as(4,3) as for each row denoting each neuron in layer 2 having 3 columns as 3 input features but it does not output single value at the end of all layes but now i realizes(thank you i got it now ) vice versa is right because if finalizes to single value. Thank you SIRAJ indirectly affecting millions of amatures like me… ; >)

• HackingEDU
November 27, 2020

Can someone have a look on my code and tell me why memory overflow happens after 3 loops?
https://paste.ofcode.org/HCPyVfbTkXNZne7biuxamV

• HackingEDU
November 27, 2020

code is ugly, explanations are wrong, and using numpy is not "from scratch"

• HackingEDU
November 27, 2020

its siraj

• HackingEDU
November 27, 2020

from basic neural network to tensorflow to rnn to cnn. this helps a lot! thanks!

• HackingEDU
November 27, 2020

RE: the first example. This is great. The code compiles, and I get the same numbers. What I'm a bit confused about is why there is no learning rate and is this applying back propagation?

• HackingEDU
November 27, 2020

مجموعة الذكاء الصناعي و تقنيات تعليم الألة و التعليم العميق Deepmind and Machine learning. في الفيسبوك

• HackingEDU
November 27, 2020

People talking shit about Siraj, listen, nobody is perfect. Have you ever programmed? Errors come and go and the process takes time. It's heard to stand there in front of an audience and do live programming, it's just hard, imagine it. He tried to explain and his audience was not noobs. He at least tried, what you didn't so stay cool and study the basics again, come back here and see the difference. He is an awesome teacher, and he is an inspiration to a lot of people and he is the teacher so instead of pointing negatives, point positives. That's what you're supposed to do when learning.

• HackingEDU
November 27, 2020

3:37 neural network tutorial

• HackingEDU
November 27, 2020

I was wondering why everybody is trying to teach neural nets starts with XOR and end with the same shit XOR! Is there anything else you can use it? After all there is no point repeating the same problem over and over again. Please change the subject.

• HackingEDU
November 27, 2020

I believe this guy memorize the code, not understand how it works. Look at his typo and shallow explanation for every step..

• HackingEDU
November 27, 2020

Awesome Video! Fun and exciting and yes not perfect.

• HackingEDU
November 27, 2020

Well now, numpy isnt like os or random or time, etc. This is a package that doesnt come as default on a python installation. So importing it without explanation about how it use it, where to get it, how to install it, you really make a lot of leaps here and take a lot for granted. And you really truly are building on top of other peoples work much more profoundly than using default packages. In fact, not everyone uses python so this commentary about "abstracting everything" is just bullshit – this is a specific implementation.

• HackingEDU
November 27, 2020

WHERE MAH PEP8 @ BRO

• HackingEDU
November 27, 2020

The typing is terrible and his explanations are total crap.

• HackingEDU
November 27, 2020

YeS SiR !!!!!!!!!!!!!!!!

• HackingEDU
November 27, 2020

Yes he is siraj, because that code in the video is also on the video uploaded by siraj

• HackingEDU
November 27, 2020

How do you draw the node/layers picture based off the input and output matrices?

• HackingEDU
November 27, 2020