Beginner Intro to Neural Networks 12: Neural Network in Python from Scratch
[ad_1]
Handwriting generation with recurrent neural networks: https://www.cs.toronto.edu/~graves/handwriting.html
Notebook with neural net:
https://github.com/JonComo/flowers/blob/master/flowers.ipynb
Music:
Pookatori and Friends Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/
Hey everyone!
In this video we solve the flower problem “by hand” using python, and do it interactively in jupyter notebook/IPython.
You’ll see everything from weight initialization, to the feed forward and backward passes used to train the net, to inference on some crazy types of flowers.
Also as a bonus I left in my hyperparameter (learning rate, training iterations, weight variance etc.) “dancing” for fun!
Thanks for watching. Appreciate seeing you all here still. I’m starting to work on these videos full time. Let me know what you’d like to see next!
See you in the next video,
– gnn
Source
[ad_2]
is this channel DeD?
Can I add more features and more types (outputs) of the flowers to do the excect same thing? Without sigmoid function of cause
Hello from the other side! Really cool visualizations, and I may or may not have taken notes to include you in a machine learning report/project I'm doing that tries to summarize concepts that are lacking from traditional ml courses
hey great video, just wanna ask about the TensorFlow video that you talked about at the end?
Super easy to follow and brilliant testing at the end was super cool to test all the weird combinations. Did you ever manage to do this in tensor flow?
missing u this series bro
When ever I go to print anything in my for loop it gives me error( numpy.float64 object is not callable)….can sombody please help
this entire series was a giant waste of time, i made sure to dislike the video.
Do you cover backpropagation in this series?
Hi, I followed you correctly, but my cost_sum is going in the opposite direction of minimisation. Now its graph is forming a Nike symbol, i.e., maximising
Hi, i am trying to develop a python neural network module for learning purpose thanks to your really amazing videos, and after trying to figure out everything, i came up to a point where everything seemed correct but… something strange is happening in my train method. My cost value over every epoch does not go lower than 1. I would really like to correct any mistake i did but i do not know what it is :/ Would you help me to figure it out? I would really appreciate it!
Shouldn't you be re-inistializing the target variable in your little costs_sum loop?
Hey… Thx for the tutorial man. It was very easy and helpful. But here's the thing while iterating for 10000 times it works fine however when say 50000 its accuracy suffers significantly. Is this a common thing for NN or is this only for this dataset. I see in the video you had a similar problem. Please reply man.
Hands down the best possible series. I was jumping between tutorials as I couldn't understand the other ones, this is the best and precise yet, amazingly simple series ever. This just gave me a push in the right direction that I needed.
Thanks a lot for the efforts you put in into the videos and all the explanations. You got me even more interested in neural networks and even maths that I absolutely hate. Man, I love you and I owe you a big one. Once again thanks a lot.
Hope you keep making such series and gain many more subscribers so that people know how great educator you are.
Are you going to proceed your multi-neural network series? Please doo!!!
Setup some Patreon and we would gladly join!
This is BY FAR the best intro to NN! Thank you very much! Please, keep going!
I am always picky about liking videos and even commenting on them. I loved this series from the beginning but was still in doubt until I came to the tenth video. You just made me login to my account and like each one of your video in this series and comment on them. That’s how good this series was. Thank you 😀 (Warning : Since I wrote the comments sometime after watching the video, some of them might be cringey)
Thank you sm!
I dont normally comment but man, forreal? I agree completely with Tejas Arlimatti this is how I learned the breakdown of a neural network. Now it's been 2 yrs since this video I would really like to see the break down of more. How to add a few hidden layers to this? How to take it to the next step build a CNN, RNN, use ARS, Activation function ReLU?