Dropout Regularization | Deep Learning Tutorial 20 (Tensorflow2.0, Keras & Python)
Overfitting and underfitting are common phenomena in the field of machine learning and the techniques used to tackle overfitting problem is called regularization. In deep learning, dropout regularization is used to randomly drop neurons from hidden layers and this helps with generalization. In this video, we will see a theory behind dropout regularization. We will then implement artificial neural network for binary classification problem and see how using dropout layer can increase the performance of the model.
CSV file: https://github.com/codebasics/py/tree/master/DeepLearningML/13_dropout_layer
Deep learning playlist: https://www.youtube.com/playlist?list=PLeo1K3hjS3uu7CxAacxVndI4bE_o3BDtO
Machine learning playlist : https://www.youtube.com/playlist?list=PLeo1K3hjS3uvCeTYTeyfe0-rN5r8zn9rw
Prerequisites for this series:
1: Python tutorials (first 16 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uv5U-Lmlnucd7gqF-3ehIh0
2: Pandas tutorials(first 8 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uuASpe-1LjfG5f14Bnozjwy
3: Machine learning playlist (first 16 videos): https://www.youtube.com/playlist?list=PLeo1K3hjS3uvCeTYTeyfe0-rN5r8zn9rw
DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers’.