Tutorial 28- Ridge and Lasso Regression using Python and Sklearn




[ad_1]

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
https://www.youtube.com/channel/UCNU_lfiiWBdtULKOw6X0Dig/join

github url: https://github.com/krishnaik06/RegressionandLasso
#Regularization

Please do subscribe my other channel too

https://www.youtube.com/channel/UCjWY5hREA6FFYrthD0rZNIw

Connect with me here:
Twitter: https://twitter.com/Krishnaik06
Facebook: https://www.facebook.com/krishnaik06
instagram: https://www.instagram.com/krishnaik06

Source


[ad_2]

Comment List

  • Krish Naik
    November 20, 2020

    Watch the 2nd part just now…. You're like a savior to me as I have some deadlines due tomorrow and this helped me a lot sir. Thank you very much.๐Ÿ’ฏ๐Ÿ’ฏ

  • Krish Naik
    November 20, 2020

    Very useful video to implement Ridge Regression using sklearn. Thank you very much! I actually wanted to use Kernel Ridge Regression but the steps you took were essentially similar to ones needed with KRR (except changing some parameters). Thanks again!

  • Krish Naik
    November 20, 2020

    your theory videos are good, but i don't like coding part it looks way different from what i do. you must have tried doing it from start with some new dataset. i'm good with theory but now i'm messing my mind with coding part(everyone has their own way of coding).

  • Krish Naik
    November 20, 2020

    Hi Krish can you please explain why lasso is better than bridge in histogram prediction didn't able to follow the last minutes of the video.It would be great if you can clarify.

  • Krish Naik
    November 20, 2020

    Superb explanation. Need to get my hands dirty in jupyter notebook. Thanks

  • Krish Naik
    November 20, 2020

    can we use other scoring method rather than neg_mean_squared_error to solve the problem…If any please suggest…Please help me out..

  • Krish Naik
    November 20, 2020

    but sir u have not scaled the values????

  • Krish Naik
    November 20, 2020

    Hello krish ,
    i am confused at the end which one is performed well ? Lasso or Ridge in this case?
    Please give some feedback.

  • Krish Naik
    November 20, 2020

    Cannot clone object. You should provide an instance of scikit-learn estimator instead of a class.

    what is this error

  • Krish Naik
    November 20, 2020

    Thankyou sooooo much.. ๐Ÿ™‚

  • Krish Naik
    November 20, 2020

    Excellent straightforward video ๐Ÿ™‚

  • Krish Naik
    November 20, 2020

    I'm following the 'the complete machine learning playlist' playlist, but you're step-jumping skipping many details saying 'hope you know this', love your teaching though but Can you make any good completed playlist??

  • Krish Naik
    November 20, 2020

    Excellent !!

  • Krish Naik
    November 20, 2020

    Thanks for your previous video Krish. I am not getting whether Lasso or Ridge is better at the end of the code. Also i referred a blog on this topic and found Elastic net is more efficient. Can you kindly explain this.

    Dy

  • Krish Naik
    November 20, 2020

    Hi Krish, Amazing videos….On what basis Alpha parameter list is decided? Also can you please explain in more detail about the 2 plots in the last.? Thanks.

  • Krish Naik
    November 20, 2020

    I feel, steps for regression process can be like this:
    1) split the data into train and test.
    2) Use train data for cross-validation and find the parameter with min MSE.
    3) Use the same parameter over test data and check for the accuracy of different models.

  • Krish Naik
    November 20, 2020

    Krish, A Correction: You said that Ridge Graph (Histogram) is more stable and in the end said LASSO is what we want. I am confused about that step.

  • Krish Naik
    November 20, 2020

    Good videos. So far so good. From most of the videos, i feel inference part is missing. What can we infer from the plots ?

  • Krish Naik
    November 20, 2020

    Hi.
    I just want to know if I am not wrong, we need to use train_test_split method before training the data. right?
    But you trained the data and then split the data into train & test, which for sure do not give us an accurate prediction on future predictions.

    Please correct me if I am wrong.

    Thank you.

  • Krish Naik
    November 20, 2020

    sir if i use cv=10 then the mse is coming still less..so how to chose the cv value appropriately..??will it depends upon the dataset..??

  • Krish Naik
    November 20, 2020

    Hello Krish,
    can you tell me how are you selecting alpha(lambda) values ?

  • Krish Naik
    November 20, 2020

    Hello Krish, thanks for making this wonderful video. Could you also please make a video on SVM and its underlying aspects like Kernelization etc…

  • Krish Naik
    November 20, 2020

    Sir after you made everything, I did not get things related to Cross Validation, can you please explain it in brief?

  • Krish Naik
    November 20, 2020

    HI Krish,
    At 5.58 time in video, you said that this best score helps us to find out which lambda value is suitable! but question that how? you have mentioned those values as an alpha values. and alpha values as a learning rate should not be very high number, instead it should be very small in order to reach global minima.
    regards,
    Krunal

  • Krish Naik
    November 20, 2020

    I have a question. what do you mean by stable in the last ses distplot graphs? both ridge and lasso looks same to me. how is one more stable than other

  • Krish Naik
    November 20, 2020

    Krish, as always amazing video. But why you decide to use cv=5, and how you come up with alpha values ? thanks

Write a comment