Lmplot – Seaborn




[ad_1]

We go over the entirety of seaborn’s lmplot. We talk about factor grids and doing conditional linear regression. We talk about logistic, log transformed and lowess regression. This one was a big one, and a lot of fun, hope you enjoyed!

Associated Github Commit:
https://github.com/knathanieltucker/seaborn-weird-parts/commit/558ea779b08936f069985e194285e7c57d001ebf

Associated Seaborn Documentation:
http://seaborn.pydata.org/generated/seaborn.lmplot.html#seaborn.lmplot
http://seaborn.pydata.org/tutorial/regression.html

Dataset Link:
https://en.wikipedia.org/wiki/Anscombe’s_quartet

Lowess Link:
https://en.wikipedia.org/wiki/Local_regression

Source


[ad_2]

Comment List

  • Data Talks
    November 29, 2020

    Things I learned from this lecture:
    Regression(Linear model) Plots-1
    *sns.lmplot()

  • Data Talks
    November 29, 2020

    Thank you this was very helpful

  • Data Talks
    November 29, 2020

    Thanks for the video! It is really amazing!

  • Data Talks
    November 29, 2020

    Hey Nathaniel great videos! I'm pretty new to Python so I'm still figuring a bunch of stuff out….but is there a way to display the R-square and equation of the regression line?

Write a comment