mlcourse.ai is an open Machine Learning course by OpenDataScience (ods.ai), led by Yury Kashnitsky (yorko). Having both a Ph.D. degree in applied math and a Kaggle Competitions Master tier, Yury aimed at designing an ML course with a perfect balance between theory and practice. Thus, you go through math formulae in lectures, and practice with Kaggle Inclass competitions. Currently, the course is in a self-paced mode. Check out a thorough Roadmap guiding you through the self-paced mlcourse.ai. The Russian version of the course is resurrected and will be led by Petr Ermakov. If you speak Russian, please refer to this Open ML course launching on Feb 1st, 2022.
Bonus: Additionally, you can purchase a Bonus Assignments pack with the best non-demo versions of mlcourse.ai assignments. Select the "Bonus Assignments" tier. Refer to the details of the deal on the main page mlcourse.ai.
Mirrors (:uk:-only): mlcourse.ai (main site), Kaggle Dataset (same notebooks as Kaggle Notebooks)
You are guided through 10 weeks of mlcourse.ai. For each week, from Pandas to Gradient Boosting, instructions are given on which articles to read, lectures to watch, what assignments to accomplish.
This is the list of published articles on medium.com 🇬🇧, habr.com 🇷🇺. Also notebooks in Chinese are mentioned 🇨🇳 and links to Kaggle Notebooks (in English) are given. Icons are clickable.
- Exploratory Data Analysis with Pandas 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebook
- Visual Data Analysis with Python 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebooks: part1, part2
- Classification, Decision Trees and k Nearest Neighbors 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebook
- Linear Classification and Regression 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebooks: part1, part2, part3, part4, part5
- Bagging and Random Forest 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebooks: part1, part2, part3
- Feature Engineering and Feature Selection 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebook
- Unsupervised Learning: Principal Component Analysis and Clustering 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebook
- Vowpal Wabbit: Learning with Gigabytes of Data 🇬🇧 🇷🇺 🇨🇳, Kaggle Notebook
- Time Series Analysis with Python, part 1 🇬🇧 🇷🇺 🇨🇳. Predicting future with Facebook Prophet, part 2 🇬🇧, 🇨🇳 Kaggle Notebooks: part1, part2
- Gradient Boosting 🇬🇧 🇷🇺, 🇨🇳, Kaggle Notebook
Videolectures are uploaded to this YouTube playlist. Introduction, video, slides
- Exploratory data analysis with Pandas, video
- Visualization, main plots for EDA, video
- Decision trees: theory and practical part
- Logistic regression: theoretical foundations, practical part (baselines in the "Alice" competition)
- Ensembles and Random Forest – part 1. Classification metrics – part 2. Example of a business task, predicting a customer payment – part 3
- Linear regression and regularization - theory, LASSO & Ridge, LTV prediction - practice
- Unsupervised learning - Principal Component Analysis and Clustering
- Stochastic Gradient Descent for classification and regression - part 1, part 2 TBA
- Time series analysis with Python (ARIMA, Prophet) - video
- Gradient boosting: basic ideas - part 1, key ideas behind Xgboost, LightGBM, and CatBoost + practice - part 2
The following are demo-assignments. Additionally, within the "Bonus Assignments" tier you can get access to non-demo assignments.
- Exploratory data analysis with Pandas, nbviewer, Kaggle Notebook, solution
- Analyzing cardiovascular disease data, nbviewer, Kaggle Notebook, solution
- Decision trees with a toy task and the UCI Adult dataset, nbviewer, Kaggle Notebook, solution
- Sarcasm detection, Kaggle Notebook, solution. Linear Regression as an optimization problem, nbviewer, Kaggle Notebook
- Logistic Regression and Random Forest in the credit scoring problem, nbviewer, Kaggle Notebook, solution
- Exploring OLS, Lasso and Random Forest in a regression task, nbviewer, Kaggle Notebook, solution
- Unsupervised learning, nbviewer, Kaggle Notebook, solution
- Implementing online regressor, nbviewer, Kaggle Notebook, solution
- Time series analysis, nbviewer, Kaggle Notebook, solution
- Beating baseline in a competition, Kaggle Notebook
- Catch Me If You Can: Intruder Detection through Webpage Session Tracking. Kaggle Inclass
- DotA 2 winner prediction. Kaggle Inclass
The course originated in OpenDataScience, at that time in 2017 (well, and still), mostly Russian-speaking community. Hence the first couple of sessions were held in Russian. Then, later, the course launched in English and resulted in what you see here and on mlcourse.ai.
The Russian version of the course is resurrected and is led by Petr Ermakov. If you speak Russian, please refer to this Open ML course launching on Feb 1st, 2022.
If you happen to cite mlcourse.ai in your work, you can use this BibTeX record:
@misc{mlcourse_ai,
author = {Kashnitsky, Yury},
title = {mlcourse.ai – Open Machine Learning Course},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/Yorko/mlcourse.ai}},
}
Discussions are held in the #mlcourse_ai channel of the OpenDataScience (ods.ai) Slack team.
The course is free but you can support organizers by making a pledge on Patreon (monthly support) or a one-time payment on Ko-fi.