Collaborators:
- Machine Learning
- Practical Introduction
- Training Models 1
- Training Models 2
- Ensemble Methods
- Dimensionality Reduction
-
Practical Introduction Contains basic information about approaches to make machine learning models.
-
Training Models 1 Contains practical approaches to Following Training Models
- Contents
- Linear Regression
- Normal Equation
- Gradient Descent
- Regularized Models
- Logistic Regression
- Decision Tress
- Classification
- Regression
- Linear Regression
- Contents
-
Training Models 2 Contains practical approaches to following Training Models
-
Ensemble Methods Contains notes and explainations on following ensemble methods:
- Contents
- Voting Classifier
- Bagging vs Pasting
- Random Patches and Random Subspaces
- Random Forests
- Feature Importance
- Boosting
- AdaBoost
- Gradient Boosting
- Stacking
- Contents
-
Dimensionality Reduction Contains Approaches to reduce dimension of data before trainging a model on it
-
Unsupervised Learning Contains Unsupervised Learning Algorithms
-
theory folder constains theory about Machine Learning
-
Introduction contains Statistical Theory(In depth) about machine learning
-
Regression constains theory about simple and multiple linear regression
-
-
Data Visualisation Contains introductory practical insights on plotting with Seaborn.(Reference Kaggle MicroCourse on Data Visualisation)
-
S1Regresssion is an example of how to apply linear regression to a dataset. The analysis is dangerously incomplete as of now (10/10/19).
-
tf_introduction is guide to basic operations of tensorflow.
-
Essental Statistics and Probability is the guide to essentials of statistics and probability required for data science and engineering.
Please use the rendered HTML file directly from the bin/ folder if to avoid any malfunctioning.
Please use the commit.sh file to commit the changes and then push to the remote to maintain a common format of commit messages
conda.sh file sets up the environment required to run the codes