- Jupyter Notebooks on Habrok
- Jupyter Notebooks on a local machine
- Jupyter Notebooks on Google Colab
- ML vs. Traditional Programming
- Key Concepts in Machine Learning
- Types of Machine Learning
- Data Exploration and Preprocessing
- Model Selection, Training, and Validation
- Model Evaluation
- Goals of EDA
- Typical Techniques
- Train-Test Split
- Cross-Validation
- Handling missing values
- Feature scaling
- Encoding categorical variables
- Polynomial
- Derived
- Discretization
- Introduction to Regression
- Simple Linear Regression
- Assumptions of Simple Linear Regression
- Evaluation Metrics
- Types of Robust Regression
- Huber Robust Regression
- RANSAC Algorithm
- Extension of Simple Linear Regression
- Additional Assumption: Low Multicolinearity
- Cost Functions
- Gradient Descent
- Regularization
- Ridge and Lasso Regularized Regression
- Polynomial Regression
- Exponential Regression
- Logarithmic Regression
- Overfitting and Underfitting
- Bias-Variance Tradeoff
- Hyperparameter Tuning
- Binary vs. Multiclass Classification
- Classification Algorithms
- Decision Boundaries
- Probability vs. Hard Classification
- Output of Classification Models
- Logistic Regression
- Decision Boundaries
- Confusion Matrix
- Accuracy, Precision, Recall, F1 Score
- Precision-Recall Tradeoff and Curve
- ROC Curve and AUC
- The k-NN Algorithm
- Decision Trees
- Information Gain and Gini Index
- Overfitting and Pruning
- Introduction to SVMs
- Linear SVMs
- Nonlinear SVMs
- Kernel Trick
- What are Ensemble Methods?
- Why Use Ensemble Methods?
- Types of Ensemble Methods
- Bagging (Bootstrap Aggregating) Overview
- Random Forests
- Feature Importance in Random Forests
- Boosting Overview
- AdaBoost
- Gradient Boosting
- XGBoost
- Stacking
- Multi-level EoE
- Dimensionality Reduction
- Clustering
- Anomaly Detection
- Principal Component Analysis (PCA)
- Autoencoders
- Linear Discriminant Analysis (LDA)
- k-Means Clustering
- Hierarchical Clustering
- What are Neural Networks?
- Biological Inspiration
- Structure of a Neural Network
- Activation Functions
- Forward Propagation
- Backpropagation
- Loss Functions
- Gradient Descent
- What is Deep Learning?
- Deep Neural Networks
- Convolutional Neural Networks (CNNs)
- Recurrent Neural Networks (RNNs)
- Autoencoders
- Generative Adversarial Networks (GANs)