Hyperparameter Tuning Visualizer
An interactive Streamlit app to visualize how Decision Tree hyperparameters affect model accuracy. Users can experiment with hyperparameters and observe their effect on training and test accuracy in real-time.
Hyperparameters are settings that control the learning process of a model, such as tree depth or splitting rules. They are set before training and affect model performance and generalization.
Proper tuning ensures your model performs well on unseen data. Incorrect hyperparameters can lead to underfitting (too simple) or overfitting (too complex).
GridSearch: Systematically tries all combinations of hyperparameters to find the best set. RandomizedSearch: Tries random combinations of hyperparameters; faster for large search spaces but may miss the absolute best combination.
Adjust sliders and dropdowns for hyperparameters. Observe changes in baseline accuracy and depth vs accuracy plots. Compare GridSearch and RandomizedSearch results. Understand overfitting and model complexity visually.
Interactive sidebar for max_depth, min_samples_split, min_samples_leaf, criterion Live depth vs accuracy plots (train vs test) Highlight selected depth and overfitting zone Export tables and plots (CSV, PNG) Educational notes on hyperparameters and overfitting
streamlit run app.py