Skip to content

aleesha-10/hyperparameter-tuning-visualizer

Repository files navigation

Hyperparameter Tuning Visualizer

Overview

An interactive Streamlit app to visualize how Decision Tree hyperparameters affect model accuracy. Users can experiment with hyperparameters and observe their effect on training and test accuracy in real-time.

What Hyperparameters Are

Hyperparameters are settings that control the learning process of a model, such as tree depth or splitting rules. They are set before training and affect model performance and generalization.

Why Tuning Matters

Proper tuning ensures your model performs well on unseen data. Incorrect hyperparameters can lead to underfitting (too simple) or overfitting (too complex).

GridSearch vs RandomSearch

GridSearch: Systematically tries all combinations of hyperparameters to find the best set. RandomizedSearch: Tries random combinations of hyperparameters; faster for large search spaces but may miss the absolute best combination.

How Students Can Use It to Learn

Adjust sliders and dropdowns for hyperparameters. Observe changes in baseline accuracy and depth vs accuracy plots. Compare GridSearch and RandomizedSearch results. Understand overfitting and model complexity visually.

Features

Interactive sidebar for max_depth, min_samples_split, min_samples_leaf, criterion Live depth vs accuracy plots (train vs test) Highlight selected depth and overfitting zone Export tables and plots (CSV, PNG) Educational notes on hyperparameters and overfitting

Command to Run the App

streamlit run app.py

image

About

An interactive Streamlit app to visualize how Decision Tree hyperparameters affect model accuracy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages