Bensemble is a comprehensive comparative study and a production-ready library for Bayesian Deep Learning. It integrates established methods for neural network ensembling and uncertainty quantification under a unified PyTorch interface.
| Resource | Description |
|---|---|
| 📘 Documentation | Full API reference and user guides. |
| 📝 Tech Report | In-depth technical details and theoretical background. |
| ✍️ Blog Post | Summary of the project and motivation. |
| 📊 Benchmarks | Comparison of methods on standard datasets. |
- Unified API: All methods share a consistent
fit/predictinterface (Scikit-learn style). - Core Bayesian Methods: Implements canonical algorithms from Variational Inference to Scalable Laplace approximations.
- Modern Stack: Built with
uv, fully typed, and tested with 98% code coverage.
You can install bensemble using pip:
pip install bensembleOr, if you prefer using uv for lightning-fast installation:
uv pip install bensembleIf you want to contribute to bensemble or run tests, we recommend using uv to manage the environment.
# 1. Clone the repository
git clone https://github.com/intsystems/bensemble.git
cd bensemble
# 2. Create and activate virtual environment via uv
uv venv
source .venv/bin/activate # on Windows: .venv\Scripts\activate
# 3. Install in editable mode with dev dependencies
uv pip install -e ".[dev]"Here is how to turn a standard PyTorch model into a Bayesian one using Variational Inference:
import torch
import torch.nn as nn
from torch.utils.data import DataLoader, TensorDataset
from bensemble import VariationalEnsemble
# 0. Prepare dummy data
X_train = torch.randn(100, 10)
y_train = torch.randn(100, 1)
dataset = TensorDataset(X_train, y_train)
train_loader = DataLoader(dataset, batch_size=32, shuffle=True)
# 1. Define your standard PyTorch model
model = nn.Sequential(nn.Linear(10, 50), nn.ReLU(), nn.Linear(50, 1))
# 2. Wrap it with Bensemble
# auto_convert=True automatically replaces Linear layers with BayesianLinear
ensemble = VariationalEnsemble(model, auto_convert=True)
# 3. Train (Fit)
history = ensemble.fit(train_loader, epochs=50)
# 4. Predict with Uncertainty
# Returns mean prediction and standard deviation (uncertainty)
X_test = torch.randn(5, 10)
mean, std = ensemble.predict(X_test, n_samples=100)
print(f"Prediction: {mean[0].item():.4f} ± {std[0].item():.4f}")We have implemented four distinct approaches. Check out the interactive demos for each:
| Method | Description | Demo |
|---|---|---|
| Variational Inference | Uses "Bayes By Backprop" with Local Reparameterization Trick. | Open Notebook |
| Laplace Approximation | Fits a Gaussian around the MAP estimate using Kronecker-Factored Curvature (K-FAC). | Open Notebook |
| Variational Rényi | Generalization of VI minimizing |
Open Notebook |
| Probabilistic Backprop | Propagates moments through the network using Assumed Density Filtering (ADF). | Open Notebook |
The library is covered by a comprehensive test suite to ensure reliability.
pytest tests/We use ruff to keep code clean:
ruff check .
ruff format .Developed by:
This project is licensed under the MIT License - see the LICENSE file for details.