Skip to content

Учебный проект по байесовскому мультимоделированию студентов первого курса магистратуры 25/26 года обучения. Авторы : Соболевский Федор, Набиев Мухаммадшариф, Василенко Дмитрий, Касюк Вадим

License

Notifications You must be signed in to change notification settings

intsystems/bensemble

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

😎 Bensemble: Bayesian Multimodeling Project

Python License Coverage Documentation Ruff image

Bensemble is a comprehensive comparative study and a production-ready library for Bayesian Deep Learning. It integrates established methods for neural network ensembling and uncertainty quantification under a unified PyTorch interface.


Key Resources

Resource Description
📘 Documentation Full API reference and user guides.
📝 Tech Report In-depth technical details and theoretical background.
✍️ Blog Post Summary of the project and motivation.
📊 Benchmarks Comparison of methods on standard datasets.

Features

  • Unified API: All methods share a consistent fit / predict interface (Scikit-learn style).
  • Core Bayesian Methods: Implements canonical algorithms from Variational Inference to Scalable Laplace approximations.
  • Modern Stack: Built with uv, fully typed, and tested with 98% code coverage.

Installation

You can install bensemble using pip:

pip install bensemble

Or, if you prefer using uv for lightning-fast installation:

uv pip install bensemble

Development Setup

If you want to contribute to bensemble or run tests, we recommend using uv to manage the environment.

# 1. Clone the repository
git clone https://github.com/intsystems/bensemble.git
cd bensemble

# 2. Create and activate virtual environment via uv
uv venv
source .venv/bin/activate  # on Windows: .venv\Scripts\activate

# 3. Install in editable mode with dev dependencies
uv pip install -e ".[dev]"

Quick Start

Here is how to turn a standard PyTorch model into a Bayesian one using Variational Inference:

import torch
import torch.nn as nn
from torch.utils.data import DataLoader, TensorDataset

from bensemble import VariationalEnsemble

# 0. Prepare dummy data
X_train = torch.randn(100, 10)
y_train = torch.randn(100, 1)
dataset = TensorDataset(X_train, y_train)
train_loader = DataLoader(dataset, batch_size=32, shuffle=True)

# 1. Define your standard PyTorch model
model = nn.Sequential(nn.Linear(10, 50), nn.ReLU(), nn.Linear(50, 1))

# 2. Wrap it with Bensemble
# auto_convert=True automatically replaces Linear layers with BayesianLinear
ensemble = VariationalEnsemble(model, auto_convert=True)

# 3. Train (Fit)
history = ensemble.fit(train_loader, epochs=50)

# 4. Predict with Uncertainty
# Returns mean prediction and standard deviation (uncertainty)
X_test = torch.randn(5, 10)
mean, std = ensemble.predict(X_test, n_samples=100)

print(f"Prediction: {mean[0].item():.4f} ± {std[0].item():.4f}")

Algorithms & Demos

We have implemented four distinct approaches. Check out the interactive demos for each:

Method Description Demo
Variational Inference Uses "Bayes By Backprop" with Local Reparameterization Trick. Open Notebook
Laplace Approximation Fits a Gaussian around the MAP estimate using Kronecker-Factored Curvature (K-FAC). Open Notebook
Variational Rényi Generalization of VI minimizing $\alpha$-divergence (Rényi). Open Notebook
Probabilistic Backprop Propagates moments through the network using Assumed Density Filtering (ADF). Open Notebook

Development & Testing

The library is covered by a comprehensive test suite to ensure reliability.

Run Tests

pytest tests/

Linting

We use ruff to keep code clean:

ruff check .
ruff format .

Authors

Developed by:


License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Учебный проект по байесовскому мультимоделированию студентов первого курса магистратуры 25/26 года обучения. Авторы : Соболевский Федор, Набиев Мухаммадшариф, Василенко Дмитрий, Касюк Вадим

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •