"There is a ceiling above standard Deep Learning that no one saw. Versor opens the door above it."
Versor replaces standard matrix multiplications with Geometric Algebra (Rotor) operations to preserve the topological structure of data.
| Benchmark | Metric | Performance | Note |
|---|---|---|---|
| QM9 (Molecular) | MAE | 7.64 meV | Trained < 1 hour on single 4090 |
| QM9 (Inference) | Latency | 5.8 ms / molecule | Real-time on CPU (M4) |
| Motion (UCI-HAR) | Accuracy | ~100% | Grade Purity 0.9957 |
| Semantic (BERT) | Purity | 100% |
|
| Architecture | Params | Lightweight |
|
Versor is a PyTorch framework for Geometric Algebra Deep Learning. It provides the building blocks for the Geometric Blade Network (GBN) and Multi-Rotor GBN — model architectures that go beyond unconstrained linear transformations, using pure, manifold-aligned geometric rotations via Clifford Algebra and Rotors.
Rotors (
-
Metric-Agnostic Kernel: Supports Euclidean
$Cl(p, 0)$ , Minkowski/Hyperbolic$Cl(p, q)$ , and Projective algebras out of the box. -
Geometric Layers:
RotorLayer,MultiRotorLayer,CliffordLinear,CliffordGraphConv,CliffordLayerNorm. -
GBN Architectures:
GeometricBladeNetwork(single-rotor GBN),MultiRotorModel(multi-rotor GBN),MoleculeGNN/MultiRotorQuantumNet(graph GBN),MotionManifoldNetwork(alignment GBN). -
Novel Activations:
GeometricGELU(magnitude-based),GradeSwish(per-grade gating). -
Automatic Metric Search: Finds optimal
$(p, q)$ signature based on data topology. -
Geometric Sparsity:
prune_bivectorsfor compression of geometric layers.
Standard deep learning models are black boxes of millions of uninterpretable scalars. In Versor, every learnable parameter is a Bivector, which has a direct geometric meaning (a specific plane of rotation). This transparency offers a path to Interpretability by Design, where the model's internal reasoning can be visualized as clear geometric transformations rather than abstract weights.
Computer Vision, NLP, and Physics-ML currently rely on fragmented architectures. Versor’s Metric-Agnostic Kernel unifies these domains under a single mathematical framework. By simply changing the signature
- 3D Euclidean Geometry (Robotics, Molecules)
- Minkowski Spacetime (Relativistic Physics)
- High-Dimensional Manifolds (LLM Latent Spaces) This represents a step toward a General Geometric Intelligence that transcends specific domains.
Versor requires Python 3.9+ and PyTorch.
# Clone the repository
git clone https://github.com/Concode0/Versor.git
cd Versor
# Install core dependencies
uv sync
# Install with optional dependency groups
uv sync --extra viz # matplotlib, seaborn, scikit-learn, plotly, imageio
uv sync --extra examples # transformers, pillow, scikit-learn, matplotlib
uv sync --extra graph # torch-geometric (for molecular GNN tasks)
uv sync --extra demo # streamlit, plotly
uv sync --extra all # everythingimport torch
from core.algebra import CliffordAlgebra
from layers.rotor import RotorLayer
from layers.linear import CliffordLinear
from functional.activation import GeometricGELU
# Create a 3D Euclidean Clifford Algebra
algebra = CliffordAlgebra(p=3, q=0)
# Build a model with geometric layers
rotor = RotorLayer(algebra, channels=4)
linear = CliffordLinear(algebra, in_channels=4, out_channels=8)
activation = GeometricGELU(algebra, channels=8)
# Input: [Batch, Channels, 2^n] multivectors
x = torch.randn(32, 4, algebra.dim)
out = activation(linear(rotor(x)))Versor uses Hydra for configuration management:
# Run a task
uv run main.py task=qm9 training.epochs=100
uv run main.py task=motion training.epochs=100
uv run main.py task=semantic training.epochs=200
# Override parameters
uv run main.py task=qm9 algebra.device=cuda training.lr=0.001streamlit run examples/demo.pyTask: Predict the internal energy (
| Metric | Value |
|---|---|
| Algebra |
|
| Network | MultiRotorQuantumNet |
| Num Rotors | 12 |
| Validation MAE | 7.6468 |
| Avg Inference Time (CPU) | 5.8439 ms / molecule |
| Training Time | < 1 hour on Single 4090 |
# Train from scratch
uv run main.py task=multi_rotor_qm9 training.epochs=100
# Evaluate pretrained model
uv run main.py task=multi_rotor_qm9 training.epochs=0 checkpoint=multi_rotor_qm9_best.ptNote on Convergence & Efficiency: The current 7.6468 meV was achieved in just 100 epochs, and training was intentionally halted before reaching a plateau. We identified that gradient descent through standard matrix-based mixing introduces infinitesimal manifold deformations that counteract the pure isometric unbending of the GBN — a limitation we aim to resolve by replacing CliffordLinear with pure rotor compositions.
Task: Align high-dimensional motion data into a linearly separable latent space using geometric rotation.
| Metric | Value |
|---|---|
| Algebra |
|
| Network | MotionManifoldNetwork (Rotor Alignment) |
| Latent Accuracy | ~100% |
| Latent Grade Purity | 0.9957 |
uv run main.py task=motion training.epochs=100Task: Test whether a rotor can geometrically "unbend" the semantic manifold — pushing meaning into the grade-1 (vector) subspace while reconstructing faithfully.
| Metric | Value |
|---|---|
| Algebra |
|
| Dataset | 20 Newsgroups (full corpus) |
| Network | SemanticAutoEncoder (Encoder → BladeSelector → Decoder, each with RotorLayer) |
| Input | BERT [CLS] → PCA(48) → 8-channel multivectors |
| Grade Purity | 100% (all energy in grade-1 vectors) |
| Reconstruction Loss | ~0.0 |
| Noise Robustness | 0.003 @ 5%, 0.024 @ 10%, 0.148 @ 20% |
uv run main.py task=semantic training.epochs=200First run downloads BERT model + 20 Newsgroups and caches embeddings to
data/newsgroups/.
Synthetic experiments demonstrating GA concepts are in the examples/ directory:
# Run synthetic tasks
uv run python -m examples.main task=manifold training.epochs=500
uv run python -m examples.main task=hyperbolic training.epochs=500
uv run python -m examples.main task=sanity| Example | Algebra | Description |
|---|---|---|
| Manifold | Flatten a figure-8 manifold (100% topology restoration) | |
| Hyperbolic | Reverse a Lorentz boost in Minkowski spacetime | |
| Sanity | Verify algebra correctness (identity learning) |
Configuration files are in conf/ (main tasks) and examples/conf/ (synthetic tasks).
# Override any parameter from CLI
uv run main.py task=qm9 algebra.p=4 training.lr=0.001Versor/
├── core/ # Math kernel (CliffordAlgebra, metric, visualizer)
├── layers/ # Neural layers (Rotor, MultiRotor, Linear, GNN, Norm)
├── functional/ # Activations (GeometricGELU, GradeSwish) & losses
├── models/ # GBN architectures (single-rotor, multi-rotor, graph, motion)
├── tasks/ # Task runners (QM9, Motion, Semantic)
├── datasets/ # Data loaders (QM9, HAR, Newsgroups)
├── conf/ # Hydra configs for main tasks
├── docs/ # Documentation (philosophy, tutorial, math, FAQ)
├── examples/ # Synthetic demos and interactive Streamlit app
│ ├── tasks/ # Manifold, Hyperbolic, Sanity
│ ├── datasets/ # Synthetic data generators
│ └── conf/ # Hydra configs for example tasks
├── tests/ # Unit & property tests
└── main.py # CLI entry point
- Philosophy: Why Geometric Algebra? The "unbending" paradigm.
- Tutorial: Step-by-step guide to building with Versor.
- Mathematics: Clifford Algebra, Rotors, Metric Signatures.
- FAQ: Common questions and troubleshooting.
- Milestone: Roadmap — completed and upcoming work.
This project is licensed under the Apache License 2.0.
Notice on Patents: The core GBN architecture is covered by KR Patent Application 10-2026-0023023. By releasing this under Apache 2.0, we provide a perpetual, royalty-free patent license to any individual or entity using this software.
@software{kim2026versor,
author = {Kim, Eunkyum},
title = {Versor: Universal Geometric Algebra Neural Network},
url = {https://github.com/Concode0/versor},
version = {0.1.0},
year = {2026},
month = {2},
license = {Apache-2.0},
note = {ROK Patent Application 10-2026-0023023 (Geometric Blade Networks)}
}Unlike standard neural networks, which must use spectral normalization, weight clipping, or gradient penalties to force Lipschitz constraints (often approximately), Versor's RotorLayers satisfy this property by construction, while GeometricGELU and CliffordLayerNorm explicitly decouple and control only the Radial Scale, preserving angular integrity.
Active Development: We are currently transitioning to a Pure Geometric Update paradigm. This involves:
Replacing matrix-based mixing with a Composition of Irreducible Rotors.
Moving all weight updates from Euclidean space to the Bivector Manifold (Lie Algebra).


