Inference is a programming language designed for building verifiable software. It is featured with static typing, explicit semantics, and formal verification capabilities available out of the box.
Inference allows for mathematically verifying code correctness without learning provers. Keep the implementation correct, even with vibecode.
Important
The project is in early development. Internal design and implementation are subject to change. So please be patient with us as we build out the language and tools.
- Inference homepage
- Access our Inference book for a guide on how to get started
- Inference Programming Language specification
infc drives the compilation pipeline for a single .inf source file. Phases are:
- Parse (
--parse) β build the typed AST. - Analyze (
--analyze) β perform semantic/type inference (WIP). - Codegen (
--codegen) β emit WebAssembly and optionally translate to.vwhen-ois supplied.
You must specify at least one phase flag; requested phases run in canonical order.
cargo run -p inference-cli -- infc path/to/file.inf --parseAfter building you can call the binary directly:
./infc path/to/file.inf --codegen -oinfc --versionArtifacts are written to an out/ directory relative to the working directory. Rocq translation output is out/out.v.
| Code | Meaning |
|---|---|
| 0 | Success |
| 1 | Usage / IO / Parse failure |
Prebuilt infc binaries distributables are arranged in the following directory structure:
<distribution-folder>/
βββ infc # The main CLI binary
βββ bin/
β βββ inf-llc # LLVM compiler with Inference intrinsics
β βββ rust-lld # WebAssembly linker
βββ lib/ # (Linux only)
βββ libLLVM.so.* # LLVM shared library
Notes:
- On Linux, the LLVM shared library must be in the
lib/directory. - On Windows, all required DLL files should be placed in the
bin/directory next to the executables. - The
infcbinary automatically locates these dependencies relative to its own location. - No system LLVM installation is required for end users.
To build Inference from source, you'll need the required binary dependencies for your platform.
Download the following files for your platform and place them in the specified directories:
- inf-llc: Download β Extract to
external/bin/linux/ - rust-lld: Download β Extract to
external/bin/linux/ - libLLVM: Download β Extract to
external/lib/linux/
- inf-llc: Download β Extract to
external/bin/macos/ - rust-lld: Download β Extract to
external/bin/macos/
- inf-llc.exe: Download β Extract to
external/bin/windows/ - rust-lld.exe: Download β Extract to
external/bin/windows/
-
Clone the repository:
git clone https://github.com/Inferara/inference.git cd inference -
Download and extract the required binaries for your platform (see links above)
-
Make the binaries executable (Linux/macOS only):
chmod +x external/bin/linux/inf-llc external/bin/linux/rust-lld # Linux chmod +x external/bin/macos/inf-llc external/bin/macos/rust-lld # macOS
-
Build the project:
cargo build --release
The compiled infc binary will be in target/release/infc.
The workspace is configured for efficient development:
cargo build- Builds only thecore/crates (faster for core development)cargo build-full- Builds the entire workspace, including tools and testscargo test- Runs tests forcore/crates and thetests/integration suitecargo test-full- Runs tests for all workspace members, including tools
Check out open issues.
Contributions are welcome! Please see CONTRIBUTING.md for details.