Skip to content

DidiSkywalker/nexon

 
 

Repository files navigation

NEXON

The ONNX Deployment Platform NEXON is a web-based application that allows users to upload, deploy, and run inference on ONNX models easily. It provides a user-friendly interface for managing AI models and executing inference tasks.

🚀 Features

  • Upload and deploy ONNX models.
  • Perform inference with custom input data.
  • View inference results in a results panel.
  • View all models uploaded on the platform and their metadata.
  • Modern, responsive UI with an intuitive layout.
  • 🆕 Deploy ONNX models directly from a MLflow Tracking Server

Setup with MLflow

1. Setup environment variables

Copy .env.example and rename it to .env. For testing no value changes are needed, but it is adviced to change the passwords.

2. Start MLflow services

To build and start the MLflow Tracking Server, S3, MySQL and initial MLflow experiments use:

docker compose -f mlflow-compose.yml up --build -d

3. Start NEXON

To build and start the NEXON frontend, backend and MongoDB use:

docker compose -f nexon-compose.yml up --build -d

4. Use integration

  • Check MLflow to make sure the initial models are registered
  • Run example requests from the examples/ directory.
curl -X POST http://localhost:8000/api/mlflow/sync -H "Content-Type: application/json" -d @examples/test_step_1.json

Optional, for better readable responses:

curl -X POST http://localhost:8000/api/mlflow/sync -H "Content-Type: application/json" -d @examples/test_step_1.json | python -m json.tool
  • Check NEXON to see your deployed models

📦 Installation

1. Clone the Repository

git clone https://github.com/Uni-Stuttgart-ESE/nexon.git
cd nexon

2. Set Up MongoDB

NEXON uses MongoDB to store uploaded models and their metadata.

2.1 Install MongoDB

Mac:

brew install mongodb-community@7.0

Ubuntu:

sudo apt update
sudo apt install -y mongodb

Windows(Chocolatey):

choco install mongodb

Or use your preferred package manager

2.2 Start MongoDB Locally

mongod --dbpath=/data/db

3. Set Up the Backend

Navigate to the server directory and create a virtual environment:

cd server
python -m venv nexon_env 
source nexon_env/bin/activate  # (Windows: nexon_env\Scripts\activate)

if this doesn't work try using python3.

Install dependencies:

pip install -r requirements.txt

Run the FastAPI backend:

uvicorn main:app --reload

4. Set Up the Frontend

Open a new terminal and navigate to the frontend directory:

cd frontend
npm install
npm start

About

A platform for deploying ONNX models via REST APIs with metadata management

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Python 57.7%
  • JavaScript 37.5%
  • CSS 1.8%
  • HTML 1.7%
  • Dockerfile 1.3%