Jan
Jan is a self-hosted, AI Inference Platform that scales from personal use to production deployments.
Run an entire AI stack locally, from the inference engine to a shareable web application.
Jan is free, source-available, and fair-code licensed.
👋 Access a live demo at https://cloud.jan.ai.
Intended use
- Run ChatGPT and Midjourney alternatives on-prem and on your private data
- Self-host AI models for your friends or for a team
- GPU support with Nvidia hardware acceleration
- CPU support with optimizations via llama.cpp
Features
- Web, Mobile and APIs (OpenAI compatible REST & GRPC)
- LLMs and Generative Art models
- Support for Apple Silicon, CPU architectures
- C++ inference backend with CUDA/TensorRT/Triton, dynamic batching
- Load balancing via Traefik
- Login and authz via Keycloak
- Data persistence via Postgres and/or MinIO
Planned
- Support opting out of optional, 3rd party integrations
- Universal model installer & compiler, targeting Nvidia GPU acceleration
- Mobile UI with a swappable backend URL
- Support for controlnet, upscaler, and code llama
- Admin dashboards for user management and audit
Quickstart
So far, this setup is tested and supported for Docker on Linux, Mac, and Windows Subsystem for Linux (WSL).
Dependencies
-
Install Docker: Install Docker here.
-
Install Docker Compose: Install Docker Compose here.
-
Clone the Repository: Clone this repository and pull in the latest git submodules.
git clone https://github.com/janhq/jan.git cd jan # Pull latest submodules git submodule update --init --recursive -
Export Environment Variables
# For Apple Silicon, please set the Docker platform
export DOCKER_DEFAULT_PLATFORM=linux/$(uname -m)
-
Set a .env: You will need to set up several environment variables for services such as Keycloak and Postgres. You can place them in
.envfiles in the respective folders as shown in thedocker-compose.yml.cp sample.env .envService (Docker) env file Global env .env, just runcp sample.env .envKeycloak .envpresented in global env and initiate realm inconf/keycloak_conf/example-realm.jsonKeycloak PostgresDB .envpresented in global envjan-inference .envpresented in global envapp-backend (hasura) conf/sample.env_app-backendrefer from hereapp-backend PostgresDB conf/sample.env_app-backend-postgresweb-client conf/sample.env_web-client
Install Models
- Download Runway SD 1.5 from HuggingFace
wget https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.safetensors -P jan-inference/sd/models
- Download Llama 7Bn ggml from HuggingFace
wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin -P jan-inference/llm/models
Compose Up
Jan uses an opinionated, but modular, open-source stack that comes with many services out of the box, e.g. multiple clients, autoscaling, auth and more.
You can opt out of such services or swap in your own integrations via Configurations.
- Run the following command to start all the services defined in the
docker-compose.yml
# Docker Compose up
docker compose up
# Docker Compose up detached mode
docker compose up -d
- This step takes 5-15 minutes and the following services will be provisioned:
| Service | URL | Credentials |
|---|---|---|
| Keycloak | http://localhost:8088 | Admin credentials are set via the environment variables KEYCLOAK_ADMIN and KEYCLOAK_ADMIN_PASSWORD |
| app-backend (hasura) | http://localhost:8080 | Admin credentials are set via the environment variables HASURA_GRAPHQL_ADMIN_SECRET in file conf/sample.env_app-backend |
| web-client | http://localhost:3000 | Users are signed up to keycloak, default created user is set via conf/keycloak_conf/example-realm.json on keycloak with username: username, password: password |
| llm service | http://localhost:8000 | |
| sd service | http://localhost:8001 |
Usage
- Launch the web application via
http://localhost:3000. - Login with default user (username:
username, password:password) - Talk to the models
Configurations
TODO
Developers
Architecture
- Architecture Diagram
Dependencies
- Keycloak Community (Apache-2.0)
- Hasura Community Edition (Apache-2.0)
Repo Structure
Jan is a monorepo that pulls in the following submodules
├── docker-compose.yml
├── mobile-client
├── web-client
├── app-backend
├── inference-backend
├── docs # Developer Docs
├── adrs # Architecture Decision Records
Common Issues and Troubleshooting
Contributing
Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines on how to contribute to this project.
License
This project is licensed under the Fair Code License. See LICENSE.md for more details.
Authors and Acknowledgments
Created by jan. Thanks to all contributors who have helped to improve this project.
Support and Contact
For support or to report issues, please email support@jan.ai.