7.8 KiB
Jan - Self-Hosted AI Platform
Getting Started - Docs - Changelog - Bug reports - Discord
Jan is a self-hosted AI Platform. We help you run AI on your own hardware, giving you full control and protecting your enterprises' data and IP.
Jan is free, source-available, and fair-code licensed.
Demo
Features
Multiple AI Engines
- Self-hosted Llama2 and LLMs
- Self-hosted StableDiffusion and Controlnet
- Connect to ChatGPT, Claude via API Key (coming soon)
- 1-click installs for Models (coming soon)
Cross-Platform
- Web App
- Jan Mobile support for custom Jan server (in progress)
- Cloud deployments (coming soon)
Organization Tools
- Multi-user support
- Audit and Usage logs (coming soon)
- Compliance and Audit (coming soon)
- PII and Sensitive Data policy engine for 3rd-party AIs (coming soon)
Hardware Support
- Nvidia GPUs
- Apple Silicon (in progress)
- CPU support via llama.cpp (in progress)
Usage
So far, this setup is tested and supported for Docker on Linux, Mac, and Windows Subsystem for Linux (WSL).
Dependencies
-
Install Docker: Install Docker here.
-
Install Docker Compose: Install Docker Compose here.
-
Clone the Repository: Clone this repository and pull in the latest git submodules.
git clone https://github.com/janhq/jan.git cd jan # Pull latest submodules git submodule update --init --recursive -
Export Environment Variables
export DOCKER_DEFAULT_PLATFORM=linux/$(uname -m)
-
Set a .env: You will need to set up several environment variables for services such as Keycloak and Postgres. You can place them in
.envfiles in the respective folders as shown in thedocker-compose.yml.cp sample.env .envService (Docker) env file Global env .env, just runcp sample.env .envKeycloak .envpresented in global env and initiate realm inconf/keycloak_conf/example-realm.jsonKeycloak PostgresDB .envpresented in global envjan-inference .envpresented in global envapp-backend (hasura) conf/sample.env_app-backendrefer from hereapp-backend PostgresDB conf/sample.env_app-backend-postgresweb-client conf/sample.env_web-client
Install Models
wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin -P jan-inference/llm/models
Compose Up
Jan uses an opinionated, but modular, open-source stack that comes with many services out of the box, e.g. multiple clients, autoscaling, auth and more.
You can opt out of such services or swap in your own integrations via Configurations.
- Run the following command to start all the services defined in the
docker-compose.yml
# Docker Compose up
docker compose up
# Docker Compose up detached mode
docker compose up -d
- This step takes 5-15 minutes and the following services will be provisioned:
| Service | URL | Credentials |
|---|---|---|
| Web App | http://localhost:3000 | Users are signed up to keycloak, default created user is set via conf/keycloak_conf/example-realm.json on keycloak with username: username, password: password |
| Keycloak Admin | http://localhost:8088 | Admin credentials are set via the environment variables KEYCLOAK_ADMIN and KEYCLOAK_ADMIN_PASSWORD |
| Hasura App Backend | http://localhost:8080 | Admin credentials are set via the environment variables HASURA_GRAPHQL_ADMIN_SECRET in file conf/sample.env_app-backend |
| LLM Service | http://localhost:8000 |
Usage
- Launch the web application via
http://localhost:3000. - Login with default user (username:
username, password:password) - For configuring login theme, check out here
Configurations
TODO
Developers
Architecture
TODO
Dependencies
- Keycloak Community (Apache-2.0)
- Hasura Community Edition (Apache-2.0)
Repo Structure
Jan is a monorepo that pulls in the following submodules
├── docker-compose.yml
├── mobile-client # Mobile app
├── web-client # Web app
├── app-backend # Web & mobile app backend
├── inference-backend # Inference server
├── docs # Developer Docs
├── adrs # Architecture Decision Records
Common Issues and Troubleshooting
Contributing
Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines on how to contribute to this project.
License
This project is licensed under the Fair Code License. See LICENSE.md for more details.
Authors and Acknowledgments
Created by Jan. Thanks to all contributors who have helped to improve this project.
Contact
For support: please file a Github ticket For questions: join our Discord here For long form inquiries: please email hello@jan.ai
Current Features
- Llama 7Bn
- Web app and APIs (OpenAI compatible REST & GRPC)
- Supports Apple Silicon/CPU & GPU architectures
- Load balancing via Traefik
- Login and authz via Keycloak
- Data storage via Postgres, MinIO