diff --git a/README.md b/README.md index 2c8ca648b..5bb9723a5 100644 --- a/README.md +++ b/README.md @@ -18,6 +18,8 @@ - Changelog - Bug reports - Discord
+> ⚠️ **Jan is currently in Development**: Expect breaking changes and bugs! + Jan is a self-hosted AI Platform. We help you run AI on your own hardware, giving you full control and protecting your enterprises' data and IP. Jan is free, source-available, and [fair-code](https://faircode.io/) licensed. @@ -51,138 +53,130 @@ Jan is free, source-available, and [fair-code](https://faircode.io/) licensed. - [ ] Apple Silicon (in progress) - [ ] CPU support via llama.cpp (in progress) -## Usage +## Documentation -So far, this setup is tested and supported for Docker on Linux, Mac, and Windows Subsystem for Linux (WSL). +👋 https://docs.jan.ai (Work in Progress) -### Dependencies +## Installation -- **Install Docker**: Install Docker [here](https://docs.docker.com/get-docker/). +> ⚠️ **Jan is currently in Development**: Expect breaking changes and bugs! -- **Install Docker Compose**: Install Docker Compose [here](https://docs.docker.com/compose/install/). +### Step 1: Install Docker -- **Clone the Repository**: Clone this repository and pull in the latest git submodules. +Jan is currently packaged as a Docker Compose application. - ```bash - git clone https://github.com/janhq/jan.git +- Docker ([Installation Instructions](https://docs.docker.com/get-docker/)) +- Docker Compose ([Installation Instructions](https://docs.docker.com/compose/install/)) - cd jan +### Step 2: Clone Repo - # Pull latest submodules - git submodule update --init --recursive - ``` +```bash +git clone https://github.com/janhq/jan.git +cd jan -- **Export Environment Variables** -```sh -export DOCKER_DEFAULT_PLATFORM=linux/$(uname -m) +# Pull latest submodules +git submodule update --init --recursive ``` -- **Set a .env**: You will need to set up several environment variables for services such as Keycloak and Postgres. You can place them in `.env` files in the respective folders as shown in the `docker-compose.yml`. +### Step 3: Configure `.env` - ```bash - cp sample.env .env - ``` - - | Service (Docker) | env file | - | ---------------------- | ------------------------------------------------------------------------------------------------------------------------------- | - | Global env | `.env`, just run `cp sample.env .env` | - | Keycloak | `.env` presented in global env and initiate realm in `conf/keycloak_conf/example-realm.json` | - | Keycloak PostgresDB | `.env` presented in global env | - | jan-inference | `.env` presented in global env | - | app-backend (hasura) | `conf/sample.env_app-backend` refer from [here](https://hasura.io/docs/latest/deployment/graphql-engine-flags/config-examples/) | - | app-backend PostgresDB | `conf/sample.env_app-backend-postgres` | - | web-client | `conf/sample.env_web-client` | - -### Install Models -```sh -wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin -P jan-inference/llm/models -``` - -### Compose Up - -Jan uses an opinionated, but modular, open-source stack that comes with many services out of the box, e.g. multiple clients, autoscaling, auth and more. - -You can opt out of such services or swap in your own integrations via [Configurations](#configurations). - -- Run the following command to start all the services defined in the `docker-compose.yml` +We provide a sample `.env` file that you can use to get started. ```shell -# Docker Compose up -docker compose up - -# Docker Compose up detached mode -docker compose up -d +cp sample.env .env ``` -- This step takes 5-15 minutes and the following services will be provisioned: +You will need to set the following `.env` variables -| Service | URL | Credentials | -| -------------------- | --------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| Web App | http://localhost:3000 | Users are signed up to keycloak, default created user is set via `conf/keycloak_conf/example-realm.json` on keycloak with username: `username`, password: `password` | -| Keycloak Admin | http://localhost:8088 | Admin credentials are set via the environment variables `KEYCLOAK_ADMIN` and `KEYCLOAK_ADMIN_PASSWORD` | -| Hasura App Backend | http://localhost:8080 | Admin credentials are set via the environment variables `HASURA_GRAPHQL_ADMIN_SECRET` in file `conf/sample.env_app-backend` | -| LLM Service | http://localhost:8000 | | +```shell +# TODO: Document .env variables +``` -## Usage +### Step 4: Install Models + +> Note: This step will change soon with [Nitro](https://github.com/janhq/nitro) becoming its own library + +We recommend that Llama2-7B (4-bit quantized) as a basic model to get started. + +You will need to download the models to the `jan-inference/llms/models` folder. + +```shell +cd jan-inference/llms/models + +# Downloads model (~4gb) +# Download time depends on your internet connection and HuggingFace's bandwidth +wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin +``` + +### Step 5: `docker compose up` + +Jan utilizes Docker Compose to run all services: + +```shell +docker compose up +docker compose up -d # Detached mode +``` +- (Backend) +- [Keycloak](https://www.keycloak.org/) (Identity) + +The table below summarizes the services and their respective URLs and credentials. + +| Service | Container Name | URL and Port | Credentials | +| ------------------------------------------------ | -------------------- | --------------------- | ---------------------------------------------------------------------------------- | +| Jan Web | jan-web-* | http://localhost:3000 | Set in `conf/keycloak_conf/example-realm.json`