From 770aebe7ba386f961f15e49619c422b42c18f658 Mon Sep 17 00:00:00 2001 From: Ho Duc Hieu <150573299+hieu-jan@users.noreply.github.com> Date: Wed, 3 Jan 2024 21:09:02 +0700 Subject: [PATCH] docs: separted model docs --- .../04-using-models/02-import-manually.mdx | 137 ---------------- .../03-integrating-remote-server.mdx | 147 ++++++++++++++++++ ...omize-models.md => 04-customize-models.md} | 0 ...package-models.md => 05-package-models.md} | 0 ...ng => 03-oai-compatible-configuration.png} | Bin ...g => 03-openai-platform-configuration.png} | Bin 6 files changed, 147 insertions(+), 137 deletions(-) create mode 100644 docs/docs/guides/04-using-models/03-integrating-remote-server.mdx rename docs/docs/guides/04-using-models/{03-customize-models.md => 04-customize-models.md} (100%) rename docs/docs/guides/04-using-models/{04-package-models.md => 05-package-models.md} (100%) rename docs/docs/guides/04-using-models/assets/{02-oai-compatible-configuration.png => 03-oai-compatible-configuration.png} (100%) rename docs/docs/guides/04-using-models/assets/{02-openai-platform-configuration.png => 03-openai-platform-configuration.png} (100%) diff --git a/docs/docs/guides/04-using-models/02-import-manually.mdx b/docs/docs/guides/04-using-models/02-import-manually.mdx index a4161d9d1..97cc53d77 100644 --- a/docs/docs/guides/04-using-models/02-import-manually.mdx +++ b/docs/docs/guides/04-using-models/02-import-manually.mdx @@ -14,7 +14,6 @@ keywords: large language model, import-models-manually, local model, - remote model, ] --- @@ -26,25 +25,6 @@ This is currently under development. import Tabs from "@theme/Tabs"; import TabItem from "@theme/TabItem"; -## Overview - -In this guide, we will walk you through how to import models manually. In Jan, you can use a local model directly on your computer or connect to a remote server. - -- Local Model: Jan is compatible with all GGUF models. If you can not find the model you want in the Hub or have a custom model you want to use, you can import it manually by following the [Steps to Manually Import a Local Model](#steps-to-manually-import-a-local-model) section. - -- Remote Model: Jan also supports integration with remote models. To establish a connection with these remote models, you can configure the client connection to a remote/ local server by following the [OpenAI Platform Configuration](#openai-platform-configuration) or [Engines with OAI Compatible Configuration](#engines-with-oai-compatible-configuration) section. Please note that at the moment, you can only connect to one OpenAI compatible server at a time (e.g. OpenAI Platform, Azure OpenAI, Jan API Server, etc). - -```mermaid -graph TB - Model --> LocalModel[Local model] - Model --> RemoteModel[Remote model] - LocalModel[Local Model] --> NitroEngine[Nitro Engine] - RemoteModel[Remote Model] --> OpenAICompatible[OpenAI Compatible] - - OpenAICompatible --> OpenAIPlatform[OpenAI Platform] - OpenAICompatible --> OAIEngines[Engines with OAI Compatible: Jan API server, Azure OpenAI, LM Studio, vLLM, etc] -``` - ## Steps to Manually Import a Local Model In this section, we will show you how to import a GGUF model from [HuggingFace](https://huggingface.co/), using our latest model, [Trinity](https://huggingface.co/janhq/trinity-v1-GGUF), as an example. @@ -185,123 +165,6 @@ Restart Jan and navigate to the Hub. Locate your model and click the `Download` Your model is now ready to use in Jan. -## OpenAI Platform Configuration - -In this section, we will show you how to configure with OpenAI Platform, using the OpenAI GPT 3.5 Turbo 16k model as an example. - -### 1. Create a Model JSON - -Navigate to the `~/jan/models` folder. Create a folder named `gpt-3.5-turbo-16k` and create a `model.json` file inside the folder including the following configurations: - -- Ensure the filename must be `model.json`. -- Ensure the `id` property matches the folder name you created. -- Ensure the `format` property is set to `api`. -- Ensure the `engine` property is set to `openai`. -- Ensure the `state` property is set to `ready`. - -```js -{ - "source_url": "https://openai.com", - // highlight-next-line - "id": "gpt-3.5-turbo-16k", - "object": "model", - "name": "OpenAI GPT 3.5 Turbo 16k", - "version": "1.0", - "description": "OpenAI GPT 3.5 Turbo 16k model is extremely good", - // highlight-start - "format": "api", - "settings": {}, - "parameters": {}, - "metadata": { - "author": "OpenAI", - "tags": ["General", "Big Context Length"] - }, - "engine": "openai", - "state": "ready" - // highlight-end -} -``` - -### 2. Configure OpenAI API Keys - -You can find your API keys in the [OpenAI Platform](https://platform.openai.com/api-keys) and set the OpenAI API keys in `~/jan/engines/openai.json` file. - -```js -{ - "full_url": "https://api.openai.com/v1/chat/completions", - // highlight-next-line - "api_key": "sk-" -} -``` - -### 3. Start the Model - -Restart Jan and navigate to the Hub. Then, select your configured model and start the model. - -![image-02](assets/02-openai-platform-configuration.png) - -## Engines with OAI Compatible Configuration - -In this section, we will show you how to configure a client connection to a remote/local server, using Jan's API server that is running model `mistral-ins-7b-q4` as an example. - -### 1. Configure a Client Connection - -Navigate to the `~/jan/engines` folder and modify the `openai.json` file. Please note that at the moment the code supports any openai compatible endpoint only read `engine/openai.json` file, thus, it will not search any other files in this directory. - -Configure `full_url` properties with the endpoint server that you want to connect. For example, if you want to connect to Jan's API server, you can configure as follows: - -```js -{ - // highlight-next-line - "full_url": "http://:1337/v1/chat/completions", - // Skip api_key if your local server does not require authentication - // "api_key": "sk-" -} -``` - -### 2. Create a Model JSON - -Navigate to the `~/jan/models` folder. Create a folder named `mistral-ins-7b-q4` and create a `model.json` file inside the folder including the following configurations: - -- Ensure the filename must be `model.json`. -- Ensure the `id` property matches the folder name you created. -- Ensure the `format` property is set to `api`. -- Ensure the `engine` property is set to `openai`. -- Ensure the `state` property is set to `ready`. - -```js -{ - "source_url": "https://jan.ai", - // highlight-next-line - "id": "mistral-ins-7b-q4", - "object": "model", - "name": "Mistral Instruct 7B Q4 on Jan API Server", - "version": "1.0", - "description": "Jan integration with remote Jan API server", - // highlight-next-line - "format": "api", - "settings": {}, - "parameters": {}, - "metadata": { - "author": "MistralAI, The Bloke", - "tags": [ - "remote", - "awesome" - ] - }, - // highlight-start - "engine": "openai", - "state": "ready" - // highlight-end -} -``` - -### 3. Start the Model - -Restart Jan and navigate to the Hub. Locate your model and click the Use button. - -![image-03](assets/02-oai-compatible-configuration.png) - ## Assistance and Support If you have questions or are looking for more preconfigured GGUF models, please feel free to join our [Discord community](https://discord.gg/Dt7MxDyNNZ) for support, updates, and discussions. diff --git a/docs/docs/guides/04-using-models/03-integrating-remote-server.mdx b/docs/docs/guides/04-using-models/03-integrating-remote-server.mdx new file mode 100644 index 000000000..65d587aaf --- /dev/null +++ b/docs/docs/guides/04-using-models/03-integrating-remote-server.mdx @@ -0,0 +1,147 @@ +--- +title: Integrating With a Remote Server +slug: /docs/guides/integrating-remote-server +description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server. +keywords: + [ + Jan AI, + Jan, + ChatGPT alternative, + local AI, + private AI, + conversational AI, + no-subscription fee, + large language model, + import-models-manually, + remote server, + ] +--- + +:::caution +This is currently under development. +::: + +In this guide, we will show you how to configure Jan as a client and point it to any remote & local (self-hosted) API server. + +## OpenAI Platform Configuration + +In this section, we will show you how to configure with OpenAI Platform, using the OpenAI GPT 3.5 Turbo 16k model as an example. + +### 1. Create a Model JSON + +Navigate to the `~/jan/models` folder. Create a folder named `gpt-3.5-turbo-16k` and create a `model.json` file inside the folder including the following configurations: + +- Ensure the filename must be `model.json`. +- Ensure the `id` property matches the folder name you created. +- Ensure the `format` property is set to `api`. +- Ensure the `engine` property is set to `openai`. +- Ensure the `state` property is set to `ready`. + +```js +{ + "source_url": "https://openai.com", + // highlight-next-line + "id": "gpt-3.5-turbo-16k", + "object": "model", + "name": "OpenAI GPT 3.5 Turbo 16k", + "version": "1.0", + "description": "OpenAI GPT 3.5 Turbo 16k model is extremely good", + // highlight-start + "format": "api", + "settings": {}, + "parameters": {}, + "metadata": { + "author": "OpenAI", + "tags": ["General", "Big Context Length"] + }, + "engine": "openai", + "state": "ready" + // highlight-end +} +``` + +### 2. Configure OpenAI API Keys + +You can find your API keys in the [OpenAI Platform](https://platform.openai.com/api-keys) and set the OpenAI API keys in `~/jan/engines/openai.json` file. + +```js +{ + "full_url": "https://api.openai.com/v1/chat/completions", + // highlight-next-line + "api_key": "sk-" +} +``` + +### 3. Start the Model + +Restart Jan and navigate to the Hub. Then, select your configured model and start the model. + +![image-01](assets/03-openai-platform-configuration.png) + +## Engines with OAI Compatible Configuration + +In this section, we will show you how to configure a client connection to a remote/local server, using Jan's API server that is running model `mistral-ins-7b-q4` as an example. + +### 1. Configure a Client Connection + +Navigate to the `~/jan/engines` folder and modify the `openai.json` file. Please note that at the moment the code that supports any openai compatible endpoint only read `engine/openai.json` file, thus, it will not search any other files in this directory. + +Configure `full_url` properties with the endpoint server that you want to connect. For example, if you want to connect to Jan's API server, you can configure as follows: + +```js +{ + // highlight-start + // "full_url": "http://:/v1/chat/completions" + "full_url": "http://:1337/v1/chat/completions", + // highlight-end + // Skip api_key if your local server does not require authentication + // "api_key": "sk-" +} +``` + +### 2. Create a Model JSON + +Navigate to the `~/jan/models` folder. Create a folder named `mistral-ins-7b-q4` and create a `model.json` file inside the folder including the following configurations: + +- Ensure the filename must be `model.json`. +- Ensure the `id` property matches the folder name you created. +- Ensure the `format` property is set to `api`. +- Ensure the `engine` property is set to `openai`. +- Ensure the `state` property is set to `ready`. + +```js +{ + "source_url": "https://jan.ai", + // highlight-next-line + "id": "mistral-ins-7b-q4", + "object": "model", + "name": "Mistral Instruct 7B Q4 on Jan API Server", + "version": "1.0", + "description": "Jan integration with remote Jan API server", + // highlight-next-line + "format": "api", + "settings": {}, + "parameters": {}, + "metadata": { + "author": "MistralAI, The Bloke", + "tags": [ + "remote", + "awesome" + ] + }, + // highlight-start + "engine": "openai", + "state": "ready" + // highlight-end +} +``` + +### 3. Start the Model + +Restart Jan and navigate to the Hub. Locate your model and click the Use button. + +![image-02](assets/03-oai-compatible-configuration.png) + +## Assistance and Support + +If you have questions or are looking for more preconfigured GGUF models, please feel free to join our [Discord community](https://discord.gg/Dt7MxDyNNZ) for support, updates, and discussions. \ No newline at end of file diff --git a/docs/docs/guides/04-using-models/03-customize-models.md b/docs/docs/guides/04-using-models/04-customize-models.md similarity index 100% rename from docs/docs/guides/04-using-models/03-customize-models.md rename to docs/docs/guides/04-using-models/04-customize-models.md diff --git a/docs/docs/guides/04-using-models/04-package-models.md b/docs/docs/guides/04-using-models/05-package-models.md similarity index 100% rename from docs/docs/guides/04-using-models/04-package-models.md rename to docs/docs/guides/04-using-models/05-package-models.md diff --git a/docs/docs/guides/04-using-models/assets/02-oai-compatible-configuration.png b/docs/docs/guides/04-using-models/assets/03-oai-compatible-configuration.png similarity index 100% rename from docs/docs/guides/04-using-models/assets/02-oai-compatible-configuration.png rename to docs/docs/guides/04-using-models/assets/03-oai-compatible-configuration.png diff --git a/docs/docs/guides/04-using-models/assets/02-openai-platform-configuration.png b/docs/docs/guides/04-using-models/assets/03-openai-platform-configuration.png similarity index 100% rename from docs/docs/guides/04-using-models/assets/02-openai-platform-configuration.png rename to docs/docs/guides/04-using-models/assets/03-openai-platform-configuration.png