From d96d5d30c9388164c99cf453bfa6d29907ba586f Mon Sep 17 00:00:00 2001 From: Ho Duc Hieu <150573299+hieu-jan@users.noreply.github.com> Date: Tue, 2 Jan 2024 19:17:52 +0700 Subject: [PATCH] docs: fix configuration --- .../04-using-models/02-import-manually.mdx | 19 +++++++------------ 1 file changed, 7 insertions(+), 12 deletions(-) diff --git a/docs/docs/guides/04-using-models/02-import-manually.mdx b/docs/docs/guides/04-using-models/02-import-manually.mdx index 9e6f7f0f8..18f11dc17 100644 --- a/docs/docs/guides/04-using-models/02-import-manually.mdx +++ b/docs/docs/guides/04-using-models/02-import-manually.mdx @@ -172,11 +172,13 @@ Your model is now ready to use in Jan. In this guide, we will show you how to configure a client connection to a remote/local server, using LM Studio as an example. -At the moment, you can only connect to one compatible server at a time (e.g OpenAI Platform, Azure OpenAI, LM Studio, etc) +At the moment, you can only connect to one compatible server at a time (e.g OpenAI Platform, Azure OpenAI, LM Studio, etc). ### 1. Configure Local Server in Engine -Create `lmstudio.json` in the `~/jan/engines` folder. Configure `full_url` properties with the endpoint server that you want to connect. For example, if you want to connect to LM Studio, you can configure as follows: +Navigate to the `~/jan/engines` folder and find the `openai.json` file. Currently, the code support any openai compatible endpoint only read `engine/openai.json` file, thus, it will not search any other files in this directory. + +Configure `full_url` properties with the endpoint server that you want to connect. For example, if you want to connect to LM Studio, you can configure as follows: ```js { @@ -194,17 +196,12 @@ Navigate to the `~/jan/models` folder. Create a folder named `remote-lmstudio` a - Ensure the filename must be `model.json`. - Ensure the `id` property matches the folder name you created. - Ensure the `format` property is set to `api`. -- Ensure the `engine` property is set to as the filename that you recently created in `~/jan/models`. In this example, it is `lmstudio`. +- Ensure the `engine` property is set to `openai`. - Ensure the `state` property is set to `ready`. ```js { - "source": [ - { - "filename": "lmstudio", - "url": "https://lmstudio.ai" - } - ], + "source_url": "https://lmstudio.ai", // highlight-next-line "id": "remote-lmstudio", "object": "model", @@ -220,7 +217,7 @@ Navigate to the `~/jan/models` folder. Create a folder named `remote-lmstudio` a "tags": ["remote", "awesome"] }, // highlight-start - "engine": "lmstudio", + "engine": "openai", "state": "ready" // highlight-end } @@ -230,8 +227,6 @@ Navigate to the `~/jan/models` folder. Create a folder named `remote-lmstudio` a Restart Jan and navigate to the Hub. Locate your model and click the Use button. -![start-model](assets/configure-local-server.png) - ## Assistance and Support If you have questions or are looking for more preconfigured GGUF models, please feel free to join our [Discord community](https://discord.gg/Dt7MxDyNNZ) for support, updates, and discussions.