diff --git a/docs/docs/guides/04-using-models/02-import-manually.mdx b/docs/docs/guides/04-using-models/02-import-manually.mdx index 18f11dc17..182e6312e 100644 --- a/docs/docs/guides/04-using-models/02-import-manually.mdx +++ b/docs/docs/guides/04-using-models/02-import-manually.mdx @@ -176,7 +176,7 @@ At the moment, you can only connect to one compatible server at a time (e.g Open ### 1. Configure Local Server in Engine -Navigate to the `~/jan/engines` folder and find the `openai.json` file. Currently, the code support any openai compatible endpoint only read `engine/openai.json` file, thus, it will not search any other files in this directory. +Navigate to the `~/jan/engines` folder and modify the `openai.json` file. Please note that at the moment the code supports any openai compatible endpoint only read `engine/openai.json` file, thus, it will not search any other files in this directory. Configure `full_url` properties with the endpoint server that you want to connect. For example, if you want to connect to LM Studio, you can configure as follows: @@ -227,6 +227,8 @@ Navigate to the `~/jan/models` folder. Create a folder named `remote-lmstudio` a Restart Jan and navigate to the Hub. Locate your model and click the Use button. +![start-the-model](assets/configure-local-server.png) + ## Assistance and Support If you have questions or are looking for more preconfigured GGUF models, please feel free to join our [Discord community](https://discord.gg/Dt7MxDyNNZ) for support, updates, and discussions. diff --git a/docs/docs/guides/04-using-models/assets/configure-local-server.png b/docs/docs/guides/04-using-models/assets/configure-local-server.png index 14a18d152..13bfead4b 100644 Binary files a/docs/docs/guides/04-using-models/assets/configure-local-server.png and b/docs/docs/guides/04-using-models/assets/configure-local-server.png differ