From df0f33413c9996f0b1408e4e27925748081ecbe9 Mon Sep 17 00:00:00 2001 From: Ho Duc Hieu <150573299+hieu-jan@users.noreply.github.com> Date: Wed, 3 Jan 2024 15:29:42 +0700 Subject: [PATCH] docs: update overview --- docs/docs/guides/04-using-models/02-import-manually.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/guides/04-using-models/02-import-manually.mdx b/docs/docs/guides/04-using-models/02-import-manually.mdx index 5be76a06b..5cbbfece6 100644 --- a/docs/docs/guides/04-using-models/02-import-manually.mdx +++ b/docs/docs/guides/04-using-models/02-import-manually.mdx @@ -32,7 +32,7 @@ In this guide, we will walk you through how to import models manually. In Jan, y - Local Model: Jan is compatible with all GGUF models. If you can not find the model you want in the Hub or have a custom model you want to use, you can import it manually by following the [Steps to Manually Import a Local Model](#steps-to-manually-import-a-local-model) section. -- Remote Model: Jan also supports integration with remote models. To establish a connection with these remote models, you can configure the client connection to a remote/ local server by following the [OpenAI Platform Configuration](#openai-platform-configuration) or [Engines with OAI Compatible Configuration](#engines-with-oai-compatible-configuration) section. Please note that at the moment, you can only connect to one OpenAI compatible at a time (e.g. OpenAI Platform, Azure OpenAI, LM Studio, etc). +- Remote Model: Jan also supports integration with remote models. To establish a connection with these remote models, you can configure the client connection to a remote/ local server by following the [OpenAI Platform Configuration](#openai-platform-configuration) or [Engines with OAI Compatible Configuration](#engines-with-oai-compatible-configuration) section. Please note that at the moment, you can only connect to one OpenAI compatible at a time (e.g. OpenAI Platform, Azure OpenAI, Jan API Server, etc). ```mermaid graph TB