docs: fix minor typo

This commit is contained in:
hieu-jan 2024-02-19 19:03:45 +07:00
parent 43ea3681f7
commit 18deeebc86
2 changed files with 6 additions and 6 deletions

View File

@ -20,9 +20,9 @@ keywords:
With [LM Studio](https://lmstudio.ai/), you can discover, download, and run local Large Language Models (LLMs). In this guide, we will show you how to integrate and use your current models on LM Studio with Jan using 2 methods. The first method is integrating LM Studio server with Jan UI. The second method is migrating your downloaded model from LM Studio to Jan. We will use the [Phi 2 - GGUF](https://huggingface.co/TheBloke/phi-2-GGUF) model on Hugging Face as an example.
## Steps to Integrate LM Studio server with Jan UI
## Steps to Integrate LM Studio Server with Jan UI
### 1. Start the LM Studio server
### 1. Start the LM Studio Server
Navigate to the `Local Inference Server` on the LM Studio application, and select the model you want to use. Then, start the server after configuring the server port and options.
@ -108,11 +108,11 @@ Restart Jan and navigate to the Hub. Jan will automatically detect the model and
![Demo](assets/05-demo-migrating-model.gif)
## Steps to Pointing the Downloaded Model of LM Studio from Jan (version 0.4.7+)
## Steps to Pointing to the Downloaded Model of LM Studio from Jan (version 0.4.7+)
Starting from version 0.4.7, Jan supports importing models using an absolute filepath, so you can directly use the model from the LM Studio folder.
### 1. Revel the Model Absolute Path
### 1. Reveal the Model Absolute Path
Navigate to `My Models` in the LM Studio application and reveal the model folder. Then, you can get the absolute path of your model.

View File

@ -20,9 +20,9 @@ keywords:
With [Ollama](https://ollama.com/), you can run large language models locally. In this guide, we will show you how to integrate and use your current models on Ollama with Jan using 2 methods. The first method is integrating Ollama server with Jan UI. The second method is migrating your downloaded model from Ollama to Jan. We will use the [llama2](https://ollama.com/library/llama2) model as an example.
## Steps to Integrate Ollama server with Jan UI
## Steps to Integrate Ollama Server with Jan UI
### 1. Start the Ollama server
### 1. Start the Ollama Server
Firstly, you can select the model you want to use from the [Ollama library](https://ollama.com/library). Then, run your model by using the following command: