docs: fix minor typo
This commit is contained in:
parent
43ea3681f7
commit
18deeebc86
@ -20,9 +20,9 @@ keywords:
|
||||
|
||||
With [LM Studio](https://lmstudio.ai/), you can discover, download, and run local Large Language Models (LLMs). In this guide, we will show you how to integrate and use your current models on LM Studio with Jan using 2 methods. The first method is integrating LM Studio server with Jan UI. The second method is migrating your downloaded model from LM Studio to Jan. We will use the [Phi 2 - GGUF](https://huggingface.co/TheBloke/phi-2-GGUF) model on Hugging Face as an example.
|
||||
|
||||
## Steps to Integrate LM Studio server with Jan UI
|
||||
## Steps to Integrate LM Studio Server with Jan UI
|
||||
|
||||
### 1. Start the LM Studio server
|
||||
### 1. Start the LM Studio Server
|
||||
|
||||
Navigate to the `Local Inference Server` on the LM Studio application, and select the model you want to use. Then, start the server after configuring the server port and options.
|
||||
|
||||
@ -108,11 +108,11 @@ Restart Jan and navigate to the Hub. Jan will automatically detect the model and
|
||||
|
||||

|
||||
|
||||
## Steps to Pointing the Downloaded Model of LM Studio from Jan (version 0.4.7+)
|
||||
## Steps to Pointing to the Downloaded Model of LM Studio from Jan (version 0.4.7+)
|
||||
|
||||
Starting from version 0.4.7, Jan supports importing models using an absolute filepath, so you can directly use the model from the LM Studio folder.
|
||||
|
||||
### 1. Revel the Model Absolute Path
|
||||
### 1. Reveal the Model Absolute Path
|
||||
|
||||
Navigate to `My Models` in the LM Studio application and reveal the model folder. Then, you can get the absolute path of your model.
|
||||
|
||||
|
||||
@ -20,9 +20,9 @@ keywords:
|
||||
|
||||
With [Ollama](https://ollama.com/), you can run large language models locally. In this guide, we will show you how to integrate and use your current models on Ollama with Jan using 2 methods. The first method is integrating Ollama server with Jan UI. The second method is migrating your downloaded model from Ollama to Jan. We will use the [llama2](https://ollama.com/library/llama2) model as an example.
|
||||
|
||||
## Steps to Integrate Ollama server with Jan UI
|
||||
## Steps to Integrate Ollama Server with Jan UI
|
||||
|
||||
### 1. Start the Ollama server
|
||||
### 1. Start the Ollama Server
|
||||
|
||||
Firstly, you can select the model you want to use from the [Ollama library](https://ollama.com/library). Then, run your model by using the following command:
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user