docs: improve quickstart docs
This commit is contained in:
parent
2b0048ff51
commit
5e460c2a84
@ -1,5 +0,0 @@
|
|||||||
---
|
|
||||||
title: Model Management
|
|
||||||
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
|
||||||
keywords: [Jan AI, Jan, ChatGPT alternative, local AI, private AI, conversational AI, no-subscription fee, large language model ]
|
|
||||||
---
|
|
||||||
159
docs/docs/guides/models.mdx
Normal file
159
docs/docs/guides/models.mdx
Normal file
@ -0,0 +1,159 @@
|
|||||||
|
---
|
||||||
|
title: Model Management
|
||||||
|
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
||||||
|
keywords:
|
||||||
|
[
|
||||||
|
Jan AI,
|
||||||
|
Jan,
|
||||||
|
ChatGPT alternative,
|
||||||
|
local AI,
|
||||||
|
private AI,
|
||||||
|
conversational AI,
|
||||||
|
no-subscription fee,
|
||||||
|
large language model,
|
||||||
|
]
|
||||||
|
---
|
||||||
|
|
||||||
|
{/* Imports */}
|
||||||
|
import Tabs from "@theme/Tabs";
|
||||||
|
import TabItem from "@theme/TabItem";
|
||||||
|
|
||||||
|
Jan is compatible with all GGUF models.
|
||||||
|
|
||||||
|
If you don't see the model you want in the Hub, or if you have a custom model, you can add it to Jan.
|
||||||
|
|
||||||
|
In this guide we will use our latest model, [Trinity](https://huggingface.co/janhq/trinity-v1-GGUF), as an example.
|
||||||
|
|
||||||
|
> We are fast shipping a UI to make this easier, but it's a bit manual for now. Apologies.
|
||||||
|
|
||||||
|
## 1. Create a model folder
|
||||||
|
|
||||||
|
Navigate to the `~/jan/models` folder on your computer.
|
||||||
|
|
||||||
|
In `App Settings`, go to `Advanced`, then `Open App Directory`.
|
||||||
|
|
||||||
|
<Tabs groupId="operating-systems">
|
||||||
|
<TabItem value="mac" label="macOS">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
cd ~/jan/models
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="win" label="Windows">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
C:/Users/<your_user_name>/jan/models
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="linux" label="Linux">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
cd ~/jan/models
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
</Tabs>
|
||||||
|
|
||||||
|
In the `models` folder, create a folder with the name of the model.
|
||||||
|
|
||||||
|
<Tabs groupId="operating-systems">
|
||||||
|
<TabItem value="mac" label="macOS">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
mkdir trinity-v1-7b
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="win" label="Windows">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
mkdir trinity-v1-7b
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="linux" label="Linux">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
mkdir trinity-v1-7b
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
</Tabs>
|
||||||
|
|
||||||
|
## 2. Create a model JSON
|
||||||
|
|
||||||
|
Jan follows a folder-based, [standard model template](/specs/models) called a `model.json` to persist the model configurations on your local filesystem.
|
||||||
|
|
||||||
|
This means you can easily & transparently reconfigure your models and export and share your preferences.
|
||||||
|
|
||||||
|
<Tabs groupId="operating-systems">
|
||||||
|
<TabItem value="mac" label="macOS">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
cd trinity-v1-7b
|
||||||
|
touch model.json
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="win" label="Windows">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
cd trinity-v1-7b
|
||||||
|
touch model.json
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="linux" label="Linux">
|
||||||
|
|
||||||
|
```sh
|
||||||
|
cd trinity-v1-7b
|
||||||
|
touch model.json
|
||||||
|
```
|
||||||
|
|
||||||
|
</TabItem>
|
||||||
|
</Tabs>
|
||||||
|
|
||||||
|
Copy the following configurations into the `model.json`.
|
||||||
|
|
||||||
|
1. Make sure the `id` property is the same as the folder name you created.
|
||||||
|
2. Make sure the `source_url` property is the direct binary download link ending in `.gguf`. In HuggingFace, you can find the directl links in `Files and versions` tab.
|
||||||
|
3. Ensure you are using the correct `prompt_template`. This is usually provided in the HuggingFace model's description page.
|
||||||
|
|
||||||
|
```js
|
||||||
|
{
|
||||||
|
"source_url": "https://huggingface.co/janhq/trinity-v1-GGUF/resolve/main/trinity-v1.Q4_K_M.gguf",
|
||||||
|
"id": "trinity-v1-7b",
|
||||||
|
"object": "model",
|
||||||
|
"name": "Trinity 7B Q4",
|
||||||
|
"version": "1.0",
|
||||||
|
"description": "Trinity is an experimental model merge of GreenNodeLM & LeoScorpius using the Slerp method. Recommended for daily assistance purposes.",
|
||||||
|
"format": "gguf",
|
||||||
|
"settings": {
|
||||||
|
"ctx_len": 2048,
|
||||||
|
"prompt_template": "<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant"
|
||||||
|
},
|
||||||
|
"parameters": {
|
||||||
|
"max_tokens": 2048
|
||||||
|
},
|
||||||
|
"metadata": {
|
||||||
|
"author": "Jan",
|
||||||
|
"tags": ["7B", "Merged", "Featured"],
|
||||||
|
"size": 4370000000
|
||||||
|
},
|
||||||
|
"engine": "nitro"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## 3. Download your model
|
||||||
|
|
||||||
|
Restart the Jan application and look for your model in the Hub.
|
||||||
|
|
||||||
|
Click the green `download` button to download your actual model binary. This pulls from the `source_url` you provided above.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
There you go! You are ready to use your model.
|
||||||
|
|
||||||
|
If you have any questions or want to request for more preconfigured GGUF models, please message us in [Discord](https://discord.gg/Dt7MxDyNNZ).
|
||||||
@ -1,97 +0,0 @@
|
|||||||
---
|
|
||||||
title: Quickstart
|
|
||||||
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
|
||||||
keywords:
|
|
||||||
[
|
|
||||||
Jan AI,
|
|
||||||
Jan,
|
|
||||||
ChatGPT alternative,
|
|
||||||
local AI,
|
|
||||||
private AI,
|
|
||||||
conversational AI,
|
|
||||||
no-subscription fee,
|
|
||||||
large language model,
|
|
||||||
]
|
|
||||||
---
|
|
||||||
|
|
||||||
Jan is compatible with all GGUF models.
|
|
||||||
|
|
||||||
In this guide we will use our latest model, [Trinity](https://huggingface.co/janhq/trinity-v1-GGUF), as an example.
|
|
||||||
|
|
||||||
## 1. Create a model folder
|
|
||||||
|
|
||||||
Navigate to `~/jan/models` folder on your computer.
|
|
||||||
|
|
||||||
In `App Settings`, go to `Advanced`, then `Open App Directory`.
|
|
||||||
|
|
||||||
Or, you can directly cd into:
|
|
||||||
|
|
||||||
```sh
|
|
||||||
# Windows
|
|
||||||
C:/Users/<your_user_name>/jan/models
|
|
||||||
|
|
||||||
# MacOS/Linux
|
|
||||||
jan/models
|
|
||||||
```
|
|
||||||
|
|
||||||
In the `models` folder, create a folder with the name of the model.
|
|
||||||
|
|
||||||
```sh
|
|
||||||
mkdir trinity-v1-7b
|
|
||||||
```
|
|
||||||
|
|
||||||
## 2. Create a model JSON
|
|
||||||
|
|
||||||
Jan follows a folder-based, [standard model template](/specs/models) called a `model.json`. This allows for easy model configurations, exporting, and sharing.
|
|
||||||
|
|
||||||
```sh
|
|
||||||
cd trinity-v1-7b
|
|
||||||
touch model.json
|
|
||||||
```
|
|
||||||
|
|
||||||
Copy the following into the `model.json`
|
|
||||||
|
|
||||||
```js
|
|
||||||
{
|
|
||||||
"source_url": "https://huggingface.co/janhq/trinity-v1-GGUF/resolve/main/trinity-v1.Q4_K_M.gguf",
|
|
||||||
"id": "trinity-v1-7b",
|
|
||||||
"object": "model",
|
|
||||||
"name": "Trinity 7B Q4",
|
|
||||||
"version": "1.0",
|
|
||||||
"description": "Trinity is an experimental model merge of GreenNodeLM & LeoScorpius using the Slerp method. Recommended for daily assistance purposes.",
|
|
||||||
"format": "gguf",
|
|
||||||
"settings": {
|
|
||||||
"ctx_len": 2048,
|
|
||||||
"prompt_template": "<|im_start|>system\n{system_message}<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant"
|
|
||||||
},
|
|
||||||
"parameters": {
|
|
||||||
"max_tokens": 2048
|
|
||||||
},
|
|
||||||
"metadata": {
|
|
||||||
"author": "Jan",
|
|
||||||
"tags": ["7B", "Merged", "Featured"],
|
|
||||||
"size": 4370000000
|
|
||||||
},
|
|
||||||
"engine": "nitro"
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
:::caution
|
|
||||||
Ensure the `source_url` property is the direct binary download link ending in `.gguf`. Find the links in Huggingface > `Files and versions`.
|
|
||||||
|
|
||||||
Ensure the `id` property is the model foldername
|
|
||||||
|
|
||||||
Ensure you are using the correct `prompt_template`
|
|
||||||
:::
|
|
||||||
|
|
||||||
## 3. Download your model binary
|
|
||||||
|
|
||||||
Restart the Jan application and look for your model in the Hub.
|
|
||||||
|
|
||||||
Click the green `download` button to download your actual model binary from the URL you provided.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
There you go! You are ready to use your model.
|
|
||||||
|
|
||||||
If you have any questions or want to request for more preconfigured GGUF models, please message us in [Discord](https://discord.gg/Dt7MxDyNNZ).
|
|
||||||
81
docs/docs/guides/quickstart.mdx
Normal file
81
docs/docs/guides/quickstart.mdx
Normal file
@ -0,0 +1,81 @@
|
|||||||
|
---
|
||||||
|
title: Quickstart
|
||||||
|
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
||||||
|
keywords:
|
||||||
|
[
|
||||||
|
Jan AI,
|
||||||
|
Jan,
|
||||||
|
ChatGPT alternative,
|
||||||
|
local AI,
|
||||||
|
private AI,
|
||||||
|
conversational AI,
|
||||||
|
no-subscription fee,
|
||||||
|
large language model,
|
||||||
|
]
|
||||||
|
---
|
||||||
|
|
||||||
|
import Tabs from "@theme/Tabs";
|
||||||
|
import TabItem from "@theme/TabItem";
|
||||||
|
|
||||||
|
In this quickstart we'll show you how to:
|
||||||
|
|
||||||
|
- Download the Jan Desktop client - Mac, Windows, Linux, (and toaster) compatible
|
||||||
|
- Download and customize models
|
||||||
|
- Import custom models
|
||||||
|
- Use the local server at port `1337`
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
- To download the latest stable release: https://jan.ai/
|
||||||
|
|
||||||
|
- To download a nightly release (highly unstable but lots of new features): https://github.com/janhq/jan/releases
|
||||||
|
|
||||||
|
- For a detailed installation guide for your operating system, see the following:
|
||||||
|
|
||||||
|
<Tabs groupId="operating-systems">
|
||||||
|
<TabItem value="mac" label="macOS">
|
||||||
|
[Mac installation guide](/install/mac)
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="win" label="Windows">
|
||||||
|
[Windows installation guide](/install/windows)
|
||||||
|
</TabItem>
|
||||||
|
<TabItem value="linux" label="Linux">
|
||||||
|
[Linux installation guide](/install/linux)
|
||||||
|
</TabItem>
|
||||||
|
</Tabs>
|
||||||
|
|
||||||
|
- To build Jan Desktop from scratch (and have the right to tinker!)
|
||||||
|
|
||||||
|
See the [Build from Source](/install/from-source) guide.
|
||||||
|
|
||||||
|
### Working with Models
|
||||||
|
|
||||||
|
Jan provides a list of recommended models to get you started.
|
||||||
|
You can find them in the in-app Hub.
|
||||||
|
|
||||||
|
1. `cmd + k` and type "hub" to open the Hub.
|
||||||
|
2. Download your preferred models.
|
||||||
|
3. `cmd + k` and type "chat" to open the conversation UI and start chatting.
|
||||||
|
4. Your model may take a few seconds to start up.
|
||||||
|
5. You can customize the model settings, at each conversation thread level, on the right panel.
|
||||||
|
6. To change model defaults globally, edit the `model.json` file. See the [Models](/guides/models) guide.
|
||||||
|
|
||||||
|
### Importing Models
|
||||||
|
|
||||||
|
Jan is compatible with all GGUF models.
|
||||||
|
|
||||||
|
For more information on how to import custom models, not found in the Hub, see the [Models](/guides/models) guide.
|
||||||
|
|
||||||
|
## Working with the Local Server
|
||||||
|
|
||||||
|
> This feature is currently under development. So expect bugs!
|
||||||
|
|
||||||
|
Jan runs a local server on port `1337` by default.
|
||||||
|
|
||||||
|
The endpoints are OpenAI compatible.
|
||||||
|
|
||||||
|
See the [API server guide](/guides/server) for more information.
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
Loading…
x
Reference in New Issue
Block a user