docs: update path from category -> guides

This commit is contained in:
Arista Indrajaya 2024-03-04 16:40:31 +07:00
parent 07ef5b8144
commit fc8eb28195
19 changed files with 148 additions and 158 deletions

View File

@ -1,6 +1,6 @@
---
title: Engineering Specs
slug: /docs/engineering
slug: /developer/engineering
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[

View File

@ -1,6 +1,6 @@
---
title: Product Specs
slug: /docs/product
slug: /developer/product
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[

View File

@ -0,0 +1,22 @@
---
title: Changelogs
slug: /guides/changelogs/
sidebar_position: 12
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Changelogs",
"position": 5,
"link": {
"type": "generated-index",
"description": "Changelog for Jan"
}
}

View File

@ -0,0 +1,22 @@
---
title: Common Error
slug: /guides/common-error/
sidebar_position: 8
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Common Error",
"position": 8,
"link": {
"type": "generated-index",
"description": "List of common errors for Jan users"
}
}

View File

@ -0,0 +1,22 @@
---
title: Error Codes
slug: /guides/error-codes/
sidebar_position: 7
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Error Codes",
"position": 7,
"link": {
"type": "generated-index",
"description": "List of Error Codes for Jan users"
}
}

View File

@ -0,0 +1,22 @@
---
title: Extensions
slug: /guides/extensions/
sidebar_position: 5
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Extensions",
"position": 5,
"link": {
"type": "generated-index",
"description": "More info regarding Extensions for Jan"
}
}

View File

@ -0,0 +1,22 @@
---
title: Integrations
slug: /guides/integrations/
sidebar_position: 6
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Integrations",
"position": 6,
"link": {
"type": "generated-index",
"description": "More info regarding Jan.ai integrations"
}
}

View File

@ -70,27 +70,9 @@ The [Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/o
}
```
### Regarding `model.json`
- In `settings`, two crucial values are:
- `ctx_len`: Defined based on the model's context size.
- `prompt_template`: Defined based on the model's trained template (e.g., ChatML, Alpaca).
- To set up the `prompt_template`:
1. Visit [Hugging Face](https://huggingface.co/), an open-source machine learning platform.
2. Find the current model that you're using (e.g., [Gemma 7b it](https://huggingface.co/google/gemma-7b-it)).
3. Review the text and identify the template.
- In `parameters`, consider the following options. The fields in `parameters` are typically general and can be the same across models. An example is provided below:
```json
"parameters":{
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 4096,
"frequency_penalty": 0,
"presence_penalty": 0
}
```
:::note
For more details regarding the `model.json` settings and parameters fields, please see [here](../models/integrate-remote.mdx#modeljson).
:::
### Step 3: Start the Model

View File

@ -80,27 +80,9 @@ Replace `(port)` with your chosen port number. The default is 1234.
"engine": "openai"
}
```
### Regarding `model.json`
- In `settings`, two crucial values are:
- `ctx_len`: Defined based on the model's context size.
- `prompt_template`: Defined based on the model's trained template (e.g., ChatML, Alpaca).
- To set up the `prompt_template`:
1. Visit [Hugging Face](https://huggingface.co/), an open-source machine learning platform.
2. Find the current model that you're using (e.g., [Gemma 7b it](https://huggingface.co/google/gemma-7b-it)).
3. Review the text and identify the template.
- In `parameters`, consider the following options. The fields in `parameters` are typically general and can be the same across models. An example is provided below:
```json
"parameters":{
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 4096,
"frequency_penalty": 0,
"presence_penalty": 0
}
```
:::note
For more details regarding the `model.json` settings and parameters fields, please see [here](../models/integrate-remote.mdx#modeljson).
:::
### Step 3: Starting the Model

View File

@ -73,33 +73,11 @@ This tutorial demonstrates integrating Mistral AI with Jan using the API.
"engine": "openai"
}
```
### Regarding `model.json`
- In `settings`, two crucial values are:
- `ctx_len`: Defined based on the model's context size.
- `prompt_template`: Defined based on the model's trained template (e.g., ChatML, Alpaca).
- To set up the `prompt_template`:
1. Visit [Hugging Face](https://huggingface.co/), an open-source machine learning platform.
2. Find the current model that you're using (e.g., [Gemma 7b it](https://huggingface.co/google/gemma-7b-it)).
3. Review the text and identify the template.
- In `parameters`, consider the following options. The fields in `parameters` are typically general and can be the same across models. An example is provided below:
```json
"parameters":{
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 4096,
"frequency_penalty": 0,
"presence_penalty": 0
}
```
:::note
Mistral AI offers various endpoints. Refer to their [endpoint documentation](https://docs.mistral.ai/platform/endpoints/) to select the one that fits your requirements. Here, we use the `mistral-tiny` model as an example.
- For more details regarding the `model.json` settings and parameters fields, please see [here](../models/integrate-remote.mdx#modeljson).
- Mistral AI offers various endpoints. Refer to their [endpoint documentation](https://docs.mistral.ai/platform/endpoints/) to select the one that fits your requirements. Here, we use the `mistral-tiny` model as an example.
:::
### Step 3: Start the Model

View File

@ -79,27 +79,9 @@ ollama run <model-name>
"engine": "openai"
}
```
### Regarding `model.json`
- In `settings`, two crucial values are:
- `ctx_len`: Defined based on the model's context size.
- `prompt_template`: Defined based on the model's trained template (e.g., ChatML, Alpaca).
- To set up the `prompt_template`:
1. Visit [Hugging Face](https://huggingface.co/), an open-source machine learning platform.
2. Find the current model that you're using (e.g., [Gemma 7b it](https://huggingface.co/google/gemma-7b-it)).
3. Review the text and identify the template.
- In `parameters`, consider the following options. The fields in `parameters` are typically general and can be the same across models. An example is provided below:
```json
"parameters":{
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 4096,
"frequency_penalty": 0,
"presence_penalty": 0
}
```
:::note
For more details regarding the `model.json` settings and parameters fields, please see [here](../models/integrate-remote.mdx#modeljson).
:::
### Step 3: Start the Model
1. Restart Jan and navigate to the **Hub**.

View File

@ -49,27 +49,9 @@ To connect Jan with OpenRouter for accessing remote Large Language Models (LLMs)
}
```
### Regarding `model.json`
- In `settings`, two crucial values are:
- `ctx_len`: Defined based on the model's context size.
- `prompt_template`: Defined based on the model's trained template (e.g., ChatML, Alpaca).
- To set up the `prompt_template`:
1. Visit [Hugging Face](https://huggingface.co/), an open-source machine learning platform.
2. Find the current model that you're using (e.g., [Gemma 7b it](https://huggingface.co/google/gemma-7b-it)).
3. Review the text and identify the template.
- In `parameters`, consider the following options. The fields in `parameters` are typically general and can be the same across models. An example is provided below:
```json
"parameters":{
"temperature": 0.7,
"top_p": 0.95,
"stream": true,
"max_tokens": 4096,
"frequency_penalty": 0,
"presence_penalty": 0
}
```
:::note
For more details regarding the `model.json` settings and parameters fields, please see [here](../models/integrate-remote.mdx#modeljson).
:::
### Step 3 : Start the Model
1. Restart Jan and go to the **Hub**.

View File

@ -0,0 +1,22 @@
---
title: Models Setup
slug: /guides/models-setup/
sidebar_position: 5
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
build extension,
]
---
import DocCardList from "@theme/DocCardList";
<DocCardList />

View File

@ -1,8 +0,0 @@
{
"label": "Advanced Models Setup",
"position": 5,
"link": {
"type": "generated-index",
"description": "More info regarding AI models for Jan"
}
}