Merge pull request #706 from janhq/docs/add-guides-section

docs: Refactor Jan Site Structure
This commit is contained in:
0xSage 2023-11-24 16:56:04 +08:00 committed by GitHub
commit 68e5f9adbc
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
12 changed files with 167 additions and 459 deletions

View File

@ -0,0 +1,5 @@
---
title: Community
---
- [ ] Social media links

View File

@ -2,15 +2,29 @@
title: Chats
---
:::warning
:::caution
This page is still under construction, and should be read as a scratchpad
This is currently under development.
:::
Chats are essentially inference requests to a model
## Overview
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/chat
In Jan, `chats` are LLM responses in the form of OpenAI compatible `chat completion objects`.
- This should reference Nitro ChatCompletion API page to reduce duplication.
- We are fine with adding Jan API for this but it makes sense to use Nitro as reference as Nitro is default inference engine for Jan in this release
- Models take a list of messages and return a model-generated response as output.
- An [OpenAI Chat API](https://platform.openai.com/docs/api-reference/chat) compatible endpoint at `localhost:3000/v1/chats`.
## Folder Structure
Chats are stateless, thus are not saved in `janroot`. Any content and relevant metadata from calling this endpoint is extracted and persisted through [Messages](/specs/messages).
## API Reference
Jan's Chat API is compatible with [OpenAI's Chat API](https://platform.openai.com/docs/api-reference/chat).
See [Jan Chat API](https://jan.ai/api-reference/#tag/Chat-Completion)
## Implementation
Under the hood, the `/chat` endpoint simply reroutes an existing endpoint from [Nitro server](https://nitro.jan.ai). Nitro is a lightweight & local inference server, written in C++ and embedded into the Jan app. See [Nitro documentation](https://nitro.jan.ai/docs).

View File

@ -8,66 +8,36 @@ This page is still under construction, and should be read as a scratchpad
:::
```sh
janroot/
assistants/
assistant-a/
assistant.json
src/
index.ts
threads/
thread-a/
thread-b
models/
model-a/
model.json
```
Jan use the local filesystem for data persistence, similar to VSCode. This allows for composability and tinkerability.
```sh=
/janroot # Jan's root folder (e.g. ~/jan)
/models # For raw AI models
/threads # For conversation history
/assistants # For AI assistants' configs, knowledge, etc.
```yaml
janroot/ # Jan's root folder (e.g. ~/jan)
models/ # For raw AI models
threads/ # For conversation history
assistants/ # For AI assistants' configs, knowledge, etc.
```
```sh=
```yaml
/models
/modelA
model.json # Default model settings
llama-7b-q4.gguf # Model binaries
llama-7b-q5.gguf # Include different quantizations
/threads
/jan-unixstamp-salt
model.json # Overrides assistant/model-level model settings
/jan-unixstamp
thread.json # thread metadata (e.g. subject)
messages.json # messages
content.json # What is this?
files/ # Future for RAG
messages.jsonl # messages
files/ # RAG
/assistants
/jan
/jan # A default assistant that can use all models
assistant.json # Assistant configs (see below)
# For any custom code
package.json # Import npm modules
# e.g. Langchain, Llamaindex
/src # Supporting files (needs better name)
package.json # Import npm modules, e.g. Langchain, Llamaindex
/src # For custom code
index.js # Entrypoint
process.js # For electron IPC processes (needs better name)
# `/threads` at root level
# `/models` at root level
/shakespeare
# `/threads` at root level
# `/models` at root level
/shakespeare # Example of a custom assistant
assistant.json
model.json # Creator chooses model and settings
package.json
/src
index.js
process.js
/threads # Assistants remember conversations in the future
/models # Users can upload custom models
/finetuned-model
```

View File

@ -18,7 +18,7 @@ Files can be used by `threads`, `assistants` and `fine-tuning`
- Note: OAI's struct doesn't seem very well designed
- `files.json`
```json
```js
{
// Public properties (OpenAI Compatible: https://platform.openai.com/docs/api-reference/files/object)
"id": "file-BK7bzQj3FfZFXr7DbL6xJwfo",
@ -31,19 +31,25 @@ Files can be used by `threads`, `assistants` and `fine-tuning`
```
## File API
### List Files
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/files/list
### Upload file
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/files/create
### Delete file
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/files/delete
### Retrieve file
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/files/retrieve
### Retrieve file content
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/files/retrieve-contents
## Files Filesystem

3
docs/docs/specs/home.md Normal file
View File

@ -0,0 +1,3 @@
---
title: Home
---

3
docs/docs/specs/hub.md Normal file
View File

@ -0,0 +1,3 @@
---
title: Hub
---

View File

@ -19,22 +19,19 @@ This is currently under development.
Messages are saved in the `/threads/{thread_id}` folder in `messages.jsonl` files
```sh
```yaml
jan/
threads/
assistant_name_unix_timestamp/
...
messages.jsonl
jan_2341243134/
...
messages.jsonl
thread.json # Thread metadata
messages.jsonl # Messages are stored in jsonl format
```
## `message.jsonl`
Individual messages are saved in `jsonl` format for indexing purposes.
```json
```js
{...message_2}
{...message_1}
{...message_0}
@ -44,13 +41,13 @@ Individual messages are saved in `jsonl` format for indexing purposes.
Here's a standard example `message` sent from a user.
```json
```js
"id": "0", // Sequential or UUID
"object": "thread.message", // Defaults to "thread.message"
"created_at": 1698983503,
"thread_id": "thread_asdf", // Defaults to parent thread
"assistant_id": "jan", // Defaults to parent thread
"role": "user", // From either "user" or "assistant"
"role": "user", // From either "user" or "assistant"
"content": [
{
"type": "text",
@ -61,13 +58,11 @@ Here's a standard example `message` sent from a user.
}
],
"metadata": {}, // Defaults to {}
// "run_id": "...", // Rather than `run` id abstraction
// "file_ids": [],
```
Here's an example `message` response from an assistant.
```json
```js
"id": "0", // Sequential or UUID
"object": "thread.message", // Defaults to "thread.message"
"created_at": 1698983503,
@ -84,175 +79,11 @@ Here's an example `message` response from an assistant.
}
],
"metadata": {}, // Defaults to {}
// "run_id": "...", // KIV
// "file_ids": [], // KIV
// "usage": {} // KIV: saving chat completion properties https://platform.openai.com/docs/api-reference/chat/object
"usage": {} // Save chat completion properties https://platform.openai.com/docs/api-reference/chat/object
```
## API Reference
Jan's `messages` API is compatible with [OpenAI's Messages API](https://platform.openai.com/docs/api-reference/messages), with additional methods for managing messages locally.
See [Jan Messages API](https://jan.ai/api-reference#tag/Messages)
<!-- TODO clean this part up into API -->
<!--
### Get list message
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/messages/getMessage
- Example request
```shell
curl {JAN_URL}/v1/threads/{thread_id}/messages/{message_id} \
-H "Content-Type: application/json"
```
- Example response
```json
{
"id": "msg_abc123",
"object": "thread.message",
"created_at": 1699017614,
"thread_id": "thread_abc123",
"role": "user",
"content": [
{
"type": "text",
"text": {
"value": "How does AI work? Explain it in simple terms.",
"annotations": []
}
}
],
"file_ids": [],
"assistant_id": null,
"run_id": null,
"metadata": {}
}
```
### Create message
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/messages/createMessage
- Example request
```shell
curl -X POST {JAN_URL}/v1/threads/{thread_id}/messages \
-H "Content-Type: application/json" \
-d '{
"role": "user",
"content": "How does AI work? Explain it in simple terms."
}'
```
- Example response
```json
{
"id": "msg_abc123",
"object": "thread.message",
"created_at": 1699017614,
"thread_id": "thread_abc123",
"role": "user",
"content": [
{
"type": "text",
"text": {
"value": "How does AI work? Explain it in simple terms.",
"annotations": []
}
}
],
"file_ids": [],
"assistant_id": null,
"run_id": null,
"metadata": {}
}
```
### Get message
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/assistants/listAssistants
- Example request
```shell
curl {JAN_URL}/v1/threads/{thread_id}/messages/{message_id} \
-H "Content-Type: application/json"
```
- Example response
```json
{
"id": "msg_abc123",
"object": "thread.message",
"created_at": 1699017614,
"thread_id": "thread_abc123",
"role": "user",
"content": [
{
"type": "text",
"text": {
"value": "How does AI work? Explain it in simple terms.",
"annotations": []
}
}
],
"file_ids": [],
"assistant_id": null,
"run_id": null,
"metadata": {}
}
```
### Modify message
> Jan: TODO: Do we need to modify message? Or let user create new message?
# Get message file
> OpenAI Equivalent: https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}/files/{file_id}
- Example request
```shell
curl {JAN_URL}/v1/threads/{thread_id}/messages/{message_id}/files/{file_id} \
-H "Content-Type: application/json"
```
- Example response
```json
{
"id": "file-abc123",
"object": "thread.message.file",
"created_at": 1699061776,
"message_id": "msg_abc123"
}
```
# List message files
> OpenAI Equivalent: https://api.openai.com/v1/threads/{thread_id}/messages/{message_id}/files
````
- Example request
```shell
curl {JAN_URL}/v1/threads/{thread_id}/messages/{message_id}/files/{file_id} \
-H "Content-Type: application/json"
````
- Example response
```json
{
"id": "file-abc123",
"object": "thread.message.file",
"created_at": 1699061776,
"message_id": "msg_abc123"
}
``` -->
See [Jan Messages API](https://jan.ai/api-reference#tag/Messages).

View File

@ -22,7 +22,7 @@ In Jan, models are primary entities with the following capabilities:
- Models are organized by individual folders, each containing the binaries and configurations needed to run the model. This makes for easy packaging and sharing.
- Model folder names are unique and used as `model_id` default values.
```bash
```yaml
jan/ # Jan root folder
models/
llama2-70b-q4_k_m/ # Example: standard GGUF model
@ -52,9 +52,9 @@ Here's a standard example `model.json` for a GGUF model.
- `source_url`: https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF/.
```json
```js
"id": "zephyr-7b" // Defaults to foldername
"object": "model", // Defaults to "model"
"object": "model", // Defaults to "model"
"source_url": "https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF/blob/main/zephyr-7b-beta.Q4_K_M.gguf",
"name": "Zephyr 7B" // Defaults to foldername
"owned_by": "you" // Defaults to you
@ -63,24 +63,21 @@ Here's a standard example `model.json` for a GGUF model.
"description": ""
"state": enum[null, "downloading", "ready", "starting", "stopping", ...]
"format": "ggufv3", // Defaults to "ggufv3"
"settings": { // Models are initialized with these settings
"settings": { // Models are initialized with settings
"ctx_len": "2048",
"ngl": "100",
"embedding": "true",
"n_parallel": "4",
// KIV: "pre_prompt": "A chat between a curious user and an artificial intelligence",
// KIV:"user_prompt": "USER: ",
// KIV: "ai_prompt": "ASSISTANT: "
}
"parameters": { // Models are called with these parameters
"parameters": { // Models are called parameters
"temperature": "0.7",
"token_limit": "2048",
"top_k": "0",
"top_p": "1",
"stream": "true"
},
"metadata": {} // Defaults to {}
"assets": [ // Filepaths to model binaries; Defaults to current dir
"metadata": {} // Defaults to {}
"assets": [ // Defaults to current dir
"file://.../zephyr-7b-q4_k_m.bin",
]
```
@ -89,7 +86,7 @@ Here's a standard example `model.json` for a GGUF model.
Jan's Model API is compatible with [OpenAI's Models API](https://platform.openai.com/docs/api-reference/models), with additional methods for managing and running models locally.
See [Jan Models API](https://jan.ai/api-reference#tag/Models)
See [Jan Models API](https://jan.ai/api-reference#tag/Models).
## Importing Models

View File

@ -0,0 +1,3 @@
---
title: System Monitor
---

View File

@ -23,13 +23,10 @@ This is currently under development.
- Thread folders follow the naming: `assistant_id` + `thread_created_at`.
- Thread folders also contain `messages.jsonl` files. See [messages](/specs/messages).
```sh
jan/
```yaml
janroot/
threads/
assistant_name_unix_timestamp/
thread.json
messages.jsonl
jan_2341243134/
assistant_name_unix_timestamp/ # Thread `ID`
thread.json
```
@ -43,14 +40,15 @@ jan/
Here's a standard example `thread.json` for a conversation between the user and the default Jan assistant.
```json
```js
"id": "thread_....", // Defaults to foldername
"object": "thread", // Defaults to "thread"
"title": "funny physics joke", // Defaults to ""
"assistants": [
{
"assistant_id": "jan", // Defaults to "jan"
"model": { // Defaults to 1 currently active model (can be changed before thread is begun)
"model": { // Defaults to the currently active model (can be changed before thread is begun)
"id": "...",
"settings": {}, // Defaults to and overrides assistant.json's "settings" (and if none, then model.json "settings")
"parameters": {}, // Defaults to and overrides assistant.json's "parameters" (and if none, then model.json "parameters")
}
@ -64,164 +62,4 @@ Here's a standard example `thread.json` for a conversation between the user and
Jan's Threads API is compatible with [OpenAI's Threads API](https://platform.openai.com/docs/api-reference/threads), with additional methods for managing threads locally.
See [Jan Threads API](https://jan.ai/api-reference#tag/Threads)
<!-- TODO clean this part up into API -->
<!--
### Get thread
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/threads/getThread
- Example request
```shell
curl {JAN_URL}/v1/threads/{thread_id}
```
- Example response
```json
{
"id": "thread_abc123",
"object": "thread",
"created_at": 1699014083,
"assistants": ["assistant-001"],
"metadata": {},
"messages": []
}
```
### Create Thread
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/threads/createThread
- Example request
```shell
curl -X POST {JAN_URL}/v1/threads \
-H "Content-Type: application/json" \
-d '{
"messages": [{
"role": "user",
"content": "Hello, what is AI?",
"file_ids": ["file-abc123"]
}, {
"role": "user",
"content": "How does AI work? Explain it in simple terms."
}]
}'
```
- Example response
```json
{
"id": "thread_abc123",
"object": "thread",
"created_at": 1699014083,
"metadata": {}
}
```
### Modify Thread
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/threads/modifyThread
- Example request
```shell
curl -X POST {JAN_URL}/v1/threads/{thread_id} \
-H "Content-Type: application/json" \
-d '{
"messages": [{
"role": "user",
"content": "Hello, what is AI?",
"file_ids": ["file-abc123"]
}, {
"role": "user",
"content": "How does AI work? Explain it in simple terms."
}]
}'
```
- Example response
```json
{
"id": "thread_abc123",
"object": "thread",
"created_at": 1699014083,
"metadata": {}
}
```
- https://platform.openai.com/docs/api-reference/threads/modifyThread
### Delete Thread
> OpenAI Equivalent: https://platform.openai.com/docs/api-reference/threads/deleteThread
- Example request
```shell
curl -X DELETE {JAN_URL}/v1/threads/{thread_id}
```
- Example response
```json
{
"id": "thread_abc123",
"object": "thread.deleted",
"deleted": true
}
```
### List Threads
> This is a Jan-only endpoint, not supported by OAI yet.
- Example request
```shell
curl {JAN_URL}/v1/threads \
-H "Content-Type: application/json" \
```
- Example response
```json
[
{
"id": "thread_abc123",
"object": "thread",
"created_at": 1699014083,
"assistants": ["assistant-001"],
"metadata": {},
"messages": []
},
{
"id": "thread_abc456",
"object": "thread",
"created_at": 1699014083,
"assistants": ["assistant-002", "assistant-002"],
"metadata": {}
}
]
```
### Get & Modify `Thread.Assistants`
-> Can achieve this goal by calling `Modify Thread` API
#### `GET v1/threads/{thread_id}/assistants`
-> Can achieve this goal by calling `Get Thread` API
#### `POST v1/threads/{thread_id}/assistants/{assistant_id}`
-> Can achieve this goal by calling `Modify Assistant` API with `thread.assistant[]`
### List `Thread.Messages`
-> Can achieve this goal by calling `Get Thread` API -->
See [Jan Threads API](https://jan.ai/api-reference#tag/Threads).

View File

@ -146,16 +146,34 @@ const config = {
// Navbar Left
{
type: "docSidebar",
sidebarId: "docsSidebar",
sidebarId: "guidesSidebar",
position: "left",
label: "Documentation",
label: "Guides",
},
{
type: "docSidebar",
sidebarId: "developerSidebar",
position: "left",
label: "Developer",
},
{
position: "left",
to: "/api-reference",
label: "API Reference",
},
{
type: "docSidebar",
position: "left",
sidebarId: "specsSidebar",
label: "Specs",
},
// Navbar right
{
type: "docSidebar",
position: "right",
sidebarId: "communitySidebar",
label: "Community",
},
{
to: "blog",
label: "Blog",

View File

@ -13,7 +13,7 @@
/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
const sidebars = {
docsSidebar: [
guidesSidebar: [
{
type: "category",
label: "Introduction",
@ -42,43 +42,14 @@ const sidebars = {
collapsed: true,
items: ["docs/models", "docs/server"],
},
{
type: "category",
label: "Extending Jan",
link: { type: "doc", id: "docs/extensions" },
collapsible: true,
collapsed: true,
items: ["docs/assistants", "docs/themes", "docs/tools", "docs/modules"],
},
{
type: "category",
label: "Building Jan",
collapsible: true,
collapsed: true,
items: [
"specs/architecture",
"specs/data-structures",
"specs/user-interface",
{
type: "category",
label: "Specifications",
collapsible: true,
collapsed: false,
items: [
"specs/chats",
"specs/models",
"specs/threads",
"specs/messages",
// "specs/assistants",
// "specs/files",
// "specs/jan",
// "specs/fine-tuning",
// "specs/settings",
// "specs/prompts",
],
},
],
},
],
developerSidebar: [
"docs/extensions",
"docs/assistants",
"docs/themes",
"docs/tools",
"docs/modules",
],
apiSidebar: [
@ -97,17 +68,53 @@ const sidebars = {
},
],
aboutSidebar: [
specsSidebar: [
{
type: "doc",
label: "About Jan",
id: "about/about",
type: "category",
label: "Overview",
collapsible: true,
collapsed: false,
items: [
"specs/architecture",
"specs/data-structures",
"specs/user-interface",
],
},
{
type: "link",
label: "Careers",
href: "https://janai.bamboohr.com/careers",
type: "category",
label: "Product",
collapsible: true,
collapsed: false,
items: [
"specs/home",
"specs/hub",
"specs/system-monitor",
"specs/settings",
],
},
{
type: "category",
label: "Engineering",
collapsible: true,
collapsed: false,
items: [
"specs/chats",
"specs/models",
"specs/threads",
"specs/messages",
// "specs/assistants",
// "specs/files",
// "specs/jan",
// "specs/fine-tuning",
// "specs/settings",
// "specs/prompts",
],
},
],
communitySidebar: [
"community/community",
{
type: "category",
label: "Events",
@ -122,6 +129,19 @@ const sidebars = {
},
],
},
],
aboutSidebar: [
{
type: "doc",
label: "About Jan",
id: "about/about",
},
{
type: "link",
label: "Careers",
href: "https://janai.bamboohr.com/careers",
},
{
type: "category",
label: "Company Handbook",