3.9 KiB
3.9 KiB
| title | slug |
|---|---|
| Introduction | /docs |
Jan can be used to build a variety of AI use cases, at every level of the stack:
- An OpenAI compatible API, with feature parity for
models,assistants,filesand more - A standard data format on top of the user's local filesystem, allowing for transparency and composability
- Automatically package and distribute to Mac, Windows and Linux. Cloud coming soon
- An UI kit to customize user interactions with
assistantsand more - A standalone inference engine for low level use cases
Resources
- Create an AI assistant
- Run an OpenAI compatible API endpoint
- Build a VSCode plugin with a local model
- Build a Jan platform module
Key Concepts
Modules
Jan is comprised of system-level modules that mirror OpenAI’s, exposing similar APIs and objects
- Modules are modular, atomic implementations of a single OpenAI-compatible endpoint
- Modules can be swapped out for alternate implementations
- The default
messagesmodule persists messages in thread-specific.json messages-postgresqluses Postgres for production-grade cloud-native environments
- The default
| Jan Module | Description | API Docs |
|---|---|---|
| Chat | Inference | /chat |
| Models | Models | /model |
| Assistants | Apps | /assistant |
| Threads | Conversations | /thread |
| Messages | Messages | /message |
Local Filesystem
Jan use the local filesystem for data persistence, similar to VSCode. This allows for composability and tinkerability.
/janroot # Jan's root folder (e.g. ~/jan)
/models # For raw AI models
/threads # For conversation history
/assistants # For AI assistants' configs, knowledge, etc.
/models
/modelA
model.json # Default model settings
llama-7b-q4.gguf # Model binaries
llama-7b-q5.gguf # Include different quantizations
/threads
/jan-unixstamp-salt
model.json # Overrides assistant/model-level model settings
thread.json # thread metadata (e.g. subject)
messages.json # messages
content.json # What is this?
files/ # Future for RAG
/assistants
/jan
assistant.json # Assistant configs (see below)
# For any custom code
package.json # Import npm modules
# e.g. Langchain, Llamaindex
/src # Supporting files (needs better name)
index.js # Entrypoint
process.js # For electron IPC processes (needs better name)
# `/threads` at root level
# `/models` at root level
/shakespeare
assistant.json
model.json # Creator chooses model and settings
package.json
/src
index.js
process.js
/threads # Assistants remember conversations in the future
/models # Users can upload custom models
/finetuned-model
Jan: a "global" assistant
Jan ships with a default assistant "Jan" that lets users chat with any open source model out-of-the-box.
This assistant is defined in /jan. It is a generic assistant to illustrate power of Jan. In the future, it will support additional features e.g. multi-assistant conversations
- Your Assistant "Jan" lets you pick any model that is in the root /models folder
- Right panel: pick LLM model and set model parameters
- Jan’s threads will be at root level
model.jsonwill reflect model chosen for that session- Be able to “add” other assistants in the future
- Jan’s files will be at thread level
- Jan is not a persistent memory assistant