diff --git a/docs/docs/guides/00-overview.md b/docs/docs/guides/00-overview.md index e6cf5d99a..8c8ca8d7a 100644 --- a/docs/docs/guides/00-overview.md +++ b/docs/docs/guides/00-overview.md @@ -15,36 +15,32 @@ keywords: ] --- -Jan is a ChatGPT-alternative that runs on your own computer, with a [local API server](/api-reference/). +Jan is a ChatGPT alternative that runs on your own computer, with a [local API server](/guides/using-server). -Jan uses [open-source AI models](/docs/models), stores data in [open file formats](/specs/data-structures), is highly customizable via [extensions](/docs/extensions). +We believe in the need for an open source AI ecosystem. We're focused on building infra, tooling and [custom models](https://huggingface.co/janhq) to allow open source AIs to compete on a level playing field with proprietary offerings. -Jan believes in the need for an open source AI ecosystem. We aim to build infra and tooling to allow open source AIs to compete on a level playing field with proprietary offerings. +## Features + +- Compatible with [open-source models](/guides/using-models) (GGUF, TensorRT, and remote APIs) +- Compatible with most OSes: [Windows](/install/windows/), [Mac](/install/mac), [Linux](/install/linux), with/without GPU acceleration +- Stores data in [open file formats](/developer/file-based) +- Customizable via [extensions](/developer/build-extension) +- And more in the [roadmap](https://github.com/orgs/janhq/projects/5/views/16). Join us on [Discord](https://discord.gg/5rQ2zTv3be) and tell us what you want to see! ## Why Jan? #### 💻 Own your AI -Jan runs 100% on your own machine, [predictably](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7ksti6/?utm_source=share&utm_medium=web2x&context=3), privately and even offline. No one else can see your conversations, not even us. +Jan runs 100% on your own machine, predictably, privately and offline. No one else can see your conversations, not even us. #### 🏗️ Extensions -Jan ships with a powerful [extension framework](/docs/extensions), which allows developers to extend and customize Jan's functionality. In fact, most core modules of Jan are [built as extensions](/specs/architecture) and use the same extensions API. +Jan ships with a local-first, AI-native, and cross platform [extensions framework](/developer/build-extension). Developers can extend and customize everything from functionality to UI to branding. In fact, Jan's current main features are actually built as extensions on top of this framework. #### 🗂️ Open File Formats -Jan stores data in a [local folder of non-proprietary files](/specs/data-structures). You're never locked-in and can do what you want with your data with extensions, or even a different app. +Jan stores data in your [local filesystem](/developer/file-based). Your data never leaves your computer. You are free to delete, export, migrate your data, even to a different platform. #### 🌍 Open Source Both Jan and [Nitro](https://nitro.jan.ai), our lightweight inference engine, are licensed via the open source [AGPLv3 license](https://github.com/janhq/jan/blob/main/LICENSE). - - - - diff --git a/docs/docs/guides/02-installation/05-hardware.md b/docs/docs/guides/02-installation/05-hardware.md new file mode 100644 index 000000000..c84e16bde --- /dev/null +++ b/docs/docs/guides/02-installation/05-hardware.md @@ -0,0 +1,55 @@ +--- +title: Hardware Requirements +description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server. +keywords: + [ + Jan AI, + Jan, + ChatGPT alternative, + local AI, + private AI, + conversational AI, + no-subscription fee, + large language model, + ] +--- + +Jan is designed to be lightweight and able to run Large Language Models (LLMs) out-of-the-box. + +The current download size is less than 150 MB and has a disk space of ~300 MB. + +To ensure optimal performance, please see the following system requirements: + +## Disk Space + +- Minimum requirement + - At least 5 GB of free disk space is required to accommodate the download, storage, and management of open-source LLM models. +- Recommended + - For an optimal experience and to run most available open-source LLM models on Jan, it is recommended to have 10 GB of free disk space. + +## RAM and GPU VRAM + +The amount of RAM on your system plays a crucial role in determining the size and complexity of LLM models you can effectively run. Jan can be utilized on traditional computers where RAM is a key resource. For enhanced performance, Jan also supports GPU acceleration, utilizing the VRAM of your graphics card. + +## Best Models for your V/RAM + +The RAM and GPU VRAM requirements are dependent on the size and complexity of the LLM models you intend to run. The following are some general guidelines to help you determine the amount of RAM or VRAM you need to run LLM models on Jan + +- `8 GB of RAM`: Suitable for running smaller models like 3B models or quantized 7B models +- `16 GB of RAM (recommended)`: This is considered the "minimum usable models" threshold, particularly for 7B models (e.g Mistral 7B, etc) +- `Beyond 16GB of RAM`: Required for handling larger and more sophisticated model, such as 70B models. + +## Architecture + +Jan is designed to run on muptiple architectures, versatility and widespread usability. The supported architectures include: + +### CPU Support + +- `x86`: Jan is well-suited for systems with x86 architecture, which is commonly found in traditional desktops and laptops. It ensures smooth performance on a variety of devices using x86 processors. +- `ARM`: Jan is optimized to run efficiently on ARM-based systems, extending compatibility to a broad range of devices using ARM processors. + +### GPU Support + +- `NVIDIA` +- `AMD` +- `ARM64 Mac` diff --git a/docs/docs/guides/02-installation/README.mdx b/docs/docs/guides/02-installation/README.mdx index 1aa2c5ede..00dc4ef57 100644 --- a/docs/docs/guides/02-installation/README.mdx +++ b/docs/docs/guides/02-installation/README.mdx @@ -21,9 +21,8 @@ import TabItem from "@theme/TabItem"; In this quickstart we'll show you how to: - Download the Jan Desktop client - Mac, Windows, Linux, (and toaster) compatible -- Download and customize models -- Import custom models -- Use the local server at port `1337` +- Download the Nightly (unstable) version +- Build the application from source ## Setup @@ -50,89 +49,3 @@ In this quickstart we'll show you how to: - To build Jan Desktop from scratch (and have the right to tinker!) See the [Build from Source](/install/from-source) guide. - -### Working with Models - -Jan provides a list of recommended models to get you started. -You can find them in the in-app Hub. - -1. `cmd + k` and type "hub" to open the Hub. -2. Download your preferred models. -3. `cmd + k` and type "chat" to open the conversation UI and start chatting. -4. Your model may take a few seconds to start up. -5. You can customize the model settings, at each conversation thread level, on the right panel. -6. To change model defaults globally, edit the `model.json` file. See the [Models](/guides/models) guide. - -### Importing Models - -Jan is compatible with all GGUF models. - -For more information on how to import custom models, not found in the Hub, see the [Models](/guides/models) guide. - -## Working with the Local Server - -> This feature is currently under development. So expect bugs! - -Jan runs a local server on port `1337` by default. - -The endpoints are OpenAI compatible. - -See the [API server guide](/guides/server) for more information. - -## Next Steps - ---- - -TODO: Merge this in: - -Getting up and running open-source AI models on your own computer with Jan is quick and easy. Jan is lightweight and can run on a variety of hardware and platform versions. Specific requirements tailored to your platform are outlined below. - -## Cross platform - -A free, open-source alternative to OpenAI that runs on the Linux, macOS, and Windows operating systems. Please refer to the specific guides below for your platform - -- [Linux](/install/linux) -- [MacOS (Mac Intel Chip and Mac Apple Silicon Chip)](/install/mac) -- [Windows](/install/windows) - -## Requirements for Jan - -### Hardware - -Jan is a lightweight platform designed for seamless download, storage, and execution of open-source Large Language Models (LLMs). With a small download size of less than 200 MB and a disk footprint of under 300 MB, Jan is optimized for efficiency and should run smoothly on modern hardware. - -To ensure optimal performance while using Jan and handling LLM models, it is recommended to meet the following system requirements: - -#### Disk space - -- Minimum requirement - - At least 5 GB of free disk space is required to accommodate the download, storage, and management of open-source LLM models. -- Recommended - - For an optimal experience and to run most available open-source LLM models on Jan, it is recommended to have 10 GB of free disk space. - -#### Random Access Memory (RAM) and Graphics Processing Unit Video Random Access Memory (GPU VRAM) - -The amount of RAM on your system plays a crucial role in determining the size and complexity of LLM models you can effectively run. Jan can be utilized on traditional computers where RAM is a key resource. For enhanced performance, Jan also supports GPU acceleration, utilizing the VRAM of your graphics card. - -#### Relationship between RAM and VRAM Sizes in Relation to LLM Models - -The RAM and GPU VRAM requirements are dependent on the size and complexity of the LLM models you intend to run. The following are some general guidelines to help you determine the amount of RAM or VRAM you need to run LLM models on Jan - -- 8 GB of RAM: Suitable for running smaller models like 3B models or quantized 7B models -- 16 GB of RAM(recommended): This is considered the "minimum usable models" threshold, particularly for 7B models (e.g Mistral 7B, etc) -- Beyond 16GB of RAM: Required for handling larger and more sophisticated model, such as 70B models. - -### Architecture - -Jan is designed to run on muptiple architectures, versatility and widespread usability. The supported architectures include: - -#### CPU - -- x86: Jan is well-suited for systems with x86 architecture, which is commonly found in traditional desktops and laptops. It ensures smooth performance on a variety of devices using x86 processors. -- ARM: Jan is optimized to run efficiently on ARM-based systems, extending compatibility to a broad range of devices using ARM processors. - -#### GPU - -- NVIDIA: Jan optimizes the computational capabilities of NVIDIA GPUs, achieving efficiency through the utilization of llama.cpp. This strategic integration enhances the performance of Jan, particularly in resource-intensive Language Model (LLM) tasks. Users can expect accelerated processing and improved responsiveness when leveraging the processing capabilities inherent in NVIDIA GPUs. -- AMD: Users with AMD GPUs can seamlessly integrate Jan's GPU acceleration, offering a comprehensive solution for diverse hardware configurations and preferences. -- ARM64 Mac: Jan seamlessly supports ARM64 architecture on Mac systems, leveraging Metal for efficient GPU operations. This ensures a smooth and efficient experience for users with Apple Silicon Chips, utilizing the power of Metal for optimal performance on ARM64 Mac devices. diff --git a/docs/docs/guides/03-chatting/01-start-thread.md b/docs/docs/guides/03-chatting/01-start-thread.md deleted file mode 100644 index c2176a430..000000000 --- a/docs/docs/guides/03-chatting/01-start-thread.md +++ /dev/null @@ -1,12 +0,0 @@ ---- -title: Starting a Thread ---- - -Rough outline: -Choosing an assistant -Setting assistant instructions - At thread level - Globally, as default -Choosing a model -Customizing model params (thread level) -Customizing engine params diff --git a/docs/docs/guides/03-chatting/02-upload-docs.md b/docs/docs/guides/03-chatting/02-upload-docs.md deleted file mode 100644 index 1f635e31a..000000000 --- a/docs/docs/guides/03-chatting/02-upload-docs.md +++ /dev/null @@ -1,3 +0,0 @@ ---- -title: Uploading docs ---- diff --git a/docs/docs/guides/03-chatting/03-upload-images.md b/docs/docs/guides/03-chatting/03-upload-images.md deleted file mode 100644 index 69a7ea95a..000000000 --- a/docs/docs/guides/03-chatting/03-upload-images.md +++ /dev/null @@ -1,3 +0,0 @@ ---- -title: Uploading Images ---- diff --git a/docs/docs/guides/03-chatting/04-manage-history.md b/docs/docs/guides/03-chatting/04-manage-history.md index 9062a15a7..1b21cc0eb 100644 --- a/docs/docs/guides/03-chatting/04-manage-history.md +++ b/docs/docs/guides/03-chatting/04-manage-history.md @@ -1,3 +1,56 @@ --- title: Manage Chat History +slug: /guides/chatting/manage-history/ +description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server. +keywords: + [ + Jan AI, + Jan, + ChatGPT alternative, + local AI, + private AI, + conversational AI, + no-subscription fee, + large language model, + manage-chat-history, + ] --- + +Jan offers a convenient and private way to interact with a conversational AI locally on your computer. This guide will walk you through how to manage your chat history with Jan, ensuring your interactions remain private and organized. + +## Viewing Chat History + +1. Navigate to the main dashboard. +2. Locate the list of threads on the left side of the screen. This list shows all your conversations. +3. Select a thread to view the conversation in the main chat window. +4. Scroll up and down to view the entire chat history in the selected thread. + +

+![viewing-chat-history](./assets/viewing-chat-history.gif) + +## Managing Threads via Folders + +This feature allows you to directly manage your thread history and configurations. + +1. Navigate to the Thread that you want to manage via the list of threads on the left side of the dashboard. +2. Click on the three dots (⋮) on the `Thread` section on the right side of the dashboard. There are two options: + +- `Reveal in Finder` will open the folder containing the thread history and configurations. +- `View as JSON` will open the thread.json file in your default browser. + +

+![managing-threads-via-folders](./assets/managing-threads-via-folders.gif) + +## Clean Thread + +To streamline your conservation view, click on the three dots (⋮) on the thread you want to clean, then select `Clean Thread`. It will remove all messages from the thread. It is useful if you want to keep the thread settings, but want to remove the messages from the chat window. + +

+![clean-thread](./assets/clean-thread.gif) + +## Delete Thread + +To delete a thread, click on the three dots (⋮) on the thread you want to delete, then select `Delete Thread`. It will remove the thread from the list of threads. + +

+![delete-thread](./assets/delete-thread.gif) diff --git a/docs/docs/guides/03-chatting/assets/choose-model.png b/docs/docs/guides/03-chatting/assets/choose-model.png new file mode 100644 index 000000000..d109f03e0 Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/choose-model.png differ diff --git a/docs/docs/guides/03-chatting/assets/clean-thread.gif b/docs/docs/guides/03-chatting/assets/clean-thread.gif new file mode 100644 index 000000000..e4f7c678e Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/clean-thread.gif differ diff --git a/docs/docs/guides/03-chatting/assets/customize-model-params.png b/docs/docs/guides/03-chatting/assets/customize-model-params.png new file mode 100644 index 000000000..8e27fdbe3 Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/customize-model-params.png differ diff --git a/docs/docs/guides/03-chatting/assets/delete-thread.gif b/docs/docs/guides/03-chatting/assets/delete-thread.gif new file mode 100644 index 000000000..92ae66cb7 Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/delete-thread.gif differ diff --git a/docs/docs/guides/03-chatting/assets/managing-threads-via-folders.gif b/docs/docs/guides/03-chatting/assets/managing-threads-via-folders.gif new file mode 100644 index 000000000..abbb2313d Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/managing-threads-via-folders.gif differ diff --git a/docs/docs/guides/03-chatting/assets/setting-assistant-instructions.png b/docs/docs/guides/03-chatting/assets/setting-assistant-instructions.png new file mode 100644 index 000000000..8c9b39d19 Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/setting-assistant-instructions.png differ diff --git a/docs/docs/guides/03-chatting/assets/setting-thread-title.png b/docs/docs/guides/03-chatting/assets/setting-thread-title.png new file mode 100644 index 000000000..b9ab4c71c Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/setting-thread-title.png differ diff --git a/docs/docs/guides/03-chatting/assets/start-thread.gif b/docs/docs/guides/03-chatting/assets/start-thread.gif new file mode 100644 index 000000000..2c944dbb9 Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/start-thread.gif differ diff --git a/docs/docs/guides/03-chatting/assets/viewing-chat-history.gif b/docs/docs/guides/03-chatting/assets/viewing-chat-history.gif new file mode 100644 index 000000000..3fed0f25c Binary files /dev/null and b/docs/docs/guides/03-chatting/assets/viewing-chat-history.gif differ diff --git a/docs/docs/guides/04-using-models/04-customize-models.md b/docs/docs/guides/04-using-models/04-customize-models.md deleted file mode 100644 index 21dfeb33f..000000000 --- a/docs/docs/guides/04-using-models/04-customize-models.md +++ /dev/null @@ -1,3 +0,0 @@ ---- -title: Customize Model Defaults ---- diff --git a/docs/docs/guides/04-using-models/05-package-models.md b/docs/docs/guides/04-using-models/05-package-models.md deleted file mode 100644 index 93ca62d6c..000000000 --- a/docs/docs/guides/04-using-models/05-package-models.md +++ /dev/null @@ -1,3 +0,0 @@ ---- -title: Package & Publish Models ----