docs: update installation guide
This commit is contained in:
parent
bea0678e26
commit
12f1897e27
@ -2,76 +2,21 @@
|
|||||||
title: Linux
|
title: Linux
|
||||||
---
|
---
|
||||||
|
|
||||||
# Installing Jan on Linux
|
# Jan on Linux
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
### Step 1: Download the Installer
|
1. To download the lastest version of Jan on Linux, please visit the [Jan's homepage](https://jan.ai/).
|
||||||
To begin using 👋Jan.ai on your Windows computer, follow these steps:
|
2. For Debian/Ubuntu-based distributions, the recommended installation method is through the `.deb` package (64-bit). This can be done either through the graphical software center, if available, or via the command line using the following:
|
||||||
|
|
||||||
1. Visit [Jan.ai](https://jan.ai/).
|
|
||||||
2. Click on the "Download for Windows" button to download the Jan Installer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
:::tip
|
|
||||||
|
|
||||||
For faster results, you should enable your NVIDIA GPU. Make sure to have the CUDA toolkit installed. You can download it from your Linux distro's package manager or from here: [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads).
|
|
||||||
|
|
||||||
:::
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
apt install nvidia-cuda-toolkit
|
sudo apt install ./jan-linux-amd64-<version>.deb
|
||||||
|
# sudo apt install ./jan-linux-arm64-0.3.1.deb
|
||||||
```
|
```
|
||||||
|
|
||||||
Check the installation by
|
## Uninstall Jan
|
||||||
|
To uninstall VS Code on Linux, you should use your package manager's uninstall or remove option. For Debian/Ubuntu-based distributions, if you installed Jan via the `.deb` package, you can uninstall Jan using the following command:
|
||||||
```bash
|
```bash
|
||||||
nvidia-smi
|
sudo apt-get remove jan`
|
||||||
|
# where jan is the name of Jan package
|
||||||
```
|
```
|
||||||
|
In case you wish to completely remove all user data associated with Jan after uninstallation, you can delete the user data folders located at `$HOME/.config/Jan` and ~/.jan. This will return your system to its state prior to the installation of Jan. This method can also be used to reset all settings if you are experiencing any issues with Jan.
|
||||||
:::tip
|
|
||||||
|
|
||||||
For AMD GPU. You can download it from your Linux distro's package manager or from here: [ROCm Quick Start (Linux)](https://rocm.docs.amd.com/en/latest/deploy/linux/quick_start.html).
|
|
||||||
|
|
||||||
:::
|
|
||||||
|
|
||||||
### Step 2: Download your first model
|
|
||||||
Now, let's get your first model:
|
|
||||||
|
|
||||||
1. After installation, you'll find the 👋Jan application icon on your desktop. Double-click to open it.
|
|
||||||
|
|
||||||
2. Welcome to the Jan homepage. Click on "Explore Models" to see the Model catalog.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
3. You can also see different quantized versions by clicking on "Show Available Versions."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
> Note: Choose a model that matches your computer's memory and RAM.
|
|
||||||
|
|
||||||
4. Select your preferred model and click "Download."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
### Step 3: Start the model
|
|
||||||
Once your model is downloaded. Go to "My Models" and then click "Start Model."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
|
|
||||||
### Step 4: Start the conversations
|
|
||||||
Now you're ready to start using 👋Jan.ai for conversations:
|
|
||||||
|
|
||||||
Click "Chat" and begin your first conversation by selecting "New conversation."
|
|
||||||
|
|
||||||
You can also check the CPU and Memory usage of the computer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
That's it! Enjoy using Large Language Models (LLMs) with 👋Jan.ai.
|
|
||||||
|
|
||||||
## Uninstallation
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
@ -2,83 +2,43 @@
|
|||||||
title: Mac
|
title: Mac
|
||||||
---
|
---
|
||||||
|
|
||||||
|
# Jan on MacOS
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
1. To download the lastest version of Jan on MacOS, please visit the [Jan's homepage](https://jan.ai/).
|
||||||
|
2. On the homepage, please choose the appropriate release version for your system architecture as follows:
|
||||||
|
- Intel Mac: `jan-mac-x64-<version>.dmg`
|
||||||
|
- Apple Silicon Mac: `jan-mac-arm64-<version>.dmg`
|
||||||
|
|
||||||
### Step 1: Download the Installer
|
## Uninstall Jan
|
||||||
To begin using 👋Jan.ai on your Windows computer, follow these steps:
|
As Jan is development mode, you might get stuck on a broken build
|
||||||
|
To reset your installation
|
||||||
1. Visit [Jan.ai](https://jan.ai/).
|
|
||||||
2. Click on the "Download for Windows" button to download the Jan Installer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
### Step 2: Download your first model
|
|
||||||
Now, let's get your first model:
|
|
||||||
|
|
||||||
1. After installation, you'll find the 👋Jan application icon on your desktop. Open it.
|
|
||||||
|
|
||||||
2. Welcome to the Jan homepage. Click on "Explore Models" to see the Model catalog.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
3. You can also see different quantized versions by clicking on "Show Available Versions."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
> Note: Choose a model that matches your computer's memory and RAM.
|
|
||||||
|
|
||||||
4. Select your preferred model and click "Download."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
### Step 3: Start the model
|
|
||||||
Once your model is downloaded. Go to "My Models" and then click "Start Model."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
### Step 4: Start the conversations
|
|
||||||
Now you're ready to start using 👋Jan.ai for conversations:
|
|
||||||
|
|
||||||
Click "Chat" and begin your first conversation by selecting "New conversation."
|
|
||||||
|
|
||||||
You can also check the CPU and Memory usage of the computer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
That's it! Enjoy using Large Language Models (LLMs) with 👋Jan.ai.
|
|
||||||
|
|
||||||
## Uninstallation
|
|
||||||
|
|
||||||
As Jan is development mode, you might get stuck on a broken build.
|
|
||||||
|
|
||||||
To reset your installation:
|
|
||||||
|
|
||||||
1. Delete Jan from your `/Applications` folder
|
1. Delete Jan from your `/Applications` folder
|
||||||
|
2. Delete Application data
|
||||||
2. Delete Application data:
|
```bash
|
||||||
```sh
|
|
||||||
# Newer versions
|
# Newer versions
|
||||||
rm -rf /Users/$(whoami)/Library/Application\ Support/jan
|
rm -rf /Users/$(whoami)/Library/Application\ Support/jan
|
||||||
|
|
||||||
# Versions 0.2.0 and older
|
# Versions 0.2.0 and older
|
||||||
rm -rf /Users/$(whoami)/Library/Application\ Support/jan-electron
|
rm -rf /Users/$(whoami)/Library/Application\ Support/jan-electron
|
||||||
```
|
```
|
||||||
|
3. Clear Application cache
|
||||||
3. Clear Application cache:
|
```bash
|
||||||
```sh
|
|
||||||
rm -rf /Users/$(whoami)/Library/Caches/jan*
|
rm -rf /Users/$(whoami)/Library/Caches/jan*
|
||||||
```
|
```
|
||||||
|
|
||||||
4. Use the following commands to remove any dangling backend processes:
|
4. Use the following commands to remove any dangling backend processes:
|
||||||
|
```bash
|
||||||
```sh
|
|
||||||
ps aux | grep nitro
|
ps aux | grep nitro
|
||||||
```
|
```
|
||||||
|
|
||||||
Look for processes like "nitro" and "nitro_arm_64," and kill them one by one with:
|
Look for processes like "nitro" and "nitro_arm_64," and kill them one by one with:
|
||||||
|
```bash
|
||||||
```sh
|
|
||||||
kill -9 <PID>
|
kill -9 <PID>
|
||||||
```
|
```
|
||||||
|
|
||||||
## FAQs
|
## Common Questions
|
||||||
|
|
||||||
|
### Does Jan run on Apple Silicon machines?
|
||||||
|
Yes, Jan supports MacOS Arm64 builds that can run on Macs with the Apple Silicon chipsets. You can install Jan on your Apple Silicon Mac by downloading the `jan-mac-arm64-<version>.dmg` file from the [Jan's homepage](https://jan.ai/).
|
||||||
|
|
||||||
|
### Which package should I download for my Mac?
|
||||||
|
Jan supports both Intel and Apple Silicon Macs. To find which appropriate package to download for your Mac, please follow this official guide from Apple: [Get system information about your Mac - Apple Support](https://support.apple.com/guide/mac-help/syspr35536/mac).
|
||||||
43
docs/docs/install/overview.md
Normal file
43
docs/docs/install/overview.md
Normal file
@ -0,0 +1,43 @@
|
|||||||
|
---
|
||||||
|
title: Overview
|
||||||
|
---
|
||||||
|
|
||||||
|
Getting up and running open-source AI models on your own computer with Jan is quick and easy. Jan is lightweight and can run on a variety of hardware and platform versions. Specific requirements tailored to your platform are outlined below.
|
||||||
|
|
||||||
|
## Cross platform
|
||||||
|
A free, open-source alternative to OpenAI that runs on the Linux, macOS, and Windows operating systems. Please refer to the specific guides below for your platform
|
||||||
|
- [Linux](/install/linux)
|
||||||
|
- [MacOS (Mac Intel Chip and Mac Apple Silicon Chip)](/install/mac)
|
||||||
|
- [Windows](/install/windows)
|
||||||
|
|
||||||
|
## Requirements for Jan
|
||||||
|
|
||||||
|
### Hardware
|
||||||
|
Jan is a lightweight platform designed for seamless download, storage, and execution of open-source Large Language Models (LLMs). With a small download size of less than 200 MB and a disk footprint of under 300 MB, Jan is optimized for efficiency and should run smoothly on modern hardware.
|
||||||
|
|
||||||
|
To ensure optimal performance while using Jan and handling LLM models, it is recommended to meet the following system requirements:
|
||||||
|
|
||||||
|
#### Disk space
|
||||||
|
- Minimum requirement
|
||||||
|
- At least 5 GB of free disk space is required to accommodate the download, storage, and management of open-source LLM models.
|
||||||
|
- Recommended
|
||||||
|
- For an optimal experience and to run most available open-source LLM models on Jan, it is recommended to have 10 GB of free disk space.
|
||||||
|
|
||||||
|
#### Random Access Memory (RAM) and Graphics Processing Unit Video Random Access Memory (GPU VRAM)
|
||||||
|
The amount of RAM on your system plays a crucial role in determining the size and complexity of LLM models you can effectively run. Jan can be utilized on traditional computers where RAM is a key resource. For enhanced performance, Jan also supports GPU acceleration, utilizing the VRAM of your graphics card.
|
||||||
|
|
||||||
|
#### Relationship between RAM and VRAM Sizes in Relation to LLM Models
|
||||||
|
The RAM and GPU VRAM requirements are dependent on the size and complexity of the LLM models you intend to run. The following are some general guidelines to help you determine the amount of RAM or VRAM you need to run LLM models on Jan
|
||||||
|
- 8 GB of RAM: Suitable for running smaller models like 3B models or quantized 7B models
|
||||||
|
- 16 GB of RAM(recommended): This is considered the "minimum usable models" threshold, particularly for 7B models (e.g Mistral 7B, etc)
|
||||||
|
- Beyond 16GB of RAM: Required for handling larger and more sophisticated model, such as 70B models.
|
||||||
|
|
||||||
|
### Architecture
|
||||||
|
Jan is designed to run on muptiple architectures, versatility and widespread usability. The supported architectures include:
|
||||||
|
#### CPU
|
||||||
|
- x86: Jan is well-suited for systems with x86 architecture, which is commonly found in traditional desktops and laptops. It ensures smooth performance on a variety of devices using x86 processors.
|
||||||
|
- ARM: Jan is optimized to run efficiently on ARM-based systems, extending compatibility to a broad range of devices using ARM processors.
|
||||||
|
#### GPU
|
||||||
|
- NVIDIA: Jan optimizes the computational capabilities of NVIDIA GPUs, achieving efficiency through the utilization of llama.cpp. This strategic integration enhances the performance of Jan, particularly in resource-intensive Language Model (LLM) tasks. Users can expect accelerated processing and improved responsiveness when leveraging the processing capabilities inherent in NVIDIA GPUs.
|
||||||
|
- AMD: Users with AMD GPUs can seamlessly integrate Jan's GPU acceleration, offering a comprehensive solution for diverse hardware configurations and preferences.
|
||||||
|
- ARM64 Mac: Jan seamlessly supports ARM64 architecture on Mac systems, leveraging Metal for efficient GPU operations. This ensures a smooth and efficient experience for users with Apple Silicon Chips, utilizing the power of Metal for optimal performance on ARM64 Mac devices.
|
||||||
@ -2,76 +2,19 @@
|
|||||||
title: Windows
|
title: Windows
|
||||||
---
|
---
|
||||||
|
|
||||||
# Installing Jan on Windows
|
# Jan on Windows
|
||||||
|
|
||||||
## Step 1: Download the Installer
|
|
||||||
To begin using 👋Jan.ai on your Windows computer, follow these steps:
|
|
||||||
|
|
||||||
1. Visit [Jan.ai](https://jan.ai/).
|
|
||||||
2. Click on the "Download for Windows" button to download the Jan Installer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
## Step 2: Proceed the Windows Defender
|
|
||||||
|
|
||||||
When you run the Jan Installer, Windows Defender may display a warning. Here's what to do:
|
|
||||||
|
|
||||||
1. Click "Run away" to accept and install 👋Jan.ai.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
1. Wait for the 👋Jan.ai installation to complete.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
:::tip
|
|
||||||
|
|
||||||
For faster results, you should enable your NVIDIA GPU. Make sure to have the CUDA toolkit installed. You can download it from here: [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads) or [CUDA Installation guide](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html#verify-you-have-a-cuda-capable-gpu).
|
|
||||||
|
|
||||||
:::
|
|
||||||
|
|
||||||
Check the installation by
|
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
1. To download the lastest version of Jan on Windows, please visit the [Jan's homepage](https://jan.ai/).
|
||||||
|
2. Once Jan installer (jan-win-x64-{version}.exe) is downloaded, run the installer. The installation process is expected to take around a minute.
|
||||||
|
3. By default, Jan is installed under this directory
|
||||||
```bash
|
```bash
|
||||||
nvidia-smi
|
C:\Users\{username}\AppData\Local\Programs\Jan
|
||||||
```
|
```
|
||||||
:::tip
|
## Uninstall Jan
|
||||||
|
If you have installed Jan on your Windows computer, either as a User or System installation, you can easily uninstall it by going to the Windows Control Panel.
|
||||||
|
In case you wish to completely remove all user data associated with Jan after uninstallation, you can delete the user data folders located at `%APPDATA%\Jan`. This will return your system to its state prior to the installation of Jan. This method can also be used to reset all settings if you are experiencing any issues with Jan.
|
||||||
|
|
||||||
For AMD GPU, you should use [WSLv2](https://learn.microsoft.com/en-us/windows/wsl/install). You can download it from here: [ROCm Quick Start (Linux)](https://rocm.docs.amd.com/en/latest/deploy/linux/quick_start.html).
|
## Common Questions
|
||||||
|
### Windows Defender Warning
|
||||||
:::
|
When initiating the Jan Installer, be aware that Windows Defender may display a warning due to the application not originating from the Microsoft Store. This warning is a standard security measure, as the installer will make changes to the system. To proceed with the installation, navigate through the warning options, typically found in "More info" section and select the appropriate option to continue with the installation.
|
||||||
|
|
||||||
## Step 3: Download your first model
|
|
||||||
Now, let's get your first model:
|
|
||||||
|
|
||||||
1. After installation, you'll find the 👋Jan application icon on your desktop. Double-click to open it.
|
|
||||||
|
|
||||||
2. Welcome to the Jan homepage. Click on "Explore Models" to see the Model catalog.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
1. You can also see different quantized versions by clicking on "Show Available Versions."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
> Note: Choose a model that matches your computer's memory and RAM.
|
|
||||||
|
|
||||||
1. Select your preferred model and click "Download."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
## Step 4: Start the model
|
|
||||||
Once your model is downloaded. Go to "My Models" and then click "Start Model."
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
## Step 5: Start the conversations
|
|
||||||
Now you're ready to start using 👋Jan.ai for conversations:
|
|
||||||
|
|
||||||
Click "Chat" and begin your first conversation by selecting "New conversation."
|
|
||||||
|
|
||||||
You can also check the CPU and Memory usage of the computer.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
That's it! Enjoy using Large Language Models (LLMs) with 👋Jan.ai.
|
|
||||||
@ -27,7 +27,7 @@ const sidebars = {
|
|||||||
label: "Installation",
|
label: "Installation",
|
||||||
collapsible: true,
|
collapsible: true,
|
||||||
collapsed: true,
|
collapsed: true,
|
||||||
items: ["install/windows", "install/mac", "install/linux"],
|
items: ["install/overview","install/windows", "install/mac", "install/linux"],
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
type: "category",
|
type: "category",
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user