docs(tabby): add tabby integration
This commit is contained in:
parent
20c467e615
commit
193cfba9ec
Binary file not shown.
|
After Width: | Height: | Size: 262 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 618 KiB |
104
docs/src/pages/integrations/coding/tabby.mdx
Normal file
104
docs/src/pages/integrations/coding/tabby.mdx
Normal file
@ -0,0 +1,104 @@
|
||||
---
|
||||
title: Tabby
|
||||
description: A step-by-step guide on integrating Jan with Tabby and VSCode, JetBrains, or other IDEs.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
Tabby integration,
|
||||
VSCode integration,
|
||||
JetBrains integration,
|
||||
]
|
||||
---
|
||||
|
||||
import { Tabs, Steps } from 'nextra/components'
|
||||
|
||||
# Tabby
|
||||
|
||||
## Integrate Jan with Tabby and Your Favorite IDEs
|
||||
|
||||
[Tabby](https://www.tabbyml.com/) is an open-source, self-hosted AI coding assistant.
|
||||
With Tabby, teams can easily set up their own LLM-powered code completion server.
|
||||
|
||||
Tabby provides integrations with VSCode, JetBrains, and other IDEs to help developers code more efficiently,
|
||||
and it can be used with various LLM services, including Jan.
|
||||
|
||||
To integrate Jan with Tabby, follow these steps:
|
||||
|
||||
<Steps>
|
||||
|
||||
### Step 1: Enable the Jan API Server
|
||||
|
||||
To set up Tabby with Jan's Local Server, you must activate the Jan API Server with your chosen model.
|
||||
|
||||
1. Click the `Local API Server` (`<>`) button above the Settings. Jan will direct you to the **Local API Server** section.
|
||||
2. Configure the server, including the **IP Port**, **Cross-Origin Resource Sharing (CORS)**, and **Verbose Server Logs**.
|
||||
3. Press the **Start Server** button.
|
||||
|
||||
### Step 2: Find the Model ID and Ensure the Model is Activated
|
||||
|
||||
1. Go to `Settings` > `My Models`.
|
||||
2. Models are listed with their **Model ID** beneath their names.
|
||||
3. Click the **three dots (⋮)** button next to the model.
|
||||
4. Select **Start Model** to activate the model.
|
||||
|
||||
### Step 3: Installing Tabby Server
|
||||
|
||||
Use the following documentation to install the Tabby server:
|
||||
- [Docker](https://tabby.tabbyml.com/docs/quick-start/installation/docker/)
|
||||
- [Apple Silicon](https://tabby.tabbyml.com/docs/quick-start/installation/apple/)
|
||||
- [Linux](https://tabby.tabbyml.com/docs/quick-start/installation/linux/)
|
||||
- [Windows](https://tabby.tabbyml.com/docs/quick-start/installation/windows/)
|
||||
|
||||
Then, follow the steps to connect Jan with the Tabby server:
|
||||
[Connect Jan with Tabby](https://tabby.tabbyml.com/docs/references/models-http-api/jan.ai/).
|
||||
|
||||
For example, to connect Jan with Tabby, save the following configuration under `~/.tabby/config.toml`:
|
||||
|
||||
```toml title="~/.tabby/config.toml"
|
||||
# Chat model
|
||||
[model.chat.http]
|
||||
kind = "openai/chat"
|
||||
model_name = "model_id"
|
||||
api_endpoint = "http://localhost:1337/v1"
|
||||
api_key = ""
|
||||
```
|
||||
|
||||
Currently, the Jan completion and embedding API is under construction.
|
||||
Once completed, you can also connect Jan with Tabby for completion and embedding tasks.
|
||||
|
||||
### Step 4: Installing Tabby on Your Favorite IDEs
|
||||
|
||||
Refer to the following documentation to install the Tabby extension on your favorite IDEs:
|
||||
- [Visual Studio Code](https://tabby.tabbyml.com/docs/extensions/installation/vscode/)
|
||||
- [JetBrains IntelliJ Platform](https://tabby.tabbyml.com/docs/extensions/installation/intellij/)
|
||||
- [VIM / NeoVIM](https://tabby.tabbyml.com/docs/extensions/installation/vim/)
|
||||
|
||||
</Steps>
|
||||
|
||||
## How to Use Tabby with Jan Integration
|
||||
|
||||
### Answer Engine: Chat with Your Codes and Documentation
|
||||
|
||||
Tabby offers an [Answer Engine](https://tabby.tabbyml.com/docs/administration/answer-engine/) on the homepage,
|
||||
which can leverage the Jan LLM and related contexts like code, documentation, and web pages to answer user questions.
|
||||
|
||||
Simply open the Tabby homepage at [localhost:8080](http://localhost:8080) and ask your questions.
|
||||
|
||||

|
||||
|
||||
### IDE Chat Sidebar
|
||||
|
||||
After installing the Tabby extension on your preferred IDEs, you can engage in a conversation with Jan to:
|
||||
|
||||
1. Discuss your code, receive suggestions, and seek assistance.
|
||||
2. Request Jan to inline edit your code, and then review and accept the proposed changes.
|
||||
|
||||

|
||||
Loading…
x
Reference in New Issue
Block a user