Merge pull request #1924 from SamPatt/local-server-documentation
docs: Updates Guide Using the Local Server
This commit is contained in:
commit
9fcc341105
@ -1,33 +0,0 @@
|
||||
---
|
||||
title: Connect to Server
|
||||
description: Connect to Jan's built-in API server.
|
||||
keywords:
|
||||
[
|
||||
Jan AI,
|
||||
Jan,
|
||||
ChatGPT alternative,
|
||||
local AI,
|
||||
private AI,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language model,
|
||||
]
|
||||
---
|
||||
|
||||
:::warning
|
||||
|
||||
This page is under construction.
|
||||
|
||||
:::
|
||||
|
||||
Jan ships with a built-in API server, that can be used as a drop-in, local replacement for OpenAI's API.
|
||||
|
||||
Jan runs on port `1337` by default, but this can (soon) be changed in Settings.
|
||||
|
||||
1. Go to Settings > Advanced > Enable API Server
|
||||
|
||||
2. Go to http://localhost:1337 for the API docs.
|
||||
|
||||
3. In terminal, simply CURL...
|
||||
|
||||
Note: Some UI states may be broken when in Server Mode.
|
||||
72
docs/docs/guides/05-using-server/01-start-server.md
Normal file
72
docs/docs/guides/05-using-server/01-start-server.md
Normal file
@ -0,0 +1,72 @@
|
||||
---
|
||||
title: Start Local Server
|
||||
slug: /guides/using-server/server
|
||||
description: How to run Jan's built-in API server.
|
||||
keywords:
|
||||
[
|
||||
Jan AI,
|
||||
Jan,
|
||||
ChatGPT alternative,
|
||||
local AI,
|
||||
private AI,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language model,
|
||||
local server,
|
||||
api server,
|
||||
]
|
||||
---
|
||||
|
||||
Jan ships with a built-in API server that can be used as a drop-in, local replacement for OpenAI's API. You can run your server by following these simple steps.
|
||||
|
||||
## Open Local API Server View
|
||||
|
||||
Navigate to the Local API Server view by clicking the corresponding icon on the left side of the screen.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
## Choosing a Model
|
||||
|
||||
On the top right of your screen under `Model Settings`, set the LLM that your local server will be running. You can choose from any of the models already installed, or pick a new model by clicking `Explore the Hub`.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
## Server Options
|
||||
|
||||
On the left side of your screen, you can set custom server options.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
### Local Server Address
|
||||
|
||||
By default, Jan will be accessible only on localhost `127.0.0.1`. This means a local server can only be accessed on the same machine where the server is being run.
|
||||
|
||||
You can make the local server more accessible by clicking on the address and choosing `0.0.0.0` instead, which allows the server to be accessed from other devices on the local network. This is less secure than choosing localhost, and should be done with caution.
|
||||
|
||||
### Port
|
||||
|
||||
Jan runs on port `1337` by default. You can change the port to any other port number if needed.
|
||||
|
||||
### Cross-Origin Resource Sharing (CORS)
|
||||
|
||||
Cross-Origin Resource Sharing (CORS) manages resource access on the local server from external domains. Enabled for security by default, it can be disabled if needed.
|
||||
|
||||
### Verbose Server Logs
|
||||
|
||||
The center of the screen displays the server logs as the local server runs. This option provides extensive details about server activities.
|
||||
|
||||
## Start Server
|
||||
|
||||
Click the `Start Server` button on the top left of your screen. You will see the server log display a message such as `Server listening at http://127.0.0.1:1337`, and the `Start Server` button will change to a red `Stop Server` button.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
You server is now running and you can use the server address and port to make requests to the local server.
|
||||
102
docs/docs/guides/05-using-server/02-using-server.md
Normal file
102
docs/docs/guides/05-using-server/02-using-server.md
Normal file
@ -0,0 +1,102 @@
|
||||
---
|
||||
title: Using Jan's Built-in API Server
|
||||
description: How to use Jan's built-in API server.
|
||||
keywords:
|
||||
[
|
||||
Jan AI,
|
||||
Jan,
|
||||
ChatGPT alternative,
|
||||
local AI,
|
||||
private AI,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language model,
|
||||
local server,
|
||||
api server,
|
||||
]
|
||||
---
|
||||
|
||||
Jan's built-in API server is compatible with [OpenAI's API](https://platform.openai.com/docs/api-reference) and can be used as a drop-in, local replacement. Follow these steps to use the API server.
|
||||
|
||||
## Open the API Reference
|
||||
|
||||
Jan contains a comprehensive API reference. This reference displays all the API endpoints available, gives you examples requests and responses, and allows you to execute them in browser.
|
||||
|
||||
On the top left of your screen below the red `Stop Server` button is the blue `API Reference`. Clicking this will open the reference in your browser.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
Scroll through the various available endpoints to learn what options are available and try them out by executing the example requests. In addition, you can also use the [Jan API Reference](https://jan.ai/api-reference/) on the Jan website.
|
||||
|
||||
### Chat
|
||||
|
||||
In the Chat section of the API reference, you will see an example JSON request body.
|
||||
|
||||
<br></br>
|
||||
|
||||

|
||||
|
||||
With your local server running, you can click the `Try it out` button on the top left, then the blue `Execute` button below the JSON. The browser will send the example request to your server, and display the response body below.
|
||||
|
||||
Use the API endpoints, request and response body examples as models for your own application.
|
||||
|
||||
### cURL Request Example
|
||||
|
||||
Here is an example curl request with a local server running `tinyllama-1.1b`:
|
||||
|
||||
<br></br>
|
||||
|
||||
```json
|
||||
{
|
||||
"messages": [
|
||||
{
|
||||
"content": "You are a helpful assistant.",
|
||||
"role": "system"
|
||||
},
|
||||
{
|
||||
"content": "Hello!",
|
||||
"role": "user"
|
||||
}
|
||||
],
|
||||
"model": "tinyllama-1.1b",
|
||||
"stream": true,
|
||||
"max_tokens": 2048,
|
||||
"stop": [
|
||||
"hello"
|
||||
],
|
||||
"frequency_penalty": 0,
|
||||
"presence_penalty": 0,
|
||||
"temperature": 0.7,
|
||||
"top_p": 0.95
|
||||
}
|
||||
'
|
||||
```
|
||||
|
||||
### Response Body Example
|
||||
|
||||
```json
|
||||
{
|
||||
"choices": [
|
||||
{
|
||||
"finish_reason": null,
|
||||
"index": 0,
|
||||
"message": {
|
||||
"content": "Hello user. What can I help you with?",
|
||||
"role": "assistant"
|
||||
}
|
||||
}
|
||||
],
|
||||
"created": 1700193928,
|
||||
"id": "ebwd2niJvJB1Q2Whyvkz",
|
||||
"model": "_",
|
||||
"object": "chat.completion",
|
||||
"system_fingerprint": "_",
|
||||
"usage": {
|
||||
"completion_tokens": 500,
|
||||
"prompt_tokens": 33,
|
||||
"total_tokens": 533
|
||||
}
|
||||
}
|
||||
```
|
||||
BIN
docs/docs/guides/05-using-server/assets/01-choose-model.png
Normal file
BIN
docs/docs/guides/05-using-server/assets/01-choose-model.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 328 KiB |
BIN
docs/docs/guides/05-using-server/assets/01-local-api-view.gif
Normal file
BIN
docs/docs/guides/05-using-server/assets/01-local-api-view.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 1.2 MiB |
BIN
docs/docs/guides/05-using-server/assets/01-running-server.gif
Normal file
BIN
docs/docs/guides/05-using-server/assets/01-running-server.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 3.7 MiB |
BIN
docs/docs/guides/05-using-server/assets/01-server-options.png
Normal file
BIN
docs/docs/guides/05-using-server/assets/01-server-options.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 109 KiB |
BIN
docs/docs/guides/05-using-server/assets/02-api-reference.png
Normal file
BIN
docs/docs/guides/05-using-server/assets/02-api-reference.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 90 KiB |
BIN
docs/docs/guides/05-using-server/assets/02-chat-example.png
Normal file
BIN
docs/docs/guides/05-using-server/assets/02-chat-example.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 252 KiB |
Loading…
x
Reference in New Issue
Block a user