jan/docs/docs/guides/07-integrations/02-integrate-openrouter.mdx
Louis 0e48be67e8
feat: support multiple model binaries (#1659)
* feat: Support multiple model binaries

* fix: Update downloadModel with multiple binaries handler

* feat: Add 3 models with multiple binaries

* chore: fix model download

* fix: model file lookup & model path

* chore: add .prettierrc

* chore: refactor docs

* chore: bump model version

* fix(capybara): add filename

* fix(codeninja): add file name + llama model path

* fix(default): add llama model path

* fix(deepseek coder): add filename

* fix(deepseek 33B): add filename

* fix(dolphin mixtral): add filename

* fix(llama2-chat): add filename

* fix(llama2-70B): add filename

* fix(mistral 7b): add filename + model path

* fix(bakllava): correct size model

* fix(llava-7b): correct size model

* fix(llava-13b): correct size model

* fix(mixtral-8x7b): add file name + modelpath

* fix(noramaid-7b): add file name + modelpath

* fix(openchat-7b): add file name + modelpath

* fix(openhermes-7b): add file name + modelpath

* fix(phi2-3b): add file name + modelpath

* fix(phind): add file name + modelpath

* fix(solarslerp): add file name + modelpath

* fix(starling): add file name + modelpath

* fix(stealth): add file name + modelpath

* fix(tinyllama): add file name + modelpath

* fix(trinity): add file name + modelpath

* fix(tulu): add file name + modelpath

* fix(wizardcoder): add file name + modelpath

* fix(yi): add file name + modelpath

* update from source -> sources

Signed-off-by: James <james@jan.ai>

---------

Signed-off-by: James <james@jan.ai>
Co-authored-by: hiro <vuonghoainam.work@gmail.com>
Co-authored-by: hahuyhoang411 <hahuyhoanghhh41@gmail.com>
Co-authored-by: James <james@jan.ai>
2024-01-25 14:05:33 +07:00

85 lines
2.9 KiB
Plaintext

---
title: Integrate OpenRouter with Jan
slug: /guides/integrations/openrouter
description: Guide to integrate OpenRouter with Jan
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
OpenRouter integration,
]
---
## Quick Introduction
[OpenRouter](https://openrouter.ai/docs#quick-start) is an AI model aggregator. The API can be used by developers to interact with a variety of large language models, generative image models, and generative 3D object models.
In this guide, we will show you how to integrate OpenRouter with Jan, enabling you to leverage remote Large Language Models (LLM) that are available at OpenRouter.
## Steps to Integrate OpenRouter with Jan
### 1. Configure OpenRouter API key
You can find your API keys in the [OpenRouter API Key](https://openrouter.ai/keys) and set the OpenRouter API key in `~/jan/engines/openai.json` file.
```json title="~/jan/engines/openai.json"
{
// highlight-start
"full_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-or-v1<your-openrouter-api-key-here>"
// highlight-end
}
```
### 2. Modify a Model JSON
Navigate to the `~/jan/models` folder. Create a folder named `<openrouter-modelname>`, for example, `openrouter-dolphin-mixtral-8x7b` and create a `model.json` file inside the folder including the following configurations:
- Ensure the filename must be `model.json`.
- Ensure the `id` property is set to the model id from OpenRouter.
- Ensure the `format` property is set to `api`.
- Ensure the `engine` property is set to `openai`.
- Ensure the `state` property is set to `ready`.
```json title="~/jan/models/openrouter-dolphin-mixtral-8x7b/model.json"
{
"sources": [
{
"filename": "openrouter",
"url": "https://openrouter.ai/"
}
],
"id": "cognitivecomputations/dolphin-mixtral-8x7b",
"object": "model",
"name": "Dolphin 2.6 Mixtral 8x7B",
"version": "1.0",
"description": "This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.",
// highlight-next-line
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"tags": ["General", "Big Context Length"]
},
// highlight-start
"engine": "openai"
// highlight-end
}
```
### 3. Start the Model
Restart Jan and navigate to the Hub. Locate your model and click the Use button.
![Dolphin OpenRouter Model](assets/02-openrouter-dolphin.png)
### 4. Try Out the Integration of Jan and OpenRouter
![Dolphin OpenRouter Model Trial](assets/02-openrouter-dolphin-trial.gif)