Merge pull request #6273 from menloresearch/rp/v2-docs-improvements
removed orphan pages and polished wording of main page
|
Before Width: | Height: | Size: 142 KiB |
|
Before Width: | Height: | Size: 198 KiB |
|
Before Width: | Height: | Size: 164 KiB |
|
Before Width: | Height: | Size: 542 KiB |
|
Before Width: | Height: | Size: 351 KiB |
@ -1,29 +0,0 @@
|
||||
{
|
||||
"about-separator": {
|
||||
"title": "About Us",
|
||||
"type": "separator"
|
||||
},
|
||||
"index": "About",
|
||||
"vision": {
|
||||
"title": "Vision",
|
||||
"display": "hidden"
|
||||
},
|
||||
"team": "Team",
|
||||
"investors": "Investors",
|
||||
"wall-of-love": {
|
||||
"theme": {
|
||||
"toc": false,
|
||||
"layout": "full"
|
||||
}
|
||||
},
|
||||
"acknowledgements": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"handbook-separator": {
|
||||
"title": "Handbook",
|
||||
"display": "hidden"
|
||||
},
|
||||
"handbook": {
|
||||
"display": "hidden"
|
||||
}
|
||||
}
|
||||
@ -1,44 +0,0 @@
|
||||
---
|
||||
title: Handbook
|
||||
description: How we work at Jan
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
build in public,
|
||||
remote team,
|
||||
how we work,
|
||||
]
|
||||
---
|
||||
|
||||
# How We Work
|
||||
|
||||
Jan operates on open-source principles, giving everyone the freedom to adjust, personalize, and contribute to its development. Our focus is on creating a community-powered ecosystem that prioritizes transparency, customization, and user privacy. For more on our principles, visit our [About page](https://jan.ai/about).
|
||||
|
||||
## Open-Source
|
||||
|
||||
We embrace open development, showcasing our progress and upcoming features on GitHub, and we encourage your input and contributions:
|
||||
|
||||
- [Jan Framework](https://github.com/menloresearch/jan) (AGPLv3)
|
||||
- [Jan Desktop Client & Local server](https://jan.ai) (AGPLv3, built on Jan Framework)
|
||||
- [Nitro: run Local AI](https://github.com/menloresearch/nitro) (AGPLv3)
|
||||
|
||||
## Build in Public
|
||||
|
||||
We use GitHub to build in public and welcome anyone to join in.
|
||||
|
||||
- [Jan's Kanban](https://github.com/orgs/menloresearch/projects/5)
|
||||
- [Jan's Roadmap](https://github.com/orgs/menloresearch/projects/5/views/29)
|
||||
|
||||
## Collaboration
|
||||
|
||||
Our team spans the globe, working remotely to bring Jan to life. We coordinate through Discord and GitHub, valuing asynchronous communication and minimal, purposeful meetings. For collaboration and brainstorming, we utilize tools like [Excalidraw](https://excalidraw.com/) and [Miro](https://miro.com/), ensuring alignment and shared vision through visual storytelling and detailed documentation on [HackMD](https://hackmd.io/).
|
||||
|
||||
Check out the [Jan Framework](https://github.com/menloresearch/jan) and our desktop client & local server at [jan.ai](https://jan.ai), both licensed under AGPLv3 for maximum openness and user freedom.
|
||||
@ -1,21 +0,0 @@
|
||||
{
|
||||
"strategy": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"project-management": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"engineering": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"product-design": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"analytics": {
|
||||
"display": "hidden"
|
||||
},
|
||||
"website-docs": {
|
||||
"title": "Website & Docs",
|
||||
"display": "hidden"
|
||||
}
|
||||
}
|
||||
@ -1,26 +0,0 @@
|
||||
---
|
||||
title: Analytics
|
||||
description: Jan's Analytics philosophy and implementation
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
analytics,
|
||||
]
|
||||
---
|
||||
|
||||
# Analytics
|
||||
|
||||
Adhering to Jan's privacy preserving philosophy, our analytics philosophy is to get "barely-enough-to-function'.
|
||||
|
||||
## What is tracked
|
||||
|
||||
1. By default, Github tracks downloads and device metadata for all public GitHub repositories. This helps us troubleshoot & ensure cross-platform support.
|
||||
2. Additionally, we plan to enable a `Settings` feature for users to turn off all tracking.
|
||||
@ -1,23 +0,0 @@
|
||||
---
|
||||
title: Engineering
|
||||
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
]
|
||||
---
|
||||
|
||||
# Engineering
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- [Requirements](https://github.com/menloresearch/jan?tab=readme-ov-file#requirements-for-running-jan)
|
||||
- [Setting up local env](https://github.com/menloresearch/jan?tab=readme-ov-file#contributing)
|
||||
@ -1,4 +0,0 @@
|
||||
{
|
||||
"ci-cd": "CI & CD",
|
||||
"qa": "QA"
|
||||
}
|
||||
@ -1,11 +0,0 @@
|
||||
---
|
||||
title: CI & CD
|
||||
---
|
||||
|
||||
import { Callout } from 'nextra/components'
|
||||
|
||||
# CI & CD
|
||||
|
||||
Previously we were trunk based. Now we use the following Gitflow:
|
||||
|
||||
<Callout type="warning">TODO: @van to include her Mermaid diagram</Callout>
|
||||
@ -1,82 +0,0 @@
|
||||
---
|
||||
title: QA
|
||||
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
]
|
||||
---
|
||||
|
||||
# QA
|
||||
|
||||
## Phase 1: Planning
|
||||
|
||||
### Definition of Ready (DoR):
|
||||
|
||||
- **Scope Defined:** The features to be implemented are clearly defined and scoped out.
|
||||
- **Requirements Gathered:** Gather and document all the necessary requirements for the feature.
|
||||
- **Stakeholder Input:** Ensure relevant stakeholders have provided input on the document scope and content.
|
||||
|
||||
### Definition of Done (DoD):
|
||||
|
||||
- **Document Complete:** All sections of the document are filled out with relevant information.
|
||||
- **Reviewed by Stakeholders:** The document has been reviewed and approved by stakeholders.
|
||||
- **Ready for Development:** The document is in a state where developers can use it to begin implementation.
|
||||
|
||||
## Phase 2: Development
|
||||
|
||||
### Definition of Ready (DoR):
|
||||
|
||||
- **Task Breakdown:** The development team has broken down tasks based on the document.
|
||||
- **Communication Plan:** A plan is in place for communication between developers and writers if clarification is needed during implementation.
|
||||
- **Developer Understanding:** Developers have a clear understanding of the document content.
|
||||
|
||||
### Definition of Done (DoD):
|
||||
|
||||
- **Code Implementation:** The feature is implemented according to the document specifications.
|
||||
- **Developer Testing:**
|
||||
- Unit tests and basic integration tests are completed
|
||||
- Developer also completed self-testing for the feature (please add this as a comment in the ticket, with the tested OS and as much info as possible to reduce overlaping effort).
|
||||
- (AC -> Code Changes -> Impacted scenarios)
|
||||
- **Communication with Writers:** Developers have communicated any changes or challenges to the writers, and necessary adjustments are made in the document. (Can be through a note in the PR of the feature for writers to take care, or create a separate PR with the change you made for the docs, for writers to review)
|
||||
|
||||
## Phase 3: QA for feature
|
||||
|
||||
### Definition of Ready (DoR):
|
||||
|
||||
- **Test Note Defined:** The test note is prepared outlining the testing items.
|
||||
- **Environment Ready:** PR merged to nightly build, Nightly build notes updated (automatically from pipeline after merged).
|
||||
- **Status:** Ticket moved to the column Testing and assigning to QA/writers to review.
|
||||
- **Test Data Prepared:** Relevant test data is prepared for testing the scenarios.
|
||||
|
||||
### Definition of Done (DoD):
|
||||
|
||||
- **Test Executed:** All identified test items are executed on different OS, along with exploratory testing.
|
||||
- **Defects Logged:** Any defects found during testing are resolved / appropriately logged (and approved for future fix).
|
||||
- **Test Sign-Off:** QA team provides sign-off indicating the completion of testing.
|
||||
|
||||
## Phase 4: Release (DoR)
|
||||
|
||||
- **Pre-release wait time:** Code change to pre-release version should be frozen for at least X (hrs/days) for Regression testing purpose.
|
||||
- Pre-release cut off on Thu morning for the team to regression test.
|
||||
- Release to production (Stable) during working hour on Mon morning (if no blocker) or Tue morning.
|
||||
- During the release cut off, the nightly build will be paused, to leave room for pre-release build. The build version used for regression test will be notified.
|
||||
- **Pre-release testing:** A review of the implemented feature has been conducted, a long with regression test (check-list) by the team.
|
||||
- Release checklist cloned from the templat for different OS (with hackMD link)
|
||||
- New key test items from new feature added to the checklist.
|
||||
- Split 3 OS to different team members for testing.
|
||||
- **Document Updated:** The document is updated based on the review and feedback on any discrepancies or modification needed for this release.
|
||||
- **Reviewed by Stakeholders:** New feature and the updated document is reviewed and approved by stakeholders. The document is in its final version, reflecting the implemented feature accurately.
|
||||
|
||||
## Notes (WIP)
|
||||
|
||||
- **API collection run:** to run along with nightly build daily, for critical API validation
|
||||
- **Automation run:** for regression testing purpose, to reduce manual testing effort for the same items each release on multiple OS.
|
||||
@ -1,27 +0,0 @@
|
||||
---
|
||||
title: Product & Design
|
||||
description: How we work on product design
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
product design,
|
||||
]
|
||||
---
|
||||
|
||||
# Product & Design
|
||||
|
||||
## Roadmap
|
||||
|
||||
- Conversations over Tickets
|
||||
- Discord's #roadmap channel
|
||||
- Work with the community to turn conversations into Product Specs
|
||||
- Future System?
|
||||
- Use Canny?
|
||||
@ -1,83 +0,0 @@
|
||||
---
|
||||
title: Project Management
|
||||
description: Project management at Jan
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
project management,
|
||||
]
|
||||
---
|
||||
|
||||
import { Callout } from 'nextra/components'
|
||||
|
||||
# Project Management
|
||||
|
||||
We use the [Jan Monorepo Project](https://github.com/orgs/menloresearch/projects/5) in Github to manage our roadmap and sprint Kanbans.
|
||||
|
||||
As much as possible, everyone owns their respective `epics` and `tasks`.
|
||||
|
||||
<Callout>
|
||||
We aim for a `loosely coupled, but tightly aligned` autonomous culture.
|
||||
</Callout>
|
||||
|
||||
## Quicklinks
|
||||
|
||||
- [High-level roadmap](https://github.com/orgs/menloresearch/projects/5/views/16): view used at at strategic level, for team wide alignment. Start & end dates reflect engineering implementation cycles. Typically product & design work preceeds these timelines.
|
||||
- [Standup Kanban](https://github.com/orgs/menloresearch/projects/5/views/25): view used during daily standup. Sprints should be up to date.
|
||||
|
||||
## Organization
|
||||
|
||||
[`Roadmap Labels`](https://github.com/menloresearch/jan/labels?q=roadmap)
|
||||
|
||||
- `Roadmap Labels` tag large, long-term, & strategic projects that can span multiple teams and multiple sprints
|
||||
- Example label: `roadmap: Jan has Mobile`
|
||||
- `Roadmaps` contain `epics`
|
||||
|
||||
[`Epics`](https://github.com/menloresearch/jan/issues?q=is%3Aissue+is%3Aopen+label%3A%22type%3A+epic%22)
|
||||
|
||||
- `Epics` track large stories that span 1-2 weeks, and it outlines specs, architecture decisions, designs
|
||||
- `Epics` contain `tasks`
|
||||
- `Epics` should always have 1 owner
|
||||
|
||||
[`Milestones`](https://github.com/menloresearch/jan/milestones)
|
||||
|
||||
- `Milestones` track release versions. We use [semantic versioning](https://semver.org/)
|
||||
- `Milestones` span ~2 weeks and have deadlines
|
||||
- `Milestones` usually fit within 2-week sprint cycles
|
||||
|
||||
[`Tasks`](https://github.com/menloresearch/jan/issues)
|
||||
|
||||
- Tasks are individual issues (feats, bugs, chores) that can be completed within a few days
|
||||
- Tasks, except for critical bugs, should always belong to an `epic` (and thus fit into our roadmap)
|
||||
- Tasks are usually named per [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/#summary)
|
||||
- Tasks should always have 1 owner
|
||||
|
||||
We aim to always sprint on `tasks` that are a part of the [current roadmap](https://github.com/orgs/menloresearch/projects/5/views/16).
|
||||
|
||||
## Kanban
|
||||
|
||||
- `no status`: issues that need to be triaged (needs an owner, ETA)
|
||||
- `icebox`: issues you don't plan to tackle yet
|
||||
- `planned`: issues you plan to tackle this week
|
||||
- `in-progress`: in progress
|
||||
- `in-review`: pending PR or blocked by something
|
||||
- `done`: done
|
||||
|
||||
## Triage SOP
|
||||
|
||||
- `Urgent bugs`: assign to an owner (or @engineers if you are not sure) && tag the current `sprint` & `milestone`
|
||||
- `All else`: assign the correct roadmap `label(s)` and owner (if any)
|
||||
|
||||
### Request for help
|
||||
|
||||
As a result, our feature prioritization can feel a bit black box at times.
|
||||
|
||||
We'd appreciate high quality insights and volunteers for user interviews through [Discord](https://discord.gg/af6SaTdzpx) and [Github](https://github.com/menloresearch).
|
||||
@ -1,51 +0,0 @@
|
||||
# Strategy
|
||||
|
||||
We only have 2 planning parameters:
|
||||
|
||||
- 10 year vision
|
||||
- 2 week sprint
|
||||
- Quarterly OKRs
|
||||
|
||||
## Ideal Customer
|
||||
|
||||
Our ideal customer is an AI enthusiast or business who has experienced some limitations with current AI solutions and is keen to find open source alternatives.
|
||||
|
||||
## Problems
|
||||
|
||||
Our ideal customer would use Jan to solve one of these problems.
|
||||
|
||||
_Control_
|
||||
|
||||
- Control (e.g. preventing vendor lock-in)
|
||||
- Stability (e.g. runs predictably every time)
|
||||
- Local-use (e.g. for speed, or for airgapped environments)
|
||||
|
||||
_Privacy_
|
||||
|
||||
- Data protection (e.g. personal data or company data)
|
||||
- Privacy (e.g. nsfw)
|
||||
|
||||
_Customisability_
|
||||
|
||||
- Tinkerability (e.g. ability to change model, experiment)
|
||||
- Niche Models (e.g. fine-tuned, domain-specific models that outperform OpenAI)
|
||||
|
||||
Sources: [^1] [^2] [^3] [^4]
|
||||
|
||||
[^1]: [What are you guys doing that can't be done with ChatGPT?](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7ksti6/?utm_source=share&utm_medium=web2x&context=3)
|
||||
[^2]: [What's your main interest in running a local LLM instead of an existing API?](https://www.reddit.com/r/LocalLLaMA/comments/1718a9o/whats_your_main_interest_in_running_a_local_llm/)
|
||||
[^3]: [Ask HN: What's the best self-hosted/local alternative to GPT-4?](https://news.ycombinator.com/item?id=36138224)
|
||||
[^4]: [LoRAs](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7mdz1i/?utm_source=share&utm_medium=web2x&context=3)
|
||||
|
||||
## Solution
|
||||
|
||||
Jan is a seamless user experience that runs on your personal computer, that glues the different pieces of the open source AI ecosystem to provide an alternative to OpenAI's closed platform.
|
||||
|
||||
- We build a comprehensive, seamless platform that takes care of the technical chores across the stack required to run open source AI
|
||||
- We run on top of a local folder of non-proprietary files, that anyone can tinker with (yes, even other apps!)
|
||||
- We provide open formats for packaging and distributing AI to run reproducibly across devices
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- [Figma](https://figma.com)
|
||||
- [ScreenStudio](https://www.screen.studio/)
|
||||
@ -1,89 +0,0 @@
|
||||
---
|
||||
title: Website & Docs
|
||||
description: Information about the Jan website and documentation.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
website,
|
||||
documentation,
|
||||
]
|
||||
---
|
||||
|
||||
# Website & Docs
|
||||
|
||||
This website is built using [Docusaurus 3.0](https://docusaurus.io/), a modern static website generator.
|
||||
|
||||
## Information Architecture
|
||||
|
||||
We try to **keep routes consistent** to maintain SEO.
|
||||
|
||||
- **`/guides/`**: Guides on how to use the Jan application. For end users who are directly using Jan.
|
||||
|
||||
- **`/developer/`**: Developer docs on how to extend Jan. These pages are about what people can build with our software.
|
||||
|
||||
- **`/api-reference/`**: Reference documentation for the Jan API server, written in Swagger/OpenAPI format.
|
||||
|
||||
- **`/changelog/`**: A list of changes made to the Jan application with each release.
|
||||
|
||||
- **`/blog/`**: A blog for the Jan application.
|
||||
|
||||
## How to Contribute
|
||||
|
||||
Refer to the [Contributing Guide](https://github.com/menloresearch/jan/blob/dev/CONTRIBUTING.md) for more comprehensive information on how to contribute to the Jan project.
|
||||
|
||||
## Pre-requisites and Installation
|
||||
|
||||
- [Node.js](https://nodejs.org/en/) (version 20.0.0 or higher)
|
||||
- [yarn](https://yarnpkg.com/) (version 1.22.0 or higher)
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
cd jan/docs
|
||||
```
|
||||
|
||||
```bash
|
||||
yarn install && yarn start
|
||||
```
|
||||
|
||||
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
|
||||
|
||||
### Build
|
||||
|
||||
```bash
|
||||
yarn build
|
||||
```
|
||||
|
||||
This command generates static content into the `build` directory and can be served using any static contents hosting service.
|
||||
|
||||
### Deployment
|
||||
|
||||
Using SSH:
|
||||
|
||||
```bash
|
||||
USE_SSH=true yarn deploy
|
||||
```
|
||||
|
||||
Not using SSH:
|
||||
|
||||
```bash
|
||||
GIT_USER=<Your GitHub username> yarn deploy
|
||||
```
|
||||
|
||||
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.
|
||||
|
||||
### Preview URL, Pre-release and Publishing Documentation
|
||||
|
||||
- When a pull request is created, the preview URL will be automatically commented on the pull request.
|
||||
|
||||
- The documentation will then be published to [https://dev.jan.ai/](https://dev.jan.ai/) when the pull request is merged to `main`.
|
||||
|
||||
- Our open-source maintainers will sync the updated content from `main` to `release` branch, which will then be published to [https://jan.ai/](https://jan.ai/).
|
||||
@ -1,104 +0,0 @@
|
||||
---
|
||||
title: Menlo Research
|
||||
description: We are Menlo Research, the creators and maintainers of Jan, Cortex and other tools.
|
||||
keywords:
|
||||
[
|
||||
Menlo Research,
|
||||
Jan,
|
||||
local AI,
|
||||
open-source alternative to chatgpt,
|
||||
alternative to openai platform,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
about Jan,
|
||||
desktop application,
|
||||
thinking machines,
|
||||
]
|
||||
---
|
||||
|
||||
import { Callout } from 'nextra/components'
|
||||
|
||||
# Menlo Research
|
||||
|
||||

|
||||
_[Eniac](https://www.computerhistory.org/revolution/birth-of-the-computer/4/78), the World's First Computer (Photo courtesy of US Army)_
|
||||
|
||||
## About
|
||||
|
||||
We're a team of AI researchers and engineers. We are the creators and lead maintainers of a few open-source AI tools:
|
||||
|
||||
- 👋 [Jan](https://jan.ai): ChatGPT-alternative that runs 100% offline
|
||||
- 🤖 [Cortex](https://cortex.so/docs/): A simple, embeddable library to run LLMs locally
|
||||
- More to come!
|
||||
|
||||
<Callout>
|
||||
The [Menlo Research](https://en.wikipedia.org/wiki/Homebrew_Computer_Club) was an early computer hobbyist group from 1975 to 1986 that led to Apple and the personal computer revolution.
|
||||
</Callout>
|
||||
|
||||
### Mission
|
||||
|
||||
We're a robotics company that focuses on the cognitive framework for future robots. Our long-term mission is to advance human-machine collaboration to enable human civilization to thrive.
|
||||
|
||||
### Business Model
|
||||
|
||||
We're currently a bootstrapped startup [^2]. We balance technical invention with the search for a sustainable business model (e.g., consulting, paid support, and custom development).
|
||||
|
||||
<Callout>
|
||||
We welcome business inquiries: 👋 hello@jan.ai
|
||||
</Callout>
|
||||
|
||||
### Community
|
||||
|
||||
We have a thriving community built around [Jan](../docs), where we also discuss our other projects.
|
||||
|
||||
- [Discord](https://discord.gg/AAGQNpJQtH)
|
||||
- [Twitter](https://twitter.com/jandotai)
|
||||
- [LinkedIn](https://www.linkedin.com/company/menloresearch)
|
||||
- Email: hello@jan.ai
|
||||
|
||||
## Philosophy
|
||||
|
||||
[Menlo](https://menlo.ai/handbook/about) is an open R&D lab in pursuit of General Intelligence, that achieves real-world impact through agents and robots.
|
||||
|
||||
### 🔑 User Owned
|
||||
|
||||
We build tools that are user-owned. Our products are [open-source](https://en.wikipedia.org/wiki/Open_source), designed to run offline or be [self-hosted.](https://www.reddit.com/r/selfhosted/) We make no attempt to lock you in, and our tools are free of [user-hostile dark patterns](https://twitter.com/karpathy/status/1761467904737067456?t=yGoUuKC9LsNGJxSAKv3Ubg) [^1].
|
||||
|
||||
We adopt [Local-first](https://www.inkandswitch.com/local-first/) principles and store data locally in [universal file formats](https://stephango.com/file-over-app). We build for privacy by default, and we do not [collect or sell your data](/privacy).
|
||||
|
||||
### 🔧 Right to Tinker
|
||||
|
||||
We believe in the [Right to Repair](https://en.wikipedia.org/wiki/Right_to_repair). We encourage our users to take it further by [tinkering, extending, and customizing](https://www.popularmechanics.com/technology/gadgets/a4395/pm-remembers-steve-jobs-how-his-philosophy-changed-technology-6507117/) our products to fit their needs.
|
||||
|
||||
Our products are designed with [Extension APIs](/docs/extensions), and we do our best to write good [documentation](/docs) so users understand how things work under the hood.
|
||||
|
||||
### 👫 Build with the Community
|
||||
|
||||
We are part of a larger open-source community and are committed to being a good jigsaw puzzle piece. We credit and actively contribute to upstream projects.
|
||||
|
||||
We adopt a public-by-default approach to [Project Management](https://github.com/orgs/menloresearch/projects/30/views/1), [Roadmaps](https://github.com/orgs/menloresearch/projects/30/views/4), and Helpdesk for our products.
|
||||
|
||||
## Inspirations
|
||||
|
||||
> Good artists borrow, great artists steal - Picasso
|
||||
|
||||
We are inspired by and actively try to emulate the paths of companies we admire ❤️:
|
||||
|
||||
- [Posthog](https://posthog.com/handbook)
|
||||
- [Obsidian](https://obsidian.md/)
|
||||
- [Discourse](https://www.discourse.org/about)
|
||||
- [Gitlab](https://handbook.gitlab.com/handbook/company/history/#2017-gitlab-storytime)
|
||||
- [Red Hat](https://www.redhat.com/en/about/development-model)
|
||||
- [Ghost](https://ghost.org/docs/contributing/)
|
||||
- [Lago](https://www.getlago.com/blog/open-source-licensing-and-why-lago-chose-agplv3)
|
||||
- [Twenty](https://twenty.com/story)
|
||||
|
||||
## Footnotes
|
||||
|
||||
[^1]: [Kaparthy's Love Letter to Obsidian](https://twitter.com/karpathy/status/1761467904737067456?t=yGoUuKC9LsNGJxSAKv3Ubg)
|
||||
|
||||
[^2]: [The Market for AI Companies](https://www.artfintel.com/p/the-market-for-ai-companies) by Finbarr Timbers
|
||||
@ -1,18 +0,0 @@
|
||||
---
|
||||
title: Investors
|
||||
description: Our unique, unconventional approach to distributing ownership
|
||||
keywords: [
|
||||
ESOP,
|
||||
Thinking Machines,
|
||||
Jan,
|
||||
Jan.ai,
|
||||
Jan AI,
|
||||
cortex,
|
||||
]
|
||||
---
|
||||
|
||||
# Investors
|
||||
|
||||
We are a [bootstrapped company](https://en.wikipedia.org/wiki/Bootstrapping), and don't have any external investors (yet).
|
||||
|
||||
We're open to exploring opportunities with strategic partners want to tackle [our mission](/about#mission) together.
|
||||
@ -1,29 +0,0 @@
|
||||
---
|
||||
title: Team
|
||||
description: Meet the Thinking Machines team.
|
||||
keywords:
|
||||
[
|
||||
Thinking Machines,
|
||||
Jan,
|
||||
Cortex,
|
||||
jan AI,
|
||||
Jan AI,
|
||||
jan.ai,
|
||||
cortex,
|
||||
]
|
||||
---
|
||||
|
||||
import { Callout } from 'nextra/components'
|
||||
import { Cards, Card } from 'nextra/components'
|
||||
|
||||
# Team
|
||||
|
||||
We're a small, fully-remote team, mostly based in Southeast Asia.
|
||||
|
||||
We are committed to become a global company. You can check our [Careers page](https://menlo.bamboohr.com/careers) if you'd like to join us on our adventure.
|
||||
|
||||
You can find our full team members on the [Menlo handbook](https://menlo.ai/handbook/team#jan).
|
||||
|
||||
<Callout emoji="🌏">
|
||||
Ping us in [Discord](https://discord.gg/AAGQNpJQtH) if you're keen to talk to us!
|
||||
</Callout>
|
||||
@ -1,56 +0,0 @@
|
||||
---
|
||||
title: Vision - Thinking Machines
|
||||
description: We want to continue a legacy of craftsmen making tools that propel humanity forward.
|
||||
keywords:
|
||||
[
|
||||
Jan AI,
|
||||
Thinking Machines,
|
||||
Jan,
|
||||
ChatGPT alternative,
|
||||
local AI,
|
||||
private AI,
|
||||
conversational AI,
|
||||
OpenAI platform alternative,
|
||||
no-subscription fee,
|
||||
large language model,
|
||||
about Jan,
|
||||
desktop application,
|
||||
thinking machine,
|
||||
jan vision,
|
||||
]
|
||||
---
|
||||
|
||||
# Vision
|
||||
|
||||
> "I do not fear computers. I fear the lack of them" - Isaac Asimov
|
||||
|
||||

|
||||
|
||||
- Harmonious symbiosis of humans, nature, and machines
|
||||
- Humanity has over millennia adopted tools. Fire, electricity, computers, and AI.
|
||||
- AI is no different. It is a tool that can propel humanity forward.
|
||||
- We reject the
|
||||
- Go beyond the apocalypse narratives of Dune and Terminator, and you will find a kernel of progress
|
||||
|
||||
We want to continue a legacy of craftsmen making tools that propel humanity forward.
|
||||
|
||||
## Collaborating with Thinking Machines
|
||||
|
||||
Our vision is to develop thinking machines that work alongside humans.
|
||||
|
||||
We envision a future where AI is safely used as a tool in our daily lives, like fire and electricity. These robots enhance human potential and do not replace our key decision-making. You own your own AI.
|
||||
|
||||

|
||||
|
||||

|
||||
> We like that Luke can just open up R2-D2 and tinker around. He was not submitting support tickets to a centralized server somewhere in the galaxy.
|
||||
|
||||
## Solarpunk, not Dune
|
||||
|
||||
Our vision is rooted in an optimistic view of AI's role in humanity's future.
|
||||
|
||||
Like the [Solarpunk movement](https://en.wikipedia.org/wiki/Solarpunk), we envision a world where technology and nature coexist harmoniously, supporting a sustainable and flourishing ecosystem.
|
||||
|
||||
We focus on AI's positive impacts on our world. From environmental conservation to the democratization of energy, AI has the potential to address some of the most pressing challenges facing our planet.
|
||||
|
||||
https://www.yesmagazine.org/environment/2021/01/28/climate-change-sustainable-solarpunk
|
||||
@ -1,23 +0,0 @@
|
||||
---
|
||||
title: Wall of Love ❤️
|
||||
|
||||
description: Check out what our amazing users are saying about Jan!
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Rethink the Computer,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
wall of love,
|
||||
]
|
||||
---
|
||||
|
||||
import WallOfLove from "@/components/Home/WallOfLove"
|
||||
|
||||
<WallOfLove transparent />
|
||||
|
||||
BIN
docs/src/pages/docs/_assets/jan_loaded.png
Normal file
|
After Width: | Height: | Size: 310 KiB |
@ -1,6 +1,6 @@
|
||||
---
|
||||
title: Jan
|
||||
description: Build, run, and own your AI. From laptop to superintelligence.
|
||||
description: Working towards open superintelligence through community-driven AI
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
@ -28,56 +28,116 @@ import FAQBox from '@/components/FaqBox'
|
||||
|
||||
## Jan's Goal
|
||||
|
||||
> Jan's goal is to build superintelligence that you can self-host and use locally.
|
||||
> We're working towards open superintelligence to make a viable open-source alternative to platforms like ChatGPT
|
||||
and Claude that anyone can own and run.
|
||||
|
||||
## What is Jan?
|
||||
## What is Jan Today
|
||||
|
||||
Jan is an open-source AI ecosystem that runs on your hardware. We're building towards open superintelligence - a complete AI platform you actually own.
|
||||
Jan is an open-source AI platform that runs on your hardware. We believe AI should be in the hands of many, not
|
||||
controlled by a few tech giants.
|
||||
|
||||
### The Ecosystem
|
||||
Today, Jan is:
|
||||
- **A desktop app** that runs AI models locally or connects to cloud providers
|
||||
- **A model hub** making the latest open-source models accessible
|
||||
- **A connector system** that lets AI interact with real-world tools via MCP
|
||||
|
||||
**Models**: We build specialized models for real tasks, not general-purpose assistants:
|
||||
- **Jan-Nano (32k/128k)**: 4B parameters designed for deep research with MCP. The 128k version processes entire papers, codebases, or legal documents in one go
|
||||
- **Lucy**: 1.7B model that runs agentic web search on your phone. Small enough for CPU, smart enough for complex searches
|
||||
- **Jan-v1**: 4B model for agentic reasoning and tool use, achieving 91.1% on SimpleQA
|
||||
Tomorrow, Jan aims to be a complete ecosystem where open models rival or exceed closed alternatives.
|
||||
|
||||
We also integrate the best open-source models - from OpenAI's gpt-oss to community GGUF models on Hugging Face. The goal: make powerful AI accessible to everyone, not just those with server farms.
|
||||
|
||||
**Applications**: Jan Desktop runs on your computer today. Web, mobile, and server versions coming in late 2025. Everything syncs, everything works together.
|
||||
|
||||
**Tools**: Connect to the real world through [Model Context Protocol (MCP)](./mcp). Design with Canva, analyze data in Jupyter notebooks, control browsers, execute code in E2B sandboxes. Your AI can actually do things, not just talk about them.
|
||||
|
||||
<Callout>
|
||||
API keys are optional. No account needed. Just download and run. Bring your own API keys to connect your favorite cloud models.
|
||||
<Callout type="info">
|
||||
We're building this with the open-source AI community, using the best available tools, and sharing everything
|
||||
we learn along the way.
|
||||
</Callout>
|
||||
|
||||
### Core Features
|
||||
## The Jan Ecosystem
|
||||
|
||||
- **Run Models Locally**: Download any GGUF model from Hugging Face, use OpenAI's gpt-oss models, or connect to cloud providers
|
||||
- **OpenAI-Compatible API**: Local server at `localhost:1337` works with tools like [Continue](./server-examples/continue-dev) and [Cline](https://cline.bot/)
|
||||
- **Extend with MCP Tools**: Browser automation, web search, data analysis, design tools - all through natural language
|
||||
- **Your Choice of Infrastructure**: Run on your laptop, self-host on your servers (soon), or use cloud when you need it
|
||||
### Jan Apps
|
||||
**Available Now:**
|
||||
- **Desktop**: Full-featured AI workstation for Windows, Mac, and Linux
|
||||
|
||||
### Growing MCP Integrations
|
||||
**Coming Late 2025:**
|
||||
- **Mobile**: Jan on your phone
|
||||
- **Web**: Browser-based access at jan.ai
|
||||
- **Server**: Self-hosted for teams
|
||||
- **Extensions**: Browser extension for Chrome-based browsers
|
||||
|
||||
Jan connects to real tools through MCP:
|
||||
- **Creative Work**: Generate designs with Canva
|
||||
- **Data Analysis**: Execute Python in Jupyter notebooks
|
||||
- **Web Automation**: Control browsers with Browserbase and Browser Use
|
||||
- **Code Execution**: Run code safely in E2B sandboxes
|
||||
- **Search & Research**: Access current information via Exa, Perplexity, and Octagon
|
||||
- **More coming**: The MCP ecosystem is expanding rapidly
|
||||
### Jan Model Hub
|
||||
Making open-source AI accessible to everyone:
|
||||
- **Easy Downloads**: One-click model installation
|
||||
- **Jan Models**: Our own models optimized for local use
|
||||
- **Jan-v1**: 4B reasoning model specialized in web search
|
||||
- **Research Models**
|
||||
- **Jan-Nano (32k/128k)**: 4B model for web search with MCP tools
|
||||
- **Lucy**: 1.7B mobile-optimized for web search
|
||||
- **Community Models**: Any GGUF from Hugging Face works in Jan
|
||||
- **Cloud Models**: Connect your API keys for OpenAI, Anthropic, Gemini, and more
|
||||
|
||||
|
||||
### Jan Connectors Hub
|
||||
Connect AI to the tools you use daily via [Model Context Protocol](./mcp):
|
||||
|
||||
**Creative & Design:**
|
||||
- **Canva**: Generate and edit designs
|
||||
|
||||
**Data & Analysis:**
|
||||
- **Jupyter**: Run Python notebooks
|
||||
- **E2B**: Execute code in sandboxes
|
||||
|
||||
**Web & Search:**
|
||||
- **Browserbase & Browser Use**: Browser automation
|
||||
- **Exa, Serper, Perplexity**: Advanced web search
|
||||
- **Octagon**: Deep research capabilities
|
||||
|
||||
**Productivity:**
|
||||
- **Linear**: Project management
|
||||
- **Todoist**: Task management
|
||||
|
||||
## Core Features
|
||||
|
||||
- **Run Models Locally**: Download any GGUF model from Hugging Face, use OpenAI's gpt-oss models,
|
||||
or connect to cloud providers
|
||||
- **OpenAI-Compatible API**: Local server at `localhost:1337` works with tools like
|
||||
[Continue](./server-examples/continue-dev) and [Cline](https://cline.bot/)
|
||||
- **Extend with MCP Tools**: Browser automation, web search, data analysis, and design tools, all
|
||||
through natural language
|
||||
- **Your Choice of Infrastructure**: Run on your laptop, self-host on your servers (soon), or use
|
||||
cloud when you need it
|
||||
|
||||
## Philosophy
|
||||
|
||||
Jan is built to be user-owned:
|
||||
- **Open Source**: Apache 2.0 license - truly free
|
||||
- **Open Source**: Apache 2.0 license
|
||||
- **Local First**: Your data stays on your device. Internet is optional
|
||||
- **Privacy Focused**: We don't collect or sell user data. See our [Privacy Policy](./privacy)
|
||||
- **No Lock-in**: Export your data anytime. Use any model. Switch between local and cloud
|
||||
|
||||
<Callout type="info">
|
||||
We're building AI that respects your choices. Not another wrapper around someone else's API.
|
||||
<Callout>
|
||||
The best AI is the one you control. Not the one that others control for you.
|
||||
</Callout>
|
||||
|
||||
## The Path Forward
|
||||
|
||||
### What Works Today
|
||||
- Run powerful models locally on consumer hardware
|
||||
- Connect to any cloud provider with your API keys
|
||||
- Use MCP tools for real-world tasks
|
||||
- Access transparent model evaluations
|
||||
|
||||
### What We're Building
|
||||
- More specialized models that excel at specific tasks
|
||||
- Expanded app ecosystem (mobile, web, extensions)
|
||||
- Richer connector ecosystem
|
||||
- An evaluation framework to build better models
|
||||
|
||||
### The Long-Term Vision
|
||||
We're working towards open superintelligence where:
|
||||
- Open models match or exceed closed alternatives
|
||||
- Anyone can run powerful AI on their own hardware
|
||||
- The community drives innovation, not corporations
|
||||
- AI capabilities are owned by users, not rented
|
||||
|
||||
<Callout type="warning">
|
||||
This is an ambitious goal without a guaranteed path. We're betting on the open-source community, improved
|
||||
hardware, and better techniques, but we're honest that this is a journey, not a destination we've reached.
|
||||
</Callout>
|
||||
|
||||
## Quick Start
|
||||
@ -85,7 +145,7 @@ We're building AI that respects your choices. Not another wrapper around someone
|
||||
1. [Download Jan](./quickstart) for your operating system
|
||||
2. Choose a model - download locally or add cloud API keys
|
||||
3. Start chatting or connect tools via MCP
|
||||
4. Build with our [API](https://jan.ai/api-reference)
|
||||
4. Build with our [local API](./api-server)
|
||||
|
||||
## Acknowledgements
|
||||
|
||||
@ -97,7 +157,7 @@ Jan is built on the shoulders of giants:
|
||||
## FAQs
|
||||
|
||||
<FAQBox title="What is Jan?">
|
||||
Jan is an open-source AI ecosystem building towards superintelligence you can self-host. Today it's a desktop app that runs AI models locally. Tomorrow it's a complete platform across all your devices.
|
||||
Jan is an open-source AI platform working towards a viable alternative to Big Tech AI. Today it's a desktop app that runs models locally or connects to cloud providers. Tomorrow it aims to be a complete ecosystem rivaling platforms like ChatGPT and Claude.
|
||||
</FAQBox>
|
||||
|
||||
<FAQBox title="How is this different from other AI platforms?">
|
||||
@ -106,14 +166,14 @@ Jan is built on the shoulders of giants:
|
||||
|
||||
<FAQBox title="What models can I use?">
|
||||
**Jan Models:**
|
||||
- Jan-Nano (32k/128k) - Deep research with MCP integration
|
||||
- Lucy - Mobile-optimized agentic search (1.7B)
|
||||
- Jan-v1 - Agentic reasoning and tool use (4B)
|
||||
|
||||
- Jan-Nano (32k/128k) - Research and analysis with MCP integration
|
||||
- Lucy - Mobile-optimized search (1.7B)
|
||||
- Jan-v1 - Reasoning and tool use (4B)
|
||||
|
||||
**Open Source:**
|
||||
- OpenAI's gpt-oss models (120b and 20b)
|
||||
- Any GGUF model from Hugging Face
|
||||
|
||||
|
||||
**Cloud (with your API keys):**
|
||||
- OpenAI, Anthropic, Mistral, Groq, and more
|
||||
</FAQBox>
|
||||
@ -130,15 +190,27 @@ Jan is built on the shoulders of giants:
|
||||
|
||||
**Hardware**:
|
||||
- Minimum: 8GB RAM, 10GB storage
|
||||
- Recommended: 16GB RAM, GPU (NVIDIA/AMD/Intel), 50GB storage
|
||||
- Works with: NVIDIA (CUDA), AMD (Vulkan), Intel Arc, Apple Silicon
|
||||
- Recommended: 16GB RAM, GPU (NVIDIA/AMD/Intel/Apple), 50GB storage
|
||||
</FAQBox>
|
||||
|
||||
<FAQBox title="How realistic is 'open superintelligence'?">
|
||||
Honestly? It's ambitious and uncertain. We believe the combination of rapidly improving open models, better consumer hardware, community innovation, and specialized models working together can eventually rival closed platforms. But this is a multi-year journey with no guarantees. What we can guarantee is that we'll keep building in the open, with the community, towards this goal.
|
||||
</FAQBox>
|
||||
|
||||
<FAQBox title="What can Jan actually do today?">
|
||||
Right now, Jan can:
|
||||
- Run models like Llama, Mistral, and our own Jan models locally
|
||||
- Connect to cloud providers if you want more power
|
||||
- Use MCP tools to create designs, analyze data, browse the web, and more
|
||||
- Work completely offline once models are downloaded
|
||||
- Provide an OpenAI-compatible API for developers
|
||||
</FAQBox>
|
||||
|
||||
<FAQBox title="Is Jan really free?">
|
||||
**Local use**: Always free, no catches
|
||||
**Cloud models**: You pay providers directly (we add no markup)
|
||||
**Jan cloud**: Optional paid services coming 2025
|
||||
|
||||
|
||||
The core platform will always be free and open source.
|
||||
</FAQBox>
|
||||
|
||||
@ -161,7 +233,7 @@ Jan is built on the shoulders of giants:
|
||||
- **Jan Web**: Beta late 2025
|
||||
- **Jan Mobile**: Late 2025
|
||||
- **Jan Server**: Late 2025
|
||||
|
||||
|
||||
All versions will sync seamlessly.
|
||||
</FAQBox>
|
||||
|
||||
@ -174,4 +246,4 @@ Jan is built on the shoulders of giants:
|
||||
|
||||
<FAQBox title="Are you hiring?">
|
||||
Yes! We love hiring from our community. Check [Careers](https://menlo.bamboohr.com/careers).
|
||||
</FAQBox>
|
||||
</FAQBox>
|
||||
|
||||
@ -47,30 +47,13 @@ We recommend starting with **Jan v1**, our 4B parameter model optimized for reas
|
||||
Jan v1 achieves 91.1% accuracy on SimpleQA and excels at tool calling, making it perfect for web search and reasoning tasks.
|
||||
</Callout>
|
||||
|
||||
**HuggingFace models:** Some require an access token. Add yours in **Settings > Model Providers > Llama.cpp > Hugging Face Access Token**.
|
||||
|
||||

|
||||
|
||||
### Step 3: Enable GPU Acceleration (Optional)
|
||||
|
||||
For Windows/Linux with compatible graphics cards:
|
||||
|
||||
1. Go to **(<Settings width={16} height={16} style={{display:"inline"}}/>) Settings** > **Hardware**
|
||||
2. Toggle **GPUs** to ON
|
||||
|
||||

|
||||
|
||||
<Callout type="info">
|
||||
Install required drivers before enabling GPU acceleration. See setup guides for [Windows](/docs/desktop/windows#gpu-acceleration) & [Linux](/docs/desktop/linux#gpu-acceleration).
|
||||
</Callout>
|
||||
|
||||
### Step 4: Start Chatting
|
||||
### Step 3: Start Chatting
|
||||
|
||||
1. Click **New Chat** (<SquarePen width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
2. Select your model in the input field dropdown
|
||||
3. Type your message and start chatting
|
||||
|
||||

|
||||

|
||||
|
||||
Try asking Jan v1 questions like:
|
||||
- "Explain quantum computing in simple terms"
|
||||
@ -118,7 +101,7 @@ Thread deletion is permanent. No undo available.
|
||||
|
||||
**All threads:**
|
||||
1. Hover over `Recents` category
|
||||
2. Click **three dots** (<Ellipsis width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
2. Click **three dots** (<Ellipsis width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
3. Select <Trash2 width={16} height={16} style={{display:"inline"}}/> **Delete All**
|
||||
|
||||
## Advanced Features
|
||||
|
||||
@ -1,145 +0,0 @@
|
||||
---
|
||||
title: Start Chatting
|
||||
description: Download models and manage your conversations with AI models locally.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
local AI,
|
||||
LLM,
|
||||
chat,
|
||||
threads,
|
||||
models,
|
||||
download,
|
||||
installation,
|
||||
conversations,
|
||||
]
|
||||
---
|
||||
|
||||
import { Callout, Steps } from 'nextra/components'
|
||||
import { SquarePen, Pencil, Ellipsis, Paintbrush, Trash2, Settings } from 'lucide-react'
|
||||
|
||||
# Start Chatting
|
||||
|
||||
<Steps>
|
||||
|
||||
### Step 1: Install Jan
|
||||
|
||||
1. [Download Jan](/download)
|
||||
2. Install the app ([Mac](/docs/desktop/mac), [Windows](/docs/desktop/windows), [Linux](/docs/desktop/linux))
|
||||
3. Launch Jan
|
||||
|
||||
### Step 2: Download a Model
|
||||
|
||||
Jan requires a model to chat. Download one from the Hub:
|
||||
|
||||
1. Go to the **Hub Tab**
|
||||
2. Browse available models (must be GGUF format)
|
||||
3. Select one matching your hardware specs
|
||||
4. Click **Download**
|
||||
|
||||

|
||||
|
||||
<Callout type="warning">
|
||||
Models consume memory and processing power. Choose based on your hardware specs.
|
||||
</Callout>
|
||||
|
||||
**HuggingFace models:** Some require an access token. Add yours in **Settings > Model Providers > Llama.cpp > Hugging Face Access Token**.
|
||||
|
||||

|
||||
|
||||
### Step 3: Enable GPU Acceleration (Optional)
|
||||
|
||||
For Windows/Linux with compatible graphics cards:
|
||||
|
||||
1. Go to **(<Settings width={16} height={16} style={{display:"inline"}}/>) Settings** > **Hardware**
|
||||
2. Toggle **GPUs** to ON
|
||||
|
||||

|
||||
|
||||
<Callout type="info">
|
||||
Install required drivers before enabling GPU acceleration. See setup guides for [Windows](/docs/desktop/windows#gpu-acceleration) & [Linux](/docs/desktop/linux#gpu-acceleration).
|
||||
</Callout>
|
||||
|
||||
### Step 4: Start Chatting
|
||||
|
||||
1. Click **New Chat** (<SquarePen width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
2. Select your model in the input field dropdown
|
||||
3. Type your message and start chatting
|
||||
|
||||

|
||||
|
||||
</Steps>
|
||||
|
||||
## Managing Conversations
|
||||
|
||||
Jan organizes conversations into threads for easy tracking and revisiting.
|
||||
|
||||
### View Chat History
|
||||
|
||||
- **Left sidebar** shows all conversations
|
||||
- Click any chat to open the full conversation
|
||||
- **Favorites**: Pin important threads for quick access
|
||||
- **Recents**: Access recently used threads
|
||||
|
||||

|
||||
|
||||
### Edit Chat Titles
|
||||
|
||||
1. Hover over a conversation in the sidebar
|
||||
2. Click **three dots** (<Ellipsis width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
3. Click <Pencil width={16} height={16} style={{display:"inline"}}/> **Rename**
|
||||
4. Enter new title and save
|
||||
|
||||

|
||||
|
||||
### Delete Threads
|
||||
|
||||
<Callout type="warning">
|
||||
Thread deletion is permanent. No undo available.
|
||||
</Callout>
|
||||
|
||||
**Single thread:**
|
||||
1. Hover over thread in sidebar
|
||||
2. Click **three dots** (<Ellipsis width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
3. Click <Trash2 width={16} height={16} style={{display:"inline"}}/> **Delete**
|
||||
|
||||
**All threads:**
|
||||
1. Hover over `Recents` category
|
||||
2. Click **three dots** (<Ellipsis width={16} height={16} style={{display:"inline"}}/>) icon
|
||||
3. Select <Trash2 width={16} height={16} style={{display:"inline"}}/> **Delete All**
|
||||
|
||||
## Advanced Features
|
||||
|
||||
### Custom Assistant Instructions
|
||||
|
||||
Customize how models respond:
|
||||
|
||||
1. Use the assistant dropdown in the input field
|
||||
2. Or go to the **Assistant tab** to create custom instructions
|
||||
3. Instructions work across all models
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
### Model Parameters
|
||||
|
||||
Fine-tune model behavior:
|
||||
- Click the **Gear icon** next to your model
|
||||
- Adjust parameters in **Assistant Settings**
|
||||
- Switch models via the **model selector**
|
||||
|
||||

|
||||
|
||||
### Connect Cloud Models (Optional)
|
||||
|
||||
Connect to OpenAI, Anthropic, Groq, Mistral, and others:
|
||||
|
||||
1. Open any thread
|
||||
2. Select a cloud model from the dropdown
|
||||
3. Click the **Gear icon** beside the provider
|
||||
4. Add your API key (ensure sufficient credits)
|
||||
|
||||

|
||||
|
||||
For detailed setup, see [Remote APIs](/docs/remote-models/openai).
|
||||
9
docs/src/pages/platforms/_meta.json
Normal file
@ -0,0 +1,9 @@
|
||||
{
|
||||
"-- Switcher": {
|
||||
"type": "separator",
|
||||
"title": "Switcher"
|
||||
},
|
||||
"index": {
|
||||
"display": "hidden"
|
||||
}
|
||||
}
|
||||
87
docs/src/pages/platforms/index.mdx
Normal file
@ -0,0 +1,87 @@
|
||||
---
|
||||
title: Coming Soon
|
||||
description: Exciting new features and platforms are on the way. Stay tuned for Jan Web, Jan Mobile, and our API Platform.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Customizable Intelligence, LLM,
|
||||
local AI,
|
||||
privacy focus,
|
||||
free and open source,
|
||||
private and offline,
|
||||
conversational AI,
|
||||
no-subscription fee,
|
||||
large language models,
|
||||
coming soon,
|
||||
Jan Web,
|
||||
Jan Mobile,
|
||||
API Platform,
|
||||
]
|
||||
---
|
||||
|
||||
import { Callout } from 'nextra/components'
|
||||
|
||||
<div className="text-center py-12">
|
||||
<div className="mb-8">
|
||||
<h1 className="text-4xl font-bold bg-gradient-to-r from-blue-600 to-purple-600 bg-clip-text text-transparent mb-4 py-2">
|
||||
🚀 Coming Soon
|
||||
</h1>
|
||||
<p className="text-xl text-gray-600 dark:text-gray-300 max-w-2xl mx-auto">
|
||||
We're working on the next stage of Jan - making our local assistant more powerful and available in more platforms.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-1 md:grid-cols-3 gap-6 max-w-4xl mx-auto mb-12">
|
||||
<div className="p-6 border border-gray-200 dark:border-gray-700 rounded-lg bg-gradient-to-br from-blue-50 to-indigo-50 dark:from-blue-900/20 dark:to-indigo-900/20">
|
||||
<div className="text-3xl mb-3">🌐</div>
|
||||
<h3 className="text-lg font-semibold mb-2">Jan Web</h3>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-400">
|
||||
Access Jan directly from your browser with our powerful web interface
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="p-6 border border-gray-200 dark:border-gray-700 rounded-lg bg-gradient-to-br from-green-50 to-emerald-50 dark:from-green-900/20 dark:to-emerald-900/20">
|
||||
<div className="text-3xl mb-3">📱</div>
|
||||
<h3 className="text-lg font-semibold mb-2">Jan Mobile</h3>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-400">
|
||||
Take Jan on the go with our native mobile applications
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="p-6 border border-gray-200 dark:border-gray-700 rounded-lg bg-gradient-to-br from-purple-50 to-pink-50 dark:from-purple-900/20 dark:to-pink-900/20">
|
||||
<div className="text-3xl mb-3">⚡</div>
|
||||
<h3 className="text-lg font-semibold mb-2">API Platform</h3>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-400">
|
||||
Integrate Jan's capabilities into your applications with our API
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<Callout type="info">
|
||||
**Stay Updated**: Follow our [GitHub repository](https://github.com/menloresearch/jan) and join our [Discord community](https://discord.com/invite/FTk2MvZwJH) for the latest updates on these exciting releases!
|
||||
</Callout>
|
||||
|
||||
<div className="mt-12">
|
||||
<h2 className="text-2xl font-semibold mb-6">What to Expect</h2>
|
||||
<div className="text-left max-w-2xl mx-auto space-y-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="text-green-500 text-xl">✓</span>
|
||||
<div>
|
||||
<strong>Seamless Experience:</strong> Unified interface across all platforms
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="text-green-500 text-xl">✓</span>
|
||||
<div>
|
||||
<strong>Privacy First:</strong> Same privacy-focused approach you trust
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="text-green-500 text-xl">✓</span>
|
||||
<div>
|
||||
<strong>Developer Friendly:</strong> Robust APIs and comprehensive documentation
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@ -29,19 +29,25 @@ export default defineConfig({
|
||||
starlightThemeNext(),
|
||||
starlightSidebarTopics([
|
||||
{
|
||||
label: 'Jan Desktop',
|
||||
label: 'Jan',
|
||||
link: '/',
|
||||
icon: 'rocket',
|
||||
items: [{ label: 'Ecosystem', slug: 'index' }],
|
||||
},
|
||||
{
|
||||
label: 'Jan Desktop',
|
||||
link: '/jan/quickstart',
|
||||
icon: 'rocket',
|
||||
items: [
|
||||
{
|
||||
label: 'GETTING STARTED',
|
||||
items: [
|
||||
{ label: 'QuickStart', slug: 'jan/quickstart' },
|
||||
{
|
||||
label: 'Install 👋 Jan',
|
||||
collapsed: false,
|
||||
autogenerate: { directory: 'jan/installation' },
|
||||
},
|
||||
{ label: 'QuickStart', slug: 'jan/quickstart' },
|
||||
{
|
||||
label: 'Models',
|
||||
collapsed: true,
|
||||
@ -70,17 +76,12 @@ export default defineConfig({
|
||||
{ label: 'Groq', slug: 'jan/remote-models/groq' },
|
||||
],
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
label: 'TUTORIALS',
|
||||
items: [
|
||||
{
|
||||
label: 'MCP Examples',
|
||||
label: 'Tutorials',
|
||||
collapsed: true,
|
||||
items: [
|
||||
{
|
||||
label: 'Browser Control (Browserbase)',
|
||||
label: 'Browser Control',
|
||||
slug: 'jan/mcp-examples/browser/browserbase',
|
||||
},
|
||||
{
|
||||
@ -88,11 +89,15 @@ export default defineConfig({
|
||||
slug: 'jan/mcp-examples/data-analysis/e2b',
|
||||
},
|
||||
{
|
||||
label: 'Design Creation (Canva)',
|
||||
label: 'Jupyter Notebooks',
|
||||
slug: 'jan/mcp-examples/data-analysis/jupyter',
|
||||
},
|
||||
{
|
||||
label: 'Design with Canva',
|
||||
slug: 'jan/mcp-examples/design/canva',
|
||||
},
|
||||
{
|
||||
label: 'Deep Research (Octagon)',
|
||||
label: 'Deep Financial Research',
|
||||
slug: 'jan/mcp-examples/deepresearch/octagon',
|
||||
},
|
||||
{
|
||||
@ -100,9 +105,17 @@ export default defineConfig({
|
||||
slug: 'jan/mcp-examples/search/serper',
|
||||
},
|
||||
{
|
||||
label: 'Web Search (Exa)',
|
||||
label: 'Exa Search',
|
||||
slug: 'jan/mcp-examples/search/exa',
|
||||
},
|
||||
{
|
||||
label: 'Linear',
|
||||
slug: 'jan/mcp-examples/productivity/linear',
|
||||
},
|
||||
{
|
||||
label: 'Todoist',
|
||||
slug: 'jan/mcp-examples/productivity/todoist',
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
|
||||
BIN
website/src/assets/gpt5-add.png
Normal file
|
After Width: | Height: | Size: 37 KiB |
BIN
website/src/assets/gpt5-chat.png
Normal file
|
After Width: | Height: | Size: 81 KiB |
BIN
website/src/assets/gpt5-msg.png
Normal file
|
After Width: | Height: | Size: 136 KiB |
BIN
website/src/assets/gpt5-msg2.png
Normal file
|
After Width: | Height: | Size: 264 KiB |
BIN
website/src/assets/gpt5-msg3.png
Normal file
|
After Width: | Height: | Size: 391 KiB |
BIN
website/src/assets/gpt5-tools.png
Normal file
|
After Width: | Height: | Size: 41 KiB |
BIN
website/src/assets/jan_loaded.png
Normal file
|
After Width: | Height: | Size: 310 KiB |
BIN
website/src/assets/jupyter.png
Normal file
|
After Width: | Height: | Size: 56 KiB |
BIN
website/src/assets/jupyter1.png
Normal file
|
After Width: | Height: | Size: 307 KiB |
BIN
website/src/assets/jupyter2.png
Normal file
|
After Width: | Height: | Size: 35 KiB |
BIN
website/src/assets/jupyter3.png
Normal file
|
After Width: | Height: | Size: 50 KiB |
BIN
website/src/assets/jupyter4.png
Normal file
|
After Width: | Height: | Size: 80 KiB |
BIN
website/src/assets/jupyter5.png
Normal file
|
After Width: | Height: | Size: 947 KiB |
BIN
website/src/assets/linear1.png
Normal file
|
After Width: | Height: | Size: 161 KiB |
BIN
website/src/assets/linear2.png
Normal file
|
After Width: | Height: | Size: 695 KiB |
BIN
website/src/assets/linear3.png
Normal file
|
After Width: | Height: | Size: 232 KiB |
BIN
website/src/assets/linear4.png
Normal file
|
After Width: | Height: | Size: 176 KiB |
BIN
website/src/assets/linear5.png
Normal file
|
After Width: | Height: | Size: 926 KiB |
BIN
website/src/assets/linear6.png
Normal file
|
After Width: | Height: | Size: 175 KiB |
BIN
website/src/assets/linear7.png
Normal file
|
After Width: | Height: | Size: 197 KiB |
BIN
website/src/assets/linear8.png
Normal file
|
After Width: | Height: | Size: 369 KiB |
BIN
website/src/assets/openai-settings.png
Normal file
|
After Width: | Height: | Size: 131 KiB |
BIN
website/src/assets/todoist1.png
Normal file
|
After Width: | Height: | Size: 247 KiB |
BIN
website/src/assets/todoist2.png
Normal file
|
After Width: | Height: | Size: 383 KiB |
BIN
website/src/assets/todoist3.png
Normal file
|
After Width: | Height: | Size: 328 KiB |
BIN
website/src/assets/todoist4.png
Normal file
|
After Width: | Height: | Size: 216 KiB |
BIN
website/src/assets/todoist5.png
Normal file
|
After Width: | Height: | Size: 514 KiB |
@ -1,106 +1,252 @@
|
||||
---
|
||||
title: Jan
|
||||
description: Build, run, and own your AI. From laptop to superintelligence.
|
||||
description: Working towards open superintelligence through community-driven AI
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
Jan AI,
|
||||
open superintelligence,
|
||||
AI ecosystem,
|
||||
self-hosted AI,
|
||||
local AI,
|
||||
private AI,
|
||||
self-hosted AI,
|
||||
llama.cpp,
|
||||
Model Context Protocol,
|
||||
MCP,
|
||||
GGUF models,
|
||||
MCP tools,
|
||||
Model Context Protocol
|
||||
large language model,
|
||||
LLM,
|
||||
]
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||

|
||||
# Jan
|
||||
|
||||

|
||||
|
||||
## Jan's Goal
|
||||
|
||||
> Jan's goal is to build superintelligence that you can self-host and use locally.
|
||||
> We're working towards open superintelligence to make a viable open-source alternative to platforms like ChatGPT
|
||||
and Claude that anyone can own and run.
|
||||
|
||||
## What is Jan?
|
||||
## What is Jan Today
|
||||
|
||||
Jan is an open-source AI ecosystem that runs on your hardware. We're building towards open superintelligence - a complete AI platform you actually own.
|
||||
Jan is an open-source AI platform that runs on your hardware. We believe AI should be in the hands of many, not
|
||||
controlled by a few tech giants.
|
||||
|
||||
### The Ecosystem
|
||||
Today, Jan is:
|
||||
- **A desktop app** that runs AI models locally or connects to cloud providers
|
||||
- **A model hub** making the latest open-source models accessible
|
||||
- **A connector system** that lets AI interact with real-world tools via MCP
|
||||
|
||||
**Models**: We build specialized models for real tasks, not general-purpose assistants:
|
||||
- **Jan-Nano (32k/128k)**: 4B parameters designed for deep research with MCP. The 128k version processes entire papers, codebases, or legal documents in one go
|
||||
- **Lucy**: 1.7B model that runs agentic web search on your phone. Small enough for CPU, smart enough for complex searches
|
||||
- **Jan-v1**: 4B model for agentic reasoning and tool use, achieving 91.1% on SimpleQA
|
||||
Tomorrow, Jan aims to be a complete ecosystem where open models rival or exceed closed alternatives.
|
||||
|
||||
We also integrate the best open-source models - from OpenAI's gpt-oss to community GGUF models on Hugging Face. The goal: make powerful AI accessible to everyone, not just those with server farms.
|
||||
|
||||
**Applications**: Jan Desktop runs on your computer today. Web, mobile, and server versions coming in late 2025. Everything syncs, everything works together.
|
||||
|
||||
**Tools**: Connect to the real world through [Model Context Protocol (MCP)](https://modelcontextprotocol.io). Design with Canva, analyze data in Jupyter notebooks, control browsers, execute code in E2B sandboxes. Your AI can actually do things, not just talk about them.
|
||||
|
||||
<Aside type="tip">
|
||||
API keys are optional. No account needed. Just download and run. Bring your own API keys to connect your favorite cloud models.
|
||||
<Aside type="note">
|
||||
We're building this with the open-source AI community, using the best available tools, and sharing everything
|
||||
we learn along the way.
|
||||
</Aside>
|
||||
|
||||
## The Jan Ecosystem
|
||||
|
||||
### Jan Apps
|
||||
**Available Now:**
|
||||
- **Desktop**: Full-featured AI workstation for Windows, Mac, and Linux
|
||||
|
||||
**Coming Late 2025:**
|
||||
- **Mobile**: Jan on your phone
|
||||
- **Web**: Browser-based access at jan.ai
|
||||
- **Server**: Self-hosted for teams
|
||||
- **Extensions**: Browser extension for Chrome-based browsers
|
||||
|
||||
### Jan Model Hub
|
||||
Making open-source AI accessible to everyone:
|
||||
- **Easy Downloads**: One-click model installation
|
||||
- **Jan Models**: Our own models optimized for local use
|
||||
- **Jan-v1**: 4B reasoning model specialized in web search
|
||||
- **Research Models**
|
||||
- **Jan-Nano (32k/128k)**: 4B model for web search with MCP tools
|
||||
- **Lucy**: 1.7B mobile-optimized for web search
|
||||
- **Community Models**: Any GGUF from Hugging Face works in Jan
|
||||
- **Cloud Models**: Connect your API keys for OpenAI, Anthropic, Gemini, and more
|
||||
|
||||
|
||||
### Jan Connectors Hub
|
||||
Connect AI to the tools you use daily via [Model Context Protocol](./mcp):
|
||||
|
||||
**Creative & Design:**
|
||||
- **Canva**: Generate and edit designs
|
||||
|
||||
**Data & Analysis:**
|
||||
- **Jupyter**: Run Python notebooks
|
||||
- **E2B**: Execute code in sandboxes
|
||||
|
||||
**Web & Search:**
|
||||
- **Browserbase & Browser Use**: Browser automation
|
||||
- **Exa, Serper, Perplexity**: Advanced web search
|
||||
- **Octagon**: Deep research capabilities
|
||||
|
||||
**Productivity:**
|
||||
- **Linear**: Project management
|
||||
- **Todoist**: Task management
|
||||
|
||||
## Core Features
|
||||
|
||||
### Run Models Locally
|
||||
- Download any GGUF model from Hugging Face
|
||||
- Use OpenAI's gpt-oss models (120b and 20b)
|
||||
- Automatic GPU acceleration (NVIDIA/AMD/Intel/Apple Silicon)
|
||||
- OpenAI-compatible API at `localhost:1337`
|
||||
- **Run Models Locally**: Download any GGUF model from Hugging Face, use OpenAI's gpt-oss models,
|
||||
or connect to cloud providers
|
||||
- **OpenAI-Compatible API**: Local server at `localhost:1337` works with tools like
|
||||
[Continue](./server-examples/continue-dev) and [Cline](https://cline.bot/)
|
||||
- **Extend with MCP Tools**: Browser automation, web search, data analysis, and design tools, all
|
||||
through natural language
|
||||
- **Your Choice of Infrastructure**: Run on your laptop, self-host on your servers (soon), or use
|
||||
cloud when you need it
|
||||
|
||||
### Connect to Cloud (Optional)
|
||||
- Your API keys for OpenAI, Anthropic, etc.
|
||||
- Jan.ai cloud models (coming late 2025)
|
||||
- Self-hosted Jan Server (soon)
|
||||
## Philosophy
|
||||
|
||||
### Extend with MCP Tools
|
||||
Growing ecosystem of real-world integrations:
|
||||
- **Creative Work**: Generate designs with Canva
|
||||
- **Data Analysis**: Execute Python in Jupyter notebooks
|
||||
- **Web Automation**: Control browsers with Browserbase and Browser Use
|
||||
- **Code Execution**: Run code safely in E2B sandboxes
|
||||
- **Search & Research**: Access current information via Exa, Perplexity, and Octagon
|
||||
- **More coming**: The MCP ecosystem is expanding rapidly
|
||||
|
||||
## Architecture
|
||||
|
||||
Jan is built on:
|
||||
- [Llama.cpp](https://github.com/ggerganov/llama.cpp) for inference
|
||||
- [Model Context Protocol](https://modelcontextprotocol.io) for tool integration
|
||||
- Local-first data storage in `~/jan`
|
||||
|
||||
## Why Jan?
|
||||
|
||||
| Feature | Other AI Platforms | Jan |
|
||||
|:--------|:-------------------|:----|
|
||||
| **Deployment** | Their servers only | Your device, your servers, or our cloud |
|
||||
| **Models** | One-size-fits-all | Specialized models for specific tasks |
|
||||
| **Data** | Stored on their servers | Stays on your hardware |
|
||||
| **Cost** | Monthly subscription | Free locally, pay for cloud |
|
||||
| **Extensibility** | Limited APIs | Full ecosystem with MCP tools |
|
||||
| **Ownership** | You rent access | You own everything |
|
||||
|
||||
## Development Philosophy
|
||||
|
||||
1. **Local First**: Everything works offline. Cloud is optional.
|
||||
2. **User Owned**: Your data, your models, your compute.
|
||||
3. **Built in Public**: Watch our models train. See our code. Track our progress.
|
||||
Jan is built to be user-owned:
|
||||
- **Open Source**: Apache 2.0 license
|
||||
- **Local First**: Your data stays on your device. Internet is optional
|
||||
- **Privacy Focused**: We don't collect or sell user data. See our [Privacy Policy](./privacy)
|
||||
- **No Lock-in**: Export your data anytime. Use any model. Switch between local and cloud
|
||||
|
||||
<Aside>
|
||||
We're building AI that respects your choices. Not another wrapper around someone else's API.
|
||||
The best AI is the one you control. Not the one that other control for you.
|
||||
</Aside>
|
||||
|
||||
## System Requirements
|
||||
## The Path Forward
|
||||
|
||||
**Minimum**: 8GB RAM, 10GB storage
|
||||
**Recommended**: 16GB RAM, GPU (NVIDIA/AMD/Intel), 50GB storage
|
||||
**Supported**: Windows 10+, macOS 12+, Linux (Ubuntu 20.04+)
|
||||
### What Works Today
|
||||
- Run powerful models locally on consumer hardware
|
||||
- Connect to any cloud provider with your API keys
|
||||
- Use MCP tools for real-world tasks
|
||||
- Access transparent model evaluations
|
||||
|
||||
## What's Next?
|
||||
### What We're Building
|
||||
- More specialized models that excel at specific tasks
|
||||
- Expanded app ecosystem (mobile, web, extensions)
|
||||
- Richer connector ecosystem
|
||||
- An evaluation framework to build better models
|
||||
|
||||
### The Long-Term Vision
|
||||
We're working towards open superintelligence where:
|
||||
- Open models match or exceed closed alternatives
|
||||
- Anyone can run powerful AI on their own hardware
|
||||
- The community drives innovation, not corporations
|
||||
- AI capabilities are owned by users, not rented
|
||||
|
||||
<Aside type="caution">
|
||||
This is an ambitious goal without a guaranteed path. We're betting on the open-source community, improved
|
||||
hardware, and better techniques, but we're honest that this is a journey, not a destination we've reached.
|
||||
</Aside>
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. [Download Jan](./quickstart) for your operating system
|
||||
2. Choose a model - download locally or add cloud API keys
|
||||
3. Start chatting or connect tools via MCP
|
||||
4. Build with our [local API](./api-server)
|
||||
|
||||
## Acknowledgements
|
||||
|
||||
Jan is built on the shoulders of giants:
|
||||
- [Llama.cpp](https://github.com/ggerganov/llama.cpp) for inference
|
||||
- [Model Context Protocol](https://modelcontextprotocol.io) for tool integration
|
||||
- The open-source community that makes this possible
|
||||
|
||||
## FAQs
|
||||
|
||||
<details>
|
||||
<summary><strong>What is Jan?</strong></summary>
|
||||
|
||||
Jan is an open-source AI platform working towards a viable alternative to Big Tech AI. Today it's a desktop app that runs models locally or connects to cloud providers. Tomorrow it aims to be a complete ecosystem rivaling platforms like ChatGPT and Claude.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>How is this different from other AI platforms?</strong></summary>
|
||||
|
||||
Other platforms are models behind APIs you rent. Jan is a complete AI ecosystem you own. Run any model, use real tools through MCP, keep your data private, and never pay subscriptions for local use.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What models can I use?</strong></summary>
|
||||
|
||||
**Jan Models:**
|
||||
- Jan-Nano (32k/128k) - Research and analysis with MCP integration
|
||||
- Lucy - Mobile-optimized search (1.7B)
|
||||
- Jan-v1 - Reasoning and tool use (4B)
|
||||
|
||||
**Open Source:**
|
||||
- OpenAI's gpt-oss models (120b and 20b)
|
||||
- Any GGUF model from Hugging Face
|
||||
|
||||
**Cloud (with your API keys):**
|
||||
- OpenAI, Anthropic, Mistral, Groq, and more
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What are MCP tools?</strong></summary>
|
||||
|
||||
MCP (Model Context Protocol) lets AI interact with real applications. Instead of just generating text, your AI can create designs in Canva, analyze data in Jupyter, browse the web, and execute code - all through conversation.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Is Jan compatible with my system?</strong></summary>
|
||||
|
||||
**Supported OS**:
|
||||
- [Windows 10+](/docs/desktop/windows#compatibility)
|
||||
- [macOS 12+](/docs/desktop/mac#compatibility)
|
||||
- [Linux (Ubuntu 20.04+)](/docs/desktop/linux)
|
||||
|
||||
**Hardware**:
|
||||
- Minimum: 8GB RAM, 10GB storage
|
||||
- Recommended: 16GB RAM, GPU (NVIDIA/AMD/Intel/Apple), 50GB storage
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>How realistic is 'open superintelligence'?</strong></summary>
|
||||
|
||||
Honestly? It's ambitious and uncertain. We believe the combination of rapidly improving open models, better consumer hardware, community innovation, and specialized models working together can eventually rival closed platforms. But this is a multi-year journey with no guarantees. What we can guarantee is that we'll keep building in the open, with the community, towards this goal.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What can Jan actually do today?</strong></summary>
|
||||
|
||||
Right now, Jan can:
|
||||
- Run models like Llama, Mistral, and our own Jan models locally
|
||||
- Connect to cloud providers if you want more power
|
||||
- Use MCP tools to create designs, analyze data, browse the web, and more
|
||||
- Work completely offline once models are downloaded
|
||||
- Provide an OpenAI-compatible API for developers
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Is Jan really free?</strong></summary>
|
||||
|
||||
**Local use**: Always free, no catches
|
||||
**Cloud models**: You pay providers directly (we add no markup)
|
||||
**Jan cloud**: Optional paid services coming 2025
|
||||
|
||||
The core platform will always be free and open source.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>How does Jan protect privacy?</strong></summary>
|
||||
|
||||
- Runs 100% offline once models are downloaded
|
||||
- All data stored locally in [Jan Data Folder](/docs/data-folder)
|
||||
- No telemetry without explicit consent
|
||||
- Open source code you can audit
|
||||
|
||||
<Aside type="caution">
|
||||
When using cloud providers through Jan, their privacy policies apply.
|
||||
</Aside>
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Can I self-host Jan?</strong></summary>
|
||||
|
||||
Yes. Download directly or build from [source](https://github.com/menloresearch/jan). Jan Server for production deployments coming late 2025.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>When will mobile/web versions launch?</strong></summary>
|
||||
@ -113,87 +259,16 @@ All versions will sync seamlessly.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What models are available?</strong></summary>
|
||||
<summary><strong>How can I contribute?</strong></summary>
|
||||
|
||||
**Jan Models:**
|
||||
- **Jan-Nano (32k/128k)**: Deep research with MCP integration
|
||||
- **Lucy**: Mobile-optimized agentic search (1.7B)
|
||||
- **Jan-v1**: Agentic reasoning and tool use (4B)
|
||||
|
||||
**Open Source:**
|
||||
- OpenAI's gpt-oss models (120b and 20b)
|
||||
- Any GGUF model from Hugging Face
|
||||
|
||||
**Cloud (with your API keys):**
|
||||
- OpenAI, Anthropic, Mistral, Groq, and more
|
||||
|
||||
**Coming late 2025:**
|
||||
- More specialized models for specific tasks
|
||||
|
||||
[Watch live training progress →](https://train.jan.ai)
|
||||
- Code: [GitHub](https://github.com/menloresearch/jan)
|
||||
- Community: [Discord](https://discord.gg/FTk2MvZwJH)
|
||||
- Testing: Help evaluate models and report bugs
|
||||
- Documentation: Improve guides and tutorials
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What are MCP tools?</strong></summary>
|
||||
<summary><strong>Are you hiring?</strong></summary>
|
||||
|
||||
MCP (Model Context Protocol) lets AI interact with real applications. Instead of just generating text, your AI can:
|
||||
- Create designs in Canva
|
||||
- Analyze data in Jupyter notebooks
|
||||
- Browse and interact with websites
|
||||
- Execute code in sandboxes
|
||||
- Search the web for current information
|
||||
|
||||
All through natural language conversation.
|
||||
Yes! We love hiring from our community. Check [Careers](https://menlo.bamboohr.com/careers).
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>How does Jan make money?</strong></summary>
|
||||
|
||||
- **Local use**: Always free
|
||||
- **Cloud features**: Optional paid services (coming late 2025)
|
||||
- **Enterprise**: Self-hosted deployment and support
|
||||
|
||||
We don't sell your data. We sell software and services.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Can I contribute?</strong></summary>
|
||||
|
||||
Yes. Everything is open:
|
||||
- [GitHub](https://github.com/janhq/jan) - Code contributions
|
||||
- [Model Training](https://jan.ai/docs/models) - See how we train
|
||||
- [Discord](https://discord.gg/FTk2MvZwJH) - Join discussions
|
||||
- [Model Testing](https://eval.jan.ai) - Help evaluate models
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Is this just another AI wrapper?</strong></summary>
|
||||
|
||||
No. We're building:
|
||||
- Our own models trained for specific tasks
|
||||
- Complete local AI infrastructure
|
||||
- Tools that extend model capabilities via MCP
|
||||
- An ecosystem that works offline
|
||||
|
||||
Other platforms are models behind APIs you rent. Jan is a complete AI platform you own.
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>What about privacy?</strong></summary>
|
||||
|
||||
**Local mode**: Your data never leaves your device. Period.
|
||||
**Cloud mode**: You choose when to use cloud features. Clear separation.
|
||||
|
||||
See our [Privacy Policy](./privacy).
|
||||
</details>
|
||||
|
||||
## Get Started
|
||||
|
||||
1. [Install Jan Desktop](./jan/installation) - Your AI workstation
|
||||
2. [Download Models](./jan/models) - Choose from gpt-oss, community models, or cloud
|
||||
3. [Explore MCP Tools](./mcp) - Connect to real applications
|
||||
4. [Build with our API](./api-reference) - OpenAI-compatible at localhost:1337
|
||||
|
||||
---
|
||||
|
||||
**Questions?** Join our [Discord](https://discord.gg/FTk2MvZwJH) or check [GitHub](https://github.com/janhq/jan/).
|
||||
@ -1,6 +1,11 @@
|
||||
---
|
||||
title: Jan-v1
|
||||
description: 4B parameter model with strong performance on reasoning benchmarks
|
||||
sidebar:
|
||||
order: 0
|
||||
badge:
|
||||
text: New
|
||||
variant: tip
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
@ -0,0 +1,335 @@
|
||||
---
|
||||
title: Jupyter MCP
|
||||
description: Real-time Jupyter notebook interaction and code execution through MCP integration.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
MCP,
|
||||
Model Context Protocol,
|
||||
Jupyter,
|
||||
data analysis,
|
||||
code execution,
|
||||
notebooks,
|
||||
Python,
|
||||
visualization,
|
||||
tool calling,
|
||||
GPT-5,
|
||||
OpenAI,
|
||||
]
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
[Jupyter MCP Server](https://jupyter-mcp-server.datalayer.tech/) enables real-time interaction with Jupyter notebooks, allowing AI models to edit, execute, and document code for data analysis and visualization. Instead of just generating code suggestions, AI can actually run Python code and see the results.
|
||||
|
||||
This integration gives Jan the ability to execute analysis, create visualizations, and iterate based on actual results - turning your AI assistant into a capable data science partner.
|
||||
|
||||
<Aside type="note">
|
||||
**Breaking Change**: Version 0.11.0+ renamed `room` to `document`. Check the [release notes](https://jupyter-mcp-server.datalayer.tech/releases) for details.
|
||||
</Aside>
|
||||
|
||||
## Available Tools
|
||||
|
||||
The Jupyter MCP Server provides [12 comprehensive tools](https://jupyter-mcp-server.datalayer.tech/tools/):
|
||||
|
||||
### Core Operations
|
||||
- `append_execute_code_cell`: Add and run code cells at notebook end
|
||||
- `insert_execute_code_cell`: Insert and run code at specific positions
|
||||
- `execute_cell_simple_timeout`: Execute cells with timeout control
|
||||
- `execute_cell_streaming`: Long-running cells with progress updates
|
||||
- `execute_cell_with_progress`: Execute with timeout and monitoring
|
||||
|
||||
### Cell Management
|
||||
- `append_markdown_cell`: Add documentation cells
|
||||
- `insert_markdown_cell`: Insert markdown at specific positions
|
||||
- `delete_cell`: Remove cells from notebook
|
||||
- `overwrite_cell_source`: Update existing cell content
|
||||
|
||||
### Information & Reading
|
||||
- `get_notebook_info`: Retrieve notebook metadata
|
||||
- `read_cell`: Examine specific cell content
|
||||
- `read_all_cells`: Get complete notebook state
|
||||
|
||||
<Aside type="caution">
|
||||
The MCP connects to **one notebook at a time**, not multiple notebooks. Specify your target notebook in the configuration.
|
||||
</Aside>
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Jan with MCP enabled
|
||||
- Python 3.8+ with uv package manager
|
||||
- Docker installed
|
||||
- OpenAI API key for GPT-5 access
|
||||
- Basic understanding of Jupyter notebooks
|
||||
|
||||
## Setup
|
||||
|
||||
### Enable MCP
|
||||
|
||||
1. Go to **Settings** > **MCP Servers**
|
||||
2. Toggle **Allow All MCP Tool Permission** ON
|
||||
|
||||

|
||||
|
||||
### Install uv Package Manager
|
||||
|
||||
If you don't have uv installed:
|
||||
|
||||
```bash
|
||||
# macOS and Linux
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
|
||||
# Windows
|
||||
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
|
||||
```
|
||||
|
||||
### Create Python Environment
|
||||
|
||||
Set up an isolated environment for Jupyter:
|
||||
|
||||
```bash
|
||||
# Create environment with Python 3.13
|
||||
uv venv .venv --python 3.13
|
||||
|
||||
# Activate environment
|
||||
source .venv/bin/activate # Linux/macOS
|
||||
# or
|
||||
.venv\Scripts\activate # Windows
|
||||
|
||||
# Install Jupyter dependencies
|
||||
uv pip install jupyterlab==4.4.1 jupyter-collaboration==4.0.2 ipykernel
|
||||
uv pip uninstall pycrdt datalayer_pycrdt
|
||||
uv pip install datalayer_pycrdt==0.12.17
|
||||
|
||||
# Add data science libraries
|
||||
uv pip install pandas numpy matplotlib altair
|
||||
```
|
||||
|
||||
### Start JupyterLab Server
|
||||
|
||||
Launch JupyterLab with authentication:
|
||||
|
||||
```bash
|
||||
jupyter lab --port 8888 --IdentityProvider.token heyheyyou --ip 0.0.0.0
|
||||
```
|
||||
|
||||

|
||||
|
||||
The server opens in your browser:
|
||||
|
||||

|
||||
|
||||
### Create Target Notebook
|
||||
|
||||
Create a new notebook named `for_jan.ipynb`:
|
||||
|
||||

|
||||
|
||||
### Configure MCP Server in Jan
|
||||
|
||||
Click `+` in MCP Servers section:
|
||||
|
||||
**Configuration for macOS/Windows:**
|
||||
- **Server Name**: `jupyter`
|
||||
- **Command**: `docker`
|
||||
- **Arguments**:
|
||||
```
|
||||
run -i --rm -e DOCUMENT_URL -e DOCUMENT_TOKEN -e DOCUMENT_ID -e RUNTIME_URL -e RUNTIME_TOKEN datalayer/jupyter-mcp-server:latest
|
||||
```
|
||||
- **Environment Variables**:
|
||||
- Key: `DOCUMENT_URL`, Value: `http://host.docker.internal:8888`
|
||||
- Key: `DOCUMENT_TOKEN`, Value: `heyheyyou`
|
||||
- Key: `DOCUMENT_ID`, Value: `for_jan.ipynb`
|
||||
- Key: `RUNTIME_URL`, Value: `http://host.docker.internal:8888`
|
||||
- Key: `RUNTIME_TOKEN`, Value: `heyheyyou`
|
||||
|
||||

|
||||
|
||||
## Using OpenAI's GPT-5
|
||||
|
||||
### Configure OpenAI Provider
|
||||
|
||||
Navigate to **Settings** > **Model Providers** > **OpenAI**:
|
||||
|
||||

|
||||
|
||||
### Add GPT-5 Model
|
||||
|
||||
Since GPT-5 is new, you'll need to manually add it to Jan:
|
||||
|
||||

|
||||
|
||||
<Aside type="note">
|
||||
**About GPT-5**: OpenAI's smartest, fastest, most useful model yet. It features built-in thinking capabilities, state-of-the-art performance across coding, math, and writing, and exceptional tool use abilities. GPT-5 automatically decides when to respond quickly versus when to think longer for expert-level responses.
|
||||
</Aside>
|
||||
|
||||
### Enable Tool Calling
|
||||
|
||||
Ensure tools are enabled for GPT-5:
|
||||
|
||||

|
||||
|
||||
## Usage
|
||||
|
||||
### Verify Tool Availability
|
||||
|
||||
Start a new chat with GPT-5. The tools bubble shows all available Jupyter operations:
|
||||
|
||||

|
||||
|
||||
### Initial Test
|
||||
|
||||
Start with establishing the notebook as your workspace:
|
||||
|
||||
```
|
||||
You have access to a jupyter notebook, please use it as our data analysis scratchpad. Let's start by printing "Hello Jan" in a new cell.
|
||||
```
|
||||
|
||||
GPT-5 creates and executes the code successfully:
|
||||
|
||||

|
||||
|
||||
### Advanced Data Analysis
|
||||
|
||||
Try a more complex task combining multiple operations:
|
||||
|
||||
```
|
||||
Generate synthetic data with numpy, move it to a pandas dataframe and create a pivot table, and then make a cool animated plot using matplotlib. Your use case will be sales analysis in the luxury fashion industry.
|
||||
```
|
||||
|
||||

|
||||
|
||||
Watch the complete output unfold:
|
||||
|
||||
<video width="100%" controls>
|
||||
<source src="/assets/videos/mcpjupyter.mp4" type="video/mp4" />
|
||||
Your browser does not support the video tag.
|
||||
</video>
|
||||
|
||||
## Example Prompts to Try
|
||||
|
||||
### Financial Analysis
|
||||
```
|
||||
Create a Monte Carlo simulation for portfolio risk analysis. Generate 10,000 scenarios, calculate VaR at 95% confidence, and visualize the distribution.
|
||||
```
|
||||
|
||||
### Time Series Forecasting
|
||||
```
|
||||
Generate synthetic time series data representing daily website traffic over 2 years with weekly seasonality and trend. Build an ARIMA model and forecast the next 30 days.
|
||||
```
|
||||
|
||||
### Machine Learning Pipeline
|
||||
```
|
||||
Build a complete classification pipeline: generate a dataset with 3 classes and 5 features, split the data, try multiple algorithms (RF, SVM, XGBoost), and create a comparison chart of their performance.
|
||||
```
|
||||
|
||||
### Interactive Dashboards
|
||||
```
|
||||
Create an interactive visualization using matplotlib widgets showing how changing interest rates affects loan payments over different time periods.
|
||||
```
|
||||
|
||||
### Statistical Testing
|
||||
```
|
||||
Generate two datasets representing A/B test results for an e-commerce site. Perform appropriate statistical tests and create visualizations to determine if the difference is significant.
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
<Aside type="caution">
|
||||
Multiple tools can quickly consume context windows, especially for local models. GPT-5's unified system with smart routing helps manage this, but local models may struggle with speed and context limitations.
|
||||
</Aside>
|
||||
|
||||
### Context Management
|
||||
- Each tool call adds to conversation history
|
||||
- 12 available tools means substantial system prompt overhead
|
||||
- Local models may need reduced tool sets for reasonable performance
|
||||
- Consider disabling unused tools to conserve context
|
||||
|
||||
### Cloud vs Local Trade-offs
|
||||
- **Cloud models (GPT-5)**: Handle multiple tools efficiently with large context windows
|
||||
- **Local models**: May require optimization, reduced tool sets, or smaller context sizes
|
||||
- **Hybrid approach**: Use cloud for complex multi-tool workflows, local for simple tasks
|
||||
|
||||
## Security Considerations
|
||||
|
||||
<Aside type="caution">
|
||||
MCP provides powerful capabilities but requires careful security practices.
|
||||
</Aside>
|
||||
|
||||
### Authentication Tokens
|
||||
- **Always use strong tokens** - avoid simple passwords
|
||||
- **Never commit tokens** to version control
|
||||
- **Rotate tokens regularly** for production use
|
||||
- **Use different tokens** for different environments
|
||||
|
||||
### Network Security
|
||||
- JupyterLab is network-accessible with `--ip 0.0.0.0`
|
||||
- Consider using `--ip 127.0.0.1` for local-only access
|
||||
- Implement firewall rules to restrict access
|
||||
- Use HTTPS in production environments
|
||||
|
||||
### Code Execution Risks
|
||||
- AI has full Python execution capabilities
|
||||
- Review generated code before execution
|
||||
- Use isolated environments for sensitive work
|
||||
- Monitor resource usage and set limits
|
||||
|
||||
### Data Privacy
|
||||
- Notebook content is processed by AI models
|
||||
- When using cloud models like GPT-5, data leaves your system
|
||||
- Keep sensitive data in secure environments
|
||||
- Consider model provider's data policies
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Environment Management
|
||||
- Use virtual environments for isolation
|
||||
- Document required dependencies
|
||||
- Version control your notebooks
|
||||
- Regular environment cleanup
|
||||
|
||||
### Performance Optimization
|
||||
- Start with simple operations
|
||||
- Monitor memory usage during execution
|
||||
- Close unused notebooks
|
||||
- Restart kernels when needed
|
||||
|
||||
### Effective Prompting
|
||||
- Be specific about desired outputs
|
||||
- Break complex tasks into steps
|
||||
- Ask for explanations with code
|
||||
- Request error handling in critical operations
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Connection Problems:**
|
||||
- Verify JupyterLab is running
|
||||
- Check token matches configuration
|
||||
- Confirm Docker can reach host
|
||||
- Test with curl to verify connectivity
|
||||
|
||||
**Execution Failures:**
|
||||
- Check Python package availability
|
||||
- Verify kernel is running
|
||||
- Look for syntax errors in generated code
|
||||
- Restart kernel if stuck
|
||||
|
||||
**Tool Calling Errors:**
|
||||
- Ensure model supports tool calling
|
||||
- Verify all 12 tools appear in chat
|
||||
- Check MCP server is active
|
||||
- Review Docker logs for errors
|
||||
|
||||
**API Rate Limits:**
|
||||
- Monitor OpenAI usage dashboard
|
||||
- Implement retry logic for transient errors
|
||||
- Consider fallback to local models
|
||||
- Cache results when possible
|
||||
|
||||
## Conclusion
|
||||
|
||||
The Jupyter MCP integration combined with GPT-5's advanced capabilities creates an exceptionally powerful data science environment. With GPT-5's built-in reasoning and expert-level intelligence, complex analyses that once required extensive manual coding can now be accomplished through natural conversation.
|
||||
|
||||
Whether you're exploring data, building models, or creating visualizations, this integration provides the computational power of Jupyter with the intelligence of GPT-5 - all within Jan's privacy-conscious interface.
|
||||
|
||||
Remember: with great computational power comes the responsibility to use it securely. Always validate generated code, use strong authentication, and be mindful of data privacy when using cloud-based models.
|
||||
@ -0,0 +1,270 @@
|
||||
---
|
||||
title: Linear MCP
|
||||
description: Manage software projects and issue tracking through natural language with Linear integration.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
MCP,
|
||||
Model Context Protocol,
|
||||
Linear,
|
||||
project management,
|
||||
issue tracking,
|
||||
agile,
|
||||
software development,
|
||||
tool calling,
|
||||
]
|
||||
sidebar:
|
||||
badge:
|
||||
text: New
|
||||
variant: tip
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
[Linear MCP](https://linear.app) provides comprehensive project management capabilities through natural conversation. Transform your software development workflow by managing issues, projects, and team collaboration directly through AI.
|
||||
|
||||
## Available Tools
|
||||
|
||||
Linear MCP offers extensive project management capabilities:
|
||||
|
||||
### Issue Management
|
||||
- `list_issues`: View all issues in your workspace
|
||||
- `get_issue`: Get details of a specific issue
|
||||
- `create_issue`: Create new issues with full details
|
||||
- `update_issue`: Modify existing issues
|
||||
- `list_my_issues`: See your assigned issues
|
||||
- `list_issue_statuses`: View available workflow states
|
||||
- `list_issue_labels`: See and manage labels
|
||||
- `create_issue_label`: Create new labels
|
||||
|
||||
### Project & Team
|
||||
- `list_projects`: View all projects
|
||||
- `get_project`: Get project details
|
||||
- `create_project`: Start new projects
|
||||
- `update_project`: Modify project settings
|
||||
- `list_teams`: See all teams
|
||||
- `get_team`: Get team information
|
||||
- `list_users`: View team members
|
||||
|
||||
### Documentation & Collaboration
|
||||
- `list_documents`: Browse documentation
|
||||
- `get_document`: Read specific documents
|
||||
- `search_documentation`: Find information
|
||||
- `list_comments`: View issue comments
|
||||
- `create_comment`: Add comments to issues
|
||||
- `list_cycles`: View sprint cycles
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Jan with experimental features enabled
|
||||
- Linear account (free for up to 250 issues)
|
||||
- Model with strong tool calling support
|
||||
- Active internet connection
|
||||
|
||||
<Aside type="note">
|
||||
Linear offers a generous free tier perfect for small teams and personal projects. Unlimited users, 250 active issues, and full API access included.
|
||||
</Aside>
|
||||
|
||||
## Setup
|
||||
|
||||
### Create Linear Account
|
||||
|
||||
1. Sign up at [linear.app](https://linear.app)
|
||||
2. Complete the onboarding process
|
||||
|
||||

|
||||
|
||||
Once logged in, you'll see your workspace:
|
||||
|
||||

|
||||
|
||||
### Enable MCP in Jan
|
||||
|
||||
<Aside type="caution">
|
||||
Enable **Experimental Features** in **Settings > General** if you don't see the MCP Servers option.
|
||||
</Aside>
|
||||
|
||||
1. Go to **Settings > MCP Servers**
|
||||
2. Toggle **Allow All MCP Tool Permission** ON
|
||||
|
||||
### Configure Linear MCP
|
||||
|
||||
Click the `+` button to add Linear MCP:
|
||||
|
||||
**Configuration:**
|
||||
- **Server Name**: `linear`
|
||||
- **Command**: `npx`
|
||||
- **Arguments**: `-y mcp-remote https://mcp.linear.app/sse`
|
||||
|
||||

|
||||
|
||||
### Authenticate with Linear
|
||||
|
||||
When you first use Linear tools, a browser tab will open for authentication:
|
||||
|
||||

|
||||
|
||||
Complete the OAuth flow to grant Jan access to your Linear workspace.
|
||||
|
||||
## Usage
|
||||
|
||||
### Select a Model with Tool Calling
|
||||
|
||||
For this example, we'll use kimi-k2 from Groq:
|
||||
|
||||
1. Add the model in Groq settings: `moonshotai/kimi-k2-instruct`
|
||||
|
||||

|
||||
|
||||
2. Enable tools for the model:
|
||||
|
||||

|
||||
|
||||
### Verify Available Tools
|
||||
|
||||
You should see all Linear tools in the chat interface:
|
||||
|
||||

|
||||
|
||||
### Epic Project Management
|
||||
|
||||
Watch AI transform mundane tasks into epic narratives:
|
||||
|
||||

|
||||
|
||||
## Creative Examples
|
||||
|
||||
### 🎭 Shakespearean Sprint Planning
|
||||
```
|
||||
Create Linear tickets in the '👋Jan' team for my AGI project as battles in a Shakespearean war epic. Each sprint is a military campaign, bugs are enemy spies, and merge conflicts are sword fights between rival houses. Invent unique epic titles and dramatic descriptions with battle cries and victory speeches. Characterize bugs as enemy villains and developers as heroic warriors in this noble quest for AGI glory. Make tasks like model training, testing, and deployment sound like grand military campaigns with honor and valor.
|
||||
```
|
||||
|
||||
### 🚀 Space Mission Development
|
||||
```
|
||||
Transform our mobile app redesign into a NASA space mission. Create issues where each feature is a mission objective, bugs are space debris to clear, and releases are launch windows. Add dramatic mission briefings, countdown sequences, and astronaut logs. Priority levels become mission criticality ratings.
|
||||
```
|
||||
|
||||
### 🏴☠️ Pirate Ship Operations
|
||||
```
|
||||
Set up our e-commerce platform project as a pirate fleet adventure. Features are islands to conquer, bugs are sea monsters, deployments are naval battles. Create colorful pirate-themed tickets with treasure maps, crew assignments, and tales of high seas adventure.
|
||||
```
|
||||
|
||||
### 🎮 Video Game Quest Log
|
||||
```
|
||||
Structure our API refactoring project like an RPG quest system. Create issues as quests with XP rewards, boss battles for major features, side quests for minor tasks. Include loot drops (completed features), skill trees (learning requirements), and epic boss fight descriptions for challenging bugs.
|
||||
```
|
||||
|
||||
### 🍳 Gordon Ramsay's Kitchen
|
||||
```
|
||||
Manage our restaurant app project as if Gordon Ramsay is the head chef. Create brutally honest tickets criticizing code quality, demanding perfection in UX like a Michelin star dish. Bugs are "bloody disasters" and successful features are "finally, some good code." Include Kitchen Nightmares-style rescue plans.
|
||||
```
|
||||
|
||||
## Practical Workflows
|
||||
|
||||
### Sprint Planning
|
||||
```
|
||||
Review all open issues in the Backend team, identify the top 10 by priority, and create a new sprint cycle called "Q1 Performance Sprint" with appropriate issues assigned.
|
||||
```
|
||||
|
||||
### Bug Triage
|
||||
```
|
||||
List all bugs labeled "critical" or "high-priority", analyze their descriptions, and suggest which ones should be fixed first based on user impact. Update their status to "In Progress" for the top 3.
|
||||
```
|
||||
|
||||
### Documentation Audit
|
||||
```
|
||||
Search our documentation for anything related to API authentication. Create issues for any gaps or outdated sections you find, labeled as "documentation" with detailed improvement suggestions.
|
||||
```
|
||||
|
||||
### Team Workload Balance
|
||||
```
|
||||
Show me all active issues grouped by assignee. Identify anyone with more than 5 high-priority items and suggest redistributions to balance the workload.
|
||||
```
|
||||
|
||||
### Release Planning
|
||||
```
|
||||
Create a project called "v2.0 Release" with milestones for: feature freeze, beta testing, documentation, and launch. Generate appropriate issues for each phase with realistic time estimates.
|
||||
```
|
||||
|
||||
## Advanced Integration Patterns
|
||||
|
||||
### Cross-Project Dependencies
|
||||
```
|
||||
Find all issues labeled "blocked" across all projects. For each one, identify what they're waiting on and create linked issues for the blocking items if they don't exist.
|
||||
```
|
||||
|
||||
### Automated Status Updates
|
||||
```
|
||||
Look at all issues assigned to me that haven't been updated in 3 days. Add a comment with a status update based on their current state and any blockers.
|
||||
```
|
||||
|
||||
### Smart Labeling
|
||||
```
|
||||
Analyze all unlabeled issues in our workspace. Based on their titles and descriptions, suggest appropriate labels and apply them. Create any missing label categories we need.
|
||||
```
|
||||
|
||||
### Sprint Retrospectives
|
||||
```
|
||||
Generate a retrospective report for our last completed cycle. List what was completed, what was pushed to next sprint, and create discussion issues for any patterns you notice.
|
||||
```
|
||||
|
||||
## Tips for Maximum Productivity
|
||||
|
||||
- **Batch Operations**: Create multiple related issues in one request
|
||||
- **Smart Templates**: Ask AI to remember your issue templates
|
||||
- **Natural Queries**: "Show me what John is working on this week"
|
||||
- **Context Awareness**: Reference previous issues in new requests
|
||||
- **Automated Workflows**: Set up recurring management tasks
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Authentication Issues:**
|
||||
- Clear browser cookies for Linear
|
||||
- Re-authenticate through the OAuth flow
|
||||
- Check Linear workspace permissions
|
||||
- Verify API access is enabled
|
||||
|
||||
**Tool Calling Errors:**
|
||||
- Ensure model supports multiple tool calls
|
||||
- Try breaking complex requests into steps
|
||||
- Verify all required fields are provided
|
||||
- Check Linear service status
|
||||
|
||||
**Missing Data:**
|
||||
- Refresh authentication token
|
||||
- Verify workspace access permissions
|
||||
- Check if issues are in archived projects
|
||||
- Ensure proper team selection
|
||||
|
||||
**Performance Issues:**
|
||||
- Linear API has rate limits (see dashboard)
|
||||
- Break bulk operations into batches
|
||||
- Cache frequently accessed data
|
||||
- Use specific filters to reduce data
|
||||
|
||||
<Aside type="tip">
|
||||
Linear's keyboard shortcuts work great alongside MCP! Use CMD+K for quick navigation while AI handles the heavy lifting.
|
||||
</Aside>
|
||||
|
||||
## Integration Ideas
|
||||
|
||||
Combine Linear with other MCP tools:
|
||||
|
||||
- **Serper + Linear**: Research technical solutions, then create implementation tickets
|
||||
- **Jupyter + Linear**: Analyze project metrics, generate data-driven sprint plans
|
||||
- **Todoist + Linear**: Sync personal tasks with work issues
|
||||
- **E2B + Linear**: Run code tests, automatically create bug reports
|
||||
|
||||
## Privacy & Security
|
||||
|
||||
Linear MCP uses OAuth for authentication, meaning:
|
||||
- Your credentials are never shared with Jan
|
||||
- Access can be revoked anytime from Linear settings
|
||||
- Data stays within Linear's infrastructure
|
||||
- Only requested permissions are granted
|
||||
|
||||
## Next Steps
|
||||
|
||||
Linear MCP transforms project management from clicking through interfaces into natural conversation. Whether you're planning sprints, triaging bugs, or crafting epic development sagas, AI becomes your project management companion.
|
||||
|
||||
Start with simple issue creation, then explore complex workflows like automated sprint planning and workload balancing. The combination of Linear's powerful platform with AI's creative capabilities makes project management both efficient and entertaining!
|
||||
@ -0,0 +1,261 @@
|
||||
---
|
||||
title: Todoist MCP
|
||||
description: Manage your tasks and todo lists through natural language with Todoist integration.
|
||||
keywords:
|
||||
[
|
||||
Jan,
|
||||
MCP,
|
||||
Model Context Protocol,
|
||||
Todoist,
|
||||
task management,
|
||||
productivity,
|
||||
todo list,
|
||||
tool calling,
|
||||
]
|
||||
sidebar:
|
||||
badge:
|
||||
text: New
|
||||
variant: tip
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
[Todoist MCP Server](https://github.com/abhiz123/todoist-mcp-server) enables AI models to manage your Todoist tasks through natural conversation. Instead of switching between apps, you can create, update, and complete tasks by simply chatting with your AI assistant.
|
||||
|
||||
## Available Tools
|
||||
|
||||
- `todoist_create_task`: Add new tasks to your todo list
|
||||
- `todoist_get_tasks`: Retrieve and view your current tasks
|
||||
- `todoist_update_task`: Modify existing tasks
|
||||
- `todoist_complete_task`: Mark tasks as done
|
||||
- `todoist_delete_task`: Remove tasks from your list
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Jan with experimental features enabled
|
||||
- Todoist account (free or premium)
|
||||
- Model with strong tool calling support
|
||||
- Node.js installed
|
||||
|
||||
<Aside type="note">
|
||||
Todoist offers a generous free tier perfect for personal task management. Premium features add labels, reminders, and more projects.
|
||||
</Aside>
|
||||
|
||||
## Setup
|
||||
|
||||
### Create Todoist Account
|
||||
|
||||
1. Sign up at [todoist.com](https://todoist.com) or log in if you have an account
|
||||
2. Complete the onboarding process
|
||||
|
||||

|
||||
|
||||
Once logged in, you'll see your main dashboard:
|
||||
|
||||

|
||||
|
||||
### Get Your API Token
|
||||
|
||||
1. Click **Settings** (gear icon)
|
||||
2. Navigate to **Integrations**
|
||||
3. Click on the **Developer** tab
|
||||
4. Copy your API token (it's already generated for you)
|
||||
|
||||

|
||||
|
||||
### Enable MCP in Jan
|
||||
|
||||
<Aside type="caution">
|
||||
If you don't see the MCP Servers option, enable **Experimental Features** in **Settings > General** first.
|
||||
</Aside>
|
||||
|
||||
1. Go to **Settings > MCP Servers**
|
||||
2. Toggle **Allow All MCP Tool Permission** ON
|
||||
|
||||
### Configure Todoist MCP
|
||||
|
||||
Click the `+` button to add a new MCP server:
|
||||
|
||||
**Configuration:**
|
||||
- **Server Name**: `todoist`
|
||||
- **Command**: `npx`
|
||||
- **Arguments**: `-y @abhiz123/todoist-mcp-server`
|
||||
- **Environment Variables**:
|
||||
- Key: `TODOIST_API_TOKEN`, Value: `your_api_token_here`
|
||||
|
||||

|
||||
|
||||
## Usage
|
||||
|
||||
### Select a Model with Tool Calling
|
||||
|
||||
Open a new chat and select a model that excels at tool calling. Make sure tools are enabled for your chosen model.
|
||||
|
||||

|
||||
|
||||
### Verify Tools Available
|
||||
|
||||
You should see the Todoist tools in the tools panel:
|
||||
|
||||

|
||||
|
||||
### Start Managing Tasks
|
||||
|
||||
Now you can manage your todo list through natural conversation:
|
||||
|
||||

|
||||
|
||||
## Example Prompts
|
||||
|
||||
### Blog Writing Workflow
|
||||
```
|
||||
I need to write a blog post about AI and productivity tools today. Please add some tasks to my todo list to make sure I have a good set of steps to accomplish this task.
|
||||
```
|
||||
|
||||
The AI will create structured tasks like:
|
||||
- Research AI productivity tools
|
||||
- Create blog outline
|
||||
- Write introduction
|
||||
- Draft main sections
|
||||
- Add examples and screenshots
|
||||
- Edit and proofread
|
||||
- Publish and promote
|
||||
|
||||
### Weekly Meal Planning
|
||||
```
|
||||
Help me plan meals for the week. Create a grocery shopping list and cooking schedule for Monday through Friday, focusing on healthy, quick dinners.
|
||||
```
|
||||
|
||||
### Home Improvement Project
|
||||
```
|
||||
I'm renovating my home office this weekend. Break down the project into manageable tasks including shopping, prep work, and the actual renovation steps.
|
||||
```
|
||||
|
||||
### Study Schedule
|
||||
```
|
||||
I have a statistics exam in 2 weeks. Create a study plan with daily tasks covering all chapters, practice problems, and review sessions.
|
||||
```
|
||||
|
||||
### Fitness Goals
|
||||
```
|
||||
Set up a 30-day fitness challenge for me. Include daily workout tasks, rest days, and weekly progress check-ins.
|
||||
```
|
||||
|
||||
### Event Planning
|
||||
```
|
||||
I'm organizing a surprise birthday party for next month. Create a comprehensive task list covering invitations, decorations, food, entertainment, and day-of coordination.
|
||||
```
|
||||
|
||||
## Advanced Usage
|
||||
|
||||
### Task Management Commands
|
||||
|
||||
**View all tasks:**
|
||||
```
|
||||
Show me all my pending tasks for today
|
||||
```
|
||||
|
||||
**Update priorities:**
|
||||
```
|
||||
Make "Write blog introduction" high priority and move it to the top of my list
|
||||
```
|
||||
|
||||
**Bulk completion:**
|
||||
```
|
||||
Mark all my morning routine tasks as complete
|
||||
```
|
||||
|
||||
**Clean up:**
|
||||
```
|
||||
Delete all completed tasks from last week
|
||||
```
|
||||
|
||||
### Project Organization
|
||||
|
||||
Todoist supports projects, though the MCP may have limitations. Try:
|
||||
```
|
||||
Create a new project called "Q1 Goals" and add 5 key objectives as tasks
|
||||
```
|
||||
|
||||
### Recurring Tasks
|
||||
|
||||
Set up repeating tasks:
|
||||
```
|
||||
Add a daily task to review my calendar at 9 AM
|
||||
Add a weekly task for meal prep on Sundays
|
||||
Add a monthly task to pay bills on the 1st
|
||||
```
|
||||
|
||||
## Creative Use Cases
|
||||
|
||||
### 🎮 Game Development Sprint
|
||||
```
|
||||
I'm participating in a 48-hour game jam. Create an hour-by-hour task schedule covering ideation, prototyping, art creation, programming, testing, and submission.
|
||||
```
|
||||
|
||||
### 📚 Book Writing Challenge
|
||||
```
|
||||
I'm doing NaNoWriMo (writing a novel in a month). Break down a 50,000-word goal into daily writing tasks with word count targets and plot milestones.
|
||||
```
|
||||
|
||||
### 🌱 Garden Planning
|
||||
```
|
||||
It's spring planting season. Create a gardening schedule for the next 3 months including soil prep, planting dates for different vegetables, watering reminders, and harvest times.
|
||||
```
|
||||
|
||||
### 🎂 Baking Business Launch
|
||||
```
|
||||
I'm starting a home bakery. Create tasks for getting permits, setting up social media, creating a menu, pricing strategy, and first week's baking schedule.
|
||||
```
|
||||
|
||||
### 🏠 Moving Checklist
|
||||
```
|
||||
I'm moving to a new apartment next month. Generate a comprehensive moving checklist including utilities setup, packing by room, change of address notifications, and moving day logistics.
|
||||
```
|
||||
|
||||
## Tips for Best Results
|
||||
|
||||
- **Be specific**: "Add task: Call dentist tomorrow at 2 PM" works better than "remind me about dentist"
|
||||
- **Use natural language**: The AI understands context, so chat naturally
|
||||
- **Batch operations**: Ask to create multiple related tasks at once
|
||||
- **Review regularly**: Ask the AI to show your tasks and help prioritize
|
||||
- **Iterate**: If the tasks aren't quite right, ask the AI to modify them
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Tasks not appearing in Todoist:**
|
||||
- Verify API token is correct
|
||||
- Check Todoist website/app and refresh
|
||||
- Ensure MCP server shows as active
|
||||
|
||||
**Tool calling errors:**
|
||||
- Confirm model supports tool calling
|
||||
- Enable tools in model settings
|
||||
- Try a different model (Claude 3.5+ or GPT-4o recommended)
|
||||
|
||||
**Connection issues:**
|
||||
- Check internet connectivity
|
||||
- Verify Node.js installation
|
||||
- Restart Jan after configuration
|
||||
|
||||
**Rate limiting:**
|
||||
- Todoist API has rate limits
|
||||
- Space out bulk operations
|
||||
- Wait a moment between large task batches
|
||||
|
||||
<Aside type="tip">
|
||||
Todoist syncs across all devices. Tasks created through Jan instantly appear on your phone, tablet, and web app!
|
||||
</Aside>
|
||||
|
||||
## Privacy Note
|
||||
|
||||
Your tasks are synced with Todoist's servers. While the MCP runs locally, task data is stored in Todoist's cloud for sync functionality. Review Todoist's privacy policy if you're handling sensitive information.
|
||||
|
||||
## Next Steps
|
||||
|
||||
Combine Todoist MCP with other tools for powerful workflows:
|
||||
- Use Serper MCP to research topics, then create action items in Todoist
|
||||
- Generate code with E2B, then add testing tasks to your todo list
|
||||
- Analyze data with Jupyter, then create follow-up tasks for insights
|
||||
|
||||
Task management through natural language makes staying organized effortless. Let your AI assistant handle the overhead while you focus on getting things done!
|
||||
@ -1,12 +1,14 @@
|
||||
---
|
||||
title: Serper Search MCP
|
||||
description: Connect Jan to real-time web search with Google results through Serper API.
|
||||
sidebar:
|
||||
badge:
|
||||
text: New
|
||||
variant: tip
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
# Serper Search MCP
|
||||
|
||||
[Serper](https://serper.dev) provides Google search results through a simple API, making it perfect for giving AI models access to current web information. The Serper MCP integration enables Jan models to search the web and retrieve real-time information.
|
||||
|
||||
## Available Tools
|
||||
|
||||
@ -17,8 +17,6 @@ keywords:
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
|
||||
# QuickStart
|
||||
|
||||
Get up and running with Jan in minutes. This guide will help you install Jan, download a model, and start chatting immediately.
|
||||
|
||||
<ol>
|
||||
@ -46,30 +44,13 @@ We recommend starting with **Jan v1**, our 4B parameter model optimized for reas
|
||||
Jan v1 achieves 91.1% accuracy on SimpleQA and excels at tool calling, making it perfect for web search and reasoning tasks.
|
||||
</Aside>
|
||||
|
||||
**HuggingFace models:** Some require an access token. Add yours in **Settings > Model Providers > Llama.cpp > Hugging Face Access Token**.
|
||||
|
||||

|
||||
|
||||
### Step 3: Enable GPU Acceleration (Optional)
|
||||
|
||||
For Windows/Linux with compatible graphics cards:
|
||||
|
||||
1. Go to **Settings** > **Hardware**
|
||||
2. Toggle **GPUs** to ON
|
||||
|
||||

|
||||
|
||||
<Aside type="note">
|
||||
Install required drivers before enabling GPU acceleration. See setup guides for [Windows](/docs/desktop/windows#gpu-acceleration) & [Linux](/docs/desktop/linux#gpu-acceleration).
|
||||
</Aside>
|
||||
|
||||
### Step 4: Start Chatting
|
||||
### Step 3: Start Chatting
|
||||
|
||||
1. Click the **New Chat** icon
|
||||
2. Select your model in the input field dropdown
|
||||
3. Type your message and start chatting
|
||||
|
||||

|
||||

|
||||
|
||||
Try asking Jan v1 questions like:
|
||||
- "Explain quantum computing in simple terms"
|
||||
|
||||
@ -16,7 +16,6 @@ keywords:
|
||||
|
||||
import { Aside, Steps } from '@astrojs/starlight/components'
|
||||
|
||||
# Settings
|
||||
|
||||
Access Jan's settings by clicking the Settings icon in the bottom left corner.
|
||||
|
||||
|
||||