Merge pull request #1438 from janhq/framework

docs: jan framework principles
This commit is contained in:
0xSage 2024-01-08 15:10:43 +08:00 committed by GitHub
commit b9f82fd778
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
29 changed files with 200 additions and 94 deletions

View File

@ -1,37 +1,60 @@
---
title: About Jan
description: Jan is a ChatGPT-alternative that runs on your own computer, with a local API server.
keywords: [Jan AI, Jan, ChatGPT alternative, local AI, private AI, conversational AI, no-subscription fee, large language model ]
keywords:
[
Jan AI,
Jan,
ChatGPT alternative,
local AI,
private AI,
conversational AI,
no-subscription fee,
large language model,
]
---
Jan believes in the need for an open source AI ecosystem, and are building the infra and tooling to allow open source AIs to compete on a level playing field with proprietary ones.
Jan believes in the need for an **open source AI ecosystem**. We are focused on building the infra and tooling to allow open source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.
Jan's long-term technical endeavor is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.
## Quicklinks
- Core product vision for [Jan Framework](http://localhost:3001/docs/)
- R&D and model training efforts [Discord](https://discord.gg/9NfUSyzp3y) (via our small data-center which is `free & open to all researchers who lack GPUs`!)
- Current implementations of Jan Framework: [Jan Desktop](https://jan.ai/), [Nitro](https://nitro.jan.ai/)
## Why does Jan Exist?
### Mission
Our mission is to allow humans and businesses to own their AI, with the right to tinker, repair and innovate.
Our current mission is to allow humans and businesses to **own their AI, with the right to tinker, repair and innovate**.
:::tip
Our life-long mission is to **eliminate work - so human can focus on creation, invention, and moral governance over robots**.
:::
### Ideal Customer
Our ideal customer is an AI enthusiast or business who has experienced some limitations with current AI solutions and is keen to find open source alternatives.
Our ideal customer is an AI enthusiast or business who has experienced some limitations with current AI solutions and is keen to find open source alternatives.
### Problems
Our ideal customer would use Jan to solve one of these problems.
Our ideal customer would use Jan to solve one of these problems.
_Control_
*Control*
- Control (e.g. preventing vendor lock-in)
- Stability (e.g. runs predictably every time)
- Local-use (e.g. for speed, or for airgapped environments)
*Privacy*
- Data protection (e.g. personal data or company data)
- Privacy (e.g. nsfw)
*Customisability*
_Privacy_
- Data protection (e.g. personal data or company data)
- Privacy (e.g. nsfw)
_Customisability_
- Tinkerability (e.g. ability to change model, experiment)
- Niche Models (e.g. fine-tuned, domain-specific models that outperform OpenAI)
@ -39,43 +62,44 @@ Sources: [^1] [^2] [^3] [^4]
[^1]: [What are you guys doing that can't be done with ChatGPT?](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7ksti6/?utm_source=share&utm_medium=web2x&context=3)
[^2]: [What's your main interest in running a local LLM instead of an existing API?](https://www.reddit.com/r/LocalLLaMA/comments/1718a9o/whats_your_main_interest_in_running_a_local_llm/)
[^3]: [Ask HN: What's the best self-hosted/local alternative to GPT-4?](https://news.ycombinator.com/item?id=36138224)
[^3]: [Ask HN: What's the best self-hosted/local alternative to GPT-4?](https://news.ycombinator.com/item?id=36138224)
[^4]: [LoRAs](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7mdz1i/?utm_source=share&utm_medium=web2x&context=3)
### Solution
Jan is a seamless user experience that runs on your personal computer, that glues the different pieces of the open source AI ecosystem to provide an alternative to OpenAI's closed platform.
Jan is a seamless user experience that runs on your personal computer, that glues the different pieces of the open source AI ecosystem to provide an alternative to OpenAI's closed platform.
- We build a comprehensive, seamless platform that takes care of the technical chores across the stack required to run open source AI
- We run on top of a local folder of non-proprietary files, that anyone can tinker with (yes, even other apps!)
- We run on top of a local folder of non-proprietary files, that anyone can tinker with (yes, even other apps!)
- We provide open formats for packaging and distributing AI to run reproducibly across devices
## How Jan Works
### Open Source
Jan is a startup with an open source business model. We believe in the need for an open source AI ecosystem, and are committed to building it.
Jan is a startup with an open source business model. We believe in the need for an open source AI ecosystem, and are committed to building it.
- [Jan: a Personal AI](https://github.com/janhq/jan) (AGPLv3)
- [Jan Framework](https://github.com/janhq/jan) (AGPLv3)
- [Jan Desktop Client & Local server](https://jan.ai) (AGPLv3, built on Jan Framework)
- [Nitro: run Local AI](https://github.com/janhq/nitro) (AGPLv3)
### Build in Public
We use GitHub to build in public and welcome anyone to join in.
We use GitHub to build in public and welcome anyone to join in.
- [Jan's Kanban](https://github.com/orgs/janhq/projects/5)
- [Jan's Roadmap](https://github.com/orgs/janhq/projects/5/views/29)
- [Jan's Newsletter](https://newsletter.jan.ai)
- [Jan's Newsletter](https://newsletter.jan.ai)
### Bootstrapped
Jan is currently a bootstrapped startup. We balance technical invention with the search for a sustainable business model.
Jan is currently a bootstrapped startup. We balance technical invention with the search for a sustainable business model.
We appreciate any business that can balance growth with cashflow/profitability.
We appreciate any business that can balance growth with cashflow/profitability.
### Remote Team
Jan has a fully-remote team. We are mainly based in the APAC timezone. We use [Discord](https://discord.gg/af6SaTdzpx) and [Github](https://github.com/janhq) to work.
Jan has a fully-remote team. We are mainly based in the APAC timezone. We use [Discord](https://discord.gg/af6SaTdzpx) and [Github](https://github.com/janhq) to work.
## Contact
@ -90,6 +114,6 @@ Drop us a message in our [Discord](https://discord.gg/af6SaTdzpx) and we'll get
### Careers
Jan has a culture of ownership, independent thought, and lightning fast execution. If you'd like to join us, we have open positions on our [careers page](https://janai.bamboohr.com/careers).
Jan has a culture of ownership, independent thought, and lightning fast execution. If you'd like to join us, we have open positions on our [careers page](https://janai.bamboohr.com/careers).
## Footnotes

View File

@ -0,0 +1,98 @@
---
title: Overview
slug: /docs
---
The following low-level docs are aimed at core contributors.
We cover how to contribute to the core framework (aka the `Core SDK`).
:::tip
If you are interested to **build on top of the framework**, like creating assistants or adding app level extensions, please refer to [developer docs](/developer) instead.
:::
## Jan Framework
At its core, Jan is a **cross-platform, local-first and AI native framework** that can be used to build anything.
### Extensions
Ultimately, we aim for a `VSCode` or `Obsidian` like SDK that allows **devs to build and customize complex and ethical AI applications for any use case**, in less than 30 minutes.
In fact, the current Jan [Desktop Client](https://jan.ai/) is actually just a specific set of extensions & integrations built on top of this framework.
![Desktop is Extensions](./assets/ExtensionCallouts.png)
:::tip
We encourage devs to fork, customize, and open source your improvements for the greater community.
:::
### Cross Platform
Jan follows [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) to the best of our ability. Though leaky abstractions remain (we're a fast moving, open source codebase), we do our best to build an SDK that allows devs to **build once, deploy everywhere.**
![Clean Architecture](./assets/CleanArchitecture.jpg)
**Supported Runtimes:**
- `Node Native Runtime`, good for server side apps
- `Electron Chromium`, good for Desktop Native apps
- `Capacitor`, good for Mobile apps (planned, not built yet)
- `Python Runtime`, good for MLOps workflows (planned, not built yet)
**Supported OS & Architectures:**
- Mac Intel & Silicon
- Windows
- Linux (through AppImage)
- Nvidia GPUs
- AMD ROCm (coming soon)
Read more:
- [Code Entrypoint](https://github.com/janhq/jan/tree/main/core)
- [Dependency Inversion](https://en.wikipedia.org/wiki/Dependency_inversion_principle)
### Local First
Jan's data persistence happens on the user's local filesystem.
We implemented abstractions on top of `fs` and other core modules in an opinionated way, s.t. user data is saved in a folder-based framework that lets users easily package, export, and manage their data.
Future endeavors on this front include cross device syncing, multi user experience, and more.
Long term, we want to integrate with folks working on [CRDTs](https://www.inkandswitch.com/local-first/), e.g. [Socket Runtime](https://www.theregister.com/2023/04/11/socket_runtime/) to deeply enable local-first software.
Read more:
- [Folder-based wrappers entrypoint](https://github.com/janhq/jan/blob/main/core/src/fs.ts)
- [Piping Node modules across infrastructures](https://github.com/janhq/jan/tree/main/core/src/node)
:::caution
Our local first approach at the moment needs a lot of work. Please don't hesitate to refactor as you make your way through the codebase.
:::
### AI Native
We believe all software applications can be natively supercharged with AI primitives and embedded AI servers.
Including:
- OpenAI Compatible AI [types](https://github.com/janhq/jan/tree/main/core/src/types) and [core extensions](https://github.com/janhq/jan/tree/main/core/src/extensions) to support common functionality like making an inference call.
- Multiple inference engines through [extensions, integrations & wrappers](https://github.com/janhq/jan/tree/main/extensions/inference-nitro-extension) _On this, we'd like to appreciate the folks at [llamacpp](https://github.com/ggerganov/llama.cpp) and [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM) for. To which we'll continue to make commits & fixes back upstream._
- [Code Entrypoint](https://github.com/janhq/jan/tree/main/core/src/api)
## Fun Project Ideas
Beyond the current Jan client and UX, the Core SDK can be used to build many other AI-powered and privacy preserving applications.
- `Game engine`: For AI enabled character games, procedural generation games
- `Health app`: For a personal healthcare app that improves habits
- Got ideas? Make a PR into this docs page!
If you are interested to tackle these issues, or have suggestions for integrations and other OSS tools we can use, please hit us up in [Discord](https://discord.gg/5rQ2zTv3be).
:::caution
Our open source license is copy left, which means we encourage forks to stay open source, and allow core contributors to merge things upstream.
:::

View File

@ -0,0 +1,9 @@
---
title: Integrations
---
Existing and upcoming 3rd party integrations on top of Jan Framework.
From both the core development team, and core contributors.
Suggestions? File an [issue here](https://github.com/janhq/jan/issues)

View File

@ -0,0 +1,7 @@
---
title: Langchain
---
:::caution
WIP
:::

View File

@ -0,0 +1,9 @@
---
title: LlamaCPP
---
## Quicklinks
- Jan Framework [Extension Code](https://github.com/janhq/jan/tree/main/extensions/inference-nitro-extension)
- ggerganov/llama.pp [Source URL](https://github.com/ggerganov/llama.cpp)
- [Productized Wrapper](https://nitro.jan.ai/): a bit lower effort to use out of the box

View File

@ -0,0 +1,7 @@
---
title: Ollama
---
:::caution
Requested, committed, but not started
:::

View File

@ -0,0 +1,8 @@
---
title: OpenAI
---
## Quicklinks
- Jan Framework [Extension Code](https://github.com/janhq/jan/tree/main/extensions/inference-openai-extension)
- OpenAI API [Reference Docs](https://platform.openai.com/docs/api-reference)

View File

@ -0,0 +1,7 @@
---
title: OpenRouter
---
:::caution
Requested, committed, but not started
:::

View File

@ -0,0 +1,8 @@
---
title: TensorRT-LLM
---
## Quicklinks
- Jan Framework [Extension Code](https://github.com/janhq/jan/tree/main/extensions/inference-triton-trtllm-extension)
- TensorRT [Source URL](https://github.com/NVIDIA/TensorRT-LLM)

View File

@ -1,71 +0,0 @@
---
title: Overview
slug: /docs
---
The following low-level docs are aimed at core contributors and cover how to contribute to the core SDK.
:::tip
If you are interested to **build on top of the SDK**, like creating assistants or adding app level extensions, please refer to [developer docs](/developer) instead.
:::
## Jan Framework
At its core, Jan is a cross-platform, local-first and AI native framework that can be used to build anything. In fact, current features are all implemented as 3rd party extensions on top of this core SDK.
Ultimately, we aim for a VSCode or Obsidian like framework that allows devs to build and customize complex AI applications for their specific needs, in less than 30 minutes.
### Cross Platform
Jan follows [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) to the best of our ability. Though leaky abstractions remain (we're a fast moving, open source codebase), we do our best to build an SDK that allows devs to **build once, deploy everywhere.**
Currently, Jan supports:
- `Node Native Runtime`, good for server side apps
- `Electron Chromium`, good for Desktop Native apps
- `Capacitor`, good for Mobile apps (planned, not built yet)
- `Python Runtime`, good for MLOps workflows (planned, not built yet)
Currently, Jan works across:
- Mac Intel & Silicon
- Windows
- Ubuntu
- Nvidia GPUs
Read more:
- [Code Entrypoint](https://github.com/janhq/jan/tree/main/core)
- [Dependency Inversion](https://en.wikipedia.org/wiki/Dependency_inversion_principle)
### Local First
Jan's data persistence happens on the user's local filesystem.
We implemented abstractions on top of `fs` and other core modules in an opinionated way, s.t. user data is saved in a folder-based framework that lets users easily package, export, and manage their data.
Read more:
- [Folder-based fs wrapper](https://github.com/janhq/jan/blob/main/core/src/fs.ts)
- [Piping Node modules across infrastructures](https://github.com/janhq/jan/tree/main/core/src/node)
### AI Native
All software applications can be natively supercharged with an embedded AI server and AI abstractions.
Including:
- OpenAI Compatible AI [types](https://github.com/janhq/jan/tree/main/core/src/types) and [core extensions](https://github.com/janhq/jan/tree/main/core/src/extensions) to support common functionality like making an inference call.
- A lightweight, embedded C++ [inference engine](https://github.com/janhq/jan/tree/main/extensions/inference-nitro-extension) that's immediately callable from code.
- [Code Entrypoint](https://github.com/janhq/jan/tree/main/core/src/api)
## Fun Project Ideas
Beyond the current Jan client and UX, the Core SDK can be used to build many other AI-powered and privacy preserving applications.
- `Game engine`: For AI enabled character games, procedural generation games
- `Health app`: For a personal healthcare app that improves habits
- Got ideas? Make a PR into this docs page!
If you are interested to tackle these issues, or have suggestions for integrations and other OSS tools we can use, please hit us up in [Discord](https://discord.gg/5rQ2zTv3be).

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 402 KiB

BIN
docs/docs/docs/image.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB