docs: jan framework principles

This commit is contained in:
0xSage 2024-01-08 14:32:08 +08:00
parent ef98e35155
commit e14dbc596d
10 changed files with 40 additions and 10 deletions

View File

@ -3,35 +3,50 @@ title: Overview
slug: /docs
---
The following low-level docs are aimed at core contributors and cover how to contribute to the core SDK.
The following low-level docs are aimed at core contributors.
We cover how to contribute to the core framework (aka the `Core SDK`).
:::tip
If you are interested to **build on top of the SDK**, like creating assistants or adding app level extensions, please refer to [developer docs](/developer) instead.
If you are interested to **build on top of the framework**, like creating assistants or adding app level extensions, please refer to [developer docs](/developer) instead.
:::
## Jan Framework
At its core, Jan is a cross-platform, local-first and AI native framework that can be used to build anything. In fact, current features are all implemented as 3rd party extensions on top of this core SDK.
At its core, Jan is a **cross-platform, local-first and AI native framework** that can be used to build anything.
Ultimately, we aim for a VSCode or Obsidian like framework that allows devs to build and customize complex AI applications for their specific needs, in less than 30 minutes.
### Extensions
Ultimately, we aim for a `VSCode` or `Obsidian` like SDK that allows **devs to build and customize complex AI applications for their specific needs**, in less than 30 minutes.
In fact, the current Jan [Desktop Client](https://jan.ai/) is actually just a specific set of extensions & integrations built on top of this framework.
![Desktop is Extensions](./assets/ExtensionCallouts.png)
:::tip
We encourage devs to fork, customize, and open source your improvements for the greater community.
:::
### Cross Platform
Jan follows [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) to the best of our ability. Though leaky abstractions remain (we're a fast moving, open source codebase), we do our best to build an SDK that allows devs to **build once, deploy everywhere.**
Currently, Jan supports:
![Clean Architecture](./assets/CleanArchitecture.jpg)
**Supported Runtimes:**
- `Node Native Runtime`, good for server side apps
- `Electron Chromium`, good for Desktop Native apps
- `Capacitor`, good for Mobile apps (planned, not built yet)
- `Python Runtime`, good for MLOps workflows (planned, not built yet)
Currently, Jan works across:
**Supported OS & Architectures:**
- Mac Intel & Silicon
- Windows
- Ubuntu
- Linux (through AppImage)
- Nvidia GPUs
- AMD ROCm (coming soon)
Read more:
@ -44,19 +59,27 @@ Jan's data persistence happens on the user's local filesystem.
We implemented abstractions on top of `fs` and other core modules in an opinionated way, s.t. user data is saved in a folder-based framework that lets users easily package, export, and manage their data.
Future endeavors on this front include cross device syncing, multi user experience, and more.
Long term, we want to integrate with folks working on [CRDTs](https://www.inkandswitch.com/local-first/), e.g. [Socket Runtime](https://www.theregister.com/2023/04/11/socket_runtime/) to deeply enable local-first software.
Read more:
- [Folder-based fs wrapper](https://github.com/janhq/jan/blob/main/core/src/fs.ts)
- [Folder-based wrappers entrypoint](https://github.com/janhq/jan/blob/main/core/src/fs.ts)
- [Piping Node modules across infrastructures](https://github.com/janhq/jan/tree/main/core/src/node)
:::caution
Our local first approach at the moment needs a lot of work. Please don't hesitate to refactor as you make your way through the codebase.
:::
### AI Native
All software applications can be natively supercharged with an embedded AI server and AI abstractions.
We believe all software applications can be natively supercharged with AI primitives and embedded AI servers.
Including:
- OpenAI Compatible AI [types](https://github.com/janhq/jan/tree/main/core/src/types) and [core extensions](https://github.com/janhq/jan/tree/main/core/src/extensions) to support common functionality like making an inference call.
- A lightweight, embedded C++ [inference engine](https://github.com/janhq/jan/tree/main/extensions/inference-nitro-extension) that's immediately callable from code.
- A lightweight, embedded C++ [inference engine/wrapper](https://github.com/janhq/jan/tree/main/extensions/inference-nitro-extension) that's immediately callable from code. _On this, we'd like to appreciate the folks at [llamacpp](https://github.com/ggerganov/llama.cpp) and [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM) for. To which we'll continue to make commits & fixes back upstream._
- [Code Entrypoint](https://github.com/janhq/jan/tree/main/core/src/api)
@ -69,3 +92,7 @@ Beyond the current Jan client and UX, the Core SDK can be used to build many oth
- Got ideas? Make a PR into this docs page!
If you are interested to tackle these issues, or have suggestions for integrations and other OSS tools we can use, please hit us up in [Discord](https://discord.gg/5rQ2zTv3be).
:::caution
Our open source license is copy left, which means we encourage forks to stay open source, and allow core contributors to merge things upstream.
:::

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 402 KiB

BIN
docs/docs/docs/image.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

View File

@ -0,0 +1,3 @@
---
title: Integrations
---

View File

View File

View File

View File