jan/CONTRIBUTING.md
Dinh Long Nguyen b0eec07a01
Add contributing section for jan (#6231) (#6232)
* Add contributing section for jan

* Update CONTRIBUTING.md

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-08-20 10:18:35 +07:00

12 KiB

Contributing to Jan

First off, thank you for considering contributing to Jan. It's people like you that make Jan such an amazing project.

Jan is an AI assistant that can run 100% offline on your device. Think ChatGPT, but private, local, and under your complete control. If you're thinking about contributing, you're already awesome - let's make AI accessible to everyone, one commit at a time.

How Jan Actually Works

Jan is a desktop app that runs local AI models. Here's how the components actually connect:

┌──────────────────────────────────────────────────────────┐
│                   Web App (Frontend)                     │
│                      (web-app/)                          │
│  • React UI                                              │
│  • Chat Interface                                        │
│  • Settings Pages                                        │
│  • Model Hub                                             │
└────────────┬─────────────────────────────┬───────────────┘
             │                             │
             │ imports                     │ imports
             ▼                             ▼
  ┌──────────────────────┐      ┌──────────────────────┐
  │     Core SDK         │      │     Extensions       │
  │      (core/)         │      │   (extensions/)      │
  │                      │      │                      │
  │ • TypeScript APIs    │◄─────│ • Assistant Mgmt     │
  │ • Extension System   │ uses │ • Conversations      │
  │ • Event Bus          │      │ • Downloads          │
  │ • Type Definitions   │      │ • LlamaCPP           │
  └──────────┬───────────┘      └───────────┬──────────┘
             │                              │
             │   ┌──────────────────────┐   │
             │   │       Web App        │   │
             │   └──────────┬───────────┘   │
             │              │               │
             └──────────────┼───────────────┘
                            │
                            ▼
                        Tauri IPC
                    (invoke commands)
                            │
                            ▼
┌───────────────────────────────────────────────────────────┐
│                   Tauri Backend (Rust)                    │
│                      (src-tauri/)                         │
│                                                           │
│  • Window Management        • File System Access          │
│  • Process Control          • System Integration          │
│  • IPC Command Handler      • Security & Permissions      │
└───────────────────────────┬───────────────────────────────┘
                            │
                            │
                            ▼
┌───────────────────────────────────────────────────────────┐
│                   Tauri Plugins (Rust)                    │
│                   (src-tauri/plugins/)                    │
│                                                           │
│     ┌──────────────────┐        ┌──────────────────┐      │
│     │  Hardware Plugin │        │  LlamaCPP Plugin │      │
│     │                  │        │                  │      │
│     │ • CPU/GPU Info   │        │ • Process Mgmt   │      │
│     │ • Memory Stats   │        │ • Model Loading  │      │
│     │ • System Info    │        │ • Inference      │      │
│     └──────────────────┘        └──────────────────┘      │
└───────────────────────────────────────────────────────────┘

The Communication Flow

  1. JavaScript Layer Relationships:

    • Web App imports Core SDK and Extensions as JavaScript modules
    • Extensions use Core SDK for shared functionality
    • All run in the browser/webview context
  2. All Three → Backend: Through Tauri IPC

    • Web App → Backend: await invoke('app_command', data)
    • Core SDK → Backend: await invoke('core_command', data)
    • Extensions → Backend: await invoke('ext_command', data)
    • Each component can independently call backend commands
  3. Backend → Plugins: Native Rust integration

    • Backend loads plugins as Rust libraries
    • Direct function calls, no IPC overhead
  4. Response Flow:

    • Plugin → Backend → IPC → Requester (Web App/Core/Extension) → UI updates

Real-World Example: Loading a Model

Here's what actually happens when you click "Download Llama 3":

  1. Web App (web-app/) - User clicks download button
  2. Extension (extensions/download-extension) - Handles the download logic
  3. Tauri Backend (src-tauri/) - Actually downloads the file to disk
  4. Extension (extensions/llamacpp-extension) - Prepares model for loading
  5. Tauri Plugin (src-tauri/plugins/llamacpp) - Starts llama.cpp process
  6. Hardware Plugin (src-tauri/plugins/hardware) - Detects GPU, optimizes settings
  7. Model ready! - User can start chatting

Project Structure

jan/
├── web-app/              # React frontend (what users see)
├── src-tauri/            # Rust backend (system integration)
│   ├── src/core/         # Core Tauri commands
│   └── plugins/          # Tauri plugins (hardware, llamacpp)
├── core/                 # TypeScript SDK (API layer)
├── extensions/           # JavaScript extensions
│   ├── assistant-extension/
│   ├── conversational-extension/
│   ├── download-extension/
│   └── llamacpp-extension/
├── docs/                 # Documentation website
├── website/              # Marketing website
├── autoqa/               # Automated testing
├── scripts/              # Build utilities
│
├── package.json          # Root workspace configuration
├── Makefile              # Build automation commands
├── mise.toml             # Mise tool configuration  
├── LICENSE               # Apache 2.0 license
└── README.md             # Project overview

Development Setup

The Scenic Route (Build from Source)

Prerequisites:

  • Node.js ≥ 20.0.0
  • Yarn ≥ 1.22.0
  • Rust (for Tauri)
  • Make ≥ 3.81

Option 1: The Easy Way (Make)

git clone https://github.com/menloresearch/jan
cd jan
make dev

Option 2: The Easier Way (Mise)

git clone https://github.com/menloresearch/jan
cd jan

# Install mise
curl https://mise.run | sh

# Let mise handle everything
mise install    # installs Node.js, Rust, and other tools
mise dev        # runs the full development setup

How Can I Contribute?

Reporting Bugs

  • Ensure the bug was not already reported by searching on GitHub under Issues
  • If you're unable to find an open issue addressing the problem, open a new one
  • Include your system specs and error logs - it helps a ton

Suggesting Enhancements

  • Open a new issue with a clear title and description
  • Explain why this enhancement would be useful
  • Include mockups or examples if you can

Your First Code Contribution

Choose Your Adventure:

  • Frontend UI and logicweb-app/
  • Shared API declarationscore/
  • Backend system integrationsrc-tauri/
  • Business logic featuresextensions/
  • Dedicated backend handlersrc-tauri/plugins/

The Process:

  1. Fork the repo
  2. Create a new branch (git checkout -b feature-name)
  3. Make your changes (and write tests!)
  4. Commit your changes (git commit -am 'Add some feature')
  5. Push to the branch (git push origin feature-name)
  6. Open a new Pull Request against dev branch

Testing

yarn test                    # All tests
cd src-tauri && cargo test  # Rust tests
cd autoqa && python main.py # End-to-end tests

Code Standards

TypeScript/JavaScript

  • TypeScript required (we're not animals)
  • ESLint + Prettier
  • Functional React components
  • Proper typing (no any - seriously!)

Rust

  • cargo fmt + cargo clippy
  • Result<T, E> for error handling
  • Document public APIs

Git Conventions

Branches

  • main - stable releases
  • dev - development (target this for PRs)
  • feature/* - new features
  • fix/* - bug fixes

Commit Messages

  • Use the present tense ("Add feature" not "Added feature")
  • Be descriptive but concise
  • Reference issues when applicable

Examples:

feat: add support for Qwen models
fix: resolve memory leak in model loading
docs: update installation instructions

Troubleshooting

If things go sideways:

  1. Check our troubleshooting docs
  2. Clear everything and start fresh: make clean then make dev
  3. Copy your error logs and system specs
  4. Ask for help in our Discord #🆘|jan-help channel

Common issues:

  • Build failures: Check Node.js and Rust versions
  • Extension not loading: Verify it's properly registered
  • Model not working: Check hardware requirements and GPU drivers

Getting Help

License

Apache 2.0 - Because sharing is caring. See LICENSE for the legal stuff.

Additional Notes

We're building something pretty cool here - an AI assistant that respects your privacy and runs entirely on your machine. Every contribution, no matter how small, helps make AI more accessible to everyone.

Thanks for being part of the journey. Let's build the future of local AI together! 🚀