vision update

This commit is contained in:
Ramon Perez 2025-07-30 11:18:11 +10:00
parent 04353a289c
commit 9e43f61366
14 changed files with 181 additions and 192 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 745 KiB

View File

@ -13,112 +13,81 @@ import { Aside, Card, CardGrid } from '@astrojs/starlight/components';
We know it's hard but we believe this will be possible in the next decade through a combination of We know it's hard but we believe this will be possible in the next decade through a combination of
models, applications and tools. For this we are... models, applications and tools. For this we are...
> **building a solution that ties all of these seamlessly so that users, regardless of their technical > **building Jan as the ecosystem that ties all of these seamlessly so that users, regardless of their technical
background, are able to use Jan in the same way they use other applications but while owning them.** background, add intelligence to their day-to-day lives like they would, but better, with similar tools.**
## What We're Building ![Jan Vision](../../assets/jan-vision.png)
**Jan Ecosystem** = Jan Models + Jan Application + Jan Tools
Unlike other AI assistants that do specific tasks with one model or have many models with a myriad of solutions, Jan provides: ## Core Principles
- Its own specialised models that are optimised at specific tasks like web-search, creative writing, and translation
- Applications that work across all of your devices in an integrated way
- Tools to help you get things done
## Two Modes ### 1) Build the Full Stack
### Local Mode Models alone aren't enough. Neither are applications. Superintelligence requires models that
Run AI models entirely on your device, giving you complete privacy with no internet required. understand your needs, tools that extend capabilities, and applications that tie it all
together. We're building all three, openly.
![Jan Desktop](../../assets/jan_desktop.png) ### 2) You Choose Who Runs It
Run Jan on your laptop. Self-host it on your servers. Use our cloud. The same superintelligence
works everywhere. Your data, your compute, your choice.
### Cloud Mode ### 3) Start Simple, Scale Infinitely
Connect to more powerful models when needed - either self-hosted or via jan.ai. Open Jan and start chatting. No setup required. When you need more - better models, advanced tools,
team deployment - everything's there. The complexity scales with your ambition, not our architecture.
![Jan Everywhere](../../assets/jan_everywhere.png) ## The Path to Superintelligence
<Aside type="note"> ### Today we have the Foundation
Users shouldn't need to understand models, APIs, or technical details. Just choose Local for privacy or Cloud for power.
</Aside>
## Our Product Principles - **A desktop app** that works both with local and cloud-based models
- **Jan models** small enough to run on any laptop and powerful enough to scale on any server
- **Basic tools** enabled through MCP Search, file parsing, simple workflows
### 1) It Just Works ### Next 12 Months: Ecosystem
1. Open Jan, start chatting - **The Jan v1 models** are a specialized series of models with general capabilities but optimsed
2. Onboarding is fully available but optional for specific tasks like search, analysis, creative writing and more
3. Setting up an API key is optional - **The Jan server** works as a self-hosted AI infrastructure for teams
4. Selecting a local model is optional - **Advanced tools** like browser use, deep research, and long-term memory works across devices, excels
5. Become a power user at your own pace, if you want to across different day-to-day use cases, and scales with the needs of large teams
- **Cross-device sync** allows you to take your AI everywhere
We handle the complexity. ### End State: Open Superintelligence
Not one massive model, but an ecosystem of specialized models, tools, and applications working
together. Built in public. Owned by whoever runs it.
### 2) Cloud When Needed ## Why This Matters: The Status Quo
Start completely locally and own your AI models. Add cloud capabilities only when you choose to.
### 3) Solve Problems, Not Settings Every other AI company wants to be your AI provider. We want you to own your AI.
We help users get to answers quickly and leave the configurations as optional. Power users can dig deeper, but it's never required.
## Available on Every Device - **OpenAI/Anthropic**: Their models, their servers, their rules
- **Open Source Models**: Powerful but fragmented - no cohesive experience
- **Jan**: Complete ecosystem you can own, modify, and deploy however you want
<CardGrid> ## Watch Us Build
<Card title="Jan Desktop" icon="laptop">
**Available Now**
Your personal AI workstation that helps with your use cases and powers other devices. Run models locally right away or bring an API key to connect to your favorite cloud-based models. ### Live Model Training
We train our models in public. Check the [models page](./models/jan-v1) to see:
- Real-time training progress
- Failed runs and what went wrong
- Models in testing before release
**[Learn more &rarr;](./platforms/desktop)** No black box. No "trust us, it's good." Watch the entire process from dataset to deployment.
</Card>
<Card title="Jan Web" icon="up-arrow">
**Beta Launch Soon**
Web-based version of Jan with no setup required. The same default cloud mode for mobile and desktop users. ### Help Evaluate Our Models
Every model needs real-world testing. Join our open evaluation platform where you can:
- Compare model outputs side-by-side
- Test specific capabilities you care about
- Vote on which responses actually help
- Suggest improvements based on your use cases
**[Learn more &rarr;](./platforms/jan-ai)** Think LMArena, but you can see all the data, run your own evals, and directly influence what we train next.
</Card>
<Card title="Jan Mobile" icon="phone">
**Coming Q4 2025**
Connect to your Desktop/Server, or run local models like Jan Nano. The same experience, everywhere. [Test/evaluate our models here](link)
**[Learn more &rarr;](./platforms/mobile)** ## Get Involved
</Card>
<Card title="Jan Server" icon="bars">
**Coming Q3 2025**
Self-hosted solution or connect to Jan via API for teams and enterprises. We build in public. Everything from our model training to our product roadmap is open.
**[Learn more &rarr;](./platforms/server)** - [GitHub](link) - Contribute code
</Card> - [Handbook](link) - See how we train models
</CardGrid> - [Discord](link) - Join the discussion
## What Makes Jan Different
| Feature | Other AI Assistants | Jan |
| :--- | :--- | :--- |
| **Models** | Wrapper around Claude/GPT | Our own models + You can own them |
| **Dual mode** | Your data on their servers | Your data stays yours |
| **Deployment** | Cloud only | Local, self-hosted, or cloud |
| **Cost** | Subscription forever | Free locally, pay for cloud |
## Development Timeline
Jan is actively developed with regular releases. Our development follows these key milestones:
### Current Focus
- **Jan Desktop**: Continuous improvements and model support
- **Jan Web**: Beta launch preparation
- **Model Development**: Jan Nano optimization and v1 launch
### Next 6 Months
- Jan Web public beta
- Mobile app development
- Server deployment tools
### Future Vision
- Complete full-stack AI solution
- Advanced tool integration
- Enterprise features
<Aside type="tip">
We're building AI that respects your choices. Run it locally and power other apps, connect to the cloud for power, or self-host for both.
</Aside>

View File

@ -1,29 +1,51 @@
--- ---
title: Jan V1 title: Jan Models
description: Our upcoming family of foundational models, built to compete with the best. description: Specialized AI models trained in public for real tasks.
sidebar: sidebar:
order: 1 order: 1
banner: banner:
content: 'In Development: Jan V1 models are currently being trained and are not yet available.' content: 'Watch live training progress below. First models releasing Q1 2025.'
--- ---
import { Aside } from '@astrojs/starlight/components'; import { Aside } from '@astrojs/starlight/components';
## Our Foundational Model Family ## Not Just Another Model Family
Jan V1 is our in-house, still in training, family of models designed to compete directly with leading models. We're building powerful, general-purpose models from the ground up to solve real-world problems with a focus on efficiency and privacy. Jan Models aren't general-purpose chatbots. Each model is trained for specific tasks that matter in daily work: search, analysis, creative writing, coding, research. They work together, each handling what it does best.
### Planned Model Lineup ### Current Training Status
| Model | Target Size | Intended Use Case | Availability | | Model | Specialization | Size | Training Progress | Status |
|:------------|:------------|:-----------------------------|:--------------| |:------|:--------------|:-----|:------------------|:-------|
| Jan V1-7B | 4-8GB | Fast, efficient daily tasks | Coming Soon | | Jan-Search | Web search + synthesis | 7B | ████████░░ 82% | Testing phase |
| Jan V1-13B | 8-16GB | Balanced power and performance | Coming Soon | | Jan-Write | Creative + technical writing | 13B | ████░░░░░░ 41% | Active training |
| Jan V1-70B | 40-64GB | Deep analysis, professional work | Coming Soon | | Jan-Analyze | Data analysis + reasoning | 13B | ██░░░░░░░░ 23% | Dataset prep |
| Jan V1-2350B | 100GB+ | Frontier research, complex tasks | Planned 2026 | | Jan-Code | Code generation + debugging | 7B | ░░░░░░░░░░ 0% | Starting Jan 2025 |
### What to Expect <Aside type="note">
- **Competitive Performance**: Aiming for results on par with leading closed-source models. Training runs live at [train.jan.ai](). See datasets, metrics, and even failed runs.
- **Optimized for Local Use**: Efficient quantized versions for running on your own hardware. </Aside>
- **Privacy-Centric**: Trainable and runnable in your own environment, ensuring your data stays yours.
- **Seamless Integration**: Designed to work perfectly within the Jan ecosystem. ### Why Specialized Models?
- **Fine-tuning support**: Easy to adapt to specific tasks or domains.
One model can't excel at everything. GPT-4o, o3, o4, or Claude 4 Sonnet writing poetry
use the same weights to do math as well, which can be inefficient and expensive.
Our approach:
- **Jan-Search** knows how to query, crawl, and synthesize
- **Jan-Write** understands tone, structure, and creativity
- **Jan-Analyze** excels at reasoning and data interpretation
- Models work together through the Jan orchestration layer
### Built for the Ecosystem
These aren't standalone models. They're designed to:
- Run efficiently on local hardware (quantized to 4-8GB)
- Work with Jan Tools (browser automation, file parsing, memory)
- Scale from laptop to server without code changes
- Share context and hand off tasks to each other
### Help Us Improve
Models are only as good as their real-world performance. [Test our models](link) against your actual use cases and vote on what works.
We train on your feedback, not just benchmarks.

View File

@ -1,113 +1,111 @@
--- ---
title: Jan Desktop title: Jan Desktop
description: AI that runs on your computer, not someone else's. Your personal AI workstation. description: The foundation of your AI ecosystem. Runs on your hardware.
sidebar: sidebar:
order: 2 order: 2
--- ---
import { Aside, Card, CardGrid, Tabs, TabItem } from '@astrojs/starlight/components'; import { Aside, Card, CardGrid, Tabs, TabItem } from '@astrojs/starlight/components';
This is how Jan started and it has been available since day 1. Jan Desktop is your local AI workstation. Download it, run your own models, or connect to cloud providers. Your computer, your choice.
Jan Desktop strives to be: ## How It Works
> Your personal AI workstation that helps with your use cases and powers other devices. Run models locally right away or bring an API key to connect to your favorite cloud-based models. ### Default: Local Mode
Open Jan. Start chatting with Jan Nano. No internet, no account, no API keys. Your conversations never leave your machine.
Jan Desktop is where it all starts. Download it, open it, and start chatting. Your AI runs on your computer with zero setup required. ### Optional: Cloud Mode
Need more power? Connect to:
- Your own Jan Server
- jan.ai (coming soon)
- Any OpenAI-compatible API
## Two Modes, Zero Complexity <Aside type="caution">
**Current limitation**: You need to download a model first (2-4GB). We're embedding Jan Nano in the app to fix this.
### Local Mode (Default)
Your conversations stay on your computer. No internet needed. Complete privacy.
### Cloud Mode
Connect to more powerful models when you need them. Your choice of provider.
<Aside type="note">
As of today, when you first open Jan you do have to download a model or connect to a cloud provider, but that is about to change soon.
</Aside> </Aside>
## What You Get ## Why Desktop First
Your desktop has the GPU, storage, and memory to run real AI models. Not toy versions. Not demos. The same models that power ChatGPT-scale applications.
More importantly: it becomes the hub for your other devices. Your phone connects to your desktop. Your team connects to your desktop. Everything stays in your control.
## Specifications
<CardGrid> <CardGrid>
<Card title="Works Offline" icon="wifi"> <Card title="Storage" icon="folder">
Download once, use forever. Internet is optional. Everything in `~/jan`. Your data, your models, your configuration. Back it up, move it, delete it - it's just files.
</Card> </Card>
<Card title="Your Data Stays Yours" icon="shield"> <Card title="API Server" icon="code">
Everything is stored in `~/.local/share/jan`. No cloud backups unless you want them. OpenAI-compatible API at `localhost:1337`. Any tool that works with OpenAI works with Jan. No code changes.
</Card> </Card>
<Card title="Powers Other Devices" icon="devices"> <Card title="GPU Support" icon="rocket">
Your desktop becomes an AI server for your phone and other computers. NVIDIA CUDA acceleration out of the box. Automatically detects and uses available GPUs. CPU fallback always works.
</Card> </Card>
<Card title="Developer Friendly" icon="code"> <Card title="Model Flexibility" icon="puzzle">
Local API at `localhost:1337`. Works with any OpenAI-compatible tool. Run any GGUF model from Hugging Face. Or our models. Or your fine-tuned models. If it's GGUF, it runs.
</Card>
<Card title="GPU Acceleration" icon="rocket">
Automatically detects and uses NVIDIA GPUs for faster performance.
</Card>
<Card title="Cross-Platform" icon="laptop">
Windows, macOS, and Linux support with native performance.
</Card> </Card>
</CardGrid> </CardGrid>
## System Requirements ## System Requirements
### Minimum Requirements **Minimum**: 8GB RAM, 10GB storage, any 64-bit OS from the last 5 years
- **RAM:** 8GB (models use less than 80% of available memory)
- **Storage:** 10GB+ free space
- **OS:** Windows 10, macOS 12, Ubuntu 20.04 or newer
### Recommended **Recommended**: 16GB RAM, NVIDIA GPU, 50GB storage for multiple models
- **RAM:** 16GB+ for larger models
- **Storage:** 20GB+ for multiple models
- **GPU:** NVIDIA GPU with 6GB+ VRAM for acceleration
- **OS:** Latest versions for best performance
## Getting Started **Runs on**: Windows 10+, macOS 12+, Ubuntu 20.04+
1. **Download Jan** from [jan.ai/download](https://jan.ai/download) ## Installation
2. **Open the app** - it loads with everything ready
3. **Start chatting** - that's it
## Local Mode Features ```bash
# macOS/Linux
curl -sSL https://jan.ai/install.sh | bash
- **Select your favorite Model:** Jan allows you to download any GGUF model from the Hugging Face Hub. # Windows
- **Smart Defaults:** Automatically uses your GPU if available and adjusts to your system's capabilities. # Download from jan.ai/download
- **Complete Privacy:** No telemetry by default, no account required, and no data leaves your machine. ```
## Cloud Mode (Optional) <Aside type="tip">
First run downloads Jan Nano (4GB). After that, everything works offline.
Connect to external AI providers when you need more power: </Aside>
<Tabs>
<TabItem label="jan.ai">
Our cloud service (coming soon). One click to enable.
</TabItem>
<TabItem label="OpenAI">
Use your OpenAI API key for GPT-4 access.
</TabItem>
<TabItem label="Self-Hosted">
Connect to your own Jan Server.
</TabItem>
</Tabs>
## Desktop as Your AI Hub
Your desktop can power AI across all your devices by automatically becoming a local server.
- **Network Sharing:** Mobile apps connect over WiFi, and other computers can access your models.
- **API:** Available at `localhost:1337` for any OpenAI-compatible application.
- **Offline Access:** No internet required for local network connections.
## For Developers ## For Developers
### Local API Server ### Use Jan as an OpenAI Drop-in
```bash
# Always running at localhost:1337 ```javascript
curl http://localhost:1337/v1/chat/completions \ // Your existing OpenAI code
-H "Content-Type: application/json" \ const openai = new OpenAI({
-d '{"model": "gemma3:3b", "messages": [{"role": "user", "content": "Hello"}]}' apiKey: "not-needed",
baseURL: "http://localhost:1337/v1"
});
// Works exactly the same
const completion = await openai.chat.completions.create({
model: "jan-nano",
messages: [{ role: "user", content: "Hello" }]
});
``` ```
## The Bottom Line ### Available Endpoints
- `/v1/chat/completions` - Chat with any loaded model
- `/v1/models` - List available models
- `/v1/embeddings` - Generate embeddings
- `/routes` - See all available routes
Jan Desktop is AI that respects that your computer is YOUR computer, not a terminal to someone else's server. Just software that works for you. ## The Foundation
Jan Desktop isn't just an app. It's the foundation of your personal AI infrastructure:
1. **Today**: Run models locally, connect to cloud APIs
2. **Soon**: Your phone connects to your desktop
3. **Next**: Your desktop serves your team
4. **Future**: Your personal AI that knows you, runs everywhere
No subscriptions. No lock-in. Just software that's yours.
---
**Next steps**:
- [Download Jan Desktop](https://jan.ai/download)
- [Try Jan Models →](../models/jan-v1)
- [Explore Tools →](../tools/search)

View File

@ -27,8 +27,8 @@ import { Content } from '../../content/products/index.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/models/jan-nano.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -5,7 +5,7 @@ import { Content } from '../../../content/products/models/jan-v1.mdx';
<StarlightPage <StarlightPage
frontmatter={{ frontmatter={{
title: 'Jan v1', title: 'Models Overview',
description: 'Full-featured AI model for desktop and server deployment', description: 'Full-featured AI model for desktop and server deployment',
tableOfContents: false tableOfContents: false
}} }}
@ -26,8 +26,8 @@ import { Content } from '../../../content/products/models/jan-v1.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/platforms/desktop.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/platforms/jan-ai.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/platforms/mobile.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/platforms/server.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/tools/browseruse.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/tools/deepresearch.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {

View File

@ -26,8 +26,8 @@ import { Content } from '../../../content/products/tools/search.mdx';
{ {
label: 'Models', label: 'Models',
items: [ items: [
{ label: 'Overview', link: '/products/models/jan-v1/' },
{ label: 'Jan Nano', link: '/products/models/jan-nano/' }, { label: 'Jan Nano', link: '/products/models/jan-nano/' },
{ label: 'Jan v1', link: '/products/models/jan-v1/' },
] ]
}, },
{ {