From 12012232ac21b43cb0fb7da2a995dd2b89cbb84d Mon Sep 17 00:00:00 2001
From: Emre Can Kartal <159995642+eckartal@users.noreply.github.com>
Date: Fri, 26 Sep 2025 18:13:55 +0800
Subject: [PATCH] Update README.md
---
README.md | 25 +++++++++----------------
1 file changed, 9 insertions(+), 16 deletions(-)
diff --git a/README.md b/README.md
index 656917634..bab883a75 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
-# Jan - Local AI Assistant
+# Jan - Open-source ChatGPT replacement
-
+
@@ -12,15 +12,13 @@
- Getting Started
- - Docs
+ Getting Started
+ - Community
- Changelog
- Bug reports
- - Discord
-Jan is an AI assistant that can run 100% offline on your device. Download and run LLMs with
-**full control** and **privacy**.
+Jan is an AI assistant that can run 100% offline on your device. Download and run LLMs with **full control** and **privacy**.
## Installation
@@ -30,40 +28,35 @@ The easiest way to get started is by downloading one of the following versions f
| Platform |
Stable |
- Nightly |
| Windows |
jan.exe |
- jan.exe |
| macOS |
jan.dmg |
- jan.dmg |
| Linux (deb) |
jan.deb |
- jan.deb |
| Linux (AppImage) |
jan.AppImage |
- jan.AppImage |
-Download from [jan.ai](https://jan.ai/) or [GitHub Releases](https://github.com/menloresearch/jan/releases).
+Download from [jan.ai](https://jan.ai/) or [GitHub Releases](https://github.com/menloresearch/jan/releases).
## Features
-- **Local AI Models**: Download and run LLMs (Llama, Gemma, Qwen, etc.) from HuggingFace
-- **Cloud Integration**: Connect to OpenAI, Anthropic, Mistral, Groq, and others
+- **Local AI Models**: Download and run LLMs (Llama, Gemma, Qwen, GPT-oss etc.) from HuggingFace
+- **Cloud Integration**: Connect to GPT models via OpenAI, Claude models via Anthropic, Mistral, Groq, and others
- **Custom Assistants**: Create specialized AI assistants for your tasks
- **OpenAI-Compatible API**: Local server at `localhost:1337` for other applications
-- **Model Context Protocol**: MCP integration for enhanced capabilities
+- **Model Context Protocol**: MCP integration for agentic capabilities
- **Privacy First**: Everything runs locally when you want it to
## Build from Source