small updates to readme, index, local-server
This commit is contained in:
parent
c5ea4cb8f9
commit
17da6d4ada
@ -26,3 +26,23 @@ All commands are run from the root of the project, from a terminal:
|
||||
| `bun preview` | Preview your build locally, before deploying |
|
||||
| `bun astro ...` | Run CLI commands like `astro add`, `astro check` |
|
||||
| `bun astro -- --help` | Get help using the Astro CLI |
|
||||
|
||||
## 📖 API Reference Commands
|
||||
|
||||
The website includes interactive API documentation. These commands help manage the OpenAPI specifications:
|
||||
|
||||
| Command | Action |
|
||||
| :------------------------------- | :-------------------------------------------------------- |
|
||||
| `bun run api:dev` | Start dev server with API reference at `/api` |
|
||||
| `bun run api:local` | Start dev server with local API docs at `/api-reference/local` |
|
||||
| `bun run api:cloud` | Start dev server with cloud API docs at `/api-reference/cloud` |
|
||||
| `bun run generate:local-spec` | Generate/fix the local OpenAPI specification |
|
||||
| `bun run generate:cloud-spec` | Generate the cloud OpenAPI specification from Jan Server |
|
||||
| `bun run generate:cloud-spec-force` | Force update cloud spec (ignores cache/conditions) |
|
||||
|
||||
**API Reference Pages:**
|
||||
- `/api` - Landing page with Local and Server API options
|
||||
- `/api-reference/local` - Local API (llama.cpp) documentation
|
||||
- `/api-reference/cloud` - Jan Server API (vLLM) documentation
|
||||
|
||||
The cloud specification is automatically synced via GitHub Actions on a daily schedule and can be manually triggered by the Jan Server team.
|
||||
|
||||
@ -26,6 +26,7 @@ export default defineConfig({
|
||||
starlight({
|
||||
title: '👋 Jan',
|
||||
favicon: 'favicon.ico',
|
||||
customCss: ['./src/styles/global.css'],
|
||||
head: [
|
||||
{
|
||||
tag: 'script',
|
||||
|
||||
@ -242,11 +242,7 @@ const { title, description } = Astro.props;
|
||||
background: var(--sl-color-gray-6);
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.custom-nav-links {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@media (max-width: 1024px) {
|
||||
.custom-nav-links {
|
||||
|
||||
@ -22,7 +22,7 @@ banner:
|
||||
We just launched something cool! 👋Jan now <a href="./jan/multi-modal">supports image 🖼️ attachments</a> 🎉
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
import { Aside, LinkCard } from '@astrojs/starlight/components';
|
||||
|
||||
|
||||

|
||||
@ -121,11 +121,10 @@ The best AI is the one you control. Not the one that other control for you.
|
||||
- Run powerful models locally on consumer hardware
|
||||
- Connect to any cloud provider with your API keys
|
||||
- Use MCP tools for real-world tasks
|
||||
- Access transparent model evaluations
|
||||
|
||||
### What We're Building
|
||||
- More specialized models that excel at specific tasks
|
||||
- Expanded app ecosystem (mobile, web, extensions)
|
||||
- Expanded app ecosystem (mobile, self-hosted server, web, extensions)
|
||||
- Richer connector ecosystem
|
||||
- An evaluation framework to build better models
|
||||
|
||||
@ -147,6 +146,13 @@ hardware, and better techniques, but we're honest that this is a journey, not a
|
||||
2. Choose a model - download locally or add cloud API keys
|
||||
3. Start chatting or connect tools via MCP
|
||||
4. Build with our [local API](./api-server)
|
||||
5. Explore the [API Reference](/api) for Local and Server endpoints
|
||||
|
||||
<LinkCard
|
||||
title="API Reference"
|
||||
description="Learn how to get started with the API Reference"
|
||||
href="/api"
|
||||
/>
|
||||
|
||||
## Acknowledgements
|
||||
|
||||
|
||||
@ -3,7 +3,7 @@ title: Local API Server
|
||||
description: Build AI applications with Jan's OpenAI-compatible API server.
|
||||
---
|
||||
|
||||
import { Aside } from '@astrojs/starlight/components';
|
||||
import { Aside, LinkCard } from '@astrojs/starlight/components';
|
||||
|
||||
Jan provides an OpenAI-compatible API server that runs entirely on your computer. Use the same API patterns you know from OpenAI, but with complete control over your models and data.
|
||||
|
||||
@ -31,10 +31,17 @@ curl http://localhost:1337/v1/chat/completions \
|
||||
|
||||
## Documentation
|
||||
|
||||
- [**API Reference**](/api) - Interactive API documentation with Try It Out
|
||||
- [**API Configuration**](./api-server) - Server settings, authentication, CORS
|
||||
- [**Engine Settings**](./llama-cpp) - Configure llama.cpp for your hardware
|
||||
- [**Server Settings**](./settings) - Advanced configuration options
|
||||
|
||||
<LinkCard
|
||||
title="API Reference"
|
||||
description="Learn how to get started with the API Reference"
|
||||
href="/api"
|
||||
/>
|
||||
|
||||
## Integration Examples
|
||||
|
||||
### Continue (VS Code)
|
||||
@ -94,14 +101,14 @@ Jan implements the core OpenAI API endpoints. Some advanced features like functi
|
||||
|
||||
## Why Use Jan's API?
|
||||
|
||||
**Privacy** - Your data stays on your machine with local models
|
||||
**Cost** - No API fees for local model usage
|
||||
**Control** - Choose your models, parameters, and hardware
|
||||
**Flexibility** - Mix local and cloud models as needed
|
||||
**Privacy** - Your data stays on your machine with local models
|
||||
**Cost** - No API fees for local model usage
|
||||
**Control** - Choose your models, parameters, and hardware
|
||||
**Flexibility** - Mix local and cloud models as needed
|
||||
|
||||
## Related Resources
|
||||
|
||||
- [Models Overview](/docs/jan/manage-models) - Available models
|
||||
- [Data Storage](/docs/jan/data-folder) - Where Jan stores data
|
||||
- [Troubleshooting](/docs/jan/troubleshooting) - Common issues
|
||||
- [GitHub Repository](https://github.com/janhq/jan) - Source code
|
||||
- [GitHub Repository](https://github.com/janhq/jan) - Source code
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user