jan/docs/src/pages/cortex/command-line.mdx
hiento09 f93deb6749
chore: move Jan Docs back into Jan Repo (#3790)
Co-authored-by: Hien To <tominhhien97@gmail.com>
2024-10-14 10:07:16 +07:00

82 lines
2.9 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: Command Line Interface
description: Cortex CLI.
keywords:
[
Jan,
Customizable Intelligence, LLM,
local AI,
privacy focus,
free and open source,
private and offline,
conversational AI,
no-subscription fee,
large language models,
Cortex,
Jan,
LLMs
]
---
import { Callout, Steps } from 'nextra/components'
import { Cards, Card } from 'nextra/components'
<Callout type="warning">
🚧 Cortex is under construction.
</Callout>
# Command Line Interface
The Cortex CLI provides a user-friendly platform for managing and operating large language models (LLMs), inspired by tools like Docker and GitHub CLI. Designed for straightforward installation and use, it simplifies the integration and management of LLMs.
<Callout type="info">
The Cortex CLI is OpenAI-compatible.
</Callout>
## Installation
To get started with the Cortex CLI, please see our guides:
- [Quickstart](/cortex/quickstart)
- [Device specific installation](/cortex/installation)
These resources provide detailed instructions to ensure Cortex is set up correctly on your machine, accommodating various hardware environments.
## Usage
The Cortex CLI has a robust command set that streamlines your LLM interactions.
Check out the [CLI reference pages](/cortex/cli) for a comprehensive guide on all available commands and their specific functions.
## Storage
By default, Cortex CLI stores model binaries, thread history, and other usage data in:
`$(npm list -g @janhq/cortex)`.
You can find the respective folders within the `/lib/node_modules/@janhq/cortex/dist/` subdirectory.
<Callout type="info">
**Ongoing Development**:
- Customizable Storage Locations
- Database Integration
</Callout>
## CLI Syntax
The Cortex CLI improves the developer experience by incorporating command chaining and syntactic enhancements.
This approach automatically combines multiple operations into a single command, streamlining complex workflows. It simplifies the execution of extensive processes through integrated commands.
### OpenAI API Equivalence
The design of Cortex CLI commands strictly adheres to the method names used in the OpenAI API as a standard practice. This ensures a smooth transition for users familiar with OpenAIs system.
For example:
- The `cortex chat` command is equivalent to the [`POST /v1/chat/completions` endpoint](/cortex/cortex-chat).
- The `cortex models get ID` command is equivalent to the [`GET /models ${ID}` endpoint](/cortex/cortex-models).
### Command Chaining
Cortex CLIs command chaining support allows multiple commands to be executed in sequence with a simplified syntax. This approach reduces the complexity of command inputs and speeds up development tasks.
For example:
- The [`cortex run`](/cortex/cortex-run), inspired by Docker and Github, starts the models and the inference engine, and provides a command line chat interface for easy testing.