feat: support RAG

chore: Update new model.json with multiple binaries

feat: Add updates for handling multiple model binaries

chore: jan can see

Update Model.json (#1005)

* add(mixtral): add model.json for mixtral

* archived some models + update the model.json

* add(model): add pandora 10.7b

* fix(model): update description

* fix(model): pump vers and change the featured model to trinity

* fix(model): archive neuralchat

* fix(model): decapriated all old models

* fix(trinity): add cover image and change description

* fix(trinity): update cover png

* add(pandora): cover image

* fix(pandora): cover image

* add(mixtral): add model.json for mixtral

* archived some models + update the model.json

* add(model): add pandora 10.7b

* fix(model): update description

* fix(model): pump vers and change the featured model to trinity

* fix(model): archive neuralchat

* fix(model): decapriated all old models

* fix(trinity): add cover image and change description

* fix(trinity): update cover png

* add(pandora): cover image

* fix(pandora): cover image

* chore: model desc nits

* fix(models): adjust the size for solars and pandoras

* add(mixtral): description

---------

Co-authored-by: 0xSage <n@pragmatic.vc>

chore: reformat model.json and use new template

fix(Model): download/abort model (#1163)

* fix(Model): download/abort model
* fix: image preview

Signed-off-by: James <james@jan.ai>

---------

Signed-off-by: James <james@jan.ai>
Co-authored-by: James <james@jan.ai>
Co-authored-by: Louis <louis@jan.ai>

add preview and reduce time re-render all chat screen

Signed-off-by: James <james@jan.ai>

store files under thread_id/files

Signed-off-by: James <james@jan.ai>

fix: Update llava 1.5 size

fix: Nitro extension path resolver

feat: Add upload preview clearance

chore: Update FileType to multiple targets

fix: delete file preview once new thread created

chore: Add langchain import

support storing pdf file

Signed-off-by: James <james@jan.ai>

feat: add retrieval tool in node runtime

fix: import module done

Co-authored-by: Louis <louis-jan@users.noreply.github.com>

feat: Add type assistant tool

chore: Add tool_retrieval_enabled to InferenceEngine

chore: Add AssistantTool to thread entity

chore: refactor tool retrieval base class

feat: Add handler for assistant with rag enabled

chore: Update inferenceEngine type properly

chore: Update inferenceEngine type properly

fix: Update retrieval tool

chore: main entry correction

refactor: tsconfig files

chore: Update ModelRuntimeParams type

refactor: Remove unused files

fix: wip

chore: remove unused console.log for FileUploadPreview

fix: Update mapping correctly for engine and proxyEngine

feat: Add proxyEngine to type ModelInfo

fix: WIP with test route

fix: Add bundleDependencies to package.json

chore: remove conversational history memory

fix: refactor data passing

reafactor: remove unused code

fix: Update module

chore: export import correction

fix conflict

Signed-off-by: James <james@jan.ai>

fix: resolve after rebased

fix: Update llava 1.5 model json

feat: Add bakllava 1 model json

refactor: node module export, ES syntax and langchain import

fix: WIP

fix: WIP

fix: WIP

fix: external module import

fix: WIP

Add UI attachment fot file upload

Prepare Thumbnail UI image

chore: rebase model folder to dev branch

chore: remove multiple binaries related commits

fix: remove multiple binaries related commits part 2

fix: Remove transformer.js related deps

Fix truncate file name attachment

remove unused code image preview attachment

fix: remove multi binaries error

chore: remove commented code for ModelArtifacts type

Dropzone for drag and drop attachment

Avoid conditional showing 0 using length

fix symbol windows

avoid undefined tools

fix: add tool retrieval to true by default and disable the change

chore: remove unused code

fix: Enable nitro embedding by default

fix: Update code WIP with nitro embedding

chore: remove unused running function

fix: assistant extension missing module

feat: Retrieval ingest, query and reforward

fix: Update hnswlib version conflict

fix: Add tool settings

fix: Update path to thread_id/memory

fix: Add support for nitro embedding usage

fix: RAG does not work with plain content message

fix(Model): #1662 imported model does not use gpu (#1723)

Signed-off-by: James <james@jan.ai>
Co-authored-by: James <james@jan.ai>

feat: allow users to update retrieval settings

chore: pass thread assistant settings to assistant extensions

chore: eslint fix

fix bug border right panel showing in thread while not have active thread

Update setting layout retrieval assistant

Renaming file settingcomponent

change default value in core extention

add fake loader generate response

fix conditional fake loader

remove unused import

Proper error message on file typr

fix: loading indicator

fix: chunk size and overlap constraint

conditional drag and drop when retrieval off

fix: enable retrieval middleware as soon as its tool is enabled

fix: configure embedding engine according to request

fix: Retrieval false by default

fix: engine json

chore: migrate assistant

disabled collapse panel when retrieval or children null

chore: remove unused log

chore: Bump nitro version to 0.2.14 for batch embedding

chore: remove unused console.log
This commit is contained in:
hiro 2023-12-19 23:33:27 +07:00 committed by Louis
parent 38f757dd4d
commit 28e4405498
No known key found for this signature in database
GPG Key ID: 44FA9F4D33C37DE2
68 changed files with 1959 additions and 525 deletions

View File

@ -62,6 +62,7 @@ export enum FileManagerRoute {
getJanDataFolderPath = 'getJanDataFolderPath',
getResourcePath = 'getResourcePath',
fileStat = 'fileStat',
writeBlob = 'writeBlob',
}
export type ApiFunction = (...args: any[]) => any

View File

@ -1,4 +1,4 @@
import { FileStat } from "./types"
import { FileStat } from './types'
/**
* Writes data to a file at the specified path.
@ -6,6 +6,15 @@ import { FileStat } from "./types"
*/
const writeFileSync = (...args: any[]) => global.core.api?.writeFileSync(...args)
/**
* Writes blob data to a file at the specified path.
* @param path - The path to file.
* @param data - The blob data.
* @returns
*/
const writeBlob: (path: string, data: string) => Promise<any> = (path, data) =>
global.core.api?.writeBlob(path, data)
/**
* Reads the contents of a file at the specified path.
* @returns {Promise<any>} A Promise that resolves with the contents of the file.
@ -60,7 +69,6 @@ const syncFile: (src: string, dest: string) => Promise<any> = (src, dest) =>
*/
const copyFileSync = (...args: any[]) => global.core.api?.copyFileSync(...args)
/**
* Gets the file's stats.
*
@ -70,7 +78,6 @@ const copyFileSync = (...args: any[]) => global.core.api?.copyFileSync(...args)
const fileStat: (path: string) => Promise<FileStat | undefined> = (path) =>
global.core.api?.fileStat(path)
// TODO: Export `dummy` fs functions automatically
// Currently adding these manually
export const fs = {
@ -84,5 +91,6 @@ export const fs = {
appendFileSync,
copyFileSync,
syncFile,
fileStat
fileStat,
writeBlob,
}

View File

@ -2,6 +2,7 @@ import { FileSystemRoute } from '../../../api'
import { join } from 'path'
import { HttpServer } from '../HttpServer'
import { getJanDataFolderPath } from '../../utils'
import { normalizeFilePath } from '../../path'
export const fsRouter = async (app: HttpServer) => {
const moduleName = 'fs'
@ -13,10 +14,10 @@ export const fsRouter = async (app: HttpServer) => {
const result = await import(moduleName).then((mdl) => {
return mdl[route](
...body.map((arg: any) =>
typeof arg === 'string' && arg.includes('file:/')
? join(getJanDataFolderPath(), arg.replace('file:/', ''))
: arg,
),
typeof arg === 'string' && (arg.startsWith(`file:/`) || arg.startsWith(`file:\\`))
? join(getJanDataFolderPath(), normalizeFilePath(arg))
: arg
)
)
})
res.status(200).send(result)

View File

@ -2,6 +2,13 @@
* Assistant type defines the shape of an assistant object.
* @stored
*/
export type AssistantTool = {
type: string
enabled: boolean
settings: any
}
export type Assistant = {
/** Represents the avatar of the user. */
avatar: string
@ -22,7 +29,7 @@ export type Assistant = {
/** Represents the instructions for the object. */
instructions?: string
/** Represents the tools associated with the object. */
tools?: any
tools?: AssistantTool[]
/** Represents the file identifiers associated with the object. */
file_ids: string[]
/** Represents the metadata of the object. */

View File

@ -1,3 +1,5 @@
import { ContentType, ContentValue } from '../message'
/**
* The role of the author of this message.
*/
@ -13,7 +15,32 @@ export enum ChatCompletionRole {
*/
export type ChatCompletionMessage = {
/** The contents of the message. **/
content?: string
content?: ChatCompletionMessageContent
/** The role of the author of this message. **/
role: ChatCompletionRole
}
export type ChatCompletionMessageContent =
| string
| (ChatCompletionMessageContentText &
ChatCompletionMessageContentImage &
ChatCompletionMessageContentDoc)[]
export enum ChatCompletionMessageContentType {
Text = 'text',
Image = 'image_url',
Doc = 'doc_url',
}
export type ChatCompletionMessageContentText = {
type: ChatCompletionMessageContentType
text: string
}
export type ChatCompletionMessageContentImage = {
type: ChatCompletionMessageContentType
image_url: { url: string }
}
export type ChatCompletionMessageContentDoc = {
type: ChatCompletionMessageContentType
doc_url: { url: string }
}

View File

@ -1,5 +1,6 @@
import { ChatCompletionMessage, ChatCompletionRole } from '../inference'
import { ModelInfo } from '../model'
import { Thread } from '../thread'
/**
* The `ThreadMessage` type defines the shape of a thread's message object.
@ -35,7 +36,10 @@ export type ThreadMessage = {
export type MessageRequest = {
id?: string
/** The thread id of the message request. **/
/**
* @deprecated Use thread object instead
* The thread id of the message request.
*/
threadId: string
/**
@ -48,6 +52,10 @@ export type MessageRequest = {
/** Settings for constructing a chat completion request **/
model?: ModelInfo
/** The thread of this message is belong to. **/
// TODO: deprecate threadId field
thread?: Thread
}
/**
@ -62,7 +70,7 @@ export enum MessageStatus {
/** Message loaded with error. **/
Error = 'error',
/** Message is cancelled streaming */
Stopped = "stopped"
Stopped = 'stopped',
}
/**
@ -71,6 +79,7 @@ export enum MessageStatus {
export enum ContentType {
Text = 'text',
Image = 'image',
Pdf = 'pdf',
}
/**
@ -80,6 +89,8 @@ export enum ContentType {
export type ContentValue = {
value: string
annotations: string[]
name?: string
size?: number
}
/**

View File

@ -7,6 +7,7 @@ export type ModelInfo = {
settings: ModelSettingParams
parameters: ModelRuntimeParams
engine?: InferenceEngine
proxyEngine?: InferenceEngine
}
/**
@ -18,7 +19,8 @@ export enum InferenceEngine {
nitro = 'nitro',
openai = 'openai',
triton_trtllm = 'triton_trtllm',
hf_endpoint = 'hf_endpoint',
tool_retrieval_enabled = 'tool_retrieval_enabled',
}
export type ModelArtifact = {
@ -90,6 +92,13 @@ export type Model = {
* The model engine.
*/
engine: InferenceEngine
proxyEngine?: InferenceEngine
/**
* Is multimodal or not.
*/
visionModel?: boolean
}
export type ModelMetadata = {
@ -129,4 +138,5 @@ export type ModelRuntimeParams = {
stop?: string[]
frequency_penalty?: number
presence_penalty?: number
engine?: string
}

View File

@ -1,2 +1,3 @@
export * from './threadEntity'
export * from './threadInterface'
export * from './threadEvent'

View File

@ -1,3 +1,4 @@
import { AssistantTool } from '../assistant'
import { ModelInfo } from '../model'
/**
@ -30,6 +31,7 @@ export type ThreadAssistantInfo = {
assistant_name: string
model: ModelInfo
instructions?: string
tools?: AssistantTool[]
}
/**

View File

@ -0,0 +1,4 @@
export enum ThreadEvent {
/** The `OnThreadStarted` event is emitted when a thread is started. */
OnThreadStarted = 'OnThreadStarted',
}

View File

@ -59,4 +59,20 @@ export function handleFileMangerIPCs() {
return fileStat
}
)
ipcMain.handle(
FileManagerRoute.writeBlob,
async (_event, path: string, data: string): Promise<void> => {
try {
const normalizedPath = normalizeFilePath(path)
const dataBuffer = Buffer.from(data, 'base64')
fs.writeFileSync(
join(getJanDataFolderPath(), normalizedPath),
dataBuffer
)
} catch (err) {
console.error(`writeFile ${path} result: ${err}`)
}
}
)
}

View File

@ -1,9 +1,9 @@
import { ipcMain } from 'electron'
import { FileSystemRoute } from '@janhq/core'
import { join } from 'path'
import { getJanDataFolderPath, normalizeFilePath } from '@janhq/core/node'
import fs from 'fs'
import { FileManagerRoute, FileSystemRoute } from '@janhq/core'
import { join } from 'path'
/**
* Handles file system operations.
*/
@ -15,7 +15,7 @@ export function handleFsIPCs() {
mdl[route](
...args.map((arg) =>
typeof arg === 'string' &&
(arg.includes(`file:/`) || arg.includes(`file:\\`))
(arg.startsWith(`file:/`) || arg.startsWith(`file:\\`))
? join(getJanDataFolderPath(), normalizeFilePath(arg))
: arg
)

View File

@ -3,26 +3,46 @@
"version": "1.0.0",
"description": "This extension enables assistants, including Jan, a default assistant that can call all downloaded models",
"main": "dist/index.js",
"module": "dist/module.js",
"node": "dist/node/index.js",
"author": "Jan <service@jan.ai>",
"license": "AGPL-3.0",
"scripts": {
"build": "tsc -b . && webpack --config webpack.config.js",
"build": "tsc --module commonjs && rollup -c rollup.config.ts",
"build:publish": "rimraf *.tgz --glob && npm run build && npm pack && cpx *.tgz ../../electron/pre-install"
},
"devDependencies": {
"@rollup/plugin-commonjs": "^25.0.7",
"@rollup/plugin-json": "^6.1.0",
"@rollup/plugin-node-resolve": "^15.2.3",
"@rollup/plugin-replace": "^5.0.5",
"@types/pdf-parse": "^1.1.4",
"cpx": "^1.5.0",
"rimraf": "^3.0.2",
"webpack": "^5.88.2",
"webpack-cli": "^5.1.4"
"rollup": "^2.38.5",
"rollup-plugin-define": "^1.0.1",
"rollup-plugin-sourcemaps": "^0.6.3",
"rollup-plugin-typescript2": "^0.36.0",
"typescript": "^5.3.3"
},
"dependencies": {
"@janhq/core": "file:../../core",
"@langchain/community": "0.0.13",
"hnswlib-node": "^1.4.2",
"langchain": "^0.0.214",
"path-browserify": "^1.0.1",
"pdf-parse": "^1.1.1",
"ts-loader": "^9.5.0"
},
"files": [
"dist/*",
"package.json",
"README.md"
],
"bundleDependencies": [
"@janhq/core",
"@langchain/community",
"hnswlib-node",
"langchain",
"pdf-parse"
]
}

View File

@ -0,0 +1,81 @@
import resolve from "@rollup/plugin-node-resolve";
import commonjs from "@rollup/plugin-commonjs";
import sourceMaps from "rollup-plugin-sourcemaps";
import typescript from "rollup-plugin-typescript2";
import json from "@rollup/plugin-json";
import replace from "@rollup/plugin-replace";
const packageJson = require("./package.json");
const pkg = require("./package.json");
export default [
{
input: `src/index.ts`,
output: [{ file: pkg.main, format: "es", sourcemap: true }],
// Indicate here external modules you don't wanna include in your bundle (i.e.: 'lodash')
external: [],
watch: {
include: "src/**",
},
plugins: [
replace({
NODE: JSON.stringify(`${packageJson.name}/${packageJson.node}`),
EXTENSION_NAME: JSON.stringify(packageJson.name),
VERSION: JSON.stringify(packageJson.version),
}),
// Allow json resolution
json(),
// Compile TypeScript files
typescript({ useTsconfigDeclarationDir: true }),
// Compile TypeScript files
// Allow bundling cjs modules (unlike webpack, rollup doesn't understand cjs)
commonjs(),
// Allow node_modules resolution, so you can use 'external' to control
// which external modules to include in the bundle
// https://github.com/rollup/rollup-plugin-node-resolve#usage
resolve({
extensions: [".js", ".ts", ".svelte"],
}),
// Resolve source maps to the original source
sourceMaps(),
],
},
{
input: `src/node/index.ts`,
output: [{ dir: "dist/node", format: "cjs", sourcemap: false }],
// Indicate here external modules you don't wanna include in your bundle (i.e.: 'lodash')
external: [
"@janhq/core/node",
"@langchain/community",
"langchain",
"langsmith",
"path",
"hnswlib-node",
],
watch: {
include: "src/node/**",
},
// inlineDynamicImports: true,
plugins: [
// Allow json resolution
json(),
// Compile TypeScript files
typescript({ useTsconfigDeclarationDir: true }),
// Allow bundling cjs modules (unlike webpack, rollup doesn't understand cjs)
commonjs({
ignoreDynamicRequires: true,
}),
// Allow node_modules resolution, so you can use 'external' to control
// which external modules to include in the bundle
// https://github.com/rollup/rollup-plugin-node-resolve#usage
resolve({
extensions: [".ts", ".js", ".json"],
}),
// Resolve source maps to the original source
// sourceMaps(),
],
},
];

View File

@ -1 +1,3 @@
declare const MODULE: string;
declare const NODE: string;
declare const EXTENSION_NAME: string;
declare const VERSION: string;

View File

@ -1,15 +1,151 @@
import { fs, Assistant, AssistantExtension } from "@janhq/core";
import { join } from "path";
import {
fs,
Assistant,
MessageRequest,
events,
InferenceEngine,
MessageEvent,
InferenceEvent,
joinPath,
executeOnMain,
AssistantExtension,
} from "@janhq/core";
export default class JanAssistantExtension extends AssistantExtension {
private static readonly _homeDir = "file://assistants";
controller = new AbortController();
isCancelled = false;
retrievalThreadId: string | undefined = undefined;
async onLoad() {
// making the assistant directory
if (!(await fs.existsSync(JanAssistantExtension._homeDir)))
fs.mkdirSync(JanAssistantExtension._homeDir).then(() => {
this.createJanAssistant();
});
const assistantDirExist = await fs.existsSync(
JanAssistantExtension._homeDir,
);
if (
localStorage.getItem(`${EXTENSION_NAME}-version`) !== VERSION ||
!assistantDirExist
) {
if (!assistantDirExist)
await fs.mkdirSync(JanAssistantExtension._homeDir);
// Write assistant metadata
this.createJanAssistant();
// Finished migration
localStorage.setItem(`${EXTENSION_NAME}-version`, VERSION);
}
// Events subscription
events.on(MessageEvent.OnMessageSent, (data: MessageRequest) =>
JanAssistantExtension.handleMessageRequest(data, this),
);
events.on(InferenceEvent.OnInferenceStopped, () => {
JanAssistantExtension.handleInferenceStopped(this);
});
}
private static async handleInferenceStopped(instance: JanAssistantExtension) {
instance.isCancelled = true;
instance.controller?.abort();
}
private static async handleMessageRequest(
data: MessageRequest,
instance: JanAssistantExtension,
) {
instance.isCancelled = false;
instance.controller = new AbortController();
if (
data.model?.engine !== InferenceEngine.tool_retrieval_enabled ||
!data.messages ||
!data.thread?.assistants[0]?.tools
) {
return;
}
const latestMessage = data.messages[data.messages.length - 1];
// Ingest the document if needed
if (
latestMessage &&
latestMessage.content &&
typeof latestMessage.content !== "string"
) {
const docFile = latestMessage.content[1]?.doc_url?.url;
if (docFile) {
await executeOnMain(
NODE,
"toolRetrievalIngestNewDocument",
docFile,
data.model?.proxyEngine,
);
}
}
// Load agent on thread changed
if (instance.retrievalThreadId !== data.threadId) {
await executeOnMain(NODE, "toolRetrievalLoadThreadMemory", data.threadId);
instance.retrievalThreadId = data.threadId;
// Update the text splitter
await executeOnMain(
NODE,
"toolRetrievalUpdateTextSplitter",
data.thread.assistants[0].tools[0]?.settings?.chunk_size ?? 4000,
data.thread.assistants[0].tools[0]?.settings?.chunk_overlap ?? 200,
);
}
if (latestMessage.content) {
const prompt =
typeof latestMessage.content === "string"
? latestMessage.content
: latestMessage.content[0].text;
// Retrieve the result
console.debug("toolRetrievalQuery", latestMessage.content);
const retrievalResult = await executeOnMain(
NODE,
"toolRetrievalQueryResult",
prompt,
);
// Update the message content
// Using the retrieval template with the result and query
if (data.thread?.assistants[0].tools)
data.messages[data.messages.length - 1].content =
data.thread.assistants[0].tools[0].settings?.retrieval_template
?.replace("{CONTEXT}", retrievalResult)
.replace("{QUESTION}", prompt);
}
// Filter out all the messages that are not text
data.messages = data.messages.map((message) => {
if (
message.content &&
typeof message.content !== "string" &&
(message.content.length ?? 0) > 0
) {
return {
...message,
content: [message.content[0]],
};
}
return message;
});
// Reroute the result to inference engine
const output = {
...data,
model: {
...data.model,
engine: data.model.proxyEngine,
},
};
events.emit(MessageEvent.OnMessageSent, output);
}
/**
@ -18,15 +154,21 @@ export default class JanAssistantExtension extends AssistantExtension {
onUnload(): void {}
async createAssistant(assistant: Assistant): Promise<void> {
const assistantDir = join(JanAssistantExtension._homeDir, assistant.id);
const assistantDir = await joinPath([
JanAssistantExtension._homeDir,
assistant.id,
]);
if (!(await fs.existsSync(assistantDir))) await fs.mkdirSync(assistantDir);
// store the assistant metadata json
const assistantMetadataPath = join(assistantDir, "assistant.json");
const assistantMetadataPath = await joinPath([
assistantDir,
"assistant.json",
]);
try {
await fs.writeFileSync(
assistantMetadataPath,
JSON.stringify(assistant, null, 2)
JSON.stringify(assistant, null, 2),
);
} catch (err) {
console.error(err);
@ -38,14 +180,17 @@ export default class JanAssistantExtension extends AssistantExtension {
// get all the assistant metadata json
const results: Assistant[] = [];
const allFileName: string[] = await fs.readdirSync(
JanAssistantExtension._homeDir
JanAssistantExtension._homeDir,
);
for (const fileName of allFileName) {
const filePath = join(JanAssistantExtension._homeDir, fileName);
const filePath = await joinPath([
JanAssistantExtension._homeDir,
fileName,
]);
if (filePath.includes(".DS_Store")) continue;
const jsonFiles: string[] = (await fs.readdirSync(filePath)).filter(
(file: string) => file === "assistant.json"
(file: string) => file === "assistant.json",
);
if (jsonFiles.length !== 1) {
@ -54,8 +199,8 @@ export default class JanAssistantExtension extends AssistantExtension {
}
const content = await fs.readFileSync(
join(filePath, jsonFiles[0]),
"utf-8"
await joinPath([filePath, jsonFiles[0]]),
"utf-8",
);
const assistant: Assistant =
typeof content === "object" ? content : JSON.parse(content);
@ -72,7 +217,10 @@ export default class JanAssistantExtension extends AssistantExtension {
}
// remove the directory
const assistantDir = join(JanAssistantExtension._homeDir, assistant.id);
const assistantDir = await joinPath([
JanAssistantExtension._homeDir,
assistant.id,
]);
await fs.rmdirSync(assistantDir);
return Promise.resolve();
}
@ -88,7 +236,24 @@ export default class JanAssistantExtension extends AssistantExtension {
description: "A default assistant that can use all downloaded models",
model: "*",
instructions: "",
tools: undefined,
tools: [
{
type: "retrieval",
enabled: false,
settings: {
top_k: 2,
chunk_size: 1024,
chunk_overlap: 64,
retrieval_template: `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
CONTEXT: {CONTEXT}
----------------
QUESTION: {QUESTION}
----------------
Helpful Answer:`,
},
},
],
file_ids: [],
metadata: undefined,
};

View File

@ -0,0 +1,13 @@
import fs from "fs";
import path from "path";
import { getJanDataFolderPath } from "@janhq/core/node";
// Sec: Do not send engine settings over requests
// Read it manually instead
export const readEmbeddingEngine = (engineName: string) => {
const engineSettings = fs.readFileSync(
path.join(getJanDataFolderPath(), "engines", `${engineName}.json`),
"utf-8",
);
return JSON.parse(engineSettings);
};

View File

@ -0,0 +1,39 @@
import { getJanDataFolderPath, normalizeFilePath } from "@janhq/core/node";
import { Retrieval } from "./tools/retrieval";
import path from "path";
const retrieval = new Retrieval();
export async function toolRetrievalUpdateTextSplitter(
chunkSize: number,
chunkOverlap: number,
) {
retrieval.updateTextSplitter(chunkSize, chunkOverlap);
return Promise.resolve();
}
export async function toolRetrievalIngestNewDocument(
file: string,
engine: string,
) {
const filePath = path.join(getJanDataFolderPath(), normalizeFilePath(file));
const threadPath = path.dirname(filePath.replace("files", ""));
retrieval.updateEmbeddingEngine(engine);
await retrieval.ingestAgentKnowledge(filePath, `${threadPath}/memory`);
return Promise.resolve();
}
export async function toolRetrievalLoadThreadMemory(threadId: string) {
try {
await retrieval.loadRetrievalAgent(
path.join(getJanDataFolderPath(), "threads", threadId, "memory"),
);
return Promise.resolve();
} catch (err) {
console.debug(err);
}
}
export async function toolRetrievalQueryResult(query: string) {
const res = await retrieval.generateResult(query);
return Promise.resolve(res);
}

View File

@ -0,0 +1,78 @@
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { formatDocumentsAsString } from "langchain/util/document";
import { PDFLoader } from "langchain/document_loaders/fs/pdf";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { readEmbeddingEngine } from "../../engine";
export class Retrieval {
public chunkSize: number = 100;
public chunkOverlap?: number = 0;
private retriever: any;
private embeddingModel: any = undefined;
private textSplitter?: RecursiveCharacterTextSplitter;
constructor(chunkSize: number = 4000, chunkOverlap: number = 200) {
this.updateTextSplitter(chunkSize, chunkOverlap);
this.embeddingModel = new OpenAIEmbeddings({});
}
public updateTextSplitter(chunkSize: number, chunkOverlap: number): void {
this.chunkSize = chunkSize;
this.chunkOverlap = chunkOverlap;
this.textSplitter = new RecursiveCharacterTextSplitter({
chunkSize: chunkSize,
chunkOverlap: chunkOverlap,
});
}
public updateEmbeddingEngine(engine: string): void {
// Engine settings are not compatible with the current embedding model params
// Switch case manually for now
const settings = readEmbeddingEngine(engine);
if (engine === "nitro") {
this.embeddingModel = new OpenAIEmbeddings(
{ openAIApiKey: "nitro-embedding" },
{ basePath: "http://127.0.0.1:3928/v1" },
);
} else {
// Fallback to OpenAI Settings
this.embeddingModel = new OpenAIEmbeddings({
configuration: {
apiKey: settings.api_key,
},
});
}
}
public ingestAgentKnowledge = async (
filePath: string,
memoryPath: string,
): Promise<any> => {
const loader = new PDFLoader(filePath, {
splitPages: true,
});
const doc = await loader.load();
const docs = await this.textSplitter!.splitDocuments(doc);
const vectorStore = await HNSWLib.fromDocuments(docs, this.embeddingModel);
return vectorStore.save(memoryPath);
};
public loadRetrievalAgent = async (memoryPath: string): Promise<void> => {
const vectorStore = await HNSWLib.load(memoryPath, this.embeddingModel);
this.retriever = vectorStore.asRetriever(2);
return Promise.resolve();
};
public generateResult = async (query: string): Promise<string> => {
if (!this.retriever) {
return Promise.resolve(" ");
}
const relevantDocs = await this.retriever.getRelevantDocuments(query);
const serializedDoc = formatDocumentsAsString(relevantDocs);
return Promise.resolve(serializedDoc);
};
}

View File

@ -1,14 +1,20 @@
{
"compilerOptions": {
"target": "es2016",
"module": "ES6",
"moduleResolution": "node",
"outDir": "./dist",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": false,
"target": "es5",
"module": "ES2020",
"lib": ["es2015", "es2016", "es2017", "dom"],
"strict": true,
"sourceMap": true,
"declaration": true,
"allowSyntheticDefaultImports": true,
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"declarationDir": "dist/types",
"outDir": "dist",
"importHelpers": true,
"typeRoots": ["node_modules/@types"],
"skipLibCheck": true,
"rootDir": "./src"
},
"include": ["./src"]
"include": ["src"],
}

View File

@ -1,38 +0,0 @@
const path = require("path");
const webpack = require("webpack");
const packageJson = require("./package.json");
module.exports = {
experiments: { outputModule: true },
entry: "./src/index.ts", // Adjust the entry point to match your project's main file
mode: "production",
module: {
rules: [
{
test: /\.tsx?$/,
use: "ts-loader",
exclude: /node_modules/,
},
],
},
output: {
filename: "index.js", // Adjust the output file name as needed
path: path.resolve(__dirname, "dist"),
library: { type: "module" }, // Specify ESM output format
},
plugins: [
new webpack.DefinePlugin({
MODULE: JSON.stringify(`${packageJson.name}/${packageJson.module}`),
}),
],
resolve: {
extensions: [".ts", ".js"],
fallback: {
path: require.resolve("path-browserify"),
},
},
optimization: {
minimize: false,
},
// Add loaders and other configuration as needed for your project
};

View File

@ -4,15 +4,14 @@ import {
ConversationalExtension,
Thread,
ThreadMessage,
events,
} from '@janhq/core'
/**
* JSONConversationalExtension is a ConversationalExtension implementation that provides
* functionality for managing threads.
*/
export default class JSONConversationalExtension
extends ConversationalExtension
{
export default class JSONConversationalExtension extends ConversationalExtension {
private static readonly _homeDir = 'file://threads'
private static readonly _threadInfoFileName = 'thread.json'
private static readonly _threadMessagesFileName = 'messages.jsonl'
@ -119,6 +118,32 @@ export default class JSONConversationalExtension
])
if (!(await fs.existsSync(threadDirPath)))
await fs.mkdirSync(threadDirPath)
if (message.content[0].type === 'image') {
const filesPath = await joinPath([threadDirPath, 'files'])
if (!(await fs.existsSync(filesPath))) await fs.mkdirSync(filesPath)
const imagePath = await joinPath([filesPath, `${message.id}.png`])
const base64 = message.content[0].text.annotations[0]
await this.storeImage(base64, imagePath)
// if (fs.existsSync(imagePath)) {
// message.content[0].text.annotations[0] = imagePath
// }
}
if (message.content[0].type === 'pdf') {
const filesPath = await joinPath([threadDirPath, 'files'])
if (!(await fs.existsSync(filesPath))) await fs.mkdirSync(filesPath)
const filePath = await joinPath([filesPath, `${message.id}.pdf`])
const blob = message.content[0].text.annotations[0]
await this.storeFile(blob, filePath)
if (await fs.existsSync(filePath)) {
// Use file path instead of blob
message.content[0].text.annotations[0] = `threads/${message.thread_id}/files/${message.id}.pdf`
}
}
await fs.appendFileSync(threadMessagePath, JSON.stringify(message) + '\n')
Promise.resolve()
} catch (err) {
@ -126,6 +151,25 @@ export default class JSONConversationalExtension
}
}
async storeImage(base64: string, filePath: string): Promise<void> {
const base64Data = base64.replace(/^data:image\/\w+;base64,/, '')
try {
await fs.writeBlob(filePath, base64Data)
} catch (err) {
console.error(err)
}
}
async storeFile(base64: string, filePath: string): Promise<void> {
const base64Data = base64.replace(/^data:application\/pdf;base64,/, '')
try {
await fs.writeBlob(filePath, base64Data)
} catch (err) {
console.error(err)
}
}
async writeMessages(
threadId: string,
messages: ThreadMessage[]
@ -229,7 +273,11 @@ export default class JSONConversationalExtension
const messages: ThreadMessage[] = []
result.forEach((line: string) => {
messages.push(JSON.parse(line) as ThreadMessage)
try {
messages.push(JSON.parse(line) as ThreadMessage)
} catch (err) {
console.error(err)
}
})
return messages
} catch (err) {

View File

@ -1 +1 @@
0.2.12
0.2.14

View File

@ -40,6 +40,7 @@
"dependencies": {
"@janhq/core": "file:../../core",
"@rollup/plugin-replace": "^5.0.5",
"@types/os-utils": "^0.0.4",
"fetch-retry": "^5.0.6",
"path-browserify": "^1.0.1",
"rxjs": "^7.8.1",

View File

@ -50,7 +50,7 @@ export default class JanInferenceNitroExtension extends InferenceExtension {
ngl: 100,
cpu_threads: 1,
cont_batching: false,
embedding: false,
embedding: true,
};
controller = new AbortController();
@ -83,19 +83,19 @@ export default class JanInferenceNitroExtension extends InferenceExtension {
// Events subscription
events.on(MessageEvent.OnMessageSent, (data: MessageRequest) =>
this.onMessageRequest(data)
this.onMessageRequest(data),
);
events.on(ModelEvent.OnModelInit, (model: Model) =>
this.onModelInit(model)
this.onModelInit(model),
);
events.on(ModelEvent.OnModelStop, (model: Model) =>
this.onModelStop(model)
this.onModelStop(model),
);
events.on(InferenceEvent.OnInferenceStopped, () =>
this.onInferenceStopped()
this.onInferenceStopped(),
);
// Attempt to fetch nvidia info
@ -120,7 +120,7 @@ export default class JanInferenceNitroExtension extends InferenceExtension {
} else {
await fs.writeFileSync(
engineFile,
JSON.stringify(this._engineSettings, null, 2)
JSON.stringify(this._engineSettings, null, 2),
);
}
} catch (err) {
@ -148,7 +148,7 @@ export default class JanInferenceNitroExtension extends InferenceExtension {
this.getNitroProcesHealthIntervalId = setInterval(
() => this.periodicallyGetNitroHealth(),
JanInferenceNitroExtension._intervalHealthCheck
JanInferenceNitroExtension._intervalHealthCheck,
);
}

View File

@ -78,7 +78,7 @@ function stopModel(): Promise<void> {
* TODO: Should pass absolute of the model file instead of just the name - So we can modurize the module.ts to npm package
*/
async function runModel(
wrapper: ModelInitOptions
wrapper: ModelInitOptions,
): Promise<ModelOperationResponse | void> {
if (wrapper.model.engine !== InferenceEngine.nitro) {
// Not a nitro model
@ -96,7 +96,7 @@ async function runModel(
const ggufBinFile = files.find(
(file) =>
file === path.basename(currentModelFile) ||
file.toLowerCase().includes(SUPPORTED_MODEL_FORMAT)
file.toLowerCase().includes(SUPPORTED_MODEL_FORMAT),
);
if (!ggufBinFile) return Promise.reject("No GGUF model file found");
@ -133,7 +133,6 @@ async function runModel(
mmproj: path.join(modelFolderPath, wrapper.model.settings.mmproj),
}),
};
console.log(currentSettings);
return runNitroAndLoadModel();
}
}
@ -192,10 +191,10 @@ function promptTemplateConverter(promptTemplate: string): PromptTemplate {
const system_prompt = promptTemplate.substring(0, systemIndex);
const user_prompt = promptTemplate.substring(
systemIndex + systemMarker.length,
promptIndex
promptIndex,
);
const ai_prompt = promptTemplate.substring(
promptIndex + promptMarker.length
promptIndex + promptMarker.length,
);
// Return the split parts
@ -205,7 +204,7 @@ function promptTemplateConverter(promptTemplate: string): PromptTemplate {
const promptIndex = promptTemplate.indexOf(promptMarker);
const user_prompt = promptTemplate.substring(0, promptIndex);
const ai_prompt = promptTemplate.substring(
promptIndex + promptMarker.length
promptIndex + promptMarker.length,
);
// Return the split parts
@ -234,8 +233,8 @@ function loadLLMModel(settings: any): Promise<Response> {
.then((res) => {
log(
`[NITRO]::Debug: Load model success with response ${JSON.stringify(
res
)}`
res,
)}`,
);
return Promise.resolve(res);
})
@ -264,8 +263,8 @@ async function validateModelStatus(): Promise<void> {
}).then(async (res: Response) => {
log(
`[NITRO]::Debug: Validate model state success with response ${JSON.stringify(
res
)}`
res,
)}`,
);
// If the response is OK, check model_loaded status.
if (res.ok) {
@ -316,7 +315,7 @@ function spawnNitroProcess(): Promise<any> {
const args: string[] = ["1", LOCAL_HOST, PORT.toString()];
// Execute the binary
log(
`[NITRO]::Debug: Spawn nitro at path: ${executableOptions.executablePath}, and args: ${args}`
`[NITRO]::Debug: Spawn nitro at path: ${executableOptions.executablePath}, and args: ${args}`,
);
subprocess = spawn(
executableOptions.executablePath,
@ -327,7 +326,7 @@ function spawnNitroProcess(): Promise<any> {
...process.env,
CUDA_VISIBLE_DEVICES: executableOptions.cudaVisibleDevices,
},
}
},
);
// Handle subprocess output

View File

@ -15,6 +15,7 @@ import {
ThreadMessage,
events,
fs,
InferenceEngine,
BaseExtension,
MessageEvent,
ModelEvent,
@ -57,7 +58,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
// Events subscription
events.on(MessageEvent.OnMessageSent, (data) =>
JanInferenceOpenAIExtension.handleMessageRequest(data, this)
JanInferenceOpenAIExtension.handleMessageRequest(data, this),
);
events.on(ModelEvent.OnModelInit, (model: OpenAIModel) => {
@ -81,7 +82,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
try {
const engineFile = join(
JanInferenceOpenAIExtension._homeDir,
JanInferenceOpenAIExtension._engineMetadataFileName
JanInferenceOpenAIExtension._engineMetadataFileName,
);
if (await fs.existsSync(engineFile)) {
const engine = await fs.readFileSync(engineFile, "utf-8");
@ -90,7 +91,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
} else {
await fs.writeFileSync(
engineFile,
JSON.stringify(JanInferenceOpenAIExtension._engineSettings, null, 2)
JSON.stringify(JanInferenceOpenAIExtension._engineSettings, null, 2),
);
}
} catch (err) {
@ -98,7 +99,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
}
}
private static async handleModelInit(model: OpenAIModel) {
if (model.engine !== "openai") {
if (model.engine !== InferenceEngine.openai) {
return;
} else {
JanInferenceOpenAIExtension._currentModel = model;
@ -116,7 +117,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
}
private static async handleInferenceStopped(
instance: JanInferenceOpenAIExtension
instance: JanInferenceOpenAIExtension,
) {
instance.isCancelled = true;
instance.controller?.abort();
@ -130,7 +131,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
*/
private static async handleMessageRequest(
data: MessageRequest,
instance: JanInferenceOpenAIExtension
instance: JanInferenceOpenAIExtension,
) {
if (data.model.engine !== "openai") {
return;
@ -160,7 +161,7 @@ export default class JanInferenceOpenAIExtension extends BaseExtension {
...JanInferenceOpenAIExtension._currentModel,
parameters: data.model.parameters,
},
instance.controller
instance.controller,
).subscribe({
next: (content) => {
const messageContent: ThreadContent = {

View File

@ -3,13 +3,12 @@
"target": "es2016",
"module": "ES6",
"moduleResolution": "node",
"outDir": "./dist",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": false,
"skipLibCheck": true,
"rootDir": "./src"
"rootDir": "./src",
},
"include": ["./src"]
"include": ["./src"],
}

View File

@ -3,13 +3,12 @@
"target": "es2016",
"module": "ES6",
"moduleResolution": "node",
"outDir": "./dist",
"esModuleInterop": true,
"forceConsistentCasingInFileNames": true,
"strict": false,
"skipLibCheck": true,
"rootDir": "./src"
"rootDir": "./src",
},
"include": ["./src"]
"include": ["./src"],
}

View File

@ -17,6 +17,7 @@ import { activeThreadAtom } from '@/helpers/atoms/Thread.atom'
interface Props {
children: ReactNode
rightAction?: ReactNode
title: string
asChild?: boolean
hideMoreVerticalAction?: boolean
@ -25,6 +26,7 @@ export default function CardSidebar({
children,
title,
asChild,
rightAction,
hideMoreVerticalAction,
}: Props) {
const [show, setShow] = useState(true)
@ -53,27 +55,16 @@ export default function CardSidebar({
<div
className={twMerge(
'relative flex items-center justify-between pl-4',
show && 'border-b border-border'
show && children && 'border-b border-border'
)}
>
<span className="font-bold">{title}</span>
<div className="flex">
{!asChild && (
<>
{!hideMoreVerticalAction && (
<div
ref={setToggle}
className="cursor-pointer rounded-lg bg-zinc-100 p-2 pr-0 dark:bg-zinc-900"
onClick={() => setMore(!more)}
>
<MoreVerticalIcon className="h-5 w-5" />
</div>
)}
</>
)}
<div className="flex items-center ">
<button
onClick={() => setShow(!show)}
className="flex w-full flex-1 items-center space-x-2 rounded-lg bg-zinc-100 px-3 py-2 dark:bg-zinc-900"
onClick={() => {
if (!children) return
setShow(!show)
}}
className="flex w-full flex-1 items-center space-x-2 rounded-lg bg-zinc-100 py-2 pr-2 dark:bg-zinc-900"
>
<ChevronDownIcon
className={twMerge(
@ -82,6 +73,23 @@ export default function CardSidebar({
)}
/>
</button>
<span className="font-bold">{title}</span>
</div>
<div className="flex">
{rightAction && rightAction}
{!asChild && (
<>
{!hideMoreVerticalAction && (
<div
ref={setToggle}
className="cursor-pointer rounded-lg bg-zinc-100 p-2 px-3 dark:bg-zinc-900"
onClick={() => setMore(!more)}
>
<MoreVerticalIcon className="h-5 w-5" />
</div>
)}
</>
)}
</div>
{more && (

View File

@ -9,54 +9,26 @@ import {
TooltipTrigger,
} from '@janhq/uikit'
import { useAtomValue, useSetAtom } from 'jotai'
import { InfoIcon } from 'lucide-react'
import { useActiveModel } from '@/hooks/useActiveModel'
import useUpdateModelParameters from '@/hooks/useUpdateModelParameters'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toSettingParams } from '@/utils/modelParam'
import { serverEnabledAtom } from '@/helpers/atoms/LocalServer.atom'
import {
engineParamsUpdateAtom,
getActiveThreadIdAtom,
getActiveThreadModelParamsAtom,
} from '@/helpers/atoms/Thread.atom'
type Props = {
name: string
title: string
enabled?: boolean
description: string
checked: boolean
onValueChanged?: (e: string | number | boolean) => void
}
const Checkbox: React.FC<Props> = ({ name, title, checked, description }) => {
const { updateModelParameter } = useUpdateModelParameters()
const threadId = useAtomValue(getActiveThreadIdAtom)
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const modelSettingParams = toSettingParams(activeModelParams)
const engineParams = getConfigurationsData(modelSettingParams)
const setEngineParamsUpdate = useSetAtom(engineParamsUpdateAtom)
const serverEnabled = useAtomValue(serverEnabledAtom)
const { stopModel } = useActiveModel()
const Checkbox: React.FC<Props> = ({
title,
checked,
enabled = true,
description,
onValueChanged,
}) => {
const onCheckedChange = (checked: boolean) => {
if (!threadId) return
if (engineParams.some((x) => x.name.includes(name))) {
setEngineParamsUpdate(true)
stopModel()
} else {
setEngineParamsUpdate(false)
}
updateModelParameter(threadId, name, checked)
onValueChanged?.(checked)
}
return (
@ -80,7 +52,7 @@ const Checkbox: React.FC<Props> = ({ name, title, checked, description }) => {
<Switch
checked={checked}
onCheckedChange={onCheckedChange}
disabled={serverEnabled}
disabled={!enabled}
/>
</div>
)

View File

@ -120,13 +120,13 @@ const TopBar = () => {
</span>
</div>
</div>
<div
className={twMerge(
'absolute right-0 h-full w-80',
showing && 'border-l border-border'
)}
>
{activeThread && (
{activeThread && (
<div
className={twMerge(
'absolute right-0 h-full w-80',
showing && 'border-l border-border'
)}
>
<div className="flex h-full w-52 items-center justify-between px-4">
{showing && (
<div className="relative flex h-full items-center">
@ -227,8 +227,8 @@ const TopBar = () => {
/>
</div>
</div>
)}
</div>
</div>
)}
</div>
)}
<CommandSearch />

View File

@ -0,0 +1,39 @@
import React, { useEffect, useState } from 'react'
export default function GenerateResponse() {
const [loader, setLoader] = useState(0)
// This is fake loader please fix this when we have realtime percentage when load model
useEffect(() => {
if (loader === 24) {
setTimeout(() => {
setLoader(loader + 1)
}, 250)
} else if (loader === 50) {
setTimeout(() => {
setLoader(loader + 1)
}, 250)
} else if (loader === 78) {
setTimeout(() => {
setLoader(loader + 1)
}, 250)
} else if (loader === 85) {
setLoader(85)
} else {
setLoader(loader + 1)
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [loader])
return (
<div className=" mb-1 mt-2 py-2 text-center">
<div className="relative inline-block overflow-hidden rounded-lg border border-neutral-50 bg-gray-50 px-4 py-2 font-semibold text-gray-600 shadow-lg">
<div
className="absolute left-0 top-0 h-full bg-gray-200"
style={{ width: `${loader}%` }}
/>
<span className="relative z-10">Generating response...</span>
</div>
</div>
)
}

View File

@ -7,65 +7,26 @@ import {
TooltipTrigger,
} from '@janhq/uikit'
import { useAtomValue, useSetAtom } from 'jotai'
import { InfoIcon } from 'lucide-react'
import { useActiveModel } from '@/hooks/useActiveModel'
import useUpdateModelParameters from '@/hooks/useUpdateModelParameters'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toSettingParams } from '@/utils/modelParam'
import { serverEnabledAtom } from '@/helpers/atoms/LocalServer.atom'
import {
engineParamsUpdateAtom,
getActiveThreadIdAtom,
getActiveThreadModelParamsAtom,
} from '@/helpers/atoms/Thread.atom'
type Props = {
title: string
enabled?: boolean
name: string
description: string
placeholder: string
value: string
onValueChanged?: (e: string | number | boolean) => void
}
const ModelConfigInput: React.FC<Props> = ({
title,
name,
enabled = true,
value,
description,
placeholder,
onValueChanged,
}) => {
const { updateModelParameter } = useUpdateModelParameters()
const threadId = useAtomValue(getActiveThreadIdAtom)
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const modelSettingParams = toSettingParams(activeModelParams)
const engineParams = getConfigurationsData(modelSettingParams)
const setEngineParamsUpdate = useSetAtom(engineParamsUpdateAtom)
const { stopModel } = useActiveModel()
const serverEnabled = useAtomValue(serverEnabledAtom)
const onValueChanged = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
if (!threadId) return
if (engineParams.some((x) => x.name.includes(name))) {
setEngineParamsUpdate(true)
stopModel()
} else {
setEngineParamsUpdate(false)
}
updateModelParameter(threadId, name, e.target.value)
}
return (
<div className="flex flex-col">
<div className="mb-2 flex items-center gap-x-2">
@ -86,9 +47,9 @@ const ModelConfigInput: React.FC<Props> = ({
</div>
<Textarea
placeholder={placeholder}
onChange={onValueChanged}
onChange={(e) => onValueChanged?.(e.target.value)}
value={value}
disabled={serverEnabled}
disabled={!enabled}
/>
</div>
)

View File

@ -22,6 +22,7 @@ import { extensionManager } from '@/extension'
import {
addNewMessageAtom,
updateMessageAtom,
generateResponseAtom,
} from '@/helpers/atoms/ChatMessage.atom'
import {
updateThreadWaitingForResponseAtom,
@ -34,6 +35,7 @@ export default function EventHandler({ children }: { children: ReactNode }) {
const { downloadedModels } = useGetDownloadedModels()
const setActiveModel = useSetAtom(activeModelAtom)
const setStateModel = useSetAtom(stateModelAtom)
const setGenerateResponse = useSetAtom(generateResponseAtom)
const updateThreadWaiting = useSetAtom(updateThreadWaitingForResponseAtom)
const threads = useAtomValue(threadsAtom)
@ -50,6 +52,7 @@ export default function EventHandler({ children }: { children: ReactNode }) {
const onNewMessageResponse = useCallback(
(message: ThreadMessage) => {
setGenerateResponse(false)
addNewMessage(message)
},
[addNewMessage]
@ -93,6 +96,7 @@ export default function EventHandler({ children }: { children: ReactNode }) {
const onMessageResponseUpdate = useCallback(
(message: ThreadMessage) => {
setGenerateResponse(false)
updateMessage(
message.id,
message.thread_id,
@ -102,7 +106,6 @@ export default function EventHandler({ children }: { children: ReactNode }) {
if (message.status === MessageStatus.Pending) {
return
}
// Mark the thread as not waiting for response
updateThreadWaiting(message.thread_id, false)

View File

@ -9,9 +9,17 @@ type Props = {
}
export const currentPromptAtom = atom<string>('')
export const fileUploadAtom = atom<FileInfo[]>([])
export const appDownloadProgress = atom<number>(-1)
export const searchAtom = atom<string>('')
export default function JotaiWrapper({ children }: Props) {
return <Provider>{children}</Provider>
}
export type FileType = 'image' | 'pdf'
export type FileInfo = {
file: File
type: FileType
}

View File

@ -1,6 +1,6 @@
export default function ShortCut(props: { menu: string }) {
const { menu } = props
const symbol = isMac ? '⌘' : 'Ctrl'
const symbol = isMac ? '⌘' : 'Ctrl + '
return (
<div className="inline-flex items-center justify-center rounded-full bg-secondary px-1 py-0.5 text-xs font-bold text-muted-foreground">

View File

@ -9,74 +9,36 @@ import {
TooltipPortal,
TooltipTrigger,
} from '@janhq/uikit'
import { useAtomValue, useSetAtom } from 'jotai'
import { InfoIcon } from 'lucide-react'
import { useActiveModel } from '@/hooks/useActiveModel'
import { useClickOutside } from '@/hooks/useClickOutside'
import useUpdateModelParameters from '@/hooks/useUpdateModelParameters'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toSettingParams } from '@/utils/modelParam'
import { serverEnabledAtom } from '@/helpers/atoms/LocalServer.atom'
import {
engineParamsUpdateAtom,
getActiveThreadIdAtom,
getActiveThreadModelParamsAtom,
} from '@/helpers/atoms/Thread.atom'
type Props = {
name: string
title: string
enabled: boolean
description: string
min: number
max: number
step: number
value: number
onValueChanged: (e: string | number | boolean) => void
}
const SliderRightPanel: React.FC<Props> = ({
name,
title,
enabled,
min,
max,
step,
description,
value,
onValueChanged,
}) => {
const { updateModelParameter } = useUpdateModelParameters()
const threadId = useAtomValue(getActiveThreadIdAtom)
const serverEnabled = useAtomValue(serverEnabledAtom)
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const modelSettingParams = toSettingParams(activeModelParams)
const engineParams = getConfigurationsData(modelSettingParams)
const setEngineParamsUpdate = useSetAtom(engineParamsUpdateAtom)
const { stopModel } = useActiveModel()
const [showTooltip, setShowTooltip] = useState({ max: false, min: false })
useClickOutside(() => setShowTooltip({ max: false, min: false }), null, [])
const onValueChanged = (e: number[]) => {
if (!threadId) return
if (engineParams.some((x) => x.name.includes(name))) {
setEngineParamsUpdate(true)
stopModel()
} else {
setEngineParamsUpdate(false)
}
updateModelParameter(threadId, name, e[0])
}
return (
<div className="flex flex-col">
<div className="mb-3 flex items-center gap-x-2">
@ -99,11 +61,11 @@ const SliderRightPanel: React.FC<Props> = ({
<div className="relative w-full">
<Slider
value={[value]}
onValueChange={onValueChanged}
onValueChange={(e) => onValueChanged?.(e[0])}
min={min}
max={max}
step={step}
disabled={serverEnabled}
disabled={!enabled}
/>
<div className="relative mt-2 flex items-center justify-between text-gray-400">
<p className="text-sm">{min}</p>
@ -118,18 +80,18 @@ const SliderRightPanel: React.FC<Props> = ({
min={min}
max={max}
value={String(value)}
disabled={serverEnabled}
disabled={!enabled}
onBlur={(e) => {
if (Number(e.target.value) > Number(max)) {
onValueChanged([Number(max)])
onValueChanged?.(Number(max))
setShowTooltip({ max: true, min: false })
} else if (Number(e.target.value) < Number(min)) {
onValueChanged([Number(min)])
onValueChanged?.(Number(min))
setShowTooltip({ max: false, min: true })
}
}}
onChange={(e) => {
onValueChanged([Number(e.target.value)])
onValueChanged?.(Number(e.target.value))
}}
/>
</TooltipTrigger>

View File

@ -14,6 +14,8 @@ import {
/**
* Stores all chat messages for all threads
*/
export const generateResponseAtom = atom<boolean>(false)
export const chatMessages = atom<Record<string, ThreadMessage[]>>({})
/**

View File

@ -7,7 +7,9 @@ import {
ThreadState,
Model,
} from '@janhq/core'
import { atom, useAtomValue, useSetAtom } from 'jotai'
import { atom, useAtom, useAtomValue, useSetAtom } from 'jotai'
import { fileUploadAtom } from '@/containers/Providers/Jotai'
import { generateThreadId } from '@/utils/thread'
@ -46,7 +48,7 @@ export const useCreateNewThread = () => {
const createNewThread = useSetAtom(createNewThreadAtom)
const setActiveThreadId = useSetAtom(setActiveThreadIdAtom)
const updateThread = useSetAtom(updateThreadAtom)
const [fileUpload, setFileUpload] = useAtom(fileUploadAtom)
const { deleteThread } = useDeleteThread()
const requestCreateNewThread = async (
@ -72,6 +74,7 @@ export const useCreateNewThread = () => {
const assistantInfo: ThreadAssistantInfo = {
assistant_id: assistant.id,
assistant_name: assistant.name,
tools: assistant.tools,
model: {
id: modelId,
settings: {},
@ -93,6 +96,9 @@ export const useCreateNewThread = () => {
// add the new thread on top of the thread list to the state
createNewThread(thread)
setActiveThreadId(thread.id)
// Delete the file upload state
setFileUpload([])
}
function updateThreadMetadata(thread: Thread) {

View File

@ -76,8 +76,25 @@ export const usePath = () => {
openFileExplorer(fullPath)
}
const onViewFile = async (id: string) => {
if (!activeThread) return
const activeThreadState = threadStates[activeThread.id]
if (!activeThreadState.isFinishInit) {
alert('Thread is not started yet')
return
}
const userSpace = await getJanDataFolderPath()
let filePath = undefined
filePath = await joinPath(['threads', `${activeThread.id}/files`, `${id}`])
if (!filePath) return
const fullPath = await joinPath([userSpace, filePath])
openFileExplorer(fullPath)
}
return {
onReviewInFinder,
onViewJson,
onViewFile,
}
}

View File

@ -1,3 +1,4 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
import { useEffect, useRef, useState } from 'react'
import {
@ -13,16 +14,20 @@ import {
Model,
ConversationalExtension,
MessageEvent,
InferenceEngine,
ChatCompletionMessageContentType,
AssistantTool,
} from '@janhq/core'
import { useAtom, useAtomValue, useSetAtom } from 'jotai'
import { ulid } from 'ulid'
import { selectedModelAtom } from '@/containers/DropdownListSidebar'
import { currentPromptAtom } from '@/containers/Providers/Jotai'
import { currentPromptAtom, fileUploadAtom } from '@/containers/Providers/Jotai'
import { toaster } from '@/containers/Toast'
import { getBase64 } from '@/utils/base64'
import { toRuntimeParams, toSettingParams } from '@/utils/modelParam'
import { useActiveModel } from './useActiveModel'
@ -30,6 +35,7 @@ import { useActiveModel } from './useActiveModel'
import { extensionManager } from '@/extension/ExtensionManager'
import {
addNewMessageAtom,
generateResponseAtom,
getCurrentChatMessagesAtom,
} from '@/helpers/atoms/ChatMessage.atom'
import {
@ -48,6 +54,7 @@ export default function useSendChatMessage() {
const updateThread = useSetAtom(updateThreadAtom)
const updateThreadWaiting = useSetAtom(updateThreadWaitingForResponseAtom)
const [currentPrompt, setCurrentPrompt] = useAtom(currentPromptAtom)
const setGenerateResponse = useSetAtom(generateResponseAtom)
const currentMessages = useAtomValue(getCurrentChatMessagesAtom)
const { activeModel } = useActiveModel()
@ -64,6 +71,7 @@ export default function useSendChatMessage() {
const setEngineParamsUpdate = useSetAtom(engineParamsUpdateAtom)
const [reloadModel, setReloadModel] = useState(false)
const [fileUpload, setFileUpload] = useAtom(fileUploadAtom)
useEffect(() => {
modelRef.current = activeModel
@ -135,6 +143,8 @@ export default function useSendChatMessage() {
}
const sendChatMessage = async () => {
setGenerateResponse(true)
if (!currentPrompt || currentPrompt.trim().length === 0) return
if (!activeThread) {
@ -160,6 +170,7 @@ export default function useSendChatMessage() {
const assistantId = activeThread.assistants[0].assistant_id ?? ''
const assistantName = activeThread.assistants[0].assistant_name ?? ''
const instructions = activeThread.assistants[0].instructions ?? ''
const tools = activeThread.assistants[0].tools ?? []
const updatedThread: Thread = {
...activeThread,
@ -168,6 +179,7 @@ export default function useSendChatMessage() {
assistant_id: assistantId,
assistant_name: assistantName,
instructions: instructions,
tools: tools,
model: {
id: selectedModel.id,
settings: settingParams,
@ -190,6 +202,12 @@ export default function useSendChatMessage() {
const prompt = currentPrompt.trim()
setCurrentPrompt('')
const base64Blob = fileUpload[0]
? await getBase64(fileUpload[0].file).then()
: undefined
const msgId = ulid()
const messages: ChatCompletionMessage[] = [
activeThread.assistants[0]?.instructions,
]
@ -210,16 +228,41 @@ export default function useSendChatMessage() {
.concat([
{
role: ChatCompletionRole.User,
content: prompt,
content:
selectedModel && base64Blob
? [
{
type: ChatCompletionMessageContentType.Text,
text: prompt,
},
{
type: ChatCompletionMessageContentType.Doc,
doc_url: {
url: `threads/${activeThread.id}/files/${msgId}.pdf`,
},
},
]
: prompt,
} as ChatCompletionMessage,
])
)
const msgId = ulid()
const modelRequest = selectedModel ?? activeThread.assistants[0].model
let modelRequest = selectedModel ?? activeThread.assistants[0].model
if (runtimeParams.stream == null) {
runtimeParams.stream = true
}
// Add middleware to the model request with tool retrieval enabled
if (
activeThread.assistants[0].tools?.some(
(tool: AssistantTool) => tool.type === 'retrieval' && tool.enabled
)
) {
modelRequest = {
...modelRequest,
engine: InferenceEngine.tool_retrieval_enabled,
proxyEngine: modelRequest.engine,
}
}
const messageRequest: MessageRequest = {
id: msgId,
threadId: activeThread.id,
@ -229,8 +272,44 @@ export default function useSendChatMessage() {
settings: settingParams,
parameters: runtimeParams,
},
thread: activeThread,
}
const timestamp = Date.now()
const content: any = []
if (base64Blob && fileUpload[0]?.type === 'image') {
content.push({
type: ContentType.Image,
text: {
value: prompt,
annotations: [base64Blob],
},
})
}
if (base64Blob && fileUpload[0]?.type === 'pdf') {
content.push({
type: ContentType.Pdf,
text: {
value: prompt,
annotations: [base64Blob],
name: fileUpload[0].file.name,
size: fileUpload[0].file.size,
},
})
}
if (prompt && !base64Blob) {
content.push({
type: ContentType.Text,
text: {
value: prompt,
annotations: [],
},
})
}
const threadMessage: ThreadMessage = {
id: msgId,
thread_id: activeThread.id,
@ -239,18 +318,13 @@ export default function useSendChatMessage() {
created: timestamp,
updated: timestamp,
object: 'thread.message',
content: [
{
type: ContentType.Text,
text: {
value: prompt,
annotations: [],
},
},
],
content: content,
}
addNewMessage(threadMessage)
if (base64Blob) {
setFileUpload([])
}
await extensionManager
.get<ConversationalExtension>(ExtensionTypeEnum.Conversational)

View File

@ -1,3 +1,5 @@
import { useEffect } from 'react'
import {
InferenceEvent,
ExtensionTypeEnum,

View File

@ -24,9 +24,6 @@ const nextConfig = {
config.plugins = [
...config.plugins,
new webpack.DefinePlugin({
PLUGIN_CATALOG: JSON.stringify(
'https://cdn.jsdelivr.net/npm/@janhq/plugin-catalog@latest/dist/index.js'
),
VERSION: JSON.stringify(packageJson.version),
ANALYTICS_ID:
JSON.stringify(process.env.ANALYTICS_ID) ?? JSON.stringify('xxx'),

View File

@ -8,6 +8,7 @@
"build": "next build",
"start": "next start",
"lint": "eslint .",
"lint:fix": "eslint . --fix",
"format": "prettier --write \"**/*.{js,jsx,ts,tsx}\"",
"compile": "tsc --noEmit -p . --pretty"
},
@ -32,6 +33,7 @@
"posthog-js": "^1.95.1",
"react": "18.2.0",
"react-dom": "18.2.0",
"react-dropzone": "^14.2.3",
"react-hook-form": "^7.47.0",
"react-hot-toast": "^2.4.1",
"react-icons": "^4.12.0",

View File

@ -0,0 +1,78 @@
import { useAtomValue } from 'jotai'
import { useCreateNewThread } from '@/hooks/useCreateNewThread'
import SettingComponentBuilder, {
SettingComponentData,
} from '../ModelSetting/SettingComponent'
import { activeThreadAtom } from '@/helpers/atoms/Thread.atom'
const AssistantSetting = ({
componentData,
}: {
componentData: SettingComponentData[]
}) => {
const activeThread = useAtomValue(activeThreadAtom)
const { updateThreadMetadata } = useCreateNewThread()
return (
<div className="flex flex-col">
{activeThread && componentData && (
<SettingComponentBuilder
componentData={componentData}
updater={(_, name, value) => {
if (
activeThread.assistants[0].tools &&
(name === 'chunk_overlap' || name === 'chunk_size')
) {
if (
activeThread.assistants[0].tools[0]?.settings.chunk_size <
activeThread.assistants[0].tools[0]?.settings.chunk_overlap
) {
activeThread.assistants[0].tools[0].settings.chunk_overlap =
activeThread.assistants[0].tools[0].settings.chunk_size
}
if (
name === 'chunk_size' &&
value <
activeThread.assistants[0].tools[0].settings.chunk_overlap
) {
activeThread.assistants[0].tools[0].settings.chunk_overlap =
value
} else if (
name === 'chunk_overlap' &&
value > activeThread.assistants[0].tools[0].settings.chunk_size
) {
activeThread.assistants[0].tools[0].settings.chunk_size = value
}
}
updateThreadMetadata({
...activeThread,
assistants: [
{
...activeThread.assistants[0],
tools: [
{
type: 'retrieval',
enabled: false,
settings: {
...(activeThread.assistants[0].tools &&
activeThread.assistants[0].tools[0]?.settings),
[name]: value,
},
},
],
},
],
})
}}
/>
)}
</div>
)
}
export default AssistantSetting

View File

@ -8,8 +8,11 @@ import { useAtomValue } from 'jotai'
import LogoMark from '@/containers/Brand/Logo/Mark'
import GenerateResponse from '@/containers/Loader/GenerateResponse'
import { MainViewState } from '@/constants/screens'
import { activeModelAtom } from '@/hooks/useActiveModel'
import { useGetDownloadedModels } from '@/hooks/useGetDownloadedModels'
import { useMainViewState } from '@/hooks/useMainViewState'
@ -18,12 +21,17 @@ import ChatItem from '../ChatItem'
import ErrorMessage from '../ErrorMessage'
import { getCurrentChatMessagesAtom } from '@/helpers/atoms/ChatMessage.atom'
import {
generateResponseAtom,
getCurrentChatMessagesAtom,
} from '@/helpers/atoms/ChatMessage.atom'
const ChatBody: React.FC = () => {
const messages = useAtomValue(getCurrentChatMessagesAtom)
const activeModel = useAtomValue(activeModelAtom)
const { downloadedModels } = useGetDownloadedModels()
const { setMainViewState } = useMainViewState()
const generateResponse = useAtomValue(generateResponseAtom)
if (downloadedModels.length === 0)
return (
@ -80,7 +88,10 @@ const ChatBody: React.FC = () => {
<ScrollToBottom className="flex h-full w-full flex-col">
{messages.map((message, index) => (
<div key={message.id}>
<ChatItem {...message} key={message.id} />
{(message.status !== MessageStatus.Pending ||
message.content.length > 0) && (
<ChatItem {...message} key={message.id} />
)}
{(message.status === MessageStatus.Error ||
message.status === MessageStatus.Stopped) &&
index === messages.length - 1 && (
@ -88,6 +99,15 @@ const ChatBody: React.FC = () => {
)}
</div>
))}
{activeModel &&
(generateResponse ||
(messages.length &&
messages[messages.length - 1].status ===
MessageStatus.Pending &&
!messages[messages.length - 1].content.length)) && (
<GenerateResponse />
)}
</ScrollToBottom>
)}
</Fragment>

View File

@ -0,0 +1,254 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
import { useEffect, useRef, useState } from 'react'
import { InferenceEvent, MessageStatus, events } from '@janhq/core'
import {
Textarea,
Button,
Tooltip,
TooltipArrow,
TooltipContent,
TooltipPortal,
TooltipTrigger,
} from '@janhq/uikit'
import { useAtom, useAtomValue } from 'jotai'
import {
FileTextIcon,
ImageIcon,
StopCircle,
PaperclipIcon,
} from 'lucide-react'
import { twMerge } from 'tailwind-merge'
import { currentPromptAtom, fileUploadAtom } from '@/containers/Providers/Jotai'
import { useActiveModel } from '@/hooks/useActiveModel'
import { useClickOutside } from '@/hooks/useClickOutside'
import useSendChatMessage from '@/hooks/useSendChatMessage'
import FileUploadPreview from '../FileUploadPreview'
import ImageUploadPreview from '../ImageUploadPreview'
import { getCurrentChatMessagesAtom } from '@/helpers/atoms/ChatMessage.atom'
import {
activeThreadAtom,
getActiveThreadIdAtom,
waitingToSendMessage,
} from '@/helpers/atoms/Thread.atom'
const ChatInput: React.FC = () => {
const activeThread = useAtomValue(activeThreadAtom)
const { stateModel } = useActiveModel()
const messages = useAtomValue(getCurrentChatMessagesAtom)
const [currentPrompt, setCurrentPrompt] = useAtom(currentPromptAtom)
const { sendChatMessage } = useSendChatMessage()
const activeThreadId = useAtomValue(getActiveThreadIdAtom)
const [isWaitingToSend, setIsWaitingToSend] = useAtom(waitingToSendMessage)
const [fileUpload, setFileUpload] = useAtom(fileUploadAtom)
const textareaRef = useRef<HTMLTextAreaElement>(null)
const fileInputRef = useRef<HTMLInputElement>(null)
const imageInputRef = useRef<HTMLInputElement>(null)
const [ShowAttacmentMenus, setShowAttacmentMenus] = useState(false)
const onPromptChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
setCurrentPrompt(e.target.value)
}
const refAttachmentMenus = useClickOutside(() => setShowAttacmentMenus(false))
useEffect(() => {
if (isWaitingToSend && activeThreadId) {
setIsWaitingToSend(false)
sendChatMessage()
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [waitingToSendMessage, activeThreadId])
useEffect(() => {
if (textareaRef.current) {
textareaRef.current.style.height = '40px'
textareaRef.current.style.height = textareaRef.current.scrollHeight + 'px'
}
}, [currentPrompt])
const onKeyDown = async (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === 'Enter') {
if (!e.shiftKey) {
e.preventDefault()
if (messages[messages.length - 1]?.status !== MessageStatus.Pending)
sendChatMessage()
else onStopInferenceClick()
}
}
}
const onStopInferenceClick = async () => {
events.emit(InferenceEvent.OnInferenceStopped, {})
}
/**
* Handles the change event of the extension file input element by setting the file name state.
* Its to be used to display the extension file name of the selected file.
* @param event - The change event object.
*/
const handleFileChange = (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0]
if (!file) return
setFileUpload([{ file: file, type: 'pdf' }])
setCurrentPrompt('Summarize this for me')
}
const handleImageChange = (event: React.ChangeEvent<HTMLInputElement>) => {
const file = event.target.files?.[0]
if (!file) return
setFileUpload([{ file: file, type: 'image' }])
setCurrentPrompt('What do you see in this image?')
}
const renderPreview = (fileUpload: any) => {
if (fileUpload.length > 0) {
if (fileUpload[0].type === 'image') {
return <ImageUploadPreview file={fileUpload[0].file} />
} else {
return <FileUploadPreview />
}
}
}
return (
<div className="mx-auto flex w-full flex-shrink-0 items-end justify-center space-x-4 px-8 py-4">
<div className="relative flex w-full flex-col">
{renderPreview(fileUpload)}
<Textarea
className={twMerge(
'max-h-[400px] resize-none overflow-y-hidden pr-20',
fileUpload.length && 'rounded-t-none'
)}
style={{ height: '40px' }}
ref={textareaRef}
onKeyDown={onKeyDown}
placeholder="Enter your message..."
disabled={stateModel.loading || !activeThread}
value={currentPrompt}
onChange={onPromptChange}
/>
<Tooltip>
<TooltipTrigger asChild>
<PaperclipIcon
size={20}
className="absolute bottom-2 right-4 cursor-pointer text-muted-foreground"
onClick={(e) => {
if (
fileUpload.length > 0 ||
(activeThread?.assistants[0].tools &&
!activeThread?.assistants[0].tools[0]?.enabled)
) {
e.stopPropagation()
} else {
setShowAttacmentMenus(!ShowAttacmentMenus)
}
}}
/>
</TooltipTrigger>
<TooltipPortal>
{fileUpload.length > 0 ||
(activeThread?.assistants[0].tools &&
!activeThread?.assistants[0].tools[0]?.enabled && (
<TooltipContent side="top" className="max-w-[154px] px-3">
{fileUpload.length !== 0 && (
<span>
Currently, we only support 1 attachment at the same time
</span>
)}
{activeThread?.assistants[0].tools &&
activeThread?.assistants[0].tools[0]?.enabled ===
false && (
<span>
Turn on Retrieval in Assistant Settings to use this
feature
</span>
)}
<TooltipArrow />
</TooltipContent>
))}
</TooltipPortal>
</Tooltip>
{ShowAttacmentMenus && (
<div
ref={refAttachmentMenus}
className="absolute bottom-10 right-0 w-36 cursor-pointer rounded-lg border border-border bg-background py-1 shadow"
>
<ul>
<li className="flex w-full cursor-not-allowed items-center space-x-2 px-4 py-2 text-muted-foreground opacity-50 hover:bg-secondary">
<ImageIcon size={16} />
<span className="font-medium">Image</span>
</li>
<li
className="flex w-full cursor-pointer items-center space-x-2 px-4 py-2 text-muted-foreground hover:bg-secondary"
onClick={() => {
fileInputRef.current?.click()
setShowAttacmentMenus(false)
}}
>
<FileTextIcon size={16} />
<span className="font-medium">Document</span>
</li>
</ul>
</div>
)}
</div>
<input
type="file"
className="hidden"
ref={imageInputRef}
value=""
onChange={handleImageChange}
accept="image/png, image/jpeg, image/jpg"
/>
<input
type="file"
className="hidden"
ref={fileInputRef}
value=""
onChange={handleFileChange}
accept="application/pdf"
/>
{messages[messages.length - 1]?.status !== MessageStatus.Pending ? (
<Button
size="lg"
disabled={
stateModel.loading ||
!activeThread ||
currentPrompt.trim().length === 0
}
themes="primary"
className="min-w-[100px]"
onClick={sendChatMessage}
>
Send
</Button>
) : (
<Button
size="lg"
themes="danger"
onClick={onStopInferenceClick}
className="min-w-[100px]"
>
<StopCircle size={24} />
</Button>
)}
</div>
)
}
export default ChatInput

View File

@ -6,11 +6,11 @@ import { selectedModelAtom } from '@/containers/DropdownListSidebar'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toSettingParams } from '@/utils/modelParam'
import settingComponentBuilder from '../ModelSetting/settingComponentBuilder'
import SettingComponentBuilder from '../ModelSetting/SettingComponent'
import { getActiveThreadModelParamsAtom } from '@/helpers/atoms/Thread.atom'
const EngineSetting = () => {
const EngineSetting = ({ enabled = true }: { enabled?: boolean }) => {
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const selectedModel = useAtomValue(selectedModelAtom)
@ -18,13 +18,18 @@ const EngineSetting = () => {
const modelSettingParams = toSettingParams(activeModelParams)
const componentData = getConfigurationsData(modelSettingParams, selectedModel)
componentData.sort((a, b) => a.title.localeCompare(b.title))
const componentData = getConfigurationsData(
modelSettingParams,
selectedModel
).toSorted((a, b) => a.title.localeCompare(b.title))
return (
<div className="flex flex-col">
{settingComponentBuilder(componentData)}
<SettingComponentBuilder
componentData={componentData}
enabled={enabled}
selector={(e) => e.name !== 'prompt_template'}
/>
</div>
)
}

View File

@ -0,0 +1,95 @@
import React from 'react'
type Props = {
type: string
}
const Icon: React.FC<Props> = ({ type }) => {
return (
<div className="relative">
<span className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2 text-[10px] font-medium uppercase">
{type}
</span>
<svg
width="34"
height="42"
viewBox="0 0 34 42"
fill="none"
xmlns="http://www.w3.org/2000/svg"
>
<g filter="url(#filter0_dd_2991_12588)">
<path
d="M26.274 10.2068C25.3629 10.2055 24.4894 9.84283 23.8453 9.19837C23.2011 8.55392 22.8389 7.68029 22.838 6.76912V2H7.48584C6.89683 1.99978 6.31354 2.11561 5.7693 2.34086C5.22507 2.56611 4.73054 2.89637 4.31397 3.31279C3.8974 3.7292 3.56694 4.2236 3.34149 4.76776C3.11603 5.31191 3 5.89517 3 6.48417V33.5158C3 34.1048 3.11603 34.6881 3.34149 35.2322C3.56694 35.7764 3.8974 36.2708 4.31397 36.6872C4.73054 37.1036 5.22507 37.4339 5.7693 37.6591C6.31354 37.8844 6.89683 38.0002 7.48584 38H25.9158C27.105 38 28.2456 37.5275 29.0865 36.6866C29.9275 35.8457 30.3999 34.7051 30.3999 33.5158V10.2068H26.274Z"
fill="white"
/>
<path
d="M30.3998 10.2068H26.2739C25.3628 10.2055 24.4893 9.84283 23.8452 9.19837C23.201 8.55392 22.8388 7.68029 22.8379 6.76912V2L30.3998 10.2068Z"
fill="#A1A1AA"
/>
</g>
<defs>
<filter
id="filter0_dd_2991_12588"
x="0"
y="0"
width="33.3999"
height="42"
filterUnits="userSpaceOnUse"
colorInterpolationFilters="sRGB"
>
<feFlood floodOpacity="0" result="BackgroundImageFix" />
<feColorMatrix
in="SourceAlpha"
type="matrix"
values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0"
result="hardAlpha"
/>
<feOffset dy="1" />
<feGaussianBlur stdDeviation="1.5" />
<feColorMatrix
type="matrix"
values="0 0 0 0 0.0627451 0 0 0 0 0.0941176 0 0 0 0 0.156863 0 0 0 0.1 0"
/>
<feBlend
mode="normal"
in2="BackgroundImageFix"
result="effect1_dropShadow_2991_12588"
/>
<feColorMatrix
in="SourceAlpha"
type="matrix"
values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 127 0"
result="hardAlpha"
/>
<feMorphology
radius="1"
operator="erode"
in="SourceAlpha"
result="effect2_dropShadow_2991_12588"
/>
<feOffset dy="1" />
<feGaussianBlur stdDeviation="1" />
<feComposite in2="hardAlpha" operator="out" />
<feColorMatrix
type="matrix"
values="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.1 0"
/>
<feBlend
mode="normal"
in2="effect1_dropShadow_2991_12588"
result="effect2_dropShadow_2991_12588"
/>
<feBlend
mode="normal"
in="SourceGraphic"
in2="effect2_dropShadow_2991_12588"
result="shape"
/>
</filter>
</defs>
</svg>
</div>
)
}
export default Icon

View File

@ -0,0 +1,47 @@
import React from 'react'
import { useAtom, useSetAtom } from 'jotai'
import { XIcon } from 'lucide-react'
import { currentPromptAtom, fileUploadAtom } from '@/containers/Providers/Jotai'
import { toGibibytes } from '@/utils/converter'
import Icon from './Icon'
const FileUploadPreview: React.FC = () => {
const [fileUpload, setFileUpload] = useAtom(fileUploadAtom)
const setCurrentPrompt = useSetAtom(currentPromptAtom)
const onDeleteClick = () => {
setFileUpload([])
setCurrentPrompt('')
}
return (
<div className="flex flex-col rounded-t-lg border border-b-0 border-border p-4">
<div className="relative inline-flex w-60 space-x-3 rounded-lg bg-secondary p-4">
<Icon type={fileUpload[0].type} />
<div>
<h6 className="line-clamp-1 font-medium">
{fileUpload[0].file.name.replaceAll(/[-._]/g, ' ')}
</h6>
<p className="text-muted-foreground">
{toGibibytes(fileUpload[0].file.size)}
</p>
</div>
<div
className="absolute -right-2 -top-2 cursor-pointer rounded-full bg-foreground p-0.5"
onClick={onDeleteClick}
>
<XIcon size={14} className="text-background" />
</div>
</div>
</div>
)
}
export default FileUploadPreview

View File

@ -0,0 +1,54 @@
import React, { useEffect } from 'react'
import { useState } from 'react'
import { useSetAtom } from 'jotai'
import { XIcon } from 'lucide-react'
import { currentPromptAtom, fileUploadAtom } from '@/containers/Providers/Jotai'
import { getBase64 } from '@/utils/base64'
type Props = {
file: File
}
const ImageUploadPreview: React.FC<Props> = ({ file }) => {
const [base64, setBase64] = useState<string | undefined>()
const setFileUpload = useSetAtom(fileUploadAtom)
const setCurrentPrompt = useSetAtom(currentPromptAtom)
useEffect(() => {
getBase64(file)
.then((base64) => setBase64(base64))
.catch((err) => console.error(err))
}, [file])
if (!base64) {
return
}
const onDeleteClick = () => {
setFileUpload([])
setCurrentPrompt('')
}
return (
<div className="flex flex-col rounded-t-lg border border-b-0 border-border p-4">
<div className="relative w-60 rounded-lg bg-secondary p-4">
<img src={base64} alt={file.name} className="object-cover" />
<h6 className="mt-2 line-clamp-1 font-medium">
{file.name.replaceAll(/[-._]/g, ' ')}
</h6>
<div
className="absolute -right-2 -top-2 cursor-pointer rounded-full bg-foreground p-0.5"
onClick={onDeleteClick}
>
<XIcon size={14} className="text-background" />
</div>
</div>
</div>
)
}
export default React.memo(ImageUploadPreview)

View File

@ -0,0 +1,19 @@
import useSendChatMessage from '@/hooks/useSendChatMessage'
const MessageQueuedBanner: React.FC = () => {
const { queuedMessage } = useSendChatMessage()
return (
<div>
{queuedMessage && (
<div className="my-2 py-2 text-center">
<span className="rounded-lg border border-border px-4 py-2 shadow-lg">
Message queued. It can be sent once the model has started
</span>
</div>
)}
</div>
)
}
export default MessageQueuedBanner

View File

@ -1,8 +1,22 @@
/* eslint-disable no-case-declarations */
import { useAtomValue, useSetAtom } from 'jotai'
import Checkbox from '@/containers/Checkbox'
import ModelConfigInput from '@/containers/ModelConfigInput'
import SliderRightPanel from '@/containers/SliderRightPanel'
import { useActiveModel } from '@/hooks/useActiveModel'
import useUpdateModelParameters from '@/hooks/useUpdateModelParameters'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toSettingParams } from '@/utils/modelParam'
import {
engineParamsUpdateAtom,
getActiveThreadIdAtom,
getActiveThreadModelParamsAtom,
} from '@/helpers/atoms/Thread.atom'
export type ControllerType = 'slider' | 'checkbox' | 'input'
export type SettingComponentData = {
@ -30,14 +44,51 @@ type CheckboxData = {
checked: boolean
}
const settingComponentBuilder = (
componentData: SettingComponentData[],
onlyPrompt?: boolean
) => {
const SettingComponent = ({
componentData,
enabled = true,
selector,
updater,
}: {
componentData: SettingComponentData[]
enabled?: boolean
selector?: (e: SettingComponentData) => boolean
updater?: (
threadId: string,
name: string,
value: string | number | boolean
) => void
}) => {
const { updateModelParameter } = useUpdateModelParameters()
const threadId = useAtomValue(getActiveThreadIdAtom)
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const modelSettingParams = toSettingParams(activeModelParams)
const engineParams = getConfigurationsData(modelSettingParams)
const setEngineParamsUpdate = useSetAtom(engineParamsUpdateAtom)
const { stopModel } = useActiveModel()
const onValueChanged = (name: string, value: string | number | boolean) => {
if (!threadId) return
if (engineParams.some((x) => x.name.includes(name))) {
setEngineParamsUpdate(true)
stopModel()
} else {
setEngineParamsUpdate(false)
}
if (updater) updater(threadId, name, value)
else {
updateModelParameter(threadId, name, value)
}
}
const components = componentData
.filter((x) =>
onlyPrompt ? x.name === 'prompt_template' : x.name !== 'prompt_template'
)
.filter((x) => (selector ? selector(x) : true))
.map((data) => {
switch (data.controllerType) {
case 'slider':
@ -52,6 +103,8 @@ const settingComponentBuilder = (
step={step}
value={value}
name={data.name}
enabled={enabled}
onValueChanged={(value) => onValueChanged(data.name, value)}
/>
)
case 'input':
@ -60,11 +113,13 @@ const settingComponentBuilder = (
return (
<ModelConfigInput
title={data.title}
enabled={enabled}
key={data.name}
name={data.name}
description={data.description}
placeholder={placeholder}
value={textValue}
onValueChanged={(value) => onValueChanged(data.name, value)}
/>
)
case 'checkbox':
@ -72,10 +127,12 @@ const settingComponentBuilder = (
return (
<Checkbox
key={data.name}
enabled={enabled}
name={data.name}
description={data.description}
title={data.title}
checked={checked}
onValueChanged={(value) => onValueChanged(data.name, value)}
/>
)
default:
@ -86,4 +143,4 @@ const settingComponentBuilder = (
return <div className="flex flex-col gap-y-4">{components}</div>
}
export default settingComponentBuilder
export default SettingComponent

View File

@ -8,7 +8,7 @@ import { selectedModelAtom } from '@/containers/DropdownListSidebar'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toRuntimeParams } from '@/utils/modelParam'
import settingComponentBuilder from './settingComponentBuilder'
import SettingComponentBuilder from './SettingComponent'
import { getActiveThreadModelParamsAtom } from '@/helpers/atoms/Thread.atom'
@ -27,7 +27,10 @@ const ModelSetting = () => {
return (
<div className="flex flex-col">
{settingComponentBuilder(componentData)}
<SettingComponentBuilder
componentData={componentData}
selector={(e) => e.name !== 'prompt_template'}
/>
</div>
)
}

View File

@ -1,4 +1,4 @@
import { SettingComponentData } from './settingComponentBuilder'
import { SettingComponentData } from './SettingComponent'
export const presetConfiguration: Record<string, SettingComponentData> = {
prompt_template: {
@ -141,4 +141,52 @@ export const presetConfiguration: Record<string, SettingComponentData> = {
value: 1,
},
},
// assistant
chunk_size: {
name: 'chunk_size',
title: 'Chunk Size',
description: 'Maximum number of tokens in a chunk',
controllerType: 'slider',
controllerData: {
min: 128,
max: 2048,
step: 128,
value: 1024,
},
},
chunk_overlap: {
name: 'chunk_overlap',
title: 'Chunk Overlap',
description: 'Number of tokens overlapping between two adjacent chunks',
controllerType: 'slider',
controllerData: {
min: 32,
max: 512,
step: 32,
value: 64,
},
},
top_k: {
name: 'top_k',
title: 'Top K',
description: 'Number of top-ranked documents to retrieve',
controllerType: 'slider',
controllerData: {
min: 1,
max: 5,
step: 1,
value: 2,
},
},
retrieval_template: {
name: 'retrieval_template',
title: 'Retrieval Template',
description:
'The template to use for retrieval. The following variables are available: {CONTEXT}, {QUESTION}',
controllerType: 'input',
controllerData: {
placeholder: 'Retrieval Template',
value: '',
},
},
}

View File

@ -0,0 +1,42 @@
import React, { Fragment, useCallback } from 'react'
import { Button } from '@janhq/uikit'
import LogoMark from '@/containers/Brand/Logo/Mark'
import { MainViewState } from '@/constants/screens'
import { useGetDownloadedModels } from '@/hooks/useGetDownloadedModels'
import { useMainViewState } from '@/hooks/useMainViewState'
const RequestDownloadModel: React.FC = () => {
const { downloadedModels } = useGetDownloadedModels()
const { setMainViewState } = useMainViewState()
const onClick = useCallback(() => {
setMainViewState(MainViewState.Hub)
}, [setMainViewState])
return (
<div className="mx-auto mt-8 flex h-full w-3/4 flex-col items-center justify-center text-center">
{downloadedModels.length === 0 && (
<Fragment>
<LogoMark
className="mx-auto mb-4 animate-wave"
width={56}
height={56}
/>
<h1 className="text-2xl font-bold">Welcome!</h1>
<p className="mt-1 text-base">
You need to download your first model
</p>
<Button className="mt-4" onClick={onClick}>
Explore The Hub
</Button>
</Fragment>
)}
</div>
)
}
export default React.memo(RequestDownloadModel)

View File

@ -1,7 +1,8 @@
/* eslint-disable @typescript-eslint/no-explicit-any */
import React from 'react'
import { Input, Textarea } from '@janhq/uikit'
import { InferenceEngine } from '@janhq/core'
import { Input, Textarea, Switch } from '@janhq/uikit'
import { atom, useAtomValue } from 'jotai'
@ -10,17 +11,20 @@ import { twMerge } from 'tailwind-merge'
import LogoMark from '@/containers/Brand/Logo/Mark'
import CardSidebar from '@/containers/CardSidebar'
import DropdownListSidebar from '@/containers/DropdownListSidebar'
import DropdownListSidebar, {
selectedModelAtom,
} from '@/containers/DropdownListSidebar'
import { useCreateNewThread } from '@/hooks/useCreateNewThread'
import { getConfigurationsData } from '@/utils/componentSettings'
import { toRuntimeParams, toSettingParams } from '@/utils/modelParam'
import AssistantSetting from '../AssistantSetting'
import EngineSetting from '../EngineSetting'
import ModelSetting from '../ModelSetting'
import settingComponentBuilder from '../ModelSetting/settingComponentBuilder'
import SettingComponentBuilder from '../ModelSetting/SettingComponent'
import {
activeThreadAtom,
@ -33,18 +37,23 @@ const Sidebar: React.FC = () => {
const showing = useAtomValue(showRightSideBarAtom)
const activeThread = useAtomValue(activeThreadAtom)
const activeModelParams = useAtomValue(getActiveThreadModelParamsAtom)
const selectedModel = useAtomValue(selectedModelAtom)
const { updateThreadMetadata } = useCreateNewThread()
const modelEngineParams = toSettingParams(activeModelParams)
const modelRuntimeParams = toRuntimeParams(activeModelParams)
const componentDataAssistantSetting = getConfigurationsData(
(activeThread?.assistants[0]?.tools &&
activeThread?.assistants[0]?.tools[0]?.settings) ??
{}
)
const componentDataEngineSetting = getConfigurationsData(modelEngineParams)
const componentDataRuntimeSetting = getConfigurationsData(modelRuntimeParams)
return (
<div
className={twMerge(
'h-full flex-shrink-0 overflow-x-hidden border-l border-border bg-background transition-all duration-100 dark:bg-background/20',
'h-full flex-shrink-0 overflow-x-hidden border-l border-border bg-background pb-6 transition-all duration-100 dark:bg-background/20',
showing
? 'w-80 translate-x-0 opacity-100'
: 'w-0 translate-x-full opacity-0'
@ -122,21 +131,71 @@ const Sidebar: React.FC = () => {
}}
/>
</div>
{/* Temporary disabled */}
{/* <div>
<label
id="tool-title"
className="mb-2 inline-block font-bold text-zinc-500 dark:text-gray-300"
>
Tools
</label>
<div className="flex items-center justify-between">
<label className="font-medium text-zinc-500 dark:text-gray-300">
Retrieval
</label>
<Switch name="retrieval" />
</div>
</div> */}
<div>
{activeThread?.assistants[0]?.tools &&
componentDataAssistantSetting.length > 0 && (
<div className="mt-2">
<CardSidebar
title="Retrieval"
asChild
rightAction={
<Switch
name="retrieval"
className="mr-2"
checked={activeThread?.assistants[0].tools[0].enabled}
onCheckedChange={(e) => {
if (activeThread)
updateThreadMetadata({
...activeThread,
assistants: [
{
...activeThread.assistants[0],
tools: [
{
type: 'retrieval',
enabled: e,
settings:
(activeThread.assistants[0].tools &&
activeThread.assistants[0].tools[0]
?.settings) ??
{},
},
],
},
],
})
}}
/>
}
>
{activeThread?.assistants[0]?.tools[0].enabled && (
<div className="px-2 py-4">
<div className="mb-4">
<label
id="tool-title"
className="mb-2 inline-block font-bold text-zinc-500 dark:text-gray-300"
>
Embedding Engine
</label>
<div className="flex items-center justify-between">
<label className="font-medium text-zinc-500 dark:text-gray-300">
{selectedModel?.engine ===
InferenceEngine.openai
? 'OpenAI'
: 'Nitro'}
</label>
</div>
</div>
<AssistantSetting
componentData={componentDataAssistantSetting}
/>
</div>
)}
</CardSidebar>
</div>
)}
</div>
</div>
</CardSidebar>
<CardSidebar title="Model">
@ -145,7 +204,7 @@ const Sidebar: React.FC = () => {
<DropdownListSidebar />
</div>
{componentDataRuntimeSetting.length !== 0 && (
{componentDataRuntimeSetting.length > 0 && (
<div className="mt-6">
<CardSidebar title="Inference Parameters" asChild>
<div className="px-2 py-4">
@ -161,13 +220,16 @@ const Sidebar: React.FC = () => {
<div className="mt-4">
<CardSidebar title="Model Parameters" asChild>
<div className="px-2 py-4">
{settingComponentBuilder(componentDataEngineSetting, true)}
<SettingComponentBuilder
componentData={componentDataEngineSetting}
selector={(x: any) => x.name === 'prompt_template'}
/>
</div>
</CardSidebar>
</div>
)}
{componentDataEngineSetting.length !== 0 && (
{componentDataEngineSetting.length > 0 && (
<div className="my-4">
<CardSidebar title="Engine Parameters" asChild>
<div className="px-2 py-4">

View File

@ -1,10 +1,23 @@
import React, { useEffect, useRef, useState } from 'react'
import { ChatCompletionRole, MessageStatus, ThreadMessage } from '@janhq/core'
import {
ChatCompletionRole,
ContentType,
MessageStatus,
ThreadMessage,
} from '@janhq/core'
import {
Tooltip,
TooltipArrow,
TooltipContent,
TooltipPortal,
TooltipTrigger,
} from '@janhq/uikit'
import hljs from 'highlight.js'
import { useAtomValue } from 'jotai'
import { FolderOpenIcon } from 'lucide-react'
import { Marked, Renderer } from 'marked'
import { markedHighlight } from 'marked-highlight'
@ -13,12 +26,13 @@ import { twMerge } from 'tailwind-merge'
import LogoMark from '@/containers/Brand/Logo/Mark'
import BubbleLoader from '@/containers/Loader/Bubble'
import { useClipboard } from '@/hooks/useClipboard'
import { usePath } from '@/hooks/usePath'
import { toGibibytes } from '@/utils/converter'
import { displayDate } from '@/utils/datetime'
import Icon from '../FileUploadPreview/Icon'
import MessageToolbar from '../MessageToolbar'
import { getCurrentChatMessagesAtom } from '@/helpers/atoms/ChatMessage.atom'
@ -29,6 +43,7 @@ const SimpleTextMessage: React.FC<ThreadMessage> = (props) => {
text = props.content[0]?.text?.value ?? ''
}
const clipboard = useClipboard({ timeout: 1000 })
const { onViewFile } = usePath()
const marked: Marked = new Marked(
markedHighlight({
@ -77,7 +92,6 @@ const SimpleTextMessage: React.FC<ThreadMessage> = (props) => {
const isUser = props.role === ChatCompletionRole.User
const isSystem = props.role === ChatCompletionRole.System
const [tokenCount, setTokenCount] = useState(0)
const [lastTimestamp, setLastTimestamp] = useState<number | undefined>()
const [tokenSpeed, setTokenSpeed] = useState(0)
const messages = useAtomValue(getCurrentChatMessagesAtom)
@ -148,6 +162,7 @@ const SimpleTextMessage: React.FC<ThreadMessage> = (props) => {
</svg>
</div>
)}
<div
className={twMerge(
'text-sm font-extrabold capitalize',
@ -178,23 +193,80 @@ const SimpleTextMessage: React.FC<ThreadMessage> = (props) => {
</div>
<div className={twMerge('w-full')}>
{props.status === MessageStatus.Pending &&
(!props.content[0] || props.content[0].text.value === '') ? (
<BubbleLoader />
) : (
<>
<div
className={twMerge(
'message flex flex-grow flex-col gap-y-2 text-[15px] font-normal leading-relaxed',
isUser
? 'whitespace-pre-wrap break-words'
: 'rounded-xl bg-secondary p-4'
)}
// eslint-disable-next-line @typescript-eslint/naming-convention
dangerouslySetInnerHTML={{ __html: parsedText }}
/>
</>
)}
<>
{props.content[0]?.type === ContentType.Image && (
<div className="group/image relative mb-2 inline-flex overflow-hidden rounded-xl">
<img
className="aspect-auto h-[300px]"
alt={props.content[0]?.text.name}
src={props.content[0]?.text.annotations[0]}
/>
<div className="absolute left-0 top-0 z-20 hidden h-full w-full bg-black/20 group-hover/image:inline-block" />
<Tooltip>
<TooltipTrigger asChild>
<div
className="absolute right-2 top-2 z-20 hidden h-8 w-8 cursor-pointer items-center justify-center rounded-md bg-background group-hover/image:flex"
onClick={() => onViewFile(`${props.id}.png`)}
>
<FolderOpenIcon size={20} />
</div>
</TooltipTrigger>
<TooltipPortal>
<TooltipContent side="top" className="max-w-[154px] px-3">
<span>Show in finder</span>
<TooltipArrow />
</TooltipContent>
</TooltipPortal>
</Tooltip>
</div>
)}
{props.content[0]?.type === ContentType.Pdf && (
<div className="group/file relative mb-2 inline-flex w-60 cursor-pointer gap-x-3 overflow-hidden rounded-lg bg-secondary p-4">
<div className="absolute left-0 top-0 z-20 hidden h-full w-full bg-black/20 backdrop-blur-sm group-hover/file:inline-block" />
<Tooltip>
<TooltipTrigger asChild>
<div
className="absolute right-2 top-2 z-20 hidden h-8 w-8 cursor-pointer items-center justify-center rounded-md bg-background group-hover/file:flex"
onClick={() =>
onViewFile(`${props.id}.${props.content[0]?.type}`)
}
>
<FolderOpenIcon size={20} />
</div>
</TooltipTrigger>
<TooltipPortal>
<TooltipContent side="top" className="max-w-[154px] px-3">
<span>Show in finder</span>
<TooltipArrow />
</TooltipContent>
</TooltipPortal>
</Tooltip>
<Icon type={props.content[0].type} />
<div>
<h6 className="line-clamp-1 font-medium">
{props.content[0].text.name?.replaceAll(/[-._]/g, ' ')}
</h6>
<p className="text-muted-foreground">
{toGibibytes(Number(props.content[0].text.size))}
</p>
</div>
</div>
)}
<div
className={twMerge(
'message flex flex-grow flex-col gap-y-2 text-[15px] font-normal leading-relaxed',
isUser
? 'whitespace-pre-wrap break-words'
: 'rounded-xl bg-secondary p-4'
)}
// eslint-disable-next-line @typescript-eslint/naming-convention
dangerouslySetInnerHTML={{ __html: parsedText }}
/>
</>
</div>
</div>
)

View File

@ -84,6 +84,7 @@ export default function ThreadList() {
threads.map((thread, i) => {
const lastMessage =
threadStates[thread.id]?.lastMessage ?? 'No new message'
return (
<div
key={i}

View File

@ -1,110 +1,123 @@
import { ChangeEvent, Fragment, KeyboardEvent, useEffect, useRef } from 'react'
/* eslint-disable @typescript-eslint/naming-convention */
import React, { useEffect, useState } from 'react'
import { InferenceEvent, MessageStatus, events } from '@janhq/core'
import { Button, Textarea } from '@janhq/uikit'
import { useDropzone } from 'react-dropzone'
import { useAtom, useAtomValue } from 'jotai'
import { useAtomValue, useSetAtom } from 'jotai'
import { debounce } from 'lodash'
import { StopCircle } from 'lucide-react'
import { UploadCloudIcon, XIcon } from 'lucide-react'
import LogoMark from '@/containers/Brand/Logo/Mark'
import { twMerge } from 'tailwind-merge'
import ModelReload from '@/containers/Loader/ModelReload'
import ModelStart from '@/containers/Loader/ModelStart'
import { currentPromptAtom } from '@/containers/Providers/Jotai'
import { currentPromptAtom, fileUploadAtom } from '@/containers/Providers/Jotai'
import { showLeftSideBarAtom } from '@/containers/Providers/KeyListener'
import { MainViewState } from '@/constants/screens'
import { useActiveModel } from '@/hooks/useActiveModel'
import { useGetDownloadedModels } from '@/hooks/useGetDownloadedModels'
import { useMainViewState } from '@/hooks/useMainViewState'
import useSendChatMessage from '@/hooks/useSendChatMessage'
import ChatBody from '@/screens/Chat/ChatBody'
import ThreadList from '@/screens/Chat/ThreadList'
import ChatInput from './ChatInput'
import RequestDownloadModel from './RequestDownloadModel'
import Sidebar from './Sidebar'
import { getCurrentChatMessagesAtom } from '@/helpers/atoms/ChatMessage.atom'
import {
activeThreadAtom,
engineParamsUpdateAtom,
getActiveThreadIdAtom,
waitingToSendMessage,
} from '@/helpers/atoms/Thread.atom'
import { activeThreadStateAtom } from '@/helpers/atoms/Thread.atom'
const ChatScreen = () => {
const ChatScreen: React.FC = () => {
const setCurrentPrompt = useSetAtom(currentPromptAtom)
const activeThread = useAtomValue(activeThreadAtom)
const { downloadedModels } = useGetDownloadedModels()
const showLeftSideBar = useAtomValue(showLeftSideBarAtom)
const { activeModel, stateModel } = useActiveModel()
const { setMainViewState } = useMainViewState()
const messages = useAtomValue(getCurrentChatMessagesAtom)
const [currentPrompt, setCurrentPrompt] = useAtom(currentPromptAtom)
const activeThreadState = useAtomValue(activeThreadStateAtom)
const { sendChatMessage, queuedMessage, reloadModel } = useSendChatMessage()
const isWaitingForResponse = activeThreadState?.waitingForResponse ?? false
const isDisabledChatbox =
currentPrompt.trim().length === 0 || isWaitingForResponse
const activeThreadId = useAtomValue(getActiveThreadIdAtom)
const [isWaitingToSend, setIsWaitingToSend] = useAtom(waitingToSendMessage)
const textareaRef = useRef<HTMLTextAreaElement>(null)
const modelRef = useRef(activeModel)
const engineParamsUpdate = useAtomValue(engineParamsUpdateAtom)
const { queuedMessage, reloadModel } = useSendChatMessage()
const [dragOver, setDragOver] = useState(false)
const [dragRejected, setDragRejected] = useState({ code: '' })
const setFileUpload = useSetAtom(fileUploadAtom)
const { getRootProps, isDragReject } = useDropzone({
noClick: true,
multiple: false,
accept: {
// 'image/*': ['.png', '.jpg', '.jpeg'],
'application/pdf': ['.pdf'],
},
useEffect(() => {
modelRef.current = activeModel
}, [activeModel])
const onPromptChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
setCurrentPrompt(e.target.value)
}
useEffect(() => {
if (isWaitingToSend && activeThreadId) {
setIsWaitingToSend(false)
sendChatMessage()
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [waitingToSendMessage, activeThreadId])
useEffect(() => {
if (textareaRef.current) {
textareaRef.current.style.height = '40px'
textareaRef.current.style.height = textareaRef.current.scrollHeight + 'px'
}
}, [currentPrompt])
const onKeyDown = debounce(
async (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
if (e.key === 'Enter') {
if (!e.shiftKey) {
e.preventDefault()
if (messages[messages.length - 1]?.status !== MessageStatus.Pending)
sendChatMessage()
else onStopInferenceClick()
}
onDragOver: (e) => {
if (
e.dataTransfer.items.length === 1 &&
activeThread?.assistants[0].tools &&
activeThread?.assistants[0].tools[0]?.enabled
) {
setDragOver(true)
} else if (
activeThread?.assistants[0].tools &&
!activeThread?.assistants[0].tools[0]?.enabled
) {
setDragRejected({ code: 'retrieval-off' })
} else {
setDragRejected({ code: 'multiple-upload' })
}
},
50,
{ leading: false, trailing: true }
)
onDragLeave: () => setDragOver(false),
onDrop: (files, rejectFiles) => {
if (
!files ||
files.length !== 1 ||
rejectFiles.length !== 0 ||
(activeThread?.assistants[0].tools &&
!activeThread?.assistants[0].tools[0]?.enabled)
)
return
const imageType = files[0]?.type.includes('image')
setFileUpload([{ file: files[0], type: imageType ? 'image' : 'pdf' }])
setDragOver(false)
if (imageType) {
setCurrentPrompt('What do you see in this image?')
} else {
setCurrentPrompt('Summarize this for me')
}
},
onDropRejected: (e) => {
if (
activeThread?.assistants[0].tools &&
!activeThread?.assistants[0].tools[0]?.enabled
) {
setDragRejected({ code: 'retrieval-off' })
} else {
setDragRejected({ code: e[0].errors[0].code })
}
setDragOver(false)
},
})
const onStopInferenceClick = async () => {
events.emit(InferenceEvent.OnInferenceStopped, {})
// TODO @faisal change this until we have sneakbar component
useEffect(() => {
setTimeout(() => {
if (dragRejected.code) {
setDragRejected({ code: '' })
}
}, 2000)
}, [dragRejected.code])
const renderError = (code: string) => {
switch (code) {
case 'multiple-upload':
return 'Currently, we only support 1 attachment at the same time'
case 'retrieval-off':
return 'Turn on Retrieval in Assistant Settings to use this feature'
case 'file-invalid-type':
return 'We do not support this file type'
default:
return 'Oops, something error, please try again.'
}
}
return (
@ -116,34 +129,68 @@ const ChatScreen = () => {
</div>
) : null}
<div className="relative flex h-full w-full flex-col overflow-auto bg-background">
<div
className="relative flex h-full w-full flex-col overflow-auto bg-background outline-none"
{...getRootProps()}
>
{dragRejected.code !== '' && (
<div className="absolute bottom-3 left-1/2 z-50 inline-flex w-full -translate-x-1/2 justify-center px-16">
<div className="flex items-start justify-between gap-x-4 rounded-lg bg-foreground px-4 py-2 text-white dark:border dark:border-border dark:bg-zinc-900">
<svg
width="20"
height="20"
viewBox="0 0 20 20"
fill="none"
xmlns="http://www.w3.org/2000/svg"
>
<path
fillRule="evenodd"
clipRule="evenodd"
d="M20 10C20 15.5228 15.5228 20 10 20H0.993697C0.110179 20 -0.332289 18.9229 0.292453 18.2929L2.2495 16.3195C0.843343 14.597 1.21409e-08 12.397 1.21409e-08 10C1.21409e-08 4.47715 4.47715 0 10 0C15.5228 0 20 4.47715 20 10ZM13.2071 6.79289C13.5976 7.18342 13.5976 7.81658 13.2071 8.20711L11.4142 10L13.2071 11.7929C13.5976 12.1834 13.5976 12.8166 13.2071 13.2071C12.8166 13.5976 12.1834 13.5976 11.7929 13.2071L10 11.4142L8.20711 13.2071C7.81658 13.5976 7.18342 13.5976 6.79289 13.2071C6.40237 12.8166 6.40237 12.1834 6.79289 11.7929L8.58579 10L6.79289 8.20711C6.40237 7.81658 6.40237 7.18342 6.79289 6.79289C7.18342 6.40237 7.81658 6.40237 8.20711 6.79289L10 8.58579L11.7929 6.79289C12.1834 6.40237 12.8166 6.40237 13.2071 6.79289Z"
fill="#F87171"
/>
</svg>
<p>{renderError(dragRejected.code)}</p>
<XIcon
size={24}
className="cursor-pointer"
onClick={() => setDragRejected({ code: '' })}
/>
</div>
</div>
)}
{dragOver && (
<div className="absolute z-50 mx-auto h-full w-full bg-background/50 p-8 backdrop-blur-lg">
<div
className={twMerge(
'flex h-full w-full items-center justify-center rounded-lg border border-dashed border-blue-500',
isDragReject && 'border-red-500'
)}
>
<div className="mx-auto w-1/2 text-center">
<div className="mx-auto inline-flex h-12 w-12 items-center justify-center rounded-full bg-blue-200">
<UploadCloudIcon size={24} className="text-blue-600" />
</div>
<div className="mt-4 text-blue-600">
<h6 className="font-bold">
{isDragReject
? 'Currently, we only support 1 attachment at the same time with PDF format'
: 'Drop file here'}
</h6>
{!isDragReject && <p className="mt-2">(PDF)</p>}
</div>
</div>
</div>
</div>
)}
<div className="flex h-full w-full flex-col justify-between">
{activeThread ? (
<div className="flex h-full w-full overflow-y-auto overflow-x-hidden">
<ChatBody />
</div>
) : (
<div className="mx-auto mt-8 flex h-full w-3/4 flex-col items-center justify-center text-center">
{downloadedModels.length === 0 && (
<Fragment>
<LogoMark
className="mx-auto mb-4 animate-wave"
width={56}
height={56}
/>
<h1 className="text-2xl font-bold">Welcome!</h1>
<p className="mt-1 text-base">
You need to download your first model
</p>
<Button
className="mt-4"
onClick={() => setMainViewState(MainViewState.Hub)}
>
Explore The Hub
</Button>
</Fragment>
)}
</div>
<RequestDownloadModel />
)}
{!engineParamsUpdate && <ModelStart />}
@ -166,48 +213,9 @@ const ChatScreen = () => {
</span>
</div>
)}
<div className="mx-auto flex w-full flex-shrink-0 items-end justify-center space-x-4 px-8 py-4">
<Textarea
className="max-h-[400px] resize-none overflow-y-auto pr-20"
style={{ height: '40px' }}
ref={textareaRef}
onKeyDown={(e: KeyboardEvent<HTMLTextAreaElement>) =>
onKeyDown(e)
}
placeholder="Enter your message..."
disabled={stateModel.loading || !activeThread}
value={currentPrompt}
onChange={(e: ChangeEvent<HTMLTextAreaElement>) =>
onPromptChange(e)
}
/>
{messages[messages.length - 1]?.status !== MessageStatus.Pending ? (
<Button
size="lg"
disabled={
isDisabledChatbox || stateModel.loading || !activeThread
}
themes="primary"
className="min-w-[100px]"
onClick={sendChatMessage}
>
Send
</Button>
) : (
<Button
size="lg"
themes="danger"
onClick={onStopInferenceClick}
className="min-w-[100px]"
>
<StopCircle size={24} />
</Button>
)}
</div>
<ChatInput />
</div>
</div>
{/* Right side bar */}
{activeThread && <Sidebar />}
</div>

View File

@ -24,7 +24,9 @@ import { MainViewState } from '@/constants/screens'
import { useCreateNewThread } from '@/hooks/useCreateNewThread'
import useDownloadModel from '@/hooks/useDownloadModel'
import { useDownloadState } from '@/hooks/useDownloadState'
import { getAssistants } from '@/hooks/useGetAssistants'
import { downloadedModelsAtom } from '@/hooks/useGetDownloadedModels'
import { useMainViewState } from '@/hooks/useMainViewState'

View File

@ -41,7 +41,7 @@ import { toSettingParams } from '@/utils/modelParam'
import EngineSetting from '../Chat/EngineSetting'
import settingComponentBuilder from '../Chat/ModelSetting/settingComponentBuilder'
import SettingComponentBuilder from '../Chat/ModelSetting/SettingComponent'
import { showRightSideBarAtom } from '../Chat/Sidebar'
@ -361,7 +361,11 @@ const LocalServerScreen = () => {
<div className="mt-4">
<CardSidebar title="Model Parameters" asChild>
<div className="px-2 py-4">
{settingComponentBuilder(componentDataEngineSetting, true)}
<SettingComponentBuilder
enabled={!serverEnabled}
componentData={componentDataEngineSetting}
selector={(x) => x.name === 'prompt_template'}
/>
</div>
</CardSidebar>
</div>
@ -371,7 +375,7 @@ const LocalServerScreen = () => {
<div className="my-4">
<CardSidebar title="Engine Parameters" asChild>
<div className="px-2 py-4">
<EngineSetting />
<EngineSetting enabled={!serverEnabled} />
</div>
</CardSidebar>
</div>

View File

@ -17,13 +17,13 @@
"incremental": true,
"plugins": [
{
"name": "next"
}
"name": "next",
},
],
"paths": {
"@/*": ["./*"]
}
"@/*": ["./*"],
},
},
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules", "../electron"]
"exclude": ["node_modules"],
}

View File

@ -4,7 +4,6 @@ import { APIFunctions } from '@janhq/core'
export {}
declare global {
declare const PLUGIN_CATALOG: string
declare const VERSION: string
declare const ANALYTICS_ID: string
declare const ANALYTICS_HOST: string

9
web/utils/base64.ts Normal file
View File

@ -0,0 +1,9 @@
export const getBase64 = async (file: File): Promise<string> =>
new Promise((resolve) => {
const reader = new FileReader()
reader.readAsDataURL(file)
reader.onload = () => {
const baseURL = reader.result
resolve(baseURL as string)
}
})

View File

@ -1,13 +1,10 @@
import { Model, ModelRuntimeParams, ModelSettingParams } from '@janhq/core'
import { Model } from '@janhq/core'
import { SettingComponentData } from '@/screens/Chat/ModelSetting/SettingComponent'
import { presetConfiguration } from '@/screens/Chat/ModelSetting/predefinedComponent'
import { SettingComponentData } from '@/screens/Chat/ModelSetting/settingComponentBuilder'
import { ModelParams } from '@/helpers/atoms/Thread.atom'
export const getConfigurationsData = (
settings: ModelSettingParams | ModelRuntimeParams,
settings: object,
selectedModel?: Model
) => {
const componentData: SettingComponentData[] = []
@ -19,31 +16,35 @@ export const getConfigurationsData = (
return
}
if ('slider' === componentSetting.controllerType) {
const value = Number(settings[key as keyof ModelParams])
const value = Number(settings[key as keyof typeof settings])
if ('value' in componentSetting.controllerData) {
componentSetting.controllerData.value = value
if ('max' in componentSetting.controllerData) {
switch (key) {
case 'max_tokens':
componentSetting.controllerData.max =
selectedModel?.parameters.max_tokens || 4096
selectedModel?.parameters.max_tokens ||
componentSetting.controllerData.max ||
4096
break
case 'ctx_len':
componentSetting.controllerData.max =
selectedModel?.settings.ctx_len || 4096
selectedModel?.settings.ctx_len ||
componentSetting.controllerData.max ||
4096
break
}
}
}
} else if ('input' === componentSetting.controllerType) {
const value = settings[key as keyof ModelParams] as string
const placeholder = settings[key as keyof ModelParams] as string
const value = settings[key as keyof typeof settings] as string
const placeholder = settings[key as keyof typeof settings] as string
if ('value' in componentSetting.controllerData)
componentSetting.controllerData.value = value
if ('placeholder' in componentSetting.controllerData)
componentSetting.controllerData.placeholder = placeholder
} else if ('checkbox' === componentSetting.controllerType) {
const checked = settings[key as keyof ModelParams] as boolean
const checked = settings[key as keyof typeof settings] as boolean
if ('checked' in componentSetting.controllerData)
componentSetting.controllerData.checked = checked

View File

@ -22,7 +22,7 @@ export const toRuntimeParams = (
for (const [key, value] of Object.entries(modelParams)) {
if (key in defaultModelParams) {
runtimeParams[key as keyof ModelRuntimeParams] = value
Object.assign(runtimeParams, { ...runtimeParams, [key]: value })
}
}
@ -47,7 +47,7 @@ export const toSettingParams = (
for (const [key, value] of Object.entries(modelParams)) {
if (key in defaultSettingParams) {
settingParams[key as keyof ModelSettingParams] = value
Object.assign(settingParams, { ...settingParams, [key]: value })
}
}