5457 Commits

Author SHA1 Message Date
Louis
af8404d627
fix: tests 2025-07-10 20:16:09 +07:00
Louis
5fe4cc6bab
chore: remove cortex install step 2025-07-10 20:09:26 +07:00
Louis
389721ba89
fix: build step 2025-07-10 16:49:21 +07:00
Louis
ca6f4f8977
test: fix failed tests 2025-07-10 16:25:47 +07:00
Louis
6e0218c084
Merge branch 'release/v0.7.0' into feat/inference-llamacpp-extension
# Conflicts:
#	.devcontainer/buildAppImage.sh
#	.github/workflows/template-tauri-build-linux-x64.yml
#	Makefile
#	core/src/node/extension/index.test.ts
#	package.json
#	src-tauri/tauri.conf.json
#	web-app/package.json
2025-07-10 15:36:41 +07:00
Louis
94d9304c0b
Merge branch 'dev' into release/v0.7.0 2025-07-10 15:33:21 +07:00
Minh141120
94ada5969b refactor: deprecate electron-checksum.py script 2025-07-10 15:05:51 +07:00
Minh141120
39256dad09 refactor: clean up tauri build workflow for macos 2025-07-10 15:05:21 +07:00
Minh141120
291a32759a refactor: clean up tauri build workflow for windows 2025-07-10 15:03:41 +07:00
Minh141120
3790bd5753 refactor: clean up tauri build workflow for linux 2025-07-10 15:01:52 +07:00
Minh141120
0fd346181c chore: enable active installation for window installer 2025-07-10 14:57:55 +07:00
Nguyen Ngoc Minh
5b01d0c196
Merge pull request #5732 from DistractionRectangle/refactor/linux-build-process
Refactor/linux build process
2025-07-10 13:03:01 +07:00
Louis
08342b5b00
Merge pull request #5742 from menloresearch/feat/bump-version-of-llamacpp
feat: bump version of llama.cpp - b5857
2025-07-10 12:51:36 +07:00
Louis
10bb8527bd
feat: bump llama.cpp b5857 2025-07-10 11:51:27 +07:00
D. Rect.
a668204cdc refactor: pin linuxdeploy in make/yarn build process instead of github workflow
- pulls fix for #5463 out of the github release workflow and into
  the make/yarn build process
- implements a wrapper script that pins linuxdeploy and injects
  a new location for XDG_CACHE_HOME into the build pipeline,
  allowing manipulating .cache/tauri without tainting the hosts
  .cache
- adds ./.cache (project_root/.cache) to make clean and mise clean
  task
- remove .devcontainer/buildAppImage.sh, obsolete now that extra
  build steps have been removed from the github workflow and
  incorporated in the normal build process
- remove appimagetool from .devcontainer/postCreateCommand.sh,
  as it was only used by .devcontainer/buildAppImage.sh
2025-07-10 04:50:12 +00:00
D. Rect.
7d04d66a0b refactor: pull appimage packaging steps out of github linux release workflow
- pulled appimage packaging steps out of release workflow into new
  src-tauri/build-utils/buildAppImage.sh
- cleaned up yarn scripts:
  - moved multi platform yarn scripts out of yarn build:tauri:<platform>
    into generic yarn build:tauri
  - split yarn build:tauri:linux:win32 into separate yarn scripts so it's
    clearer what is specific to which platform
- added src-tauri/build-utils/buildAppImage.sh to new yarn build:tauri:linux
  yarn script

    This is also a good entry point to add flatpak builds in the future.

    Part of #5641
2025-07-10 04:50:12 +00:00
D. Rect.
4134917a45 refactor: split platform specific config out of tauri.conf.json
Allows for better per platform default config. Currently the
default serves windows/macos fine while it has to be tweaked
in order to build for linux

make build-tauri now successfully runs where it errored out before.
Appimages made with make alone however is incomplete as there are
still post processing steps in the github release workflow to bundle
additional resources.

- split platform specific config out of tauri.conf.json into auxiliary
  platform specific config files, natively supported by tauri

- pull improved defaults out of template-tauri-build-linux-x64.yml
  into new tauri.linux.conf.json

- fix tauri-build-linx-x64.yml to utilize new tauri.linux.conf.json
2025-07-10 04:50:12 +00:00
Louis
a8ed759a06 fix: model download - windows path issue 2025-07-10 09:42:36 +07:00
Louis
61cd33284d
Merge pull request #5579 from bob-ros2/de_de-i18n
Add language support locale de-DE Germany
2025-07-10 08:52:18 +07:00
Louis
46c95ebb97
feat: bump version of llama.cpp - b5833 2025-07-10 08:29:56 +07:00
Louis
9567bd9a49
Merge branch 'dev' into de_de-i18n 2025-07-08 16:42:32 +07:00
hiento09
d2d8778425
chore: disable coverage check for external contributor pr (#5728) 2025-07-08 16:36:38 +07:00
Louis
2f02a228cc
fix: download on windows 2025-07-08 15:41:17 +07:00
Bob Ros
e160339c7b
Merge branch 'dev' into de_de-i18n 2025-07-08 09:38:09 +02:00
Faisal Amir
60b7e6a081
🐛fix: think tag auto expand like tool tag behavior (#5727) 2025-07-08 11:11:46 +07:00
Bob Ros
0a3185f88d
Merge branch 'dev' into de_de-i18n 2025-07-07 22:18:55 +02:00
Louis
b26ae7d0a4
ci: remove cortex build steps 2025-07-07 22:39:04 +07:00
Akarshan
d5ffc6a476
feat: Migrate Jan's API server to llamacpp-extension
Things to ponder:
- Now, the v1/models endpoint of the API server will return an empty
  list if no models are loaded
- Streaming v1/chat/completion routing works as well as v1/models; needs
  further testing
2025-07-07 20:52:00 +05:30
Louis
e3faf09ab2
chore: try fixing CI 2025-07-07 21:27:37 +07:00
Louis
5ce9bbb304
fix: linux build without cortex binaries 2025-07-07 18:56:04 +07:00
Louis
e75bb0de27
fix: libvulkan download 2025-07-07 18:40:25 +07:00
Louis
6b496ae413
fix: build issues 2025-07-07 18:27:45 +07:00
Faisal Amir
42d4d48362
fix: broken hero image landing page website (#5719) 2025-07-07 14:06:59 +07:00
Louis
ccab7f9119
Merge pull request #5389 from STRRL/feat/identify-jan-on-openrouter
feat: identidy jan for openrouter
2025-07-07 12:51:55 +07:00
Louis
a2c59e9934
Merge pull request #5715 from menloresearch/release/v0.6.6
Merge Release/v0.6.6 into release 0.6.5
2025-07-07 11:43:04 +07:00
hiento09
641275ee79
chore: add coverage report comment (#5716) 2025-07-07 11:37:57 +07:00
hiento09
3287e8b300
chore: enable test coverage (#5710)
* chore: enable test coverage
2025-07-07 11:24:13 +07:00
Faisal Amir
977a8a5774
Update web-app/src/routes/settings/providers/index.tsx
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-07-07 11:19:00 +07:00
Faisal Amir
1422d94fac
🐛fix: make three dots default show 3 dots and can trigger with right click (#5712)
* 🐛fix: default show 3 dots

* enhancement: enable resizable left panel (#5713)

* enhancement: enable resizable left panel

* Update web-app/src/hooks/useLeftPanel.ts

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-07-07 11:14:43 +07:00
Faisal Amir
a0be23b500
enhancement: show readme on detail each model (#5705)
* 🧹cleanup: linter and log

* Update web-app/src/routes/hub/$modelId.tsx

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-07-07 09:54:16 +07:00
Louis
9696589e79
Merge pull request #5389 from STRRL/feat/identify-jan-on-openrouter
feat: identidy jan for openrouter
2025-07-07 09:32:46 +07:00
Louis
9975580497
Merge pull request #5358 from ethanova/allow-assistant-message-edits 2025-07-07 08:22:24 +07:00
Zhiqiang ZHOU
9ff3cbe63f
Merge remote-tracking branch 'upstream/dev' into feat/identify-jan-on-openrouter 2025-07-06 11:54:46 -07:00
Ethan Garber
5bf78a31d9 streaming content doesn't need to deal with edits so doesn't need updateMessage function 2025-07-05 21:41:29 -04:00
Ethan Garber
a1ff097336 Merge branch 'dev' into allow-assistant-message-edits 2025-07-05 19:58:24 -04:00
Akarshan
d4a3d6a0d6
Refactor session PID types from string to number across backend and extension
- Changed `pid` field in `SessionInfo` from `string` to `number`/`i32` in TypeScript and Rust.
- Updated `activeSessions` map key from `string` to `number` to align with new PID type.
- Adjusted process monitoring logic to correctly handle numeric PIDs.
- Removed fallback UUID-based PID generation in favor of numeric fallback (-1).
- Added PID cleanup logic in `is_process_running` when the process is no longer alive.
- Bumped application version from 0.5.16 to 0.6.900 in `tauri.conf.json`.
2025-07-04 21:40:54 +05:30
Akarshan
dbdc031583
chore: store session_info in backend as well for API server(WIP) 2025-07-04 20:31:30 +05:30
Sam Hoang Van
36c2024cb3
fix: update base URL for Anthropic provider (#5600) 2025-07-04 09:43:45 +07:00
Bob Ros
4665876698
Merge branch 'menloresearch:dev' into de_de-i18n 2025-07-03 23:35:01 +02:00
Akarshan
ffef7b9cab enhancement: Add custom Jinja chat template option
Adds a new configuration option `chat_template` to the Llama.cpp extension, allowing users to define a custom Jinja chat template for the model.

The template can be provided via a new input field in the settings, and if set, it will be passed to the Llama.cpp backend using the `--chat-template` argument. This enhances flexibility for users who require specific chat formatting beyond the GGUF default.

The `chat_template` is added to the `LlamacppConfig` type and conditionally pushed to the command arguments if it's provided. The placeholder text provides an example of a Jinja template structure.
2025-07-03 23:38:16 +07:00