5700 Commits

Author SHA1 Message Date
Louis
646f40d664
chore: token-js version bump 2025-07-16 11:37:40 +07:00
Ethan Garber
a3b95f01de add comment to justify style block 2025-07-15 21:14:24 -04:00
Ethan Garber
b0e66993fe set line number userSelect to none so that code can be copied without line number 2025-07-15 21:10:22 -04:00
Louis
bd3b8bff35
Merge pull request #5781 from menloresearch/test/add-missing-tests
test: add missing unit tests
2025-07-15 22:56:22 +07:00
Louis
9872a6e82a test: add missing unit tests 2025-07-15 22:29:28 +07:00
Louis
f083fafcfd
Merge pull request #5776 from menloresearch/fix/download-icon
🐛fix: download icon when left panel close
2025-07-15 11:44:29 +07:00
Louis
08fe2c27fd
fix: translations 2025-07-15 11:11:00 +07:00
Nguyen Ngoc Minh
f674c786ba
Merge pull request #5778 from menloresearch/fix/revert-windows-installation-mode
fix: revert installationmode in nsis template
2025-07-15 11:02:42 +07:00
Faisal Amir
02c049653e
🐛fix: revert back stat hover for three dots (#5777) 2025-07-15 10:34:02 +07:00
Minh141120
65bc24530f fix: change installationmode in nsis template 2025-07-15 10:23:22 +07:00
Louis
8e85c2fd06
fix: bump llama.cpp b5857 on windows 2025-07-15 10:15:03 +07:00
Faisal Amir
55b68df956 🐛fix: download icon when left panel close 2025-07-15 09:39:51 +07:00
Sam Hoang Van
9a76c94e22
update rmcp to fix issues (#5290) 2025-07-14 16:49:27 +07:00
Akarshan Biswas
dee98f41d1
Feat: Improved llamacpp Server Stability and Diagnostics (#5761)
* feat: Improve llamacpp server error reporting and model load stability

This commit introduces significant improvements to how the llamacpp server
process is managed and how its errors are reported.

Key changes:
- **Enhanced Error Reporting:** The llamacpp server's stdout and stderr
  are now piped and captured. If the llamacpp process exits prematurely
  or fails to start, its stderr output is captured and returned as a
  `LlamacppError`. This provides much more specific and actionable
  diagnostic information for users and developers.
- **Increased Model Load Timeout:** The `waitForModelLoad` timeout has
  been increased from 30 seconds to 240 seconds (4 minutes). This
  addresses issues where larger models or slower systems would
  prematurely time out during the model loading phase.
- **API Secret Update:** The internal API secret for the llamacpp
  extension has been updated from 'Jan' to 'JustAskNow'.
- **Version Bump:** The application version in `tauri.conf.json` has
  been incremented to `0.6.901`.

* fix: should not spam load requests

* test: add test to cover the fix

* refactor: clean up

* test: add more test case

---------

Co-authored-by: Louis <louis@jan.ai>
2025-07-14 11:55:44 +05:30
Akarshan Biswas
96ba42e411
feat: Add missing ctx-shift toggle (#5765)
* feat: Add missing ctx_shift

* fix typo

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>

* refine description

---------

Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
2025-07-14 11:51:34 +05:30
Louis
eaf4b1b954
Merge pull request #5757 from menloresearch/test/add-tests
test: add missing unit tests
2025-07-14 09:55:53 +07:00
Sherzod Mutalov
1ff62de3b6
Merge branch 'dev' into feat/old-mac-support 2025-07-13 16:29:36 +05:00
Louis
03bcd02002
test: add missing unit tests 2025-07-12 22:46:27 +07:00
Louis
c2790d9181
test: remove route tests 2025-07-12 21:35:49 +07:00
Louis
864ad50880
test: add missing tests 2025-07-12 21:29:51 +07:00
Louis
c5fd964bf2 test: add missing tests 2025-07-12 20:15:45 +07:00
Louis
191537884e
Merge pull request #5759 from menloresearch/fix/test-coverage-upload-lcov
fix: vitest config - coverage lcov
2025-07-12 20:14:42 +07:00
Louis
09a45baa5d
fix: vitest config - coverage lcov 2025-07-12 20:09:31 +07:00
Louis
c23c34583d
Merge pull request #4965 from menloresearch/feat/inference-llamacpp-extension
feat: llama.cpp provider extension to replace cortex.cpp
2025-07-11 09:25:47 +07:00
Louis
b2ce138ea0
test: add tests 2025-07-11 09:21:11 +07:00
Louis
b8259e7794 feat: add HF token setting 2025-07-11 00:05:52 +07:00
Louis
9cea579c8e
fix: build issue 2025-07-10 22:16:31 +07:00
Louis
114f2b9092
ci: attempt to upload cov 2025-07-10 21:51:13 +07:00
Louis
963ad448f5
fix: build 2025-07-10 21:23:04 +07:00
Nguyen Ngoc Minh
0f7f2a7b38
Merge pull request #5745 from menloresearch/refactor/tauri-workflow
refactor: tauri workflow
2025-07-10 21:18:26 +07:00
Louis
a770e08013
test: migrate jest to vitest 2025-07-10 21:14:21 +07:00
Louis
1c7a20be44
fix: linux build 2025-07-10 21:14:20 +07:00
Louis
bc0ea343cc
fix: remove legacy build step 2025-07-10 20:24:11 +07:00
Louis
37718d1e71
fix: build issue with legacy libs 2025-07-10 20:17:20 +07:00
Louis
af8404d627
fix: tests 2025-07-10 20:16:09 +07:00
Louis
5fe4cc6bab
chore: remove cortex install step 2025-07-10 20:09:26 +07:00
Louis
389721ba89
fix: build step 2025-07-10 16:49:21 +07:00
Louis
ca6f4f8977
test: fix failed tests 2025-07-10 16:25:47 +07:00
Louis
6e0218c084
Merge branch 'release/v0.7.0' into feat/inference-llamacpp-extension
# Conflicts:
#	.devcontainer/buildAppImage.sh
#	.github/workflows/template-tauri-build-linux-x64.yml
#	Makefile
#	core/src/node/extension/index.test.ts
#	package.json
#	src-tauri/tauri.conf.json
#	web-app/package.json
2025-07-10 15:36:41 +07:00
Louis
94d9304c0b
Merge branch 'dev' into release/v0.7.0 2025-07-10 15:33:21 +07:00
Minh141120
94ada5969b refactor: deprecate electron-checksum.py script 2025-07-10 15:05:51 +07:00
Minh141120
39256dad09 refactor: clean up tauri build workflow for macos 2025-07-10 15:05:21 +07:00
Minh141120
291a32759a refactor: clean up tauri build workflow for windows 2025-07-10 15:03:41 +07:00
Minh141120
3790bd5753 refactor: clean up tauri build workflow for linux 2025-07-10 15:01:52 +07:00
Minh141120
0fd346181c chore: enable active installation for window installer 2025-07-10 14:57:55 +07:00
Nguyen Ngoc Minh
5b01d0c196
Merge pull request #5732 from DistractionRectangle/refactor/linux-build-process
Refactor/linux build process
2025-07-10 13:03:01 +07:00
Louis
08342b5b00
Merge pull request #5742 from menloresearch/feat/bump-version-of-llamacpp
feat: bump version of llama.cpp - b5857
2025-07-10 12:51:36 +07:00
Louis
10bb8527bd
feat: bump llama.cpp b5857 2025-07-10 11:51:27 +07:00
D. Rect.
a668204cdc refactor: pin linuxdeploy in make/yarn build process instead of github workflow
- pulls fix for #5463 out of the github release workflow and into
  the make/yarn build process
- implements a wrapper script that pins linuxdeploy and injects
  a new location for XDG_CACHE_HOME into the build pipeline,
  allowing manipulating .cache/tauri without tainting the hosts
  .cache
- adds ./.cache (project_root/.cache) to make clean and mise clean
  task
- remove .devcontainer/buildAppImage.sh, obsolete now that extra
  build steps have been removed from the github workflow and
  incorporated in the normal build process
- remove appimagetool from .devcontainer/postCreateCommand.sh,
  as it was only used by .devcontainer/buildAppImage.sh
2025-07-10 04:50:12 +00:00
D. Rect.
7d04d66a0b refactor: pull appimage packaging steps out of github linux release workflow
- pulled appimage packaging steps out of release workflow into new
  src-tauri/build-utils/buildAppImage.sh
- cleaned up yarn scripts:
  - moved multi platform yarn scripts out of yarn build:tauri:<platform>
    into generic yarn build:tauri
  - split yarn build:tauri:linux:win32 into separate yarn scripts so it's
    clearer what is specific to which platform
- added src-tauri/build-utils/buildAppImage.sh to new yarn build:tauri:linux
  yarn script

    This is also a good entry point to add flatpak builds in the future.

    Part of #5641
2025-07-10 04:50:12 +00:00