Louis
19cb1c96e0
fix: llama.cpp backend download on windows ( #5813 )
...
* fix: llama.cpp backend download on windows
* test: add missing cases
* clean: linter
* fix: build
2025-07-20 16:58:09 +07:00
Louis
8ca507c01c
feat: proxy support for the new downloader ( #5795 )
...
* feat: proxy support for the new downloader
* test: remove outdated test
* ci: clean up
2025-07-17 23:10:21 +07:00
Akarshan
37151ba926
Feat: Auto load and download default backend during first launch
2025-07-03 09:13:32 +05:30
Thien Tran
525cc93d4a
fix system cudart detection on linux
2025-07-02 12:27:34 +07:00
Thien Tran
65d6f34878
check for system libraries
2025-07-02 12:27:17 +07:00
Thien Tran
622f4118c0
add placeholder for windows and linux arm
2025-07-02 12:27:17 +07:00
Thien Tran
f7bcf43334
update folde structure. small refactoring
2025-07-02 12:27:16 +07:00
Thien Tran
494a47aaa5
fix download condition
2025-07-02 12:27:14 +07:00
Thien Tran
f32ae402d5
fix CUDA version URL
2025-07-02 12:27:14 +07:00
Thien Tran
27146eb5cc
fix feature parsing
2025-07-02 12:27:14 +07:00
Thien Tran
a75d13f42f
fix version compare
2025-07-02 12:27:14 +07:00
Thien Tran
3490299f66
refactor get supported features. check driver version for cu11 and cu12
2025-07-02 12:27:13 +07:00
Thien Tran
fbfaaf43c5
download CUDA libs if needed
2025-07-02 12:27:13 +07:00
Thien Tran
40cd7e962a
feat: download backend for llama.cpp extension ( #5123 )
...
* wip
* update
* add download logic
* add decompress. support delete file
* download backend upon selecting setting
* add some logging and nootes
* add note on race condition
* remove then catch
* default to none backend. only download if it's not installed
* merge version and backend. fetch version from GH
* restrict scope of output_dir
* add note on unpack
2025-07-02 12:27:13 +07:00