jan/extensions
Akarshan ea231676bf
fix: correct flash_attn and main_gpu flag checks in llamacpp extension
Previously the condition for `flash_attn` was always truthy, causing
unnecessary or incorrect `--flash-attn` arguments to be added. The
`main_gpu` check also used a loose inequality which could match values
that were not intended. The updated logic uses strict comparison and
correctly handles the empty string case, ensuring the command line
arguments are generated only when appropriate.
2025-10-30 19:49:55 +05:30
..
2025-10-09 04:28:08 +07:00
2025-10-09 03:48:51 +07:00