jan/extensions/llamacpp-extension
Akarshan Biswas 5ff7935d91
fix: include lm_head and embedding layers in totalLayers count (#6415)
The original calculation used only the `block_count` from the model metadata, which excludes the final LM head and the embedding layer. This caused an underestimation of the total number of layers and consequently an incorrect `layerSize` value. Adding `+2` accounts for these two missing layers, ensuring accurate model size metrics.
2025-09-11 11:40:39 +05:30
..
2025-08-19 22:16:24 +07:00
2025-07-11 09:21:11 +07:00