chore: fix typo
This commit is contained in:
parent
5a4f60868d
commit
5bd87aadcc
@ -27,7 +27,7 @@ We present a straightforward approach to adapting small, open-source models for
|
||||
|
||||
In short, (1) extending a general foundation model like [Mistral](https://huggingface.co/mistralai/Mistral-7B-v0.1) with strong math and coding, and (2) training it over a high-quality, synthetic dataset generated from the intended corpus, and (3) adding RAG capabilities, can lead to significant accuracy improvements.
|
||||
|
||||
Problems still arise with catastrophic forgetting in general tasks, commonly observed during continued fine-tuning [1]. In our case, this is likely exacerbated by our lack of access to Mistral’s original training dataset and various compression techniques used in our approach to keep the model small.
|
||||
Problems still arise with catastrophic forgetting in general tasks, commonly observed during specilizied domain fine-tuning. In our case, this is likely exacerbated by our lack of access to Mistral’s original training dataset and various compression techniques used in our approach to keep the model small.
|
||||
|
||||
## Selecting a strong foundation model
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user