Update website/src/content/blog/data-is-moat.mdx
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
This commit is contained in:
parent
654920b3b4
commit
fc56c418d6
@ -102,7 +102,7 @@ In the open-source community, 2 notable examples of fine-tuning with Mistral as
|
||||
|
||||
## Conclusion
|
||||
|
||||
The ownership and strategic use of pre-trained data serve as an invisible moat. It not only enables the tackling of complex challenges like catastrophic forgetting but also provides a baseline for continuous, targeted improvements. Although there is a solution to decomotralize, the cost remains reasonably high.
|
||||
The ownership and strategic use of pre-trained data serve as an invisible moat. It not only enables the tackling of complex challenges like catastrophic forgetting but also provides a baseline for continuous, targeted improvements. Although there is a solution to decentralize, the cost remains reasonably high.
|
||||
|
||||
Fully open pretrained + open weight
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user