diff --git a/website/src/content/blog/data-is-moat.mdx b/website/src/content/blog/data-is-moat.mdx index 2172ef0e0..5e238103a 100644 --- a/website/src/content/blog/data-is-moat.mdx +++ b/website/src/content/blog/data-is-moat.mdx @@ -102,7 +102,7 @@ In the open-source community, 2 notable examples of fine-tuning with Mistral as ## Conclusion -The ownership and strategic use of pre-trained data serve as an invisible moat. It not only enables the tackling of complex challenges like catastrophic forgetting but also provides a baseline for continuous, targeted improvements. Although there is a solution to decomotralize, the cost remains reasonably high. +The ownership and strategic use of pre-trained data serve as an invisible moat. It not only enables the tackling of complex challenges like catastrophic forgetting but also provides a baseline for continuous, targeted improvements. Although there is a solution to decentralize, the cost remains reasonably high. Fully open pretrained + open weight