diff --git a/docs/src/pages/post/deepseek-r1-locally.mdx b/docs/src/pages/post/deepseek-r1-locally.mdx index b36437439..a10ca9cf6 100644 --- a/docs/src/pages/post/deepseek-r1-locally.mdx +++ b/docs/src/pages/post/deepseek-r1-locally.mdx @@ -1,9 +1,5 @@ --- -<<<<<<< HEAD title: "Run DeepSeek R1 locally on your device (Beginner-Friendly Guide)" -======= -title: "Beginner's Guide: Run DeepSeek R1 Locally" ->>>>>>> origin/dev description: "A straightforward guide to running DeepSeek R1 locally for enhanced privacy, regardless of your background." tags: DeepSeek, R1, local AI, Jan, GGUF, Qwen, Llama categories: guides @@ -14,7 +10,6 @@ ogImage: assets/run-deepseek-r1-locally-in-jan.jpg import { Callout } from 'nextra/components' import CTABlog from '@/components/Blog/CTA' -<<<<<<< HEAD # Run DeepSeek R1 locally on your device (Beginner-Friendly Guide) ![image](./_assets/run-deepseek-r1-locally-in-jan.jpg) @@ -24,13 +19,6 @@ DeepSeek R1 is one of the best open-source models in the market right now, and y New to running AI models locally? Check out our [comprehensive guide on running AI models locally](/post/run-ai-models-locally) first. It covers essential concepts that will help you better understand this DeepSeek R1 guide. -======= -# Beginner's Guide: Run DeepSeek R1 Locally - -![image](./_assets/run-deepseek-r1-locally-in-jan.jpg) - -DeepSeek R1 brings state-of-the-art AI capabilities to your local machine. With optimized versions available for different hardware configurations, you can run this powerful model directly on your laptop or desktop computer. This guide will show you how to run open-source AI models like DeepSeek, Llama, or Mistral locally on your computer, regardless of your background. ->>>>>>> origin/dev Why use an optimized version? - Efficient performance on standard hardware