diff --git a/docs/src/pages/post/_assets/download-jan.jpg b/docs/src/pages/post/_assets/download-jan.jpg
new file mode 100644
index 000000000..f799260c7
Binary files /dev/null and b/docs/src/pages/post/_assets/download-jan.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-hub-deepseek-r1.jpg b/docs/src/pages/post/_assets/jan-hub-deepseek-r1.jpg
new file mode 100644
index 000000000..12c0c6640
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-hub-deepseek-r1.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1-2.jpg b/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1-2.jpg
new file mode 100644
index 000000000..24be4bd25
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1-2.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1.jpg b/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1.jpg
new file mode 100644
index 000000000..83d9ab370
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-hub-download-deepseek-r1.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-library-deepseek-r1.jpg b/docs/src/pages/post/_assets/jan-library-deepseek-r1.jpg
new file mode 100644
index 000000000..6a54082dc
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-library-deepseek-r1.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-runs-deepseek-r1-distills.jpg b/docs/src/pages/post/_assets/jan-runs-deepseek-r1-distills.jpg
new file mode 100644
index 000000000..02ce847f4
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-runs-deepseek-r1-distills.jpg differ
diff --git a/docs/src/pages/post/_assets/jan-system-prompt-deepseek-r1.jpg b/docs/src/pages/post/_assets/jan-system-prompt-deepseek-r1.jpg
new file mode 100644
index 000000000..f79e71af0
Binary files /dev/null and b/docs/src/pages/post/_assets/jan-system-prompt-deepseek-r1.jpg differ
diff --git a/docs/src/pages/post/_assets/run-deepseek-r1-locally-in-jan.jpg b/docs/src/pages/post/_assets/run-deepseek-r1-locally-in-jan.jpg
new file mode 100644
index 000000000..aa6980585
Binary files /dev/null and b/docs/src/pages/post/_assets/run-deepseek-r1-locally-in-jan.jpg differ
diff --git a/docs/src/pages/post/deepseek-r1-locally.mdx b/docs/src/pages/post/deepseek-r1-locally.mdx
new file mode 100644
index 000000000..688fa75a3
--- /dev/null
+++ b/docs/src/pages/post/deepseek-r1-locally.mdx
@@ -0,0 +1,109 @@
+---
+title: "Beginner's Guide: Run DeepSeek R1 Locally (Private)"
+description: "Quick steps on how to run DeepSeek R1 locally for full privacy. Perfect for beginners—no coding required."
+tags: DeepSeek, R1, local AI, Jan, GGUF, Qwen, Llama
+categories: guides
+date: 2024-01-31
+ogImage: assets/run-deepseek-r1-locally-in-jan.jpg
+---
+
+import { Callout } from 'nextra/components'
+import CTABlog from '@/components/Blog/CTA'
+
+# Beginner’s Guide: Run DeepSeek R1 Locally
+
+
+
+You can run DeepSeek R1 on your own computer! While the full model needs very powerful hardware, we'll use a smaller version that works great on regular computers.
+
+Why use a smaller version?
+- Works smoothly on most modern computers
+- Downloads much faster
+- Uses less storage space on your computer
+
+## Quick Steps at a Glance
+1. Download and install [Jan](https://jan.ai/) (just like any other app!)
+2. Pick a version that fits your computer
+3. Choose the best settings
+4. Set up a quick template & start chatting!
+
+Keep reading for a step-by-step guide with pictures.
+
+## Step 1: Download Jan
+[Jan](https://jan.ai/) is a free app that helps you run AI models on your computer. It works on Windows, Mac, and Linux, and it's super easy to use - no coding needed!
+
+
+
+- Get Jan from [jan.ai](https://jan.ai)
+- Install it like you would any other app
+- That's it! Jan takes care of all the technical stuff for you
+
+## Step 2: Choose Your DeepSeek R1 Version
+DeepSeek R1 comes in different sizes. Let's help you pick the right one for your computer.
+
+
+💡 Not sure how much VRAM your computer has?
+- Windows: Press Windows + R, type "dxdiag", press Enter, and click the "Display" tab
+- Mac: Click Apple menu > About This Mac > More Info > Graphics/Displays
+
+
+Below is a detailed table showing which version you can run based on your computer's VRAM:
+
+| Version | Link to Paste into Jan Hub | Required VRAM for smooth performance |
+|---------|---------------------------|---------------|
+| Qwen 1.5B | [https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-1.5B-GGUF](https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-1.5B-GGUF) | 6GB+ VRAM |
+| Qwen 7B | [https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-7B-GGUF](https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-7B-GGUF) | 8GB+ VRAM |
+| Llama 8B | [https://huggingface.co/unsloth/DeepSeek-R1-Distill-Llama-8B-GGUF](https://huggingface.co/unsloth/DeepSeek-R1-Distill-Llama-8B-GGUF) | 8GB+ VRAM |
+| Qwen 14B | [https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-14B-GGUF](https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-14B-GGUF) | 16GB+ VRAM |
+| Qwen 32B | [https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-32B-GGUF](https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-32B-GGUF) | 16GB+ VRAM |
+| Llama 70B | [https://huggingface.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF](https://huggingface.co/unsloth/DeepSeek-R1-Distill-Llama-70B-GGUF) | 48GB+ VRAM |
+
+
+Quick Guide:
+- 6GB VRAM? Start with the 1.5B version - it's fast and works great!
+- 8GB VRAM? Try the 7B or 8B versions - good balance of speed and smarts
+- 16GB+ VRAM? You can run the larger versions for even better results
+
+
+Ready to download? Here's how:
+1. Open Jan and click the button in the left sidebar to open Jan Hub
+2. Find the "Add Model" section (shown below)
+
+
+
+3. Copy the link for your chosen version and paste it here:
+
+
+
+## Step 3: Choose Model Settings
+When adding your model, you'll see two options:
+
+
+- **Q4:** Perfect for most users - fast and works great! ✨ (Recommended)
+- **Q8:** Slightly more accurate but needs more powerful hardware
+
+
+## Step 4: Set Up & Start Chatting
+Almost done! Just one quick setup:
+
+1. Click Model Settings in the sidebar
+2. Find the Prompt Template box
+3. Copy and paste this exactly:
+
+
+```
+<|User|>{prompt}<|Assistant|>
+```
+
+
+This helps DeepSeek understand when you're talking and when it should respond.
+
+Now you're ready to start chatting!
+
+
+
+## Need help?
+
+
+Having trouble? We're here to help! [Join our Discord community](https://discord.gg/Exe46xPMbK) for support.
+