What Is the Minimum Hardware to Run 671b Deepseek 3.1
Posted5 months agoActive5 months ago
Tech Discussionstory
informativeneutral
Debate
20/100
LLMHardware RequirementsAI Model Deployment
Key topics
LLM
Hardware Requirements
AI Model Deployment
URL for Ollama is https://ollama.com/library/deepseek-v3.1/tags
So has anyone put this model to run and with what hardware?
Discussion Activity
Light discussionFirst comment
2m
Peak period
2
0-1h
Avg / period
1.5
Key moments
- 01Story posted
Aug 26, 2025 at 5:32 PM EDT
5 months ago
Step 01 - 02First comment
Aug 26, 2025 at 5:34 PM EDT
2m after posting
Step 02 - 03Peak activity
2 comments in 0-1h
Hottest window of the conversation
Step 03 - 04Latest activity
Aug 26, 2025 at 7:21 PM EDT
5 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45032578Type: storyLast synced: 11/18/2025, 12:08:58 AM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
This is ollama run deepseek-r1:70b with size of 43gb.
32GB desktop.
Beyond that it's just how fast you want to go. Pure CPU will be slow, but will work. There has been a few stories and comments here[1] about people running this model.
[1]: https://hn.algolia.com/?q=671B+DeepSeek