How much VRAM for D...
 
Notifications
Clear all

How much VRAM for DeepSeek R1 70B?

3 Posts
4 Users
0 Reactions
52 Views
0
Topic starter

I planning to build a PC to run DeepSeek R1 70b locally, how much VRAM is required? And which GPU is best for this model?

Topic Tags
3 Answers
0

48GB, recommended GPUs: RTX 5090 x2, A100, H100.

0

From personal experience: Running DeepSeek R1 70B with 4-bit quantization on a single RTX 6000 Ada (48GB)works for inference at batch size 1. Larger batches or training require splitting the model across GPUs. For reference, LLaMA 70B in 4-bit uses ~40GB VRAM, so expect similar here. If you’re stuck with 24GB cards (e.g., 3090/4090), you’ll need 2-3 GPUs and tensor parallelism. Always monitor VRAM usage with nvidia-smi!




0

50GB and more.

Share:
PCTalkTalk.COM is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.

Contact Us | Privacy Policy