YouTuber Dave Lee, known for his Dave2D channel, recently showed off how Apple’s latest Mac Studio with the M3 Ultra chip can run a massive DeepSeek R1 AI model right on the device. To pull this off, you need the top-tier version with 512GB of memory.
Mac Studio 2025 Breakdown
Lee’s tests revealed that this giant AI model, with 671 billion parameters, works smoothly on the Mac Studio. It gobbles up 404GB of storage and needs 448GB of virtual memory, which users set up manually using Terminal commands.
The secret sauce is the M3 Ultra’s all-in-one memory setup. It handles a slimmed-down, 4-bit version of DeepSeek R1 without breaking a sweat. This tweak lowers accuracy a tiny bit but keeps all the important bits intact, churning out about 17-18 tokens per second—plenty fast for most tasks.
Even better, the Mac Studio does this while sipping less than 200 watts of power. If you tried this on a regular PC, you’d need several graphics cards guzzling around ten times more energy.
Why It Matters
Running big AI models at home is a game-changer for privacy. Think of things like medical data crunching—keeping it local avoids the risks of sending sensitive info to the cloud.
The Price Tag
This power comes with a hefty cost. A Mac Studio with the M3 Ultra and 512GB of RAM starts at about $10,000. Go all out with 16TB of storage and a fully loaded M3 Ultra (32-core CPU, 80-core GPU, and 32-core Neural Engine), and you’re looking at $14,099. Still, for companies needing secure, local AI processing, it’s a solid deal compared to other options. Apple claims the M3 Ultra is its speediest chip yet, made by blending two M3 Max chips with its “UltraFusion” trick. That doubles the power, making it a beast for heavy-duty work.