DeepSeek, Where are they now?

AuthorLOCS Automation Research
July 28, 2025
5 min read
DeepSeek, Where are they now?

Image: DeepSeek AI icon. Copyright © DeepSeek. Licensed under the MIT License.

DeepSeek, Where are they now? Last spring the Beijing‑based startup bought 10 000 Nvidia A100s and 50 000 H800s—an eye‑watering haul that briefly rattled Wall Street and sent "DeepSeek Shock" headlines across tech finance blogs. Their pitch: match ChatGPT quality on cheaper, export‑compliant chips.

This massive investment represented one of the largest single purchases of AI hardware in history, signaling DeepSeek's ambition to compete directly with OpenAI and other Western AI giants. The move sent shockwaves through the industry and raised questions about the future of AI development in China.

A year later, how good is the model?

The March‑2025 release DeepSeek‑V3‑0324 weighs in at 685 billion parameters and posts an 88.5 MMLU score—slightly ahead of GPT‑4o's 87.2—and tops GPT‑4o on coding benchmark HumanEval (82.6 vs 80.5). Real‑world testers still note occasional English fluency gaps, but Chinese output is strong.

Did the big GPU bet pay off?

According to Stanford's Cyber Policy Center, the company's A100 cluster is two generations behind Nvidia's latest Blackwell chips, yet DeepSeek claims it can fine‑tune V3 updates for <$5 million per training run, far below U.S. peers. The gamble: lower capex today plus full control of its hardware stack.

Funding burn and runway

DeepSeek raised a reported $800 million Series C in mid‑2024, earmarking roughly half for Nvidia hardware. With no public revenue numbers, analysts estimate 18–24 months of runway at current cloud‑compute burn, enough for two major model refreshes.

Does it still make business sense?

  • Pricing: V3 API starts at $1.80 / million tokens—40% cheaper than GPT‑4o
  • Language split: Best choice for Chinese or bilingual workloads; English content‑quality parity in most factual Q&A, still trails on creative writing
  • Compliance: Runs exclusively in mainland data centres; not ideal for firms needing U.S. or EU data residency

Bottom line. DeepSeek hasn't dethroned OpenAI, but the numbers show its Nvidia shopping spree wasn't a bust. For companies that need strong Chinese output at a discount—and aren't bound by Western data‑residency rules—DeepSeek V3 remains a credible, lower‑cost LLM worth testing alongside the usual U.S. giants.


Sources

  1. DeepSeek GPU purchase coverage — TechCrunch
  2. DeepSeek‑V3‑0324 model card (GitHub)
  3. PapersWithCode benchmark table (MMLU & HumanEval scores)
  4. Stanford Cyber Policy Center brief — China's Domestic LLM Race (PDF)
  5. Funding details — South China Morning Post (June 12 2024)
  6. Nvidia H800 export‑compliant chip specs — Nvidia Developer Blog
  7. DeepSeek V3 pricing page

Stay Updated with LOCS Automation

Get the latest insights on automation, software development, and industry trends delivered to your inbox weekly.

Unsubscribe anytime.