Eleanor Olcott and Zijing Wu

In China, Liang Wenfeng is being celebrated as a hero this week, a digital David fighting America’s Big Tech Goliath, armed with a modest cluster of artificial intelligence chips and a small crack team of engineers. 

His computational projectile was a series of papers released by his AI start-up DeepSeek, which appeared to show that it was possible to build powerful large language models with far fewer Nvidia chips than US rivals. Global investors wiped almost $600bn off Nvidia’s market capitalisation as a result, questioning whether pouring hundreds of billions of dollars into gargantuan AI computing clusters was necessary. 

——

More

David Sacks:

New report by leading semiconductor analyst Dylan Patel shows that DeepSeek spent over $1 billion on its compute cluster. The widely reported $6M number is highly misleading, as it excludes capex and R&D, and at best describes the cost of the final training run only.