Cookie Consent: We use cookies to give you the best online experience, for analytics, performance, and to tailor the experience towards your interests.

Skip to content
Pic of CoRecursive: Coding Stories

The Pre-Training Wall and the Treadmill After It

May 9, 2026
0 comments

CoRecursive: Coding Stories

Description

I've been confusing Don with frontier-lab links late at night for a bit.

Ilya Sutskever told a NeurIPS audience that pre-training as we know it would unquestionably end. There's only one internet, and the data isn't growing. The frontier labs call this the pre-training wall.

A leaked Google memo from 2023 argued they had no moat. R1 is on GitHub. Llama is on Hugging Face. OpenAI's secondary-market valuation has climbed past $850 billion. 

Don was confused. So he came over and we made an episode about it.