OpenAI’s o3 suggests AI models are scaling in new ways — but so are the costs

Last month, AI founders and investors told TechCrunch that we’re now in the “second era of scaling laws,” noting how established methods of improving AI models were showing diminishing returns. One promising new method they suggested could keep gains was “test-time scaling,” which seems to be what’s behind the performance of OpenAI’s o3 model —…

Read More

Current AI scaling laws are showing diminishing returns, forcing AI labs to change course

AI labs traveling the road to super-intelligent systems are realizing they might have to take a detour. “AI scaling laws,” the methods and expectations that labs have used to increase the capabilities of their models for the last five years, are now showing signs of diminishing returns, according to several AI investors, founders, and CEOs…

Read More