Sagence is building analog chips to run AI

Graphics processing units (GPUs), the chips on which most AI models run, are energy-hungry beasts. As a consequence of the accelerating incorporation of GPUs in data centers, AI will drive a 160% uptick in electricity demand by 2030, Goldman Sachs estimates. The trend isn’t sustainable, argues Vishal Sarin, an analog and memory circuit designer. After…

Read More

Xscape is building multicolor lasers to connect chips within datacenters

The GPUs and other chips used to train AI communicate with each other inside datacenters through “interconnects.” But those interconnects have limited bandwidth, which limits AI training performance. A 2022 survey found that AI developers typically struggle to use more than 25% of a GPU’s capacity. One solution could be new interconnects with much higher…

Read More

Deal Dive: Human Native AI is building the marketplace for AI training licensing deals

AI systems and large language models need to be trained on massive amounts of data to be accurate but they shouldn’t train on data that they don’t have the rights to use. OpenAI’s licensing deals with The Atlantic and Vox last week show that both sides of the table are interested in landing these AI-training…

Read More

The AI paradox: Building creativity to protect against AI

Cultivating creativity in schools is vital for a future driven by artificial intelligence (AI). But while teachers embrace creativity as an essential 21st century skill, a lack of valid and reliable creativity tests means schools struggle to assess student achievement. Now, a new machine-learning model developed by the University of South Australia is providing teachers…

Read More