New book claims superintelligent AI development is racing toward global catastrophe

Written by on September 19, 2025

New book claims superintelligent AI development is racing toward global catastrophe
STOCK PHOTO/Getty Images

(NEW YORK) — A new book by two artificial intelligence researchers claims that the race to build superintelligent AI could spell doom for humanity.

In “If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All,” authors Eliezer Yudkowsky and Nate Soares claim that AI development is moving too fast and without proper safety measures.

“We tried a whole lot of things besides writing a book, and you really want to try all the things you can if you’re trying to prevent the utter extinction of humanity,” Yudkowsky told ABC News.

Yudkowsky says major tech companies claim superintelligent AI — a hypothetical form of AI that could possess intellectual abilities far exceeding humans — could arrive within two to three years. But he warns these companies may not fully understand the risks they’re taking.

Unlike the chatbots many people use today, superintelligent AI could be fundamentally different and more dangerous, according to Soares.

“Chatbots are a stepping stone. They [companies] are rushing to build smarter and smarter AIs,” he told ABC News.

The authors explain that modern AI systems are “grown” rather than built in traditional ways, making them harder to control. When these systems do unexpected things, developers can’t simply fix the code.

“When they threaten a New York Times reporter or engage in blackmail … that’s just a behavior that comes out of these AI’s being grown. It’s not a behavior someone put in there on purpose,” Soares said.

Soares compared AI abilities to human abilites as a professional NFL team playing against a high school team.

“You don’t know exactly what the plays are. You know who’s going to win.” He suggested AI could potentially take over robots, create dangerous viruses or build infrastructure that overwhelms humanity.

While some argue AI could help solve humanity’s biggest challenges, Yudkowsky remains skeptical.

“The trouble is, we don’t have the technical capacity to make something that wants to help us,” he told ABC News.

The authors advocate for a complete halt in superintelligent AI development.

“I don’t think you want a plan to get into a fight with something that is smarter than humanity,” Yudkowsky warned. “That’s a dumb plan.”

Copyright © 2025, ABC Audio. All rights reserved.


Reader's opinions

Leave a Reply


Current track

Title

Artist