Google's 'Big Red Button' Could Save the World

Google red button
Google's "big red button" would act as an off switch for a rogue artificial intelligence agent. WikiCommons

In Superintelligence, Nick Bostrom's seminal book on artificial intelligence (AI), the Swedish philosopher warned that if machines surpass humans in terms of general intelligence, the result will be an existential catastrophe for humanity. Introducing a "superintelligent" system, Bostrom argued, would see humans replaced as the dominant life form on Earth—and potentially wiped out.

After reading Bostrom's book, billionaire entrepreneur Elon Musk sent out a tweet in 2014 warning: "we need to be super careful with AI. Potentially more dangerous than nukes." When an open letter calling for more responsible oversight into AI research was penned in 2015, Musk joined Bostrom in being one of the first people to sign it.

Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.

— Elon Musk (@elonmusk) August 3, 2014

Ultimately, the main concern is that the first machine to surpass human capabilities will be impossible to switch off. Speaking at a TED (technology, entertainment and design) conference conference last year, Bostrom hypothesized why neanderthals hadn't "flicked the off switch" with humans when we became the dominant species.

"They certainly had reasons," Bostrom said. "The reason is that we are an intelligent adversary. We can anticipate threats and plan around them. But so could a super intelligent agent and it would be much better at that than we are."

The way to stop this from happening, according to Google, is a "big red button."

Fittingly, this solution comes from the same company that Musk views as the biggest threat to humanity in terms of AI. Speaking at the Code Conference in California last week, Musk refused to explicitly mention Google by name, but alluded heavily to the fact that it was the "only one" he was worried about.

Google has invested heavily and worked hard to assemble some of the brightest minds in AI research, employing them to develop everything from self-driving cars, to improved search algorithms. In 2014, Google acquired the London-based startup DeepMind for $500 million, which has since gone on to be Google's AI flag bearer, making headlines for its creation of the first computer capable of beating a human champion at the boardgame Go.

It is researchers from Google DeepMind that have now put forward the idea of an off switch, in a peer-reviewed paper titled Safely Interruptible Agents. The paper outlines a framework for preventing advanced machines from ignoring turn-off commands and becoming an out-of-control rogue agent.

"Safe interruptibility can be useful to take control of a robot that is misbehaving and may lead to irreversible consequences," the paper states. "If such an agent is operating in real-time under human supervision, now and then it may be necessary for a human operator to press the big red button to prevent the agent from continuing a harmful sequence of actions—harmful either for the agent or for the environment—and lead the agent into a safer situation."

In short, the methods set out in this paper could one day save the world, or at least humanity, from an AI apocalypse. Bostrom himself has said that it would be a "great tragedy" if human-level AI is never developed, as it holds the potential to cure diseases, eradicate poverty and advance civilization at an exponential rate. But first safeguards need to be put in place, and that safeguard may well be Google's big red button.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Anthony Cuthbertson is a staff writer at Newsweek, based in London.  

Anthony's awards include Digital Writer of the Year (Online ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go