14.6 C
Auckland
Saturday, November 23, 2024

Popular Now

‘Everyone on Earth will die,’ top AI researcher warns

AI predictions news

Humanity is unprepared to survive an encounter with a much smarter artificial intelligence, Eliezer Yudkowsky says.

Shutting down the development of advanced artificial intelligence systems around the globe and harshly punishing those violating the moratorium is the only way to save humanity from extinction, a high-profile AI researcher has warned.

Eliezer Yudkowsky, a co-founder of the Machine Intelligence Research Institute (MIRI), has written an opinion piece for TIME magazine on Wednesday, explaining why he didn’t sign a petition calling upon “all AI labs to immediately pause for at least six months the training of AI systems more powerful than GPT-4,” a multimodal large language model, released by OpenAI earlier this month.

Yudkowsky argued that the letter, signed by the likes of Elon Musk and Apple’s Steve Wozniak, was “asking for too little to solve” the problem posed by rapid and uncontrolled development of AI.

“The most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die,” Yudkowsky wrote.

Surviving an encounter with a computer system that “does not care for us nor for sentient life in general” would require “precision and preparation and new scientific insights” that humanity lacks at the moment and is unlikely to obtain in the foreseeable future, he argued.

“A sufficiently intelligent AI won’t stay confined to computers for long,” Yudkowsky warned. He explained that the fact that it’s already possible to email DNA strings to laboratories to produce proteins will likely allow the AI “to build artificial life forms or bootstrap straight to postbiological molecular manufacturing” and get out into the world.

According to the researcher, an indefinite and worldwide moratorium on new major AI training runs has to be introduced immediately. “There can be no exceptions, including for governments or militaries,” he stressed.

International deals should be signed to place a ceiling on how much computing power anyone may use in training such systems, Yudkowsky insisted.

“If intelligence says that a country outside the agreement is building a GPU (graphics processing unit) cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike,” he wrote.

The threat from artificial intelligence is so great that it should be made “explicit in international diplomacy that preventing AI extinction scenarios is considered a priority above preventing a full nuclear exchange,” he added.

Image credit: Tara Winstead

Promoted Content

Source:RT News

No login required to comment. Name, email and web site fields are optional. Please keep comments respectful, civil and constructive. Moderation times can vary from a few minutes to a few hours. Comments may also be scanned periodically by Artificial Intelligence to eliminate trolls and spam.

1 COMMENT

  1. Oh no.! The Terminators are coming..! Mind you it they all look like Humans and Ex Machina it will be okay..

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest

Trending

Sport

Daily Life

Opinion

Wellington
overcast clouds
12.1 ° C
13.9 °
11.6 °
80 %
8.8kmh
100 %
Sat
13 °
Sun
15 °
Mon
17 °
Tue
19 °
Wed
18 °