Stephen Hawking's Warning -“Treating AI as Science Fiction Would Potentially Be Our Worst Mistake Ever" – The Daily Galaxy

Favicon
By Editorial Team Published on October 27, 2018 16:28

“We should plan ahead,” warned physicist Stephen Hawking who died March, 2018, and was buried next to Isaac Newton. “If a superior alien civilization sent us a text message saying, ‘We’ll arrive in a few decades,’ would we just reply, ‘OK, call us when you get here, we’ll leave the lights on’? Probably not, but this is more or less what has happened with AI.”

The memorial stone placed on top of Hawking’s grave included his most famous equation describing the entropy of a black hole. “Here Lies What Was Mortal Of Stephen Hawking,” read the words on the stone, which included an image of a black hole.

“I regard the brain as a computer,” observed Hawking, “which will stop working when its components fail. There is no heaven or afterlife for broken down computers; that is a fairy story for people afraid of the dark.”

Something Similar to the AI Revolution May Have Happened at Other Points in the Universe

Serious concerns about the future of mankind

But before Hawking left our planet, he expressed serious concerns about the future of mankind. Foremost was his concern for the future of our species and what might prove to be our greatest, and last, invention: Artificial Intelligence reported by The Sunday Times of London.

Here is Hawking in his own words in Stephen Hawking on Aliens, AI & The Universe: “While primitive forms of artificial intelligence developed so far have proved very useful, I fear the consequences of creating something that can match or surpass humans,” observed Stephen Hawking. “Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded. And in the future AI could develop a will of its own, a will that is in conflict with ours.”

Artificial Intelligence of the Future Could Reveal the Incomprehensible

In short, Hawking concluded, “the advent of super-intelligent AI would be either the best or the worst thing ever to happen to humanity. The real risk with AI isn’t malice, but competence. A super-intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours we’re in trouble. You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green-energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.”

The Last Word with Nick Bostrom and Amy Johnson

When we asked Nick Bostrom, director of the Future of Humanity Institute at the University of Oxford and author of Superintelligence: Paths, Dangers, Strategies if he agreed with Hawking that “treating AI as science fiction would potentially be our worst mistake ever,” he forebodingly replied in an email: “Yup”.

In a seminal interview with The Guardian, the Oxford philosopher explains that “sentient machines are a greater threat to humanity than climate change. Before the prospect of an intelligence explosion, we humans are like small children playing with a bomb,” he writes. “We have little idea when the detonation will occur, though if we hold the device to our ear we can hear a faint ticking sound.”

“I imagine that part of what Hawking was getting at is that while fiction can be an excellent tool for exploring different possibilities, it can also incline us to view its subjects as impossible or unreal—as something that maybe exists in the future, but is separate from our current lives,” Amy Johnson, a Fellow at MIT’s Language & Technology Lab and the Berkman Klein Center for Internet & Society at Harvard, wrote in an email to The Daily Galaxy. “But the futures we create come directly from the choices we make right now. Further, when we treat AI as science fiction, we often overemphasize the forms and narratives that science fiction assigns to AI, which can make it difficult to recognize the very real ways that AI-based technologies and endeavors can harm society—both in the future but also now, already in the present.

“The last decade has given us a crash course in problems that come from treating the Internet as unreal,” Johnson added. “We don’t want to repeat that with AI. I’d imagine another part of what Hawking’s getting at is the problem of reversibility—many current AI-based tools are designed to have large-scale effects. We need to recognize these as real, not fictional, so that we approach decisions to employ such tools with caution, thoughtfulness, and a willingness to set aside AI-based options for others. Humans have an enormous well of creativity, there are always other options.”

Jackie Faherty, astrophysicist, Senior Scientist with AMNH  via The Guardian and The Times of London. 

Image credit Top of Page: With thanks to Church & State


Editor, Jackie Faherty, astrophysicist, Senior Scientist with AMNH. Jackie was formerly a NASA Hubble Fellow at the Carnegie Institution for Science. Aside from a love of scientific research, she is a passionate educator and can often be found giving public lectures in the Hayden Planetarium. Her research team has won multiple grants from NASA, NSF, and the Heising Simons foundation to support projects focused on characterising planet-like objects. She has also co-founded the popular citizen science project entitled Backyard Worlds: Planet 9 which invites the general public to help scan the solar neighbourhood for previously missed cold worlds. A Google Scholar, Faherty has over 100 peer reviewed articles in astrophysical journals and has been an invited speaker at universities and conferences across the globe. Jackie received the 2020 Vera Rubin Early Career Prize from the American Astronomical Society, an award that recognises scientists who have made an impact in the field of dynamical astronomy and the 2021 Robert H Goddard Award for science accomplishments.

No comment on «Stephen Hawking's Warning -“Treating AI as Science Fiction Would Potentially Be Our Worst Mistake Ever" – The Daily Galaxy»

Leave a comment

Comments are subject to moderation. Only relevant and detailed comments will be validated. - * Required fields