Stephen Hawking Also Thinks Human-Level AI May Lead to Our Destruction, by the Way
Dude sure loves singularities.
Robotics may not be Stephen Hawking’s field, but when the brilliant theoretical physicist has something to say about the future of science and the human race, it’s hard not to listen. Especially when what he says involves humanity’s destruction at the hands of AI we create. But hey, he’s already an expert on one kind of singularity, so why not shoot for a second?
Adding his voice to the warnings of others like Tesla Motors and SpaceX founder Elon Musk, Hawking is concerned that increasingly advanced AI poses a threat to humanity. If we ever manage to create AI that finally surpasses human intelligence, it could become a metaphorical runaway train, which would be kind of a pain, and that is the truth.
At least, that’s what Hawking’s worried about. He told the BBC,
The primitive forms of artificial intelligence we already have have proved very useful. But I think the development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence, it would take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded.
The bit about AI came as part of an interview around how technology has already revolutionized Hawking’s life personally, as he’s been communicating through a speech synthesizer for decades. Recently, he received a new system with modern predictive text techniques, which is probably one of those “primitive forms” of AI that he finds useful.
Luckily for us, the most dangerous thing about AI in its current state is what it might accidentally text to our parents. We’ve got plenty of time to work out the kinks and maybe figure out how to avoid any unfortunate Ultron situations. Let’s hope this is a singularity we can prove Hawking wrong on.
(via The Daily Dot)
Are you following The Mary Sue on Twitter, Facebook, Tumblr, Pinterest, & Google +?
Have a tip we should know? tips@themarysue.com