Tracking the next pandemic: Avian Flu Talk |
Let's Hold Off on the Killer Robots |
Post Reply |
Author | |
arirish
Admin Group Joined: June 19 2013 Location: Arkansas Status: Offline Points: 39215 |
Post Options
Thanks(0)
Posted: July 28 2015 at 11:26am |
Musk, Woz: Let's Hold Off on the Killer Robots By Stephanie Mlot July 28, 2015 12:30pm EST 0 Comments More than 1,000 artificial intelligence and robotics researchers endorsed an open letter warning against the technology. Some of the industry's top players have signed on to support a ban on autonomous weapons systems. Tesla CEO Elon Musk, Apple co-founder Steve Wozniak, and professor Stephen Hawking are among the 1,000-plus artificial intelligence and robotics researchers who endorsed an open letter warning against the technology. Penned by the Future of Life Institute (FLI) and presented today at a conference in Buenos Aires, the letter warns that "deployment of such systems is—practically if not legally—feasible within years, not decades, and the stakes are high." While autonomous weapons make the front line safer for soldiers, they may also lower the threshold for going to battle, and likely result in more human casualties, according to the FLI. "The key question for humanity today is whether to start a global AI arms race or to prevent it from starting," the letter said. The authors argue that if any major military operation were to forge ahead with AI weapon development, it would lead to a global arms race—much like the one that erupted over the atom bomb. Still in its infancy, artificial intelligence is used by tech giants like Google and the Defense Advanced Research Projects Agency (DARPA). The latter, which boasts a strong focus on "rethinking military systems," has already put AI to use in its Ground X-Vehicle Technology program. The semi-autonomous armoured tank was built last year with the intention of creating a more mobile, less expensive combat platform. Still, not everyone is convinced of these systems' benefits. "[T]he endpoint of this technological trajectory is obvious," the FLI said in its letter. "Autonomous weapons will become the Kalashnikovs of tomorrow." Cheaper and easier to construct than nuclear weapons, autonomous weapons are also easier to sell on the black market. And, really, no one wants armed quadcopters falling into the hands of terrorists, dictators, or warlords. "We therefore believe that a military AI arms race would not be beneficial for humanity," the FLI said. "There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people." Last fall, Tesla and SpaceX chief Elon Musk declared that "with artificial intelligence, we are summoning the demon." Bill Gates expressed similar fears, saying during a Reddit Ask Me Anything in January that he is concerned with "super intelligence." Former Apple exec Steve Wozniak, meanwhile, had a change of heart about the technology: Initially terrified of AI, he recently backtracked, saying it could eventually benefit the human race. Still, Woz signed the LFI letter, alongside Skype co-founder Jaan Tallinn, professor Noam Chomsky, and Google DeepMind CEO Demis Hassabis. A founding member of the Future of Life Institute, Tallinn is joined in the group by Scientific Advisory Board members Hawking and Musk (as well as actors Alan Alda and Morgan Freeman). "All technologies can be used for good and bad," Toby Walsh, professor of AI at the University of New South Wales in Australia, said in a statement. "Artificial intelligence is a technology that can be used to help tackle many of the pressing problems facing society today—inequality and poverty, the rising cost of health care, the impact of global warming… But it can also be used to inflict unnecessary harm." "We need to make a decision today that will shape our future and determine whether we follow a path of good," he added. In April, a report from Human Rights Watch and Harvard Law School expressed similar concerns. http://www.pcmag.com/article2/0,2817,2488632,00.asp |
|
Buy more ammo!
|
|
jacksdad
Executive Admin Joined: September 08 2007 Location: San Diego Status: Offline Points: 47251 |
Post Options
Thanks(0)
|
AI - specifically Artificial Super Intelligence - is a worry. We've no idea how the world would be changed by the emergence of machine intelligences with IQs so much greater than ours that they could be measured in the thousands or greater. Scary stuff when you start researching it, and potentially not that far off.
http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html |
|
"Buy it cheap. Stack it deep"
"Any community that fails to prepare, with the expectation that the federal government will come to the rescue, will be tragically wrong." Michael Leavitt, HHS Secretary. |
|
Guests
Guest Group |
Post Options
Thanks(0)
|
This says it all: From the Movie Terminator 2: Judgment Day
John Connor: We're not gonna make it, are we? People, I mean. The Terminator: It's in your nature to destroy yourselves. John Connor: Yeah. Major drag, huh? |
|
Kilt-3
Valued Member Since 2006 Joined: February 04 2015 Status: Offline Points: 1365 |
Post Options
Thanks(0)
|
Listen, and understand! That Terminator is out there! It can't be bargained with. It can't be reasoned with It doesn't feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.
|
|
Post Reply | |
Tweet
|
Forum Jump | Forum Permissions You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You can vote in polls in this forum |