advertisement
advertisement
advertisement

Elon Musk, Stephen Hawking Warn Of Potentially Devastating “AI Arms Race”

Elon Musk, Stephen Hawking Warn Of Potentially Devastating “AI Arms Race”
[Photo: Kevork Djansezian/Getty Images]

The Future Of Life Institute, a Boston-based research organization founded by Skype cofounder Jaan Tallinn and MIT cosmologist Max Tegmark, has published an open letter warning that artificially intelligent weapons could be in use within a decade, and could have devastating effects on humanity. Prominent scientists and tech leaders, including Stephen Hawking, Steve Wozniak, Elon Musk, and Noam Chomsky, have endorsed the letter.

The letter refers to autonomous weapons, which can select and attack targets without human input–for example, “armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria.” It goes on to explain that such weapons have been dubbed “the third revolution in warfare, after gunpowder and nuclear arms,” and that we could be on the brink of an arms race:

The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable. . . . Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.

The letter asks the United Nations to ban “offensive autonomous weapons,” and urges AI research to focus instead on technology that will make battlefields safer and reduce civilian casualties.

Read the full text of the letter here.

advertisement
advertisement