The bad guys use artificial intelligence too.
That's the sobering finding underlying a recent report, The Malicious Use of Artificial Intelligence, a collaborative effort produced by 26 experts from 14 institutions such as Cambridge and Oxford universities, think tanks, and industry groups.
Artificial intelligence (AI) should be thought of as a dual-use technology, similar to nuclear power and hacking tools, the authors argue. It has potential military and civilian uses, but it could also enable new forms of cybercrime and physical attacks.
"As AI capabilities become more powerful and widespread, we expect the growing use of AI systems to lead to the expansion of existing threats, the introduction of new threats, and a change to the typical character of threats," the report finds.
For example, AI could help an attack be more effective by allowing it to be more finely targeted and more difficult to attribute. The attack could also be better at exploiting vulnerabilities in conventional or commercial AI systems, the authors argue.
"We also expect novel attacks that subvert cyberphysical systems (such as causing autonomous vehicles to crash) or involve physical systems that it would be infeasible to direct remotely (such as a swarm of thousands of micro-drones)," the report finds. It is reasonable to expect an AI-enabled attack within the next five years, say the authors.
Can a nightmare scenario be prevented? The report offers four recommendations.
First, policymakers should collaborate closely with technical researchers to prevent or mitigate attacks. Second, researchers and scientists in artificial intelligence must take the dual-use nature of AI seriously, and allow misuse-related considerations to influence research priorities and practices.
Third, best practices should be identified and used in research areas that address dual-use concerns, such as computer security. And fourth, innovation and new ideas should be encouraged by actively seeking to expand the range of stakeholders and experts involved in discussions of these challenges.