advertisement
advertisement

Siri’s evil twin? Cybercriminals look to exploit voice-activated AI.

The weakest link in any security chain is usually a human, not a software program. Which makes the rise of voice-activated artificial intelligence, or AI, a major concern for cybersecurity experts, who foresee new opportunities for scammers to take advantage of the intimacy that characterizes Siri, Cortana, and Alexa.  “It is only a matter of … Continue reading “Siri’s evil twin? Cybercriminals look to exploit voice-activated AI.”

The weakest link in any security chain is usually a human, not a software program. Which makes the rise of voice-activated artificial intelligence, or AI, a major concern for cybersecurity experts, who foresee new opportunities for scammers to take advantage of the intimacy that characterizes Siri, Cortana, and Alexa. 

advertisement

“It is only a matter of time before such software is put to criminal use,” the New York Times reports

The lesson for consumers: If Siri asks you for personal information, you should think twice before providing it. 

advertisement
advertisement

About the author

Senior Writer Ainsley Harris joined Fast Company in 2014. Follow her on Twitter at @ainsleyoc.

More