New Study Proves Digital Assistants Are Dangerous

It's worse thank you think.

New Study Proves Digital Assistants Are Dangerous
Photo by metamorworks/Getty Images

Digital assistants are always listening. That’s a really disturbing thought once you’ve processed it. While we assume that actively recording our conversations is something that they’re unable to do (and that’s also illegal), these devices are still awake, attentive, and waiting for you to call their names and give them something to do. Like Googling something, playing something on Pandora, or buying something from the grocery store, which is when things get fishy.

Digital assistants are becoming more and more complex and helpful, meaning that they’re capable of storing your bank accounts and all sorts of  relevant information so they can carry out complex tasks without asking for your guidance or permission every few minutes.

A new study (first reported by the New York Times) reports that a song or even commercial you listen  to could hijack your device with commands that are undetectable to human ears, and that it’s possible to hide codes within these bits of audio. This is known as a “Dolphin attack” due to the fact that dolphins are capable of listening to frequencies that human ears don’t even register.

Luckily, tech companies are well aware of these risks and have features that could prevent an attack like this from happening. Google, Apple and Amazon expressed to the New York Times that they had countermeasures for these security threats, even though they didn’t provide any specifics.

Like what you see? Subscribe to our Editor's Choice Newsletter and get the best of The Fresh Toast, chosen by our Editor-In-Chief, delivered right to your inbox!
  • Privacy

    The Fresh Toast collects and stores submitted private information in accordance with our User Agreement.