Digital assistants are always listening. That’s a really disturbing thought once you’ve processed it. While we assume that actively recording our conversations is something that they’re unable to do (and that’s also illegal), these devices are still awake, attentive, and waiting for you to call their names and give them something to do. Like Googling something, playing something on Pandora, or buying something from the grocery store, which is when things get fishy.
Related Story: Your Amazon Echo Could Soon Have Its Own Memory
Digital assistants are becoming more and more complex and helpful, meaning that they’re capable of storing your bank accounts and all sorts of relevant information so they can carry out complex tasks without asking for your guidance or permission every few minutes.
A new study (first reported by the New York Times) reports that a song or even commercial you listen to could hijack your device with commands that are undetectable to human ears, and that it’s possible to hide codes within these bits of audio. This is known as a “Dolphin attack” due to the fact that dolphins are capable of listening to frequencies that human ears don’t even register.
Related Story: Here’s One Reason Why Amazon’s Alexa Might Be Evil Laughing
Luckily, tech companies are well aware of these risks and have features that could prevent an attack like this from happening. Google, Apple and Amazon expressed to the New York Times that they had countermeasures for these security threats, even though they didn’t provide any specifics.