Saturday, April 20, 2024

These Toys Can Record Your Children And Send The Data To A Defense Contractor

We don’t have hoverboards or holographic phones yet, but our toys are definitely more futuristic. They can connect to the internet with people around the world, respond in emotive or artificially intelligent ways, and learn from us—or about us. A murderous Chucky doll is no longer as scary as the real creepiest dolls that track a child’s data.

Meet My Friend Cayla and i-Que Intelligent Robot, two toys that, as Consumerist outlines in detail, are allegedly violating laws in the U.S. and overseas by collecting voice and usage data from the kids who call them playmates. They connect to a parent’s Bluetooth, and using their internet connection, are able to answer children’s questions and take part in what seems like a conversation, kind of like Siri on an iPhone.

Consumerist describes the setup process:

Cayla in particular asks for multiple pieces of personal information — the child’s name, their parents’ names, their school name, their hometown, among other questions — so it can converse more naturally. The app also allows for location setting, and both the Cayla and i-Que apps collect users’ IP addresses. So far this is pretty straightforward. The Terms of Service for both toys say that they collect data in order to improve the way the toys work, and for “other services and products. Researchers studied the way the toys work, the complaint continues, and it turns out that they send audio files to a third party: Nuance Communication’s servers at the company’s headquarters in Massachusetts.

Nuance is a defense contractor that sells voice biometric solutions to military, intelligence, and law enforcement agencies. Part of their privacy policy states, very shadily, that “We may use the information that we collect for our internal purposes to develop, tune, enhance, and improve our products and services, and for advertising and marketing consistent with this Privacy Policy.”

As with most technology these days, Cayla’s terms of use prompt only pops up once, when setting it up, and isn’t accessible again. It’s a tome of legal jargon that parents are expected to read, and without clicking “I agree” to all 3,800 words, they won’t be able to use the toy.

So, when a child is playing with one of these toys—which, by the way, don’t have a walkie talkie-style function for a back-and-forth, but are always listening as long as they’re switched on—they’re potentially providing sensitive information about their identity or location to would-be hackers at worst, and a defense contractor for research and development at best. This isn’t what we imagined with “He hears you when you’re sleeping, he knows when you’re awake.” For goodness sake.

[h/t Consumerist]

MUST READ

The 4 Things To Avoid In Your Coffee

For the majority of people, they can't imagine mornings without it.  But here are the 4 things to avoid in your coffee. 

MORE BY THIS AUTHOR

5 Great Thanksgiving Movies To Watch On Turkey Day

It’s Thanksgiving time! Soon you’ll be bombarded with intrusive family members, tons of delicious food, and—hopefully—enough booze to get you through it all.

Don't Miss Your Weekly Dose of The Fresh Toast.

Stay informed with exclusive news briefs delivered directly to your inbox every Friday.

We respect your privacy. Unsubscribe anytime.