OPINION:
A couple of federal lawsuits have been filed against Amazon alleging the company’s voice assistant Alexa has routinely, secretly and unlawfully recorded millions of children’s voices and stored that data who knows where, for who knows how long, for who knows what reasons.
Parents, welcome to 2020.
It’s not just the playground bully that’s the big threat to watch and monitor. It’s the devices that come into the home.
These suits, filed on behalf of an 8-year-old in California and a 10-year-old in Massachusetts, are stepping stones to what their lawyers hope will turn into a class-action case involving nine states. Their argument?
Nine states — California, Florida, Illinois, Michigan, Maryland, Massachusetts, New Hampshire, Pennsylvania and Washington — have two-party consent laws that require full transparency for any audio recordings of conversations. That means the recorder has to not just notify the recordee of any planned audio recording — but also has to have permission before recording.
“[W]hen such consent is not obtained,” said Travis Lenkner of Chicago’s Keller Lenkner, one of the law firms suing Amazon, the Recorder reported, “these state laws contain penalties, including set amounts of statutory damages per violation.”
The suit also states: “Alexa routinely records and voiceprints millions of children without their consent or the consent of their parents.”
The case, in its early stages, smells a bit of legal opportunism, the type and kind of which can bring windfalls for trial attorneys but little for the plaintiffs they supposedly represent.
But the red flags the cases raise aren’t trifling.
It’s one thing to remind children of the dangers of the Internet, the permanency of the imprints they leave with every post, every social media message, every image created and shared. But it’s not enough.
It doesn’t go far enough.
Today’s parents need to be aware of the dangers they can bring into their own homes, masked as technological conveniences and aides, and plan — and warn — accordingly. Privacy, with an Alexa, is not guaranteed. And it should not even be assumed.
• Cheryl Chumley can be reached at cchumley@washingtontimes.com or on Twitter, @ckchumley.
Please read our comment policy before commenting.