Blog

Big Ears, Long Memories: Are Home Assistants a Major Risk to Your Privacy?

august 10, 2021 | internet privacy
Don't Forget Privacy Concerns when You Invite Alexa, Siri or Google into Your Home

"Hey Siri, when will the kids be home?" "OK, Google, what time is my breakfast meeting tomorrow?" "Alexa, play some John Coltrane." If you own one of these intelligent home assistants, you get it. Why should you look up the details of tonight's TV lineup if Alexa can do it for you right?

Voice-activated assistants, also known as virtual personal assistants or VPAs, include Alexa's Echo and Dot, Google Nest Hub or Home, Microsoft's Cortana and Apple's HomePod Mini with Siri.

Reports differ on the total number of smart speakers the world has plugged in today, but the number of Internet of Things (IoT) speakers could top 500 million worldwide by 2023. Do Americans know enough about the privacy implications of using these tools? That's a critical question.

Big Ears, Big Risks

So why do Alexa and her buddies listen? All these devices listen to you. All the time. Otherwise, they wouldn't know if you asked for help. Your requests and confirmations can be studied so the devices "learn" more about your needs.

Accidental recordings can be comical or disastrous. One of the first cases of errant recording involved a Portland, Oregon couple whose conversation was sent to an employee. Their Echo device had to "mishear" numerous commands to complete that task.

In 2018, a British government official discussing Syria and terrorism with colleagues in the House of Commons accidentally opened Siri on the phone in his coat pocket. Siri then schooled the politicians present on world events.

Even before these lapses occurred, Forbes.com published an opinion from a Florida attorney who warned about the dangers smart speakers would create in law office use where client privacy must be sacrosanct.

Who's Listening?

Here's where VPAs get tricky—the question of who listens or has access. These gadgets are designed to "wake up" when you speak a specific word or tap a button, but they may activate when you didn't request it or might continue recording contrary to your expectations.

In Florida, a two-year-old criminal case has explained how Alexa could be a homeowner's alibi or an unwanted witness to a crime. This case recently went to trial, and prosecutors subpoenaed VPA recordings from the scene.

Law enforcement officers requested audio a Florida woman called 911 to alert them to a fight she'd overheard between her friend and the second woman's partner. Police arrived on the scene to find a woman dead. Officials charged her partner with a crime, and what Alexa overheard could be the deciding factor in the case.

This death wasn't the first made to gain access to possible smart speaker recordings. Amazon received its first request in 2016 after the murder of an Arkansas man. The company has received thousands of requests—primarily from law enforcement—in the past five years.

Always On, Always Recording?

"Anyone who has used voice assistants knows that they accidentally wake up and record when the "wake word" isn't spoken—for example, depending on the accent, "I'm sorry" may sound like the wake word "Hey, Siri," which causes Apple's Siri-enabled devices to start listening."

That's how researchers from Northeastern University and the Imperial College of London explained the phenomenon of accidental smart speaker recordings in a 2020 research paper.

The international team set up a string of speakers and made them listen to Netflix shows for hours. The results included "notable cases of consistent activations. For example, 20.7% of Google Home Mini activations and 17.7% of HomePod activations appear in more than 75% of our experiments."

A Right to Privacy?

Reports on speaker risks have increased awareness for the public, but the tech is too complex for some consumers to wrangle. Still, they probably believe they're protected from unreasonable search and seizure in their home. That’s what the Bill of Rights covers, right? Yet recent court rulings have stated since recordings get shared with third parties by default, those protections may not apply.

In July 2021, four healthcare workers in Washington filed a class-action lawsuit against Amazon, claiming the company did not adequately disclose what the device would record. The group believes that these smart assistants could violate HIPPA—the Health Information Portability and Accountability Act.

This lawsuit claims, "Despite Alexa's built-in listening and recording functionalities, Amazon failed to disclose that it makes, stores, analyzes and uses recordings of these interactions at the time plaintiffs' and putative class members' purchased their Alexa devices."

Perhaps Amazon did disclose data-sharing practices deep in their privacy policy, but few consumers read them. Copyright infringement concerns may also limit disclosure, but privacy groups like the Electronic Frontier Foundation or EFF, wants more transparency.

"We don't have all the answers about how to make smart speakers better, or more secure, but we are one hundred percent certain that banning people from finding out what's wrong with their smart speakers and punishing anyone who tries to improve them isn't helping," the non-profit advocacy group stated.

Shield Yourself

VPAs aren't going away any time soon because they provide many valuable services—especially for vision-impaired individuals and those with mobility challenges. You can customize your speakers to add skills like "Read me the IDShield Blog" to their list. Alexa features an “Ask My Buddy” alert if you choose. When activated, it notifies all your buddies nearby per your instructions.

Every maker of intelligent speakers claims that data privacy is their number one priority, but breaches happen, including data breaches in the cloud. These risks aren’t imaginary. Once again, proactive steps can prove invaluable. Here’s a list to get you started:

  • If the eavesdropping makes you uneasy, limit what your speaker hears. If you disable a microphone, some actions stop. You'll find these choices in Settings on the Alexa, Google and Apple apps. Many speakers also possess tangible on/off buttons.
  • Never forget that most of these devices grant their makers the broadest access by default; you can whittle those conditions down to boost privacy.
  • Add extra security for online purchases if you don't want the kids to order a giant jumping castle delivered to your door.
  • Even busy people should read the company's privacy policy. Some require you to waive rights like a trial by jury; somme mandate arbitration.
  • What shouldn’t be discussed around your speaker with recordings landing in the cloud? Refrain from sharing credit or banking information or Social Security numbers.
  • Speak clearly to Cortana and her sisters. Research has shown that when homeowners mumbled or spoke English with an accent, unwanted speaker activations increased noticeably.

Finally, know your speaker's shut down or silence commands and use them routinely. You can politely say, "Please power off Echo," but "Alexa, Shut Up" works, too.

IDShield is a product of Pre-Paid Legal Services, Inc. d/b/a LegalShield (“LegalShield”). LegalShield provides access to identity theft protection and restoration services. For complete terms, coverage, and conditions, please see an identity theft plan. All Licensed Private Investigators are licensed in the state of Oklahoma. This is meant to provide general information and is not intended to provide legal advice, render an opinion, or provide any specific recommendations.

Learn how IDShield can help protect your personally identifiable information