The plaintiffs, of which one is a substance abuse counselor and another a healthcare customer service representative, said that the recording of their conversations with patients was a direct violation of the Health Insurance Portability and Accountability Act (HIPAA) that they work under, which protects their patients’ privacy.

The cited a study by Northeastern University, which demonstrated how an Alexa device can be woken by “wake words” or phrases that trigger them to start recording and transmitting what it hears.

Asking the device things like, “Alexa, is it going to rain?” will wake it as it answers the question.

The study looked at how many times an Alexa might “wake up” accidentally. They found that statements like “I care about,” “I messed up,” and “I got something” would trigger the Alexa, and even things like “head coach,” “pickle” and “I’m sorry” woke it up too.

An article from Northeastern University quoted David Choffnes, an associate professor of computer sciences at the university, saying, “A lot of us, when we think about being in our home, we think that’s a private space where we can have conversations that are intended not to be shared. And now we have all these devices with microphones that could be taking those conversations and sharing them.”

They added, “Customers have several options to manage their recordings, including the option to not have their recordings saved at all and the ability to automatically delete recordings on an ongoing three- or 18-month basis.”