Why Remote Lawyers Should Mute Alexa
BY Kerrie Spencer
LISTEN
As we all work from home, to flatten the COVID-19 cure, we must keep all sensitive data protected. This is especially true for attorneys, judges or any other legal staff. When thinking of data and information protection, not everyone will think of Alexa or any other smart listening device.
Attorneys and other legal personnel working remotely who own an Alexa should be careful. Google and Amazon might be listening in.
There is some speculation that Google's voice assistant or Amazon's Alexa only activates when certain "wake-up" words are spoken. Amazon and Google insist that their assistant devices do not record anything at all until they hear their "keyword."
Alexa, and other smart devices like it, respond to the wake-up words or keywords. Words like Alexa, Amazon, Amazon, Computer or Echo, can be used for Alexa. Other voice assistants also use wake words. For instance, Siri uses "Hey, Siri." Google Home uses "Okay, Google." Windows 10 responds to "Hey, Cortana," or "Cortana."
According to a U.K. law firm, Mishcon de Reya LLP, all firm staff has been told to shut off or mute any listening device when speaking about client matters while working at home. The preference is that there be no such devices near an attorney's home workspace.
And it is not just voice-enabled devices, but any type of voice or visual enabled devices, such as smart speakers, video products like Ring, closed-circuit TVs and baby monitors, name brand or not.
While this may seem a bit paranoid, perhaps there is something to be said for being extra cautious, particularly when dealing with sensitive information outside of a law firm's office. If details of a sensitive case are recorded, then that becomes a serious risk for law firms.
In 2019, according to a report by the Consumer Intelligence Research Partners, there were 76 million listening assistants installed in the United States, and the trend is not showing any signs of slowing down.
Despite the evidence, that voice assistants are triggered by something other than their wake-up word, Google and Amazon continue to insist that their devices are designed to only record and store audio when triggered. They also suggest that the inadvertent triggering of a listening assistant is rare.
It is not hard to imagine that listening devices can wrongly interpret a command to wake up and listen. It is not hard to see how Alexa can mistake a sound from other people, radio, TV or computer as a trigger.
The Imperial College London and Northeastern University found, during testing, that listening devices can be activated inadvertently (without using a wake-up word) anywhere from 1.5 to 19 times a day.
In the face of such revelations, Google has now said users have to opt-in to permit them to keep any voice recordings, and Amazon allows its users to configure their assistants to automatically delete recordings and opt-out of manual review.
Ultimately, when dealing with legal matters, it is always better to go the extra mile to keep all files and information safe and secure. If that means muting or turning off a listening assistant, so be it.
LATEST STORIES