Is Your Virtual Assistant Snooping On You?
Today, technology has made it possible for us to carry around a personal assistant in our pockets or to have one in our homes ready to serve at our beck and call. With the use of a simple voice command such as “ok Google,” “hey Siri,” and “Alexa,” the virtual assistants can tell us the weather, place orders online, check upcoming events on our calendar, send emails, send messages, control lights, and even control the thermostat.
While we spend our lives connected to the internet and oversharing on social media, we are able to control what we share with the world. However, virtual assistants are inherently different. They are “always on,” listening for the voice command, which raises interesting privacy concerns. Have you ever wondered whether your virtual assistant is snooping and spying on you? Who is actually listening? Where does the information go? How is it stored? Who has access?
Companies such as Apple, Google, and Amazon are acutely aware of the privacy concerns raised by their respective virtual assistants. As a result, these tech giants are quick to stress that customer privacy is their top priority. They point out that all data is anonymized and stored in ways that make it difficult to identify the customer, and claim that information is only relayed to their servers after receiving the requisite voice command. If a virtual assistant is waiting for the voice command, it stands to reason that the microphones are turning on and listening intermittently. Are your conversations with loved ones in the privacy of your own home still private?
A murder investigation in Arkansas has revealed that user data is easily recoverable and identifiable despite marketing to the contrary. In Arkansas, when the investigators discovered an Amazon Echo at the crime scene, the prosecutor was savvy enough to subpoena Amazon for the user data in the hopes that the Amazon Echo inadvertently picked up something during the murder. The subpoena directed Amazon to turn over all “audio recordings, transcribed records, or other text records related to communications and transactions between the Echo device and Amazon’s servers during the 48-hour period covering November 21-22, 2015.” Understandably, Amazon refused to comply with the subpoena on privacy and First Amendment grounds. However, reports quickly surfaced that investigators were able to extract certain information from the device without Amazon’s cooperation. What information the investigators were able to extract from the device itself has not been shared with the public.
Given how Amazon, through its marketing of Alexa, touted the anonymization of user data and recordings, it was questionable whether Amazon could indeed even comply with the subpoena and produce the requested information. In the end, the tug of war between the investigators and Amazon ended when the owner of the Amazon Echo agreed to authorize Amazon to turn over the recordings, which Amazon turned over later that same day. This case has revealed that our conversation and the data/recordings associated with our virtual assistants are not truly anonymous.
In July of this year, a Google Home (although there are conflicting reports as to the exact device at issue) picked up a phrase uttered during a domestic dispute and called 911 allowing the dispatchers to hear an ongoing domestic dispute that was about to turn violent with the suspect pulling a gun on his girlfriend. A SWAT team quickly arrived on the scene and after several hours of tense negotiations, the suspect surrendered to the authorities. The Authorities suspect that the smart device was triggered inadvertently and dialed 911 due to the suspect asking his girlfriend whether she “called the sheriffs.” While the smart device at issue prevented a tragedy, it clearly demonstrates that these smart devices are always on and always listening.
Such developments raise interesting questions in the civil litigation context. If you speak to your attorney in the presence of a virtual assistant, does the attorney-client privilege protect the portions of the conversation recorded by the virtual assistant even though the conversation has been heard (and likely recorded) by a third party? Will the user data and recording associated with a specific device be discoverable if a litigant believes that a virtual assistant may have recorded conversations regarding matters relevant to a case? Can a private litigant be compelled to authorize Amazon, Google, or Apple to turn over such data? Only time will tell, but litigants may soon be faced with such discovery requests in the civil litigation context if the stakes are high enough.
January 10, 2018
January 4, 2018
December 18, 2017
Practice Area Topics
- Employment Law
- Estates & Trusts
- Family Law
- General News
- Medical Marijuana
- Personal Injury
- Real Estate Law