Alexa, Google, and Siri – the big three when it comes to virtual assistants. They are becoming pervasive in every facet of our lives. They are ever present through phones, watches, cars, microwaves, smart rings, cameras, computers, speakers, doorbells, televisions, refrigerators, clocks, lights, outlets, toys, and probably coming to many other places we haven’t even yet imagined. As useful as they may be, is there a darker side of the technology we haven’t fully considered or recognized?
A recent virtual round table brought together ILTA Members Jeff Brandt, Geoff Rhodes, Jonathan Burley, and David Nessen to discuss the potential risks law firms may unknowingly be facing.
Within the ABA Model Rules, attorneys take on many ethical obligations to professionally and competently represent their clients. These include the duty to safeguard the confidentiality of information and to competently represent their clients, even as so far as to have a basic understanding of the technology they are employing in the due course of client representation. Do the attorneys at your firm understand the potential risks involved with the use of these virtual assistants?
Devices that contain any of “the big three” are, by default, configured to always listen for the wake word. When the device is activated, it makes a recording of the request, and anything else being said in the room. An article published in Bloomberg in 2019 indicates Amazon uses humans to review audio recordings to improve the accuracy of Alexa. Other articles insist Alexa recorded a conversation unbeknownst to the individual being recorded and shared it with someone in that person’s contact list.
How does this act of recording stand up against the ethical obligation of confidentiality? One mitigating action deemed appropriate by the panel is to modify your Acceptable Use Policy to have these devices configured so they don’t listen for the wake words. The virtual assistant would be activated with the push of a button.
It is not always clear how a device is configured. Do your attorneys know how to properly secure the device?
It is important to bring your General Counsel and/or Risk Partner up to speed on these devices, their intended uses, and the inherent risks they can pose. Educating these key stakeholders is vital to success. How many of them would sign up to be the first firm called out for a privacy breach via a virtual assistant?
While firms can’t control everything or hope to stop every breach of confidential information, firms should have a solid Incident Response Plan in place. These plans need to be communicated and table top exercises using the plan should be routinely completed. Firms should also look to their cyber insurance carriers to understand if this type of breach of data would be covered should the need arise.
The panelists implored the listeners that, as with every cyber-based risk, educating your attorneys and the firm at large should be at the top of the list. With the ever-growing number of privacy regulations across the globe, everyone in the firm needs to understand the potential impact the device they carry around every day could have on the firm’s operations.
From the virtual roundtable, we wanted to share with you some of the links provided to more information:
- https://www.popularmechanics.com/technology/security/a30666361/jeff-bezos-whatsapp-hack/ (How Jeff Bezos Got Hacked on WhatsApp)
- https://www.americanbar.org/events-cle/ecd/ondemand/284420331/ (Alexa, Siri, and Other Personal Digital Assistants: How Lawyers Use Them – and the Ethics of Using Them (On-Demand CLE))
- https://www.americanbar.org/content/dam/aba/administrative/law_national_security/ABA%20Formal%20Opinion%20477.authcheckdam.pdf (ABA Formal Opinion 477)