Learn to navigate the privacy challenges of popular smart speakers.

Virtual assistants are everywhere -- embedded in our phones, mobile applications, cars, smart speakers, consumer devices, and "smart tech" toys for kids. This article describes our findings from completing full privacy evaluations of the privacy policies of the most popular virtual assistants included in smart speakers -- Apple's Siri, Google Assistant, Amazon's Alexa, Facebook Portal, Microsoft's Cortana, and Samsung's Bixby -- and identifying the potential privacy risks and harms that may affect children and families who use these devices.

What is a Virtual Assistant? 

A virtual assistant is a piece of software that's part of many of the everyday devices used by parents, educators, students, and children. The software listens for and responds to voice commands. When a virtual assistant is activated on a mobile device or smart speaker, the microphone is enabled and the device listens for a common wake word (e.g. "Hey" or "OK"). It is not supposed to send audio information from the microphone to the service unless the device detects the wake word. Once a wake word is detected, the wake word and subsequent audio information, like voice queries and commands (which may include ambient noise or background conversations), will be sent to the company, stored in an audio format, and the voice recordings then transcribed into digital text. Much attention has been paid to the retention and deletion of virtual assistant audio recordings, which contain information about the individual speaker's voice that could be used for voice-identification matching. In addition, all voice recordings or queries are also transcribed and stored as search results, which, like text-based search queries in a web browser, can still be associated with a particular individual, household, or device even after the original voice recording has been deleted.

How do we use a Virtual Assistant?

Talking to Machines

When people talk about privacy, they usually talk about privacy from other people. People hardly have the language to talk about privacy from things. Yet here we are in 2019 talking to machines, telling them all sorts of interesting, private, and valuable information about where we intend to go, what we plan to buy, or even how we feel. The machines seldom keep our confidences, however, as they were designed by people who work for companies that desire or even need our personal information to continue to support their products.

The business models for the companies that offer virtual assistant products and services to the public have some elements in common. The input of data involves a low- or no-cost device placed within easy reach of consumers. The output of data is a rich buffet of offerings for third-party data brokers and data processors, and eventually to advertisers and other companies hungry for consumers' personal information -- companies that include insurers, political consultants, and financial services.

Kids' Voices

Access to such a wealth of personal information can be a good thing and a bad thing. Using virtual assistants embedded into smart speakers in the home, children can have access to information that's available to anyone online. Children too young to type a URL or a search term into a web browser search engine can speak simple voice commands and access this information. Research has shown that 6 in 10 parents say that their kids interact with voice-activated virtual assistants, but that could actually be even more kids and more usage, especially for in-home devices that are not under lock and key or under constant observation.

It is difficult for parents to constantly monitor devices, and very young children can access a wide world of information with virtual assistants. Parents may also be using virtual assistants as virtual childcare, or kid-specific entertainment that borders on companionship or childcare for children, depending on the age of the child. In addition, parents should be aware of hidden costs and unauthorized purchases when using virtual assistants.

Furthermore, parents and educators should look beyond the numbers and wonder what smart speakers know about their kids that they don't know. More than 4 in 10 parents (43 percent) say their 6- to 8-year old uses a smart device to help with homework. If it's math, it might be more than "help," it might be "Siri, (please?) do my homework. Before dinner." A rather large majority (68 percent) of parents reported that their child has never said something mean, rude, or inappropriate to a voice-activated assistant. Of course, parents can only share with researchers what they witness unless they have access to a device's underlying data analytics.

Children and Data Privacy

Parents and educators value the ability to control and to understand what information is collected from voice-activated smart devices when it comes to their children and students. A common concern is whether parents who use voice-activated devices at home, or educators who use them in the classroom, know when children's voices are being recorded. And if so, do they know how to control what information is collected about them and how to control whether their child or student's voice data is being used to deliver personalized or targeted ads? 

Almost all parents surveyed valued controlling and understanding what information is collected from their voice-activated devices. About 9 in 10 parents who use voice-activated devices say it is important to them to know when their family's voices are being recorded (93 percent), to control what information is collected about them (93 percent), and to control whether their family's voice data is being used to deliver more targeted ads (88 percent). 

  • The facts: There's the actual situation: Are children being recorded? 
  • The feelings: Then there are the feelings parents and educators may have about being recorded, often referred to as the "creepiness" factor. Creepy could include recording data without express permission or using the data for purposes beyond what the device was initially purchased to do.
  • The future: Beyond what is currently collected and how it is used, virtual assistants may store data and may at some point utilize the data in ways that no one -- not the children, nor the parents, nor even the device manufacturers and designers -- have yet imagined.

What Should Parents and Educators Do?

  • Turn off the device. Adults should only turn the device on when it is in use, and when an adult is present to monitor use by children. The survey found that 41% of users have turned off a speaker because it had been activated by accident. Again, we can only ask people to report on what they've done, and it is worth considering this as the tip of the iceberg. There may be many more incidences of accidental usage, recording, and data storage. 
  • Turn off the speaker/microphone. Adults should check whether the speaker/microphone has been accidentally activated and turn it off except when it is in use. There may be many more incidences of accidental usage, recording, and data storage than parents are aware of.
  • Check which devices connect to the virtual assistant. Remove automatic activations and connections to appliances in the home and other devices.
  • Make your preferences known to companies and legislators. We can start with the knowledge that 71% of parents have taken or wanted to take steps to limit data collection -- and about half of those think they have, and half want to but they don't know how. This is the jumping-off point for action. The next step is to empower parents and educators so that they actually have this control and utilize it. Legislators can support this practice by mandating features allowing parental controls, and when that doesn't fully protect kids, allowing the information to be deleted from devices and databases.
  • Make informed decisions about which virtual assistants to buy and use. This article is a snapshot of what virtual assistants are doing right now. Their business practices change rapidly as companies think creatively about how to gather, process, and sell data. In deciding whether to purchase or use a virtual assistant, consider the impact on you, on other adults that live in your home, on children that may use the device, and on guests in your home. Factor into your decision the cost of the device, purchases that may be made with the device, and the potential use of your personal information by the device manufacturers and other companies the device might share your data with over time.

How We Rate

At home and in schools and districts, parents and educators make decisions about privacy based on their specific needs -- and these needs can vary between children and students. The privacy evaluation process is designed to support parents and educators in making an informed decision, not replace the decision-making process. Our evaluation incorporates the specific needs and the decision-making process of parents, educators, schools, and districts into the following three ratings:

Image

Every privacy rating also includes an overall score. A higher score (up to 100%) means the product provides more transparent privacy policies with better practices to protect user data. The score is best used as an indicator of how much additional work a person will need to do to make an informed decision about a product. This use is directly related to the core work driving the evaluations: to help people make informed decisions about a service with less effort. The higher the number, the less effort required to make an informed and appropriate decision.

Key Findings

The following ratings and scores are from our privacy evaluations of the most popular smart speakers with virtual assistants. These products are available on mobile devices and integrated into smart speakers used at home, in businesses, and inside classrooms. The following table illustrates better, worse, and unclear practices with our privacy rating questions. These worse practices can put consumers’ privacy at risk with the use of personal information for third-party marketing, advertising, tracking, or ad-profiling purposes. The color blue means the product's policies disclose better practices, red means they disclose worse practices, and orange means they are unclear as to whether or not the vendor engages in the practice.                                                                          

 Product  Rating  Overall  Score  Sell Data  Third-Party  Marketing  Behavioral  Ads  Third-Party  Tracking  Track Users  Ad Profile
Apple's Siri
 
79 No No No No No No
Google Assistant
 
75 No Yes Yes Yes Yes Yes
Microsoft's Cortana
 
75 Unclear Yes Yes Yes Yes Yes
Facebook Portal
 
59 No Yes Yes Yes Yes Yes
Amazon's Alexa
 
54 No No Yes Yes Yes Yes
Samsung's Bixby
 
53 Unclear No Yes Yes Yes Yes

Our findings show that Siri received a "Pass" rating. The "Pass" rating demonstrates that the product has the highest relative overall score because of its of transparent policies and disclosure of better privacy practices (shown in blue) in their policies. Unlike Google Assistant and Alexa, which collect voice data and associate transcribed audio text with an individual account, Siri audio recordings are assigned a unique identifier each time the voice assistant is activated. This means that Siri does not identify a specific user's voice recordings, because the recordings are not associated with a specific account or device. Separating the voice recording from an individual account is a privacy-protecting feature.

Google Assistant, Alexa, Facebook Portal, Cortana, and Bixby all received our orange "Warning" rating, because their terms were not as transparent and did not disclose (comparatively) the same percentage of better practices in their policies. For example, all these companies' policies disclose that they use personal information for third-party marketing, advertising, tracking, or ad-profiling purposes. However, in addition to each product's overall score, the more detailed concern category scores in the chart below can help explain a product's rating and can also be helpful in making an informed decision about whether and how to use the product at home, at your company, or in the classroom based on your privacy concerns in the following categories.

 Product  Rating  Overall  Score  Data Collection  Data Sharing  Data Security  Data Rights  Data Sold
Apple's Siri
 
79 65 80 85 95 55
Google Assistant
 
75 65 90 95 95 50
Microsoft's Cortana
 
75 65 95 95 95 45
Facebook Portal
 
59 50 80 45 85 30
Amazon's Alexa
 
54 35 75 25 55 30
Samsung's Bixby
 
53 65 85 40 75 30

For example, from the chart you can see that Siri received the highest overall score compared to Bixby, which received the lowest overall score. Siri did better in almost every category. In addition, both Google Assistant and Cortana received the same overall score, but had different concern scores that contributed to their overall points. For example, Google Assistant had better practices for the concern of Data Safety, but Cortana had better practices in place for protecting student data for the concern category of School Purpose in the chart below.

 Product  Rating  Overall  Score  Data Safety  Ads & Tracking  Parental Consent  School Purpose
Apple's Siri
 
79 60 85 70 10
Google Assistant
 
75 60 60 80 0
Microsoft's Cortana
 
75 45 60 70 20
Facebook Portal
 
59 85 55 10 0
Amazon's Alexa
 
54 40 65 70 0
Samsung's Bixby
 
53 40 60 50 0

In addition, Google Assistant, Facebook Portal, Alexa, and Bixby all completely ignored the questions in the School Purpose concern, and did not adequately address use in schools or classrooms or how they protect student data.

Full Evaluations

The following full privacy evaluations and hands-on security testing results go into more detail about each of the virtual assistant products that could be used at home, in businesses, and in classrooms by kids and families. Our privacy evaluation process considers only policies that have been made publicly available prior to an individual using the application or service. Our hands-on security testing of the products look at the 10 most critical security practices around the collection of information from a smart device and from a mobile application, and the transmission of information between the device, the application, and the internet.

 

Image

Siri

 

Image

Summary:

  • Siri is a virtual assistant that is part of Apple's iOS, watchOS, macOS, and tvOS operating systems. The assistant uses voice queries and a natural-language user interface to answer questions, make recommendations, and perform actions by delegating requests to a set of Internet services. Apple's terms state that protecting children is an important priority for everyone at Apple. Apple believes in transparency and giving parents the information they need to determine what is best for their child. In addition, Apple states that Siri searches and requests are associated with a unique identifier and not an Apple ID -- so that those searches and requests are not associated with a particular individual and cannot be identified by advertisers or other organizations. Apple's terms state that security and privacy are fundamental to the design of all Apple hardware, software, and services. Lastly, Apple's terms state they understand the importance of taking extra precautions to protect the privacy and safety of children using Apple products and services. 

 

Image

Google Assistant

 

Image

Summary:

  • Google Home is a smart speaker device that integrates Google's virtual assistant, Google Assistant, which provides customized help to users across all their devices, including Google Home, the user's mobile phone, and more. With the Google Home app, users can set up, manage, and control their Google Home and Chromecast devices, along with thousands of connected home products like lights, cameras, thermostats and more. The Google Assistant app provides another way to launch Google Assistant if it's not already available on a mobile device.

 

Image

Alexa

 

Image

Summary:

  • Amazon indicates that its Alexa product allows users to instantly connect to Alexa to play music, control the user's smart home, and get information, news, weather, and more using just a voice input. Amazon's policy states that Alexa will store user messages in the cloud so that they are available on the user's Alexa app and select Alexa-enabled products. Amazon's policy explains that it will gather personal information, and that the information it learns from customers helps them personalize and continually improve the user's Amazon experience. While the policy indicates that an account is required for the initial setup, for "hands-free" devices, like the Amazon Echo, a user can access Alexa by saying the wake word (Alexa, Echo, Amazon, or Computer). The policy indicates that parental controls are available, and parents can add or update certain information. 

 

Image

Facebook Portal

 

Image

Summary:

  • Facebook describes its Portal product as a device that helps users connect with family and friends, share content and experiences, and discover photos, music, videos, and other content. Facebook is a social media and social networking service that allows users to connect with friends, family and other people they know and share photos and videos, send messages and get updates. The terms for Portal state that Facebook allows users to communicate with other users with messages or posts. For example, when users post on Facebook, they select the audience for the post, such as a group, all their friends, the public, or a customized list of people. The terms state that Facebook does not sell users' data to advertisers, including personal information like a user's name or the content of their Facebook posts. However, the terms state that Facebook may use the information they collect about a user, including information about their interests, actions and connections, to select and personalize ads, offers, and other sponsored content. The terms state that Facebook is broadly available to everyone, but a child cannot register an account to use Facebook if they are under 13 years old.

 

Image

Cortana

 

Image

Summary:

  • Microsoft's Cortana is an intelligent assistant that provides smart features and personalized experiences across a variety of devices, apps, and services. Microsoft's terms for Cortana state that a child can access social communication services, like Outlook and Skype, and can freely communicate and share data with other trusted and untrusted users of all ages. Microsoft's terms state that the personal information Cortana collects depends on the choices a user makes (including their privacy settings and whether or not they are signed in), the data they share with Cortana, and Cortana's capabilities (which vary depending on a user's operating system, device, and the services and apps they use). Microsoft's terms state they are committed to protecting the security of users' personal data. Lastly, the terms state if a child tries to register for an account, Microsoft will ask them to provide consent or authorization from a parent or guardian before they can use the service. 

 

Image

Bixby

 

Image

Summary:

  • Bixby is an intelligent personal assistant for Samsung devices that learns what users like and works with their favorite apps. The terms state that Samsung Social (formerly named Enhanced Features) is a bundled set of services designed to simplify and improve the ways in which users can share and connect with their friends. To provide the Bixby voice services, some voice commands may be transmitted (along with information about a user's device and its usage, including device identifiers) to a third-party service provider that converts their voice commands to text. In addition, the terms state that Samsung maintains administrative, technical and physical safeguards designed to protect the personal information they obtain through the services and the customization service against unlawful or unauthorized destruction, interference, loss, alteration, access, disclosure, or use. Lastly, Samsung's terms state that children under 13 years of age should not attempt to register for its services or send any personal information about themselves to Samsung.

 

 

Girard K.

Girard Kelly is an attorney focused on Internet, privacy, cybersecurity, and Intellectual Property law who thrives on cutting-edge legal issues and has a strong background in public policy, information technology, entrepreneurship, and emerging technologies.