Browse all articles

What Teachers Need to Know About Using Smart Speakers in the Classroom

Key takeaways from new privacy evaluations, plus advice on how to use these devices with caution.

Erin Wilkey Oh | November 11, 2019

As smart speakers with virtual assistants become more common in our personal lives, it's no surprise that some teachers are starting to explore the benefits of bringing them into the classroom. From assisting with class routines to supporting English learners, the ways teachers are integrating these devices into their teaching seem innovative and promising.

When it comes to using smart speakers in classrooms, what do educators need to know to make the best decision for their students?

But is it a good idea to use smart speakers in the classroom? Since they were launched, devices like Amazon's Echo and the Google Home have been under scrutiny by consumers, the media, and privacy advocates for the potential privacy risks they expose users to. From an Echo recording and emailing someone's private conversation, to the news that both Amazon and Google vetted and approved malicious third-party apps, the evidence is mounting that the concerns are well-founded.

When it comes to using smart speakers in classrooms, what do educators need to know to make the best decision for their students? The Common Sense Privacy Program recently released privacy evaluations of the six most popular smart speakers with virtual assistants: Apple Siri, Google Assistant, Amazon Alexa, Facebook Portal, Microsoft Cortana, and Samsung Bixby. The findings give us a clearer picture of the privacy issues related to using these devices, and can help teachers and administrators make more informed decisions when considering their use in the classroom.

Key Takeaways from Common Sense Smart Speaker Privacy Evaluations

None of the virtual assistants adequately address use in schools or classrooms. None of the privacy policies disclose how they protect student data. This means strong privacy laws like FERPA, COPPA, and California's SOPIPA, which protect student data and education records created by students in the school or classroom, may not apply because the virtual assistant is intended only to be used at home by consumers.

The majority of these devices make money from users' personal information. The privacy policies for Google Assistant, Amazon Alexa, Facebook Portal, Microsoft Cortana, and Samsung Bixby disclose that they use personal information for third-party marketing, advertising, tracking, or ad-profiling purposes. The cost of the devices is low to get them in front of as many users as possible. Companies can then sell the data they collect to third-party data brokers and data processors, and eventually to advertisers and other companies hungry for personal information from consumers, including insurers, political consultants, and education technology services.

Apple HomePod with Siri is the best choice for privacy. Unlike Google Assistant and Amazon Alexa, which collect voice data and associate transcribed audio text with an individual account, Apple's Siri audio recordings are assigned a unique identifier each time the voice assistant is activated. This means that Apple's Siri does not identify a specific user's voice recordings because the recordings are not associated with a specific account or device.

Of the six assistants, only Apple's Siri earned a "Pass" in its privacy evaluation, which means the device's privacy policy meets Common Sense's minimum requirements for privacy safeguards. The other five virtual assistants, including Amazon Alexa and Google Assistant, received a "Warning" rating, which means their privacy policies do not meet Common Sense's minimum requirements for privacy safeguards and they should be used with caution.

Tips for Using Smart Speakers in the Classroom

In our personal lives, we make the choice every day to offer our data (or not to) as currency in exchange for convenience. If we use a Google account, purchase products from Amazon, sign up for giveaways, or use a supermarket member card, we are giving up some of our data. But when making this decision on behalf of our students, it's crucial that we are fully aware of the risks and thoughtful about the choices we make.

If you decide to bring a smart speaker into the classroom, consider these tips to help you use these devices with caution:

  • Check in with your administrator. You should be sure that using a smart speaker aligns with your school and district's technology policies. It's a good idea to get the principal and technology staff on board before you move forward.

  • Think through the learning value. Before implementing any new tech tool, take some time to think deeply about its learning value. How does the tool support the learning objectives? Does using the tool redefine what's possible for students? Is it simply substituting a tech-based tool for a non-tech one? If so, are there advantages to doing so?

  • Get parental consent/opt-in. It's important to communicate with parents and caregivers about the technology you're using in the classroom. Give them all the information they need, including how and why you're using the speaker. Outline the risks and rewards of using this new technology, and get their consent before you begin. 

  • Use the settings to reduce the risks. When setting up the device, figure out whether you can limit any information collection or sharing -- this may be in the settings or in a product manual -- and enter and share only what you need to in any associated apps.

  • Turn off (or mute) the device when it's not in use. This lessens the risk of unintentionally activating the speaker.

  • Co-create guidelines and expectations with your students. When do we use the speaker? What is it used for? Who gets to use it? What is OK to say?