Adapting consumer technology in educational contexts carries additional privacy concerns.
Pokémon GO launched with a splash -- and an immediate privacy hole -- in early July 2016. Despite these hiccups, however, the game enjoyed immediate popularity. Shortly after the game launched, people began talking about potential educational uses.
This tendency is understandable -- education technology advocates, gaming-in-learning advocates, and augmented and virtual reality (AR and VR, respectively) advocates have been looking for examples to demonstrate the potential, and Pokémon GO delivers on all three: accessible tech, with gameplay that gets people moving and that incorporates AR as a component of play.
However, Pokémon GO is decidedly not an educational game, despite calls from some corners that it represents the future of education. Because of the consistent calls for use of the game in educational settings, we made the decision to run a privacy evaluation on it. While the Privacy Evaluation Initiative focuses primarily on software with a clearer educational focus, Pokémon GO provides a useful case study for some of the potential trouble areas we see when consumer tech gets adapted into an educational context. While this post uses Pokémon GO as the vehicle for discussing potential issues, we want to emphasize that these issues extend beyond Pokémon GO.
Also, for people interested in using Pokémon GO in the classroom, the software provides a great starting point for discussing social, technical, data, and user experience issues with software development. Questions (and resources) that could be used to start conversations include:
- When designing software, how do we ensure that we're designing inclusively?
- How do we define or quantify security and privacy issues in software design?
- If an application has the potential for significant injury, blatant carelessness on the part of users, international incidents, violence between players and nonplayers, and worse, what steps can (or should) a development company take to protect users?
- What can we learn about or infer from software design that incorporates advertisers into its source code?
All of these questions or concerns can use the example of Pokémon GO as a starting point, but the issues that will be brought up in these conversations are applicable beyond the game.
In looking at the terms of Pokémon GO, we were struck by how typical these terms are for a game and for software designed for the consumer market. Because of this, we wanted to break down some general observations that are relevant to Pokémon GO and also are broadly relevant when any consumer technology is brought into an educational context. In this post, we will highlight five issues that are common to consumer tech when it's adapted for use in an educational context:
- How personally identifiable information (PII) is defined
- The gray area between ownership and access
- Data sharing and de-identification
- Use and identification of third parties
- FERPA issues when consumer tech creates something that looks like part of an educational record
Personally Identifiable Information (PII)
Because PII is not defined, and Pokémon GO developer Niantic Labs anticipates use of the game by people under the age of 13, the definition of personally identifiable information would be governed by COPPA. Arguably, this definition only applies for people under 13 who play in the United States. People over 13, and people outside the U.S., likely would have different definitions.
Because Pokémon GO collects geolocation information, the COPPA definition of PII -- which includes geolocation information as PII -- is a key factor. The terms also state that the policies are governed by the laws of California (because Niantic is based in California). Because the precise definition of PII is not present in the terms, we need to look within California statute for definitions of what could be considered PII. Two sources for this definition include CalOPPA and California's data breach notification requirements -- and neither of these definitions includes geolocation information as PII.
As a result, we potentially have a situation where, in the U.S., we have at least two possible definitions of PII in play between people under and over 13, with geolocation data potentially excluded from one of the definitions. This lack of clarity is not unique to Pokémon GO; anyone considering using consumer technology in education is strongly advised to look at how PII is defined (or not defined) in the terms.
An easy fix here would be for Niantic Labs to update its terms and clarify exactly what it includes within its definition of PII.
Questions around ownership are prevalent in both consumer and education technology. Frequently, access to data makes outright ownership unnecessary. For example, if your neighbor owns a car, and you can drive it at will, whenever you want, you can effectively enjoy many of the benefits of ownership without outright ownership. Similarly, given that the Pokémon GO terms state that user data, including PII, are a business asset of Niantic Labs, it's difficult to understand exactly how users enjoy the benefits of what is traditionally considered ownership.
Data sharing and de-identification
The Pokémon GO terms permit broad sharing of user data with third parties and define few limits on how these third parties can use or handle data. Some typical limitations that can be placed on third parties include prohibitions against combining data from multiple sources (often called "data enhancement") or attempting to re-identify data that has had identifiers removed from it. The Pokémon GO terms -- like many terms in the education and consumer markets -- do not place either of these restrictions in some cases.
In some cases, the terms state that data will have PII removed from any data that is shared with a third party. However, because PII is not clearly defined, it's difficult to tell what that means.
The terms also state that data can be used by third parties to support delivering the services. In the case of Pokémon GO, at least one corporate advertiser has been written into the code, so the services literally contain advertising-specific functionality. Many vendors claim the right to share data broadly as part of delivering the services, and in many cases it's very unclear what that means. When the source code of the application contains reference to a potential advertising partner, it becomes increasingly difficult to define what is meant by the "service" and how that differs from targeted ads based on data collected within the service.
A fix for this would be to clearly state what limits are placed on third-party vendors and to define the types of behavior included by third parties as they support the services.
The terms list a small number of third parties: Google, Facebook, and the Pokémon Company International (or TPCI). Google and Facebook can be used for social login, and TPCI is used for both federated logins and for some of the parental-consent management for users under the age of 13. However, other third parties used are not listed. The terms reference multiple third parties that perform a range of services from processing log data to supporting demographic profiles, yet none of these third parties is listed.
This would be less of an issue if the limits placed on third parties and the definition of PII were clearly defined in the terms. A simple fix for this would be to list the third parties used, the roles they play in supporting the service, and an overview of the information they need to fulfill that role.
The main argument I hear against listing third-party vendors is that the list of third parties is a proprietary advantage and that listing third parties used to deliver a product would put companies at a disadvantage. This argument potentially has some merit for smaller companies who conceivably could have a key supporting vendor targeted by a larger competitor. The counter to that argument, of course, is that if a company sees their supply chain as their main asset, they should probably think about the value they add and the longevity of their company.
FERPA-related issues have less to do with the terms of the app than they do with potential uses of the game in an educational context. If the app is used in an educational context, some uses have the potential to create student work that could be considered part of an educational record. Educational context creates the scenario where the terms use another definition of PII, because FERPA contains a definition of PII that also does not include geolocation information.
Any FERPA-related issues are the responsibility of the school or district using the application, and the school or district using the application would bear the full responsibility of any complications relating to FERPA. Potentially, if the school is using the application with students under the age of 13, there could be additional obligations required under COPPA as well.
Nothing we have highlighted with Niantic Lab's terms for the Pokémon GO app are abnormal for consumer tech adapted for an educational context. While this post references the terms of Pokémon GO, the issues highlighted in this post are applicable to many consumer tech applications used in schools today.