2016: Getting Things Done and Just Getting Started

In 2016, the Privacy Evaluation Initiative was busy. But we are just getting started, and more is on its way.

January 12, 2017
Bill Fitzgerald Director, Privacy Initiative

CATEGORIES Privacy, Privacy Evaluation Initiative

The Privacy Evaluation Initiative is an effort led by over 100 schools and districts. The central deliverables of the work are the individual app evaluations, but the foundation of the work rests on communication and outreach with various stakeholders. So, while the work of evaluating privacy and security practices of commonly used education technology requires a deep dive into the legal and practical implications of privacy policies and the technical protections used to safeguard information, it also requires seeking out partners, asking questions, listening to different perspectives, and adjusting course when needed.

The daily work of the Privacy Evaluation Initiative requires an equal blend of custom software development, original research, technical expertise, a thorough understanding of legal compliance issues, an understanding of how education and pedagogy are mediated via technology, and an ongoing commitment to building communities and partnerships. In 2016, we covered a lot of ground, including these highlights:

  • Grew and staffed a consortium of schools and districts. In 2016, the District Consortium grew from 30 to 110 participating schools and districts. The Consortium meets every six to eight weeks; your school or district is welcome to join.
  • Maintained and updated the questions used to evaluate privacy terms. The original questions were based on a rubric used by Fairfax County School District in Virginia and Houston Independent School District. Over time, the questions were expanded to cover additional details across a wide range of software organized based on the Fair Information Practice Principles (FIPPs). The questions are freely available under a Creative Commons license to support principles of both transparency in our work and availability of the tools and information required to evaluate privacy policies. In 2016, we developed two versions: the initial version that we used to launch the platform and the second, revised version that we updated based on stakeholder feedback.
  • Maintained and updated a comprehensive legal and technical rationale behind the questions. The questions themselves provide the blueprint used during our evaluations of privacy policies. Statutory and regulatory requirements are included for each relevant question with parentheticals explaining the citation's requirements in plain English. Background rationale is also provided to illustrate in context why we highlight specific privacy and data-security issues. This legal and background rationale is also released under a Creative Commons license.
  • Published a 17,000-word information security primer. The primer covers the basics of information security testing, and it contains information on responsible disclosure and explanatory sections for non-technical people who want to learn more about information security. The information security primer is released under a Creative Commons license, and it's available as part of the Common Sense Privacy site and on GitHub.
  • Developed a quick triage process that helps identify common security and privacy data practices that affect many edtech apps. There are several key indicators that can be observed when using an edtech app that demonstrates whether the app provides a basic level of security with encryption and basic privacy controls for children under 13 and students in preschool and K-12.
  • Developed and maintained an API that captures the text of privacy policy terms. The Policy Crawler we developed grabs the text of privacy policies and stores both the raw HTML and a cleaned-up version in text. We currently have terms from around 1,200 software applications archived, and we check for changes in these policies periodically, saving a new copy if any changes are detected since the last time we checked. It's worth highlighting that the Policy Crawler API could be used with any text that is available online, from Wikipedia to a New York Times article to an openly licensed textbook.
  • Developed and maintained an application that provides the ability to annotate privacy policies and complete privacy evaluations. The annotator allows us to associate specific lines of a privacy policy with specific questions and include specific legal and technical information. At its core, the annotator is a tool for focused reading and comprehension; we currently use it to evaluate software and legal terms.
  • Surveyed 1,125 websites for encryption of logins. We surveyed websites used by youth to get a sense of how these vendors supported encryption and other security best practices and wrote a 5,000-word research summary on encryption practices.
  • Published a five-part 6,000-word series on personal privacy. The series highlighted how an awareness of how to maintain privacy can help reduce filter bubbles and increase information literacy.
  • Engaged in community outreach via webinars, in-person presentations, and keynotes. Presentations and events in 2016 included EduCon, South by Southwest EDU, ISTE, the Data Champions Summit, TCEA, ATLIS, Macomb County Community College, the IMS Global Leadership Institute, SIAA Symposium on Learning Ecosystems, NAED in Washington, D.C., Silicon Valley Community Foundation Innovation Conference, and webinars put on by the Future of Privacy Forum.

In 2017, our work will build on this solid foundation. Our plans include new research on support for encryption that will provide a new analysis of what "normal" looks like for privacy in educational technology and -- of course -- more thorough evaluations of commonly used edtech apps, including apps such as Google Classroom, Apple Classroom, and Microsoft Classroom. To stay up-to-date on our work, please follow the privacy initiative blog and our Twitter feed.


Share your thoughts