Navigate the Privacy Evaluation Questions

These questions provide part of the foundation of our evaluation process. You have several options for navigating these questions, and learning more about data privacy. This page contains all of the questions, with background citations.

Triage

1: Observation

0.1.1: Policy Available

  • Are the policies for the specific service available (and not, for example, for the public-facing website)?
Citation:

0.1.2: Same Policy

  • Do Android or iOS app privacy policies link to the same privacy policy URL location as the home page policy?
Citation:

0.1.3: Default Encryption

  • Do the login page, or pages accessed while logged in, use encryption with HTTPS?
Citation:

0.1.4: Encryption Required

  • Do the login page, or pages accessed while logged in, force encryption back to HTTPS if changed to HTTP?
Citation:

0.1.5: Use Trackers

  • Does the application or service use trackers on its homepage, registration page, or while a user is logged-in?
Citation:

2: Policy Available

  • Are hyperlinks to the vendor's policies available on the homepage and labeled Privacy Policy or Terms of Use?
Citation:
  • California Online Privacy Protection Act: (An operator of a service or application that collects personally identifiable information through the Internet about individual consumers from California who use or visit its service is required to conspicuously post a privacy policy) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(a)
  • California Online Privacy Protection Act: (An operator is required to post a conspicuous hyperlink that includes the word "privacy" to its actual privacy policy on the homepage or first significant page after entering the Web site, or an icon that hyperlinks to a Web page on which the actual privacy policy is posted, so that a reasonable person would notice it) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(b)(1)-(4)
Background:
  • A vendor should make their Policy recognizable by giving it a descriptive title, such as 'Privacy Policy' or 'Data Collection and Use Policy.' Make the Privacy Policy available in a single location; don't make users search for it in Terms of Service or Terms and Conditions statements, for example. Make the Policy conspicuously available on the website or from within the mobile app or other online service. If your app is available through an online store or other platform, also provide a link to the Policy there so that potential users can review it before downloading the app. Be prepared to provide a copy of or a link to the Policy to a school or school district for posting on their website. Schools and districts are increasingly receiving requests from parents to share the privacy policies of the online services they use. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.

0.2.2: Policy Accessible

  • Are the policies available in a human and machine readable format that is accessible on the web, mobile devices, screen readers or assistive technologies?
Citation:
  • California Online Privacy Protection Act: (An online service or application is required to post a conspicuous hyperlink that includes the word "privacy" to its actual privacy policy on the homepage, or provide any other reasonably accessible means of making the privacy policy available for consumers of the online service) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(b)(5)
Background:
  • A vendor should make the Privacy Policy for the site or service easy for parents and educators to understand. Use plain language, for example, say what you currently do or don’t do, instead of what practices you 'may' employ at some future time. If there are practices that a vendor may only employ in some circumstances, specify those circumstances. Be concrete and specific about the data practices related to all the features of your site or service, explaining where appropriate that a school or district may choose not to use all of its features. Format the Policy with headers that identify key parts of the policy, such as Information We Collect, How We Use Your Information, Information We Share, Access to Your Information, Security, Effective Date, and Privacy Contact. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 16.

0.2.3: Allow Crawling

  • Do the policies allow machine crawling or indexing?
Citation:
  • Copyright Act of 1976: (Fair use of a copyrighted work, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright) See Copyright Act of 1976, 17 U.S.C. § 107

0.2.4: Policy Purchase

  • Are the policies available on all product purchase or acquisition web pages?
Citation:

0.2.5: Policy Registration

  • Are the policies available on a new account registration webpage for review prior to a user creating a new account with the service or application?
Citation:

3: Account Type

0.3.1: Free Account

  • Can you create a free sample account with the application or service?

0.3.2: Access Code

  • Does the application or service require the purchase of hardware or a School access code to create an account?

0.3.3: Purchase Options

  • Does the application or service offer a seperate paid version or In-App-Purchases?

4: Policy Errors

0.4.1: Policy Readable

  • Do the policies contain structural or typographical errors?

1: Transparency

Background:
  • Transparency: Consumers have a right to easily understandable and accessible information about privacy and security practices. Companies should provide clear descriptions of what personal data they collect, why they need the data, how they will use it, when they will delete the data or de-identify it from consumers, and for what purposes they may share personal data with third-parties. These notifications should be placed in locations in the application or service that are most useful to enabling consumers to gain a meaningful understanding of privacy implications and the ability to exercise Individual Control. If an online website, service, or application does not have a Privacy Policy, Terms of Service (TOS), and/or End User License Agreement (EULA) publicly available, then this evaluation tool should not be used, because there is no reliable or legally binding guarantees about how a user's data will be treated. Additionally, in many cases, the terms should contain specifics about how cookies and other trackers are used, a data breach notification policy, and other legal notices as applicable. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 14.
  • Transparency is one of the Fair Information Practice Principles (FIPPs) that underlie privacy laws and regulations around the world. The Organization for Economic Cooperation and Development (OECD) calls for transparency about developments, practices and policies with respect to personal data, as one of the guidelines intended to help harmonize national privacy legislation while supporting the data flow essential to international commerce. See CA DOJ, Making Your Privacy Practices Public, p. 3; See also The Organisation for Economic Co-operation and Development (OECD) Privacy Framework (2013).
  • The California Online Privacy Protection Act (CalOPPA), requires operators of commercial websites or online services that collect personal information on California consumers through a website to conspicuously post a privacy policy on the site and to comply with its policy. The privacy policy must, among other things, identify the categories of personally identifiable information (PII) collected about site visitors and the categories of third-parties with whom the operator may share the information. The privacy policy must also provide information on the operator's online tracking practices. An operator is in violation for failure to post a policy within 30 days of being notified of noncompliance, or if the operator either knowingly, willfully, or negligently fails to materially comply with the provisions of its policy. See California Business and Professions Code (B.P.C.) §§ 22575-22579; See CA DOJ, How to Read a Privacy Policy.
  • The FTC recommends the implementation of substantive privacy protections – such as data security, limitations on data collection and retention, and data accuracy – as well as procedural safeguards aimed at integrating the substantive principles into a company's everyday business operations. By shifting burdens away from consumers and placing obligations on businesses to treat consumer data in a responsible manner, these principles should afford consumers basic privacy protections without forcing them to read long, incomprehensible privacy notices to learn and make choices about a company's privacy practices. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 23.
  • Most privacy policies are generally ineffective for informing consumers about a company's data practices because they are too long, are difficult to comprehend, and lack uniformity. However, the policies still have value – they provide an important accountability function by educating consumer advocates, regulators, the media, and other interested parties about the company's data practices. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 61.
  • The FTC believes that privacy policy statements should contain some standardized elements, such as format and terminology, to allow consumers to compare the privacy practices of different companies and to encourage companies to compete on privacy. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 62.

1.1: Policy Version

1.1.1: Effective Date

  • Do the policies clearly indicate the version or effective date of the policies?
Citation:
Background:

1.1.2: Change Log

  • Do the policies clearly indicate a changelog or past versions of the policies are available for review?

1.2: Policy Coverage

1.2.1: Services Include

  • Do the policies clearly indicate the websites, services, or mobile applications that are covered by the policies?
Citation:

1.3: Policy Changes

1.3.1: Review Changes

  • Do the policies clearly indicate whether or not any updates or material changes to the policies will be accessible for review by a user prior to the new changes being effective?
Citation:

1.3.2: Effective Changes

  • Do the policies clearly indicate whether or not any updates or material changes to the policies are effective immediately and continued use of the application or service indicates consent?
Citation:
Background:

1.4: Policy Notice

1.4.1: Change Notice

  • Do the policies clearly indicate whether or not a user is notified if there are any material changes to the policies?
Citation:

1.4.2: Method Notice

  • Do the policies clearly indicate the method used to notify a user when policies are updated or materially change?
Citation:
Background:
  • Do the policies clearly indicate the vendor provides prominent notice on the homepage that the website or service uses cookies?
Citation:
  • General Data Protection Regulation: (The EU General Data Protection Regulation (GDPR) replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy) See General Data Protection Regulation (GDPR), (Regulation (EU) 2016/679)

1.5: Policy Contact

1.5.1: Vendor Contact

  • Do the policies clearly indicate whether or not a user can contact the vendor about any privacy policy questions, complaints, or material changes to the policies?
Citation:
Background:

1.6: Policy Principles

1.6.1: Quick Reference

  • Do the policies clearly indicate short explanations, layered notices, a table of contents, or privacy principles of the vendor?
Background:

1.7: Policy Language

1.7.1: Preferred Language

  • Do the policies clearly indicate they are available in a language other than English?
Citation:
  • General Data Protection Regulation: (The EU General Data Protection Regulation (GDPR) replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy) See General Data Protection Regulation (GDPR), (Regulation (EU) 2016/679)

1.8: Intended Use

1.8.1: Children Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by children under the age of 13?
Citation:

1.8.2: Teens Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by teens 13 to 18 years of age?
Citation:
  • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582

1.8.3: Adults Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by adults over the age of 18?
Citation:
  • Children's Online Privacy Protection Act: (A general audience site is where the operator has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required before collection of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2

1.8.4: Parents Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by parents or guardians?
Citation:

1.8.5: Students Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by students in preschool or K-12?
Citation:

1.8.6: Teachers Intended

  • Do the policies clearly indicate whether or not the service is intended to be used by teachers?
Citation:

2: Focused Collection

Background:
  • Focused Collection: Consumers have a right to reasonable limits on the personal data that companies collect and retain. Companies should collect only as much personal data as they need to accomplish the purposes in which the data is collected. Companies should also securely dispose of or de-identify personal data once they no longer need it, unless they are under a legal obligation to do otherwise. If a company provides a clear understanding of all the data collected, a user can make an informed choice about the potential privacy implications of how their data are used. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 21.
  • The FTC recommends privacy best practices which include the principle that Companies should incorporate substantive privacy protections into their practices, such as data security, reasonable collection limits, sound retention practices, and data accuracy. It is a best practice for companies to inform customers about what information they are collecting in a clear and concise manner and to only collect the information that the company needs to complete their business purpose. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 23, 26.
  • As part of privacy by design, first-party companies and third-party data brokers should strive to assess their collection practices and, to the extent practical, collect only the data they need and properly dispose of the data as it becomes less useful. This is particularly important in light of companies' increased ability to collect, aggregate, and match consumer data and to develop secondary uses for the data in ways that consumers could never have contemplated when they provided the information. Sound data collection and disposal practices also reinforce data security, as collecting and storing large amounts of data not only increases the risk of a data breach or other unauthorized access but also increases the potential harm that could be caused. For example, identity thieves and other unscrupulous actors may be attracted to detailed consumer profiles maintained by data brokers that do not dispose of obsolete data, as this data could give them a clear picture of consumers' habits over time, thereby enabling them to predict passwords, answers to challenge questions, or other authentication credentials. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), p. 55.
  • The federal government recommends that data collected in schools is used for educational purposes and continue to support investment and innovation that raises the level of performance across our schools. To promote this innovation, it should explore how to modernize the privacy regulatory framework under the Family Educational Rights and Privacy Act (FERPA) and Children's Online Privacy Protection Act (COPPA) to ensure two complementary goals: 1) protecting students against their data being shared or used inappropriately, especially when that data is gathered in an educational context, and 2) ensuring that innovation in educational technology, including new approaches and business models, have ample opportunity to flourish. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 64.

2.1: Data Collection

2.1.1: Collect PII

  • Do the policies clearly indicate whether or not the vendor collects Personally Identifiable Information (PII)?
Citation:
  • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
Background:

2.1.2: PII Categories

  • Do the policies clearly indicate what categories of Personally Identifiable Information are collected by the application or service?
Citation:

2.1.3: Geolocation Data

  • Do the policies clearly indicate whether or not geolocation data are collected?
Citation:
  • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
  • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
  • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
Background:
  • Location information collected in the mobile context is considered a persistent identifier that can be used to recognize a user over time and across different websites or online services. Geolocation data includes information sufficient to identify the latitude and longitude coordinates of a user that can correspond to a specific street, address, name of a city or town. If location data is collected and shared with third-parties, companies should work to provide consumers with more prominent notice and choices about its geolocation data collection, transfer, use, and disposal practices. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 33; See also U.S. v. Jones, 132 S. Ct. 945, 955 (2012)("GPS monitoring generates a precise, comprehensive record of a person's public movements that reflects a wealth of detail about her familial, political, professional, religious, and sexual associations").

2.1.4: Health Data

  • Do the policies clearly indicate whether or not any biometric data are collected?
Citation:
  • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
  • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
Background:
  • Biometric data are physical or behavioral characteristics which can be used to identify unique individuals. Biometric technologies measure these unique characteristics electronically and match them against existing records to create a highly accurate identity management system. Fingerprints, retnia scans, or voice and facial recognition are examples of physcial identification technologies. It uses the layout of facial features and their distance from one another for identification against a "gallery" of faces with similar characteristics. See Privacy Best Practice Recommendations For Commercial Biometric Use, NTIA Discussion Draft (July 22, 2015), p. 1.
  • The ability of facial recognition technology to identify consumers based solely on a photograph, create linkages between the offline and online world, and compile highly detailed dossiers of information, makes it especially important for companies using this technology to implement privacy by design concepts with robust choice and transparency policies. Such practices should include reducing the amount of time consumer information is retained, adopting reasonable security measures, and disclosing to consumers that the facial data collected may be used to link them to information from third-parties or publicly available sources. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 46.

2.1.5: Behavioral Data

  • Do the policies clearly indicate whether or not any behavioral data are collected?
Citation:
  • Children's Online Privacy Protection Act: (An operator is prohibited from including behavioral advertisements or amassing a profile of a child under the age of 13 child without parental consent) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1

2.1.6: Sensitive Data

  • Do the policies clearly indicate whether or not sensitive personal information is collected?

2.1.7: Usage Data

  • Do the policies clearly indicate whether or not the application or service collects non-personal information such as a user's persistent identifier, unique device ID, IP address, or other device information?
Citation:
  • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
  • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
  • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
Background:
  • The Children's Online Privacy Protection Act (COPPA) defines "personal information" to include identifiers, such as a customer number held in a cookie, an IP address, a processor or device serial number, or a unique device identifier that can be used to recognize a user over time and across different websites or online services, even where such an identifier is not paired with other items of personal information. Companies should disclose in their privacy policy, and in their direct notice to parents, their collection, use or disclosure practices of persistent identifiers unless: (1) the company collects no other "personal information," and (2) persistent identifiers are collected on or through a company's site or service solely for the purpose of providing "support for the internal operations" of the site or service. See FTC, Complying with COPPA: Frequently Asked Questions, q. 6.
  • Persistent identifiers collected for the sole purpose of providing support for the internal operations of the website or online service do not require parental consent, so long as no other personal information is collected and the persistent identifiers are not used or disclosed to contact a specific individual, including through behavioral advertising; to amass a profile on a specific individual; or for any other purpose. See FTC, Complying with COPPA: Frequently Asked Questions, q. 5.
  • The data on students collected and maintained by Ed Tech can be extremely sensitive, including medical histories, social and emotional assessments, progress reports, and test results. Online services also collect new types of data, which were not contemplated by and may not be protected by federal privacy laws. New data types collected by Ed Tech include "metadata," such as a student’s location, how many attempts a student made to answer a question, and whether a student is using a desktop or a mobile device. Metadata can be put to good use to personalize learning and to improve educational products. It can also be used to influence or market to students or to their parents. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 3.
  • A vendor should describe the types or categories of student information that they acquire from schools, school districts, teachers, parents, or students. Data types may include behavioral data reflecting how a student used the site or service or what content the student has accessed or created through it, and transactional data, such as persistent unique identifiers, collected through the use of your site or service. While unique identifiers are evolving with technology, currently such identifiers include, but are not limited to, cookies, device IDs, IP addresses, and other data elements if used to identify devices or users. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 11.

2.1.8: Lunch Status

  • Do the policies clearly indicate whether or not the vendor collects information on free or reduced lunch status?
Citation:
  • The National School Lunch Act: (The NSLA defines penalties for the unauthorized sharing of personal information related to free and reduced lunch status for students) See The National School Lunch Act (NSLA), 42 U.S.C. §§1751-63
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
Background:
  • The National School Lunch Act (NSLA) requires school districts to provide free or reduced price lunches to all eligible children, including eligible children in schools that had not yet established school lunch programs. The NSLA aims to safeguard the health and well-being of children and defines penalties for the unauthorized sharing of personal information related to free and reduced lunch status for students. See 42 U.S.C. §§ 1751-63.

2.2: Data Source

2.2.1: Student Data

  • Do the policies clearly indicate whether or not the vendor collects personal information or education records from preK-12 students?
Citation:
Background:
  • The Family Educational Rights and Privacy Act of 1974 (FERPA), provides parents of students the right to access their children's Student Data or education records, and Students 18 years of age and older the right to access their own education records. In addition, FERPA provides the right to have the records amended, and the right to have some control over the disclosure of personally identifiable information (PII) in the education records. Furthermore, strict storage guidelines surround Student Data which require organizations to maintain accurate, and up-to-date records. See 20 U.S.C. § 1232g; 34 C.F.R. Part 99.1.
  • What are Education Records? FERPA defines educational records as records that are: (1) directly related to a student; and (2) maintained by an educational agency or institution or by a party acting for the agency or institution. These records include, but are not limited to, transcripts, class lists, student course schedules, health records, student financial information, and student disciplinary records. It is important to note that any of these records maintained by a third-party acting on behalf of a school or district are also considered education records. 20 U.S.C. § 1232g (a)(4)(A); 34 CFR § 99.3; See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 1; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 2.

2.2.2: Child Data

  • Do the policies clearly indicate whether or not the vendor collects personal information online from children under 13 years of age?
Citation:
Background:
  • The Children's Online Privacy Protection Act (COPPA) requires a privacy policy to list the kinds of personal information collected from children (for example, name, address, email address, hobbies, etc.), how the information is collected, and how the company uses the personal information. It also requires companies to indicate whether they disclose information collected from children to third-parties. If so, the company must also disclose the kinds of businesses in which the third-parties are engaged, the general purposes for which the information is used, and whether the third-parties have agreed to maintain the confidentiality and security of the information. See 15 U.S.C. § 6502; 16 C.F.R. Part 312.
  • If a company knows that a user of the online website or service is under the age of 13, the Children's Online Privacy Protection Act (COPPA) will impose more stringent requirements on the collection of information from those users. COPPA requires that operators seeking to collect, use, or disclose personal information from children under the age of 13, must first obtain verifiable parental consent. Even where a user is 13 or older, COPPA remains a source of best practices for companies that collect personal information from users, particularly when those users are still minors. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312.
  • COPPA permits the collection of limited personal information from children under 13 for the purposes of: (1) Obtaining verified parental consent; (2) providing parents with a right to opt-out of an operator's use of a child's email address for multiple contacts of the child; and (3) to protect a child's safety on a website or online service. See 15 U.S.C. 6502(b)(2); 16 C.F.R. 312.5(c)(1)–(5).

2.3: Data Exclusion

2.3.1: Data Excluded

  • Do the policies clearly indicate whether or not the vendor does not collect specific types of data?

2.3.2: Coverage Excluded

  • Do the policies clearly indicate whether or not the vendor excludes specific types of collected data from coverage under its privacy policy?

2.4: Data Limitation

2.4.1: Collection Limitation

  • Do the policies clearly indicate whether or not the vendor limits the collection or use of information to only data that are specifically required for the application or service?
Citation:
Background:

3: Data Sharing

Background:
  • Data Sharing: Companies should address in their privacy policies whether data collected are shared or sold to third-parties, and whether data are shared in an aggregate or de-identified format. In addition, companies should disclose the roles of third-parties and their functions, and whether third-parties are contractually required to provide the same level of privacy protection, as well as the use of social plugins or federated logins.
  • The FTC recommends privacy principles apply to all commercial entities that collect or use consumer data that can be reasonably linked to a specific consumer, computer, or other device, unless the entity collects only non-sensitive data from fewer than 5,000 consumers per year and does not share the data with third-parties. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 22.
  • The FTC calls for privacy policies to be clearer, shorter, and more standardized to enable better comprehension and comparison of privacy practices. The FTC recommends companies provide consumer choice in situations where a company shares data with a third-party that it collects from a consumer, thereby giving consumers the ability to control the flow of their data to third-parties who might use or sell the data to others for enhancement. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 44, 61.
  • The Children's Online Privacy Protection Act (COPPA) requires a privacy policy to list the kinds of personal information collected from children (for example, name, address, email address, hobbies, etc.), how the information is collected, and how the company uses the personal information. It also requires companies to indicate whether they disclose information collected from children to third-parties. If so, the company must also disclose the kinds of businesses in which the third-parties are engaged, the general purposes for which the information is used, and whether the third-parties have agreed to maintain the confidentiality and security of the information. See 15 U.S.C. § 6502; 16 C.F.R. Part 312.
  • Under the Family Educational Rights and Privacy Act (FERPA), student data can be shared with a third-party if the vendor has been designated as a "school official," as defined, and that official can only use data that is part of an "educational record" for the specific purpose under which it was disclosed. However, student information that has been properly de-identified or that is shared under the "directory information" exception, is not protected by FERPA, and thus is not subject to FERPA's use and redisclosure limitations. Additionally, if a vendor has not been declared a "school official," any rights claimed by a vendor to sell or disclose student data should be identified and defined in the vendor's policies. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 3-4.

3.1: Data Shared With Third Parties

3.1.1: Data Shared

  • Do the policies clearly indicate whether or not collected information (this includes data collected via automated tracking or usage analytics) is shared with third parties?
Citation:
Background:
  • Online educational services increasingly collect a large amount of contextual or transactional data as part of their operations, often referred to as "metadata." Metadata refer to information that provides meaning and context to other data being collected; for example, information about how long a particular student took to perform an online task has more meaning if the user knows the date and time when the student completed the activity, how many attempts the student made, and how long the student's mouse hovered over an item (potentially indicating indecision). See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.
  • Metadata that have been stripped of all direct and indirect identifiers are not considered protected information under FERPA, because the data are not PII. A provider that has been granted access to PII from education records under the "school official" exception may use any metadata that are not linked to FERPA-protected information for other purposes, unless otherwise prohibited by the terms of their agreement with the school or district. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.

3.1.2: Data Categories

  • Do the policies clearly indicate what categories of information are shared with third parties?
Citation:
Background:
  • Consumers deserve more transparency about how their data is shared beyond the entities with which they do business directly, including "third-party" data collectors. This means ensuring that consumers are meaningfully aware of the spectrum of information collection and reuse as the number of firms that are involved in mediating their consumer experience or collecting information from them multiplies. The data services industry should follow the lead of the online advertising and credit industries and build a common website or online portal that lists companies, describes their data practices, and provides methods for consumers to better control how their information is collected and used or to opt-out of certain marketing uses. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 62.
  • What is the "School Official" Exception? In some cases, providers need PII from a students's education records in order to deliver the agreed-upon services. FERPA's school official exception to consent is most likely to apply to the schools' and districts' relationships with service providers. When schools and districts outsource institutional services or functions, FERPA permits the disclosure of PII from education records to contractors, consultants, volunteers, or other third-parties provided that the outside party meets specified requirements. See 34 C.F.R. § 99.31(a)(1)(i); See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 2; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 3-5.

3.2: Data Use by Third Parties

3.2.1: Sharing Purpose

  • Do the policies clearly indicate the vendor's intention or purpose for sharing a user's personal information with third parties?
Citation:
Background:

3.2.2: Third-Party Analytics

  • Do the policies clearly indicate whether or not collected information is shared with third parties for analytics and tracking purposes?
Citation:
Background:

3.2.3: Third-Party Research

  • Do the policies clearly indicate whether or not collected information is shared with third parties for research or product improvement purposes?
Citation:
Background:

3.2.4: Third-Party Marketing

  • Do the policies clearly indicate whether or not personal information is shared with third parties for advertising or marketing purposes?
Citation:
Background:
  • The FTC agrees that the defintion of first-party marketing should include the practice of contacting consumers across different channels. Regardless of the particular means of contact, receipt of a message from a company with which a consumer has interacted directly is likely to be consistent with the consumer's relationship with that company. If an offline or online retailer tracks a customer's activities on a third-party website, this is unlikely to be consistent with the customer's relationship with the retailer; thus, choice should be required. See FTC 2012, P. 42; See also FTC Staff Report, Self-Regulatory Principles For Online Behavioral Advertising, pp. 26-28.

3.3: Data Not Shared With Third Parties

3.3.1: Exclude Sharing

  • Do the policies clearly indicate whether the vendor specifies the categories of information that will not be shared with third parties?

3.4: Data Sold To Third Parties

3.4.1: Data Sold

  • Do the policies clearly indicate whether or not a user's personal information is sold or rented to third parties?
Citation:
Background:

3.5: Third-Party Data Acquisition

3.5.1: Data Acquired

  • Do the policies clearly indicate whether or not a user's information is acquired from a third-party by the vendor?
Citation:
  • California Online Privacy Protection Act: (An operator is required to identify the categories of personally identifiable information that they collect about individual consumers who use or visit its website or online service) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(1)
  • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)

3.6: De-identified or Anonymized Data

3.6.1: Data Deidentified

  • Do the policies clearly indicate whether or not a user's information that is shared or sold to a third-party is only done so in an anonymous or de-identified format?
Citation:
Background:
  • There is nothing wrong with a provider using de-identified data for other purposes, because privacy statutes, govern PII, not de-identified data. But because it can be difficult to fully de-identify data, as a best practice, an agreement between a company and third-party should prohibit re-identification and any future data transfers unless the third-party also agrees not to attempt re-identification. It is also a best practice to be specific about the de-identification process. De-identification typically requires more than just removing any obvious individual identifiers, as other demographic or contextual information can often be used to re-identify specific individuals. Retaining location and school information can also greatly increase the risk of re-identification. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, P. 3.
  • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
  • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
  • If a vendor shares covered information for the development and improvement of educational sites or services, they should de-identify and aggregate the information first. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

3.6.2: Deidentified Process

  • Do the policies clearly indicate whether or not a user's personal information is de-identified with a reasonable level of justified confidence, or the vendor provides links to any information that describes their de-identification process?
Citation:
Background:
  • While data shared in the aggregate can reduce the risk of re-identifying anonymous individuals, it does not completely eliminate the risk, and sharing of aggregate data should be carefully reviewed. The aggregation of student-level data into school-level (or higher) reports removes much of the risk of disclosure, since no direct identifiers (such as a name, Social Security Number, or student ID) are present in the aggregated tables. Some risk of disclosure does remain, however, in circumstances where one or more students possess a unique or uncommon characteristic (or a combination of characteristics) that would allow them to be identified in the data table (this commonly occurs with small ethnic subgroup populations), or where some easily observable characteristic corresponds to an unrelated category in the data table (e.g., if a school reports that 100% of males in grade 11 scored at "Below Proficient" on an assessment). In these cases, some level of disclosure avoidance is necessary to prevent disclosure in the aggregate data table. See PTAC, Frequently Asked Questions—Disclosure Avoidance (Oct 2012), p. 2.
  • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
  • A company must take reasonable measures to ensure that the data is de-identified. This means that the company must achieve a reasonable level of justified confidence that the data cannot reasonably be used to infer information about, or otherwise be linked to, a particular consumer, computer, or other device. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
  • Do the policies clearly indicate whether or not outbound links on the site to third-party external websites are age appropriate?
Citation:
  • Children's Internet Protection Act: (If an operator provides third-party links on its site that link to potentially non-age appropriate information for children, then the operator must provide notice upon clicking a third-party link that a user is leaving the website) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254
Background:

3.8: Third-Party Data Access

3.8.1: Authorized Access

  • Do the policies clearly indicate whether or not a third party is authorized to access a user's information?
Citation:

3.9: Third-Party Data Collection

3.9.1: Third-Party Collection

  • Do the policies clearly indicate whether or not a user's personal information is collected by a third party?
Citation:
Background:

3.10: Third-Party Data Misuse

3.10.1: Data Misuse

  • Do the policies clearly indicate whether or not a user's information can be deleted from a third party by the vendor, if found to be misused by the third party?

3.11: Third-Party Service Providers

3.11.1: Third-Party Providers

  • Do the policies clearly indicate whether or not third-party services are used to support the internal operations of the vendor's application or service?
Citation:
Background:
  • Disclosure of personal information for the "internal operations" of the website or online service, means activities necessary for the site or service to maintain or analyze its functioning; perform network communications; authenticate users or personalize content; serve contextual advertising or cap the frequency of advertising; protect the security or integrity of the user, website, or online service; ensure legal or regulatory compliance; or fulfill a request of a child. See 16 C.F.R. 312.2; See also FTC, Complying with COPPA: Frequently Asked Questions, q. 5.

3.11.2: Third-Party Roles

  • Do the policies clearly indicate the role of third-party service providers?
Citation:

3.12: Third-Party Affiliates

3.12.1: Third-Party Categories

  • Do the policies clearly indicate the categories of third parties, subsidiaries, or affiliates with whom the vendor shares data?
Citation:
Background:

3.13: Third-Party Policies

3.13.1: Third-Party Policy

  • Do the policies clearly indicate whether a link to a third-party service provider, data processor, partner, or affiliate's privacy policy is available for review?

3.14: Third-Party Data Combination

3.14.1: Vendor Combination

  • Do the policies clearly indicate whether or not data collected or maintained by the vendor can be augmented, extended, or combined with data from third party sources?
Citation:

3.14.2: Third-Party Combination

  • Do the policies clearly indicate whether or not data shared with third parties can be augmented, extended, or combined with data from additional third party sources?
Citation:

3.15: Third-Party Authentication

3.15.1: Social Login

  • Do the policies clearly indicate whether or not social or federated login is supported to use the service?
Citation:
  • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6

3.15.2: Social Collection

  • Do the policies clearly indicate whether or not the vendor collects information from social or federated login providers?
Citation:
  • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6

3.15.3: Social Sharing

  • Do the policies clearly indicate whether or not the vendor shares information with social or federated login providers?
Background:

3.16: Third-Party Contractual Obligations

3.16.1: Third-Party Limits

  • Do the policies clearly indicate whether or not third parties have contractual limits on how they can use personal information that is shared or sold to them?
Citation:
Background:
  • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
  • A company's data would not be "reasonably linkable" to a particular consumer or device to the extent that the company implements three significant protections for that data: (1) a given data set is not reasonably identifiable, (2) the company publicly commits not to re-identify it, and (3) the company requires any downstream users of the data to keep it in de-identified form. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
  • The ability to re-identify "anonymous" data supports the FTC's framework application to data that can be reasonably linked to a consumer or device, because consumers' privacy interest in data goes beyond what is strictly labeled PII. There exists a legitimate interest for consumers in having control over how companies collect and use aggregated or de-identified data, browser fingerprints, and other types of non-PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 18-19.
  • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
  • A vendor should contractually require their service providers who receive covered information acquired through the site or service to use the information only to provide the contracted service, not to further disclose the information, to implement and maintain reasonable security procedures and practices as required by law, and to return or delete covered information at the completion of the contract. Include a requirement that any service providers notify the vendor immediately of any unauthorized disclosure of the student information in their custody, and then act promptly to provide proper notice as required by law. Make clear to service providers that they may separately face liability for the mishandling of student data. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 13.

3.16.2: Combination Limits

  • Do the policies clearly indicate whether or not third parties have contractual limits that prohibit re-identification or combining data with other data sources that are shared or sold to them?
Citation:
  • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
Background:
  • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.
  • The FTC recommends that third-party data brokers take reasonable precautions to ensure that downstream users of their data do not use it for eligibility determinations or for unlawful discriminatory purposes. Of course, the use of race, color, religion, and certain other categories to make credit, insurance, and employment decisions is already against the law, but data brokers should help ensure that the information does not unintentionally go to unscrupulous entities that would be likely to use it for unlawful discriminatory purposes. Similarly, data brokers should conduct due diligence to ensure that data that they intend for marketing or risk mitigation purposes is not used to deny consumers credit, insurance, employment, or the like. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), pp. 55-56.
  • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
  • The FTC's framework application applies to data that, while not yet linked to a particular consumer, computer, or device, may reasonably become so. There is significant evidence demonstrating that technological advances and the ability to combine disparate pieces of data can lead to identification of a consumer, computer, or device even if the individual pieces of data do not constitute PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 20.

4: Respect for Context

Background:
  • Respect for Context: Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data. Companies should limit their use and disclosure of personal data to those purposes that are consistent with both the relationship that they have with consumers and the context in which consumers originally disclosed the data, unless required by law to do otherwise. If companies will use or disclose personal data for other purposes, they should provide heightened Transparency and Individual Choice by disclosing these other purposes in a manner that is prominent and easily actionable by consumers at the time of data collection. If, subsequent to collection, companies decide to use or disclose personal data for purposes that are inconsistent with the context in which the data was disclosed, they must provide heightened measures of Transparency and Individual Choice to consumers. Finally, the age and familiarity with technology of consumers who engage with a company are important elements of context. Companies should fulfill the obligations under this principle in ways that are appropriate for the age and sophistication of consumers that may require greater protections for personal data obtained from children and teenagers than for adults. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 15.
  • The FTC recommends as a best practice that companies give their users clear and prominent notice and obtain affirmative express consent prior to making certain material retroactive changes to their privacy practices. For practices requiring choice, companies should offer the choice at a time and in a context in which the consumer is making a decision about his or her data. Companies should obtain affirmative express consent before (1) using consumer data in a materially different manner than claimed when the data was collected; or (2) collecting sensitive data for certain purposes. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 57-60.
  • Companies should present choices to consumers in a prominent, relevant, and easily accessible place at a time and in a context when it matters to them, and make privacy statements clearer, shorter, and more standardized. In addition, companies should provide consumers with reasonable access to their data, and undertake consumer education efforts to improve consumers' understanding of how companies collect, use, and share their data. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 60.
  • The Children's Online Privacy Protection Act (COPPA) requires verifiable parental consent to share data, but companies should also send a notice and request for express opt-in consent from parents if there are material changes in the collection, use or disclosure practices of the company, to which the parent had previously agreed. Although this requirement is not expressly required by law it is generally considered a best practice and encouraged by the FTC. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 57-58.
  • Identifying and defining specific classes of data can be paired with more precise definitions of what data are collected, and can allow for protecting that data in ways that go beyond the bare minimum required protections required by privacy laws. Different classes of data can also make privacy policies more comprehensible by explicitly defining that the data of students, parents, and teachers could be classified differently with appropriate levels of protection.

4.1: Data Use

4.1.1: Purpose Limitation

  • Do the policies clearly indicate whether or not the vendor limits the use of data collected by the application to the educational purpose for which it was collected?
Citation:
Background:
  • Any PII from a students's education record that the provider receives under FERPA's "school official" exception may only be used for the specific purpose for which it was disclosed (i.e., to perform the outsourced institutional service or function, and the school or district must have direct control over the use and maintenance of the PII by the provider receiving the PII). Further, under FERPA's school official exception, the provider may not share or sell FERPA-protected information, or re-use it for any other purposes, except as directed by the school or district and as permitted by FERPA. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 5.

4.1.2: Data Purpose

  • Do the policies clearly indicate the context or purpose in which data are collected?
Background:

4.2: Data Classification

4.2.1: Data Type

  • Do the policies clearly indicate specific types of personal information (PII, Non-PII, Children's PII, Sensitive information, etc.)?

4.3: User Classification

4.3.1: Account Type

  • Do the policies clearly indicate different types or classes of user accounts?

4.4: Data Combination

4.4.1: Combination Type

  • Do the policies clearly indicate whether or not Personally Identifiable Information (PII) combined with non-PII would be treated as PII?
Citation:
Background:
  • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.

4.5: Data Notice

4.5.1: Context Notice

  • Do the policies clearly indicate whether or not notice is provided to a user if the vendor changes the context in which data are collected?

4.6: Data Changes

4.6.1: Practice Changes

  • Do the policies clearly indicate whether or not the vendor will obtain consent if the practices in which data are collected change or are inconsistent with contractual requirements?

4.7: Enforcement Context

4.7.1: Community Guidelines

  • Do the policies clearly indicate whether or not the vendor may terminate a user's account if they engage in any prohibited activities?

5: Individual Control

Background:
  • Individual Control: Consumers have a right to exercise control over what personal data companies collect from them and how they use it. Companies should provide consumers appropriate control over the personal data that consumers share with others and over how companies collect, use, or disclose personal data. Companies should enable these choices by providing consumers with easily used and accessible mechanisms that reflect the scale, scope, and sensitivity of the personal data that they collect, use, or disclose. Moreover, companies should remain cognizant of the sensitivity of the uses they make of personal data based on the context in which it was collected. Companies should offer consumers clear and simple choices, presented at times and in ways that enable consumers to make meaningful decisions about personal data collection, use, and disclosure. Companies should offer consumers means to withdraw or limit consent that are as accessible and easily used as the methods for granting consent in the first place. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 11.
  • Companies that collect and use consumer data should provide easy-to-use choice mechanisms that allow consumers to control whether their data is collected and how it is used. To ensure that choice is most effective, the FTC recommends that a company should provide the choice mechanism at a time and in a context that is relevant to consumers – generally at the point the company collects the consumer's information. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 35.
  • A school or district should maintain ownership of a students's data. If a school shares personal information with an outside third-party performing institutional functions or services, then the outside party must remain under the direct control of the agency or institution with respect to the use and maintenance of education records. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service; 34 C.F.R. § 99.31(a)(1)(i)(B)(2).

5.1: User Content

5.1.1: User Submission

  • Do the policies clearly indicate whether or not a user can create or upload content to the service?

5.1.2: Content Control

  • Do the policies clearly indicate whether or not a user's content is stored with the vendor or a third party?
Citation:
  • Do the policies clearly indicate whether or not the vendor requests opt-in consent from a user at the time information is collected?
Citation:
Background:

5.3: Data Restriction

5.3.1: Restriction Notice

  • Do the policies clearly indicate whether or not the vendor is able to restrict or remove a user's content without notice or consent?
Citation:
  • The Communications Decency Act of 1996: (No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or any action taken to enable or make available to information content providers or others the technical means to restrict access to material) See The Communications Decency Act of 1996 (CDA), 47 U.S.C. 230(c)
  • Digital Millennium Copyright Act: (The provider of a service or application that has removed or disabled access to material or activity claimed to be infringing must take reasonable steps to promptly notify the subscriber that it has removed or disabled access to their material) See Digital Millennium Copyright Act (DMCA), 17 U.S.C. § 512(g)(2)(A)

5.4: Data Settings

5.4.1: User Control

  • Do the policies clearly indicate whether or not a user can control the vendor or third party's use of their information through privacy settings?
Background:
  • While notice and consent remains fundamental in many contexts, it is important to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the benefits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 61.

5.5: Data Disclosure

  • Do the policies clearly indicate whether or not a user can provide consent or opt-out from disclosure of their data to a third party?
Citation:
  • California "Shine the Light": (California's "Shine the Light" law refers to information sharing disclose requirements for companies that do business with California residents to allow customers to opt-out of information sharing, or make a detailed disclosure of how personal information was shared for direct marketing purposes) See Information Sharing Disclosure, Cal. Civ. Code §§1798.83-1798.84

5.5.2: Disclosure Request

  • Do the policies clearly indicate whether or not a user can request the vendor to disclose all the personal information or records collected about them or shared with third parties?
Citation:

5.5.3: Disclosure Notice

  • Do the policies clearly indicate whether or not in the event a vendor discloses information in response to a government or legal request, if they will contact the affected user, school, parent, or student with notice of the request, so they may choose to seek a protective order or other legal remedy?
Citation:
  • Family Educational Rights and Privacy Act: (An educational agency or institution may disclose information for lawful reasons if they make a reasonable effort to notify the parent or eligible student of the order or subpoena in advance of compliance, so that the parent or eligible student may seek protective action) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(9)(ii)
  • California Electronic Communications Privacy Act: (Prohibits a government entity from compelling the production of or access to electronic communication information or electronic device information, without a search warrant, wiretap order, order for electronic reader records, or subpoena issued pursuant under specified conditions, except for emergency situations) See California Electronic Communications Privacy Act, Cal. Pen. Code § 1546-1546.4)

5.6: Data Rights

5.6.1: Data Ownership

  • Do the policies clearly indicate whether or not a student, educator, parent, or the school retains ownership to the Intellectual Property rights of the data collected or uploaded to the application or service?
Citation:
Background:

5.7: Data License

  • Do the policies clearly indicate whether or not the vendor may claim a copyright license to the data or content collected from a user?
Background:
  • Do the policies clearly indicate whether or not the vendor may limit its copyright license of a user's data?
  • Do the policies clearly indicate whether or not the vendor provides notice to a user if their content is removed or disabled because of a claim it violates the Intellectual Property rights of others?
Citation:
  • Digital Millennium Copyright Act: (The provider of a service or application that has removed or disabled access to material or activity claimed to be infringing must take reasonable steps to promptly notify the subscriber that it has removed or disabled access to their material) See Digital Millennium Copyright Act (DMCA), 17 U.S.C. § 512(g)(2)(A)
Background:
  • The Digital Millennium Copyright Act (DMCA) establishes procedures for proper notification and rules to take down a user's content that violates the copyrights of others. Under the notice and takedown procedure, a copyright owner submits a notification under penalty of perjury, including a list of specified elements to the service provider's designated agent. If, upon receiving a proper notification, the service provider promptly removes or blocks access to the material identified in the notification, the provider is exempt from liability. However, the service provider is required to provide adequate notice to the affected user, who then has the ability to respond to the notice and takedown by filing a counter notification. See U.S. Copyright Office Summary, The Digital Millennium Copyright Act (DMCA), p. 12; See also 17 U.S.C. § 512(c)(3); 17 U.S.C. § 512(g)(1).

6: Access and Accuracy

Background:
  • Access and Accuracy: Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate. Companies should use reasonable measures to ensure they maintain accurate personal data. Companies also should provide consumers with reasonable access to personal data that they collect or maintain about them, as well as the appropriate means and opportunity to correct inaccurate data or request its deletion or use limitation. Companies that handle personal data should construe this principle in a manner consistent with freedom of expression and freedom of the press. In determining what measures they may use to maintain accuracy and to provide access, correction, deletion, or suppression capabilities to consumers, companies may also consider the scale, scope, and sensitivity of the personal data that they collect or maintain and the likelihood that its use may expose consumers to financial, physical, or other material harm. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 19.
  • Providing a student with the ability to access and export data from a website or service has the potential to allow a student to interact directly with and derive more benefit from the data collected within an application. Additionally, a robust data export feature would support content audits over time. For organizations that are concerned about parents and students losing the ability to move their data when they need to, and use it as they wish, data portability addresses many of these issues by empowering student with direct access to their data. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 67-71.
  • FERPA does not provide any specific requirements for educational agencies and institutions regarding disposition or destruction of the data they collect or maintain themselves, other than requiring them to safeguard FERPA-protected data from unauthorized disclosure, and not to destroy any education records if there is an outstanding request to inspect or review them. When educational agencies and institutions disclose (or "share") PII from education records with third parties under an applicable exception to FERPA's written consent requirement, however, additional legal requirements regarding destruction of that PII may apply. See PTAC, Best Practices for Data Destruction, p. 2.
  • The Family Educational Rights and Privacy Act of 1974 (FERPA), provides parents of students the right to access their children's Student Data or education records, and Students 18 years of age and older the right to access their own education records. In addition, FERPA provides the right to have the records amended, and the right to have some control over the disclosure of personally identifiable information (PII) in the education records. Furthermore, strict storage guidelines surround Student Data which require organizations to maintain accurate, and up-to-date records. See 20 U.S.C. § 1232g; 34 C.F.R. Part 99.1.
  • United States Constitutional law has long recognized that privacy interests co-exist alongside fundamental First Amendment rights to freedom of speech, freedom of the press, and freedom of association. Individuals and members of the press exercising their free speech rights may well speak about other individuals and include personal information in their speech. A companies' privacy policy should be balanced and interpreted with full respect for First Amendment values, especially for non-commercial speakers and individuals exercising freedom of the press, against the privacy interests of the individual seeking to restrict access to that speech. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 20.

6.1: Data Access

6.1.1: Access Data

  • Do the policies clearly indicate whether or not the vendor provides a method to access and review a user's personal information for authorized individuals?
Citation:

6.1.2: Restrict Access

  • Do the policies clearly indicate whether or not the vendor may restrict access for unauthorized individuals to a user's data?
Citation:
Background:

6.1.3: Review Data

  • Do the policies clearly indicate whether or not there is a process available for the school, parents, or eligible students to review student information?
Citation:
Background:

6.2: Data Integrity

6.2.1: Maintain Accuracy

  • Do the policies clearly indicate whether or not the vendor takes steps to maintain the accuracy of data they collect and store?
Citation:

6.3: Data Correction

6.3.1: Data Modification

  • Do the policies clearly indicate whether or not the vendor provides the ability to modify a user's inaccurate data for authorized individals?
Citation:

6.3.2: Modification Process

  • Do the policies clearly indicate whether or not there is a process for the school, parents, or eligible students to modify inaccurate student information?
Citation:
Background:

6.3.3: Modification Request

  • Do the policies clearly indicate whether or not the school, parents, or eligible students may submit a request to the vendor to modify a student's inaccurate personal information?
Citation:
Background:

6.3.4: Modification Notice

  • Do the policies clearly indicate how long the vendor has to modify a user's inaccurate data after given notice?

6.4: Data Retention

6.4.1: Retention Policy

  • Do the policies clearly indicate the vendor's data retention policy, including any data sunsets or any time-period after which a user's data will be automatically deleted if they are inactive on the application or service?
Background:

6.4.2: Retention Limits

  • Do the policies clearly indicate whether or not the vendor will limit the retention of a user's data unless a valid request to inspect data are made?
Citation:

6.5: Data Deletion

6.5.1: Deletion Purpose

  • Do the policies clearly indicate whether or not the vendor will delete a user's personal information when the data are no longer necessary to complete the purpose for which it was collected?
Citation:
  • Children's Online Privacy Protection Act: (An operator may retain information collected from a child only as long as necessarily to fulfill the purpose for which it was collected and must delete the information using reasonable measures to prevent unauthorized use) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.10
  • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a certification that a pupil's records shall not be retained or available to the third party upon completion of the terms of the contract and a description of how that certification will be enforced) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(7)

6.5.2: Account Deletion

  • Do the policies clearly indicate whether or not a user's data are deleted upon account cancellation or termination?
Citation:

6.5.3: User Deletion

  • Do the policies clearly indicate whether or not a user can delete all of their personal and non-personal information from the vendor?
Citation:
  • California Online Privacy Protection Act: (If the operator maintains a process for a consumer to review and request changes to any of their personally identifiable information they must provide a description of that process) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(2)
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582

6.5.4: Deletion Process

  • Do the policies clearly indicate whether or not there is a process for the school, parent, or eligible student to delete a student's personal information?
Citation:
Background:

6.5.5: Deletion Notice

  • Do the policies clearly indicate how long the vendor may take to delete a user's data after given notice?

6.6: Data Portability

6.6.1: User Export

  • Do the policies clearly indicate whether or not a user can export or download their data, including any user created content on the application or service?
Citation:
Background:

6.6.2: Legacy Contact

  • Do the policies clearly indicate whether or not a user may assign an authorized account manager or legacy contact to access and download their data in the event the account becomes inactive?
Citation:

7: Data Transfer

Background:
  • Data Transfer: Companies should disclose their privacy practices of data ownership, notice, and choice to a user before onward transfer of personal information assets to a third-party occurs. Transfer of a user's personal information should only be permitted where the third-party recipient provides the same level of privacy protection for the data. A company transferring user data should clearly indicate in their policies how they handle data tranfer during a potential bankruptcy, merger, or acquisition.
  • When a company goes out of business, a user's data can often be sold as an asset to another company. Policies that allow a user to delete their data in the event of a bankruptcy, or that clearly indicate a user's data will not be sold as part of any bankruptcy proceedings provide a higher level of privacy protection. Additionally, the method by which a user would be notified if their data will be transferred should be clearly identified in a company's policies. See The Bankruptcy Abuse Prevention and Consumer Protection Act of 2005 (BAPCPA), 11 U.S.C. §363.
  • When two companies merge, a user's data that was collected and protected under one company's privacy policy can become subject to a different set of privacy policies and legal terms. If this happens, it is important a user be informed either before, during, or after the transfer occurs. A user should be notified of any data transfer, regardless of whether or not the notification occurs before their data is transferred, or whether a user can delete their data, or whether a user can opt-out of the data transfer process.
  • While FERPA does not specify that education records shared under some of its exceptions must be returned or destroyed at the end of the contract, it is a best practice to require this. Data return or destruction helps limit the amount of personal information available to third-parties and prevents improper disclosure. This provision also helps schools and districts maintain control over the appropriate use and maintenance of FERPA protected student information. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, p. 6.

7.1: Data Handling

7.1.1: Transfer Data

  • Do the policies clearly indicate whether or not a user's data can be transferred by the vendor in the event of a merger, acquisition, or bankruptcy?
Citation:

7.1.2: Data Assignment

  • Do the policies clearly indicate whether or not the vendor can assign its rights or delegate its duties under the policies to a third party without notice or consent?

7.2.1: Transfer Notice

  • Do the policies clearly indicate whether or not a user will be notified and allowed to provide consent to a data transfer to a third-party successor, in the event of a vendor bankruptcy, merger, or acquisition?

7.3: Transfer Request

7.3.1: Delete Transfer

  • Do the policies clearly indicate whether or not a user can request to delete their data prior to its transfer to a third-party successor in the event of a vendor bankruptcy, merger, or acquisition?

7.4: Onward Contractual Obligations

7.4.1: Contractual Limits

  • Do the policies clearly indicate whether or not the third-party successor of a data transfer is contractually required to provide the same level of privacy protections as the vendor?
Citation:
Background:

8: Security

Background:

8.1: User Identity

8.1.1: Verify Identity

  • Do the policies clearly indicate whether or not a user's identity is verified with personal information collected by the vendor or third party?
Citation:

8.2: User Account

8.2.1: Account Required

  • Do the policies clearly indicate whether or not the vendor requires a user to create an account with a username and password in order to use the Service?

8.2.2: Managed Account

  • Do the policies clearly indicate whether or not the vendor provides user managed accounts for a parent, teacher, school or district?

8.2.3: Two-Factor Protection

  • Do the policies clearly indicate whether or not the security of a user's account is protected by two-factor authentication?

8.3: Third-party Security

8.3.1: Security Agreement

  • Do the policies clearly indicate whether or not a third party with access to a user's information is contractually required to provide the same level of security protections as the vendor?
Citation:
Background:

8.4: Data Confidentiality

8.4.1: Reasonable Security

  • Do the policies clearly indicate whether or not reasonable security standards are used to protect the confidentiality of a user's personal information?
Citation:
Background:

8.4.2: Employee Access

  • Do the policies clearly indicate whether or not the vendor implements physical access controls or limits employee access to user information?
Citation:
  • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a description of the actions the third party will take, including the designation and training of responsible individuals, to ensure the security and confidentiality of pupil records) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(5)
Background:

8.5: Data Transmission

8.5.1: Transit Encryption

  • Do the policies clearly indicate whether or not all data in transit is encrypted?
Citation:
  • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5

8.6: Data Storage

8.6.1: Storage Encryption

  • Do the policies clearly indicate whether or not all data at rest is encrypted?
Citation:
  • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5

8.6.2: Data Control

  • Do the policies clearly indicate whether or not personal information are stored outside the direct control of the vendor?

8.7: Data Breach

8.7.1: Breach Notice

  • Do the policies clearly indicate whether or not the vendor provides notice in the event of a data breach to unauthorized individuals?
Citation:
Background:
  • The breach notification laws in California and the 46 other states are similar in many ways, because most are modeled on the original California law. All of them require notifying individuals when their personal information has been breached, prefer written notification but allow using the "substitute method" in certain situations, allow for a law enforcement delay, and provide an exemption from the requirement to notify when data is encrypted and the keys required to de-crypt the data are still secure. However, there are some differences, primarily in three areas: (1) the notification trigger, (2) the timing for notification, and (3) the definition of covered information. See CA DOJ, California Data Breach Report (2016).
  • A vendor should develop and describe the process for notifying schools or school districts, parents, legal guardians, or eligible students, as well as any appropriate government agencies, of any unauthorized disclosure of student information. Determine whether the incident and the types of data involved also require notification under California's breach notification law, and if so, take appropriate action. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.

9: Responsible Use

Background:
  • Responsible Use: Companies should address appropriate levels of communication, sharing, and visibility between students and adults, and describe how they handle issues related to cyberbullying and reporting inappropriate content. In addition, companies should address the rights of various stakeholders to audit and review the social interactions between users.
  • In order to ensure students, citizens, and consumers of all ages have the ability to adequately protect themselves from data use and abuse, it is important that they develop fluency in understanding the ways in which data can be collected and shared, how algorithms are employed and for what purposes, and what tools and techniques they can use to protect themselves. Although such skills will never replace regulatory protections, increased digital literacy will better prepare individuals to live in a world saturated by data. Digital literacy—understanding how personal data is collected, shared, and used— should be recognized as an essential skill in K-12 education and be integrated into the standard curriculum. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 64.
  • The Children's Internet Protection Act (CIPA) requires that schools who receive federal funding have in place an Internet safety policy that addresses the safety and security of minors when using forms of direct electronic communication such as e-mail and chat rooms. Schools are also required to have in place measures designed to restrict access to materials that are age-restricted and potentially harmful to minors. In addition, any communication of personal data in a public forum or chat room by a child under 13 falls within the definition of a "disclosure" under COPPA. Therefore, such a disclosure could constitute an unauthorized disclosure if parental consent was not obtained beforehand. See CA DOJ, Staying Private in Public: How to Limit Your Exposure on Social Network Sites.
  • COPPA prohibits an operator from conditioning a child's participation in a game, the offering of a prize, or another activity on the child's disclosing more personal information than is reasonably necessary to participate in the activity. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312.

9.1: Social Interactions

9.1.1: Safe Interactions

  • Do the policies clearly indicate whether or not a user can interact with other users, or students can interact with other students in the same classroom, or school?
Citation:

9.1.2: Unsafe Interactions

  • Do the policies clearly indicate whether or not a user can interact with strangers, including adults?
Citation:

9.1.3: Share Profile

  • Do the policies clearly indicate whether or not information must be shared or revealed by a user in order to participate in social interactions?
Citation:

9.2: Data Visibility

9.2.1: Visible Data

  • Do the policies clearly indicate whether or not a user's personal information can be displayed publicly in any way?
Citation:

9.2.2: Profile Visibility

  • Do the policies clearly indicate whether or not a user's personal information can be displayed publicly, outside the context of social interactions?
Citation:

9.2.3: Control Visibility

  • Do the policies clearly indicate whether or not a user has control over how their personal information is displayed to others?

9.3: Report Content

9.3.1: Block Content

  • Do the policies clearly indicate whether or not an educator, parent, or a school has the ability to filter or block inappropriate content, or social interactions with unauthorized individuals?
Citation:
  • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
  • The Communications Decency Act of 1996: (A provider of an interactive computer service shall notify the customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors) See The Communications Decency Act of 1996 (CDA), 47 U.S.C. 230(d)

9.3.2: Report Abuse

  • Do the policies clearly indicate whether or not a user can report abuse or cyber-bullying?

9.4: Monitor and Review

9.4.1: Monitor Content

  • Do the policies clearly indicate whether or not user content is reviewed, screened, or monitored by the vendor?

9.4.2: Filter Content

  • Do the policies clearly indicate whether or not the vendor takes reasonable measures to delete all personal information from a user's postings before they are made publicly visible?
Citation:
  • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2

9.4.3: Moderate Interactions

  • Do the policies clearly indicate whether or not social interactions between users on the website or application are moderated?
Citation:
  • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
Background:

9.4.4: Log Interactions

  • Do the policies clearly indicate whether or not social interactions are logged by the vendor?

9.4.5: School Audit

  • Do the policies clearly indicate whether or not social interactions may be audited by a school or district?

9.4.6: Parent Audit

  • Do the policies clearly indicate whether or not social interactions may be audited by a parent or guardian?

9.4.7: User Audit

  • Do the policies clearly indicate whether or not social interactions may be audited by a user or eligible student?

9.5: Internet Safety

9.5.1: Safe Tools

  • Do the policies clearly indicate whether or not tools and processes that support safe and appropriate social interactions on the application or service are provided by the vendor?
Citation:
  • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
  • The Communications Decency Act of 1996: (A provider of an interactive computer service shall notify the customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors) See The Communications Decency Act of 1996 (CDA), 47 U.S.C. 230(d)

10: Advertising

Background:
  • Advertising: Companies should address when and where they provide advertising and whether they engage in traditional or targeted advertising practices. In addition, a company should define the role of third-parties in serving advertisements to different audiences that include students, parents, teachers, or the school and the compliance issues implicated as a result. Companies should also address how they collect advertising data, display advertising content, and how they market thier products and services based on demographic information.
  • The FTC maintains the view that affiliates are third-parties, and a consumer choice mechanism is necessary unless the affiliate relationship is clear to consumers. However, where an affiliate relationship is hidden – such as between an online publisher that provides content to consumers through its website and an ad network that invisibly tracks consumers' activities on the site – marketing from the affiliate would not be consistent with a transaction on, or the consumer's relationship with, that website. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 42.
  • While data mining or scanning may sometimes be a necessary component of online services (e.g., for malware/spam detection or personalization tools), schools and districts should prohibit any mining or scanning for targeted advertising directed to students or their parents. Such provisions could lead to a violation of COPPA, FERPA, or the PPRA. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, p. 5.
  • The Protection of Pupil Rights Amendment (PPRA) provides parents with certain rights with regard to marketing activities in schools. Specifically, the PPRA requires that a school district must, with exceptions, directly notify parents of students who are scheduled to participate in activities involving the collection, disclosure, or use of personal information collected from students for marketing purposes, or to sell or otherwise provide that information to others for marketing purposes, and to give parents the opportunity to opt-out of these activities. See 20 U.S.C. § 1232h(c)(2)(C)(i); See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 6.
  • The PPRA also requires districts to develop and adopt policies, in consultation with parents, about marketing activities. PPRA has an important exception, however, as neither parental notice, the opportunity to opt-out, or the development and adoption of policies are required for school districts to use a student's personal information for the exclusive purpose of developing, evaluating, or providing educational products or services for students or schools. See 20 U.S.C. § 1232h(c)(1)(E) and (c)(4)(A); See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 6.
  • It is important to remember that even though PPRA only applies to K-12 institutions, there is no time-limit on the limitations governing the use of personal information collected from students for marketing purposes. So, for example, while PPRA would not limit the use of information collected from college students for marketing, it would restrict the use of information collected from students while they were still in high school (if no notice or opportunity to opt-out was provided) even after those students graduate. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 6.

10.1: Vendor Communications

10.1.1: Service Messages

  • Do the policies clearly indicate whether or not a user will receive service or administrative related email or text message communications from the vendor or third party?

10.2: Traditional Advertising

10.2.1: Traditional Ads

  • Do the policies clearly indicate whether or not traditional advertisements are displayed to a user based on webpage content, but not a user's data?
Background:

10.3: Behavioral Advertising

10.3.1: Behavioral Ads

  • Do the policies clearly indicate whether or not behavioral or contextual advertising based on a student's personal information are displayed?
Citation:
Background:
  • Online behavioral or targeted advertising is the practice of collecting information about consumers' online interests in order to deliver targeted advertising to them. This system of advertising revolves around ad networks that can track individual consumers—or at least their devices—across different websites. When organized according to unique identifiers, this data can provide a potentially wide-ranging view of individual use of the Internet. These individual behavioral profiles allow advertisers to target ads based on inferences about individual interests, as revealed by Internet use. Targeted ads are generally more valuable and efficient than purely contextual ads and provide revenue that supports an array of free online content and services. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), pp. 11-12.
  • The FTC recommends that affirmative express consent is appropriate when a company uses sensitive data for any marketing, whether first or third-party. When health or children's information is involved, for example, the likelihood that data misuse could lead to embarrassment, discrimination, or other harms is increased. This risk exists regardless of whether the entity collecting and using the data is a first-party or a third-party that is unknown to the consumer. In light of the heightened privacy risks associated with sensitive data, first parties should provide a consumer choice mechanism at the time of data collection. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 47.
  • The FTC believes affirmative express consent for first-party marketing using sensitive data should be limited. Certainly, where a company's business model is designed to target consumers based on sensitive data – including data about children, financial and health information, Social Security numbers, and certain geolocation data – the company should seek affirmative express consent before collecting the data from those consumers. On the other hand, the risks to consumers may not justify the potential burdens on general audience businesses that incidentally collect and use sensitive information. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 47-48.
  • If a vendor displays targeted advertising they should not use any information, including covered information and persistent unique identifiers, acquired through the site or service as a basis for targeting advertising to a specific student or other user. This includes both advertising delivered on the site or service that acquired the information and advertising delivered on any other site or service based on that information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.

10.4: Ad Tracking

10.4.1: Third-Party Tracking

  • Do the policies clearly indicate whether or not third-party advertising services or tracking technologies collect any information from a user of the application or service?
Citation:

10.4.2: Track Users

  • Do the policies clearly indicate whether or not a user's information is used to track and target advertisements on other third-party websites or services?
Citation:
  • Children's Online Privacy Protection Act: (An operator is prohibited from sharing a persistent identifier collected from children that can be used to recognize and track a user over time and across different websites or services without verifiable parental consent) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Student Online Personal Information Protection Act: (An operator is prohibited from tracking a student across websites with targeted advertising) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(1)(B)
  • California Online Privacy Protection Act: (An operator is required to disclose whether other third parties may collect personally identifiable information about a consumer's online activities over time and across different Web sites) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(6)
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.1
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
Background:
  • The FTC recommends that where a company that has a first-party relationship with a consumer for delivery of a specific service, but also tracks the consumer's activities across other parties' websites, such tracking is unlikely to be consistent with the context of the consumer's first-party relationship with the entity. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 41.
  • The FTC agrees that the defintion of first-party marketing should include the practice of contacting consumers across different channels. Regardless of the particular means of contact, receipt of a message from a company with which a consumer has interacted directly is likely to be consistent with the consumer's relationship with that company. If an offline or online retailer tracks a customer's activities on a third-party website, this is unlikely to be consistent with the customer's relationship with the retailer; thus, choice should be required. See FTC 2012, P. 42; See also FTC Staff Report, Self-Regulatory Principles For Online Behavioral Advertising, pp. 26-28.

10.4.3: Ad Profile

  • Do the policies clearly indicate whether or not the vendor allows third parties to use a student's data to create a profile, engage in data enhancement, social advertising, or target advertising to students, parents, teachers, or the school?
Citation:
Background:

10.5: Filtered Advertising

10.5.1: Child Ads

  • Do the policies clearly indicate whether or not advertisements are displayed to children under 13 years of age?
Citation:
Background:
  • The FTC restricts advertisements that may be misleading to children. Advertising to children under the age of 13 is particularly scrutinized, as research shows that these children are especially vulnerable because they are unable to understand an advertisement's persuasive intent. Self-regulatory guidelines are also published by the Children's Advertising Review Unit, which is a branch of the advertising self-regulatory program administered by the Council of Better Business Bureaus. The guidelines generally provide that any advertising to young children should be clearly distinguishable from the other content. See Children's Advertising Review Unit, Self-Regulatory Program for Children's Advertising.
  • Third-party data brokers should implement better measures to refrain from collecting information from children and teens, particularly in marketing products. As to children under 13, COPPA already requires certain online services to refrain from collecting personal information from this age group without parental consent. The principles underlying COPPA could apply equally to information collected offline from children. As to teens, the FTC previously has noted that they often lack the judgment to appreciate the long-term consequences of, for example, posting personal information on the Internet. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), p. 55.

10.5.2: Filter Ads

  • Do the policies clearly indicate whether or not advertisements that are age inappropriate for minors are filtered (e.g., alcohol, gambling, violent, or sexual content)?
Citation:
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
  • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
Background:
  • Advertising to children in school presents a variety of legal issues. Several states have laws that place restrictions on the advertising of products or services that have inappropriate content for children, such as alcohol and firearms. Additionally, contextual advertising would likley be permissible as, 'support for internal operations' of a service or applciation, in contrast to behaviorally targeted advertising that implicates several privacy laws such as the CIPA, COPPA, and FERPA, which restrict the use of personal information without parental consent.

10.6: Marketing Communications

10.6.1: Marketing Messages

  • Do the policies clearly indicate whether or not the vendor may send marketing emails, text messages, or other related communications that may be of interest to a user?
Citation:

10.6.2: Third-Party Promotions

  • Do the policies clearly indicate whether or not the vendor allows a user to participate in any sweepstakes, contests, surveys, or other similar promotions?
Citation:

10.7: Unsubscribe

10.7.1: Unsubscribe Ads

  • Do the policies clearly indicate whether or not a user can opt-out of traditional, contextual, or behavioral advertising?
Citation:

10.7.2: Unsubscribe Marketing

  • Do the policies clearly indicate whether or not a user can opt-out or unsubscribe from a vendor or third party marketing communication?
Citation:
  • Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003: (The sender of a commerical electronic communication may not require that any recipient pay any fee, provide any information other than the recipient's electronic mail address and opt-out preferences, or take any other steps except sending a reply electronic mail message or visiting a single Internet Web page, in order to submit a request not to receive future commercial electronic mail messages from the sender) See Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003 (CAN-SPAM), 16 C.F.R. Part 316.5

10.8: Do Not Track

10.8.1: DoNotTrack Response

  • Do the policies clearly indicate whether or not the vendor responds to a "Do Not Track" signal or other opt-out mechanisms from a user?
Citation:
  • California Online Privacy Protection Act: (An operator is required to disclose how they respond to Web browser "Do Not Track" signals or other mechanisms that provide consumers the ability to opt-out of the collection of personally identifiable information about their online activities over time and across third-party Web sites) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(5)
Background:
  • A Do Not Track system should be implemented universally to cover all parties that would track consumers. The choice mechanism should be easy to find, easy to understand, and easy to use. Any choices offered should be persistent and should not be overridden if, for example, consumers clear their cookies or update their browsers. A Do Not Track system should be comprehensive, effective, and enforceable. It should opt consumers out of behavioral tracking through any means and not permit technical loopholes. Finally, an effective Do Not Track system should go beyond simply opting consumers out of receiving targeted advertisements; it should opt them out of collection of behavioral data for all purposes other than those that would be consistent with the context of the interaction (e.g., preventing click-fraud or collecting de-identified data for analytics purposes). See California Business and Professions Code §§ 22575(b)(5)-(6); See also FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 53.
  • Even as we focus more on data use, consumers still have a valid interest in "Do Not Track" tools that help them control when and how their data is collected. Strengthening these tools is especially important because there is now a growing array of technologies available for recording individual actions, behavior, and location data across a range of services and devices. Public surveys indicate a clear and overwhelming demand for these tools, and the government and private sector must continue working to evolve privacy-enhancing technologies in step with improved consumer services. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 62.

10.8.2: DoNotTrack Description

  • Do the policies clearly indicate whether the vendor provides a hyperlink to a description, including the effects, of any program or protocol the vendor follows that offers consumers a choice not to be tracked?
Citation:

11: Compliance

Background:
  • Compliance: Consumers have a right to have personal data handled by companies with appropriate measures that follow Fair Information Privacy Principles (FIPPs) and are in compliance with FERPA, COPPA, and the PPRA. Companies should be accountable to enforcement authorities and consumers for adhering to these principles and federal laws. Companies also should hold employees responsible for adhering to these principles and should train their employees as appropriate to handle personal data consistently and regularly evaluate their performance in this regard. Where appropriate, companies should conduct both full internal audits and external audits of third-party affiliates. Companies that disclose personal data to third-parties should at a minimum ensure that the recipients are under enforceable contractual obligations to adhere to these principles, unless they are required by law to do otherwise. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 21.
  • Companies that discover they are not in compliance with the rules under COPPA should take immediate remedial actions. First, until a company's website or online service comes into compliance, they must stop collecting, disclosing, or using personal information from children under age 13. Second, companies should carefully review their information practices and online privacy policy. In conducting internal and external audits, companies should look closely at what information they collect, how they collect it, how they use it, whether the information is necessary for the activities on the site or online service, whether they have adequate mechanisms for providing parents with notice and obtaining verifiable consent, whether they have adequate methods for parents to review and delete their children's information, and whether they employ adequate data security, retention, and deletion practices. A court can hold companies who violate the rules under COPPA liable for civil penalties of up to $16,000 per violation. The amount of civil penalties a court assesses may turn on a number of factors, including the egregiousness of the violations, whether the operator has previously violated the COPPA, the number of children involved, the amount and type of personal information collected, how the information was used, whether it was shared with third parties, and the size of the company. See FTC, Complying with COPPA: Frequently Asked Questions.
  • Companies that discover they are not in compliance with the rules under FERPA should take immediate remedial actions. A parent of a student under the age of 18 at an elementary or secondary school or a student who is at least 18 years of age or attending a postsecondary institution at any age ("eligible student") may file a written complaint with the Family Policy Compliance Office (FPCO) regarding an alleged violation of a school's failure to comply with his or her rights under FERPA. A parent of an eligible student generally may not file a complaint under FERPA, as the rights afforded to parents are transferred to the student when he or she becomes an eligible student. The FERPA complaint resolution process is designed to identify problems with FERPA implementation in educational agencies and institutions, to ensure compliance with FERPA requirements, and act to prevent future violations of FERPA. If a violation is substantiated, the FPCO may require specific corrective action (e.g., revise policy or procedures, or conduct training) to bring the educational agency or institution into compliance with FERPA requirements. When the educational agency or institution has completed the required corrective action, FPCO closes the complaint. See Department of Education (DOE), Family Policy Compliance Office, Filing a Complaint Under the Family Educational Rights and Privacy Act (FERPA).

11.1: Children under 13

11.1.1: Actual Knowledge

  • Do the policies clearly indicate whether or not the vendor has actual knowledge that personal information from children under 13 years of age is collected by the application or service?
Citation:
  • Children's Online Privacy Protection Act: (A general audience site is where the operator has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required before collection of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Children's Online Privacy Protection Act: (A site directed to children is where the operator has actual knowledge the site is collecting information from children under the age of 13 and parental consent is required before any collection or use of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Children's Online Privacy Protection Act: (A vendor who may obtain actual knowledge that it is collecting information from a child must not encourage a child from disclosing more information than reasonably necessary through an age verification mechanism. An age gate should be: age-neutral; not encourage falsification; list day, month, and year; have no prior warning that under 13 children will be blocked; and prevent multiple attempts) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.3(d)
Background:
  • The Children's Online Privacy Protection Act (COPPA) requires an operator to post a link to a notice of its information practices on the homepage of its web site or online service and in each area of its web site where it collects "Personal Information" from children. An operator of a general audience web site with a separate children's area must also post a link to its privacy policy on the homepage of the children's area. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312
  • COPPA applies anytime an operator of a website or online service has actual knowledge that it is collects, maintains, uses, or discloses personal information from a child under 13. In these situations an operator is generally required to obtain verified parental consent.
  • COPPA requires companies to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. Companies should minimize what they collect in the first place and take reasonable steps to release personal information only to service providers and third-parties capable of maintaining its confidentiality, security, and integrity. Always obtain assurances that third-parties will live up to their contractual privacy responsibilities. Also, companies should hold on to personal information only as long as is reasonably necessary for the purpose for which it was collected. They should securely dispose of it once they no longer have a legitimate reason for retaining it. See FTC, Six-Step Compliance Plan for Your Business.

11.1.2: Child Audience

  • Do the policies clearly indicate whether or not the application or service is directed to children under 13, or (even if for an older audience) would the service appeal to children under 13 years of age?
Citation:
  • Children's Online Privacy Protection Act: (An exception for a general audience site exists if the site would appeal to children under 13 years of age, which would take into account several factors that include subject matter, visual content, age of models, and activities provided) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
Background:
  • COPPA applies to operators of websites or online services that are directed to children. A child is defined as an individual under the age of 13. An online service which is not specifically targeted at children under the age of 13 may still be considered directed at children, if the service contains content that would appeal to children under the age of 13. The FTC looks at a variety of factors to see if a site or service is directed to children under 13, including the subject matter of the site or service, visual and audio content, the use of animated characters or other child-oriented activities and incentives, the age of models, the presence of child celebrities or celebrities who appeal to kids, ads on the site or service that are directed to children, and other reliable evidence about the age of the actual or intended audience. See FTC, Six-Step Compliance Plan for Your Business.

11.1.3: COPPA Notice

  • Do the policies clearly indicate whether or not the vendor describes: (1) what information is collected from children under 13 years of age, (2) how that information is used, and (3) its disclosure practices of that information?
Citation:

11.1.4: COPPA Offline

  • Do the policies clearly indicate whether or not the vendor collects personal information from children under 13 years of age "offline"?
Citation:
Background:
  • Congress limited the scope of COPPA to only information that an operator collects online from a child. COPPA does not govern information collected by an operator offline that is later placed online. See 15 U.S.C. 6501(8) (defining personal information as individually identifiable information about an individual collected online)

11.1.5: Restrict Account

  • Do the policies clearly indicate whether or not the vendor restricts creating an account for a child under 13 years of age?
Citation:

11.1.6: Restrict Purchase

  • Do the policies clearly indicate whether or not the vendor restricts in-app purchases for a child under 13 years of age?
Citation:

11.1.7: Safe Harbor

  • Do the policies clearly indicate whether or not the application or service participates in an FTC approved COPPA safe harbor program?
Citation:
Background:
  • An operator may satisfy its obligations under COPPA by participating in a safe harbor program. The safe harbor programs are self-regulatory frameworks developed by industry groups and approved by the FTC. FTC-approved COPPA safe harbor programs offer parental notification and consent systems for operators who are members of their programs. In addition, the FTC recognizes that these and other common consent mechanisms could benefit operators (especially smaller ones) and parents if they offer a proper means for providing notice and obtaining verifiable parental consent, as well as ongoing controls for parents to manage their children's accounts. The FTC recommends operators use a common consent mechanism to assist in providing notice and obtaining consent, because they are ultimately responsible for ensuring that the notice accurately and completely reflects their information collection practices and that the consent mechanism is reasonably designed to reach the parent. See 78 Fed. Reg. 3972, 3989; See FTC, Complying with COPPA: Frequently Asked Questions, q. 13.

11.2: Teens 13-18

11.2.1: Teen Data

  • Do the policies clearly indicate whether or not personal information from teens 13 to 18 years of age are collected?
Citation:
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
Background:
  • Third-party data brokers should implement better measures to refrain from collecting information from children and teens, particularly in marketing products. As to children under 13, COPPA already requires certain online services to refrain from collecting personal information from this age group without parental consent. The principles underlying COPPA could apply equally to information collected offline from children. As to teens, the FTC previously has noted that they often lack the judgment to appreciate the long-term consequences of, for example, posting personal information on the Internet. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), p. 55.

11.3: Students in K-12

11.3.1: School Purpose

  • Do the policies clearly indicate whether or not the application or service is primarily used for preschool or K-12 school purposes and was designed and marketed for preschool or K-12 school purposes?
Citation:

11.3.2: Education Records

  • Do the policies clearly indicate the process by which education records are entered into the application or service? For example, are data entered by district staff, school employees, parents, teachers, students, or some other person?
Citation:
Background:

11.3.3: FERPA Notice

  • Do the policies clearly indicate whether or not the vendor provides a separate agreement that provides notice to users of their rights, under FERPA?
Citation:
Background:
  • FERPA is a Federal law that protects personally identifiable information in students' education records from unauthorized disclosure. It affords parents the right to access their child's education records, the right to seek to have the records amended, and the right to have some control over the disclosure of personally identifiable information from the education records. When a student turns 18 or enters a postsecondary institution at any age, the rights under FERPA transfer from the parents to the student ("eligible student"). 20 U.S.C. § 1232g; 34 C.F.R. Part 99; See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, pp. 1-3.
  • FERPA denies federal funding to educational agencies or institutions that have a practice or policy of permitting the release of student information without parental consent. There is an exception where such information is released to "school officials" who have been determined by the educational agency or institution to have a legitimate educational interest.
  • A vendor should describe the procedures for a parent, legal guardian, or eligible student to review and correct covered information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

11.3.4: School Official

  • Do the policies clearly indicate whether or not the vendor is under the direct control of the educational institution and designated a 'school official,' under FERPA?
Citation:
Background:

11.3.5: Directory Information

  • Do the policies clearly indicate whether or not the vendor discloses student information as 'Directory Information' under a FERPA exception?
Citation:
Background:
  • What is the "Directory Information" Exception? An exception to parental consent that permits the disclosure of PII from education records under FERPA. Information designated by the school or district as directory information may be disclosed without consent and used without restriction in conformity with the policy, unless the parent, guardian, or eligible student opts out. Examples of directory information about students include name, address, telephone number, email address, date and place of birth, grade level, sports participation, and honors or awards received. Before a school or district can disclose directory information, it must first provide public notice to parents and eligible students of the types of information designated as directory information, the intended uses for the information, and the right of parents or eligible students to "opt out" of having their information shared. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 3; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 3-4.

11.4.1: Parental Consent

  • Do the policies clearly indicate whether or not 'verifiable parental consent' should be obtained before they collect or disclose personal information?
Citation:
  • Do the policies clearly indicate whether or not a parent can consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties?
Citation:
  • Children's Online Privacy Protection Act: (An operator can not condition a child's participation in the service with sharing any collected information with third parties. A parent is required to have the ability to consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(a)(2)
  • Do the policies clearly indicate whether or not the vendor responds to a request from a parent or guardian to prevent further collection of their child's information?
Citation:

11.4.4: Delete Child

  • Do the policies clearly indicate whether or not the vendor deletes personal information from a student or child under 13 years of age if collected without parental consent?
Citation:
  • Do the policies clearly indicate whether or not the vendor provides direct notice to parents of its collection and diclosure practices with method to provide verifiable parental consent, under COPPA?
Citation:
Background:
  • Under most circumstances an operator is required to obtain verified parental consent before the collection, use, or disclosure, of personal information from children under the age of 13. The method used to obtain parental consent must be reasonably calculated (taking into account available technology) to ensure that the person providing consent is actually the child's parent.

11.4.6: Internal Operations

  • Do the policies clearly indicate whether or not the vendor can collect and use personal information from children without parental consent to support the 'internal operations' of the vendor's website or service?
Citation:

11.4.7: COPPA Exception

  • Do the policies clearly indicate whether or not the vendor collects personal information from children without verifiable parental consent for the sole purpose of trying to obtain consent under COPPA?
Citation:

11.4.8: FERPA Exception

  • Do the policies clearly indicate whether or not the vendor may disclose personal information without verifiable parental consent under a FERPA exception?
Citation:
  • Do the policies clearly indicate whether or not responsibility or liability for obtaining verified parental consent is transferred to the school or district?

11.5.1: Policy Jurisdiction

  • Do the policies clearly indicate the vendor's jurisdiction that applies to the construction, interpretation, and enforcement of the policies?
Background:
  • The vendor's policies should describe the jurisdiction or state where disputes will be resolved. Typically, disputes are settled by Alternative Dispute Resolution (ADR) by an arbitrator through binding arbitration that can enter a judgement that will be upheld in any court having jurisdiction.

11.5.2: Dispute Resolution

  • Do the policies clearly indicate whether or not the vendor requires a user to waive the right to a jury trial, or settle any disputes by Alternative Dispute Resolution (ADR)?
Background:
  • The vendor's policies should describe the legal process used to determine how disputes will be resolved. Any dispute arising out of an alleged breach of the policies will likely be settled by Alternative Dispute Resolution (ADR) through binding arbitration before judicial arbitration or mediation services, such as the American Arbitration Association, or a similar arbitration service.

11.5.3: Class Waiver

  • Do the policies clearly indicate whether or not the vendor requires waiver of any rights to join a class action lawsuit?

11.5.4: Law Enforcement

  • Do the policies clearly indicate whether or not the vendor can use or disclose a user's data under a requirement of applicable law, to comply with a legal process, respond to governmental requests, enforce their own policies, for assistance in fraud detection and prevention, or to protect the rights, privacy, safety or property of the vendor, its users, or others?
Citation:
Background:

11.6: Certification

11.6.1: Privacy Pledge

  • Do the policies clearly indicate whether or not the vendor has signed any privacy pledges or received any other privacy certifications?
Background:
  • Privacy protection depends on companies being accountable to consumers as well as to agencies that enforce consumer data privacy protections. However, compliance goes beyond external accountability to encompass practices through which companies prevent lapses in their privacy commitments or detect and remedy any lapses that may occur. Companies that can demonstrate that they live up to their privacy commitments have powerful means of maintaining and strengthening consumer trust. A company's own evaluation can prove invaluable to this process. The appropriate evaluation technique, which could be a self-assessment and need not necessarily be a full audit, will depend on the size, complexity, and nature of a company's business, as well as the sensitivity of the data involved. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.

11.7: International Laws

11.7.1: GDPR Transfer

  • Do the policies clearly indicate whether or not a user's data are subject to International data jurisdiction laws, such as a privacy shield, or a safe harbor framework that protects the cross-border transfer of a user's data?
Citation:
  • General Data Protection Regulation: (The EU General Data Protection Regulation (GDPR) replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy) See General Data Protection Regulation (GDPR), (Regulation (EU) 2016/679)

11.7.2: GDPR Contact

  • Do the policies clearly indicate whether or not the vendor provides a Data Protection Officer (DPO) or other contact to ensure GDPR compliance?
Citation:
  • General Data Protection Regulation: (The EU General Data Protection Regulation (GDPR) replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy) See General Data Protection Regulation (GDPR), (Regulation (EU) 2016/679)

11.8: Compliance Assessment

11.8.1: Accountability Audit

  • Do the policies clearly indicate whether or not the data privacy or security practices of the vendor are internally or externally audited to ensure compliance?

Licensing and Attribution

The Privacy Evaluation Questions are released under a Creative Commons Attribution Non-Commercial Share-Alike 4.0 License. If you use these questions in your non-commercial project, please credit Common Sense Media as the author, and link back to the announcement post.

This is an example of proper attribution for the Questions: The Privacy Evaluation Questions were authored by Common Sense Media, and are reusable under the terms of a Creative Commons Attribution Non-Commercial Share-Alike 4.0 License.