Basic Evaluation Questions

These questions provide the foundation of our basic evaluation process that comprise only 35 of the most important questions from our full 150 question evaluation. You have several options for navigating these questions, and learning more about data privacy. This page contains all of the basic questions, with background citations.

 

Assessment:

0.1.1: (Policy Available) Are the privacy policies for the specific product (vs. the company website) made publicly available?

Citation:

Background:

  • A company’s terms of service outline the relationship between the user and the company. The terms contain rules for what activities and content users are permitted to engage in and share on a company’s services, and as such, these terms can directly affect users’ freedom of expression rights. Companies can also take action against users for violating the conditions described in the terms. Given this, we expect companies to ensure that users can easily locate these terms and understand what they mean. See Ranking Digital Rights, F1.
  • Privacy policies address how companies collect, manage, use, and secure information about users as well as information provided by users. Given this, companies should ensure that users can easily locate the policy and to make an effort to help users understand what they mean. See Ranking Digital Rights, P1.

0.1.3: (Default Encryption) Does the login page use encryption with HTTPS?

Citation:

0.1.4: (Encryption Required) Are HTTP requests made to the login page redirected to HTTPS?

Citation:

1: Transparency

1.1: Policy Version

1.1.1: (Effective Date) Do the policies clearly indicate the version or effective date of the policies?

Citation:

Background:

1.2: Policy Notice

1.3: Policy Changes

1.4: Policy Coverage

1.5: Policy Contact

1.6: Policy Principles

1.7: Policy Language

1.8: Intended Use

1.8.1: (Children Intended) Do the policies clearly indicate whether or not the product is intended to be used by children under the age of 13?

Citation:

1.8.5: (Students Intended) Do the policies clearly indicate whether or not the product is intended to be used by students in preschool or K-12?

Citation:

2: Focused Collection

2.1: Data Collection

2.1.1: (Collect PII) Do the policies clearly indicate whether or not the vendor collects Personally Identifiable Information (PII)?

Citation:

  • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
  • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)

Background:

  • FERPA defines the term personally identifiable information (PII) to include direct identifiers (such as a student's or other family member's name) and indirect identifiers (such as a student's date of birth, place of birth, or mother's maiden name). Indirect identifiers include metadata about a student's interaction with an application or service, and even aggregate information can be considered PII under FERPA if a reasonable person in the school community could identify individual students based on the indirect identifiers together with other reasonably available information, including other public information. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 2; See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, p. 2.
  • Companies collect a wide range of personal information from users—from personal details and account profiles to a user’s activities and location. We expect companies to clearly disclose what user information they collect and how they do so. See Ranking Digital Rights, P3.
  • The term “user information” appears in many indicators throughout the Privacy category. An expansive interpretation of user information is defined as: “any data that is connected to an identifiable person, or may be connected to such a person by combining datasets or utilizing data-mining techniques.” As further explanation, user information is any data that documents a user’s characteristics and/or activities. This information may or may not be tied to a specific user account. This information includes, but is not limited to, personal correspondence, user-generated content, account preferences and settings, log and access data, data about a user’s activities or preferences collected from third parties either through behavioral tracking or purchasing of data, and all forms of metadata. See Ranking Digital Rights, P3.

2.2: Data Source

2.3: Data Exclusion

2.4: Data Limitation

2.4.1: (Collection Limitation) Do the policies clearly indicate whether or not the vendor limits the collection or use of information to only data that are specifically required for the product?

Citation:

Background:

3: Data Sharing

3.1: Data Shared With Third Parties

3.1.1: (Data Shared) Do the policies clearly indicate if collected information (this includes data collected via automated tracking or usage analytics) is shared with third parties?

3.1.2: (Data Categories) Do the policies clearly indicate what categories of information are shared with third parties?

3.2: Data Use by Third Parties

3.2.4: (Third-Party Marketing) Do the policies clearly indicate whether or not personal information is shared with third parties for advertising or marketing purposes?

Citation:

Background:

3.3: Data Not Shared With Third Parties

3.4: Data Sold to Third Parties

3.4.1: (Data Sold) Do the policies clearly indicate whether or not a user's personal information is sold or rented to third parties?

Citation:

Background:

3.5: Third-Party Data Acquisition

3.6: Third-Party Links

3.7: Third-Party Data Access

3.8: Third-Party Data Collection

3.9: Third-Party Data Misuse

3.10: Third-Party Service Providers

3.11: Third-Party Affiliates

3.12: Third-Party Policies

3.13: Third-Party Data Combination

3.14: Third-Party Authentication

3.14.1: (Social Login) Do the policies clearly indicate whether or not social or federated login is supported to use the product?

Citation:

  • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6(c)

3.15: De-identified or Anonymized Data

3.16: Third-Party Contractual Obligations

3.16.1: (Third-Party Limits) Do the policies clearly indicate whether or not the vendor imposes contractual limits on how third parties can use personal information that the vendor shares or sells to them?

Citation:

  • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
  • Family Educational Rights and Privacy Act: (An exception for disclosing personally identifiable information without obtaining parental consent exists for sharing data with a third party who is considered a "school official" with a legitimate educational interest, and under direct control of the school for the use and maintenance of education records) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(1)(i)(B)
  • Student Online Personal Information Protection Act: (An operator may disclose student information to a third party service provider, but the third party is prohibited from using the information for or any purpose other than providing the service) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(4)(E)(i)
  • Student Online Personal Information Protection Act: (A third party service provider may not disclose student information to any subsequent third party) See Student Online Personal Information Protection Act (SOPIPA),Cal. B.&P. Code § 22584(b)(4)(E)(ii)
  • General Data Protection Regulation: (The processor shall not engage another processor without prior specific or general written authorisation of the controller. In the case of general written authorisation, the processor shall inform the controller of any intended changes concerning the addition or replacement of other processors, thereby giving the controller the opportunity to object to such changes.) See General Data Protection Regulation (GDPR), Processor, Art. 28(2)
  • General Data Protection Regulation: (Processing by a processor shall be governed by a contract or other legal act under Union or Member State law, that is binding on the processor with regard to the controller and that sets out the subject-matter and duration of the processing, the nature and purpose of the processing, the type of personal data and categories of data subjects and the obligations and rights of the controller.) See General Data Protection Regulation (GDPR), Processor, Art. 28(3)
  • General Data Protection Regulation: (Where a processor engages another processor for carrying out specific processing activities on behalf of the controller, the same data protection obligations as set out in the contract or other legal act between the controller and the processor ... shall be imposed on that other processor by way of a contract or other legal act under Union or Member State law, in particular providing sufficient guarantees to implement appropriate technical and organisational measures in such a manner that the processing will meet the requirements of this Regulation. Where that other processor fails to fulfil its data protection obligations, the initial processor shall remain fully liable to the controller for the performance of that other processor's obligations.) See General Data Protection Regulation (GDPR), Processor, Art. 28(4)
  • General Data Protection Regulation: (The processor and any person acting under the authority of the controller or of the processor, who has access to personal data, shall not process those data except on instructions from the controller) See General Data Protection Regulation (GDPR), Processing under the authority of the controller or processor, Art. 29

Background:

  • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
  • A company's data would not be "reasonably linkable" to a particular consumer or device to the extent that the company implements three significant protections for that data: (1) a given data set is not reasonably identifiable, (2) the company publicly commits not to re-identify it, and (3) the company requires any downstream users of the data to keep it in de-identified form. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
  • The ability to re-identify "anonymous" data supports the FTC's framework application to data that can be reasonably linked to a consumer or device, because consumers' privacy interest in data goes beyond what is strictly labeled PII. There exists a legitimate interest for consumers in having control over how companies collect and use aggregated or de-identified data, browser fingerprints, and other types of non-PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 18-19.
  • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
  • A vendor should contractually require their service providers who receive covered information acquired through the site or service to use the information only to provide the contracted service, not to further disclose the information, to implement and maintain reasonable security procedures and practices as required by law, and to return or delete covered information at the completion of the contract. Include a requirement that any service providers notify the vendor immediately of any unauthorized disclosure of the student information in their custody, and then act promptly to provide proper notice as required by law. Make clear to service providers that they may separately face liability for the mishandling of student data. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 13.

4: Respect for Context

4.1: Data Use

4.2: Data Combination

4.3: Data Notice

4.4: Data Changes

4.5: Policy Enforcement

5: Individual Control

5.1: User Content

5.1.1: (User Submission) Do the policies clearly indicate whether or not a user can create or upload content to the product?

5.2: User Consent

5.3: Remedy Process

5.4: Data Settings

5.5: Data Disclosure

5.6: Intellectual Property

6: Access and Accuracy

6.1: Data Access

6.1.1: (Access Data) Do the policies clearly indicate whether or not the vendor provides authorized individuals a method to access a user's personal information?

Citation:

6.2: Data Integrity

6.3: Data Correction

6.3.1: (Data Modification) Do the policies clearly indicate whether or not the vendor provides authorized individuals with the ability to modify a user's inaccurate data?

Citation:

  • California Online Privacy Protection Act: (If the operator maintains a process for a consumer to review and request changes to any of their personally identifiable information they must provide a description of that process) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(2)
  • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.) See General Data Protection Regulation (GDPR), Right to rectification, Art. 16

6.4: Data Retention

6.5: Data Deletion

6.5.4: (Deletion Process) Do the policies clearly indicate whether or not the vendor provides a process for the school, parent, or eligible student to delete a student's personal information?

Citation:

Background:

6.6: Data Portability

7: Data Transfer

7.1: Data Handling

7.1.1: (Transfer Data) Do the policies clearly indicate whether or not the vendor can transfer a user's data in the event of the vendor's merger, acquisition, or bankruptcy?

Citation:

7.2: Transfer Request

7.3: Onward Contractual Obligations

8: Security

8.1: User Identity

8.2: User Account

8.2.1: (Account Required) Do the policies indicate whether or not the vendor requires a user to create an account with a username and password in order to use the product?

8.2.2: (Managed Account) Do the policies clearly indicate whether or not the vendor provides user managed accounts for a parent, teacher, school or district?

8.3: Third-party Security

8.4: Data Confidentiality

8.4.1: (Reasonable Security) Do the policies clearly indicate whether or not reasonable security standards are used to protect the confidentiality of a user's personal information?

Citation:

Background:

  • A vendor should provide a general description of the technical, administrative and physical safeguards you use to protect student information from unauthorized access, destruction, use, modification, or disclosure. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.
  • A vendor should implement and maintain reasonable security measures appropriate to the nature of the student information, including covered information, acquired through your site or service. Designate and train someone responsible and use a risk management process: identify your data assets, assess threats and vulnerabilities, apply appropriate controls, monitor their effectiveness, and repeat the process. As discussed in the California Data Breach Report, the Center for Internet Security’s Critical Security Controls is a good starting point for high-priority security controls. The Federal Trade Commission’s Start with Security also offers helpful guidance. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
  • This indicator is applicable to internet and mobile ecosystem companies. Companies hold significant amounts of user information, making them targets for malicious actors. We expect companies to help users protect themselves against such threats. Companies should clearly disclose that they use advanced authentication techniques to prevent unauthorized access to user accounts and information. We also expect companies to provide users with tools that enable them to secure their accounts and to know when their accounts maybe compromised. See Ranking Digital Rights, P17.

8.5: Data Transmission

8.5.1: (Transit Encryption) Do the policies clearly indicate whether or not all data in transit is encrypted?

Citation:

  • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
  • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)

Background:

  • Encryption is an important tool for protecting freedom of expression and privacy. The UN Special Rapporteur on Freedom of Expression has stated unequivocally that encryption and anonymity are essential for the exercise and protection of human rights. We expect companies to clearly disclose that user communications are encrypted by default, that transmissions are protected by “perfect forward secrecy,” that users have an option users have to turn on end-to-end encryption, and if the company offers end-to-end encryption by default. For mobile ecosystems, we expect companies to clearly disclose that they enable full-disk encryption. See Ranking Digital Rights, P16.

8.6: Data Storage

8.6.1: (Storage Encryption) Do the policies clearly indicate whether or not all data at rest is encrypted?

Citation:

  • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
  • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)

8.7: Data Breach

8.7.1: (Breach Notice) Do the policies clearly indicate whether or not the vendor provides notice in the event of a data breach to affected individuals?

Citation:

Background:

  • The breach notification laws in California and the 46 other states are similar in many ways, because most are modeled on the original California law. All of them require notifying individuals when their personal information has been breached, prefer written notification but allow using the "substitute method" in certain situations, allow for a law enforcement delay, and provide an exemption from the requirement to notify when data is encrypted and the keys required to de-crypt the data are still secure. However, there are some differences, primarily in three areas: (1) the notification trigger, (2) the timing for notification, and (3) the definition of covered information. See CA DOJ, California Data Breach Report (2016).
  • A vendor should develop and describe the process for notifying schools or school districts, parents, legal guardians, or eligible students, as well as any appropriate government agencies, of any unauthorized disclosure of student information. Determine whether the incident and the types of data involved also require notification under California's breach notification law, and if so, take appropriate action. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
  • When the security of users' data has been compromised due to a data breach, companies should have clearly disclosed processes in place for addressing the security threat and for notifying affected users. Given that data breaches can result in significant threats to an individual's financial or personal security, in addition to exposing private information, companies should make these security processes publicly available. Individuals can then make informed decisions and consider the potential risks before signing up for a service or giving a company their information. Company press releases or blog posts addressing a data breach after it has occurred do not qualify as sufficient disclosure for this indicator. We expect companies to have formal policies in place regarding their handling of data breaches if and when they occur, and companies to make this information about these policies and commitments public. See Ranking Digital Rights, P15.

9: Responsible Use

9.1: Social Interactions

9.1.1: (Safe Interactions) Do the policies clearly indicate whether or not a user can interact with trusted users?

Citation:

9.2: Data Visibility

9.2.1: (Visible Data) Do the policies clearly indicate whether or not a user's personal information can be displayed publicly in any way?

Citation:

9.3: Monitor and Review

9.3.2: (Filter Content) Do the policies clearly indicate whether or not the vendor takes reasonable measures to delete all personal information from a user's postings before they are made publicly visible?

Citation:

  • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2

Background:

  • Companies may employ staff to review content and/or user activity or they may rely on community flagging mechanisms that allow users to flag other users’ content and/or activity for company review. See Ranking Digital Rights, F3.

9.3.3: (Moderating Interactions) Do the policies clearly indicate whether or not social interactions between users of the product are moderated?

Citation:

  • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2

Background:

9.4: Report Content

9.5: Internet Safety

10: Advertising

10.1: Vendor Communications

10.2: Traditional Advertising

10.2.1: (Traditional Ads) Do the policies clearly indicate whether or not traditional advertisements are displayed to a user based on a webpage's content, and not that user's data?

Citation:

10.3: Behavioral Advertising

10.3.1: (Behavioral Ads) Do the policies clearly indicate whether or not behavioral advertising based on a user's personal information are displayed?

Citation:

Background:

  • Online behavioral or targeted advertising is the practice of collecting information about consumers' online interests in order to deliver targeted advertising to them. This system of advertising revolves around ad networks that can track individual consumers—or at least their devices—across different websites. When organized according to unique identifiers, this data can provide a potentially wide-ranging view of individual use of the Internet. These individual behavioral profiles allow advertisers to target ads based on inferences about individual interests, as revealed by Internet use. Targeted ads are generally more valuable and efficient than purely contextual ads and provide revenue that supports an array of free online content and services. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), pp. 11-12.
  • The FTC recommends that affirmative express consent is appropriate when a company uses sensitive data for any marketing, whether first or third-party. When health or children's information is involved, for example, the likelihood that data misuse could lead to embarrassment, discrimination, or other harms is increased. This risk exists regardless of whether the entity collecting and using the data is a first-party or a third-party that is unknown to the consumer. In light of the heightened privacy risks associated with sensitive data, first parties should provide a consumer choice mechanism at the time of data collection. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 47.
  • The FTC believes affirmative express consent for first-party marketing using sensitive data should be limited. Certainly, where a company's business model is designed to target consumers based on sensitive data – including data about children, financial and health information, Social Security numbers, and certain geolocation data – the company should seek affirmative express consent before collecting the data from those consumers. On the other hand, the risks to consumers may not justify the potential burdens on general audience businesses that incidentally collect and use sensitive information. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 47-48.
  • If a vendor displays targeted advertising they should not use any information, including covered information and persistent unique identifiers, acquired through the site or service as a basis for targeting advertising to a specific student or other user. This includes both advertising delivered on the site or service that acquired the information and advertising delivered on any other site or service based on that information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.

10.4: Ad Tracking

10.4.1: (Third-Party Tracking) Do the policies clearly indicate whether or not third-party advertising services or tracking technologies collect any information from a user of the product?

Citation:

10.4.2: (Track Users) Do the policies clearly indicate whether or not a user's information is used to track users and display target advertisements on other third-party websites or services?

Citation:

  • Children's Online Privacy Protection Act: (An operator is prohibited from sharing a persistent identifier collected from children that can be used to recognize and track a user over time and across different websites or services without verifiable parental consent) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Student Online Personal Information Protection Act: (An operator is prohibited from tracking a student across websites with targeted advertising) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(1)(A)
  • California Online Privacy Protection Act: (An operator is required to disclose whether other third parties may collect personally identifiable information about a consumer's online activities over time and across different Web sites) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(6)
  • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
  • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582

Background:

10.4.3: (Data Profile) Do the policies clearly indicate whether or not the vendor allows third parties to use a student's data to create an automated profile, engage in data enhancement, conduct social advertising, or target advertising to students, parents, teachers, or the school?

Citation:

Background:

10.5: Filtered Advertising

10.6: Marketing Communications

10.7: Unsubscribe

10.8: Do Not Track

11: Compliance

11.1: Children Under 13

11.2: Students in K–12

11.2.1: (School Purpose) Do the policies clearly indicate whether or not the product is primarily used, designed, and marketed for preschool or K-12 school purposes?

Citation:

11.3: Parental Consent

11.3.1: (Parental Consent) Do the policies clearly indicate whether or not the vendor or third party obtains verifiable parental consent before they collect or disclose personal information?

Citation:

11.3.5: (Consent Method) Do the policies clearly indicate whether or not the vendor provides notice to parents or guardians of the methods to provide verifiable parental consent under COPPA?

Citation:

  • Children's Online Privacy Protection Act: (An operator is required to provide direct notice to parents describing what information is collected, how information is used, its disclosure practices and exceptions) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.4(b)
  • Children's Online Privacy Protection Act: (Existing methods to obtain verifiable parental consent include: (i) Providing a consent form to be signed by the parent and returned to the operator by postal mail, facsimile, or electronic scan; (ii) Requiring a parent, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each discrete transaction to the primary account holder; (iii) Having a parent call a toll-free telephone number staffed by trained personnel; (iv) Having a parent connect to trained personnel via video-conference; (v) Verifying a parent's identity by checking a form of government-issued identification against databases of such information, where the parent's identification is deleted by the operator from its records promptly after such verification is complete) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(i)-(v)
  • Children's Online Privacy Protection Act: (If an operator does not “disclose” children's personal information, they may use an email coupled with additional steps to provide assurances that the person providing the consent is the parent. Such additional steps include: Sending a confirmatory email to the parent following receipt of consent, or obtaining a postal address or telephone number from the parent and confirming the parent's consent by letter or telephone call. An operator that uses this method must provide notice that the parent can revoke any consent given in response to the earlier email.) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(vi)

Background:

  • Under most circumstances an operator is required to obtain verified parental consent before the collection, use, or disclosure, of personal information from children under the age of 13. The method used to obtain parental consent must be reasonably calculated (taking into account available technology) to ensure that the person providing consent is actually the child's parent.

11.4: Legal Requirements

11.5: Certification

11.6: International Laws