In conversations about student data privacy, the terms "privacy," "security," and "encryption" are often used interchangeably. While these terms are related, they ultimately are distinct concepts. In this post, we will break down how these terms overlap with each other and how they are distinct.
But at the outset, I need to emphasize that this post will be incomplete; a comprehensive treatment of these terms and the distinctions among them would be a good subject for a book. Details will be left out. If you're not OK with that, feel free to stop reading now. I imagine that the Kardashians are up to something curious or interesting -- feel free to check that out.
As is hopefully obvious by now, this post is not intended to be comprehensive. This post is intended to provide a starting point for people looking to learn more about these concepts.
Privacy is arguably the least technical element in this conversation. There are two facets to privacy we will highlight here:
- It's possible to have great security and bad privacy practices;
- and we often speak about "privacy" without clarifying "private from whom."
Great security and bad privacy
A vendor can go to extreme lengths to make sure that data can only be accessed by the vendor or the partners of the vendor. However, if the vendor reserves the right to sell your data to whomever they want, whenever they want, that's not great for your privacy. The ways that vendors can use the data they acquire from you are generally spelled out in their terms of service -- so, if a vendor reserves rights to share and reuse your data in their terms, and you agree to those terms, you have given the vendor both data and the permission to use that data.
Who is that private from, really?
Different people think of different things when we say the word "private." In most cases, when we think about privacy, we focus on things we don't want other people to know. When we're working with technology, though, the concept of "other people" gets abstract and impersonal pretty quickly.
When we use services that store a record of what we've done (and it's worth noting that "done" means read, said, searched for, liked, shared, and moused over and includes how long we have done any of these things), the "private" things we do are handed over to systems that have perfect memories. This changes the nature of what "private" can mean. For the purposes of this post, we'll use four categories of people who might be interested in us over time and how that impacts our privacy.
- Criminal: These are the folks people agree about the most: the people stealing data, perpetrating identity theft, and using a range of attacks to get unauthorized access to data with bad intent.
- Personal: There is also large agreement about personal privacy. We can all agree that we don't want Great Uncle Wilfred to know about our dating life or to talk about it during Thanksgiving. The ability to control which of our acquaintances knows what is something we all want.
- Corporate: There is less agreement here, as one person's desire for privacy often runs counter to a data broker's or a marketer's business plan. But, when using a service such as Facebook, Instagram, Twitter, Snapchat, Pinterest, and so on, the "privacy settings" provided by the vendor might offer a degree of personal privacy, but they do nothing to prevent the vendor from knowing, storing, and profiting from everything you do online. This often includes tracking you all over the Web (via cookies and local shared objects) and in real life (via location information collected via a mobile app) or buying additional data about you from a data broker.
- State: There is also less agreement about what constitutes an appropriate level of protection or freedom from state-sponsored surveillance. While people have been aware of the inclination of the state to violate privacy in the name of security and law enforcement throughout history, the Snowden leaks helped create clarity about what this looks like in the present day.
(As an aside, the data-use practices within politics should possibly be included in this list.)
Many conversations about privacy don't move past considering issues related to criminal activity or personal compromises. However, both corporate- and state-level data collection and use expose us to risk. As was recently illustrated by the Ashley Madison and OPM breaches, corporate data collection and state data collection pose criminal and personal risks.
For people looking to learn more about the various factors at play in larger privacy conversations, I strongly recommend Frank Pasquale's recent book, The Black Box Society. The book itself is great, and the footnotes are an incredible source of information.
In very general terms, security can be interpreted to mean how data is protected from unauthorized access and use. Encryption is a part of security but far from the only part. If a systems administrator leaves his username and password on a Post-it note stuck to his monitor, that undercuts the value of encrypting the servers. Human error can result in snafus such as W2s for a popular tech start-up being emailed to a scammer.
If people email passwords to one another -- or store passwords online in a Google spreadsheet -- a system with fantastic technical security can be compromised by a person who has limited technical abilities but who happens to stumble onto the passwords. Phishing and social-engineering attacks exploit human judgment to sidestep technical security measures. If a CSV file of user information is transferred via SpiderOak and then copied to an unencrypted USB key, the protection provided by secure file transfer is immediately destroyed by storing sensitive information in plain text on a portable device that's easy to lose. In short, security is the combination of technical and human factors which, taken together, decrease the risk of unauthorized access or use of information.
Encryption is an element of security but not the only element. It is, however, a big part of the foundation upon which security, and our hopes for privacy, rest.
Encryption is often used in general terms as a monolithic construct, as in: "We need to fight to protect encryption" or "Only criminals need encryption."
However, the general conversation rarely gets into the different ways that information can be encrypted. Additionally, there are differences between encrypting a device (such as a hard drive), data within an app, and data in transit between an app and a server or another user.
As an example, all of the following questions look at possible uses of encryption for a standard application: Does an application encrypt data at rest on the device where the data is stored? If the application pushes data to a remote server for storage, is the data encrypted while in transit to and from the remote location? If the data is stored at the remote location, is the data encrypted while at the remote location? If the remote location uses multiple servers to support the application, is communication among these servers encrypted?
If the answer to any of these questions is no, then, arguably, the data is not getting the full benefits of encryption. To further complicate matters, if a vendor encrypts data at rest, and encrypts data moving between servers, and encrypts data moving between servers and applications, but that vendor can still decrypt that data, then there is no guarantee that the benefits of encryption will protect an individual user. When vendors can decrypt the data on their hardware, then the data is only as secure -- and the information stored only as private -- as the vendor is able or willing to protect that encryption.
True end-to-end encryption (wherein the data is encrypted before it leaves the application, is sent via an encrypted connection, and is only decrypted at its final destination) is the ideal, but often a vendor will function as a middleman, storing and archiving the data before sending it along to its intended recipient. This is one of many reasons that the encryption debate looks different for vendors that make hardware relative to vendors that build software.
In very general terms, hardware manufacturers fighting for encryption are protecting user data; and it's in the best interest of these manufacturers to protect user data, because if hardware vendors fail to protect user data, they also lose user trust, and then people won't buy their products.
In equally general terms, many application vendors fighting for encryption have a more complicated position. A small number of vendors have been vocal supporters of encryption for years. These are the small number of vendors who offer true end-to-end encryption or who implement encryption where the user, not the vendor, retains control of their keys. However, the legal battle between Apple and the FBI over encryption has elicited broad support from within the tech community, including companies who use data to power advertising and user profiling. For companies whose business is predicated on access to and use of a large data set of sensitive user information, strong encryption is essential to their business interests.
In their external communications, they can get a public-relations win by advancing the position that they are defending people's right to privacy -- and it needs to be noted that their loud public support is a very good thing. Internally, however, encryption protects the biggest asset these companies possess: the data sets they have collected, and the communications they have about their work. This is where the paradox of strong security with questionable privacy practice comes into play: Why should encryption give large companies an additional tool to protect the means by which they compromise the privacy of individuals?
And the answer is that, without encryption available to individuals or small companies, none of us has a chance to enjoy even limited privacy. If we -- people with less access to technical and financial resources than the more wealthy or connected -- want to have a chance at maintaining our privacy, encryption is one of the tools we must have at our disposal. The fact that it's also useful to companies that make a living by mining our information and -- arguably -- eroding our privacy doesn't change the reality that encryption is essential for the rest of us, too.
Note: I'd like to thank Jeff Graham for critical feedback on drafts of this piece.