Personally Identifiable Information (PII) is data that can be used to indirectly identify a specific individual, such as a social security number, driver’s license number, or login name. PII is important to protect as a part of overall data security because PII is often used by banks, healthcare providers, and government agencies as a means of unlocking information or of proving your identity in order to get access to loans, credit cards, health records, etc.
PII has continued to be a hot topic both in terms of legal protections and in the numerous high-profile hacking cases that have occurred, including the recent break-ins to the Democratic National Committee emails that were a big discussion during, and even after, the 2016 Presidential election.
Even a small piece of PII can be used to expose individuals, for example: AOL anonymous user 4417749. In 2006, AOL anonymized and sent out data on the searches being performed on its platform in hopes of helping academic researchers. However, one particular anonymous user had made various searches that enabled her to be identified, including: “numb fingers”, “dog that urinates on everything”, “60 single men”, “landscapers in Liburn, Ga”, and the last name “Arnold.” With only the last name as a potential piece of PII, researchers were still able to identify that anonymous AOL user 4417749 as 62-year-old Georgian widow Thelma Arnold, who often researched medical ailments for her friends and loved her three dogs.
So, if someone’s simple search history can potentially identify them to others then what should be on the list of PII that should be carefully guarded? While many technology professionals tend to think that they know exactly what PII is (“I know it when I see it”), there are multiple approaches to categorizing it used in federal regulations that may help shed light on the challenges (see Paul M. Schwartz * Daniel J. Solove, The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86 N.Y.U. L. Rev. 1815 (2011)):
The Tautological Approach
This approach defines PII as any information that identifies a person. For example, the Video Privacy Protection Act (VPPA), which was enacted to keep private the sale and rental of home videos, defines PII as “… information which identifies a person.” Of course, the problem with this approach is that it provides no actual guidance on what does and does not identify a person. For example, SSN would be included, but what about a US Postal address which might identify a family instead of an individual? What about an IP address?
The Non-Public Approach
The Gramm-Leach Bliley Act (GLB Act), enacted to regulate how personal information is used by financial institutions, defines PII as “… nonpublic personal information.” The issue with this type of definition is that it also leaves out which specific information can identify a person.
The Specific-Types Approach
The federal Children’s Online Privacy Protection Act (COPPA), enacted to help ensure the privacy of persons under the age of 13 while using the Internet, uses this third approach by providing a list of items that would constitute PII, including: first and last name, physical address, SSN, e-mail address, telephone number, etc. COPPA goes further to include “… any other identifier that the [Federal Trade Commission (FTC)] determines permits the physical or online contacting of a specific individual.” Such FTC items include IP addresses. While this approach is the most concrete and may allow for some expanded PII in the future, it still fails to fully encompass future technologies that may be used to identify individual people (or a combination of pieces of information that may be used to identify individuals).
As regulators continue to grapple with the definition of PII, it is incumbent on technology professionals to continue to make their systems secure and to continue to update security policies to encompass advances in technology. While there may be no single approach that is bullet-proof, a continued focus on security and privacy will help ensure the best outcomes.