Data Privacy is a Right Not a Privilege

My public interest technology journey began when I was working in the public sector over a decade ago and started utilizing big data. The interdependency of data and technology quickly became obvious, as well as the amount of personal information that was accessible. I was tasked with developing and implementing data privacy protocols, a practice I continue with today. Most recently, my work on the 2020 United States Census as a Public Interest Technology (PIT) Census Fellow at New America exposed new data challenges that I couldn’t even imagine years ago. Specifically, how easy it has become to reidentify an individual if too much information about them is released. 

As a PIT Census Fellow at New America I was charged to “work as a technical project manager, translating between the government, tech, and civil rights organizations to ensure the IT readiness of the 2020 Census.” There was a lot to do. 

The 2020 Census contained a number of technical changes, most notably, it was the first time census responses could be submitted online. It was also the first time the census promotion would happen on social media. Misinformation and disinformation were concerns. In addition, census data products with information about ages and race will be released into a commercial data and computing environment more powerful than ever. 

During my tenure — and in partnership with The Leadership Conference on Civil and Human Rights — I worked with dozens of civil rights organizations, researchers, and other census stakeholders on the issues of IT readiness, cybersecurity, data quality, confidentiality, and more. 

One of the most complex and interesting topics was regarding the Census Bureau’s disclosure avoidance system, specifically the decision to utilize differential privacy as the main component of their system. Differential privacy allows the bureau to mathematically balance privacy and data utility. The Census Bureau decided on this methodology by conducting an experiment on the 2010 data products, to see if they could reveal the underlying dataset. What they found was jarring. Information on individual census responses was discoverable for 52 million Americans — and that may just be the tip of the iceberg. (Read more about about this in my recent The Hill op-ed.)

Confidential information and the potential for reidentification is precisely why commercial data brokers and nefarious actors want census data. They can’t get anything like it anywhere else. This fact should act as a stark realization that our data privacy practices need to evolve. 

Data privacy has two components — agency and transparency. We are asked for our data all the time, including the requirement to participate in the census. What concerns me is that once our data is handed over, we have little to no visibility and ownership over how our data is used. Website and app agreements, as cumbersome as they can be, really don’t dive into the specifics of how they use our data, and many organizations have no data policy at all, leaving us all vulnerable to exposure.

But this can be turned around. By incorporating new ways of modeling and implementing data privacy protections, we can enhance existing practices and protect vulnerable communities. In this issue of The Commons I’ve asked a few colleagues I met through my census work to discuss how challenging a lack of data privacy is for their communities. These contributors explain the unique ways their communities face challenges from a lack of data privacy, giving more examples of the uphill battle we all face. This includes commentary from Corbin Streett of the National Network to End Domestic Violence, a Q&A with Maya Berry of the Arab American Institute, and a story by Emily Chi of Asian Americans Advancing Justice. 

Collectively, they give good, relevant, and timely examples of how practitioners can expand their existing data privacy methods. My hope, and my ask of fellow practitioners, is that we move beyond traditional data privacy practices and incorporate some of the suggestions provided. After all, data privacy isn’t a privilege, it’s a right.

Maria Filippelli is a Public Interest Technology Census fellow at New America. Filippelli has a background in urban planning, civic technology, and data science. For the past two and a half years, she has worked with dozens of civil rights organizations to navigate the technical changes to the 2020 Census.