It seems the one thing technologists, companies, consumers, and policymakers can all agree on these days is the need for data privacy reform. But what do meaningful data privacy rights look like, and how do we ensure that they actually protect all members of our society? What good are more robust privacy rights if individuals are unable to practice them? It is certainly important to establish regulations and to keep companies accountable, but companies must be held accountable to each of their users and stakeholders, not just agencies or enforcers.
For too long privacy has remained a luxury as privacy policy and solutions are often designed for the most privileged. For example, although every user must agree to a technology, service, application, or platform’s terms and conditions — including a privacy policy — few people meaningfully understand or access their rights. In theory, we have all practiced our opt-in and opt-out choices. We have the ability to decline third-party tracking, delete our search histories, and have access to information about what kind of personal information is collected and how it is used. In reality, though, few actually do any of the above. A user must be well-educated, digitally literate, English-proficient, and have an abundance of free time to access and exercise their privacy rights in any meaningful way.
Protecting Everyone’s Data
There is unfortunately a significant overlap in the populations who are most vulnerable to privacy violations and material harms and those who are unable to access and exercise their privacy rights. For example, Asian Americans Advancing Justice (AAJC) works with community partners who represent and serve Asian American elderly populations that may not be English-proficient. Up to half of some Asian American communities report limited English proficiency. Some only have access to mobile technologies and limited broadband. Does a cookie notice really make a difference if it only appears in English even when it’s on a non-English site? How about if it leaves no option except the, “I accept button,” which takes up an entire half of a mobile device screen, and prevents a user from proceeding without consent?
Most users simply do not take the time or have the ability to understand what they are consenting to when they click the, “I agree” button, that follows several screens full of legal language. And when they do agree, our elderly populations may be one of the most at-risk populations. Without the understanding of how tracking, data collection, and data sharing work; aging adults with limited English proficiency can fall victim to fraudulent offers and demands, have personal information exposed, or unknowingly allow their sensitive data to be used for purposes that they wouldn’t knowingly authorize.
Other vulnerable populations include differently documented immigrant populations, whose data may be shared with entities like U.S. Immigration and Customs Enforcement’s (ICE) or the Department of Homeland Security without their proper knowledge or consent. One recent privacy violation, for instance, brought material harm to a vulnerable population was reported last year. Various mobile applications targeting Muslim users were found to be selling user’ geolocation to data brokers, contractors, and the military entities.
According to the Vice story, “Even if a user examines an app’s privacy policy, they may not ultimately realize how many different industries, companies, or government agencies are buying some of their most sensitive data.” And even after the fact, is the average user able to take action against the violation? Legal options are often limited to those with the most resources. Our most vulnerable populations are often unable to bring private action against a company doing such egregious things or seek compensation for the use of their data even if they have the right to it.
Finding the Right Solutions
While it is important to ensure companies make informed consent, data portability, deletion rights, and data reports available to users, it is also critical that users are able to take advantage of these rights and avoid privacy risks and harms. All communities, but especially communities of color, the elderly, non-English speaking, and other historically marginalized and disadvantaged populations must be educated. They need to understand why data privacy is so important and how they can take steps to protect themselves. That requires data privacy reform.
Our policy suggestions including making privacy policies, notices, and choices easy to understand and available in multiple languages. If a company has the ability to serve ads and profit off of its users who speaking in languages other than English, they should have an obligation to uphold the promises that they make to users in that same language they speak. The process to opt-out, correct, delete, or move data should be simple, easy, and quick. (And also available in multiple languages.) Companies and legislators alike should be conducting research, collecting feedback, and testing solutions directly with community members to ensure the user experience is accessible and the protections are actionable. Public interest technologists must lead the way.
To provide meaningful privacy rights to our communities and create meaningful data privacy reform, we must first understand the needs and lived experiences of our communities. Designing privacy policies and solutions without the input, questions, and concerns of the very people we aim to serve will inevitably fail. Privacy reform must center our communities. Public interest technologists can help.
Emily Chi is the Assistant Director of Telecommunications, Technology, and Media at Asian Americans Advancing Justice-AAJC, where she is responsible for developing and providing public policy research, strategies, analysis, and education on telecommunications, technology, and media diversity.