The idea and practice of public interest technology (PIT) isn’t exactly mainstream.
There are many people making inroads, though. New America’s Public Interest Technology University Network (PIT-UN) is just one example. The group of 36 public and private universities are creating degree and certificate programs that cut across disciplines and educating scientists, technologists, and sociologists, among others, to go out and create PIT in the world. They’re also working directly with public and private organizations, creating fellowships, apprenticeships, and internships to bring these newly minted PIT practitioners into every level of our economy.
Harvard University was part of PIT-UN from the very beginning. The school this fall will launch a new Public Interest Technology Lab, helmed by Latanya Sweeney, a professor of Government and Technology in Residence within the Department of Government who previously served as the CTO at the Federal Trade Commission. The Commons sat down with Prof. Sweeney, who is also the director of Harvard’s Data Privacy Lab in IQSS, and Jinyan Zang, research assistant and Ph.D. candidate in the Department of Government to discuss the plans for the new PIT Lab.
Tell us about the PIT Lab at Harvard. What was the impetus for this new work?
Zang: We already have this innovation pipeline that exists on campus — a certain method for startups that are based off of technology and projects that people work on in class in research to roll out, find investors, get mentors, incubate to scale and get them to private enterprises. Obviously, there are lots of problems that technology can address that are independent of just the for-profit road. You have foundations and governments and other sorts of entities that are really excited about these technology projects and want those connections to the researchers and students on campus to take those projects from hypothetical and cool to really being able to make a difference in the world. And so that’s the idea behind the PIT Lab — how to address that need and create a convenient space and give access to our pool of resources for these kinds of projects.
Professor Sweeney saw as the Chief Technology Officer of the FTC this clear need to figure out how do you connect policy and policymaking with technology that’s being developed. And be able to inform both worlds around where are the needs and where the resources are and how to connect the two. That focus started the work we’re doing teaching classes in the government department here at Harvard –creating a program specifically around technology science, where it’s about research and learning in this area of questions around how does technology impact society and how we develop technology to address the needs of society.
What were some of the first implementations of the program?
Prof. Sweeney: We’ve been on an incredible journey these last few years.One of our classes is called — it has taken on different forms so we call them the “save the world” classes. These are classes where all these pieces come together, where we take students regardless of their academic training, and show them how they can have an impact on the world by defining a project. We teach them how to think about it in ways to expose unforeseen consequences and to help navigate the world to a better place. And those projects have changed business practices at companies like Facebook, they’ve produced new laws, regulations have been spawned by them. And these are students who, after understanding how to think about these problems have actually made a real world impact.
The idea of having what we call the tech lab — the public interest tech lab — is a natural extension of doing exactly what we’ve been doing for years, but now doing it on a larger broader set of scholars and students to have an even broader impact on the world.
Is there a reason that public interest technology seems to be having its moment?
Prof. Sweeney: One way of thinking about this trajectory over time is to go back to the origin of the [Harvard] data privacy lab from which [the PIT lab] started building. The data privacy lab came about because of the conflict with privacy, and the technology companies that were dominating the scene –how could society enjoy privacy with these new technologies? But in reality, privacy was sort of the first wave of challenges the technology was going to bring to our society. We were also on the front lines of being the first [entity with] projects to show discrimination in algorithms. Now there’s a whole area around it, this idea that an algorithm could could break a law, break the Civil Rights Act. An algorithm being against the law in terms of its behavior was profound.
But then as we flash forward to 2016 and it’s almost like our entire society on every democratic value was being challenged by what technology allowed or didn’t allow. And so we found ourselves all of a sudden expanding into the election space. We were among the first to talk about the bots that we found in the 2016 election, or the weaknesses of those voter registration websites. We were among the first to report those as well. And all of a sudden, you see that an arc of our society has been around technology creating a new form of technocracy and our response to that has been a broadening of scope.
Now we come to 2020, and we have major projects –the 2020 census and the 2020 election since many of the vulnerabilities we found in 2016 are still there. So we had to come up with new and innovative innovations in these spaces. And then came the pandemic. And so we’ve found ourselves innovating in that space as well and it almost feels at this moment in time that that if we don’t find a way to do what we do bigger than what’s within our small shop — to explode it into the larger scene so that we get an army that’s doing this — we’re going to lose the battle. We’re just going to be overwhelmed by this tidal wave.
Jin mentioned that the first two projects to come out of the Harvard PIT lab are related to COVID-19. What are they?
Prof. Sweeney: Around May, Google and Apple got together and said we’re going to help with contract tracing and they came out with anonymous apps, which sounds really great, because the idea would be, you could know whether or not you’ve been exposed to COVID without necessarily anybody knowing who you are. But when you actually understand what’s going on with contact tracing, the idea that one person that you know is infected, how do I get the other people that that person may have infected to quarantine so they don’t keep infecting others. But if I have a if I have an anonymous app, and people are pushing the button, sometimes wrongly, I also create a lot of false positives where I’m creating a larger and larger network of people who believe they should be quarantined. So in reality, human interaction in contact tracing is quite effective. A human who will sit and talk with the person who’s infected can help me understand who really do I care about protecting, and which of those encounters are incidental and shouldn’t really count?
We began asking how we could build technology to make human contact tracing effective. How do we solve the policy problems, the privacy problems, and the technological problems to make sure that we can push as much privacy and utility into the system? We came up with two really innovative technologies. One is called TraceFi and the other one is a new kind of app platform. Both of them share location information, but we share it with cross checking, privacy, and security combinations. The only person who actually sees the data is our human contact tracer, [who the app is built for] and they only see the data when there’s a positive case that involves that data. We find that this is the way forward and can make a huge difference in effective contract tracing. Harvard pledged at the beginning of the pandemic that any intellectual property produced at Harvard would be free. We are honoring that pledge, and will provide the infrastructure needed to those who can actually use it with human contact tracers.
Why is public interest technology important??
Prof. Sweeney: The reality is technology is taking over the world. An arbitrary design decision by someone in technology changes the way we live our lives. But what happened to democracy? We don’t vote for those technologists. We don’t even know their names. But yet now Twitter, for instance, decides the rules of free speech. And the list goes on and on. And so how do we actually shore up and create the counter balances that are needed or for our democracy to survive? This is what’s at the heart of public interest technology. PIT is how to be accountable.