Lindsey Barrett, a staff attorney at Georgetown Law, does not hold back when she sees bad actors in the tech space.
Part of the Institute for Public Representation (IPR) Communications & Technology Clinic, Barrett advocates for the clinic's non-profit clients in areas relating to technology and the public interest, like consumer privacy, children's privacy, and media accessibility. We spoke with her as part of our Election 2020 package, covering big issues in tech, where the candidates stand, and what politicians can do about the data abusers. This interview has been edited and condensed.
Ben Powers: What big questions around tech should we be paying more attention to?
Barrett: Corruption is a big one. There’s no major issue where the law isn’t either under-inclusive or deeply skewed towards allowing industry impunity. We can't change things unless we are able to tamp down on how lobbyists are able to shape policy, and ensure the expertise that Congress has access to is independent. Privacy, particularly consumer privacy, is a really big one under the tech umbrella. Privacy can unfortunately get siloed into talking about Facebook and Google and nothing else. But we're talking about data that's collected from us and that law enforcement has access to, in 50 different ways, and none of it is trivial.
We've progressed a lot in how we characterize privacy problems and the real risks they pose. It’s less and less a tenable or serious position for companies to come out and say that a privacy law would cause the industry and its beautiful innovation to come crumbling down. We know that's not true.
It’s also a less serious position to say that people “don't care about privacy” or “because they don't care, they don't deserve protections from it.” We've had visceral examples demonstrating why that idea isn’t true. We know that ad tech companies and data brokers hoover up every bit of information about us that they can, make assessments of us based on that, and sell them to the highest bidder.
We know that those assessments can affect or determine whether we can rent an AirBnB, go to a bar, and afford health insurance or college. None of this is trivial. As the rhetoric moves in a positive direction, we need it reflected in meaningful privacy protections and laws that make it possible for people to sue to vindicate privacy violation, executive liability where appropriate, and measures that would make privacy law something companies take seriously because, not laughing it off because their risks for violating It is so low.
Powers: What are ways that people are harmed by abuses of privacy and data?
Barrett: When a company has bad data security practices, that company lets you get hacked, and now you’re subject to identity fraud, with the anxieties about time, money and everything else that entails. Then you have actual safety risks. There’ve been a whole series of stories and investigations into telecom giants selling location data and you can’t come up with a more horrifying safety risk than a bail bondsman (who can have access to that data) deciding he wants to stalk his girlfriend that day. There are concrete and dangerous safety implications to consumer privacy violations.
Other harms come in how the data or technology is used. We know many important life decisions are made accessible or mediated through algorithms. The information collected about you determines how you are characterized in ways that you can't see and won't have access to. These can impact everything from educational and job opportunities to being able to rent an Airbnb.
Powers: How do you give a privacy law teeth?
We need a basic level of privacy laws that treat privacy as a civil right and a human right. We need privacy laws that understand how privacy decision-making is constrained. We need a privacy law that understands how data uses can limit life opportunities. We need a privacy law with penalties that companies take seriously. After the FTC settlements with Facebook and Youtube were reported last year you saw the stock go up. That is a concrete demonstration of how the incentives of our current privacy laws are working. We need better enforcement, whether that’s empowering the FTC, or a new agency. And we need a private right of action.
Powers: So do privacy plaintiffs not have the right to sue companies that abuse their own privacy agreements?
Big "it depends" here. The long answer: It depends on the kind of privacy violation, because many privacy laws do not provide individuals with the right to sue violators, but instead vest enforcement authority solely in an agency and/or state attorney general . Even with a privacy law that has a private right of action, the company might have buried an arbitration clause in its terms of service. Plaintiffs are shunted into a process with no transparency where the company is at a strategic advantage, including the choice of arbitrator and applicable rules. And where privacy-plaintiffs are able to sue, courts have long been unduly parsimonious over their perception of privacy injuries for the purpose of standing doctrine. So, the short answer: rarely. Suing is expensive and it’s hard.
Powers: How are campaigns addressing these areas?
Barrett: Some candidates have pushed ideas that have become popular and other candidates have seized on those. Elizabeth Warren's tech proposals have been subsequently embraced by other candidates, which is great because they're really good ideas. [Bernie] Sanders said that he supported a right to repair after she came out for one. [Andrew] Yang said he supported reviving the Office of Technology Assessment in Congress after she did. And the whole field has had to address the problems of anti-competitiveness and consolidation in the tech sector after she put out her plan to break up big tech. Whether or not they're committed to the actual full bones of the idea or just like the way it sounds, is another question. Warren and Sanders appreciate the need for broad legal reforms and recognize a broad corruption problem.
I find myself gravitating to Warren's tech-related proposals because of her precision, ambition, and her prioritization of rooting out corruption. Her plans reflect careful deliberations and consultation on niche, but crucial, issues — she was the first to suggest a national right to repair, the first to come out for supporting reviving Congress's Office of Technology Assessment, and her push for antitrust reform has entirely reshaped the debate. Her anti-corruption reforms are crucial because at the end of the day, the biggest tech policy difficulty isn't figuring out how to draft effective laws, it's figuring out how to enact anything meaningful at all when industry has billions of dollars to burn on lobbying Congress, state legislatures, and the FTC and FCC.
Sanders has a number of exciting tech policy proposals, and exhibits a clear and necessary capacity to name villains and tackle the biggest policy problems at their root. I'm thrilled that he supports banning law enforcement’s uses of facial recognition; commercial uses are dangerous too, but he's helping to move the conversation in the right direction. His public broadband plan is a little sparse on detail but otherwise excellent. And I love that he supports a tax on digital advertising. The digital ecosystem is heavily skewed towards corporate profitability and against meaningful rights for consumers.
None of the other candidates have demonstrated a desire to constrain corporate power to the extent that Sanders and Warren have, which gives me little reason to think that their policies will be sufficient to restore any kind of equilibrium to our corporate-friendly tech policy ecosystem.
[Pete] Buttigieg criticized Warren's antitrust plan as being inappropriate for targeting specific companies, which is, well, how antitrust works. His coziness with Sillicon Valley and enthusiasm for a "freedom of choice" framing in healthcare, another area, like privacy, where 'freedom to choose' functionally means 'freedom to be taken advantage of by companies' also bodes poorly for the kinds of policies he would put forward or support.
Powers: You argue that Silicon Valley is just one part of the attack on our privacy? Can you explain?
Barrett: By siloing this conversation in Silicon Valley, we're giving short shrift to companies that are doing the same thing. When it comes to adtech and tracking, AT&T and Verizon are both in the ad tech business. Verizon had the biggest COPPA fine assessed until it was then topped by TikTok and YouTube. They were illegally tracking kids and making money off of them. AT&T is buying reams of regular location data and sex preference information on people from Grindr. These companies are engaging in practices like those of tech companies that are incredibly problematic but they also have their own issues. They’re lobbying against municipal broadband, against any kind of meaningful competition reform, against broadband privacy rules, and against meaningful state and federal privacy legislation. Not to mention getting net neutrality murdered.
The leader in news and information on cryptocurrency, digital assets and the future of money, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups. As part of their compensation, certain CoinDesk employees, including editorial employees, may receive exposure to DCG equity in the form of stock appreciation rights, which vest over a multi-year period. CoinDesk journalists are not allowed to purchase stock outright in DCG.