None of Your Business: Claiming Our Digital Privacy Rights, Reclaiming Democracy
We plug into the real world Matrix – the digital Wild West of surveillance capitalism that dominates this Age of Information. Behind it is the unholy alliance between Big Tech and Big Brother. Privacy is the first casualty and democracy dies with it. Our guide is Cindy Cohn, director of Electronic Frontier Foundation, with her decades of experience challenging digital authoritarianism.
Featuring
Cindy Cohn, the Executive Director of the Electronic Frontier Foundation since 2015, served as EFF’s Legal Director as well as its General Counsel from 2000 to 2015. Among other honors, Ms. Cohn was named to The Non-Profit Times 2020 Power & Influence TOP 50 list, and in 2018, Forbes included Ms. Cohn as one of America’s Top 50 Women in Tech.
Credits
- Executive Producer: Kenny Ausubel
- Written by: Kenny Ausubel
- Additional production and writing: Leo Hornak
- Senior Producer and Station Relations: Stephanie Welch
- Program Engineer and Music Supervisor: Emily Harris
- Producer: Teo Grossman
- Host and Consulting Producer: Neil Harvey
Resources
Cindy Cohn – The Climate Fight is Digital | Bioneers 2024 Keynote
This is an episode of the Bioneers: Revolution from the Heart of Nature series. Visit the radio and podcast homepage to find out how to hear the program on your local station and how to subscribe to the podcast.
Subscribe to the Bioneers: Revolution from The Heart of Nature podcast
Transcript
Neil Harvey (Host): In this program, we’ll plug into the real world Matrix – the digital Wild West of surveillance capitalism that dominates this Age of Information. Behind it is the unholy alliance between Big Tech and Big Brother.
Privacy is the first casualty and democracy dies with it. Our guide is Cindy Cohn, director of Electronic Frontier Foundation, with her decades of experience challenging digital authoritarianism.
This is “None of Your Business: Claiming Our Digital Privacy Rights, Reclaiming Democracy”… on the Bioneers: Revolution from the Heart of Nature.
Host: In the year 2000, just 25% of the world’s information was digitized. Shoshanna Zuboff – the author of “The Age of Surveillance Capitalism” – points out that when the dot-com bubble burst, Google was a small startup with a potent search engine, but scant revenues. Until – she says – Google “learned how to combine massive data flows of personal information with advanced computational analyses to predict where an ad should be placed for maximum ‘click through.’”
It would prove a world-changing act.
But the catch is that this magic trick of prediction was dependent on an insatiable appetite for data. That hunger soon scaled by orders of magnitude with the advent of smartphones, apps, and all manner of cameras, devices and sensors – and now AI.
Says Zuboff: “User ignorance was understood as crucial to success. Each new product was a means to more ‘engagement,’ a euphemism used to conceal illicit extraction operations.”
In other words, while you’re searching Google, Google’s real purpose is searching you – as secretly and profitably as it can get away with. As the saying goes, when something is free, you are the product – because knowledge is power.
It’s asymmetric warfare that Zuboff compares to “one-way mirror operations.”
Facebook followed close behind Google in this systematic abolition of personal privacy. So would the other Tech overlords, setting the table to dominate the 21st century economy as the richest and most powerful corporations in history.
Cindy Cohn is Executive Director of Electronic Frontier Foundation, the world’s oldest and largest digital civil liberties organization. She says ceding control of your data brings all manner of unintended consequences – for everyone.
Cindy Cohn spoke at a Bioneers conference.
Cindy Cohn (CC): The surveillance part of it is very pervasive, and it’s built into everything. We know that a lot of the inferences that these systems draw are not right and are discriminatory. If Facebook knows that you’re a woman, you’re unlikely to see ads for becoming a CEO. If you’re a Black person, you’re more likely to see ads for becoming a bus driver and not ads for something that might pay you more salary. Targeted advertising is bad just in and of itself, but it also really can supercharge a lot of things in our society that we’re trying to combat.
But even when they are right, I don’t think that’s the world that most of us want to live in. So it’s been disturbing to work so hard to try to build a space for private conversation and private activities online and see this business model just crush that.
Host: That business model achieved warp speed after 9/11. Any quaint concerns the government had about restrictive digital regulations favoring online privacy vaporized in a national-security fever dream to attain what Vice President Dick Cheney called “Total Information Awareness.” As the head of the CIA put it. “Collect everything and hang onto it forever.”
Naturally, the government turned eagerly to private tech corporations. The cover of national security neatly allowed the military-intelligence complex to bypass Congressional regulation and pesky legal and constitutional privacy protections.
CC: So these two powerful forces in our society – the people who want to make money and the criminal justice or national security justice, whatever you want to call them, the cops – they’re aligned in wanting to build an Internet where there is no privacy, there is very weak security, and we are not in charge.
And I think it’s really important that we talk not only about the corporate side but the governmental side. The surveillance part of surveillance capitalism isn’t just talking about the companies. And there are real serious ramifications for people around the world, and for movements around the world.
Host: One iconic rallying cry of early digital enthusiasts was: “Information wants to be free.” It began as a liberatory vision of democratic access to information and institutional transparency. But in practice, what tech monopolies want is free unfettered access to your information to claim as their private property.
So, how did we get here?
CC: I got involved with EFF in the early ‘90s, and my hair was not silver then. The promise of digital technologies to me involved building a secure and private way for more people to talk to each other than they could with technologies before – so this idea that we have to be able to figure out how to make change among a wider range of people, the technology made that possible. Doesn’t make it inevitable, but it made it possible. And that was one of the founding things.
I’m a lawyer by training, and I did human rights work before I stumbled into the digital age, so I’ve always been interested in how technology can facilitate the promise of human rights. So, you know, how can we build a movement that will actually help us better control our lives?
How can we make all the world’s knowledge available to all the world’s people, and how can we make voices heard that couldn’t be heard before? Those were three things that I saw – and I’m not the only one – that we saw as the possibility of this new global digital technology. And what happened, in my view, is that the B-School people got involved, right? The business school.
Host: In the 1990s, there was broad bipartisan consensus about the promise of these novel global digital technologies: to connect people, facilitate the sharing of knowledge and culture, and advance democracy.
But already by the 1980s, in backroom meetings between Congress and the Reagan and Bush Administrations, their interest was not the public interest. Instead, federal policy would ensure the nascent world wide web would provide private enterprise with the biggest profit-making opportunity in history – unregulated by government.
The Clinton Administration also excluded the public from the table, readily handing this game-changing global communications nervous system over to the so-called “free market” to sort it out.
But, of course, it was Congress, not the free market, that shelled out copious taxpayer dollars to build the information highway.
Back in those Before Times, one of Cindy Cohn’s first projects focused on the most critical political-economic variable at the heart of the matter: privacy.
CC: I met some folks who were involved in the free software movement. They asked me one day if I would take on a lawsuit involving freeing up encryption technology from government regulations. Encryption technology is how you have privacy online. And at the time in the 1990s, it was controlled by the U.S. government, like a munition. So it was on the U.S. munitions list along with you know surface-to-air missiles and tanks was software with the capability of maintaining secrecy. We sued, and the regulations were thrown out as unconstitutional.
So, I believe strongly that we all deserve the right to have a private conversation. And that’s true whether we’re using digital services or non-digital services that it’s part of our human rights. It’s part of us as humanity. So a good part of my career has been spent trying to make that happen.
Host: The landmark court decision meant the public had the right to use encrypted technology to keep their internet communications private – by default. The reversal of fortune came in the wake of 9-11.
During this time, Cohn and EFF launched a campaign to restrict online snooping by the National Security Agency. The hyper-secretive government intelligence branch would only be forced out of the shadows in 2013 by whistle-blower Edward Snowden…
CC: I was suing to try to stop the NSA from doing mass spying on everyone in 2006. And the government maintained that we were making it all up, we didn’t know what we were talking about; they certainly would never spy on Americans. And then they got caught. Mr. Snowden provided evidence that confirmed that what we had been saying for, at that point, seven years, was true.
And that was what inspired him, because he believes that it’s important that a government be straight with its people. And, you know, I don’t like that he’s stuck in Russia. I will tell you that the place you are standing when they take your passport away is the place you will stay. So if they want to give him his passport back, he’d be delighted to come home. But they need to drop the death penalty-based espionage charges that they have against him, because he didn’t engage in espionage. He engaged in turning on the lights for all the rest of us and stopping the lying. And then the government had to come clean, and they had to admit it.
If it weren’t for Mr. Snowden, they would still be lying to us about what they’re doing. And that’s why we owe him a debt of thanks. And we can’t change our government unless we know the truth about what they’re doing. And as a result of what he did, we have scaled back some of the mass spying, especially the telephone records program.
And EFF is part of a large coalition that helped encrypt the web in the aftermath of the Snowden revelations. The work that I and other people did is part of why we have things like Signal, things like WhatsApp, that let people communicate securely and privately over digital networks. And we went from a very low percentage of web transactions being encrypted to well over 90% being encrypted now. EFF has a plug-in for Firefox and Chrome called Privacy Badger that blocks third-party cookies.
So both stuff that people might use directly and then stuff that’s deep in the undercurrent of our digital world, we’re involved, pushing for people’s rights at every level.
Host: After Snowden’s shocking revelations, Congress decided to solve the problem by simply legalizing much of the NSA’s illegal spying.
But it’s not just federal intelligence agencies that use and abuse surveillance capitalism’s arsenal. The tools of “street-level” surveillance are ubiquitous. They’re embedded in downtown city advertising kiosks, automated license plate detectors, bodycams, drones and who knows where else.
Funding for these tools got a big boost when cities across the country funneled a significant portion of $350 billion in Covid relief to police departments, with little or no public debate.
In response, EFF created the Atlas of Surveillance. It maps the kinds of tools to which local police forces have access, and where they’re being used.
CC: Local police and the feds get this equipment without any of us knowing it, without any local accountability. And getting some local accountability, a set of ordinances that we call CCOPS ordinances, is really an important first step to figuring out what’s going on and empowering people to take the steps they need to roll it back.
We know these technologies get used on people who are engaged in climate activism, and Indigenous people, and marginalized people at a disproportionate rate. If you look at the mapping we did as part of the Atlas of Surveillance or another thing about where the automated license plate readers are in Oakland, those of you who live in the Bay Area, I bet you can predict where those are. They are targeted at marginalized communities, the people who are already over-policed, and the lily-white hills not so much. What’s going on there. Right?
So, EFF has a set of materials that we call surveillance self-defense materials. These are materials for people who are engaged in various kinds of activities where we think they might come under special surveillance or even broad surveillance by governments or companies. And we know that this has hit the climate justice community very hard already.
Host: Cindy Cohn says the privacy stakes are sky-high for journalists and activists swarmed by a digital armada of unreasonable or illegal surveillance tools.
Take Standing Rock, for example. As one of the most powerful environmental mobilizations of the last decade, it was organized by a coalition of Indigenous Peoples and non-Native allies to resist an extension of the Dakota Access Pipeline through Indigenous territories.
Protests on the ground mushroomed amid global media attention, and digital privacy became a fierce front in the struggle.
CC: We sent some folks to Standing Rock because they were using these things called IMSI catchers. These are fake cell phone towers that handle people’s calls, but also track who’s in the area based upon the location information called IMSI that your cell phone provides.
So we did some research about that and they’re continuing to track it. We think this is a pretty go-to tool for police and other law enforcement when there is mass protest activity and they want to get identifying information or close to it for the people who go there
In an early IMSI Catcher case that we handled, we found that law enforcement was using this by basically lying to a judge about what it did, and we were able to uncover the lie and require law enforcement to get an actual court order based on truthful information about what they were doing to do it.
So even when we can’t block it entirely, we can begin to scale it back and bring things within the realm of rule and law. And I think that these are things that are going to be important for this community because the climate justice movement wants to be out in the streets. It wants to be loud and wants to be making noise.
So those are some of what I think of as prevention tools that we make available. We also help people who might have other people’s information that they’re collecting, whether they’re doing reporting of one kind or another, and how can you best take the steps to keep that information safe, not just when your information is on the line, but when the information of people who might have trusted you with it is on the line. And there’s a set of things that journalists should think about, including how to safely delete stuff if things start to go wrong and you don’t want to have that information in your hands.
Host: Another egregious example of digital surveillance is Pegasus. It’s spyware that can eavesdrop on calls, read texts, locate passwords, gather information from apps on a device, and initiate a device’s camera and microphone to make you an unwitting spy.
In short: Strip search and hijack your smartphone without your knowledge or consent. Pegasus was designed by an Israeli cyber-arms company with close military ties. The Israeli government has licensed it to other governments and corporations who use it to track, repress and sometimes kill journalists and dissidents…
CC: So the lack of a secure ability to have a private conversation online is important, and it’s not just important because I believe it’s part of dignity, but there are quite clearly lives at stake for these choices that we’re making about what kind of systems we want to build.
Host: When we return, Cindy Cohn says there are ways out of the Matrix, and our democracy depends on it.
I’m Neil Harvey. You’re listening to the Bioneers: Revolution from the Heart of Nature.
Host: What feels like the overwhelming power of Big Tech and Big Brother is actually core to their strategy: Induce a sense of inevitability, helplessness and resignation, and keep people on the dark side of the one-way mirror.
CC: I think a lot of companies would like us to feel that way, and so there’s a lot of corporate pushing in that direction. You know, very famously, the head of Sun Microsystems said, privacy is dead. Get over it. That serves the interests of the tech industry if we engage in what my friend, Eva Galperin, calls “privacy nihilism”, right?
I mean, we can build a world where technology supports us. I just interviewed Alvaro Bedoya, who’s FTC commissioner, and I asked him: What does the world look like if we get it right? He said we live in a world where technology supports dignity, it supports human rights, and it supports a life where we can live and work with pride.
And I live a life where the things that I do online and the places I go and who I see are my business and nobody else’s business. And I can share them if I want, but I don’t have to –– that the focus of my technology is to support me, not having a secondary business model, not having a secondary interest, not having tracking.
And of course –– I’m a straight up OG civil liberties person –– not available to the government unless it has proper process and warrant. You know, that’s, again, kind of the place I started in this work was the idea that we should be able to have a private conversation online.
We’re still working on it. We have lots of tools, we’ve come a long way, but that’s still the goal, and I still think we can get there.
The first thing that has to happen is that people have to believe that they can have a way out. They have to have a vision of what it looks like if we get it right. Everything flows from that. I’ve sat down in congressional offices with staffers and members of Congress and said, we need a world beyond Facebook, and they look at me blankly. They can’t imagine a world beyond Facebook.
Many people can’t leave Facebook. They can’t leave the big platforms. That’s where their community is. There are people who run their businesses there, but they ought to be able to have other tools that help them interoperate on it with a different deal then the one that Facebook is operating.
Host: After twenty years of a Wild West virtually without regulation, resistance is building to reclaim democratic governance.
The U.S. federal government and dozens of states are taking serious antitrust actions against Google, Apple, Facebook, Amazon, and Microsoft. As is the European Union.
Breaking up these behemoths can allow real competition, and it would also allow more flexibility and autonomy for users and their local communities at the technical level, in actual networks.
CC: And so the question is: How do we move from a “one platform to rule them all, let’s see if we can make our dictators as benevolent as possible” model, to one where we get rid of the dictators. Right?
The Internet started as something that was decentralized. In fact, one of the founding ideas of the Internet was that it was a communications mechanism that would stay up even if part of it went down. This is the kind of “military” view sometimes you hear of the early Internet. It’s not the only story, but it’s definitely one of them.
And so re-decentralizing the Internet, bringing us to smaller places, communities that decide for themselves what the rules ought to be, and decide for themselves who else they want to communicate with and on what terms.
It means we can experiment with business models. It means that we could have maybe a non-business model. Maybe we have the, you know, community-funded social network. There are several of these in development right now – municipal social networks, philanthropic social networks, and then of course the volunteer ones, as well.
So we’re moving from, you know, one department store to rule them all to a town with small businesses. And then let’s see what could develop there.
Host: But even breaking up these Goliaths won’t address the fundamental issues of data privacy and behavioral manipulation. It’s about a lot more than targeted ads.
These radical new conditions of the digital information society require new rights. It starts with the freedom to choose if, when and how we share our data, and who owns or profits from them.
CC: The lack of comprehensive privacy protection for us is leading to a lot of these situations.
We have a very strong anti-wiretapping law in the United States. We have a very strong one in California. If you want to wiretap a conversation, if you want to listen to a conversation, you have to get both ends of the conversation to agree. But that is why there’s a difference between facial recognition and wiretapping and vocal recognition. We don’t have companies selling and collecting the conversations that we have and selling them into the data broker world. That’s because there is a strong wiretap act that prohibits that.
So we can do this. Right? We did it with voice. We need to be able to do this with our faces now, and with visual stuff as well. Whether you care about corporate surveillance or law enforcement surveillance. They are the same thing, practically, most of the time. Even the big cases that we do against the NSA, that I did against the NSA, are because AT&T lets the NSA tap into the Internet background.
So there’s corporations in every piece of this. And so if anybody says, well, I worry about law enforcement but I don’t worry about Amazon, or I worry about Google, they’re more scary to me than law enforcement, they are misunderstanding the systems that are happening. It is the same, and you can’t just pick one. We have to address both of them.
Host: But the real lynchpin for digital democracy and privacy rights means passing legislation to assure your data are none of Big Tech’s business by law and by default. The European Union has now instituted such laws and regulations protecting user privacy, and California has passed similar laws.
Cindy Cohn believes it’s the responsibility of government to connect crucial new rights to the public good and to the authentic needs of people and society.
CC: One of the fallacies that I think sometimes people get stuck in is the idea that their personal choices are the only thing that matters. One of the things that is important to remember is that some of the things that we need to fix we need to fix through collective action. We need to write our congressmen. We need to get the law changed. We need to participate in social movements and social growth, that your own personal privacy choices and whether they’re exactly the most privacy protective or whatever tools you use, those are fine. And we can make those choices, but don’t get stuck there.
We need to support the development and deployment of other systems that better serve us. And our individual choices are not the only thing and in fact, they’re probably not the most important thing. We need committed political and legal and creative innovation, action in order to move towards this better world.
That’s somewhat lobbying, somewhat geeky technical, to try to make it so that we have more control over our experiences online, and they’re not just dictated by these gigantic platforms who have terms of service that nobody can read, but we all have to agree to, and then we get limited by.
So smaller communities, more community involvement, more systems that let us be in charge of our experiences online. Those are all ideas that are being developed by people right now, and a lot of our work at EFF is to try to create the legal and policy space for them to grow.
Host: Cindy Cohn, director of the Electronic Frontier Foundation. “None of Your Business: Claiming Our Digital Privacy Rights, Reclaiming Democracy…”