Public Media for Central Pennsylvania
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The Internet Got Ensh**ified: Monopoly Power and the Fight for Digital Democracy

News Over Noise episode 306 title graphic

From the decline of Google search to the hidden economics of surveillance and algorithmic coercion, science fiction author and activist Cory Doctorow talks with Matt Jordan and guest host Jenna Spinelli about how monopolies distort our information ecosystem, erode public trust, and supercharge disinformation. But it’s not all doom and gloom: they also explore real-world strategies for reclaiming digital space—from antitrust reform to coalition building to radical imagination.

About the Guest:

Cory Doctorow is a science fiction author, activist and journalist. He is the author of many books, most recently “Picks and Shovels” and “The Lost Cause,” a solarpunk science fiction novel of hope amidst the climate emergency. His most recent nonfiction books include The Internet Con: How to Seize the Means of Computation, a Big Tech disassembly manual, and Chokepoint Capitalism, about monopoly platforms and creative labor markets. He coined the term “ensh*ttification,” to describe the decay of online platforms. The word was named Word of the Year by the American Dialect Society, the Macquarrie Dictionary and the New Scientist. He works for the Electronic Frontier Foundation, and serves as a MIT Media Lab research affiliate, a visiting professor of computer science at Open University, a visiting professor of practice at the University of North Carolina’s School of Library and Information Science, and he co-founded the UK Open Rights Group.

Episode Transcript:

Matt Jordan: In the past few decades, the average legroom on a commercial flight has shrunk from 35 inches to as little as 28. And that's if you actually make it onto your flight, because now you're also contending with overbooking delays and sudden cancelations. Why? because the airline industry is controlled by just a few major players, and when there's no real competition, consumers get squeezed.
Now, apply that same logic to the digital media landscape. If news and information flow through platforms that prioritize engagement over public interest, what happens to journalism once the public has no viable alternative besides the dominant players? What happens to democracy? To dig into these questions, we're talking with Cory Doctorow. Cory is a science fiction author, activist, and journalist who's been studying big tech's impact on society for years. His most recent nonfiction books include the Internet Con: How to Seize the Means of Computation, and Chokepoint Capitalism. He also coined a term to describe how platforms degrade over time, a word so spots on it was named word of the year by the American Dialect Society, the Macquarie Dictionary and New Scientist. I'm Matt Jordan, and on this special episode of News Over Noise, I'm joined by guest co-host Jenna Spinelli. Jenna hosts and produces Democracy Works, a podcast from the McCourtney Institute of Democracy at Penn State and WPSU. Jenna and Corey, welcome to News Over Noise.

Cory Doctorow: Thank you very much. It's a pleasure to be here.

Matt Jordan: The internet has become a gateway for information about democracy, and one of the first stops that people take is often Google. How have non-competitive policies degraded the information that we find there?

Cory Doctorow: Well, I think a lot of us have an intuitive sense that Google is not serving results of the same quality that we were accustomed to. And certainly, there's been some empirical work that suggests that that's the case. But understanding where that came from really only came into focus during the antitrust trial last year, where there were some memos published detailing an extraordinary internal conflict at Google in 2019, when the company became quite alarmed that it had stopped posting the kind of growth that it had historically posted. And there was a good reason for that. Google's got a 90% market share. We already search for all the things we're ever going to search for. Any fool thought that comes into our head, we type it into a search box, and every search box is connected to Google. And so, they weren't going to grow by signing up more people, and they weren't going to grow by getting the people who are signed up to do more searching. So, what were they going to do to continue to post growth? Please Wall Street, maintain their incredible earnings to valuation ratio that's such a huge advantage for them and the source of a lot of money, and so on? And in the memos that the DOJ published that it acquired through discovery, we see that at that moment, two Google executives came into conflict with one another. One, Prabhakar Raghavan, was the Head of Revenue. And the other one, Lars Gomes, was the Head of Search. And Prabhakar Raghavan's idea, broadly speaking, was that if they made search worse, so that the first time you searched, you didn't get your answer and you'd have to refine your query twice, three times, four times, that would be more bites at the Apple, more chance to show you ads, and they could post growth this way. Gomes and Raghavan go back and forth pretty ferociously, and in the end, Gomes is sent to the hinterlands. He's running Educational Technology for Google now. I guess that makes him the Chromebook czar or something. And Raghavan becomes a very important person at Google, and Google's quality starts to precipitously decline right about that time.

Matt Jordan: So, what does that do to citizens who are trying to get a sense of how the world is going? What is that degradation then, due to their ability to be informed?

Cory Doctorow: Well, I think that for a lot of us, the first effect that we noticed it was like being partially lobotomized because we stopped remembering where files were on the internet, and we just remembered which search terms we needed to type into Google to find them. And so, the first effect, I think, for people who are really engaged with this stuff, was that just a lot of our own personal touchstone references became harder to find and may just have disappeared at that point. Subsequently, you get a Google that just seems a lot more vulnerable to search engine optimization in every domain. Commercially, this has been the subject of a lot of commentary that if you're trying to find reviews of the best air purifiers or the best new laptop or whatever, you are just going to find the people who are best at SEO, not the people who are best at reviewing technology. A lot of them affiliated with some of the very big brands like Forbes and so on, who just had this hidden constellation of services that were designed around SEO slop and generating affiliate fees. And then, people who just want to get political news. I think a lot of those SEO techniques became very useful for extremely well-funded organizations that ran public advocacy campaigns like tech shops or like SEO shops. And it was the kind of PragerU-ification of all of our news of every kind. And you're just getting dragged into some very weird corners of the internet.

Jenna Spinelli: So, Corey, a lot of what you've been describing, the deterioration of search engines and also social media platforms, tech platforms more broadly, seems to line up with the rise of populism around the world. And I wonder to what extent some of that behavior, some of those trends may actually just be a reaction to what's been happening in our tech environment.

Cory Doctorow: Well, this is where the fact that populism has more than one meaning becomes very useful. Populism in its original sense is a leftist, anti-authoritarian, anti-big business movement. What's sometimes called right wing populism, which is a little weird. It's like calling something left wing fascism. It doesn't really make any sense, but we know what we mean when we say right wing populism. I think, has some of the same vibes as populism, historic and contemporary populism, like the sense that big business is ripping you off and that life isn't as good as it used to be. You can hear that in the MAGA, right. And if you close your eyes and you don't listen very carefully, it can sound a little like a Bernie Sanders rally. And Naomi Klein talks about this in her wonderful book, Doppelganger. She talks about how often the right has these warped, mirror bizarro world versions of leftist causes, and she's not the first person to observe it. As she points out, in the 19th century, they used to call antisemitism the socialism of fools. That if you have observed that, there's a group of rich people who seem to be running the world to their own benefit, to the detriment of everyone else, and you come to the conclusion, not that we live under a system that inevitably produces such an elite, and that this is a systemic problem, but instead come to the conclusion that it's because there are Jewish bankers secretly running the world, you've gotten most of the way there and taken a horrible turn off the course. And so I think that when you live under conditions of monopoly, as we do, with most of our sectors collapsed into just a handful of firms, a cartel capturing their regulators, being completely unable to be disciplined by their workers, having the whip hand over them, that it's easy and I think, correct, to feel like you are living in an environment where you are the food for the machine, and also that you can't trust what anyone tells you. I'm not in any position to assess most of the truth claims that I need to get right in order to survive the day. I'm not going to audit the software on the anti-lock brakes on my car or whatever. And yet, I'm pretty sure that the institutions that we rely on to validate claims about what's safe and what isn't, what's good and what isn't, I'm pretty sure they're not good. And they weren't good before the unscheduled rapid mid-air disassembly that Elon Musk is doing now. I have chronic pain. And when I lived in the UK, my wife had a job that came with private insurance. So, I went to see a fancy psychopharmacologist on Harley Street where all the fancy doctors are. And he said, I've got great news for you. Opioids are safe. You can just take opioids every day for the rest of your life, and you won't have any pain. And I did my own research, and I concluded that the pharma companies were run by evil billionaires who would murder me for $1, and that the regulators who were supposed to stop them from doing that were in their pocket. And when I heard anti-vaxxers talking about why they weren't going to get the COVID vaccine, which is something I believe in, I've had all my jabs, I get perfect 5G reception no matter where I am, a glow in the dark. I've had so many vaccines. When I hear them say, well, I don't trust the pharma companies and I don't think the regulators are keeping them honest, I don't think they're wrong about that. I think that's perfectly rational. And so, I think that regulatory capture and the ability to abuse people with impunity is downstream of monopolization. And tech monopolies are one of the most prominent and harmful monopolies that we come into contact with on a daily basis. And as digital tech finds its way into other industries, some of the tech industry's favorite scams become part of the other industries. Nurses now are booked through gig apps that check their bank balance, and if they owe a lot of credit card debt, it offers them a lower wage because they're willing to take a lower wage if they're financially desperate. I think that does make people ripe for a politics of rage and a politics of weaponized skepticism. Media literacy gone wrong, where everything that someone says, you say, well, you can't trust an expert, and it just leaves you in, this epistemological void where you're a sucker for any con man who sounds like they know what they're saying.

Matt Jordan: It's interesting, the early impulse to the internet was this huge democratic spur. I remember as a media studies guy that the early days, we would have these professors come in wearing parachute pants telling us about how the Arab Spring and the Jasmine Revolution, this was the way of the world, that the internet was inevitably democratic. But I think as it's gotten captured, as it's gotten crapified, one of the things that has happened is that impulse toward search, toward finding things out without the gatekeepers, the curators, has been turned on its head in this bizarro world that you're talking about.

Cory Doctorow: And I don't think it's wrong to say that helping people with disfavored views find each other and mobilize is a big change to our democratic system. I am a person who spent his teens and 20s riding a bicycle around Toronto wheat pasting up handbills trying to get people out to street marches. If you don't think that social media is an important change in how we mobilize people in the streets, I got a bucket of wheatpaste. You spent a couple of nights out in subzero weather and come back, and you tell me whether you think social media is important to organizing movements. But I agree that the capture of these platforms, which was only possible because the platforms themselves collapsed into such a small number of speech forums. When there is a lot of different speech forums that weren't all under one roof that you couldn't just lay hands on it. There wasn't one throat to choke, it was harder for states to exercise control just as a logistical matter. There might be hundreds of similarly sized places where people are gathering to talk, but once it's all on Facebook, you just have to suborn Facebook. And once Facebook is a listed corporation, and Mark Zuckerberg, although he controls the majority of the voting shares is highly exposed to changes in Facebook's share price, more so than any other person in the world really, then there's a lot of ways to exercise leverage over those platforms. And of course, the people who run the platforms are themselves billionaires and are class allies to a large extent of the authoritarian project to make being a billionaire stable. Being a billionaire is intrinsically unstable. This is Thomas Piketty's idea that if there's enough inequality, eventually you're spending so much money to stop people from building a guillotine on your lawn that you might as well just build some hospitals, so they stop trying.

Matt Jordan: If you're just joining us, this is News Over Noise. I'm Matt Jordan, and I'm joined by guest co-host Jenna Spinelli, talking with Cory Doctorow, a science fiction author, activist, and journalist, about the threat of an oligopoly owned digital media system poses to democracy. One of the things that was always so fascinating about the early internet was there's C. Edwin Baker, who was an early proponent of public media. He was an interesting quote from him. The key goal, the key value served by ownership dispersal, is that it directly embodies a fairer, more democratic allocation of communicative power. So, this notion that distributed communicative power also is distributed risk, It's distributed kind of—its due diligence in a democracy allowing for deliberation to go on that is not centralized. And it's an interesting twist or turn that has happened as platforms have taken over, is that everything's become much more centralized. So, the whole regulatory structure or deregulatory structure that allowed for platforms to monopolize things is, of course, this kind of antithesis of centralization, free markets are the only thing that are keeping us from becoming Stalin. But what has happened is that, in fact, those unregulated markets that allow for monopoly and monopsony power have created a centralized, non-distributed communicative environment. That means we have less opinion, less experience to draw from. We have less information as a democracy.

Cory Doctorow: And you said it's deregulatory and then you said, it's regulatory, and then you said it's deregulatory. And the reason for that, is it's all of the above. It's not just that we deregulated mergers and allow these companies to merge to monopoly and so on. It's that a lot of our regulatory response to the obvious, undeniable and often very urgent pathologies of these monopolistic platforms, has been regulation that itself started from the assumption, really, that the monopolies were natural and permanent, and that anything that created a compliance burden that would stop someone from entering the market was tolerable because no one was going to enter the market anyway. The internet was fully cooked. And also gave the platforms ammunition to use in policy forms and in the court of public opinion to argue that we shouldn't encourage more competition because they've been asked to step in and fill a safeguarding role in respect to their users. So, all these rules about hate speech, harassment, disinformation, and so on, each of them amount to a mandate for platforms to observe, surveil, and intervene in the communications of billions of users really cut against any effort to for example, say, oh, well, you also have to have an API. Like a programmatic gateway so that someone who leaves Facebook can go to a small platform run by a community group, but they can still communicate with the people who didn't leave Facebook. If you do that, Facebook will say as they have, how are we going to police hate speech if we can't observe all the people on our platform? We have to be able to do things like not just observe what people say, but we look at things like, well, there's a group of people who gather over here and have a private conversation, and then they disperse across the platform and have a series of public conversations. And this is the characteristic conduct of a troll farm or a troll army who are getting together, planning a strategy, maybe for victimizing trans people or women or leftists. And if we can't do like NSA style signals intelligence on everything everyone says to everyone else, how are we ever going to stop the hate speech? And so, what we did was we created the environment in which the platforms would amass gargantuan amounts of power. And then we made rules to force them to use that power wisely. And those rules have been a disaster. They have not worked; the platforms do not use their power wisely. But those rules require that the platforms stay big, and they cut against rules that make the platforms less important. You can either make the platforms the first line of defense, or you can make it so that you can leave the platform. But you can't do both.

Jenna Spinelli: So, Matt mentioned democratic deliberation, and there are lots of non-profits and universities and even some companies that are trying to figure out how to bring democratic deliberation online. I guess I wonder if you see a path forward there, if you think that the internet and democratic deliberation are fundamentally compatible in some way.

CORY DOCTOROW: I think they absolutely are. And I think that even the internet is currently constituted as fundamentally compatible with democratic deliberation. When you look at things like Vermont's front porch forum, I don't know if you're familiar with this. It's a really exciting place where each person gets to post once a day, and by default, you only see the post from your immediate neighborhood. And it's really effective at rallying people at addressing local issues. It's very civil. People on all sides of the political spectrum participate in it. It's become very central to life in Vermont. So, it's clear that the fact that the sunshine can't reach the forest floor, is choking out a lot of these things, like front porch forum, but also that where light does penetrate the canopy, you do get all kinds of super interesting things popping up that they may not be stable in a long-term configuration. It may be that what works for front porch forum in 2025 won't work. And the political environment of 2030 or 2050, that's fine. But one of the things I know from being in political movements all my life, is the fact that you did have a community with other people at some point in the past means that in the future, it's easier to return to those people.

Matt Jordan: It's interesting. One of the things we associate with democratic deliberation, though, is the notion of a public. And one of the things that internet affords are these not exactly public places where people who are minded can find each other and can say things without having to be accountable as one has to be in public. And that's a feature, I think, that we haven't quite figured out how to manage. You can hide behind the anonymity. You can amplify your stuff with a trillion bots. So, the idea of having something like you're describing with the Vermont platform, something like that really requires a notion of authenticating what an authentic user is. And so always a question for me in terms of the internet is how do you scale those things?

Cory Doctorow: Do you need to scale those things? Look, anonymity has played a really important role in democratic deliberation as well. This is the home of the Federalist Papers here in America. There's always been a place for anonymous speech in this. And anyone who's read about the red scares and so on, knows that the simplistic formulation that if you're not willing to sign your name to your political views, they're not firmly held, really doesn't understand how movements change. And I would tell people that we live in a world today, although it's changing again, but we live in a world today where things that were illegal in living memory, like so-called interracial marriage or being gay, that those things are not just legal but are widely accepted. And we view the fact that it used to be unlawful to be those things or do those things. We view that as a stain on our national conscience. And so that's a really strong case for democracy emerging not just out of public, but out of private forums and the solidaristic energy that comes from being in a private space and discussing things that was hugely important to movements that were about human rights and democracy and that is how these big alliances got built, and that's how they changed our world. So, I would put hiring a million bots to amplify your message and a completely different bucket from having private places or anonymous deliberation.

Matt Jordan: How much of the bad stuff that's on the internet could be fixed by better privacy laws?

Cory Doctorow: I think a lot of what's wrong on the internet has some nexus with it. It may not be all that we need to do, so it may not be sufficient, but it is necessary. So, you think about things like deepfake porn, which is a huge problem in elementary and secondary schools now. And it's obviously very gendered. And I don't know that you could have stopped people from doing it outright by having a privacy law, but it certainly would have made it harder. For one thing, it would made the firms that offer these tools and not as general-purpose tools but are specifically designed as deepfake porn tools and marketed as deepfake porn tools, would have made it a lot harder for them to raise capital. It would have made it a lot harder for them to operate onshore. It would have made it a lot harder for them to buy ads on Instagram and other platforms. They just would have been in a much worse place. And I think that the evidence about how social media affects young people is a lot muddier than someone like Jonathan Haidt would have you believe. I think Jonathan hates books and work, cites a lot of research that is objectively very low quality, small sample sizes, poor research methodology, and so on. And I think that the more robust analysis says that it's a complicated picture. It helps some people, it hurts other people, whatever. But I think we can all agree that to the extent that there are harms that we're worried about that arise out of algorithmic targeting, like If you think that being exposed to anorexia content is making your kid anorexic, which seems like a pretty reasonable hypothesis to me. Removing surveillance advertising from the ecosystem means that maybe you'll get algorithmic amplification of anorexia pro-anorexia content if your kid searches for it, but they're never going to have it shoved down their throat through just a surveillance-based algorithmic amplification. And that would be really important. We have a lot of racial inequality in this country. It's obviously getting a lot worse now. We're not going to solve it merely by eliminating things like algorithmic bias and hiring, lending, housing and so on. But if we do take surveillance and algorithmic filtering out of finance—approvals and finance, advertising and finance, product matching out of jail and bail out of child protective services and so on, all these things where they're fed by the commercial surveillance sector, and then they emerge as the determinants of policy outcomes, we just take that stuff out. It's going to be harder to do racism at scale. We built a racism machine. Getting rid of the racism machine isn't going to stop people from being racist, but they'll have to do it by hand. And that is a superior outcome than being able to automate it and doing it at scale.

Matt Jordan: I'm interested too. A lot of the things that you write about and talk about that have to do with targeting, a lot of that does have to do with finding out all this data stuff. So, democracy is always predicated on this non-coercive lifeworld. And so much of the apps and platforms require that surveillance data in order to be able to coerce, which they might use the word leverage in order to make them more profitable.

Cory Doctorow: Personalized pricing, which—I think when we hear pricing, I think we often think about the buy side. So, the price of eggs went up because they figured out that you can pay more, but it's also labor pricing. It's the sell side. So, the wage you're offered for your labor is increasingly personalized as well, through algorithmic wage discrimination in the gig economy and elsewhere. That personalized pricing nominally could be used to offer you a lower price based on your ability to pay less. You can imagine, like an airplane, where they know that there's an empty seat and they're just going to fly it empty if you're not in that seat and they know that you don't have enough money to pay the full fare, and so they offer you a discount. In practice, in a non-competitive environment, that's not how this works. In practice, personalized pricing is used when you factor power into your analysis. And when you have a powerful group of extremely wealthy firms and individuals, personalized pricing just becomes a way to pay you less and charge you more. And the instantaneous refactoring of prices from moment to moment isn't used to make the lettuce $0.10 cheaper per pound, because it's been on the shelf all day. It's used to figure out as you walk in the grocery store that you can afford to pay more, and it jacks the prices up on all the electronic shelf tags.

Matt Jordan: Just to maybe a final question is, there's a lot going on in the world that seems fracturing. That it has us worried about the state of things. But you've talked about how this also presents a lot of opportunity. So how might some of the things that we're seeing today provide an opening for fixing some of the problems?

Cory Doctorow: Well, let me tell you about something extremely anti-democratic first, that's been going on in plain sight for decades. Since the WTO and even before, the US Trade Representative has gone all around the world and insisted that countries that wanted to trade with America would have to adopt laws that were favorable to large American firms, and that this was a precondition for tariff-free access to American markets. And in particular, a lot of these laws that were instituted at the behest of the US Trade Representative are IP laws that make it illegal to do things like reverse engineer an iPhone so that you can use a domestic App Store. Apple takes $0.30 out of every app dollar that's spent everywhere in the world. And so that means that if you're on newspaper in Canada and your subscribers buy their subscriptions through the App Store, 30% out of every dollar goes to Apple to process that payment. The normal payment processing rates in North America, which are considered usurious in the rest of the world, are 3% And Europe, it's like half to 1% and Apple's charging 30% making tens of billions of dollars off of it. Well, if Canada, Mexico, Europe, the whole Global South, if we all got on board with these laws that prohibit us from competing head to head with American firms and not just any American firms, the largest American companies, and to compete with them directly in the lines of business that make them the most money, the highest margin, highest dollar lines of business. And to destroy those rackets everywhere in the world all at once, to bring the dollar value of those rackets to 0 for these large American firms that are funding the dismantling of America, creating software products that give people more control over their own digital lives, that will be impossible to prevent leaking over the border into the US. I would say Canada doesn't have to stop at exporting reasonably priced pharmaceuticals. We can also export the tools of digital freedom to Americans. This is a really interesting possibility, and it is a recovery of something important and democratic, which is laws that are set up by people voting for what's best for themselves and not by deliberation that takes place behind closed doors in these multilateral and bilateral trade bodies. And it's not just the IP laws. You've also got stuff like investor state dispute settlement agreements that come about as a result of these free trade agreements. That's where large companies can sue governments for enacting labor, environmental or safety standards laws if they erode their profits. You have tobacco companies suing Australia over plain packaging laws for cigarettes, that sort of thing. They're also part of these trade packages. So, if Trump's going to unilaterally dismantle the global system of free trade, the global system of free trade it gets us some things I like. But holy moly, has it been a way to export some of the worst, most anti-democratic policies the world has ever seen. If we're going to get rid of it, well, let's seize the moment.

Jenna Spinelli: It's a good place to end.

Cory Doctorow: Thank you.

Matt Jordan: Corey Thanks so much for joining us.

Cory Doctorow: Oh, it's my pleasure. Thanks for having me on.

Matt Jordan: That's it for this episode of News Over Noise. Our guest was Cory Doctorow, a science fiction author, activist, and journalist. To hear the full interview, including extended content, listen wherever you get your podcasts or at newsovernoise.org. I'm Matt Jordan. Until next time, stay well and well-informed. News Over Noise is produced by the Penn State Donald P. Bellisario College of Communications and WPSU. This program has been funded by the Office of the Executive Vice President and Provost at Penn State and is part of the Penn State News Literacy Initiative.

[END OF TRANSCRIPT]

Episode Credits:

Producer: Lindsey Whissel Fenton

Audio Engineers: Mickey Klein, Scott Gros, Clint Yoder

News Over Noise is a co-production of WPSU and Penn State’s Bellisario College of Communications. This program has been funded by the office of the Executive Vice President and Provost at Penn State and is part of the Penn State News Literacy Initiative.

Tags
News Over Noise: Season 3 News Over NoiseNews Literacy
Lindsey Whissel Fenton, MEd, CT, is an Emmy Award-winning filmmaker, international speaker, and grief educator.
Matt Jordan is head of the Department of Film Production and Media Studies in the Donald P. Bellisario College of Communications at Penn State University, and Director of the News Literacy Initiative.
Cory Barker, PhD, is an assistant teaching professor in the Film Production & Media Studies department and co-host of News Over Noise
Jenna Spinelle is the Communications Specialist for the McCourtney Institute for Democracy at Penn State. She is responsible for shaping all of the institute's external communication, including website content, social media, multimedia, and media outreach.