Taming the Digital Juggernaut - WhoWhatWhy Taming the Digital Juggernaut - WhoWhatWhy

Mark Zuckerberg, Tim Cook, Jeff Bezos, Sundar Pichai
Facebook CEO Mark Zuckerberg, Apple CEO Tim Cook, Amazon CEO Jeff Bezos, and Google CEO Sundar Pichai testified before a House Judiciary Subcommittee about Antitrust Law via video conferencing on July 29, 2020. Photo credit: C-SPAN

In our current news cycle, very few stories get through the great wall of politics. This week, the Department of Justice’s newly announced antitrust suit against Google and the growing power of tech companies is a stark reminder of how digital technology shapes our lives — in ways both more complex and potentially darker than most of us can imagine.

In this week’s WhoWhatWhy podcast we talk with Cory Doctorow, whose award-winning novels Little Brother, Homeland, and Attack Surface provide a glimpse into the dystopian future of such technology.

In addition to his tech-savvy fiction, Doctorow is a special consultant to the Electronic Frontier Foundation, an associate at the MIT Media Lab, and a visiting professor of computer science at the Open University in the United Kingdom. 

Doctorow lays bare the symbiotic relationship between politics and tech. He explains how policymakers’ inability to grasp what computers can and can’t do has led to our current crisis of failed oversight and regulation.

Where the recent documentary The Social Dilemma dramatizes the dangers of social media addiction, Doctorow argues that mass surveillance, control, and manipulation all have their roots in laws that have either been unenforced or undermined by corporate-friendly legislation. 

Indeed, he believes that, until now, it has been in the interest of the government to allow these companies to get as big and as powerful as they are. But, like Frankenstein’s monster, they have now broken loose from all effective constraints.

He laments that what was once exciting about working in technology has been co-opted by a combination of money and lifestyle. He shows why these companies can’t govern themselves — but remains optimistic that a blueprint exists for the government to bring the digital behemoths under control for the public good.

googleplaylogo200px download rss-35468_640
Click HERE to Download Mp3


Full Text Transcript: 

As a service to our readers, we provide transcripts with our podcasts. We try to ensure that these transcripts do not include errors. However, due to time constraints, we are not always able to proofread them as closely as we would like. Should you spot any errors, we’d be grateful if you would notify us

Jeff Schechtman: Welcome to the WhoWhatWhy Podcast. I’m your host, Jeff Schechtman. This week was one of those times when the confluence of modern life simply screams at us. We were overwhelmed with political news, political division, and the perception of endless ongoing change. The only things that seem to break through the politics in the past month was Apple’s introduction of the iPhone 12 and the Justice Department’s lawsuit against Google for what they see as monopolistic practices. It’s a stark reminder of the ways in which politics and technology dominate our culture. And while the two converge in social media, which is what we seem to talk about the most, the reality of their nexus in shaping the future is both more complex and potentially darker than most of us can imagine.

Jeff Schechtman: We’re going to talk about that in this week’s podcast with my guest, Cory Doctorow. He’s a regular contributor to The Guardian, Locus, and many other publications. He’s a special consultant to the Electronic Frontier Foundation, and he’s an Associate at the MIT Media Lab and Visiting

Professor of Computer Science at the Open University. He’s the author of the award-winning novels, Little Brother, Homeland, and Attack Force. And it is my pleasure to welcome Cory Doctorow to the WhoWhatWhy Podcast. Corey, thanks so much for joining us.

Cory Doctorow: It’s absolutely my pleasure. Thank you.

Jeff Schechtman: Talk a little bit first about how you see this symbiotic relationship that continues to grow almost exponentially between politics and technology.

Cory Doctorow: Well, look, computers have this wonderful characteristic to them, which is that they’re universal. It’s something that, it comes out of computer science and is not well understood in our policy sphere, but everything that every computer can do can be done by every other computer, although some do it more slowly and some do it more quickly. And that means that every task we can imagine can be turned into a computer program, which means that there’s lots of people trying to figure out how to make computers faster and when they do, everyone else benefits. So all of that is really important and it means that computers are kind of eating the world. And we have a real problem with our policymakers failing to grasp the implications of this. They know that computers are important, but they don’t really understand the underlying technology very well. And so our policy is generally out of step with what computers can and can’t do.

Cory Doctorow: We sometimes, for example, see law enforcement demanding that computers be designed so that their encryption works if you’re a criminal or a blackmailer or a spy, but it doesn’t work if you’re a police officer trying to access a criminal’s phone. We don’t know how to make that encryption. That encryption is just not possible and wanting it badly is not enough and deliberately introducing defects or vulnerabilities into the encryption that we use, for example, to make sure that when your pacemaker gets a software update, it comes from the manufacturer and not from a third party who’s going to make you drop dead where you stand. We can’t really afford to indulge with wishful thinking. We really have to confront it head on. And so all of this, I think it’s one of the central crises of our policy and regulation.

Jeff Schechtman: One of the things that’s happening also is that the technology is moving so much faster so that it gets further and further and further away from the policy.

Cory Doctorow: Well, yes and no. So definitely the engineering got faster, right? How many transistors we can fit in a square centimeter of chip space? What we can do with that? What kind of cool programs people can write and how they use them? Even just how much expertise there is in the world. How many people feel comfortable picking up a computer and trying something new with it. All of that’s growing really fast, but the underlying stuff, the underlying theory about what computers can and can’t do, that’s actually pretty static. The computer science part as opposed to the computer engineering part, it hasn’t changed much since the World War II era when people like John von Neumann and Alan Turing were laying down the groundwork for what we build today.

Cory Doctorow: I wrote Little Brother 14 years ago, came out 12 years ago, and I built it on this assumption that the science would remain static, the engineering would continue to grow and our policymakers would continue to not come to grips with it. And one of the reasons that book that is a techno thriller, that’s now 14 years old is still current, still gets read is because all those things are still true. And so the problems that addresses really map tightly onto the problems that we keep having now about mass surveillance, control, manipulation. And also the solutions we find, how people seize the means of computation to resist all those factors.

Jeff Schechtman: And as AI gets added into the equation and more and more surveillance gets added into it, where we’re literally surveilled all the time, it becomes even more complex and more frightening.

Cory Doctorow: It really does. It really does. Although again, I have a slightly different view from most people I think here. I don’t think machine learning really should be called AI because it’s nothing like artificial intelligence. I think that it obscures more than it illuminates. Really what we should understand machine learning as, is a statistical inference. So, it looks back in time and it says, “Look, when I see these things, they happen at the same time. In the future, if I see one of them, I’m going to predict that the other one’s going to happen.” And sometimes that’s really stable, right? Like if you see two eyes, a nose, and mouth, it’s probably a face, right? And so you can make that as this kind of stable prediction. Although it doesn’t always work. People who have smart doorbells sometimes get these endless streams of alerts because the doorbell has seen a melting snow on their walk and found a face in it and keeps saying, “Look, I found a face! I found a face! There’s a face that your door.”.

Cory Doctorow: But it isn’t destiny and these predictions are not causal. The machine learning systems, they don’t understand that one causes another thing or what that causal relationship can be. And so they get confused and sometimes those predictions are really wrong, which isn’t to say that they’re not harmful. If you observe that every poor person who goes before a judge is remanded until their trial, you can infer that judges are making good decisions and that poor people are likely to require being held until trial. That’s not true. What’s actually being surfaced by that data is that judges discriminate against poor people. But when you embed that in a machine learning system, a theory free system, that doesn’t try to understand causal relations, what you end up doing is taking the historic patterns of injustice and adding a veneer of empiricism to them, an empirical facewash that makes it look like math instead of discrimination.

Jeff Schechtman: What about when the means of gathering this information becomes more monopolistic in terms of its control. How does that enter into the equation?

Cory Doctorow: Yeah, that’s definitely a real problem. And you know, one of the reasons that we collect a lot of this data is for advertising. And the actual empirical studies on the advertising that’s based on behavior, where I look at all the things that you’ve looked at on the web and hold a real-time auction when you visit a web page and try to figure out exactly what ad to show you, that this performs about the same, or sometimes maybe a little better than a context ad, which is an ad where I say, “Oh, look, you’re reading an article about shoes. I’ll show you an ad for shoes.” Something that doesn’t require any of that data gathering, something that doesn’t require any of that privacy invasion or any of the risks that comes with it, whether that’s the risk that the data will leak or that it will be misused by the company or that it will be sold or that an oppressive government might seize it or buy it and use it. Those are all things that we’ve seen happening just this year.

Cory Doctorow: And so really I think that we have to understand this kind of data hungry acquisition by companies as being an epiphenomenon, the result of two things. The first is monopoly, that when a company is really monopolized … When an industry rather is really monopolized, there’s only four or five big companies, the way the web is now where it’s like five giant websites filled with screenshots or texts from the other four, that they make a lot of money by dent of having a monopoly. Half of the advertising dollars just get funneled straight into these ad tech platforms, not into the publishers, and that they can spend that money in concert. When you all sit around a single table, you can come up with an agreement about what you want to do and what you want from governments. And so that’s the first part is you get to distort the policy process so that this manifestly dangerous and unjust process of over data collection, over data retention can happen with impunity.

Cory Doctorow: The other thing that happens is that states engage in these kinds of public-private partnerships from hell, where they say, “Okay, we want to spy on everyone, but we don’t want to spend the money to spy on everyone. Why don’t we let businesses crop up that over collect data. We won’t regulate them to stop them from doing it. And then we’ll go and ask them for that data.” The way that police department did this week when they went to Google and started requisitioning keyword searches, right? “Tell me all the people who searched for this thing.” If the NSA had to build that from scratch, all the people who search for something on the internet, that would cost them billions and billions of dollars, and they would struggle to manage it. But Google can do it with its shareholders money. And then the cops can show up and ask them for that data. And so they’re incentivized not to curb the worst data practices of these companies. Instead, they encourage them because it gives them a free source of data that they can use for mass surveillance.

Jeff Schechtman: The other part of it is the way the law is written with regard to this, in that monopolies in and of themselves are not illegal. It’s only the degree to which it actually does damage to somebody or to the consumer.

Cory Doctorow: Yeah. You know, you’re really singing my tune here. This is a drum I beat a lot. Before the 1980s, before the Reagan era, we used to presumptively ban companies from merging with major competitors or buying their smaller competitors or creating these vertical monopolies. If you’re a rail company, you couldn’t own a freight company because your customers are freight companies. And so if your customers were competing with you for access to your rails, well, of course you’d always beat them and you wouldn’t get a fair and competitive market. And Reagan had a guy named Robert Bork. Your listeners may know him as the guy who was such a Nixon criminal, that the Senate wouldn’t approve him for the Supreme Court. And Bork was a kind of court sorcerer to Reagan. And he said that we shouldn’t enforce monopoly laws unless we can show immediate consumer harm, which is to say that unless you can show that after a company does something that’s banned in the law, that they will raise prices the next day, that we should allow them to do it.

Cory Doctorow: And so proven consumer harm in that model is effectively impossible. That’s how we ended up in a world where we have five publishers, four movie studios, three record labels, two beer companies, and only one eyeglass company that owns every major retailer from lens crafters to Sears Optical, to Sunglass Hut, and every major eyewear brand from Coach to Dolce and Gabbana to Oliver Peoples and the largest lab, Essilor and the largest insurer I met all under one roof.

Cory Doctorow: And although each of those acquisitions was undertaken with a promise that they wouldn’t raise prices, 10 years later, glasses cost 1,000% more than they did when the acquisition spree hit. It’s that rare manufactured good that gets more expensive. And this pattern is repeated, not just in beer and not just in eyeglasses, but in hospital emergency rooms, automotive, logistics, even professional wrestling, which went from 30 leagues to one in 30 years. And those wrestlers were re-classed as contractors. They lost their medical insurance, and now they’re begging on GoFundMe for pennies so they can die with dignity of work-related injuries in their 50s. So all of us, the workers and the customers, we get hurt when industries consolidate.

Jeff Schechtman: It’s interesting because one of the arguments that we’ve always heard about technology is that it would lower the barriers to entry for business and make the landscape more competitive when in fact it has resulted in the opposite.

Cory Doctorow: Well, it certainly can. And you know, technology on its own is actually pretty good at that. It’s just technology plus law that makes it bad.

Cory Doctorow: In the history of technology, we see a lot of what I call, competitive compatibility. That’s when you have a company that dominates and you have a new company that comes in and makes a compatible product, a product that works with the dominant product, and they use that to gain a foothold, to turn what looks like a walled garden, or a network effect advantage into a source of customers. So when Facebook started, they had this huge problem that they were competing with a company called MySpace that was owned by one of the shrewdest, greediest billionaires in commercial history, Rupert Murdoch. And everyone who wanted to use social media already had a MySpace account. And even though Facebook had a better service, it didn’t matter how good the service were if all your friends were still on MySpace. And no one could figure out how to get all their friends to leave MySpace all at once.

Cory Doctorow: And so Facebook made this really cool tool. If you gave it your MySpace login and password, it would scrape the messages waiting for you over on MySpace and put them in your Facebook inbox and you reply to them and then push them back out to MySpace. So you could have a foot in both camps, just like, you know, you can have a car and you can either take it to the manufacturers’ dealer, or you can take it to an independent dealer. You don’t have to say, “Oh, well, I’m just going to throw away the car because I don’t like my dealership.” You can have a foot in both camps.

Cory Doctorow: And what’s happened in the years since every tech giant used this from Apple cloning Microsoft Office formats to make the iWork suite to Google pretending to be web users in order to make a copy of every webpage on the internet. If they have all enacted laws and policies and standards and rules that make it illegal to do to them what they did to everyone else. They’ve gone up the ladder and they’ve kicked it away in the fashion of every admiral who’s ever dreamt of becoming … or every pirate, rather who’s ever dreamt of becoming an admiral. That when they did it, they call it the progress of competitive industry. And when people do it to them, if you try to make an app store for the iPhone, or if you try to scrape Google results, or if you try to allow people on Facebook to leave Facebook, but still communicate with their friends there, they tell you that you’re a pirate. That you’re a crook, that you’re violating the rules.

Jeff Schechtman: Is this cyclical? If we look at analogies to the Industrial Revolution to other periods, with its ups and downs and cyclical nature, do we see any kind of patterns that perhaps will repeat themselves with respect to what we’re talking about now?

Cory Doctorow: Well, they say history doesn’t repeat itself, but sometimes it rhymes. And certainly if you look back on the history of the new deal and the trust busting, you see a lot of analogies to what’s going on now. First of all, you see a lot of inaction. The Sherman Act was the cornerstone of the trust busting movement against the rail barons and the oil barons and steel barons. That act was decades old and just hadn’t been enforced. What FDR did was not so much make new laws as changed the enforcement regimes for existing ones. And certainly we have laws on the books today that if they were enforced could end all of this stuff pretty darn quick.

Cory Doctorow: The other thing that you saw was a coming together of a coalition that, just as say, in the ‘70s, there was this time when people who were worried about what we today call ecology, thought of themselves as single-issue people. Some people cared about whales or owls or the ozone layer, but they didn’t really have a way of talking about all of those things as being part of one fight. And the term “ecology” took a thousand issues and turned it into a movement, a movement with a thousand ways to get involved.

Cory Doctorow: Well, today there is so many people who’ve been harmed by monopolies in so many domains, whether that’s finance or energy or any of the other ones. I mean, cable and telecoms is obviously a very big one right now. The big carriers in the U.S. divided up the country like the Pope dividing up the new world and said, “You stay over there and I’ll stay over here. We won’t compete with each other.” And they offered the slowest, most expensive, least reliable service in all the rich countries in the world. The place where the internet was born has the most expensive, least effective broadband in the world. And that was bad before the pandemic, but now that everything we do involves the internet, it’s really bad. And we see how bad it is when they go bankrupt, right?

Cory Doctorow: Frontier went bust at the start of the pandemic and its disclosures reveal that their own calculations show them that they could offer 100 gigabit broadband. And I can’t even impress on you how fast and amazing that would be to 3 million households, but they chose not to, not because they wouldn’t make $800 million over 10 years, if they did it, they were sure they would, but the executives of the company were all paid in stock. And the analysts whose ratings control the price of that stock would down rank any company that made investments that amortized over more than five years. And so they left 3 million households on DSL, 20th century copper lines, basically like wrapped in newspaper, dipped in tar and draped over shrubs. And those are the people who are now getting their internet.

Cory Doctorow: So the coalitions that come together of people who have been harmed by monopolies will be like the coalitions of the people who came together, who worried about environmental issues. And when we realize that we’re not angry about wrestling or beer or eyewear or broadband, but that what we’re really angry about is monopoly, that’s when I think we’ll make a real change.

Jeff Schechtman: You’ve written about all of this in the context of science fiction and books like your Little Brother series at its most recent edition Attack Surface where employees of these companies have to confront the reality of what the companies are doing. How does all this tie together with the larger issues that we’re talking about?

Cory Doctorow: The connection between all of this and where we are now is in this moral reckoning that technologists are starting to have with themselves. People who got involved with technology, because they were excited by the power it gave them. And maybe even because they were scared of what would happen if that technology went awry. They saw the power of it. And on the other hand, they feared the potential for abuse. And it comes at a time in which the employees of these monopolists, people who have basically given up on this Silicon Valley dream of two people in a garage challenging a giant. And instead they’ve replaced it with, “Well, I’ll get a job for one of those giants and I’ll get paid well and get massages on Tuesdays. And there’ll be kombucha on tap in the cafeteria.” They realize that they’re spending their careers building technology to take away the freedom that they were excited by when they first used a computer, whether that’s facial recognition or surveillance tools or stuff that helps ICE or dictators around the world and act programs of disenfranchisement, disempowerment, and ultimately real oppression.

Cory Doctorow: The people who built the tools that caught Jamal Khashoggi and lured him to the Saudi embassy in Istanbul, where he was dismembered. Those people got involved with computers because they loved how much power it gave them. And so this is about people pulling back from the brink. Realizing as 20,000 Googlers did when they walked off the job last year, that kombucha on tap and massages on Wednesdays are a poor substitute for being able to go to your deathbed, knowing that you are a force for good in the world and realizing that in a world where there’s more demand for technologists than there are technologists, that they have the power to change the world for the better. And if they don’t use it, they are part of the problem.

Jeff Schechtman: Is the surveillance aspect of this something that is just so inherently built into the technology – and there were those that soared at the very beginning – that it really is just an inevitable part of all of this?

Cory Doctorow: I don’t think it’s inevitable. I mean, clearly like no one came down off a mountain with two stone tablets and said, “Sergey Brin. thou must stop rotating thine log files and instead mine them for marketable action.” Actionable market intelligence, right? You can run a search engine that doesn’t spy on people. There’s nothing about having a search engine that makes you spy on people. We had them before and we can have them again.

Cory Doctorow: Instead, I think we need to understand the surveillance as the intersection of companies getting away with stuff because they don’t have competitors. Nobody wants surveillance, right? If I said, “Look, here’s two identical search engines. They both perform equally, but one of them spies on you.” You wouldn’t use the one that spied on you. For all the talk about how people love personalized ads, if that were the case, then one in four web users wouldn’t have installed ad blockers. It’s the largest consumer boycott in history.

Cory Doctorow: And so there’s nothing inevitable about that. At the same time, technology gives us unprecedented access to privacy. Technology allows any person to talk to any other person using encryption that cannot be broken. There are modern ciphers if they are well implemented. If you took every atom in the universe and made it into a computer and set it to work guessing what the key was to unlock the message, you would run out of universe before you ran out of possible keys. Now that doesn’t stop someone from tying you to a chair and hitting you with a rubber hose until you tell them what the key is. And so the thing we have to understand this encryption and this privacy technology giving us is a temporary respite from surveillance, from illegitimate governments, in which we can organize a resistance that demands legitimacy, that demands human rights. And that is the only ultimate bulwark we have against surveillance is not self-help measures. It’s what self-help measures get us, which is good government.

Jeff Schechtman: Have we moved on from the point of people not caring about these things, about wanting to in a strange way, and as they did maybe in the early days of the internet, wanting to live kind of a public life as a result of the internet?

Cory Doctorow: Well, you know, I don’t think that’s the case. I mean, again, all of the ad blocking that we see and tracker blocking and other activities that people undertake is certainly evidence to the contrary. Sometimes people talk about this generationally. They say, “Look at all these kids posting their selfies.” But one of the things we know about kids is that if they’re on a network where they’re sharing potentially unwisely, it is often a network where they believe that the authority figures in their life cannot look at them, where their parents are not, where their teachers are not, where other people who could immediately punish them are not. And kids, they can be good reasoners, but they lack context. Some lessons you learn through hard experience, right? Until you’ve gone for a job and had your old social media history called up and shown to you, it can be hard to understand what are the long-term consequences of it.

Cory Doctorow: So they understand that they need privacy, they just underestimate who they need privacy from. They think it’s their parents and their teachers. What it really is, is their future bosses and maybe border guards when they cross the border who access a big database and other people who will have much more power over them ultimately than their parents do and have much less interest in their well-being than their parents do. And I think that if it were the case, that people didn’t value privacy, we wouldn’t need all of this energy to stop encryption and to stop privacy tools from being built into our networks and all these anti ad block showing up on websites, because people just wouldn’t be using those tools. The fact that people migrate to those tools tells you about their preferences.

Jeff Schechtman: Does politics, as we understand it today, have the ability to address any of this or does this have to be something that emerges from the companies themselves in some respects, and out of business decisions more than political decisions?

Cory Doctorow: I don’t think that the companies have demonstrated that their capacity for self-governance generates good outcomes. It tends to generate pretty toxic ones. The mistakes pile on the mistakes pile on the mistakes and they outrun them by just not having competitors. I think we have to believe that governments can make good decisions about nuanced technical questions, because the evidence for that is all around us.

Cory Doctorow: Food hygiene is an incredibly nuanced technical question. People have died of more bad meals than of any other cause in human history. And yet here we are, not dead from dinner. And the reason we’re not dead from dinner is not because Congress is full of people who understand microbiology. The reason we’re not dead is because Congress is full of people who respect a legitimate process in which neutral adjudicators gather experts with different views. Those experts make those views public in a way that is legible to other people, including other experts. The adjudicators make a decision about it. They show their work, they recuse themselves if they have a conflict of interest. And then when they’re done, if new evidence comes to light that can make a better outcome, that tells you that they were operating on bad data, they have a way to revisit it.

Cory Doctorow: And that process is the reason that the reinforced steel joist that is over your head right now, that keeps your roof from falling on your head is still intact. It’s why your car doesn’t explode in low velocity rear collisions. It’s why planes didn’t fall out of the sky until the 737 Max.

Cory Doctorow: And what we see is that when companies either in fact or formally, start to regulate themselves, either because they’re so concentrated that they can hijack their regulators, or because they’re so concentrated that the regulators stop even trying and say, “No, you go ahead and self-certify.” That’s when the process becomes untenable. And that’s when the outcomes become not reflective of empirical evidence, but instead reflective of the shareholder preferences of the firms that dominate the industry.

Jeff Schechtman: Cory Doctorow. Cory, I thank you so much for spending time with us.

Cory Doctorow: The pleasure was absolutely mine. Thank you very much.

Jeff Schechtman: Thanks a lot. And thank you for listening and for joining us here on Radio WhoWhatWhy. I hope you join us next week for another Radio WhoWhatWhy Podcast. I’m Jeff Schechtman.

Jeff Schechtman: If you liked this podcast, please feel free to share and help others find it by rating and reviewing it on iTunes. You can also support this podcast and all the work we do by going to WhoWhatWhy.org/donate.

Related front page panorama photo credit: Adapted by WhoWhatWhy from re:publica / Flickr (CC BY-SA 2.0).

Author

  • Jeff Schechtman

    Jeff Schechtman's career spans movies, radio stations, and podcasts. After spending twenty-five years in the motion picture industry as a producer and executive, he immersed himself in journalism, radio, and, more recently, the world of podcasts. To date, he has conducted over ten thousand interviews with authors, journalists, and thought leaders. Since March 2015, he has produced almost 500 podcasts for WhoWhatWhy.

    View all posts

Comments are closed.