Five years ago. Nicco Mele warned that technology — particularly social media — was taking power from big institutions and giving it to individuals.
When used for good, he said in his 2013 book “The End of Big: How the Internet Makes David the New Goliath,” new technologies could empower individuals, give smaller players a fighting chance and challenge incumbents. But there was also a dark side to the power shift, warned Mele, the director of the Shorenstein Center on Media, Politics, and Public Policy at the Harvard Kennedy School.
The following interview has been condensed and edited for clarity.
When you wrote ‘The End of Big’ in 2013, you seemed largely optimistic about the effect the internet and technology were having on society. Five years on, has that changed?
My publisher said the first draft of the book I turned in was too dark, that I had to find a way to make it more hopeful. So I tried to make it more hopeful in a way that had integrity, that I could stand by.
I don’t want to be the one who’s always saying the sky is falling. I’m a genuinely optimistic person. And I do see cause for optimism. I grew up with the internet, and it was a powerful force for good in my life. That’s the internet I want to preserve and encourage.
The thesis of the book is that technology pushes power from institutions to individuals, and sometimes that’s good. It’s good when the institution is corrupt and needs reform and won’t change itself. It’s good when the institution has acquired too much power. It’s good when you’re giving individuals the power to do beautiful things in the world.
But it’s bad when the institutions are performing a vital role and you’re allowing any individual to disrupt it. It’s bad when you’re talking about terrorism. It’s bad when you’re talking about the integrity of information in our communities.
Looking at the spread of misinformation in recent years, would you say we’ve entered ‘bad’ territory?
There’s no doubt in my mind that the ones that have been harmed most by it are news organizations, political parties and government.
It was pretty clear to me five years ago they had already lost their power. Right now, they’re grappling with the reality of that and trying to understand what that means. The problem is new changes will be upon us, and I’m worried that by the time they come to terms with it, they’ll miss the current growing crisis.
The unintended consequences of AI and algorithms.
Algorithms feed our worst selves in their current design in terms of information. It’s debilitating.
So should the technology companies designing these algorithms be held responsible for the divisions they’re creating?
I profoundly think they should.
Here’s a story: Henry Luce was the publisher of Time and Life magazines, and he was arguably the most influential human being in the U.S. in the late 1940s because his publications were in most American homes. He hated FDR (Franklin D. Roosevelt). He was an old-fashioned conservative, and he thought FDR was a commie pinko liberal. He wanted to go to war with FDR in his publications. But something held him back. So he gave a bunch of money to his good friend Robert Hutchins, who was president of the University of Chicago, and he asked Hutchins to answer the question: What are the responsibilities for someone like me who owns and controls media to the institutions of democracy?
Hutchins started this thing called the Hutchins Commission. After a few years, he published his findings, and Henry Luce hated it because, what did it say? It said, “You do have special responsibilities to democracy because of your power to shape and sway public opinion.”
This was following World War II, and there was a real sense that media shaping the public was a very dangerous thing. It carried special responsibility. And that view is wholly absent from people like Rupert Murdoch and Mark Zuckerberg. Zuckerberg appears to give it some lip service, but it’s not clear to me there’s a deep understanding of what that means.
How can technology companies be held accountable?
Ultimately, the only thing that will matter is political action. It’s not a satisfying answer, I know.
At the same time, you’ve said you don’t believe technology should take all the blame.
There’s something deeper happening culturally. I reject the argument that it’s technology’s fault that we’re more shallow. My general view is technology is not inherently good or bad. It boils down to two things: the values our culture carries, and the quality of our leaders. They’re interrelated.
So is this a moral decay issue?
I don’t want to sound like a ninny, but I do think there’s something in our culture that has become less substantive. I don’t fully understand that, but I want to.
As we approach the midterm election, have Americans at least gotten better at managing false and viral information?
There’s now orders of magnitude more discussion about it, and maybe we’ve raised awareness a bit, but I don’t think we’ve made any progress. Look at the school shooting and how it has been covered and the idea that these children are paid actors.
I’m pretty grim about where this is all headed. I think the fundamental nature of justice in American democracy is at stake in a really profound way, and it’s happening in slow motion.
What’s the silver lining?
The good news is that climate change will render a lot of this irrelevant as it totally dismantles modern society.
I thought you were an optimist!
Well, OK, I believe in human ingenuity and American innovation. I’m not going to count us out just yet. But as I said in my book, if America persists in my lifetime, I’ll be stunned. Where we’re at right now is grim and we have to imagine a better future.
Please consider subscribing today to support stories like this one. Get full access to our signature journalism for just 99 cents for the first four weeks. Already a subscriber? Your support makes our work possible. Thank you.