Mark Bergen, MPP'12. Photo by David Paul Morris.

Mark Bergen, MPP’12, a technology reporter for Bloomberg, has covered Google since 2015. His debut book, Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination (Viking, 2022), is a character-driven account of the video sharing platform and a fascinating look at the inner workings of the company. The following is edited from a recent conversation with Bergen, who discusses YouTube’s influence, its future, its missteps, and the pros and cons of content moderation.

How did this book come about?

After 2016, YouTube became an increasingly important part of Google’s business – and then a bigger part of Google’s headaches. As I started to cover YouTube’s myriad scandals, I realized readers didn’t have a lot of context for the factors that brought us to where we are today with the company and platform.

YouTube is a beast with many tentacles. What does the book focus on?

YouTube is probably the biggest online digital economy, which might surprise people. They pay millions of people shared revenue: from individual broadcasters and creators to huge media companies and record labels. In some ways, the book is a history of the company since its founding in 2005. And in many ways, it’s a broader history of content moderation and the internet. YouTube has many of the same problems as Facebook and Twitter, but the decisions the company makes have larger ripple effects on millions of people’s careers and livelihoods.

Is there one scandal that really exemplifies YouTube’s issues?

YouTube invented “broadcast yourself,” but the company never imagined that someone would livestream themselves committing mass murder. The book opens with the Christchurch shooting in 2019, when a white nationalist murdered 51 people at a mosque and an Islamic center in New Zealand. The shooter broadcasted the shooting on Facebook Live – and then it jumped over to YouTube, as things do on the internet. After the company took it down, people re-uploaded the footage. YouTube had to do this horrific game of stomping out and removing footage as quickly as possible. When the New Zealand government did an analysis of the shooting, they determined that the shooter’s worldview was largely informed by YouTube. He learned how to use weapons from YouTube tutorials. He had this vision of the world at risk from Muslim migration in part because of his media consumption on YouTube.

That was a critical turning point for the company. After that incident, it changed a lot of its policies around hate speech and what is allowed on the platform.

In the aftermath of January 6th, do you think the government should take a more active role in moderating the kinds of speech on YouTube’s platform? What are the broader policy implications—and their likelihood of getting solved?

It doesn’t seem very likely. There are several bills floating around, but Congress hasn’t really acted on any of them. The one time that federal regulators actually cracked down on Big Tech was when the Federal Trade Commission fined YouTube in 2019 for violating the Children’s Online Privacy Protection Act. YouTube has dramatically changed since then. The company started a YouTube Kids app, which was meant to be a safe space for children. There were questions about whether or not that was a legitimately safe space. And it was pretty clear that the audience on YouTube Kids was very meager compared to the flagship,

What surprised you the most in your research?

There’s an internal meme of Google about how people outside the company think it’s this really well-run Lamborghini – and then inside it’s this chaotic, Richard Scarry-type thing. People inside YouTube and Google generally have good intentions; for example, they’re all about LGBTQ rights, and they want YouTube to be a place where queer creators can get a voice and have opportunities and commercial success in ways that they can’t on mainstream media. But for a long time, their algorithms were taking these channels down a peg automatically—because they weren’t running ads on videos that mentioned words like “gay” and “lesbian.” And the company had no idea what to do about this problem.

What does YouTube do particularly well?

As troubled as the creator business has been, no other platform on the internet has figured this out. TikTok is still nowhere close to paying out people like a successful media business. Facebook has tried many times. Twitter and Spotify and Twitch are all trying. But YouTube is light years ahead of them all.

Their most genius product is Content ID, a copyright rights management tool that basically allows copyright material to be identified automatically so YouTube can share the advertising revenue with the copyright holders. Early on, it kept them from being sued out of existence.

And the company has been really good on the technical side. When’s the last time you had to wait more than a millisecond for your YouTube video to load?

Was there a moment that put YouTube burst onto the scene, or was it an incremental thing?

There were several moments. The one that put them on the map in some ways was “Lazy Sunday.” Remember that? A rap about cupcakes and The Chronicles of Narnia on SNL in 2005. Then Justin Bieber was discovered on YouTube.

As of 2012, YouTube wasn’t yet profitable. They were an underdog to Facebook. 2012 was a really significant year in which they changed their algorithmic system to get the most engagement. They set a goal to get people watching YouTube for a billion hours every day, which they hit in 2016.

Are there any famous missteps over the years that you highlight in the book?

So many. There was a video called “Innocence of Muslims” that was actually a trailer for a movie that was never made—an extremely Islamophobic video that depicted Muhammad, the Muslim prophet, as this barbarian. This is around the same time as Benghazi, and YouTube got pulled into the swirl of geopolitics. There were massive protests around the video. And the State Department came to YouTube and said, “Take this video down, please.” YouTube refused. They had this principle: “We’re not going to bow to governments, including our own.”

Is YouTube a symbol of Silicon Valley—smart people doing something that sounded great but turned into something totally out of their control?

My book starts with a quote from Mary Shelley’s Frankenstein. So, I would certainly say yes. It is mass media that is really for the first time not programmed by editors or producers or educators. It is mass media programmed by algorithms. I think, by design, it’s something that the company doesn’t want to control.

Who are the main players that drive the book’s narrative?

One of my main characters is Claire Stapleton, a longtime Google employee who went to work for YouTube. She was one of the leaders of a pretty massive walkout around sexual harassment issues inside the company. She came in with rose-colored glasses in her view of Google’s role in the world; by the end of her career, she came out with a very different perspective. I talked to a lot of early YouTubers like Freddie Wong. Danny Diamond. I followed PewDiePie’s career and he was very tight with YouTube and then a series of scandals and a pretty public fallout with a company.

How did your education at Harris inform what you do?

I was focused on housing policy. I like thinking about organizational theory, which UChicago and Harris are really good at. What I’ve tried to do in journalism is explain Google as this powerful organization and apparatus. Being a little bit more familiar with issues around copyright, children’s online privacy, content moderation—all these really major policy issues—I felt comfortable reading the primary texts and coming away with enough of an understanding to feel confident in writing about it. 

Is YouTube still innovating in a way that’s going to keep them on top for years?

I think you can say the Facebook kingdom is at risk of crumbling way more than YouTube. I make this point in the book: YouTube has had advertisers boycotting it. It’s had so many different crises, but their viewership numbers keep growing. There was a YouTube employee that described why no one boycotts their site. They said, “You can’t boycott electricity.”