Join us
Roger McNamee. Photo by Evan Peng/The Stanford Daily
Analysis

‘Facebook is the biggest problem we have for democracy,’ former investor says

April 26, 2019
Roger McNamee. Photo by Evan Peng/The Stanford Daily

Sign up for The Media Today, CJR’s daily newsletter.

Roger McNamee is an investor with 35 years of experience in Silicon Valley, including stints at Silver Lake Partners and Elevation Partners—a venture firm he co-founded in 2004 with U2’s Bono.

McNamee was also an early investor in Facebook and a mentor to Mark Zuckerberg, an experience McNamee now has turned into a book whose title sums up his experience working with the company: Zucked: Waking Up To the Facebook Catastrophe.

McNamee has made news because of his assertion that the platform is a danger to public health and democracy. He discusses fundamental flaws in the design of Facebook’s algorithms and business model—flaws, he says, “that allow bad actors to exploit it and harm innocent people.”

RELATED: Facebook is eating the world

McNamee says that the culture of the technology platforms—and Silicon Valley at large—echoes a larger shift in American capitalism, one that previously relied on government to “set the rules and enforce them across the entire economy” but now favors disruption and monopolistic tendencies. This, he says, “causes employees to be indifferent to the negative side effects of their success.”

Earlier this month, McNamee spoke with Ann Grimes, associate director of the Brown Institute for Media Innovation, at the Stanford Graduate School of Business about his transition from tech advocate to critic. What follows are edits of his remarks organized by themes.

Sign up for CJR’s daily email

 

The Perfect Idea

When Mark started Facebook, he literally started at the perfect moment. In 2006 there were a few things going on. The PayPal mafia—Peter Thiel, Reid Hoffman, Elon Musk—had two unbelievably powerful insights. The first was that the Internet was shifting from a web of pages to a web of people. The limitations of technology that had constrained everybody around here for the prior 50 years that made us focus on the narrowest most important functionality that people needed at any one time [had opened up]. For the first time, you were able to create global products. You were able to create services like Amazon web services that allowed entrepreneurs to buy the $100 million to $200 million worth of infrastructure you [previously] had to build just to launch a product. This meant the cost of a start-up was going to go from $100 million to $10 million. It meant that instead of being a 45 to 50-year-old entrepreneur you could be in your twenties. So when Mark started Facebook, he literally started it at the perfect moment. I was absolutely convinced, even in 2006, when it was just available to high school and college students through their school address, when it was just a Facebook picture with relationship status, that he was going to be bigger than Google. I was convinced of that because I had watched MySpace, Friendster and America Online all fail in their social activity, in my opinion, because a lot of trolls and bad actors were on those platforms [and Facebook [initially encouraged users to authenticate their identities.]

 

From advisor and investor to skeptic and critic

The culture of the valley at the time was shifting: replacing the hippie-libertarian movement of Steve Jobs—that notion of empowering people—with a different kind of libertarianism which was really more about building a monopoly, disrupting, and dominating. You didn’t really worry about the rules. You just kind of went and did things. You didn’t ask permission and beg forgiveness. That culture was very uncomfortable to me and in the early days I think Mark had a different view, or at least that’s what I thought he had, so I was blissfully a fan.

I stopped being an advisor in 2009 because the things I was good at, he didn’t need any more. So I’m just a cheerleader, not interacting with Mark or Sheryl [Sandberg]. Then I retired in December 2015.

In January [2016], I started to see things that didn’t fit my preconceived notions of this company… First, during the Democratic primary in New Hampshire, I started to see [anti-Hillary] memes coming out on Facebook groups associated with the Bernie Sanders campaign…. They were spreading widely among my friends in a way that suggested somebody was spending money to get people in the group whose only purpose was to spread misogynistic images. That struck me as really weird.

In June of 2016, the United Kingdom voted on Brexit to leave the European Union. The outcome came as a huge shock. There was an eight-point swing the day of the referendum. It occurred to me, Wow, what if the Facebook tools that allow ideas to spread so rapidly, what if there’s something about them that gives an unfair advantage to really intense nasty messages over neutral ones? Because the “Leave” campaign had this intense xenophobic message and the “Remain” campaign was Hey, stay the course. I thought if that’s true, that’s really bad for democracy. But I don’t know, I have no data….

Two other things happened: The US Department of Housing and Urban Development cites Facebook for advertising tools that enable discrimination in housing, the very thing they were sued over a few weeks ago …Then, the intelligence agencies say the Russians are hacking and interfering in the US election.

At that point … I reach out directly to Mark and Sheryl. They’re my friends. And so on the 30th of October, nine days before the election, I sent them a draft of an op-ed I was planning to publish. I say, “Guys, I think there is something wrong with the business model and algorithms that allow bad actors to harm innocent people and we’ve got to get on top of it.”

 

Facebook’s reaction

They got right back to me and they couldn’t have been more friendly. But they didn’t embrace it like it was a real business issue; They treated it like a PR problem. They handed me off to a real good friend of mine, Dan Rose. But Dan’s job is to essentially contain the PR problem. He goes: ‘We don’t have a problem: Section 230 of the Communications and Decency Act says, ‘We’re a platform not a media company. So we’re not responsible for what third parties do.’”

 

Turning point

The election happens. At that point I go, “Dan: You have got to do what Johnson and Johnson did when some asshole put poison in bottles of Tylenol in Chicago in 1982.” The CEO of Johnson and Johnson the very day the story happened pulled every bottle of Tylenol off every retail shelf in the US, and Canada, no questions asked. And didn’t put them back until they had a tamper-proof patch. I said, “Dan, you have to leap to the defense of the people … You are in the trust business. If you let the trust get broken, you will never earn it back.” I spent three months begging them to do this. And he’s just hiding behind Section 230.  Eventually, I gave up. [Section 230 of the 1996 Communications Decency Act gives Internet companies broad immunity from content flowing over their lines, treating them much like a phone company].

It’s at that point that I face a moment of truth, which maybe one or more of you will face at some point in your life. I saw something really wrongsomething with issues with democracy and issues for civil rights. I was retired. I could have sat back and said this is somebody else’s problem. But the other side of the coin was, I’d been a small part of this and I’d profited from it. And I felt a moral and emotional need to take what I knew and try to start a conversation.

 

The problem? it’s bigger than Facebook

Facebook exists in the culture here in the Valley which permeates the culture of the United States of America. Where we have deregulated for 44 years. We have essentially blown a version of capitalism—the standard version—where the government sets the rules and enforces them across the entire economy so that it’s fair for everyone. We have shifted to a world with very few rules and almost no enforcement. Businesses are encouraged to grab whatever they can grab while they can grab it.

Peter Drucker, who was a great management guru in the industrial age when I was young, would say: “There are five stakeholders: shareholders, employees, the communities where employees live, customers and suppliers. It was the duty of management to balance the interests of all five, because that’s how you maintain a zen-like harmony over time. But over the last 40 years, we’ve abandoned four of the five. Essentially all companies in the economy abuse their suppliers, many of them abuse their customers, they couldn’t care less about the communities where people live, and many of them have treated their employees badly. That has been standard operating procedure.

Now, the incentives are misaligned and it’s way bigger than Facebook, way bigger than Silicon Valley…. If Rex Tillerson is allowed to conduct a separate foreign policy at Exxon in contravention to sanctions against Russia, if Wells Fargo bank can get away with essentially fraud, if the entire banking industry can do 2008 and not be punished, it’s really hard to expect Silicon Valley to have a higher set of standards… I want to hold them to a higher standard, but I want to hold everyone to a higher standard than we’ve held them to for the last few years.

 

Users are the fuel

Facebook is the biggest problem we have for democracy…. But much bigger is behavioral prediction … and that’s the incredible genius of Google. In 2002, Google behaved just like a classic market player. They had one product, the search engine. They gathered data from the people that used it and used that to make that product better. The business model is based on ad targeting related to purchasing ads. You want to go on vacation? You look up where to go, airlines, hotel.

But when they looked at the analysis, they realize that search was capturing only a small percentage of the data and… in order to make the engine better they ask, is there any signal in the rest?. They realize that only one percent of the value is in the stuff you put in. Most of it is in the unclaimed datathe metadatathe tracking, the browsing experience. If they can get all this data they can expand the value of what they are selling to advertisers.

They create Gmail. Now Gmail is my identity and attached to purchase intent. They insert ads and find an excuse to read people’s emails. Hm, this is wrong? We’ll enhance it… give it away for free… Won’t tell anyone we’re seeing emails before they target the ads. [Then] they took away the ads [in Gmail] but kept scanning emails… If you are in the business of exchanging services for ads, where does reading email fit in?

ICYMI: Here’s what happened inside The Markup

…Then Google Glass, where there’s lots of facial recognition, lots of [tracking] individuals walking, driving. We called them “Glassholes.”

So they go back to the lab. They repackage it as a videogame and spit it out as a separate company, Niantic, and then call it Pokemon Go. They get a billion people wandering around on their smartphones taking pictures of everything. Now the behavioral thing is really interesting… if we put Pokemon on private property will people climb over a fence?  Yep. If you put a Pokemon in a Starbucks where they can buy some coffee will users go there? You bet. Do it in the third Starbucks and give you 10 cents off and see what they go for? Yes, can do that, too.

Then what happens? They go There’s all this other data that other people own. Let’s go buy it. So they go to your bank. They go to Datalogics. They go to Experian and Equifax. They go to cellular carriers and get all your location information. They’ve already got the data from Uber and Lyft. They go to health and wellness apps and they get all that stuff. They do all this tracking. They scan your documents. They build a data avatar for each and every one of us whether you are on the platform or not. The problem is that only one percent of the value is in the stuff you put in. Most of it is in the metadata… what they are selling to advertisers is that.

So now we consumers are in the behavioral manipulation business but that’s not what we signed up for. What’s wrong with the model? I’m not your customer. I’m not even your product. I’m the fuel, the source of this data. Google perfected this—but now Facebook, Amazon and Microsoft are playing this game also.

 

Unintended Consequences

Think about what engagement is. If you are in the behavioral prediction business, the problem is you want to build a habit, then you want to feed the habit with things that engage people. It turns out that people are most engaged by stuff that triggers flight or fight, which would be outrage or fear, or real conspiracy theories, or disinformation. Nobody really knows why this information is so successful, but it is… Essentially what this is about is you want to get people into an unstable state to find out what they’re really like. What happens when you expose them to a certain sentence? How do they react? What happens when you expose them to hate speech? What happens when you expose them to the threat of, let’s say, a measles outbreak? The problem with the platform business model is that it depends on hate speech, fear, outrage, disinformation…

I’m not suggesting that any of these people are bad or they did this on purpose. These are the unintended consequences of a well-intentioned strategy. In Facebook’s case, the well-intended strategy was to connect the whole world.  to expose all the world’s information. The problem was they were in such a hurry they failed to put in circuit breakers. They failed to think about containing strategy—because that’s the culture of the time.

This is just like the chemical industry 30 years ago. Chemical companies used to be able to pour mercury into freshwater….You’re a gas station, you poured used oil into the sewer. And for 50 years nobody said anything. But then the externalities started piling on and people said This is no good. And they went back to the guys who created it and said, Hey, toxic chemicals? That’s your problem. Clean it up. I think we are looking at toxic visual stuff and it’s time to change the culture.

 

Solutions

I want to use antitrust laws… and restart a regulatory infrastructure—that’s really hard when the FTC and antitrust division of the DOJ have done nothing related to consumer harm since the Microsoft case. But working with people in economics and law at the university, we may have found a way to use [free market-style] Chicago School antitrust against these platforms.

The model is very straightforward. We’ve always said that services are free. But that’s actually not correct. It’s a barter of services for personal data. The data is the currency. And what happens and it’s really obvious whether you look at Facebook or Google, is that the price for data has been rising geometrically, at a very steep slope.

The marker for that is simply data per user but the suite of services really hasn’t been changing that much. And the individual value of what an action under the service is doesn’t change much. And yet the value of the data they are getting is going up very significantly. Last year a Nobel Prize was awarded for similar analysis.

We can use the antitrust thing that we used in the AT+T [break-up]. You create MCI, Sprint by giving them access to really low rates through the long distance model. What you are going to do is really simple. Any new startup that has a business model that fits the pattern we’re looking for gets free access to advertising for customer acquisition on the big platforms. Let’s just say it’s three up to a million people. Just think about that. What would that do for innovation?

If you want to solve a problem, you don’t have to take away the incentive to have inappropriate stuff there in the first place. I don’t want to be in the censor business. I want to be in the business of eliminating the amplification of the worst content that essentially creates political polarization and creates all these unhealthy outcomes.

The first step is to ask political questions. Why is it legal for somebody to scan your email or open your documents? Why is it legal for there to be a third-party commerce in your most personal data? Why is it even legitimate to capture data on children at all? That’s the debate we need to have in 2020. And that’s the debate, interestingly enough, that brings people together.  Because it doesn’t matter whether they are left or right. This is an issue of right or wrong.

ICYMI: Don’t rely on the coverage. Read the Mueller report.

Has America ever needed a media defender more than now? Help us by joining CJR today.

Ann Grimes currently serves as Associate Director of the Brown Institute for Media Innovation at Stanford University where she also teaches classes in media innovation, entrepreneurship and design. Previously, she served as Director of Stanford’s Graduate Program in Journalism and before that held senior editorial positions at The Wall Street Journal and The Washington Post.