Facebook's Sheryl Sandberg On Data Privacy Fail: 'We Were Way Too Idealistic' | KUER 90.1

Facebook's Sheryl Sandberg On Data Privacy Fail: 'We Were Way Too Idealistic'

Apr 6, 2018
Originally published on April 6, 2018 9:30 am

After weeks of remaining conspicuously out of sight, Facebook Chief Operating Officer Sheryl Sandberg told NPR's Steve Inskeep that she doesn't know if companies other than Cambridge Analytica exploited users' private data without their consent.

"We don't know," she said, leaning into a black leather swivel chair at the company's headquarters in Menlo Park, Calif., on Thursday.

Sandberg said Facebook has launched an investigation and audit to determine whether user information has been compromised by other firms.

"As we find those, we're going to notify people," she said.

The interview with NPR is part of an intense damage control tour the company has undertaken to respond to users, politicians and regulators who are outraged to learn that for years, Facebook had left people's personal information open for mining by third-party app developers.

Chief Executive Officer Mark Zuckerberg is scheduled to testify at congressional hearings next Tuesday and Wednesday.

Sandberg outlined some of the steps the company is taking to repair the damage: On Monday, Facebook will begin notifying the 87 million users whose information may have been compromised and given to Cambridge Analytica, a data mining firm used by the Trump campaign.

But, she said, the company will not be making the same kind of individual outreach to the some 126 million Facebook users who were exposed to Russian disinformation, including through the use of pages and accounts linked to a "troll factory."

"We really believed in social experiences. We really believed in protecting privacy. But we were way too idealistic. We did not think enough about the abuse cases," she said.

Facebook, the world's largest social media company, is in the middle of a reputational crisis and faces questions from lawmakers and regulatory agencies after the political research firm Cambridge Analytica collected information on as many as 87 million people without their permission. Previous estimates had put the number of users affected at 50 million.

Now the company, which has lost about $100 billion in stock value since February, is reviewing its data policies — and changing some of them — to find better methods of protecting user data.

And its leaders are apologizing.

"We know that we did not do enough to protect people's data," Sandberg said. "I'm really sorry for that. Mark [Zuckerberg] is really sorry for that, and what we're doing now is taking really firm action."

The Federal Trade Commission is looking into whether Facebook violated a 2011 consent decree by allowing third parties to have unrestricted access to user data without users' permission and contrary to user preferences and expectations.

The penalties for violating the order would be devastating, even for Facebook. At $40,000 per violation, the total cost could theoretically run into the billions.

Sandberg contended that Facebook is and has been in compliance with the agreement for years. She also said the company has been in constant contact with the FTC. But with regard to the steps Facebook has so far taken to better protect people's privacy, Sandberg readily admitted these should have been done years ago.

Asked about any forthcoming governmental oversight, she insisted, "We're not even waiting for regulation."

Sandberg also touched on another contentious issue for Facebook: the role it will play in the upcoming elections and in the U.S. political system as a whole.

"We certainly know people want accurate information, not false news, on Facebook, and we take that really seriously," she said.

Looking ahead to the midterm elections and the 2020 presidential race, Sandberg said, "We want to make sure that there's no foreign interference." Facebook plans to prevent that by "taking very aggressive steps on ads transparency."

"I think what really matters is that we learn from what's happened. Security is an ongoing game," she said.

It is a game that will continue to challenge the company. "You build something; someone's going to try to get around it. This is going to keep happening," Sandberg said.

Sandberg has been criticized in recent weeks for her absence from the growing global debate about Facebook's failure to protect user privacy and what it must do to move forward.

Until now, Sandberg had contributed little to that conversation, taking several days after the Cambridge Analytica revelation to write a Facebook post explaining a few of the company's next steps. It piggybacked on a similar mea culpa from Zuckerberg.

A day later, in an interview with CNBC, she admitted Facebook had violated its users' trust.

Sandberg joined Facebook as second in command in 2008 and was then touted as the adult in the room who would help company co-founder Zuckerberg by bringing management expertise she had gained from six years at Google.

Her time at Facebook coincides with a period of tremendous growth at the company. The site now has about 2.2 billion users. She has been critical to Facebook's success in becoming a behemoth in advertising. The company makes almost all of its revenue and profit from ads, in large part because of the strategies implemented under Sandberg, especially as Facebook moved to mobile. As of 2015, one in five minutes spent in mobile apps was on Facebook.

And in that decade, Facebook's massive trove of user data has been used to help companies create highly targeted ads. That has made it a dominant force in digital advertising.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

STEVE INSKEEP, HOST:

Sheryl Sandberg, one of the top executives of Facebook, says she had to come to grips with the reality - a building full of social media users in Moscow linked to the Russian government spread election disinformation in the United States using Facebook.

SHERYL SANDBERG: In 2016, the Russian Internet Research Association interfered in the election on our platform, and that was something we should have caught, we should have known about. We didn't. Now we've learned.

INSKEEP: That learning is what Sandberg wants to emphasize now. Over the past year, Facebook seemed to downplay fake news on the platform. Then it had to acknowledge Russian trolls made significant use of Facebook. Next week, the company faces congressional hearings about sharing its users' data and more. And, as we met Sheryl Sandberg in the Facebook headquarters building, the 2018 elections loomed. Facebook said for years it was not a publisher, just a platform not entirely responsible for what billions of people post there, no matter how deceptive it may be. Disasters of recent years have forced the company to shift its approach somewhat. And when we referred to the company as a publisher, Sheryl Sandberg did not question it.

What do you think your company's role is as a publisher in this year's election and in the presidential election that's coming in a few years?

SANDBERG: Well, we certainly know that people want accurate information, not false news on Facebook. And we take that really seriously, and we just want to make sure that there's no foreign interference. We are also really taking very aggressive steps on ads transparency.

INSKEEP: The company says it will disclose who pays for political ads on Facebook.

SANDBERG: We're also building an archive of political ads that will run forward and build for four years so you'll always have, once it builds up, four years of data where, for any political ad, you'll be able to see who ran it, who paid for it, how much they spent and the demographics of who saw it. Again, industry-leading transparency.

INSKEEP: Because it's clear to you that in 2016 it's hard for anybody to know. Or, it was hard at the time for anybody to know just how money was being spent and by whom.

SANDBERG: Well, this hasn't happened in our industry. And that's why, again, we're not waiting for the regulation to happen to do this. We're doing it because we think that transparency is really important.

INSKEEP: Since the 2016 election, Facebook has taken steps to deemphasize news shared by media companies. Articles shared by your friends get more prominence. Now it plans more steps. News organizations widely rated as credible will get more play while those deemed not so credible will get less. Outside fact-checkers will help to examine articles, and users will be warned when they try to share doubtful ones.

Are you comfortable being the censor, which is effectively what you would have to be, wouldn't it?

SANDBERG: We're trying to have very good community standards. We're open about what those community standards all around the world, and we're going to get increasingly open about this. We want to make sure people understand, you know, there's no place for terrorism. There's no place for hate. There's no place for bullying. We don't sell your data, ever. We don't give your information to advertisers. You're not allowed to put, you know, hate content on our site. With news, we rely on third parties. We don't believe we can be the world's fact-checkers, but that doesn't mean we don't have a big responsibility.

INSKEEP: A company that aspired to connect the world has begun to face demands that it occasionally break the connection.

You probably know that there was a leaked memo from 2016 from a Facebook executive who said we care so much about connecting people that even if we connected people who used our platforms to coordinate a terrorist attack, we're fine with that because we're still just connecting people.

SANDBERG: Right. So...

INSKEEP: That was 2016. Do you still believe that?

SANDBERG: We never believed that. The person who wrote it, named Boz, never believed it. He's a provocative guy and was trying to spark debate. But Mark never believed it. I never believed it. So terrorism...

INSKEEP: So maybe it was hyperbole. But he was leaning in the way that he did believe, that maybe you cared too much about this...

SANDBERG: Well, let's go to the example.

INSKEEP: ...Too little about other things.

SANDBERG: Let's go to the example.

INSKEEP: Sure.

SANDBERG: There's no place for terrorism on our platform. We've worked really hard on this. Ninety-nine percent of the ISIS content we're able to take down now we find before it's even posted. We've worked very closely with law enforcement all across the world to make sure there is no terrorism content on our site, and that's something we care about very deeply.

INSKEEP: But what about the broader point? Essentially he was saying the company's values are out of whack - we're interested in one really big important thing, perhaps to the exclusion of other things.

SANDBERG: Again, that memo is wrong, and he said he didn't mean it. And Mark and I certainly never agreed. We never only cared about one thing. We cared about social sharing, and we cared about privacy. That's why we put the controls in place. I think the balance was off because we didn't foresee as many bad use cases, and that balance has shifted and shifted hard now.

INSKEEP: That's part of our talk with Sheryl Sandberg of Facebook. She's talking to people like us in part to prepare the ground for an event next week. Her boss, Mark Zuckerberg, takes questions before Congress. And NPR congressional reporter Kelsey Snell is with us. Hey there, Kelsey.

KELSEY SNELL, BYLINE: Hi there.

INSKEEP: What do lawmakers want to know?

SNELL: Well, they want to know a lot because they have been asking for Mark Zuckerberg to come and testify before Congress for a long time - for years, in fact. And he has put that off, and they have sent other people, other representatives from Facebook. But there will be a lot of pent-up energy and a lot of pent-up questions for Zuckerberg, not just about Cambridge Analytica and the security situation, but about Facebook's role and social media's role in data security and the way people's information is shared.

INSKEEP: But let me ask what the point is, Kelsey, because when we were talking with Sheryl Sandberg, one of the things she said in the full interview is there's not really very much regulatory activity going on in Congress. There's only one piece of legislation that she even knew about that seemed mildly significant. Are lawmakers actually considering anything that would in any way rein-in or regulate Facebook?

SNELL: Even some Democrats, who are more open to the idea of regulation, say that it would be hard in this environment to pass any new legislation that regulates Facebook or other social media sites. But I think it's interesting what we heard her say there about voluntary transparency. That is a way to stave off any inklings of regulation that might be brewing in Congress, and it kind of sets up a situation where Congress may not want to crack down now. But these things take time. Hearings traditionally are the start, not the end, of something in Congress. So it's kind of this moment where Congress is acknowledging a national conversation, stepping in, saying that they're paying attention. But we may not see them actually respond with legislation or with any real action for some time.

INSKEEP: How significant is Facebook's promise to be more transparent about who is paying for political ads? I mean, I'm asking you as a political reporter, was it hard to tell who was spending money, how, in the 2016 election?

SNELL: Yeah. And this new transparency, it will give new information, but it's hard to know just from what she's saying right now how that information will be accessed, how deep the information will go. Right now we as reporters have access to a fairly in-depth research opportunity to kind of go through political ads that are, you know, that exist now. And we need to know what this will look like from them.

INSKEEP: Kelsey, thanks.

SNELL: Thank you.

INSKEEP: That's NPR's Kelsey Snell. Transcript provided by NPR, Copyright NPR.