Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Local Newscast
Hear the latest from the WRKF/WWNO Newsroom.

Facebook responds to whistleblower's claim that company chose profits over the public

ARI SHAPIRO, HOST:

First came documents - tens of thousands of pages of internal Facebook files. They show Facebook knows its algorithms promote hate and misinformation and that content keeps people engaged. They also contain internal research showing that Facebook's Instagram app hurts some teens' mental health and exacerbates body issues. Now the whistleblower who leaked those documents is speaking out. Frances Haugen is a former Facebook employee who spoke to "60 Minutes" last night.

(SOUNDBITE OF TV SHOW, "60 MINUTES")

FRANCES HAUGEN: The thing I saw on Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook over and over again chose to optimize for its own interests, like making more money.

SHAPIRO: We'll note that the company is among NPR's financial supporters. And we're joined now by Facebook executive Neil Potts. He is vice president for trust and safety. Welcome to ALL THINGS CONSIDERED.

NEIL POTTS: Hey, Ari. How are you? Neil Potts. Thank you.

SHAPIRO: All right - good to have you here. So how can explain Facebook's calculus this way? If the company changed the algorithm to make people safer and deprioritized hateful and divisive posts, then people would spend less time on the site, click on fewer ads. The company would make less money. Do you agree that polarization leads to engagement, which translates to money for Facebook?

POTTS: I think I would categorically deny that. I think that accusation is just a bit unfounded. At Facebook, what we are designing our products to do is to increase meaningful experiences. So whether those are meaningful social interactions - I think that is the change that the leaker mentions in her "60 Minutes" interview - or having just positive social experience on the platform, that is what we want the product ultimately to provide. That makes a environment where people will come to Facebook, where they will come to Instagram and have a - just a better time. And that's really our bottom line.

SHAPIRO: So I just want to be clear. Are you saying it may be the case that polarization leads to more engagement, but we're not trying to polarize our users, so that's not what we want the algorithm to do? Or are you saying, no, the premise that more extreme views will lead people to engage more is just fundamentally wrong?

POTTS: I think just taking a step back on that question of polarization - I think polarization has, you know, existed long before Facebook and will exist long after, perhaps, that we're all no longer here. That being said, at Facebook, we're not designing anything to be for the sensational or clickbait-y (ph) or engagement bait-y (ph) ways that polarization may be seen. And that, I think, was being accused of Facebook and the leadership decisions on the product. Our products are designed to have more meaningful social interactions - those interactions with your friends, your loved ones, your close-knit groups to ensure that you're coming to the platform to have a - just a positive social experience, one that you can have not only just meaningful experiences but things that you will find perhaps unfettered positive values in.

SHAPIRO: Well, let's talk specifically about some of the safeguards that were put in place after - in the run up to the 2020 election. These were to reduce misinformation, and Haugen says many of those protections went away after the election. We now see indictments against some of the January 6 rioters that cite Facebook posts. So how does Facebook defend its decision to dissolve its Civic Integrity Unit now that you know what happened just after the election, when some of those safeguards were no longer in place?

POTTS: Thanks, Ari. I think I want to make sure that we're very clear here. The Civic Integrity Unit has not been dissolved in any ways. Actually, it has been, in many ways, promoted throughout the company. The team is now sitting with other teams so that we can take lessons learned about elections.

SHAPIRO: So this was a centralized team, and now its individual responsibilities have been farmed out to other teams.

POTTS: It's still within our centralized integrity team, the broader team to focus on other issues, including civic integrity but also issues that are, I think, at the top of everyone's mind, including issues around health and COVID-19. You can see a lot of the similarities between the way that we've treated elections on a platform of our elections hub as well as what we are doing around COVID-19 and the COVID-19 hub. So those teams are still in practice, doing work and actually working on civic issues in elections across the globe.

SHAPIRO: Let me turn to Instagram and how it affects young people. Here's how Haugen characterized the company's research last night on "60 Minutes."

(SOUNDBITE OF TV SHOW, "60 MINUTES")

HAUGEN: Facebook's own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers. It's that it is distinctly worse than other forms of social media.

SHAPIRO: And she says that research found 13.5% of teen girls said Instagram makes thoughts of suicide worse. Seventeen percent said it worsened eating disorders. And when Congress asked about this, CEO Mark Zuckerberg misstated the company's research and then withheld the damaging information that came out in this leak. How does Facebook justify that?

POTTS: I don't think Mark withheld any of the research. I think one of the issues here - we're, you know, very committed to doing research. We recognize the value so we can close the gaps or reduce the gaps. But I want to put a bit of the context of this research into the field as well. This research was done over a - for boys and girls of teenage years over 12 different areas. And the majority actually - the majority of respondents, who were both boys and girls, said that Facebook has - Facebook and Instagram have a net positive on their mental health in these areas. And for a subset of people who already were struggling with things like anxiety, nervousness - excuse me - depression, coming to Facebook, on 11 out of 12, they actually felt better leaving Facebook and Instagram after engaging.

On body images, I think that was one of the issues raised last week. There is a subset of people who - of girls who felt worse leaving. We recognize that, and we'll use that research to try to close those gaps. That's why I say we do that research.

SHAPIRO: You say you want to do this research to close these gaps. In another part of the program, I speak with a 21-year-old journalist about these issues, and she's been using these platforms since she was a teen. I asked her whether she believes Facebook's assurances that it can use these research, close these gaps, do better, fix these problems. And here's what she said.

NINA ROEHL: Well, if they could have, why haven't they already, right? This is not new. So I would love to say yes, that they can clean it up themselves. But my question then is, why haven't they already?

SHAPIRO: Neil Potts, what would you say to her?

POTTS: I would say to her, we're investing millions of dollars, billions of dollars on these issues to make sure that we are arriving at the right solutions. For any one person that experiences these, we want to make sure that we try to eliminate that. But on balance, we are doing the work, doing investing in the research so we know how to approach these issues and really even sending interventions to people who may be impacted by such harms.

SHAPIRO: All right. Neil Potts is vice president for trust and safety policy at Facebook. Thank you for speaking with us.

POTTS: Thank you.

SHAPIRO: NPR tech correspondent Shannon Bond has been listening in to that interview with us. And, Shannon, what stood out to you from what we just heard?

SHANNON BOND, BYLINE: Well, you know, ever since all of this broke with Haugen's claims and the reporting in The Wall Street Journal, you know, we've been hearing Facebook push really hard against what she's saying and what these documents seem to reveal, especially this core premise - right? - that the company is putting profit ahead of safety. So we just heard Neil Potts say, you know, those are unfounded accusations.

But I think the real question here is, you know, maybe Facebook wasn't designed that way. You know, it wasn't designed, for example, for - to make, you know, engaging in negative content, you know, go viral. But if that is happening, what is Facebook doing about it? I think it gets at that last question you asked him, right? What is Facebook doing? And that's why we're seeing so much more pressure right now on the company about transparency.

SHAPIRO: I also just need to ask you briefly about the worldwide outage that we've seen today across many of Facebook's platforms.

BOND: That's right. I mean, the company says this is a network issue they're trying to fix. But Facebook, Instagram, WhatsApp - they've all been down since before noon Eastern. That's long, a long time. That's many hours. It's not really clear what is going on here. And certainly, you know, this sort of timing is leading a lot of people to make a lot of jokes about this. But, you know, it just shows you how reliant we all are on these platforms.

SHAPIRO: NPR tech correspondent Shannon Bond following the story of Facebook. And we will be following tomorrow's testimony before the Senate panel as well. Thanks for your coverage.

BOND: Thanks, Ari. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.