Facebook ‘Puts Engagement and Growth Before the Health and Welfare of Democracy’

 

Janine Jackson interviewed Free Press’s Tim Karr about challenging Facebook for the October 29, 2021, episode of CounterSpin. This is a lightly edited transcript.

 

      CounterSpin211029Karr.mp3

 

Gizmodo: Facebook Has No Clue How to Solve Its Image Problem, Leaked Doc Shows

Gizmodo (10/27/21)

Janine Jackson: Gizmodo reports leaked documents from Facebook showing that the company’s internal research found that people don’t trust it, are confused by its rules on content moderation and that, as the reporter puts it, “nobody believed Facebook was motivated by anything but fat stacks of cash.”

This comports with revelations from whistleblower Frances Haugen, and even before that, that point to a company that worse than failed to stop the spread of disinformation, including around Covid-19 and the Stop the Steal movement behind the January 6 insurrection; that has a troubling record of targeted content-removal abroad, including against Palestinians, we’re learning; that enables products and features it knows are harmful to millions of young users; and for whom all of this, crucially, seems to be not a bug but a feature.

The company’s reported entertainment of a name change is unlikely to deflect the public, legislative and regulatory scrutiny it now faces. But what that scrutiny will amount to concretely is still to be determined.

Here to catch us up is Tim Karr. He’s senior director of strategy and communications at Free Press. He joins us now by phone from New Jersey. Welcome back to CounterSpin, Tim Karr.

Tim Karr: Always great to be with you.

JJ: Before we talk about what we’re learning, and around what we’re learning, I wonder if you would talk to us a bit about how we are coming to learn this stuff. Folks may have seen the Wall Street Journal publishing “The Facebook Files.” But a number of people have been working on these whistleblower documents, and on other materials. What do you think is meaningful about what’s going on maybe behind the scenes here?

60 Minutes: Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation

60 Minutes (10/4/21)

TK: This is really a remarkable series of events. And you mentioned we first started learning about what are called “Facebook Papers,” when the Wall Street Journal did a fairly extensive exposé, a number of stories, a couple of weeks ago. We also learned that there was a whistleblower involved who had also provided documents to Congress and to the SEC. We came to know that whistleblower when 60 Minutes did an exposé and interviewed Frances Haugen. Congress then called her to testify, and there was a lot of news around that.

And then there was a sort of second wave where a number of news outlets, including some pretty major names, like the New York Times, the Associated Press, CNN, started reporting stories. And it was really this sort of avalanche of stories that happened earlier this week and continues to this day.

And what’s happening here is that Frances Haugen has a pretty sophisticated PR operation. When she decided to come forward as a whistleblower, she received some support from Whistleblower Aid, which is a group that provides legal protection to whistleblowers, and also from a PR agency. And the PR agency decided that, not only were we going to provide these thousands of pages of documents to the Wall Street Journal, we want to make sure that this story remains in the headlines for weeks.

So what they did was work with a consortium, 17 news outlets, and say we’re going to give you bits and pieces. And every day you’re going to get a new bunch of documents. And then you can report on them. And they’ve created this structure by which they’re slowly doling out chunks of these documents to this consortium, with the plan that this will last for six weeks.

So through the end of November, we will be seeing new daily news items provided through this structure that was created.

JJ: Trying to keep it from being a flash in the pan, like a one-day story.

TK: Yeah, I think it’s really smart in certain ways. But I also know, having spoken to some of the reporters, that it’s frustrating. A lot of them feel like they’re being played like puppets.

JJ: Right.

TK: That this PR firm working for Frances Haugen has kind of got them over a barrel. And there’s been a lot of dissent within journalism about how this is structured, and some have gone so far to reject this deal, and go about their own reporting on these documents and on Facebook without having to be beholden to this process.

JJ: It sounds like information that we just want to get out, and maybe as savvy as it might sound to play into the way that we know reporters work–maybe they’ll do it for two days and then drop it–at the same time, if it’s information that’s valuable to the public and meaningful, you just kind of want to get it out there, right?

Tim Karr, Free Press

Tim Karr: “Indeed, Facebook has moved fast and broken things. But some of the things that they have broken include the lives of people.”

TK: Well, yes, it’s incredibly valuable information. I mean, Facebook’s unofficial motto used to be, “Move fast and break things.” And we now know, because of these documents, that indeed Facebook has moved fast and broken things. But some of the things that they have broken include the lives of people in places like the Philippines and Myanmar and Ethiopia. Facebook has also moved fast and broken trust in our democratic institutions and emergency healthcare systems.

They thought that they could hide these facts, and we owe it to this whistleblower to have brought a lot of this to the light. So it is important that we know these things. And I think, seriously, it has caused irreversible damage to Facebook, which is now called Meta, by the way. They just announced it.

JJ: Oh boy.

TK: In particular, I don’t know that Mark Zuckerberg, the founder, and Sheryl Sandberg will outlast this. I think there’s a lot of serious thinking out there about them having to step down as a result of this.

JJ: We should note that we are very much in medias res with this. Things are changing around us, in terms of there’s an FTC lawsuit, there’s the US attorney general, there’s legislation going on. And, no, I don’t think that people are going to be tricked by a rebranding, and not be able to follow who this is connected to.

You’ve run through some of the specific impacts that have been revealed in terms of pushing users to extremist groups, in terms of not checking disinformation, in terms of targeting ethnic minorities abroad. It’s worth saying that sometimes this is kind of shuttled off as a social media issue, as though it were not about real human beings. And one takeaway from these revelations is that communities, communities of color, LGBTQ communities, they’re really at risk, based on campaigns of hate and harassment and violence that what we’re learning is Facebook foments intentionally, as it were. I mean, at least doesn’t stop once it’s aware they’re doing it.

TK: Yes, and one of the more important things that we found through this process of exposé is that Facebook doesn’t devote moderating or AI-filtering resources to languages that are spoken outside of the United States. For example, Arabic has 22 distinct dialects, and Facebook’s AI, artificial intelligence, can’t really tell the difference. And if you’re talking about Covid disinformation in Spanish, the AI is unable to determine whether that’s a violation of Facebook’s rules or not. So there’s been this real failure when it comes to non-English disinformation that’s spreading over the network.

And it’s not only a problem, as many of these reports have revealed, in countries like India, in the Middle East, in North Africa, in Myanmar, Ethiopia and elsewhere. It’s a problem in the United States where we have a number of diaspora communities who don’t speak English, and often rely on Facebook in their own languages as a source of news and information. And Facebook just hasn’t dedicated the resources to vetting those languages. So we find that the spread of disinformation on Covid or on the 2020 election results, for example, is far worse in non-English-speaking communities that use Facebook.

NYT: Facebook, Show Us the Mess

New York Times (10/27/21)

JJ: I want to talk about what responses these revelations seem to call for and where they might come from. But I did want to note this New York Times piece with the kind of icky headline, “Facebook, Show Us the Mess,” the point of which was that perhaps the public and Facebook would benefit if these kind of “rare, unvarnished” glimpses such as the Facebook Files offer, into their workings weren’t so rare. And the Times column says that that “might make the company a little more trustworthy and understood.”

That piece reminded me of a piece by Cynthia Khoo, of the Center on Privacy and Technology, about the trap, if you will, of transparency as an end in itself–

TK: Right.

JJ: –when what we need is accountability. Don’t show us the mess, fix it. I wonder if you would talk about what serious responses to the harms that have been revealed about Facebook, what that might look like.

TK: Mark Zuckerberg and other Facebook executives have come before Congress for well over a year, on multiple appearances. I think we’ve seen in a lot of those hearings that members of Congress were just kind of incapable of talking about how to regulate, how to provide some sort of official oversight, to prevent all of the harm that Facebook is causing. And so this process has helped advance that thinking.

A lot of the interesting work that’s being done in Congress is about looking at the business model, a model that puts engagement and growth before the health and welfare of a multiracial democracy. And to start questioning the way data is used, to start questioning how data is abused and used in discriminatory ways, so that ads about job opportunities, for example, can be shared with white people but not with others. Facebook had the capacity to target ads in that way. It still does.

And so there is a role, not only for Congress to push for, as you say, transparency, but transparency is only a part of the picture. We need also to make sure that if data is being collected, that it’s being used in a way that protects the civil rights of individuals, and can’t be used in discriminatory ways. The FTC also has the authority to conduct a rulemaking about how not just Facebook, but other social media platforms use data.

And so we’ve been very involved in organizing support for action in Congress and at the FTC, Federal Trade Commission, to take on those actions, to provide, to launch those sorts of rulemaking proceedings, so that we can create a stronger regulatory framework to prevent these types of abuses from happening again.

JJ: We’ve been speaking with Tim Karr. He’s senior director of strategy and communications at Free Press. You can follow their work on Facebook and a range of other issues online at FreePress.net. Thank you so much, Tim Karr, for joining us this week on CounterSpin.

TK: Thanks, Janine.

The post Facebook ‘Puts Engagement and Growth Before the Health and Welfare of Democracy’ appeared first on FAIR.


This content originally appeared on FAIR and was authored by Janine Jackson.

This post was originally published on Radio Free.