Facebook VP of Content Policy Monika Bickert

On this episode of Yahoo Finance Presents, Facebook Vice President of Content Policy Monika Bickert sat down with Yahoo Finance’s Daniel Howley to discuss the recent whistleblower report stemming from leaked documents from ex-employee Frances Haugen. In this interview, Bickert addresses the moderation systems of Facebook and Instagram, the challenges of moderating content internationally and in foreign languages, the issue of Instagram negatively affecting teens as outlined in the leaked documents, and discusses the role Mark Zuckerberg plays in moderation at Facebook.

Video Transcript

[MUSIC PLAYING]

DANIEL HOWLEY: I’m Dan Howley. This is “Yahoo Finance Presents.” We’re here with Facebook VP of Content Policy Monika Bickert. Monika, I want to start off by talking to you about hate speech on Facebook.

Usually, when Facebook discusses some of the content that’s on the platform and some of the graphic content, it says it captures 99% of that content– or 99% of it that’s captured by its own systems is done without anybody seeing it. But in the report, Frances Haugen, the whistleblower, says that 3% to 5% of hate speech is only caught. And 1% of violence or incitement content is caught. So I guess, what amount of content is Facebook able to catch in these instances?

MONIKA BICKERT: Sure. Thanks, Dan. And I want to start by saying I think there have been a lot of mischaracterizations today include by this formula employee who did not work on these issues and even with the documents she stole I think has mischaracterized what they say and what they mean about our work here.

So in terms of what we do to remove content from the platform, we actually publish a quarterly report which includes not only the amount of content that we remove in these areas and how much of that we find ourselves, which you referenced earlier but also what the underlying prevalence of the content is. That means, what did we miss?

So for things like hate speech, that prevalence, which used to be higher when we first started proactively identifying, through technology, identifying this sort of content, that prevalence is now down to less than a tenth of a percent for hate speech. And it’s similar numbers for things like violence and incitement or terror organizations. And so if you look at those reports, which again we publish every quarter when we do these, you can see the progress that we’ve made and how we’ve driven this sort of content to low levels.

Now, look, the nature of social media is that you have people sharing content real time. And so we are not perfect at catching everything that comes through the door. But we’ve invested significantly, billions of dollars, in building a safer platform. And that includes building technical systems that over time will allow us to do this work better and better.

DANIEL HOWLEY: So I want to kind of talk about the international stance there. We talk about the US and the prevalence of hate speech there or the work in hate speech there. But in some of the documents, it was revealed that there aren’t as many robust capabilities in foreign markets. And there was discussion of human trafficking content on Facebook or cartels recruiting on Facebook. I guess, how is it working to minimize that? Or does it have enough resources to take care of the international markets?

MONIKA BICKERT: Language is a challenge. And one of the reasons it’s a challenge is because when you’re building technical tools to find abusive content or violating content, you need examples so that you can train the machines to go and find this stuff. And so we have an approach where we train our machine learning on hundreds of thousands of examples. And we get better and better over time.

And, often, we’ll start doing something in one language, and then as we build the capability to, we will roll out to other languages. So things like hate speech, for instance, we initially focused on English. We focused on Burmese. We focused on languages that– where we had special reasons to be concerned about the potential for hate speech. But then over time, we do broaden that. And we’ve certainly done that with hate speech. And we’ll continue to get better at the technical components of this.

DANIEL HOWLEY: Is it something that Facebook is working on in Ethiopia with the violent clashes that are going on there?

MONIKA BICKERT: Yes, and specifically what we do in countries like Ethiopia where there is either conflict going on, or there’s a risk for it, we have special teams who proactively before– when we first identify there may be a risk, they’re working on the ground to identify, who are the NGOs or who are the academics or others who could let us know what the trends might be, what the risks might be?

We train our reviewers in understanding, what’s the lingo? What are the topics that we need to be concerned about? And then sometimes, we actually put bespoke policies in place where we will say, this is a term or this is a trend that we are seeing in this region. We need to make sure we’re on top of it.

Those relationships, we started building those pretty early on in my– my nine years of doing this job. But they’ve gotten a lot more robust in I would say the past three or four years where we’ve significantly invested in building what we call our network of trusted partners. And these are organizations around the world who are focused on safety issues or how speech can affect certain communities in different areas, that we can take those learnings, and make sure we’re doing what we can to keep the site safe.

DANIEL HOWLEY: I want to quickly pivot to the issue of young girls and the body image discussion. You know, that’s been something that’s obviously come up a lot. Is Facebook doing anything to address this and to prevent kids from running into body image issues on services like Instagram?

MONIKA BICKERT: Yes, and I want to correct the record on– on the way that that report has been mischaracterized. So first, when it comes to research on safety, we work with safety academics around the world. We also on– on research sort of writ large, we participated– our researchers participated in more than 400 peer-reviewed research articles that were published last year alone.

The stolen documents contain what is not a peer-reviewed research article but was instead a survey of a small number– I think around 40– Instagram users who were teens who were already struggling with mental health issues. And what that showed is that the majority of both boys and girls for every issue, anxiety, body image issues, thoughts of self-harm, for every issue, the majority of these teens said that Instagram made things better or that it didn’t have a material effect.

But, look, you know, I spent my career before Facebook in child safety and criminal prosecution. And I know I speak for all of us who work on safety at Facebook when I say that even one young person having a bad experience is too much. It’s why we are investing in the technology we are. It’s also why we’ve built the robust network of safety partners that we have. And the research that we’re doing is so that we can understand people having a bad experience and then to your point actually build new products and tools that can help.

So examples would include when you go on Instagram now if you’re a young person hiding the number of likes that your content gets so that you don’t feel that peer pressure or giving people tools to restrict those who can follow them so that they don’t have to worry about bullying and harassment. Those are the sorts of things that– that we’re developing based on the research we do.

DANIEL HOWLEY: And I guess I just have one more question. You know, the discussion of moderation and, you know, revenue and when things are moderated, is it going to impact revenue or vice versa, does– does Mark have any input as to content moderation on Facebook?

MONIKA BICKERT: Mark’s very involved in these issues. And it’s actually because of that involvement that we’ve seen the investment internally in understanding and getting better on these issues. It’s why you’ve seen the oversight board, the independent oversight board that we built to basically make decisions on Facebook content. And it’s also why you’ve seen this proactive call for regulation time and time again from Mark, from others at the company. I put out a paper on this a couple years ago.

We think that governments need to have a stronger voice in these really important issues. And we’ve engaged with regulators around the world, including in the US, including in the UK and France. We’ll continue to do that. But we think regulation is a big part of getting this industry and getting the public and social media to a better place.

DANIEL HOWLEY: And then just as a quick follow-up, are you aware of the documents that were leaked? Have you been able to overlook them yourself?

MONIKA BICKERT: In fact, we actually published the Instagram Youth Survey that was leaked. So these are things that we want– we want to help put in context. I want to be clear again that the employee who took these documents didn’t work on these issues. So you know, just like a reporter reading a colleague’s story and then saying, well, I’m an expert in this topic, it doesn’t make you an expert to have read these things. And I think we’ve seen some serious mischaracterizations that I’m happy to be here today, Dan, to help set the record straight.

DANIEL HOWLEY: All right, thank you very much, Monika Bickert, VP of Content Policy at Facebook.

MONIKA BICKERT: Thank you.

https://finance.yahoo.com/video/yahoo-finance-presents-facebook-vp-100000843.html