At Facebook we get things wrong but we take our safety role seriously | Monika Bickert

Our reviewing of difficult posts and images is complex and challenging. We appreciate the Guardian revealing how tough it is to get the balance right

Last month, people shared several horrific videos on Facebook of Syrian children in the aftermath of a chemical weapons attack. The videos, which also appeared elsewhere on the internet, showed the children shaking, struggling to breathe and eventually dying.

The images were deeply shocking so much so that we placed a warning screen in front of them and made sure they were only visible to adults. But the images also prompted international outrage and renewed attention on the plight of Syrians.

The Guardians reporting on how Facebook deals with difficult issues/images such as this gets a lot of things right. Reviewing online material on a global scale is complex, challenging and essential. The articles and the training materials published alongside this show just how hard it can be to identify what is harmful and what is necessary to allow people ability to share freely. As the person in charge of doing this work for Facebook, I want to explain how and where we draw the line.

On an average day, more than a billion people will use Facebook. They will share posts in dozens of languages: everything from photos and status updates to live videos. A very small percentage of those will be reported to us and investigated by our moderation teams. The range of issues is broad from bullying and hate speech to terrorism and war crimes and complex. Designing policies that both keep people safe and enable them to share freely means understanding emerging social issues and the way they manifest themselves online, and being able to respond quickly to millions of reports a week from people all over the world.

For our reviewers, there is another hurdle: understanding context.

Its hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it?

Someone with a dark sense of humour posts a joke about suicide. Are they just being themselves, or is it a cry for help?

Cultural context is part of it too. In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. Its easy to comply with a clear-cut law, but most of the time whats acceptable is more about norms and expectations. Social attitudes are constantly evolving, and every society has its flash points. New ways to tell stories and share images can bring these tensions to the surface faster than ever.

Our approach is to try to set policies that keep people safe and enable them to share freely. We aim to remove any credible threat of violence, and we respect local laws. We dont always share the details of our policies, because we dont want to encourage people to find workarounds but we do publish our Community Standards, which set out what is and isnt allowed on Facebook, and why.

Our standards change over time as our community grows and social issues around the world evolve. We are in constant dialogue with experts and local organisations, on everything from child safety to terrorism to human rights.

Sometimes this means our policies can seem counter-intuitive. As the Guardian reported, experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats.

Sometimes this is not enough to prevent tragedy, but sometimes it is.

When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time. We are aware of at least another half-dozen cases like this from the past few months.

We also try hard to stay objective. The cases we review arent the easy ones: by definition something is reviewed when it falls within a grey area. Art and pornography arent always easily distinguished, but weve found that digitally generated images of nudity are more likely to be pornographic than handmade ones, so our policy reflects that.

Theres a big difference between general expressions of anger and specific calls for a named individual to be harmed, so we allow the former but dont permit the latter.

Read more: https://www.theguardian.com/commentisfree/2017/may/22/facebook-get-things-wrong-but-safety-role-seriously