Facebook has reversed its decision over deleting an abortion charity’s page, citing an “error” had caused it to be banned in connection with the company’s policies on “promoting and encouraging drug use.”
Yesterday, the Dutch group Women on Waves announced that its sister organization, Women on Web, had its Facebook page taken down over violation of community policies, but today in a quick reversal, Facebook reinstated the group’s page.
“Facebook is a place for people and organisations to campaign for the things that matter to them,” said the company. “Women on Web is an example of that. In this instance the account was disabled in error but has now been restored. We apologise for this and for any inconvenience caused.”
The Women on Web group, which gives advice to women on accessing abortion pills and procedures, had the account of its founder and director, Dr. Rebecca Gomperts, temporarily blocked in 2012 due to postings of images showing how women could use the abortion-inducing drug Misoprostol.
The group says that its Facebook page, “publishes news, scientific information and the protocols of the World Health Organization and Women on Web has answered over half a million emails to women who needed scientific, accurate information essential for their health and life.”
Earlier this month, Facebook founder Mark Zuckerberg announced that the company would be hiring 3,000 new employees to review material posted on its site to remove offensive and illegal content.
The move is seen as a response to growing criticism of Facebook and other content providers such as Google and Twitter whose platforms have been host to shootings, murders and other graphic content. In late April, to name one instance, a man in Thailand live-streamed the murder of his daughter on Facebook. The video was reportedly viewed 370,000 times and was only removed 24 hours after being uploaded.
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” wrote Zuckerberg in a post on May 3rd.
Historically, Facebook has primarily used real staffers to review material and handle complaints over content but the company announced late last year its plans to create an automatic flagging system to scour pages for offensive content. Joaquin Candela, the company’s director of applied machine learning, said that Facebook was increasingly relying on artificial intelligence to locate offence materials, describing the system as “an algorithm that detects nudity, violence, or any of the things that are not according to our policies.”
Facebook says that it makes its own calls on what constitutes offensive material, citing examples such as nudity, privacy violations, bullying, harassment and hate speech. The difficultly in policing Facebook’s content made Canadian headlines back in 2015, for example, when then-Prime Minister Stephen Harper’s Facebook page, which had posted a statement on the recent terror attacks in Paris, received hundreds of comments, some of which fell within the domain of hate speech and yet remained on the PM’s page even weeks later.
The job of censoring its own web traffic will only get more strenuous, however, as Facebook continues to add on more users worldwide. In its first quarterly report for 2017, the company said it had reached 1.94 billion monthly active users, 300 million more than one year ago, and said it expected to reach the two billion mark within weeks.