もっと詳しく

The social network is again singled out, after a study reveals that Facebook moderates very little images showing child abuse.

On Facebook, moderation is regularly singled out by rights defenders. Whether it concerns the regulation of political advertisements or advocating firearms, calls for hate speech or more recently, a VIP list unveiled by the wall street journalallowing certain celebrities to escape automatic moderation, the list is long, as much as the scandals that surround it.

This time, it’s a new subject that sets fire to the powder. According to a survey by New York Times which cites an internal training document, images of child abuse would actually be very unmoderated on Mark Zuckerberg’s platform.

Why age matters

In the case of a report for child abuse, Facebook would have great difficulty in accurately determining the age of the children. If in doubt explain the New York Times, “young people are treated as adults and the images are not reported to the authorities”. A loophole that leaves many content of child abuse every year.

In the document brought to the attention of the newspaper, the moderators are thus invited to consider as adults all content for which it is not possible to determine with certainty the age of the person. A major problem, since the video then passes through the meshes of moderation, but also of the authorities. According to the procedure, all images of abuse must normally be transmitted to the National Center for Missing & Exploited Children (NCMEC), while those concerning “simply” adults are generally content to be removed from the platform.

Facebook plays the privacy card to justify itself

However, Facebook is not such a bad student when it comes to moderating problematic content around minors. Every year, several million videos of alleged child sexual abuse are reported and delivered to the competent authorities. However, the investigation of New York Times believes that this lack of moderation in the recognition of age also has an impact on a good number of unreported sexual abuses.

Using a 50+ year old identification method to identify “the progressive phases of puberty”, Facebook would allow thousands — perhaps millions — of child abuse videos to slip through the cracks. For its part, the company defends tooth and nail its moderation system: according to Antigone Davis, the security manager at Facebook, considering users as adults in case of doubt simply allows protect the privacy of Internet users. Especially since Meta could be legally responsible in the event of a false report.

[related_posts_by_tax taxonomies=”post_tag”]

The post Facebook moderators turn a blind eye to child abuse appeared first on Gamingsym.