On Wednesday Facebook said that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of its Artificial Intelligence (AI) and Machine Learning (ML) technologies that automatically flags such photos.
The ML tool rolled out over the last year identifies images that contain both nudity and a child. allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualised context. A similar system also disclosed on Wednesday seizes users engaged in “grooming,” or befriending minors for sexual exploitation.
Facebook’s global head of safety Antigone Davis told Reuters in an interview that the “machine helps us prioritise” and “more efficiently queue” awkward content for the company’s trained team of reviewers which prevent the sexual exploitation of children across online technologies.
The company is exploring to apply the same technology to its app Instagram.
Under regulators and lawmakers pressure, Facebook has speeded up the elimination of extremist and illicit material. Machine learning programs that filter through the billions of pieces of content users post every day are necessary to its plan.
Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.
Facebook’s community standards have banned even family photos of lightly clothed children uploaded with “good intentions,” Worried about how others might abuse such images. It did not report earlier about the removal on child nudity, though some would have been counted among the 21 million posts and comments.
On Wednesday, the share price of Facebook fell 5 per cent. The child grooming system evaluates factors such as how many people have blocked a particular user and whether that user quickly attempts to contact many children, Davis said.
“The organisation expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year,” said Michelle DeLaune, the chief operating officer at the National Center for Missing and Exploited Children (NCMEC).
Author: THE HANS INDIA
Published at: Thu, 25 Oct 2018 15:46:12 +0530
No comments:
Post a Comment