Meta - the parent company of Facebook, Instagram, and WhatsUp - is facing significant challenges in preventing the presence of pedophiles on its social media platforms.
According to an investigation by The Wall Street Journal, there are disturbing instances where Instagram and Facebook actively promote accounts associated with pedophilia, exacerbating an already perilous situation.
The report, which continues an earlier topic in the same publication, is a damning evidence of Meta’s ongoing getaway from justice, following multiple accounts of user privacy breach, promotion of disinformation, and deception of advertisers.
More to read:
Meta’s voice replicating AI is a dangerous security loophole
The Wall Street Journal has produced the investigation in collaboration with collaborated with researchers from Stanford University and the University of Massachusetts Amherst.
It revealed, for example, that Instagram's algorithms were linked to a widespread network distributing underage explicit content and Facebook is still plenty of communities sharing child abuse content.
Despite Meta establishing a child-safety task force in response to this issue, the WSJ report indicates that, even after five months since the first evidence surfaced, the company still has a substantial amount of work ahead, as its platforms continue to promote pedophilic content.
On Facebook, there are entire groups dedicated to sharing inappropriate content involving children.
Meta's efforts, including the banning of pedophile-related hashtags by a task force comprising over 100 employees, have landed short of resolving the problem, according to the investigation, which indicates an “alarming” scale.
More to read:
Meta unveils smart glasses that let you spy and talk to AI
The WSJ cited, among others, the Canadian Centre for Child Protection, which discovered Instagram accounts with up to 10 million followers livestreaming videos of child abuse. Additionally, there are Facebook groups with hundreds of thousands of users openly celebrating incest and inappropriate content involving children, the WSJ's investigation claims.
Community f..king standards
Sadly, even when these groups were brought to Meta's attention, the company asserted that they did not violate its "Community Standards," despite clear indications, such as the term "Incest" being part of a Facebook group's name.
Meta claims to be refining tools to limit the spread of pedophilic accounts, and it acknowledges the removal of 16,000 accounts violating child safety policies since July.
However, the company is reluctant to limit the tools that algorithmically recommend content, a crucial revenue source. Moreover, despite ongoing layoffs, including those responsible for reviewing suspected inappropriate content, Meta appears to have made minimal progress in addressing the issue.
This lack of decisive action is concerning, especially considering the company's dwindling workforce and challenges faced by its content moderation teams amid increasing reliance on artificial intelligence.