Stanford researchers discover that Mastodon has a huge problem with child abuse

Mastodon, the decentralized network seen as a viable alternative to Twitter, is full of child sexual abuse (CSAM) material, according to a new study By Stanford’s Internet Observatory ,By Washington Post, In just two days, the researchers found 112 instances of known CSAM across 325,000 posts on the platform – the first instance appeared after just five minutes of searching.

To conduct its research, Internet Observatory scanned the 25 most popular mastodon specimens for CSAM. The researchers also used Google’s SafeSearch API to identify the explicit images, as well as PhotoDNA, a tool that helps find marked CSAMs. During the search, the team found 554 pieces of content that matched hashtags or keywords commonly used by online child sexual abuse groups, all of which were clearly identified as “highest confidence” by Google SafeSearch.

CSAM’s public postings are “disturbingly extensive”.

713 of the top 20 CSAM-related hashtags in Fediverse were used in posts containing media, as well as 1,217 text-only posts, indicating “off-site CSAM trafficking or grooming of minors”. The study states that open postings of CSAM are “disturbingly widespread”.

an example referring to extended mastodon.xyz server failure We noted earlier this month that this was an incident that resulted from a CSAM posted on Mastodon. In a message about the eventThe server’s sole admin said he was warned about content containing CSAM, but notes that moderation is done in his spare time and can take days – it’s not a huge operation like Meta with a global team of contractors, it’s just one person.

Although they said they were processing the affected content, the host of the mastodon.xyz domain suspended it anyway, leaving the server inaccessible to users until they could reach someone to restore the listing. After the issue was resolved, the administrator of mastodon.xyz said that the registrar added the domain to a “false positive” list to prevent future deletions. However, as the researchers point out, “the reason the action took place was not a false positive.”

“We probably have more photo DNA hits in a two-day period than in the entire history of any type of social media analysis at our organization, and this isn’t even close to that,” David Thiel, one of the report’s researchers, said in a statement. Washington Post. ,Much of this is a result of the lack of tools that centralized social media platforms use to address child safety concerns.

as a decentralized network As the popularity of the Mastodon grows, so do concerns about safety. Decentralized networks do not use the same approach to moderation as mainstream sites such as Facebook, Instagram, and Reddit. Instead, each decentralized instance is given control over moderation, which can lead to inconsistencies throughout the fediverse. Therefore, the researchers recommend that networks such as Mastodon use more robust moderator tools, along with PhotoDNA integration and cyber tipline reporting.

Stay Connected With Us On Social Media Platforms For Instant Updates, Click Here To Join Us Facebook

For the latest news and updates, follow us on Google News

Read the original article here

Show More
Back to top button

disable ad blocker

please disable ad blocker