Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory

Published:

Updated:

Disclaimer

As an affiliate, we may earn a commission from qualifying purchases. We get commissions for purchases made through links on this website from Amazon and other third parties.

Twitter has failed to remove images of child sexual abuse over recent months—even though they were flagged as such, a new report will allege this week.

Stanford Internet Observatory researchers say that the company did not deal with forty items of Child Sexual Abuse Material (CSAM) between the months of March and May of this year.

Microsoft’s PhotoDNA was then used to search for images containing CSAM. PhotoDNA automatically hashes images and compares them with known illegal images of minors held on the National Center for Missing & Exploited Children (NCMEC)—and highlighted 40 matches.

The team reports that “the investigation found problems with Twitter’s CSAM detector mechanisms. We reported this issue in April to NCMEC, but the problem persisted.”

We approached an intermediary for a briefing, as we had no Trust and Safety contact at Twitter. Twitter received notification of the problem and it appears that the issue has been resolved by May 20.

Research such as this is about to become far harder—or at any rate far more expensive—following Elon Musk’s decision to start charging $42,000 per month for its previously free API. Stanford Internet Observatory has been forced recently to cease using its enterprise version of the software. The free version, however, is only able to give read-only access. There are also concerns about researchers being forced to erase data collected previously under an agreement.

After highlighting the disinformation that was spread on Twitter during the U.S. presidential elections in 2020, it has been a constant thorn for Twitter. Musk called the platform a “propaganda system” at that time.

Wall Street Journal will publish more research results later this month.

The report states that Twitter “is not the sole platform that deals with CSAM nor is it the main focus of our upcoming study.” We’re grateful to Twitter for helping to improve child safety and we thank them.

Twitter Safety announced in January that they were “moving quicker than ever” to eliminate CSAM. In January, Twitter Safety reported that they had “moved faster than ever” to remove CSAM.

Several reports since have shown that CSAM continues to be a problem on the platform. The New York Times reported in February that Twitter took twice as long after Elon Musk’s takeover to remove CSAM flagged child safety groups.

It still replies to any press queries with an emoji of a toilet.

The post Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory appeared first on Social Media Explorer.

* This article was originally published here
(” Learn How to Make Money With Affiliate Marketing – http://bit.ly/make_online_now “)

About the author

Latest Posts

  • New Instagram Tools to Drive Traffic, Optimize Your Content, and Establish Branding

    New Instagram Tools to Drive Traffic, Optimize Your Content, and Establish Branding

    ( Learn How to Make Money With Affiliate Marketing – https://tinyurl.com/wealthaffil ) Are you still directing your Instagram followers to a link in bio page? Looking for new ways to make Instagram work for your brand? In this article, you’ll discover how to use caption links, profile grid customization, custom sticker tools, and major Edits

    Read more

  • Meta updates transparency rules for third-party ad platforms

    ( Learn How to Make Money With Affiliate Marketing – https://tinyurl.com/wealthaffil ) New policies reflect advertiser feedback and include restrictions around cost breakdown and campaign tool usage. * This article was originally published here

    Read more

  • Intro to Claude Cowork: How to Get Started

    Intro to Claude Cowork: How to Get Started

    ( Learn How to Make Money With Affiliate Marketing – https://tinyurl.com/wealthaffil ) Looking for a way to get actual work done with AI, not just chat with it? Tired of micromanaging every step of an AI conversation only to hit context limits before you finish? In this article, you’ll discover how to start using Claude

    Read more