JBloodthorn,
@JBloodthorn@kbin.social avatar

The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.

AI is now apparently generating entire children, abusing them, and uploading video of it.

Or, they are counting "CSAM-like" images as CSAM.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • oklahoma
  • feritale
  • KamenRider
  • Ask_kbincafe
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • SuperSentai
  • All magazines