skullgiver,
@skullgiver@popplesburger.hilciferous.nl avatar

That’s putting a lot of faith into CLIP, though. The thing is, to get CLIP to detect things like child porn reliably, you do need to train it to make the distinction. In my experience, CLIP tends to make up keywords, or at least misunderstand the situation surprisingly common.

If it weren’t super illegal and super unethical, AI could easily distinguish normal porn from illegal porn if you feed it enough tagged data of both. Categorisation is something these models are very good at, after all. That’s never ever going to happen (imagine the poor shmuck being hired to tag child rape for a dollar a day, horrific) but it’s the only way I trust AI to come up with something like this.

I think we need more research into this field. I’m also at least a little mad that AI companies release these models into the wild before the science is ready to prevent it from becoming a child rape image generator for the mentally ill. Companies just seem to throw their hands in the air and go “well we didn’t program it to do that, not our fault!” and deny any responsibility for what they’ve created.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines