evranch,

In this specific scenario, you wouldn’t want to remove the watermark.

The watermark would be the only thing that defines the content as “harmless” AI-generated content, which for the sake of discussion is being presented as legal. Remove the watermark, and as far as the law knows, you’re in possession of real CSAM and you’re on the way to prison.

The real concern would be adding the watermark to the real thing, to let it slip through the cracks. However, not only would this be computationally expensive if it was properly implemented, but I would assume the goal in marketing the real thing could only be to sell it to the worst of the worst, people who get off on the fact that children were abused to create it. And in that case, if AI is indistinguishable from the real thing, how do you sell criminal content if everyone thinks it’s fake?

Anyways, I agree with other commenters that this entire can of worms should be left tightly shut. We don’t need to encourage pedophilia in any way. “Regular” porn has experienced selection pressure to the point where taboo is now mainstream. We don’t need to create a new market for bored porn viewers looking for something shocking.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • [email protected]
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • SuperSentai
  • oklahoma
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • KamenRider
  • feritale
  • All magazines