dipshit, (edited )

They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.

Edit: I goggled and could not find this database. I’m thinking it’s a myth.

NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.

I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.

You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?

Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

You’re not understanding how CSAM detection works or is handled.

The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces… IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.

Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?

Pedophiles, instead of downvoting me, why not explain yourself?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • [email protected]
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • Socialism
  • feritale
  • oklahoma
  • SuperSentai
  • KamenRider
  • All magazines