I am going to be disabling image uploads and image serving, moving to moderated signups, and instituting some extensive block lists on infosec.pub due to the pervasive problems with CSAM attacks on lemmy instances.
No, it’s not happened to any of our instances yet, but I don’t need that headache. And if anyone does, I promise you that I will make it my life’s mission to see that those responsible are convicted and rotting in prison where they belong. ❤️
Edit: h/t to @infosec_jcp for pointing out the problem to me.
@claushoumann child sexual assault material - basically stuffing an instance with many posts that contain child pornography originating from many different accounts/sources
Becuase no photo uploads is really lame. I understand not having mods becuase I have seen some really horrible pictures online before that I wish I could unsee, such as dismembered bodies and cannibalism and such.
@ludiusvox it doesn’t have much to do with moderators. We have them - lemmy has some shortcomings that makes it easier for bad actors to post pictures without being detected.
@jerry@infosec_jcp That's quite a daring kind of attack to conduct, given that it raises the likelihood of various legal entities hunting them down. Pretty senseless, too.
@jerry@infosec_jcp Sorry for my ignorance but I had to look this one up to know and understand what it is.
Justs in case there are others like me that need a little more explanation CSAM stands for Child Sexual Abuse Material.
Forgive me again if the rest of this is inappropriate or may somehow informative to the criminal element out there... and yes... catch them, and grind them into chum for the sea...
Just this last week, I had to send a thoughtfully worded email to supervisors at Microsoft letting them know that it is tone-deaf at best to be sending emails directed at InfoSec staff with the subject "CSAM" in it - this time a Newsletter from their Microsoft "Customer Success Account Managers".
The rest of us in the world knows that acronym as a very different sort of definition with specific filters and actions associated when they come in.
@Enigma@infosec_jcp i am old enough to remember it being the acronym for Cyber Security Awareness Month. I am still a little miffed about that being taken over, candidly.
@jerry@Enigma@infosec_jcp Child Porn is quite an appropriate term for it. I don't know why there is always this drive to throw new terms and acronyms after everything that already exists.
Especially since the old term covers self-made images and videos, whereas CSAM kinda misses the boat.
@jerry@infosec_jcp "Rotting in prison" isn't my favorite outcome for anyone, but I agree this needs to be nipped in the bud. As far as tech solutions to social problems, moderation and block-lists I'm entirely behind. (And moderation is honestly a social solution to a social problem, at its core.)
Add comment