jerry,
@jerry@infosec.exchange avatar

I am going to be disabling image uploads and image serving, moving to moderated signups, and instituting some extensive block lists on infosec.pub due to the pervasive problems with CSAM attacks on lemmy instances.

No, it’s not happened to any of our instances yet, but I don’t need that headache. And if anyone does, I promise you that I will make it my life’s mission to see that those responsible are convicted and rotting in prison where they belong. ❤️

Edit: h/t to @infosec_jcp for pointing out the problem to me.

claushoumann,
@claushoumann@mastodon.social avatar

@jerry @infosec_jcp what’s a CSAM attack?

jerry,
@jerry@infosec.exchange avatar

@claushoumann child sexual assault material - basically stuffing an instance with many posts that contain child pornography originating from many different accounts/sources

wally3k,
@wally3k@infosec.exchange avatar

@jerry @infosec_jcp Would it be worth running through your existing served images with https://github.com/db0/lemmy-safety just in case? 🙂​

jerry,
@jerry@infosec.exchange avatar

@wally3k thanks for that. I will investigate- will have to get a server with a GPU, but looks promising

ludiusvox,
@ludiusvox@mastodon.online avatar

@jerry @infosec_jcp
That is not fun. I am glad I am on mastodon.online server.

jerry,
@jerry@infosec.exchange avatar

@ludiusvox @infosec_jcp mastodon.online is a mastodon instance, not a lemmy instance. They are quite different things.

ludiusvox,
@ludiusvox@mastodon.online avatar

@jerry @infosec_jcp

Becuase no photo uploads is really lame. I understand not having mods becuase I have seen some really horrible pictures online before that I wish I could unsee, such as dismembered bodies and cannibalism and such.

jerry,
@jerry@infosec.exchange avatar

@ludiusvox it doesn’t have much to do with moderators. We have them - lemmy has some shortcomings that makes it easier for bad actors to post pictures without being detected.

ksaj,
@ksaj@infosec.exchange avatar

@jerry @infosec_jcp That's quite a daring kind of attack to conduct, given that it raises the likelihood of various legal entities hunting them down. Pretty senseless, too.

akmartinez,
@akmartinez@infosec.exchange avatar

@jerry @infosec_jcp Sorry for my ignorance but I had to look this one up to know and understand what it is.

Justs in case there are others like me that need a little more explanation CSAM stands for Child Sexual Abuse Material.

Forgive me again if the rest of this is inappropriate or may somehow informative to the criminal element out there... and yes... catch them, and grind them into chum for the sea...

https://news.ycombinator.com/item?id=28229657

Enigma,
@Enigma@infosec.exchange avatar

@jerry @infosec_jcp Good on ya!

Just this last week, I had to send a thoughtfully worded email to supervisors at Microsoft letting them know that it is tone-deaf at best to be sending emails directed at InfoSec staff with the subject "CSAM" in it - this time a Newsletter from their Microsoft "Customer Success Account Managers".

The rest of us in the world knows that acronym as a very different sort of definition with specific filters and actions associated when they come in.

jerry,
@jerry@infosec.exchange avatar

@Enigma @infosec_jcp i am old enough to remember it being the acronym for Cyber Security Awareness Month. I am still a little miffed about that being taken over, candidly.

ksaj, (edited )
@ksaj@infosec.exchange avatar

@jerry @Enigma @infosec_jcp Child Porn is quite an appropriate term for it. I don't know why there is always this drive to throw new terms and acronyms after everything that already exists.

Especially since the old term covers self-made images and videos, whereas CSAM kinda misses the boat.

jerry,
@jerry@infosec.exchange avatar

@ksaj @Enigma @infosec_jcp yeah, I think it was intended to make the terminology more menacing for a variety of obvious reasons.

NosirrahSec,
@NosirrahSec@infosec.exchange avatar

@ksaj @jerry @Enigma @infosec_jcp context.

Words matter.

Pornography implies consent.
CSAM does not.

jerry,
@jerry@infosec.exchange avatar

@NosirrahSec @ksaj @Enigma that’s a very fair point

ksaj,
@ksaj@infosec.exchange avatar

@jerry @NosirrahSec @Enigma It's a fair point if it was true. Find even one definition for pornography that says anything about permission.

https://www.google.com/search?q=define%3Apornography

ksaj,
@ksaj@infosec.exchange avatar

@NosirrahSec @jerry @Enigma @infosec_jcp There are all kinds of porn that do not imply consent. We don't have special 4 letter acronyms for them.

Porn means only that it is for sexual gratification. Here, find me a definition that even implies implied permission.

https://www.google.com/search?q=define%3Apornography

NosirrahSec,
@NosirrahSec@infosec.exchange avatar

@ksaj @jerry @Enigma @infosec_jcp yikes.

You're really trying to argue against semantics right now?

ksaj,
@ksaj@infosec.exchange avatar

@NosirrahSec @jerry @Enigma @infosec_jcp @jerry brought it up. Yell at him lol.

jerry,
@jerry@infosec.exchange avatar

@ksaj @NosirrahSec @Enigma @infosec_jcp my bad. Please stand down.

ksaj,
@ksaj@infosec.exchange avatar

@jerry @NosirrahSec @Enigma @infosec_jcp Hehe My apologies for agreeing. 🤣

jbaggs,
@jbaggs@infosec.exchange avatar

@jerry @infosec_jcp "Rotting in prison" isn't my favorite outcome for anyone, but I agree this needs to be nipped in the bud. As far as tech solutions to social problems, moderation and block-lists I'm entirely behind. (And moderation is honestly a social solution to a social problem, at its core.)

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KamenRider
  • Ask_kbincafe
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • oklahoma
  • SuperSentai
  • feritale
  • All magazines