Lemmyshitpost community closed until further notice

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

CharlesDarwin,
@CharlesDarwin@lemmy.world avatar

All I can is that I’m sorry this is happening to you guys. Are there specific ways others can help?

DarthBueller,

Relevant: Excellent congressional legal memo explaining the current US law on CSAM (as of 2022) Link

KrisND,

lemmy.world isn’t in the US, at least right now.

Imotali,
@Imotali@lemmy.world avatar

You think the US cares about that? Europeans still have to defacto prove they aren’t Americans or they get taxed American taxes by their European banks.

The US does not and has not ever cared about international borders unless they benefit the US.

ricdeh,
@ricdeh@lemmy.world avatar

That’s a super dumb claim, it’s literally fake news and entirely nonfactual.

ClarkDoom,

Never been more glad that I don’t follow shitposters or shitpost communities. Can’t believe that community is as big as it is anyway considering how annoying shitposting is.

HelloHotel,
@HelloHotel@lemmy.world avatar

It was relitively innocent from what i remember, just strange memes

cubedsteaks,

Seriously. I’m not surprised by this at all because shitposting welcomes 4chan and 8chan types that have all kinds of access to CSAM

zakobjoa,
@zakobjoa@lemmy.world avatar

lemmyshitpost was very wholesome, I can assure you.

cubedsteaks,

that doesn’t really matter when it comes to attacks.

zakobjoa,
@zakobjoa@lemmy.world avatar

These attacks didn’t happen on lemmyshitpost because there’s shitposts on lemmyshitpost. They happened because it was one of the largest communities. This is an attack, not some lost wackos trying to establish a CSAM sharing community.

cubedsteaks,

I don’t think you understood my original comment.

zakobjoa,
@zakobjoa@lemmy.world avatar

No, I did, but lemmyshitpost did not welcome 4chan/8chan type posters. It was supportive, inclusive and weird. Very much unlike chan culture.

cubedsteaks,

I see why you’re confused because you think I was saying they were allowed when they weren’t.

No, the term “shitpost” is enough to draw them in.

Xeknos,
@Xeknos@lemmy.world avatar

There’s shitposting and then there’s mean-spirited shitposting. Lemmy’s community was the former, 4/8chan is the latter.

the_post_of_tom_joad,

Im quite pleased with this development per se since I’ve only seen posts i hate coming from there.

newIdentity,

You might find them annoying, I think they’re funny.

iquanyin,
@iquanyin@lemmy.world avatar

same. ive never understood the appeal, of doing it but especially of “consuming” it.

ekZepp,
@ekZepp@lemmy.world avatar

Any news regarding the situation? (Tue 12:59)

HelloHotel,
@HelloHotel@lemmy.world avatar

Tue Aug 29 1:04:32 PM EDT 2023,

Havent heard much yet, new comments are appearing rather slowly, we may be in some approve only mode.

Dramachad,
@Dramachad@lemmy.world avatar

deleted_by_moderator

  • Loading...
  • KrisND,

    Few things…

    • Style is absolutely horrible…
    • Rules are WAY worse then any lemmy I’ve seen.
    • Pretty sure visiting that site put me on a watch list.

    There are so many lemmy instances to join instead lol https://join-lemmy.org/

    Dramachad,
    @Dramachad@lemmy.world avatar

    The community is pretty good, no CSAM as far as I know

    newIdentity,

    People posting cp usually do it to harm a site or community. This also happend with reddit and multiple communities were shutdown because of it

    Dramachad,
    @Dramachad@lemmy.world avatar

    Pretending it’s the “other” posting CP has been a longstanding fascist tactic going back to before Reddit.

    ptrckstr, (edited )

    I’m afraid the fediverse will need a crowdsec-like decentralized banning platform. Get banned one platform for this shit, get banned everywhere.

    I’m willing to participate in fleshing that out.

    Edit: it’s just an idea, I do not have all the answers, otherwise I’d be building it.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Mabe FIDO for identity purposes is a good idea. Mabe some process that takes a week to calculate an identity token and an approval and rejection system for known tokens

    atticus88th,

    [ptrck has been permanently banned from all social media]

    Katana314,

    What you’re basically talking about is centralization. And, as much as it has tremendous benefits of convenience, I think a lot of people here can cite their own feelings as to why that’s generally bad. It’s a hard call to make.

    rbar,

    They didn’t say anything about implementation. Why couldn’t you build tooling to keep it decentralized? Servers or even communities could choose to ban from their own communities based on a heuristic based on the moderation actions published by other communities. At the end of the day it is still individual communities making their own decisions.

    I just wouldn’t be so quick to shoot this down.

    Rambi,

    There is something similar to that for Minecraft servers, it’s a website/ plugin where people’s bans get added to and other admins can check usernames on there to see if they’re a troll or whatever and ban them straight away before they cause issues. So it’s definitely possible to do in a decentralised way.

    newIdentity,

    Soo… You want to centrelize a decentralized platform…

    ptrckstr,

    You can have a local banlist supplemented by a shared banlist containing these CSAM individuals for example.

    thisisawayoflife,

    That ban list could be a set of rich objects. The user that was banned, date of action, community it happened in, reason, server it happened at. Sysops could choose to not accept any bans from a particular site. Make things fairly granular so there’s flexibility to account for bad actor sysops.

    newIdentity,

    But how do you know that these people actually spread CSAM and someone isn’t abusing their power?

    BradleyUffner,

    There is no way that could get abused… Like say, by hosting your own instance and banning anyone you want.

    ptrckstr,

    Anything can be abused, but you can also build proper safeguards.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    It might be in the works, comment

    ImpossibleRubiksCube,

    I think a ban list is, unfortunately, contrary to the notion of decentralization; as otherwise warranted as it is in this instance.

    What about a centralized list of data on major offenders, which could be made subscribable by instances? Perhaps a way of calculating probability of origin, matched with offenses made by that origin in the past? That way an instance could take it under advisory, and take actions suiting them?

    This is only the beginning of an idea, but if it is possible to collaborate without centralizing, we should explore it.

    CreeperODeath,

    I feel like this would be difficult to enforce

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Do people think that someone has to use the same email address, or the same username? If someone uses a different email, username, and IP address (don’t try and argue semantics it can, always can, and always has been done) then whatever you put into the list can’t be applied to them.

    Even if you ask for IDs people can fake those, it’s illegal sure but so is what these assholes did and it didn’t really stop them now did it.

    Draconic_NEO,
    @Draconic_NEO@lemmy.world avatar

    We already have that, it’s called prison. Can’t go on the internet from Prison (at least I’d assume so, wouldn’t make much sense if people could). That’s not 100% since people need to be caught for it to work but once they are it certainly is.

    Though other Global ban solutions don’t really work well because they require a certain level of compliance that criminals aren’t going to follow though with (i.e. Not commiting identity theft). They can also be abused by malicious actors to falsely ban people (especially with the whole identity theft thing).

    ObviouslyNotBanana,
    @ObviouslyNotBanana@lemmy.world avatar

    Well that’s fucking horrible. Jesus christ on a bicycle.

    HexesofVexes,

    Sounds like the 4chan raids of old.

    Batten down, report the offender’s to the authorities, and then clean up the mess!

    Good job so far ^_^

    johnthebeboptist,

    Yeah, in good and in bad, this place is a lot like “the old days of the internet”, but thankfully more in the good way than the bad. People keep complaining about shit I haven’t seen for a second, constantly there are actions from the mod/admin side about shit I’ve never even seen etc. Even without proper mod tools and shit it seems like everything is working out quite well. To which I and all of us owe a huge thank you to the people working on this stuff.

    Thank you people! Thank you for making this place feel like home and something greater than what we’ve had for a long ass time.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Mods, Keep yourself safe, be open to your support groups (dont isolate), clear shread your cache often

    We will get thru this together

    BoldRaven,

    They can out post us. They have nothing better to do… 😬

    ZestycloseReception8,

    isn’t this what 8cjan is for? Seriously what the fuck is wrong with people who think that CSAM is appropriate shit post material. The Internet really is a fucked up place. is there a way to setup something to automatically remove inappropriate posts similar to YouTube’s system of automatically removing inappropriate contents?

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Its an attavk on the fediverse

    BoldRaven,

    thanks, soldier 😆

    ArtisinalBS,

    Legally , in the US , you would have to remove it from public view, keep it stored on the server, collect identifying information about the uploader, and send all that data to the authorities - all that while maintaining a low false-positive rate.
    Kind of a tall order.

    middlemanSI,

    I know no details but smell Elon stink…

    BoldRaven,

    it tis a somewhat identifiable musk indeed… i like to say it’s that Pepé Le Pew

    Darth_vader__,
    @Darth_vader__@discuss.online avatar

    what if we use deep learning based automoderators to instantly nuke these posts as they appear? For privacy and efficiency let’s make the model open source and community maintained… maybe even start a seperate donation for maintaining this, and maybe even make it a public API!

    SamboT,

    For sure let us know when it’s done

    bilal,

    How will these models be trained? Surely, it would need huge amount of such data to be trained on.

    query,

    Yeah, that can’t be a community or open source project. Maybe the result can be open source and free to implement, but even if you can artificially generate the training data, that all has to be a strictly controlled process.

    Darth_vader__,
    @Darth_vader__@discuss.online avatar

    maybe we can make a SFW bot an train it on general porn… most of the communities do not allow porn anyway right?

    CreeperODeath,

    I agree But it’s also something to be careful of

    Diprount_Tomato,
    @Diprount_Tomato@lemmy.world avatar

    Ok, how can that AI distinguish those post in a way that doesn’t have false detections? That’s the main issue with leaving the job of mods to an AI

    Darth_vader__,
    @Darth_vader__@discuss.online avatar

    instead of removing maybe hide the post and let a mod review it? I say let it report all pornographic post and let the mods choose what should be allowed

    Diprount_Tomato,
    @Diprount_Tomato@lemmy.world avatar

    That’s great for detecting possible raids, but once they’re detected I see it to be an overbureaucratisation of the process

    ricdeh,
    @ricdeh@lemmy.world avatar

    I think that it’s a wrong assumption that a human moderator would be any more reliable than a sufficiently elaborate and trained machine learning system

    TheWoozy,

    It sounds like a lot of work for an uncertain outcome. Please volunteer head up the project!

    scarabic,

    Thank you.

    solstice,

    I’d like to do my part and report there are a ton of assholes on lemmy. It’s REALLY bad here and I’ve been around a while.

    saltedFish,

    That’s the portion of the user base that didn’t leave reddit voluntarily.

    Candelestine,

    Dingding.

    I also think we’ve caught the attention of some haters that just don’t like hopeful people or nice things.

    Iceblade02,

    Wow, this is awful. Huge cudos to y’all for holding on through this. It’s obviously a deliberate attack on the fediverse by malicious actors.

    CreeperODeath,

    Yeah I don’t blame you for being frustrated

    HiddenLayer5, (edited )
    @HiddenLayer5@lemmy.ml avatar

    This whole situation is shitty all around, I was really hoping it would be an isolated incident and that they wouldn’t come back. I think I speak for everyone when I say that the bastards posting CSAM need to be jailed or worse. Disagree with an instance or community all you want, the instant you pull something like this, you’ve lost every single argument and are irredeemably a horrible person not worthy of touching a computer ever again.

    Once again pedophiles ruin everything nice. I know me saying this isn’t that helpful since I can’t do anything about it, but I’m sorry this is happening on your instance (and is getting federated to other instances). Don’t worry about inconvencing regular users, taking action against CSAM is far more important and any half amicable user will understand.

    Some resources for reporting CSAM if you come across it anywhere (not just Lemmy):

    US: www.missingkids.org/cybertipline

    Canada: cybertip.ca/app/en/

    International: www.inhope.org

    Last but not least, a reminder that if you accidentally load CSAM on your device, in most cases it will get cached to your storage because the majority of apps and browsers employ disk caching by default. You should at the very least clear your caches and then trim and fully wipe the free space on your device (maybe also directly shred the actual files if you can do that/know how to). Also know if you have any mandatory reporting laws where you live and comply with them. (EDIT: another commenter mentioned that in some jurisdictions you might actually not be allowed to delete them immediately and, presumably, have to contact police immediately reporting the active file on your device.) CYA to prevent yourself from getting screwed because of someone else’s horrible acts. Also, something I’ve been thinking about since this whole thing started: it might also be helpful to use a no-disk-cache browser/app (or disable disk caching on your current browser/app if you are able to) if you do wish to keep using Lemmy, at least until this whole thing blows over, that way you can just close the page/program and reboot your device, and the local version should be gone, especially since flash storage devices cannot be reliably wiped with the “fill the drive with blank data” method (not sure how big the risk of it ending up in the swapfile or otherwise sticking around though or at what point it stops counting as possession). Being exposed to CSAM is a nightmare for this reason and unfortunately there seem to be no good resources on what to do if you’re exposed.

    I am not a lawyer and no part of this comment is legal advice.

    ToyDork,
    @ToyDork@lemmy.zip avatar

    Unfortunately, from what I’ve heard about American laws, you actually CAN’T delete it and have to report cached content to the police.

    Not sure about my own country, but I do know that kind of content is basically radioactive in most jurisdictions worldwide.

    HiddenLayer5, (edited )
    @HiddenLayer5@lemmy.ml avatar

    If that’s the case then I stand corrected, again, not a lawyer and not giving any legal advice. I’m in Canada and I need to check my local laws as well, seems like a really important thing to know in general for anyone that uses the internet. I also think governments pretty much across the board need to do a way better job of actually educating people on exactly what to do if they are accidentally exposed to CSAM, because it seems like those resources are basically nonexistent.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • [email protected]
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • Socialism
  • feritale
  • oklahoma
  • SuperSentai
  • KamenRider
  • All magazines