Lemmyshitpost community closed until further notice

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

Feathercrown,

Fuck these trolls

expatriado,

troll is too mild of an adjective for these people

Feathercrown,

How about “pedophile”? I mean, they had to have the images to post them.

jarfil,

“Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.

HelloHotel, (edited )
@HelloHotel@lemmy.world avatar

“Petophilile enabled Terrorist” or “petophilic terrorist” depending on the person

It still means they can tolerate CSAM or are normalized to it enough that they can feel anything other than discust during “shipping and handling”.

AlpacaChariot,

All of your comments have “banned” next to them for me (logged in via lemmy.world using Liftoff) - any idea why?

I assume you’re not actually banned…?

Whitehat93875,
@Whitehat93875@lemmy.world avatar

They were banned because they were defending pedophilia (advocating for them to be able to get off to what turns them on) and also trolling very aggressively. You can look at them in the Modlog on the Website, not sure if Apps implement the modlog yet though.

AlpacaChariot,

Ah thanks, I’ve seen it a few times but thought it was a bug. Why are people like this!

Whitehat93875,
@Whitehat93875@lemmy.world avatar

I have not clue, people can be quite toxic and horrible. Also noticed that they reduced his ban, not sure why, defending pedophilia is pretty bad and definitely carries legal risk but it’s not my call to make.

jarfil, (edited )

Yeah, got banned for forgetting that some axioms give people free pass to say whatever they want, no matter how they say it… and replying in kind is forbidden. My bad.

Whitehat93875,
@Whitehat93875@lemmy.world avatar

You were banned because you were arguing for why people shouldn’t be arrested for possession of CSAM material, trolling and straw-manning in the replies, and on top of that attempting to seriously and honestly advocate for pedophiles on another community, which is at best borderline illegal (anyone can check the modlog on that one if they don’t believe me, I wouldn’t make such claims if they weren’t true).

So to summerize you were banned for

  • Trolling
  • Promoting Illegal Activities (Pedophilla and CSAM)
PM_Your_Nudes_Please,

Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

gammasfor,

And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

CharlesDarwin,
@CharlesDarwin@lemmy.world avatar

Sounds like a digital form of SWATing.

pensivepangolin,

Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.

Whitehat93875,
@Whitehat93875@lemmy.world avatar

That’s not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.

jarfil,

deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    A person who is attracted to children is an evil and disgusting person, someone being a pedophile isn’t just “liking something”, they are a monster.

    jarfil,

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    This is a serious problem we are discussing, please don’t use this as an opportunity to inject bad-faith arguments.

    Edit: Wow your post history is a lot of the same garbage, there is no point in attempting to reason with you, you seem to be defending the act of CSAM or just trolling (really awful and severe trolling I might add, CSAM isn’t something to joke or troll about).

    jarfil, (edited )

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Ah I see what’s going on, you’re salty that they closed the shitposting community so you’re trolling here, going so far as to compare gays and jews to pedophillia (which is extremely bigoted and incorrect) or downplay the horrific acts that led to the closing of that community and registrations to protect the rest of the Instance’s well being.

    Also I’d appreciate it if you didn’t edit what I said when quoting me, thanks.

    jarfil,

    deleted_by_moderator

  • Loading...
  • Whitehat93875, (edited )
    @Whitehat93875@lemmy.world avatar

    Amazing, you think that being against pedophiles makes someone a bigot, do you think the same applies to being against Nazis as well? Paradox of tolerance much?

    There’s a very clear difference between the arguments you cited and pedophilia, which is that pedophilia causes physical and psychological harm to the child who is the victim and also is extremely psychologically harmful to the people who witness such event or the abuse material from it. I don’t think you realize how severe this is or you wouldn’t be trying to defend such heinous acts right now.

    jarfil,

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    🤓

    AceLucario,

    Nothing you say matters, because you’re defending pedophiles. Whoops.

    jarfil, (edited )

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Lol you do realize blocking on Lemmy means to ignore them, it’s not going to silence them or more critically prevent them from downvoting or reporting you like it does on Reddit.

    You’re just going to stop seeing them, they can keep calling you out for it. You just made it easier for them.

    AceLucario,

    Oh no. Anyways…

    else,

    People spreading CSAM are beyond terrible, but I don’t agree with this generalization. Pedophiles don’t choose to be pedophiles, so as long as they control themselves and avoid harming anyone, I don’t like labelling them as evil monsters.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Okay, fair enough. I wasn’t considering those people since they’ll never really be labelled pedophiles due to hiding their mental illness from the world (only really disclosing it to mental health professionals if ever) and never acting on it. It’s the actions committed or fantasies endorsed that differentiate someone from merely being mentally ill or a monster.

    **Disclaimer:**Would like to say, it is valid for me or anyone else to say pedophiles are mentally ill but it is not valid to say that gay people are mentally ill, homosexuality is a legitimate sexual orientation because people pursuing homosexual attractions with other consenting adults does not harm others, (this is not debatable and I WILL report any homophobic arguments I see popping up as a result here) however pedophilia is not like this at all because anyone who pursues it will cause harm because children cannot consent and therefore these interactions are harmful to them and anyone involved, which is why it is a mental illness (again this is also not debatable, anything that causes harm to others by acting on it is not a valid orientation but a mental illness, same thing goes for rapists).

    else,

    I wasn’t considering those people since they’ll never really be labelled pedophiles due to hiding their mental illness from the world (only really disclosing it to mental health professionals if ever) and never acting on it.

    I guess this is also the difference between getting caught or not. To me, rapist that is never caught is still a rapist even if they are never labelled as such, so a pedophile that hides it forever successfully is still a pedophile.

    Disclaimer

    Yes agreed.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    I guess this is also the difference between getting caught or not. To me, rapist that is never caught is still a rapist even if they are never labelled as such, so a pedophile that hides it forever successfully is still a pedophile.

    Agreed. If they did it but hid it and never were publicly labeled as a pedophile they are still a pedophile because of their actions and intentions, even if they were never caught and/or cancelled/convicted for it.

    khannie,
    @khannie@lemmy.world avatar

    Simply having it to post makes you culpable. It’s way beyond trolling.

    jarfil,

    deleted_by_author

  • Loading...
  • PeleSpirit,

    Are you promoting the acceptance of pedophilia? Cuz that’s what it sounds like and maybe they should start looking at your account first.

    jarfil,

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Do you know why we have possession laws against CSAM in the first place? It’s because people buy and sell abuse material in underground markets, it’s another way they profit off the abuse of children. This is nothing like drug possession laws (which are stupid) because the product is literally a direct product of the abuse of children that many of the people in possession likely helped the criminals in order to obtain it (either directly or by paying them for it).

    So yes in this case it does make sense to criminally charge people for possession of something like this considering the direct connection CSAM has to child trafficking and child sexual abuse and when you defend it by going against possession laws it makes it seem like you support these criminals.

    jarfil,

    deleted_by_author

  • Loading...
  • GodzillaFanboy129,

    Oh boy do we have a very hot take here don’t we, lmao this guy just tried to compare someone smoking a joint, doing shrooms, or even cocaine to the act of sexually abusing a child and trying to sell the pictures.

    Don’t even bother engaging with this guy, he’s clearly a troll mindlessly parroting other people’s arguments, report, block, move on.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Oh by the way, MAP stuff really doesn’t look good on you (that’s in your comment history). Yeah maybe you think I’m a terrible person because I think drugs should be treated less harshly but you have literally said in other comments that Pedophiles should be allowed to get off on what turns them on (which I remind you is exploitation of Minors). That is a very different stance than people shouldn’t be beaten and arrested for snorting coke, you’re literally advocating for people to be allowed to produce and consume abuse material and claiming that it’s acceptable for people to be pedophiles, and pursue their attractions instead of getting help. I don’t know how you don’t see what is wrong with that. Like seriously this is either really bad trolling (way too far) or, you’re one of them.

    Too bad, I’m “kid agnostic”; they might as well be cars or dragons —drawn or otherwise—, I don’t care whether they’re “kid” or “grownup” cars or dragons.

    I think I now know which one it is if that statement from the horse’s mouth is to be believed…

    HelloHotel, (edited )
    @HelloHotel@lemmy.world avatar

    A lot of stuff the government bans doesnt align with morrality. In this instance it fucken does

    Yes getting someone to drop this on your hard drive even if its explicitly labeled “cache” is equivilant to evidence planting. It puts you in danger of our laws falsely finding you guilty (misunderstandings are a thing, i dont know the level of risk). The advice by our governments is “delete it emedately”. Follow it as completely as you can. Most devices dont broadcast your harddrive contents without warning, giving you time to delete it. For this being false for iphones, Its a risk to ones personal freedom to go on lemmy on an iphone until we can get this CSAM issue resolved.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    What he’s saying is FBI CHECK THAT GUY’S HARD DRIVE!

    jarfil,

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    FBI doesn’t use pitchforks bud, they have tasers and guns.

    jarfil,

    deleted_by_author

  • Loading...
  • Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    🤓

    PeleSpirit,

    And I think we all agree with that.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    I had no idea how right I was at the time I commented that, since I hadn’t scrolled far back enough in his comment history but yeah, way more right than I would’ve liked 😬

    HelloHotel, (edited )
    @HelloHotel@lemmy.world avatar

    Yes, its a virus, the FBI will target anyone who is a host. anyone Who has it on their drive, (edit: intent may be relivant, im no expert). The only way to stay safe is to rid yourself of it. Delete it.

    Lemmy mods, keep yourself safeplease dont use an iphone to moderate, if your on linux (i think windows too) use bleachbit on your browser cache and do the “vaccuum” operation. On android - to clear cache or better written instructions here- go to where you see all your apps - find your client - tap and hold on its icon - tap “app info” - go to “storage” - tap “clear cache” - (if your parranoid “clear data” and loose you sign in, settings and other local data). - To manually vaccum, vant find better instructions - download an app called “termux”, it doesnt need any permissions for this task. - when you see a black screen with text, type clear and hit enter, - Then type or paste { echo writing big file of random; cat /dev/urandom >file-gets-big; rm file-gets-big -v; }- And hit enter Your phone and the program cat will complain About being out of storage, if rm gets run, it will be fixed again. If it still complains or termux crashes, uninstall and reinstall termux, the vaccum process is finished Some people know at a glance wether these steps are safe or not, others do not. never follow instructions you dont understand, verify that i havent led you do somthing dumb.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    I’d say the proper word is ‘criminal.’

    CoderKat,

    Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.

    HawlSera,

    My thoughts exactly, like if they were just spamming goatsee or something, that would be one thing…

    But this raises several questions, and they can only have grimdark answers.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Dont forget they are doing this to harm others, they deserve the name “e-terrorist” or simmlar. hey are still absolutely pedophiles. Their bombing out a space, not trying to set up shop.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    I would definately agree that this would very likely count as cyber terrorism, and if it doesn’t it should.

    GreenMario,

    Criminals.

    RightHandOfIkaros,

    Trolls? In most regions of the planet, I am fairly certain their actions would be considered criminal.

    DLSchichtl,

    To be fair, that is exactly what trolls did in the wild west days of the net. Back when not many people had home computers, much less the internet. CP and gore, the proverbial shit flung by the troll chimps.

    3ntranced,

    The Internet is essentially a small microbiome of beautiful flora and fauna that grew on top of a lake of sewage.

    jarfil,

    The Internet is a reflection of humanity, minus some of the fear of getting punched in the face.

    PM_Your_Nudes_Please,

    Yeah, back in the Limewire/Napster/etc days, it wasn’t unheard of for people to troll by relabeling CSAM as a popular movie or TV show. Oh, you wanted to download the latest Friends episode? Congrats, now you have CSAM because a troll uploaded it with the title “Friends S10E7.mov”

    STRIKINGdebate2,
    @STRIKINGdebate2@lemmy.world avatar

    I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

    Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

    lwadmin,
    @lwadmin@lemmy.world avatar

    @Striker this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

    rob_t_firefly,
    @rob_t_firefly@lemmy.world avatar

    Hopefully the devs will take the lesson from this incident and put some better tools together.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

    WhiskyTangoFoxtrot,

    Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.

    Bread,

    Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

    x1gma,

    And who’s gonna maintain the fork? Even less developers from a split community? You have absolutely no idea what you’re talking about.

    dandroid,

    This isn’t your fault. Thank you for all you have done in regards to this situation thus far.

    gabe,

    Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    It’s not your fault, thank you for your job!

    Mr_Blott,

    Definitely not your fault mate, you did what anyone would do, it’s a new community and shit happens

    rob_t_firefly,
    @rob_t_firefly@lemmy.world avatar

    Thanks for your work. The community was appreciated.

    Thief,

    Thank you for your help. It is appreciated.

    Touching_Grass,

    Really feel for you having to deal with this.

    Nerd02,
    @Nerd02@lemmy.basedcount.com avatar

    You don’t have to apologize for having done your job. You did everything right and we appreciate it a lot. I’ve spent the whole day trying to remove this shit from my own instance and understanding how purges, removals and pictrs work. I feel you, my man. The only ones at fault here are the sickos who shared that stuff, you keep holding on.

    reverendsteveii,

    You didn’t do anything wrong, this isn’t your fault and we’re grateful for the effort. These monsters will be slain, and we will get our community back.

    Draconic_NEO,
    @Draconic_NEO@lemmy.world avatar

    It’s not your fault, these people attacked and we don’t have the proper moderation tools to defend ourselves yet. Hopefully in the future this will change though. As it stands you did the best that you could.

    FlyingSquid,
    @FlyingSquid@lemmy.world avatar

    I love your community and I know it is hard for you to handle this but it isn’t your fault! I hope no one here blames you because it’s 100% the fault of these sick freaks posting CSAM.

    GONADS125,

    You’ve already had to take all that on, don’t add self-blame on top of it. This wasn’t your fault and no reasonable person would blame you. I really feel for what you and the admins have had to endure.

    Don’t hesitate to reach out to supports or speak to a mental health professional if you’ve picked up trauma from the shit you’ve had to see. There’s no shame in getting help.

    Becoming,
    @Becoming@lemmy.world avatar

    As so many others have said, there’s no need for an apology. Thank you for all of the work that you have been doing!

    The fact that you are staying on as mod speaks to your character and commitment to the community.

    MsPenguinette,

    You do a great job. I’ve reported quite a few shit heads there and it gets handled well and quickly. You have no way of knowing if some roach is gonna die after getting squashed or if they are going to keep coming back

    aport,

    Contact the FBI

    Ertebolle,

    This is good advice; I suspect they're outside of the FBI's jurisdiction, but they could also be random idiots, in which case they're random idiots who are about to become registered sex offenders.

    CantSt0pPoppin,
    @CantSt0pPoppin@lemmy.world avatar

    I have to wonder if Interpol could help with issues like this I know there are agencies that work together globally to help protect missing and exploited children.

    GeekFTW,
    @GeekFTW@kbin.social avatar

    'Criminal activity should be reported to your local or national police. INTERPOL does not carry out investigations or arrest people; this is the responsibility of national police.'

    From their website.

    assassin_aragorn,

    What the hell do they actually do then?

    GeekFTW,
    @GeekFTW@kbin.social avatar

    "Interpol provides investigative support, expertise and training to law enforcement worldwide, focusing on three major areas of transnational crime: terrorism, cybercrime and organized crime. Its broad mandate covers virtually every kind of crime, including crimes against humanity, child pornography, drug trafficking and production, political corruption, intellectual property infringement, as well as white-collar crime. The agency also facilitates cooperation among national law enforcement institutions through criminal databases and communications networks. Contrary to popular belief, Interpol is itself not a law enforcement agency."
    https://en.wikipedia.org/wiki/Interpol

    assassin_aragorn,

    Huh. Thanks!

    Ab_intra,
    @Ab_intra@lemmy.world avatar

    FBI would be great in this case tbh. They have the resources.

    TheTimeKnife,
    @TheTimeKnife@lemmy.world avatar

    The FBI reports it to interpol I believe, interpol is more of like an international warrant system built from treaties.

    The_Picard_Maneuver,
    @The_Picard_Maneuver@lemmy.world avatar

    They might be, but I’d imagine most countries have laws on the books about this sort of stuff too.

    droans,

    And it’s something that the nations usually have no issues cooperating with.

    The FBI has assisted in a lot of global raids related to CSAM.

    assassin_aragorn,

    There are few situations where pretty much everyone universally agrees to work together. This is one of those situations. Across cultures and nations, pedos are seen as some of the most vile people alive.

    dylanTheDeveloper,
    @dylanTheDeveloper@lemmy.world avatar

    The FBI has offices in alot of other countries and work with local law enforcement.

    www.fbi.gov/contact-us/international-offices

    Can’t really hide from them unless you live in North Korea or Russia

    jarfil,

    Wait, is this like China having police offices in other countries?

    I knew the US collects taxes on their citizens no matter where they live, but isn’t this kind of excessive? Wasn’t INTERPOL supposed to take care of international crime?

    dylanTheDeveloper, (edited )
    @dylanTheDeveloper@lemmy.world avatar

    For more than eight decades, the FBI has stationed special agents and other personnel overseas. We help protect Americans back home by building relationships with principal law enforcement, intelligence, and security services around the globe.

    It is similar to China’s international police but keep in mind quite a few other countries have a similar setup

    jarfil,

    I’m just surprised that it’s FBI personnel, I thought the CIA was in charge of international affairs, with INTERPOL acting as liaison for the FBI with other countries.

    IIRC in the EU we have EUROPOL acting as liaison between the national law enforcement branches, and while there is nothing stopping personnel from one country to enter another, I don’t think they do. But maybe that’s more like the state vs. federal jurisdictions in the US. On the other hand, it’s been some time since I’ve looked deeper into it, and things keep changing.

    gowan,
    @gowan@reddthat.com avatar

    You might be surprised to discover that the USA does not believe their laws end at their borders. That is why Kim Dotcom was arrested by the FBI in NZ for violating US law.

    In this case I doubt any LE agency abroad wouldn’t like the tip off.

    Touching_Grass,

    I’m not saying anybody takes csam less serious. But I wish the American government Went after minor csam events as much as they go after copyright/IP violations. Its not like mike pompeo flew out to other countries to strong arm them into new laws to prevent csam like they have done with pirates who threatened Hollywood profits

    gowan,
    @gowan@reddthat.com avatar

    We did that back in the 1990s under Clinton and a bit under GWB as well. If I recall correctly we got really firm with Japan about how they needed to ban CP there.

    jarfil, (edited )

    There is no CP and no porn in Japan… add some tiny censor bars, and it’s just some wholesome family tentacle fun!

    That one backfired spectacularly.

    gowan,
    @gowan@reddthat.com avatar

    As I recall the laws at the time were that you could not show pubic hair so naked kids who had no pubes were fine.

    jarfil,

    TIL. Oh well, it probably will keep backfiring as long as Japan insists on having “morality laws” instead of something more objective.

    jarfil,

    I wish the American government Went after minor csam events as much as they go after copyright/IP violations.

    Easy: claim copyright/IP on the CSAM… uh, no, wait…

    Resonosity,

    Yeah there was even that case where a citizen and resident of Mexico was arrested and detained in the US for breaking US law, even tho it technically didn’t apply to them since they were under Mexican sovereignty… Borders mean little to the US

    synae,
    @synae@lemmy.sdf.org avatar

    Perhaps most importantly, it establishes that the mods/admins/etc of the community are not complicit in dissemination of the material. If anyone (isp, cloud provider, law enforcement, etc) tries to shut them down for it, they can point to their active and prudent engagement of proper authorities.

    Railing5132,

    More importantly, and germaine to our conversation, the FBI has the contacts and motivation to work with their international partners wherever the data leads.

    ivanafterall,
    @ivanafterall@kbin.social avatar

    This isn't as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

    Daisyifyoudo,

    This doesn’t sound crazy in the least. It sounds like exactly what should be done.

    BitOneZero,

    yha, what do people think the FBI is for… this isn’t crazy. They can get access to ISP logs, VPN provider logs, etc.

    deweydecibel,

    I think what they’re saying is that contacting the FBI may seem daunting to someone who has never dealt with something like this before, but that they don’t need to worry about it. Just contact them.

    akippnn,

    Under US jurisdiction, yeah. Could be slightly more difficult depending on the country, LEGAT can’t conduct unilateral operations so they’ll have to cooperate with foreign authorities. These assholes can get away with exploiting jurisdictional boundaries. Hopefully they will be caught, but oh well.

    CantSt0pPoppin,
    @CantSt0pPoppin@lemmy.world avatar

    This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

    The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

    The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

    Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

    Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

    Talk to your children about online safety and the dangers of CSAM.

    Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.

    Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

    over_clox,

    So far I have not seen such disgusting material, but I’m saving this comment in case I ever need the information.

    Are there any other numbers or sites people can contact in countries other than the USA?

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    It’s probably going to be country dependent

    over_clox,

    Of course yes. But I’ve discovered that cell phones are programmed to translate emergency numbers even.

    In the USA, our main emergency number is 911, but I found out (quite by accident), that dialing 08 brings you to emergency services.

    en.wikipedia.org/…/List_of_emergency_telephone_nu…

    Pat12,

    There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

    this doesn’t seem like a respectful comment to make. People have responsibilities; they aren’t paid for this. It doesn’t seem to fair to make criticisms of something when we aren’t doing anything to provide a solution. A better comment would be “there are just 2 full time developers on this project and they have other priorities. we are working on increasing the number of full time developers.”

    khannie,
    @khannie@lemmy.world avatar

    I agree with you, I’d just gently suggest that it’s borne of what is probably significant upset at having to deal with what they’re having to deal with.

    TsarVul,
    @TsarVul@lemmy.world avatar

    Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.

    On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.

    postmateDumbass,

    I think the FBI or eqivilant keeps a record of hashes for a known CASM and middleware should be able to compare to that. Hopefully, if a match is found, kill the post and forward all info on to LE.

    malloc,

    Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.

    I think Apple was going to implement a similar system and deploy to all iPhones/Macs in some iOS/macOS update. However was eventually 86’d due to privacy concerns from many people and the possible for abuse and/or false positives.

    A system like this might work on a small scale though as part of moderating tools. Not sure where you would get a constantly updated database of CSAM hashes though.

    AeonFelis,

    Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.

    Most people are lazy and stupid, so maybe hash checking is enough to catch a huge portion (probably more than 50%, maybe even 80% or 90%?) of the CSAM that doesn’t bother (or know how) to do that?

    TechnoBabble,

    I’m almost positive they’ve been developing an image recognition AI that will make slightly altering csam photos obsolete.

    Here’s hoping.

    dipshit,

    A hash would change if even one bit changed in that file. This could be from corruption, automated resizing by any photo processing tools (i.e., most sites will resize photos if you give them one too big), saving a lossy file time again (adding more jpg), etc… This is why there aren’t many automated tools for this detection. Sites that have tried by using skin tones in a photo have failed spectacularly.

    I’ve never heard of this FBI middleware. Does anyone have the link to this? I’d like to understand what tools are available to combat this as I’ve been considering starting my own instance for some time now.

    postmateDumbass,

    In my utopia world, the FBI has a team updating the DB.

    The utopia algorithim would do multiple subsets of the picture so cropping or watermarking wouldn’t break the test (assume the ‘crux’ of the CSAM would be most likely unaltered?) , maybe handle simple image transformations (color, tint, gamma, etc.) with a formula.

    reev,

    What you’re talking about is digital (aka forensic) watermarking.

    MsPenguinette,

    IMO scanning images before posting them to a forum is a distinct and utterly completely different world than having your photo collection scanned. Especially in context and scale

    snowe,
    @snowe@programming.dev avatar

    You can already protect your instance using CloudFlare’s CSAM protection, and sorry to say it, but I would not use db0’s solution. It is more likely to get you in trouble than help you out. I posted about it in their initial thread, but they are not warning people about actual legal requirements that are required in many places and their script can get you put in jail (yes, put in jail for deleting CSAM).

    TsarVul,
    @TsarVul@lemmy.world avatar

    The developers of LemmyNet are being asked for the ability to define a subroutine by which uploaded images are to be preprocessed and denied or passed thereafter. There is no such feature right now. Even if they wanted to use CloudFlare CSAM protection, they couldn’t. That’s the entire problem. This preprocessing routine could use Microsoft PhotoDNA and Google CSAI, it could use a self-hosted alternative as db0 desires or it could even be your own custom solution that doesn’t destroy, but stores CSAM on a computer you own and stops it from being posted.

    snowe,
    @snowe@programming.dev avatar

    Even if they wanted to use CloudFlare CSAM protection, they couldn’t.

    ? CF’s solution happens at the DNS level. It has absolutely nothing to do with lemmy and there’s nothing the devs could do to change that.

    TsarVul,
    @TsarVul@lemmy.world avatar

    Yeah I just looked it up. Serving stuff through CF does a check for illicit material. Pretty neat. Be that as it may, the original complaint is that Lemmy is lacking moderation tools. Such a moderation tool would be something that disallows CSAM even being stored in the server in the first place.

    Graphine,

    I mean, the “other priorities” comment does seem to be in bad taste. But as for the comment on the future of Lemmy, I dunno. I feel like they’re just being realistic. I think the majority of us understand the devs have lives but if things don’t get sorted out soon enough it could impact the future of Lemmy.

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    we are working on increasing the number of full time developers.

    I see where you are coming from, but who is supposed to make this statement, LW admins? Because it’s not their role. And if it’s Lemmy devs, then it shouldn’t be we.

    Pat12,

    I see where you are coming from, but who is supposed to make this statement, LW admins? Because it’s not their role. And if it’s Lemmy devs, then it shouldn’t be we.

    whoever came up with “we should have full time developers” and is managing that team should be the person thinking of how to help the full time developers given the increased responsibilities/work load

    ttmrichter,
    @ttmrichter@lemmy.world avatar

    Are you volunteering?

    No?

    Then shut up and let the adults talk about how to solve things.

    ToxicWaste,

    Lemmy is developed open source and the people operating the servers are not the same people writing the source code.

    While I do not agree with the salty comment made about an amazing open source project, they corrected it. Maybe this is a great opportunity for people to contribute. Not everyone needs to be a programmer to provide value to a project like this. Sources can be found here: github.com/LemmyNet

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    They just edited their comment

    MoistWanted,

    I can’t seem to find the AMA thread from the devs but I remember they said they actually are being paid by some dutch organisation

    BuddyTheBeefalo, (edited )

    Funded by nlnet.nl

    plus ~4400$/month from donations from

    opencollective.com/lemmy

    www.patreon.com/dessalines

    liberapay.com/Lemmy

    liberapay.com/dessalines/

    liberapay.com/nutomic/

    They also take bitcoin, etherium, monero and cardano.

    join-lemmy.org/donate

    Deftdrummer,

    Good bot

    HobbitFoot,

    No one is paid for this, but moderation is going to become a problem for Lemmy and the volunteers who are admins are going to need support.

    Pat12,

    No one is paid for this, but moderation is going to become a problem for Lemmy and the volunteers who are admins are going to need support.

    yes, that’s what i’m saying. We should acknowledge that we are fortunate to have dedicated volunteer devs and work on helping/supporting them.

    HobbitFoot,

    We definitely should acknowledge the volunteer devs supporting the platform, but we need to address that there may be issues with the tools for mods as is and we need the paid devs to pull back from only coding and do more design of the architecture that can be filled in by volunteer devs.

    DogMuffins,

    There are paid devs?

    HobbitFoot,

    There are donation pages that fund the two devs. They haven’t complained about the funds yet.

    DogMuffins,

    Oh. Is there any indication of how much they may have actually received via these donation pages?

    The vast majority of FOSS projects receive hardly anything in donations - even those with many users.

    The term “paid dev” implies a salaried position. I would be astonished if the amount they’ve received is anything like a salary given the time requirements.

    can,

    The do get a certain amount from a foundation each time they reach a certain milestone. Perhaps those milestones need to be adjusted.

    TacoButtPlug,
    @TacoButtPlug@sh.itjust.works avatar

    Wish I was a dev. I’d jump in to help so fast.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Maybe you could start with making pull-requests to help and maybe also writting them an application on Matrix. I’m not being snarky just pointing out that it’s easier to help than you might think.

    TacoButtPlug,
    @TacoButtPlug@sh.itjust.works avatar

    I have no idea what a pull request or matrix is but I’ll start reading about them.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Matrix is a secure chat protocol used by the Devs to message eachother.

    A pull request is a way of proposing and contributing code on git-based platforms like github, gitlab, and codeberg.

    TacoButtPlug,
    @TacoButtPlug@sh.itjust.works avatar

    Yea, thank you. I found the github list but yea… guess it’s a good time to learn!

    lagomorphlecture,

    I think taco butt plug meant that they aren’t a developer, like at all, so can’t help with coding or PRs or anything.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    Fair enough.

    lemann,

    I’m a dev but i’m in no way familiar with Rust (or more importantly, the code structure).

    Very early on I also had a look at the codebase for their join-lemmy.org site to see if I could contibute some UX changes to make it less text-heavy, but the framework they use for the UI is something I’m not familiar with either.

    Perhaps they’re both things to revisit when I have more spare time…

    sab, (edited )

    You don’t become a developer by wishing. Here’s a tutorial if you want to learn

    (edit: Rust, not Go)

    TacoButtPlug,
    @TacoButtPlug@sh.itjust.works avatar

    Thank you!!!

    GivingEuropeASpook,
    @GivingEuropeASpook@lemm.ee avatar

    Thing is, if this continues to be a problem and if the userbase/admins of instances are organised, we can shift those priorities. They may not have envisioned this being a problem with the work they decided to work on for the next several months. Truly, the solution is to get more developers involved so that more can happen at once.

    danielton,
    @danielton@lemmy.world avatar

    Seriously. We need to cut them some slack because nobody expected Reddit to go full Elon in May.

    GivingEuropeASpook,
    @GivingEuropeASpook@lemm.ee avatar

    Exactly, and Mastodon had been kinda gunning for Twitter for years before Elon went full Elon, so they were primed for the influx. Lemmy I think expected to have years to go before it’s userbase would similarly skyrocket.

    danielton,
    @danielton@lemmy.world avatar

    Yeah, Reddit was famously open to third party developers for 15 years or so, and now they and their bootlickers are claiming they didn’t know that there were third party apps using the API to browse the whole site.

    Even the Apollo dev said nothing but good things about Reddit because they were very transparent with him until they decided to paywall the API. Nobody saw this coming.

    antonim, (edited )

    People have responsibilities

    Exactly - when you create a site, you have a responsibility to make sure it’s not used to distribute child porn.

    Pat12,

    Exactly - when you create a site, you have a responsibility to make sure it’s not used to distribute child porn.

    1 6

    Body

    Cancel Preview Reply

    That burden should not rest on 2 people.

    antonim,

    Then the logical conclusion is that the 2 people should find some other people to share the burden.

    I really don’t see how my statement is controversial. This is sadly how the internet works, regardless of how much or how little you can invest into your site - you need mechanisms to fight off against such spam and malice.

    dipshit,

    DEVELOPERS produce a software to help people post images and text online. Nothing bad about that.

    ADMINS install the developers software on a server and run it as an instance.

    MODS (if any exist besides the admin) moderate the instance to keep illegal content off the site.

    USERS may choose to use the software to post CSAM.

    None of these groups of people have paid for or are getting paid for their time. USERS generally don’t take much legal risk for what’s posted, as instance owners don’t ask for personally identifiable information from users.

    Sites like reddit, although we all hate it, do make a profit, and some of that profit is used to pay “trust and safety” teams who are paid (generally not very well, usually in underdeveloped or developing countries) to wade through thousands of pictures of CSAM, SA, DV/IPV and other violent material, taking it down as it gets posted to facebook, reddit, other major online properties.

    —-

    Developers, admins and mods are generally doing this in their free time. Not sure how many people realize this but developers, admins and mods are also people who need to eat - developers have a skill of developing software, so many open source devs are also employed and contribute to open source in their off time. Admins may be existing sysadmins at companies but admin lemmy instances in their off time. Mods do it to protect the community and the instance itself.

    USERS can be a bit self-important at times. We get it, you all generate the content on this site. Some content isn’t just unwanted though, it’s illegal and if not responded to quickly could mean not only a shutdown instance but also possible jailtime for admins, who ultimately will be the ones who are running a “reddit-like site” or “a haven for child porn”.

    dragontamer,

    Not that I’m familiar with Rust at all, but… perhaps we need to talk about this.

    The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

    Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?

    TsarVul,
    @TsarVul@lemmy.world avatar

    I guess it’d be a matter of incorporating something that hashes whatever it is that’s being uploaded. One takes that hash and checks it against a database of known CSAM. If match, stop upload, ban user and complain to closest officer of the law. Reddit uses PhotoDNA and CSAI-Match. This is not a simple task.

    Touching_Grass,

    Couldn’t one small change in the picture change the whole hash?

    TsarVul,
    @TsarVul@lemmy.world avatar

    Good question. Yes. Also artefacts from compression can fuck it up. However hash comparison returns percentage of match. If match is good enough, it is CSAM. Davai ban. There is bigger issue however for developers of Lemmy, I assume. It is a philosophical pizdec. It is that if we elect to use PhotoDNA and CSAI Match, Lemmy is now at the whims of Microsoft and Google respectively.

    what_is_a_name,

    Mod tools are not Lemmy. Give admins and mods an option. Even a paid one. Hell. Admins of Lemmy.world could have us donate extra to cover costs of api services.

    TsarVul,
    @TsarVul@lemmy.world avatar

    I agree. Perhaps what Lemmy developers can do is they can put slot for generic middleware before whatever the POST request is in Lemmy API for uploading content? This way, owner of instance can choose to put whatever middleware for CSAM they want. This way, we are not dependent on developers of Lemmy for solution to pedo problem.

    Lmaydev,

    Honestly I’d rather that than see shit like this any day.

    Serinus,

    The bigger thing is that hash detection tools don’t want to give access to just anyone, and just anyone can run a Lemmy instance. The concern is that you’re effectively giving the CSAM people a way to know if they’ll be detected.

    Perhaps they can allow some of the biggest Lemmy instances to use the tech, but I wouldn’t expect it to be available to everyone.

    shagie,

    Facebook and Reddit don’t have local CSAM detection but rather use Google’s APIs.

    This isn’t something that any average user can get access to. Even the largest Lemmy instances are small compared to Reddit and Facebook… and they don’t have local testing either.

    Part of this is also a “this isn’t just detecting and blocking but also automated reporting”.

    Furthermore, Lemmy is AGPL, and providing a Lemmy instance with an implementation that would run the risk that it wouldn’t be able to remain closed source (AGPL license violation).

    Nollij,

    If they hash the file binary data, like CRC32 or SHA, yes. But there are other hash types out there, which are more like “fingerprints” of an image. Think of how Shazam or Sound Hound can recognize a song playing, despite the extra wind, static, etc that’s present. There are similar algorithms for images/videos.

    No idea how difficult those are to implement, though.

    Railcar8095,

    There are FOSS applications that can do that (czkawka for example). What I’m not sure it’s if the specific algorithm used is available and, more importantly, if the csam hashes are available for general audiences. I would assume if they are any attacker could check first and get the right amount of changes.

    reverendsteveii,

    One bit, in fact. Luckily there are other ways of comparing images without actually showing them to human eyes that allow you to calculate a percentage of similarity.

    diffuselight,

    None of that really works anymore in the age of AI inpainting. Hashes / Perceptual worked well before but the people doing this are specifically interested in causing destruction and chaos with this content. they don’t need it to be authentic to do that.

    It’s a problem that requires AI on the defensive side but even that is just going to be eternal arms race. This problem cannot be solved with technology, only mitigated.

    The ability to exchange hashes on moderation actions against content may offer a way out, but it will change the decentralized nature of everything - basically bringing us back to the early days of the usenet, Usenet Death Penaty, etc.

    dragontamer,

    Not true.

    A simple CAPTCHA got rid of a huge set of idiotic script-kiddies. CSAM being what it is, could (and should) result in an immediate IP ban. So if you’re “dumb” enough to try to upload a well-known CSAM hash, then you absolutely deserve the harshest immediate ban automatically.


    You’re pretty much like the story of the economist who refuses to believe that $20 exists on a sidewalk. “Oh, but if that $20 really existed on the sidewalk there, then it would have been arbitraged away already”. Well guess what? Human nature ain’t economic theory. Human nature ain’t cybersecurity.

    Idiots will do dumb, easy attacks because they’re dumb and easy. We need to defend against the dumb-and-easy attacks, before spending more time working on the harder, rarer attacks.

    AustralianSimon,
    @AustralianSimon@lemmy.world avatar

    You don’t get their ip when they post from other instances. I’m surprised this hasn’t resulted in defed.

    anlumo,

    Well, my home instance has defederated from lemmy.world due to this, that’s why I had to create a local account here.

    AustralianSimon,
    @AustralianSimon@lemmy.world avatar

    I mean defedding the instances the CSAM is coming from but also yes.

    rolaulten,

    I’m sorry but you don’t want to use permanent IP bans. Most residential circuits are DHCP meaning banning via IP only has a short term positive effect.

    That said automatic scanning of known hashes, and automatically reporting to relevant authorities with relevant details should be doable (provided there is a database somewhere - I honestly have never looked).

    MrPoopyButthole,
    @MrPoopyButthole@lemmy.world avatar

    I think it would be an AI autoscan that flags some posts for mod approval before they show up to the public and perhaps more fine-grained controls for how media is posted like for instance only allowing certain image hosting sites and no directly uploaded images.

    Agamemnon,
    @Agamemnon@lemmy.world avatar

    Speculating:

    Restricting posting from accounts that don’t meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)

    Automatic hash comparison of uploaded images with database of registered illegal content.

    dragontamer,

    On various old-school forums, there’s a simple (and automated) system of trust that progresses from new users (who might be spam)… where every new user might need a manual “approve post” before it shows up. (And this existed in Reddit in some communities too).

    And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).

    MossyFeathers,

    What are the chances of a hash collision in this instance? I know accidental hash collisions are usually super rare, but with enough people it’d probably still happen every now and then, especially if the system is designed to detect images similar to the original illegal image (to catch any minor edits).

    Is there a way to use multiple hashes from different sources to help reduce collisions? For an example, checking both the MD5 and SHA256 hashes instead of just one or the other, and then it only gets flagged if both match within a certain degree.

    TsarVul,
    @TsarVul@lemmy.world avatar

    Traditional hash like MD5 and SHA256 are not locality-sensitive. Can’t be used to detect match with certain degree. Otherwise, yes you are correct. Perceptual hashes can create false positive. Very unlikely, but yes it is possible. This is not a problem with perfect solution. Extraordinary edge cases must be resolved on a case by case basis.

    And yes, simplest solution must be implemented first always. Tracking post reputation, captcha before post, wait for account to mature before can post, etc. The problem is that right now the only defense we have access to are mods. Mods are people, usually with eyeballs. Eyeballs which will be poisoned by CSAM so we can post memes and funnies without issues. This is not fair to them. We must do all we can, and if all we can includes perceptual hashing, we have moral obligation to do so.

    MossyFeathers, (edited )

    Something I thought about that might be helpful is if mods had the ability to add a post delay on a community basis. Basically, the delay would be moderator adjustable, but only moderators and admins would be able to see the post for X number of minutes after being posted. It’d help for situations like ongoing attacks where you don’t necessarily want to have to manually approve posts, but you want a chance to catch any garbage before the post goes public.

    Edit: and yeah, one of the reasons I’m aware that perceptual hashes can have collisions is because a number of image viewers/cataloging tools like xnview mp or hydrus network use hash collisions to help identify duplicate images. However, I’ve seen collisions between unrelated images when lowering the sensitivity which is why I was wondering if there was a way to use multiple hashing algorithms to help reduce false positives without sacrificing the usefulness of it.

    Natanael,

    Or just making posts approval only with a mod queue

    fkn,

    I’m surprised this isn’t linked, there are services that does this for you.

    And they are free.

    blog.cloudflare.com/the-csam-scanning-tool/

    MsPenguinette,

    I beleive there are several readily available databases of hashes of csam material for exactly this kind of scanning. Looks like there are some open source ones.

    Some top results: github.com/topics/csam

    This looks to be the top project: prostasia.org/project/csam-scanning-plugins/

    BuddyTheBeefalo,

    Could they not just change one pixel to get another hash?

    gabe,

    I think having a means of viewing uploaded images as an admin would be helpful, as well disabling external image caching. Like an “uploaded” gallery for admins to view that can potentially hook into Photodna/CSAI-Match or whatever.

    BURN,

    Probably hashing and scanning any uploaded media against some of the known DBs of CSAM hashes.

    Iirc that’s how Reddit/FB/Insta/Etc. handle it

    shagie,

    They’re sent to a 3rd party that does the checks. For example developers.cloudflare.com/cache/…/csam-scanning/

    The actual DB of hashes isn’t released to the public as it would enable those who traffic in such content to use it to find the material that doesn’t match much more easily.

    protectingchildren.google/#tools-to-fight-csam

    That appears to be the one that Facebook and Reddit use.

    CoderKat,

    The sad thing is that all we can usually do is make it harder for attackers. Which is absolutely still worth doing, to be clear. But if an attacker wants to cause trouble badly enough, there’s always ways around everything. Eg, image detection can be foiled with enough transformation, account age limits can be gotten past by a patient attacker. Minimum karma can be botted (even easier than ever with AI) and Lemmy is especially easy to bot karma because you can just spin up an instance with all the bots your heart desires. If posts have to be approved, attackers can even just hotlink to innocent images and then change the image after it’s approved.

    Law enforcement can do a lot more than we can, by subpoenaing ISPs or VPNs. But law enforcement is slow and unreliable, so that’s also imperfect.

    Serinus,

    The best feature the current Lemmy devs could work on is making the process to onboard new devs smoother. We shouldn’t expect anything more than that for the near future.

    I haven’t actually tried cloning and compiling, so if anyone has comments here they’re more than welcome.

    AustralianSimon,
    @AustralianSimon@lemmy.world avatar

    Reddit had automod which was highly configurable.

    over_clox,

    Reddit automod is also a source for all the porn communities. Have you ever checked automod comment history?

    Yeah, I have. Like 2/3 of automod comments are in porn communities.

    reddit.com/…/a_dump_of_random_subreddits_from_aut…

    AustralianSimon,
    @AustralianSimon@lemmy.world avatar

    What? Reddit automod is not a source for porn. What would be happening is the large quantity of content it reacts to there.

    It literally reads your config in your wiki and performs actions based on that. The porn communities using it are using it to moderate their subs. You can look at the post history. www.reddit.com/user/AutoModerator It is commenting on posts IN those communities as a reaction to triggers but isn’t posting porn (unless they put in their config)

    Not worth it if you don’t moderate on reddit but read the how to docs for reddit automod, it is an excellent tool for spam management and the source is open prior to reddit acquiring it and making it shit. www.reddit.com/wiki/…/full-documentation

    over_clox,

    No shit, ya don’t say?

    Where the hell you think I got that list from? I literally filtered every single subreddit that AutoModerator replied in for like three months.

    Bruh you’re preaching to the person that accumulated the data. That’s the data it puked up. I can’t help it that most of them happen to be filth communities.

    AustralianSimon, (edited )
    @AustralianSimon@lemmy.world avatar

    So you should understand that what you said is invalid. Automod doesn’t post porn without a subreddit owner configuring it to and just because it posts 2/3 to NSFW subs doesn’t mean it is posting content just working more there.

    We could 100% take advantage of a similar tool, maybe we some better controls on what mods can make it do. I’m working to bring BotDefence to Lemmy because it is needed.

    over_clox,

    You completely missed the point.

    By the statistics of the data I found, most of the subreddits using AutoModerator are filth communities.

    So you can reverse that, check AutoModerator comment history, and find a treasure trove of filth.

    I can’t help that these are the facts I dug up, but yeah AutoModerator is most active in porn communities.

    AustralianSimon,
    @AustralianSimon@lemmy.world avatar

    Too stupid to argue with. You don’t even understand your own “data”.

    over_clox,

    No no, I am well aware it’s a bot account which is programmed by moderators to filter out certain things and perform other automated tasks.

    It just so happens that many of the communities that AutoModerator has to take action in are filth communities.

    snowe,
    @snowe@programming.dev avatar

    That statement is just outright wrong though. They could easily use CloudFlares CSAM monitoring and it never would have been a problem. A lot of people in these threads, including admins, have absolutely no idea what they’re talking about.

    sunaurus,
    @sunaurus@lemm.ee avatar

    Cloudflare CSAM protection is not available outside of the US, unfortunately.

    snowe,
    @snowe@programming.dev avatar

    There are several other solutions including ones from Microsoft and Facebook.

    SubArcticTundra,

    I was just discussing this under another post and turns out that the Germans have already developed a rule-based auto moderator that they use on their instance:

    github.com/Dakkaron/SquareModBot

    This could be adopted by lemmy.world by simply modifying the config file

    Ghostalmedia,
    @Ghostalmedia@lemmy.world avatar

    The amount of people in these comments asking the mods not to cave is bonkers.

    This isn’t Reddit. These are hobbyists without legal teams to a) fend off false allegations or b) comply with laws that they don’t have any deep understanding of.

    obinice,
    @obinice@lemmy.world avatar

    Yeah, you’ve got to think of this place like the big forums of 20 years ago, they’re just run by a tiny handful of regular people, and having to deal with solicitors and other such stuff is entirely out of the question.

    If something’s bad, you lock it down and purge it until it’s not bad any more. Unfortunately that’s the best you can do with such minimal resources as a regular member of the public, and for those that don’t like it, there’s other forums out there.

    This isn’t one single huge monopoly thing like Reddit, you either stay or leave forever, if you don’t like how one is being run, just sign up on a different one. Takes the stress out of it :-)

    T156,

    Lemmy instances are also international, which would cause more problems.

    This instance is Finnish, and lemmy.ml is Malaysian(?), each with their own separate administration teams. Whereas Reddit knows that all of the Reddit servers are in a few known countries, and they have the capacity to make big changes across all of them as needed.

    Zeth0s,

    Server and url are not necessarily in the same location. Url is just a human friendly alias for the ip of a server that can be located anywhere

    T156,

    ml is Mali’s TLD. They reclaimed it recently, but the instance itself is hosted in Malaysia, if memory serves.

    Poppa_Mo,

    This is flat out disgusting. Extremely questionable someone having an arsenal of this crap to spread to begin with. I hope they catch charges.

    Whitehat93875,
    @Whitehat93875@lemmy.world avatar

    deleted_by_author

  • Loading...
  • Piecemakers3Dprints,
    @Piecemakers3Dprints@lemmy.world avatar

    You must realize the logical fallacy in that statement, right?

    HawlSera,

    See that’s the part of this that bothers me most… Why do they have so much of it? Why do they feel comfortable letting others know they have so much of it? Why are they posting it on an open forum?

    The worst part is, there is not a single god damn answer to ANY of those that wouldn’t keep a sane person up at night… shudder

    mayo,

    I’m sure it’s not hard to find on the dark web. Child porn is one of those horrible things that is probably a lot more widespread than anyone wants to know.

    I don’t really get why they are doing this though.

    HawlSera,

    I hate to get conspiratorial, but it’s possible Reddit paid some people to do this to snuff out the competition.

    Isn’t Spez a pedophile?

    utopianfiat,

    I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

    dipshit, (edited )

    I’ve just finished arguing with other lemmy users about how admins aren’t interested in taking on your legal risk. That was for the topic of piracy. CSAM is another issue entirely. Not only can lemmy users not expect to see a CSAM-friendly instance, lemmy users should expect to be deanonymized by law enforcement. Fuck around with kids and find out.

    Downvote this message if you are a pedophile, as I’m taking the stance that CSAM should not be allowed on lemmy servers.

    utopianfiat,

    Are you seriously conflating my position with arguing that CSAM should be allowed?

    dipshit,

    Are people having a difficult time reading today? It’s not just you. Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

    No, I don’t think you personally are advocating for CSAM to be allowed. I think commenters are getting a little uppity about missing out on their favorite community while the admins deal with content that is:

    • harmful to children
    • damaging to the admin’s psyche
    • damaging to the user’s psyche
    • against the law

    Imagine you owned an instance, and you found 100 moderators for your communities. You rest your head on the pillow and go to sleep. You wake up and find that some user has written a script to post CSAM on all your communities, because “fuck you that’s why”. You get on the line with your moderators and they tell you they’ve been battling this all night, just banning people and deleting comments on site. They tell you they’ve had to turn off a few communities and that some users are complaining. Your hard work for weeks and months to get this instance to a healthy place is being tested. You get an email from your hosting service, saying that they have reports that your site contains CSAM and that’s against ToS - they give you a day to get it under control before they boot your server or turn it over to police. Imagine in this case you make the drastic move to simply pull the plug - taking the entire instance offline until you can sort it through. Now imagine some users come in and start complaining about how you dear admin are killing the fediverse. Personally, I have no sympathy for those user who complain about their community or instance being taken offline while admins deal with real shit.

    utopianfiat,

    I think commenters are getting a little uppity

    What praytell the fuck do you mean by this term specifically

    dipshit,

    I’ve spent the better part of this morning explaining to people the fact that a community needs to be shut down in order for volunteers to work on cleaning it up in the time they have available.

    commenters seem to be pretty upset that something as “drastic” as turning off a community needs to be done. Some commenters have gone so far as to say that the policy of turning off communities in response to handling CSAM is what will “kill the fediverse”.

    I think the normal response to this is: “Wow, this sucks. Thanks admins for doing your best work. I understand the community make not come back for a bit, take all the time you need!”. Yet, I hear “it’s the dev’s fault for not putting in the code for blocking CSAM and taking a community offline is unacceptable”. I call that “upity” but there’s probably better words for it.

    cubedsteaks,

    Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

    So 4chan has this problem a lot but they are also based in the US where its most definitely illegal and they IP ban people and I think for the most part it works. It did suck though - I don’t go on there anymore but in the last few years I did, if I was on mobile, I would often get hit with a region ban because so many people in that area were banned that they just decided to block an entire IP region to prevent anyone else posting illegal content.

    maybe look into IP and region banning to prevent someone from just making new accounts.

    dipshit,

    You’re discussing how to ban people, this isn’t the problem.

    The problem is this: In the last hour, 10,000 images were uploaded. Some of those contain CSAM. Now, you have 1 hour to find all the CSAM photos (0 to 10,000 of them). In the next hour, another 10,000 images will be uploaded, some of them containing CSAM…

    Unless you have a lot of human moderators, you’re going to use automated tools and get false-postives or false-negatives.

    A site like 4 chan banning whole regions isn’t a great example of handling this well. I don’t think I need to explain (but maybe I do) that one person in a region who is posting CSAM doesn’t mean the entire region posts CSAM. You could just opt to block all regions by pulling the site off the internet. Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes. “Children can be abused only in these countries” “I’m sorry but your countries laws prevent images of children being abused, so this content is banned”. Yikes.

    maybe look into IP and region banning to prevent someone from just making new accounts.

    Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

    if ( $_SERVER[‘REMOTE_ADDR’] === ‘1.2.3.4’ ) { die(‘nope!’); }

    Here’s the code to ban a user at a specific region (pseudocode):

    $geoip = new GeoIPDB(); $region = $geoip->get_region( ‘1.2.3.4’ ); if ( $region === ‘USA’ ) { die(‘nope!’); }

    This isn’t difficult.

    Now, for the code to DETECT CSAM:

    look for skin tone tints (take into account all skin tone colors), look for quantity of skin on image (this would make close-ups of arms possible nude detections), detect a person in the photo, determine the person’s age by the photo… don’t detect images of art or of artful nudes, etc… or you know this is a lot of work, let’s make the humans detect instead.

    cubedsteaks,

    Region banning would prevent anyone in the area from posting. I even mentioned that I use to come across bans for other people. In the case of 4chan, when they region ban, its possible someone else will be prevented from posting.

    Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes No its against their TOS entirely. It’s readable on their site and they do enforce rules even though they also enable people to be shitty in other ways.

    Now, if you want to talk about legality in other countries - that’s a different discussion. The internet is open to the WORLD. And all I would be comfortable confirming is that it’s definitely illegal in the US where I am. I’m not gonna get into other countries where it might not be illegal. I don’t know enough about those places to be able to tell you more.

    Basically a region ban would be similar to just pulling that instance down. Preventing whatever region that person was posting in would prevent them from posting as well as making local accounts to try and post more.

    When I would be downtown where I live, and got a ban that wasn’t meant for me, but I was in the region that was banned, I was able to appeal my ban. In order to appeal, you have to be good at using your words because a person has to sit there and read the appeal to make the decision to unban or not. Mine always went through but I also am capable of talking things out and I’m smart enough to know when to properly explain myself.

    Other people didn’t get their appeals and I would see them complain about it elsewhere.

    Anyway, you don’t need to condescend to me. I’m not against what you’re saying. I agree with a lot of what you said in other comments.

    dipshit,

    I mentioned this before but I’m sorry that I didn’t see who I was responding to. I usually respond on the internet to ideas, not people. Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

    GeoIP databases aren’t perfect, which is another problem entirely. It’s better than pulling the plug on the entire internet, sure, but it has its own problems.

    I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries. This is a moral argument I’m making, but I am happy imposing the US law as it relates to CSAM being illegal (not US law such as FOSTA/KOSA, etc… those are a different can of worms entirely) on other countries. Or to put it another way, as an admin, if I get an email saying “actually bro in country xyz we get to abuse children”, it won’t sway me into allowing that content in that country. IF someone in that country wants to put up a site for that country, that’s their problem (and if I could intervene and prevent them from doing so, I would).

    cubedsteaks,

    Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

    Right, it’s definitely not an easy fix and Lemmy doesn’t even operate the way other sites do but today I’m learning that using these instances seems to be easily exploitable.

    The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

    I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries

    I agree. Honestly, if I was in charge in anyway - those countries just wouldn’t be allowed access. And that does happen. I use to work for an app where we had people working in the Philippines who couldn’t access the app itself. We had to just give them info and they would feed it to the customers. And it was because their country is blocked from viewing the app in the first place. They’re just straight up not allowed to use it there.

    Like I’m totally with you. Fuck MAPs, fuck all of em. If some archaic country still participates in something that is obviously harmful to people - yeah, impose these laws on them. Tell them to fuck off until they stop this shit.

    And lets be real. It’s gonna be years before they ever stop.

    dipshit,

    The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

    That’s an interesting point. I didn’t take into account that 4chan might use region banning as a way to shame other anons by removing access from their country. That’s an interesting approach and I guess that’s something that lemmy admins could use in their toolbag. Users would absolutely hate it more than a simple community being banned, but whatever works or at least helps decrease the amount of this in existance.

    cubedsteaks,

    Oh right, and it wasn’t country based. I’m in a large city and only the downtown region of my city was banned. If I went back home, I could easily get on the site and post.

    dipshit, (edited )

    Yeah, there isn’t any geographic information required when giving out ip addresses. Companies like MaxMind maintain a large “GeoIP” database that tries to match IP address with location and is pretty good, but isn’t 100%. VPNs also make the situation worse, as it sounds like with 4chan someone could vpn to a country they didn’t like, post CSAM and get that country banned. It also means users can circumvent the banings with VPNs. All of this makes the entire region banning pretty useless. It’s value comes likely comes from users who don’t know how to use VPNs, but do know what 4chan is.

    And then there was the issue with people distributing CSAM via sprays on counterstrike servers. Just joining a server would mean downloading those sprays onto your machine without your knowledge, just so they could be visible in game (which is more or less how the web works anyway, but the point here was that someone could create hundreds of sprays with CSAM and then even if not used, every player would have that content somewhere on their computer.

    Edit: not sure why I brought up the counterstrike thing. I think I was trying to make the point thats unfortunately sometimes this material seems to be weaponized. All around awful.

    cubedsteaks,

    Yeah I mean, I only mentioned how they do this on 4chan with region bans because they have worked. Believe or not but people don’t just get VPNs to spam CP there.

    They use to do that back in like what… 2005 ish? Maybe even into like 2008, or around that era? In all my years wasted on that site, no one was just spamming CP and getting away with it. It was always taken down quickly and the bans were put in place and the spamming of illegal content stopped.

    And people there definitely use VPNs but to do things like torrent or pirate usually. They aren’t all dumb, they just have bad morals in general over there.

    I’m familiar with the CS sprays you’re talking about! This was an issue in other games too where you could upload images as a spray to put on a wall in a game. I believe my ex and I use to spray the image from Goatse in one of the earlier Call of Duty games.

    My ex ran his own server though. If someone uploaded CP, he would just ban them.

    dragontamer,

    Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

    How does an IP Ban work when this attack came through a different, legitimate, federated Lemmy server?

    Katana314,

    I don’t think the comment above was trying to express dissatisfaction towards Lemmy’s hosts for failure to respond. They’re simply stating that the way things are all set up, much as we might like it, has serious problems - ones that may end up being considered unsolvable. As you said, we might be heading for an eventual plug pull.

    It’s like pointing out that cars produce fossil fuel exhaust. It sucks, and we’re seeing it as unsustainable, but there’s no convenient alternative yet.

    dipshit,

    Things are setup the way they are because it’s the best way that admins (not just of lemmy instances but of major sites like reddit and facebook) have found to handle these situations.

    You could take it a step further and give law enforcement their own backdoor to your site, as Facebook has done, but I would not advocate for that solution. We are in a special place in the internet where we can somewhat self-police our own content, assuming we actually self-police our own content. The way we do this is the way these admins are currently handling this.

    It may be reasonable to think that sites like reddit and facebook have it all figured out, but all they have is similar code to what lemmy has, but with a bit more money to pay some content moderators on trust and safety to actually remove this content before users get a chance to see it. The difference between those sites and lemmy is $$$ and that’s not something that’s likely to change anytime soon.

    dipshit, (edited )

    Sorry to hear about your investment in lemmy. How much did you end up investing? It just sounds like you’re very unsatisified with the value that lemmy has provided.

    Personally, I don’t pay for lemmy. Lemmy is free as for as I understood it. As it being free, I can’t really dictate the legal risk that the admins have to go through, as I do not have power over them, and because I treat them as humans.

    But yeah, I guess if you have a good reason, they really should be falling over backwards to moderate all the CSAM away from your favorite community. You are an all-powerful being.

    Edit: Sarcasm on the internet doesn’t work well so let me be frank: admins aren’t responsible for going to jail for a user’s desire to post CSAM. admins have a right to shut down a community that posts CSAM or remove CSAM or any material they find objectionable from their site. Admins take on the legal risk of the content on the site and OWE USERS NOTHING. Y’all can “the customer is always right” all you want but if you aren’t the one paying you aren’t the customer and you aren’t right.

    wanderingmagus,

    I feel like you didn’t actually read their comment before posting, !dipshit

    It has nothing to do with Lemmyshitpost being their “favorite community” and they never mentioned “investing” or “value”. That’s all from you. Stop strawmanning their position. They were criticizing the ease with which entire communities can be taken down by single individuals. Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

    dipshit, (edited )

    I read it. Let’s read it together again.

    I hope the devs take this seriously as an existential threat to the fediverse.

    The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works. I assume the commenter meant to say “admins” here, as developers write the code, they don’t admin the sites. If a developer has a lemmy instance they admin, they they are both a dev and an admin. Lemmy wasn’t built for CSAM sharing specifically, it is a site that allows sharing of CSAM as much as reddit or facebook do. The devs can’t do much about this. The admins and mods can.

    Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers.

    Neat. Irrelevant, but cool.

    If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

    This I take issue with, and is what I mostly responded to.

    “If taking the community down is the only option here” well no, it’s not. We could just get 100’s of mods to specifically address this one user’s posting of CSAM. Hey, anyone want to moderate the site? Oh right, and they’ll need to be vetted, and they’ll need to keep doing this on the side for free as volunteers since lemmy is volunteer run…

    “that’s extremely insufficient” hard disagree. A community is liable for the content on it. If we put a CSAM post up on a site and leave it around for a few minutes, that’s one thing. If it’s left up for days and weeks, that’s quite another problem entirely. The minute that an admin or mod saw CSAM material, they did the right thing by shutting that down. Even if it means downtime for users. Oh no! Users can’t read lemmyshitpost and now the world is ending.

    “and bodes death for the platform at the hands of uncontrolled spam.” Welcome to the internet, where all platforms are at the risk of uncontrolled spam. At first it was just email, but then it was bulletin boards, and then message boards, and then forums, and then community-moderated forums like reddit and lemmy. This has and will be a problem. This isn’t a new concern for lemmy devs or admins or mods, they all are aware that this can happen and is why they do what they do. Turning off the community is a viable option, and is what has happened in larger companies too while they cleaned up the mess.

    Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

    I’ve been very consistent in my arguments. Show me the contradiction and I’ll address it.

    TL;DR: users cannot expect to be allowed to post CSAM material on lemmy instances. Allowing CSAM material to be up on lemmy instances constitutes a legal risk for admin owners, and thus we cannot leave it up. Blocking a community (even if it’s like the bestest and most favorited and most subscribed and everyone loves it and wow just super-duper community) is a viable means of blocking all CSAM on that community while it is cleaned up. To suggest that the community should have stayed online is assinine. To suggest that the admins should not have blocked a community to combat CSAM is assinine. Trust admins to do their jobs.

    wanderingmagus,

    They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

    NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

    You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

    Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

    dipshit, (edited )

    They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

    Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.

    Edit: I goggled and could not find this database. I’m thinking it’s a myth.

    NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

    ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.

    I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.

    You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

    Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?

    Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

    You’re not understanding how CSAM detection works or is handled.

    The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces… IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.

    Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?

    Pedophiles, instead of downvoting me, why not explain yourself?

    wanderingmagus,

    As another commenter posted below:

    But tools do exist. PhotoDNA by Microsoft. Although much more user-friendly implementation if you use Cloudflare, related links:

    As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    Db0 even created a tool for Lemmy:

    lemmy.dbzer0.com/post/2896209

    dipshit,

    As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

    I think this is where you could be wrong here. I appreciate the links, I’ll look into those in more detail. My best understanding is that these tools generate so many false-positives and false-negatives that it’s not worth using them. It may be a first line of defense until real humans get to see them, but my point is that humans are still needed. When humans are included because the system isn’t 100%, it means humans do the labor and as such, with limited time, humans need to determine when they can do the labor - sometimes shutting down a community is the best way to stop the flood while they clean up the mess.

    ricdeh,
    @ricdeh@lemmy.world avatar

    This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    dipshit,

    This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    Hello Richard, thanks for taking the time to present your criticism. I take some issue with it, but I believe it’s due to a misunderstanding. I’ll explain.

    This is just a matter of confirmation bias from your side now.

    I do think there is some confirmation bias at play here but I’m thinking you may be surprised where it’s coming from.

    You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question.

    This factual information, to the best as I can understand from reading the comments is that these tools exist. I don’t deny that they exist. I didn’t deny they exist.

    users who have many better things to do than respond to your inquiries

    Unfortunately, it would seem you are wrong here, as they have responded to my inquiries. Or, are you talking about the people who haven’t responded to my inquiries? Either way, I agree that there are people who have or haven’t responded to my inquiries that do have better things to do. What is also true is that the people who responded to my inquiries, even if they had better things to do, still responded to my inquires.

    you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question.

    But, I didn’t, is the thing. In the cases where people have provided links to projects, I’ve thanked them. I’ve mentioned my skepticism and I’ve also wished them well on their projects.

    “Advanced and reliable” are marketing terms, and I don’t really care to use them as they have no meaning. Advanced how? In that it uses neural networks? NEAT! reliable how? in that they work 100% of the time? That they don’t generate false-positives or false-negatives? That they don’t degrade the user experience? These are questions worth asking… but… let me be clear: they are questions worth asking for the sake of improving these tools and maintaining the user experience; these are not meant to discourage use of such tools. I believe admins should use all tools available to them, including turning the servers off if they need to - in that toolbelt includes ai based tools and scripts.

    And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    Richard, is it, or can I call you dick?

    Either way, Richard, I don’t claim to have superior knowledge. I honestly thought that would come across in my username. Sorry that it did not. I’m a dipshit.

    Like, an actual dipshit. I’m dumber than a lot of people on here. That doesn’t mean that I’m the dumbest, but it doesn’t mean I’m the smartest either. I’m far from being the smartest on this site, and that’s not impostor syndrome. If I sound smart it’s just because I do know a bit on some topics relating to technology and development, and because I have many interests in many topics. What I have at best is a baseline understanding, and I try to remain humble about it. I got pretty emotional in this thread, calling people who disagreed with my very much hard-to-disagree-with stances that “CSAM bad” and “admins should feel free using all tools available to remove and prevent CSAM” names that got me a 3-day ban. A ban which is now ending, allowing me to finally respond to you.

    You see the problem here is that the people I’ve been responding to are people who are misunderstanding some things, which I’m trying to clear up. I believe this is one of the main draw to any message board system, for most people. We like to communicate and share knowledge.

    however somehow failing to provide any data or secondary sources to back up your claims.

    I don’t walk around with sources handy for everything I say, ready to cite them in every single post. I also don’t see many people doing this. I don’t lie, and I’m happy to provide links, sources, whatever - when asked for them. No one’s asked for sources for my claims that “CSAM bad” or that “admins should feel free using all the tools available to them…” or even when I say that AI tools aren’t perfect and lead to false-positives and false-negatives. But, I can give them to you if you want to see them. I don’t talk out my ass. One easy example I’ve been giving people is what’s happening and has been happening for years to this one youtuber, who constantly gets her videos flagged as having a child in them, despite her being 30 and only having herself in her videos. Her experience is valid, as there are some people who will constantly be falsely identified (which again is just stating facts, not suggesting these tools shouldn’t be used - that would suggest that these tools cannot be improved, which they can and are).

    Richard, your comment was a slam-dunk. I’m just not sure you hit the right net. Please let me know if I am misunderstanding anything.

    Yours, Dipshit.

    utopianfiat,

    The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works.

    They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

    Oh no! Users can’t read lemmyshitpost and now the world is ending.

    Replace this with !technology, or !selfhosted, or !announcements. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

    You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

    Turning off the community is a viable option

    It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

    dipshit,

    They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

    Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

    Links or you’re talking out your ass.

    Replace this with !technology, or !selfhosted, or !announcements. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

    Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place. READ the original post where it was mentioned this is a stop-gap measure.

    You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

    You’re… offended that I had snark on the topic of lemmySHITPOST? surely, you are joking.

    My point is not that this community is shit and that’s why this happened.

    My point is that this is a community on a lemmy instance that was flooded with CSAM, and was shutdown because of the flood of CSAM.

    It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

    You do see how turning a community off and then on again isn’t the same thing as burning down a house (and unburning it again?)

    You do realize that we’re talking about a literal crime against children vs your ability to see memes? Fuck off with your self-importance.

    utopianfiat,

    Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place.

    Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

    dipshit,

    Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

    Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

    “Flooding” a site with CSAM is a matter of opinion. If one person posted one image of CSAM on my instance, that would be flooding - that’s one image too many. It’s not like there’s some magic threshold of the amount of CSAM allowed on a site. All sites use human moderators to detect CSAM and all sties who do this have teams that are far too small and far too underpaid for the most part.

    Underpaid being the keyword here, as lemmy admins are volunteers. I would think that the threshold for “flooding” a lemmy instance with CSAM would be far lower than that of a major for-profit site.

    cubedsteaks,

    Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

    it’s true, if I remember correctly, tumblr was removed from the App Store because of CSA issues. I could be remembering wrong and maybe it was the Google Play Store.

    dipshit,

    This is likely going to be an issue for any site that hosts nsfw content. It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.

    If the admins of nsfw instances aren’t already on high alert, they should be now.

    cubedsteaks,

    It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.

    Yeah, I fully agree with that. If people want porn, they should go to porn sites.

    cubedsteaks,

    Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

    have you ever talked to janitors and mods on 4chan? Good luck getting any info out of them.

    dipshit,

    Do you realize that 4chan isn’t the full internet? That these programs that you already know of can exist outside of 4chan? I’m asking you - the person who knows of these apps - to provide links to back up your claims.

    cubedsteaks,

    I’m not the other person you responded to and I never claimed to have an apps or links.

    I’m just telling you how this works on 4chan. I’m aware that’s not the entire internet obviously - your sarcasm needs work considering we are both here on Lemmy, ie, not 4chan.

    If anyone on there is using these programs/apps/whatever, they’re not just gonna tell other people about them.

    And as far as I know, I haven’t been on 4chan in like 3 years not but they region ban for CSAM.

    dipshit,

    I don’t really keep track of who I’m responding to. I just respond to comments. Sorry if I got mixed up here.

    My comments still stand, though the snark isn’t directed towards you specifically.

    I just want people to understand that there isn’t a solution that we all think exists in other places and not here. The solution is largely people. People who get PTSD from viewing and moderating these images. It’s not a good solution but it’s the best solution we have so far.

    The other truth of the matter is if the Internet itself were to hypothetically shut down, this content will just be distributed via other means. The one nice thing about the internet is that lots of stuff is tracable back to the person who posted the infringing material.

    cubedsteaks,

    I just want people to understand that there isn’t a solution that we all think exists in other places and not here

    I agree with you on this. I agree with a lot of things you’re saying.

    part of responding means you know who you are talking to. I know it can be easy to forget someone is behind the screen here but there are in fact other human beings responding to you.

    Just try to be more considerate, if anything.

    newIdentity,

    Everyone got your sarcasm. We just think the Lemmyverse has no chance when it’s flooded with child porn

    dipshit, (edited )

    deleted_by_moderator

  • Loading...
  • DarthBueller,

    People are downvoting you because you’re acting like a dick.

    despotic_machine,
    @despotic_machine@lemmy.world avatar

    I don’t think they’re acting.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    If you’re a pedophile and disagree with me - instead of downvoting, why not explain yourself?

    People have been, but you’re not truly listening, Internet Warrior.

    dipshit,

    Whatever you say, kid.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Whatever you say, kid.

    I’m over 50, but you keep doing you, Internet Warrior, as it just proves my point.

    dipshit,

    Your age does not come across in your writing. You sound like a kid, which is why I called you one.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Your age does not come across in your writing.

    Well, lets see …

    People have been, but you’re not truly listening

    That sounds to you like a sentence a young person would say, punctuation and all?

    dipshit,

    Yes.

    It doesn’t to you? How young do you think these people are? If you’re 50, there’s 49 different ages which are younger than you. Not all of them know how to write like that, but I would say at least the ones with a high school education do.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Yes. It doesn’t to you?

    These days especially, I see very few young people who bother with things like commas, multiple paragraphs, or using words like “truly”.

    dipshit,

    You may want to keep your day job, and not move into the online-age-detecting sector, truly.

    Sorry for miss-generationing you.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    You may want to keep your day job, and not move into the online-age-detecting sector, truly.

    Well, for the record, you’re the one who started identifying the age of somebody online by what they wrote. So I would just advise you to follow your own advice.

    Sorry for miss-generationing you.

    Wow, assuming you weren’t just being sarcastic, an actual apology. Thanks. There’s still hope for you.

    cubedsteaks,

    I agree with a lot of what you said and upvoted you but you really need to just stop calling people pedos for disagreeing with you.

    I’m a victim of CSAM myself and you can take a look through my comment history where I talked about it in depth more. I hate pedos just as much as you do but going around calling people pedos isn’t going to do anything but upset people.

    dipshit,

    I’m taking the radical stance that CSAM isn’t a good thing, should be reported to law enforcement and that the site with CSAM can be shut down as a viable option for handling CSAM material.

    I’m getting downvotes from people who disagree with me on this “radical” stance. People who disagree that CSAM is a problem, that CSAM is a concern. I don’t have a lot of sympathy for people who promote CSAM like the people who downvoted my posts. I don’t care about the loss of internet points, I care that these worthless shits are still on lemmy, so yes, I call them what they are.

    newIdentity,

    You’re completely misinterpreting everything we said. If we would shutdown every site with CSAM, the internet wouldn’t exist. We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

    dipshit,

    You’re completely misinterpreting everything we said.

    Not at all. I am completely underestanding you.

    If we would shutdown every site with CSAM, the internet wouldn’t exist.

    You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

    We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

    My solution which is to remove CSAM? My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

    Another question for you: if your house is flooding due to a burst pipe, what do you do first:

    a) get all the water out of the house b) turn off the water coming into the house.

    my solution would be to do step B followed by step A. Your solution appears to be to just do step A, which means you’ll constantly be flooded and never have enough manpower to dry your house.

    I’d bet money that the following will happen:

    1. community gets turned off
    2. csam gets deleted, posters are identified, information turned over to law enforcement
    3. community gets turned back on.

    In the meantime, folks missing the community are free to go elsewhere on the internet. Why? because CSAM is a crime which depicts Sexual Assult and the evidence is posted online. It’s not a matter of just deleting content, it’s also a matter of turning over the people posting that content over to the police so they can be held accountable for their crimes.

    newIdentity,

    You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

    Sorry let me word this correctly: social media wouldn’t exist.

    My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

    The problem is that it won’t stop and that it will happen again.

    I’d bet money that the following will happen:

    1. community gets turned off
    2. csam gets deleted, posters are identified, information turned over to law enforcement
    3. community gets turned back on.

    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

    dipshit,

    Sorry let me word this correctly: social media wouldn’t exist.

    And this is hardly the argument you think it is. Again, not true of all social media sites, but let’s strongman your argument for a moment and say that you are refering to only the major social media sites.

    Well then, we have a problem, don’t we? What’s something the major social media sites have that lemmy doesn’t? Ad revenue, to the tune of millions of dollars. What do they do with that revenue? Well, some if it goes to pay real humans who’s entire job is simply seeking out and destroying CSAM content on the site.

    So then how does lemmy, with only enough money to pay hosting costs, if that… deal with CSAM when a user wants to create a botnet that posts CSAM to lemmy instances all day? My answer is: the admins do whatever they think is nessecary, including turning off the community for a bit. They have my full support in this.

    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

    You’re strawmanning my argument. I’ve never said forever. I’ve said while the community gets cleaned up. I’ve even described a timeline below.

    The better tools you want to moderate are your own eyeballs. I’ve said this before but there have been many attempts at making automated CSAM detection material and they just don’t work as well as needed, requiring humans to intervene. These humans are paid by major social media networks but not volunteer networks.

    The problem is that it won’t stop and will happen again.

    Yes, this is the internet! No one has a solution to stop CSAM from happening. We aren’t discussing that. We are discussing how to handle it WHEN it happens.

    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

    No, I’m correct about step 2, which I described as: “csam gets deleted, posters are identified, information turned over to law enforcement”

    I’ll break it down further:

    1. CSAM gets deleted from the instance. Admins and mods can do this, and they do this already.
    2. posters are identified. Admins and mods can do this, and might do this already. TO BE CLEAR, they can identify the users by IP address and user agent, that’s about it. The rest of it… is…
    3. “information turned over to law enformcement” … left up to law enforcement. “Hello police, I’m the owner of xyz.com and today a user at 23.43.23.22 posted CSAM on my site at this time. The user has been banned and have given you all the information we have on this. The cops can get a warrant for the ISP and go from there.

    Oh yeah, TOR. well, we’re getting deep off topic here but go on youtube and see some defcon talks about how TOR users are identified. You may think you’re slick going on TOR but then you open up facebook or check your gmail and it’s all over.

    Either way, I’m not speaking to the success of catching CSAM posters, I’m only speaking to what the admins likely are doing already, which is probably true.

    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

    Nothing, which is why social media sites dedicate teams of mods to handle this exact thing. It’s a cat and mouse game. But not playing the game and not trying to remove this content means the admins face legal trouble.

    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

    This makes no sense to me. What was your point? Yes, one image is easier to delete than thousands of images. I don’t see how that plays into any of what we have been discussing though.

    newIdentity,

    I don’t want to write a long text so here is the short version: These automated tools are not perfect but they don’t have to be. They just have to be good enough to block most of it. The rest can be done through manual labor which also people have done voluntarily on reddit. Reporting needs to get easier and you can prevent spammers from rate limiting them.

    To be clear, I don’t have anything against temporarly shutting down a community filled with CP until everything is cleared up. But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

    I’m sorry for the grammatical mistakes. I’m really tired right now and should probably go to bed.

    dipshit,

    i agree with most of what you’ve written, just one small issue:

    The rest can be done through manual labor which also people have done voluntarily on reddit.

    You’re probably right that some volunteers handle this content on reddit. By this I mean, mods are volunteers and sometimes mods handle this content.

    My point however has been that big social media sites can’t rely on volunteers to handle this content. Reddit, along with facebook and other major sites (but not twitter, as elon just removed this team) has a team of people who pick up the slack where the automated tools leave off. These people are paid, and usually not well, but enough so that it’s their job to remove this content (as opposed to it being a volunteer gig they do on the side). I’ll say that again: these people are paid to look at photographs of CSAM and other psychologically damaging content all day, usually for pennies.

    But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

    I fully agree with you. It’s just, as a dev, who has toyed around with AI and has been working on code for decades now, I don’t see a clear path forward. I am also not an expert in these tools, so I can’t speak specifically to how well they work. I can only say that they don’t work so well that humans are not required. Ideally, we want tools that work so well humans won’t be required (as it’s a psychologically damaging job), but at the sametime, we don’t want legit users to be misflagged either. The other day there was a link posted to hackerne.ws by a youtube creator who keeps needing to reenable comments on her shorts. The youtube algorithm keeps disabing comments on her shorts because it thinks there’s a child in the video - it’s only ever been her and while she is petite in stature, she’s also 30 years old. She’s been reaching out to youtube for over 3-4 years now and they still haven’t fixed the issue. Each video she uploads she needs to turn on comments manually, which affects her engagement. While nowhere near comparible to the sin of CSAM, it’s also not right for a legit user to be penalized just because of the way she looks - because the algorithm cannot properly determine her age.

    Youtube is a good example of how difficult it is to moderate something like this. A while ago, youtube revealed that “a years-worth of content is uploaded every minute” (or maybe it was every hour? still)… Consider how many people would be required to watch every minute of uploaded video, multiplied by each minute in their day. Youtube requires automated tools, and community reporting, and likely also has a team of mods. And it’s still imperfect.

    So to be clear, you’re not wrong, it’s just a very difficult problem to solve.

    cubedsteaks,

    I mean, I think people are downvoting you for other reasons.

    Obviously I agree with you that CSAM is bad. It happened to me and ruined my fucking life for like all of my teen years and then most of my early 20s.

    But calling people names is pointless. Especially when it comes off like a baseless accusation.

    dipshit,

    Noted. You’ll have to excuse the fact that I don’t really care about calling people names on the internet if the content of their message promotes abuse.

    cubedsteaks,

    Yeah I get that for sure. I mean, if I knew someone was some kind of MAP idiot who was trying to fight for the rights of pedos, I’d call them names too. Idiot seems fitting for that lol

    cubedsteaks,

    I’m a victim of CSAM and my dad exploited me for several websites.

    I get being upset about this. But it’s not the end of the world for a site. Lemmy is still totally fine and I have been using it without seeing any CSAM and the only knowledge I even have of this is from posts like OP’s.

    Like this isn’t a good time to be just down on the site and pessimistic.

    utopianfiat,

    I have a recurring donation to the instance, but that’s aside the point.

    dipshit,

    You’re right, it is. You may be the sole person donating, and maybe you of all people have the “right” to have your opinion “respected” for donating. My point is that by and large, the CSAM posters and most people who use this site aren’t directly paying for a service which contractually obligates them to take part in the site or service, let alone by posting CSAM.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

    They have very low to zero legal risk, as long as they’re doing their job.

    IANAL, but I can read laws.

    dipshit,

    Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

    Correct, emphasis mine. As long as they take action when it happens being the key phrase here.

    IANAL but from what I understand, doing something to take action (removing content, disabling communites, banning users, all of the above) shows that they are working to remove the content. This is why previously when having conversations with people about the topic of piracy I mentioned DCMA takedown notices and how the companies I’ve worked at responded to those with extreme importance (sometimes the higher ups would walk over to the devs and make sure the content was deleted).

    I’m annoyed at people in this thread who believe that the admins did the wrong thing, because turning off communities could cause users to go to another instance - who cares, this is bigger than site engagement. I’m annoyed at people who think that the devs had access to code which could prevent this issue but chose not to implement that code - this is a larger and much more difficult problem that can’t just be coded away, it usually involves humans to verify the code is working and correct false-positives and false-negatives.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    You misunderstood what I meant by the part that you highlighted of my comment.

    I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.

    dipshit,

    Yes, and believe it or not, I’ve been discussing both with people.

    I use DCMA actions because they are easily understood. People get copyright strikes. People pirate music.

    Safe Harbor provisions are not as easily understood, but basically amount to (IANAL) “if the administrator removes the offending content in a reasonable amount of time when they learn about the offending content, then we’re all good”. It’s not a safe haven for illicit content, it’s more of a “well, you didn’t know so we can’t really fault you for it” sort of deal. But when admins know about the content, they need to take action.

    systemglitch,

    It doesn’t bode well.

    1984,
    @1984@lemmy.today avatar

    Lemmy is new for all of us. I don’t see any other solution right now. You got some ideas how to handle it better?

    I think better mod tools are needed but it will take time. Doesn’t mean the platform will die, but means we may have to deal with some stuff like this.

    utopianfiat,

    It’s a hard problem but it absolutely is an existential risk. Spam is an existential risk. A platform that collapses under spam will either remain too small to be irrelevant or collapse from unusability. I’m sorry but I don’t think your response completely grasps the number of forums, social media sites, wikis, etc. that have been completely crushed by spam.

    1984,
    @1984@lemmy.today avatar

    I admit I haven’t kept track of that, true.

    SamboT,

    Solution now. Better solution later.

    utopianfiat,

    That’s what I’m hoping - we can’t just burn down any community with spam

    godless,
    @godless@lemmy.world avatar

    Fucking bastards. I don’t even know what beef they have with the community and why, but using THAT method to get them to shut down is nothing short of despicable. What absolute scum.

    aceshigh,
    @aceshigh@lemmy.world avatar

    jealousy is a powerful emotion.

    MajorHavoc,

    I would not be shocked to learn this was being organized or funded by one or several of the major stockholders of current popular social media sites. (Speaking of fucking bastards.)

    iforgotmyinstance,

    Please get some legal advice, this is so fucked up.

    PugJesus,

    Genuine question: won’t they just move to spamming CSAM in other communities?

    givesomefucks,

    With how slow Lemmy moves anyways, it wouldn’t be hard to make everything “mod approved” if it’s a picture/video.

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    This, or blocking self hosting pictures

    ButtholeSpiders,
    @ButtholeSpiders@startrek.website avatar

    Honestly, this sounds like the best start until they develop better moderation tools.

    Anonymousllama,

    This seems like the better approach. Let other sites who theoretically have image detection in place sort this out. We can just link to images hosted elsewhere

    user224,
    @user224@lemmy.sdf.org avatar

    I generally use imgur anyway because I don’t like loading my home instance with storage + bandwidth. Imgur is simply made for it.

    devious,

    Yes, and only whitelist trusted image hosting services (that is ones that have the resources to deal with any illegal material).

    Darth_vader__,
    @Darth_vader__@discuss.online avatar

    the problem is those sites can also misuse the same tools in a way that harms the privacy of it’s users. We shouldn’t resort to “hacks” to fix real problems, like using client scanning to break E2EE . One solution might be an open sourced and community maintained auto mod bot…

    assassin_aragorn,

    This seems like a really good solution for the time being.

    skullgiver,
    @skullgiver@popplesburger.hilciferous.nl avatar

    How many mods do you think will want to delete pictures of child abuse all day long? Nornal users won’t be affected but the mods will just leave.

    Besides that, images and videos can be embedded into comments and DMs as well. Lemmy needs a better moderation system to fight these trolls.

    droans,

    Not-so-fun fact - the FBI has a hard limit on how long an individual agent can spend on CSAM related work. Any agent that does so is mandated to go to therapy afterwards.

    It’s not an easy task at all and does emotionally destroy you. There’s a reason why you can find dozens of different tools to automate the detection and reporting.

    skullgiver,
    @skullgiver@popplesburger.hilciferous.nl avatar

    Exactly. You can only see so many pictures of babies getting raped before you start going crazy.

    Unfortunately, there is no free and open source CSAM hash database. Companies like Microsoft provide free APIs, but the reliable services are either centralised by big tech companies or used exclusively by law enforcement. There is scanning technology that is open source, but without a database of verified hashes that technology is rather useless.

    CharlesDarwin,
    @CharlesDarwin@lemmy.world avatar

    Yep. I know someone that does related work for a living, and there are definite time limits and so on for exactly the reasons you say. This kind of stuff leaves a mark on normal people.

    SubArcticTundra,

    Or it could even just ask 50 random instance users to approve it. To escape this, >50% of accounts would have to be bots, which is unlikely.

    XylightNotAdmin,

    But then people would have to see the horrible content first

    SubArcticTundra,

    That definitely is a downside

    MargotRobbie,

    We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It’s likely that we’ll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don’t blame them).

    fuzzzerd,

    Basically ruins the fediverse concept, as it will ultimately lock out all small instances of self hosted instances, which really is a shame.

    MargotRobbie,

    I don’t think it does, since self hosters can just federated with any number of instances they would like to interact with and not worry about the rest.

    stewie3128,

    Those instances would have to actively fedeate with the self-hosters though. Blacklist is better than whitelist - keep the internet open.

    T156,

    Although a blacklist also wouldn’t stop someone from firing up their instance, and posting CSAM to the bigger instances that way.

    It’s a downside of the Federation model. If an instance is blacklisted, they could just shut it down, fire up a new one, and get around it that way.

    pjhenry1216,

    Whitelisting basically puts power in the hands of a few and suddenly the fediverse is capable of being bought. I’d prefer looking into other options if possible prior to essentially putting the nail in the coffin of an open fediverse.

    T156,

    That’s always the case, since people will automatically flock to the larger and more stable instances, and anyone who’s looking to buy in has the financial resources to build one of those.

    It’s part of the reason why lemmy.world and lemmy.ml got so big, since they were the largest, and most active instances, which made them the most attractive places for new users to flock to.

    For example, XMPP is an open standard, but it was effectively bought and shut down by Google when they opened up their Reader(?) service, and shut that down after a while. The service still exists, and you can fire up a new server/join one, but it’s dead for all other intents and purposes.

    Lemmy isn’t any more resistant to that than XMPP was.

    pjhenry1216,

    Yeah, but notice that Google didn’t have to deal with the dark web or CSAM or ridiculous amounts of money to cover their purported tracks.

    UnfortunateShort,

    I mean, you could have some sort of application system and allow instances to pull from you by default. That way you’re “read only” until approved by other instances

    HexesofVexes,

    Sounds like the 4chan raids of old.

    Batten down, report the offender’s to the authorities, and then clean up the mess!

    Good job so far ^_^

    johnthebeboptist,

    Yeah, in good and in bad, this place is a lot like “the old days of the internet”, but thankfully more in the good way than the bad. People keep complaining about shit I haven’t seen for a second, constantly there are actions from the mod/admin side about shit I’ve never even seen etc. Even without proper mod tools and shit it seems like everything is working out quite well. To which I and all of us owe a huge thank you to the people working on this stuff.

    Thank you people! Thank you for making this place feel like home and something greater than what we’ve had for a long ass time.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Mods, Keep yourself safe, be open to your support groups (dont isolate), clear shread your cache often

    We will get thru this together

    BoldRaven,

    They can out post us. They have nothing better to do… 😬

    bemenaker,

    I assume you’ve contacted the FBI, but if not PLEASE DO.

    emergencyfood,

    lemmy.world is based in Finland.

    bemenaker,

    Didn’t realize that. Thanks.

    raptir,

    Yes, the Finnish Bureau of Investigation

    recapitated,

    They’re the ones that can finish the job.

    Dogs_cant_look_up,

    The Finnish Bureau of Icecoldmotherfuckers

    Bluetreefrog,

    Gold!

    SubArcticTundra,

    In that case probably Europol or the Finnish police/cybersecurity agency or something?

    El_illuminacho,

    It’s always fun to see Americans assume the Internett is all American as well.

    Moghul,

    Threads on this issue have been posted both on Reddit and on Lemmy before, and despite Americans protesting that they don’t mean it or they don’t think that way, some people do actually say that in the comments. They assume everyone is American because “the internet is American” and “the website is American”.

    What the fuck is “the bay area”?

    ObviouslyNotBanana,
    @ObviouslyNotBanana@lemmy.world avatar

    What the fuck is “the bay area”?

    I assume that’s where the pirates are.

    despotic_machine,
    @despotic_machine@lemmy.world avatar

    And the real estate sharks!

    bemenaker,

    What the fuck is “the bay area”?

    The San Francisco Bay area, aka Silicone Valley, aka the BIRTHPLACE of computers. DARPA (US Military), created the internet, but when it was opened to public, again it was “The Bay Area” that made it usable to the world.

    It has a little bit of importance to computing.

    Moghul,

    The point wasn’t that it’s not important, it’s that it’s so generic in name as to mean nothing to many outside of the US. The reason I mentioned it is because I have had to Google which bay area Americans refer to.

    Consider me an alien confused about which moon you’re talking about when you say ‘the moon’

    bemenaker,

    Fair point. Yes Americans are ego centric in most cases. In my original comment that got everyone panties in bunch, was because most.of the tech companies and web hosting sites are in the US, or based here, giving US law enforcement jurisdiction. I didn’t know where lemmy.world.was based until someone in this thread told me.

    obinice,
    @obinice@lemmy.world avatar

    Most people in the world don’t live in that one very specific country, it’s somewhat presumptuous to assume they do, and I’m a bit tired of those assumptions only ever coming from people from a very specific place.

    I put up with it for a long while without saying anything, but it’s been 15 years and they’re doing it more now than ever. I’m kinda… worn down now.

    Can’t we all just accept that the world is vast and amazing, and exists beyond our own borders? :-(

    qjkxbmwvz,

    It’s not a crazy leap of faith to assume Lemmy demographics are similar to reddit demographics, where the USA makes up the plurality by a huge margin statista.com/…/reddit-global-active-user-distribu…

    Fedizen,

    a lot of the fediverse is rooted in the EU so I’m not entirely sure the demographics are 1:1 with reddit.

    Dark_Arc,
    @Dark_Arc@social.packetloss.gg avatar

    Can’t we all just accept that the world is vast and amazing, and exists beyond our own borders? :-(

    So can I count on you to make sure every comment you make on social media doesn’t incorrectly reference the wrong country, culture, or generally get something wrong?

    Y’all act like Americans just get up in the morning and say “to hell with it, everywhere is 'Merica.” The person just tried to help; hell they might not even be an American and just assumed (at least) one of the site admins is.

    it’s been 15 years and they’re doing it more now than ever

    The pure unnecessary drama over what’s easily a simple mistake is ridiculous. I can’t even comprehend why you even care? And beyond that, why not just tell this person “well actually, it’s X they should call because Y”?

    Like this good egg dig … sh.itjust.works/comment/2716913

    Redredme,

    Yes. Everywhere I go Americans (and us Dutch) act like the entire world is theirs. (but I’m American! Or in case of us Dutch: ik zit hier en ik moet bier. En die bbq gaat niet uit. En techno moet hard.)

    I more or less concur with the other guy. He said it in a shitty way though.

    Problem is, there is no “world police”. ('obligatory: ’ MURICA FUCK YEAH!)

    Calling the FBI, Essex police department, Clouseau’s Surete or even interpol will do jack shit in 99% of these cases because of the simple fact they don’t have any jurisdiction. Add to that they couldn’t care less because narcotics and gang violence are bigger problems and you end up with reality: it will only cost time and solve nothing. Why? Because IT crimes are hard and often times pricey to investigate with very little returns. (yeeeh, we arrested a 14 year old pimply faced youth, a minor so he will be out in 2 days with a slap on the wrist)

    And finally: The FBI does not have global reach. This is a global platform.

    tsz,

    Wat betekent moet bier of moet hard?

    Aceticon, (edited )

    In this context:

    “ik moet bier” - I need beer/I must have beer (literally “I must beer”)

    “techno moet hard” - Techno (music) must (go) hard.

    I vaguelly recognize this whole thing as a dutch saying. (Not dutch myself)

    The pleasant wordplay in that saying is because “moet” there is used with two slightly different meanings, as “having need” and as “must”.

    tsz,

    Yeah I’ve just not seen moet alone like that. I like it lol.

    Aceticon,

    Well, the first form is a pretty common variant for human basic needs if you want to make it sound desperate in a funny way (for example, “ik moet voed” rather than “ik moet eten”) but the second whilst making sense is not common and was probably chosen for literally reasons (whilst not 100% sure, I think these might be lyrics from a dutch song).

    Dark_Arc,
    @Dark_Arc@social.packetloss.gg avatar

    And finally: The FBI does not have global reach. This is a global platform.

    www.fbi.gov/contact-us/international-offices

    wanderingmagus,

    The FBI work hand in hand with the CIA, who can and do set up black sites all over the world for enhanced interrogation and extrajudicial detention, and work extremely closely with the Five Eyes and NATO. I’d call that global reach.

    bemenaker,

    The FBI does work with Interpol, and every national police agency the US is friendly with.

    redexplosives,

    It’s not that deep

    bemenaker,

    And most of the big tech companies are based where? And most of the big hosting services are based where? So, assuming contacting the US authorities makes sense I don’t know where lemmy.world is doing their hosting, but since most of the big providers are based in the US, and that would make it US jurisdiction, it isn’t an unreasonable assumption.

    ObviouslyNotBanana, (edited )
    @ObviouslyNotBanana@lemmy.world avatar

    They* should contact Europol.

    cactusupyourbutt,

    they should contact the local authorities thats the easiest probably

    ObviouslyNotBanana,
    @ObviouslyNotBanana@lemmy.world avatar

    Might be.

    newIdentity,

    They probably haven’t. Also they won’t do anything as long as you aren’t a large cooperation. Even if you are they might not do anything. See the Epic Games hack in the early 2000s

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • [email protected]
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KbinCafe
  • TheResearchGuardian
  • Socialism
  • feritale
  • oklahoma
  • SuperSentai
  • KamenRider
  • All magazines