Lemmyshitpost community closed until further notice

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

CreeperODeath,

Yeah I don’t blame you for being frustrated

Iceblade02,

Wow, this is awful. Huge cudos to y’all for holding on through this. It’s obviously a deliberate attack on the fediverse by malicious actors.

solstice,

I’d like to do my part and report there are a ton of assholes on lemmy. It’s REALLY bad here and I’ve been around a while.

saltedFish,

That’s the portion of the user base that didn’t leave reddit voluntarily.

Candelestine,

Dingding.

I also think we’ve caught the attention of some haters that just don’t like hopeful people or nice things.

scarabic,

Thank you.

Darth_vader__,
@Darth_vader__@discuss.online avatar

what if we use deep learning based automoderators to instantly nuke these posts as they appear? For privacy and efficiency let’s make the model open source and community maintained… maybe even start a seperate donation for maintaining this, and maybe even make it a public API!

SamboT,

For sure let us know when it’s done

bilal,

How will these models be trained? Surely, it would need huge amount of such data to be trained on.

query,

Yeah, that can’t be a community or open source project. Maybe the result can be open source and free to implement, but even if you can artificially generate the training data, that all has to be a strictly controlled process.

Darth_vader__,
@Darth_vader__@discuss.online avatar

maybe we can make a SFW bot an train it on general porn… most of the communities do not allow porn anyway right?

CreeperODeath,

I agree But it’s also something to be careful of

Diprount_Tomato,
@Diprount_Tomato@lemmy.world avatar

Ok, how can that AI distinguish those post in a way that doesn’t have false detections? That’s the main issue with leaving the job of mods to an AI

Darth_vader__,
@Darth_vader__@discuss.online avatar

instead of removing maybe hide the post and let a mod review it? I say let it report all pornographic post and let the mods choose what should be allowed

Diprount_Tomato,
@Diprount_Tomato@lemmy.world avatar

That’s great for detecting possible raids, but once they’re detected I see it to be an overbureaucratisation of the process

ricdeh,
@ricdeh@lemmy.world avatar

I think that it’s a wrong assumption that a human moderator would be any more reliable than a sufficiently elaborate and trained machine learning system

TheWoozy,

It sounds like a lot of work for an uncertain outcome. Please volunteer head up the project!

ZestycloseReception8,

isn’t this what 8cjan is for? Seriously what the fuck is wrong with people who think that CSAM is appropriate shit post material. The Internet really is a fucked up place. is there a way to setup something to automatically remove inappropriate posts similar to YouTube’s system of automatically removing inappropriate contents?

HelloHotel,
@HelloHotel@lemmy.world avatar

Its an attavk on the fediverse

BoldRaven,

thanks, soldier 😆

ArtisinalBS,

Legally , in the US , you would have to remove it from public view, keep it stored on the server, collect identifying information about the uploader, and send all that data to the authorities - all that while maintaining a low false-positive rate.
Kind of a tall order.

middlemanSI,

I know no details but smell Elon stink…

BoldRaven,

it tis a somewhat identifiable musk indeed… i like to say it’s that Pepé Le Pew

HexesofVexes,

Sounds like the 4chan raids of old.

Batten down, report the offender’s to the authorities, and then clean up the mess!

Good job so far ^_^

johnthebeboptist,

Yeah, in good and in bad, this place is a lot like “the old days of the internet”, but thankfully more in the good way than the bad. People keep complaining about shit I haven’t seen for a second, constantly there are actions from the mod/admin side about shit I’ve never even seen etc. Even without proper mod tools and shit it seems like everything is working out quite well. To which I and all of us owe a huge thank you to the people working on this stuff.

Thank you people! Thank you for making this place feel like home and something greater than what we’ve had for a long ass time.

HelloHotel,
@HelloHotel@lemmy.world avatar

Mods, Keep yourself safe, be open to your support groups (dont isolate), clear shread your cache often

We will get thru this together

BoldRaven,

They can out post us. They have nothing better to do… 😬

ObviouslyNotBanana,
@ObviouslyNotBanana@lemmy.world avatar

Well that’s fucking horrible. Jesus christ on a bicycle.

ptrckstr, (edited )

I’m afraid the fediverse will need a crowdsec-like decentralized banning platform. Get banned one platform for this shit, get banned everywhere.

I’m willing to participate in fleshing that out.

Edit: it’s just an idea, I do not have all the answers, otherwise I’d be building it.

HelloHotel,
@HelloHotel@lemmy.world avatar

Mabe FIDO for identity purposes is a good idea. Mabe some process that takes a week to calculate an identity token and an approval and rejection system for known tokens

atticus88th,

[ptrck has been permanently banned from all social media]

Katana314,

What you’re basically talking about is centralization. And, as much as it has tremendous benefits of convenience, I think a lot of people here can cite their own feelings as to why that’s generally bad. It’s a hard call to make.

rbar,

They didn’t say anything about implementation. Why couldn’t you build tooling to keep it decentralized? Servers or even communities could choose to ban from their own communities based on a heuristic based on the moderation actions published by other communities. At the end of the day it is still individual communities making their own decisions.

I just wouldn’t be so quick to shoot this down.

Rambi,

There is something similar to that for Minecraft servers, it’s a website/ plugin where people’s bans get added to and other admins can check usernames on there to see if they’re a troll or whatever and ban them straight away before they cause issues. So it’s definitely possible to do in a decentralised way.

newIdentity,

Soo… You want to centrelize a decentralized platform…

ptrckstr,

You can have a local banlist supplemented by a shared banlist containing these CSAM individuals for example.

thisisawayoflife,

That ban list could be a set of rich objects. The user that was banned, date of action, community it happened in, reason, server it happened at. Sysops could choose to not accept any bans from a particular site. Make things fairly granular so there’s flexibility to account for bad actor sysops.

newIdentity,

But how do you know that these people actually spread CSAM and someone isn’t abusing their power?

BradleyUffner,

There is no way that could get abused… Like say, by hosting your own instance and banning anyone you want.

ptrckstr,

Anything can be abused, but you can also build proper safeguards.

HelloHotel,
@HelloHotel@lemmy.world avatar

It might be in the works, comment

ImpossibleRubiksCube,

I think a ban list is, unfortunately, contrary to the notion of decentralization; as otherwise warranted as it is in this instance.

What about a centralized list of data on major offenders, which could be made subscribable by instances? Perhaps a way of calculating probability of origin, matched with offenses made by that origin in the past? That way an instance could take it under advisory, and take actions suiting them?

This is only the beginning of an idea, but if it is possible to collaborate without centralizing, we should explore it.

CreeperODeath,

I feel like this would be difficult to enforce

Whitehat93875,
@Whitehat93875@lemmy.world avatar

Do people think that someone has to use the same email address, or the same username? If someone uses a different email, username, and IP address (don’t try and argue semantics it can, always can, and always has been done) then whatever you put into the list can’t be applied to them.

Even if you ask for IDs people can fake those, it’s illegal sure but so is what these assholes did and it didn’t really stop them now did it.

Draconic_NEO,
@Draconic_NEO@lemmy.world avatar

We already have that, it’s called prison. Can’t go on the internet from Prison (at least I’d assume so, wouldn’t make much sense if people could). That’s not 100% since people need to be caught for it to work but once they are it certainly is.

Though other Global ban solutions don’t really work well because they require a certain level of compliance that criminals aren’t going to follow though with (i.e. Not commiting identity theft). They can also be abused by malicious actors to falsely ban people (especially with the whole identity theft thing).

Dramachad,
@Dramachad@lemmy.world avatar

deleted_by_moderator

  • Loading...
  • KrisND,

    Few things…

    • Style is absolutely horrible…
    • Rules are WAY worse then any lemmy I’ve seen.
    • Pretty sure visiting that site put me on a watch list.

    There are so many lemmy instances to join instead lol https://join-lemmy.org/

    Dramachad,
    @Dramachad@lemmy.world avatar

    The community is pretty good, no CSAM as far as I know

    newIdentity,

    People posting cp usually do it to harm a site or community. This also happend with reddit and multiple communities were shutdown because of it

    Dramachad,
    @Dramachad@lemmy.world avatar

    Pretending it’s the “other” posting CP has been a longstanding fascist tactic going back to before Reddit.

    ekZepp,
    @ekZepp@lemmy.world avatar

    Any news regarding the situation? (Tue 12:59)

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    Tue Aug 29 1:04:32 PM EDT 2023,

    Havent heard much yet, new comments are appearing rather slowly, we may be in some approve only mode.

    ClarkDoom,

    Never been more glad that I don’t follow shitposters or shitpost communities. Can’t believe that community is as big as it is anyway considering how annoying shitposting is.

    HelloHotel,
    @HelloHotel@lemmy.world avatar

    It was relitively innocent from what i remember, just strange memes

    cubedsteaks,

    Seriously. I’m not surprised by this at all because shitposting welcomes 4chan and 8chan types that have all kinds of access to CSAM

    zakobjoa,
    @zakobjoa@lemmy.world avatar

    lemmyshitpost was very wholesome, I can assure you.

    cubedsteaks,

    that doesn’t really matter when it comes to attacks.

    zakobjoa,
    @zakobjoa@lemmy.world avatar

    These attacks didn’t happen on lemmyshitpost because there’s shitposts on lemmyshitpost. They happened because it was one of the largest communities. This is an attack, not some lost wackos trying to establish a CSAM sharing community.

    cubedsteaks,

    I don’t think you understood my original comment.

    zakobjoa,
    @zakobjoa@lemmy.world avatar

    No, I did, but lemmyshitpost did not welcome 4chan/8chan type posters. It was supportive, inclusive and weird. Very much unlike chan culture.

    cubedsteaks,

    I see why you’re confused because you think I was saying they were allowed when they weren’t.

    No, the term “shitpost” is enough to draw them in.

    Xeknos,
    @Xeknos@lemmy.world avatar

    There’s shitposting and then there’s mean-spirited shitposting. Lemmy’s community was the former, 4/8chan is the latter.

    the_post_of_tom_joad,

    Im quite pleased with this development per se since I’ve only seen posts i hate coming from there.

    newIdentity,

    You might find them annoying, I think they’re funny.

    iquanyin,
    @iquanyin@lemmy.world avatar

    same. ive never understood the appeal, of doing it but especially of “consuming” it.

    DarthBueller,

    Relevant: Excellent congressional legal memo explaining the current US law on CSAM (as of 2022) Link

    KrisND,

    lemmy.world isn’t in the US, at least right now.

    Imotali,
    @Imotali@lemmy.world avatar

    You think the US cares about that? Europeans still have to defacto prove they aren’t Americans or they get taxed American taxes by their European banks.

    The US does not and has not ever cared about international borders unless they benefit the US.

    ricdeh,
    @ricdeh@lemmy.world avatar

    That’s a super dumb claim, it’s literally fake news and entirely nonfactual.

    CharlesDarwin,
    @CharlesDarwin@lemmy.world avatar

    All I can is that I’m sorry this is happening to you guys. Are there specific ways others can help?

    utopianfiat,

    I hope the devs take this seriously as an existential threat to the fediverse. Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers. If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

    dipshit, (edited )

    I’ve just finished arguing with other lemmy users about how admins aren’t interested in taking on your legal risk. That was for the topic of piracy. CSAM is another issue entirely. Not only can lemmy users not expect to see a CSAM-friendly instance, lemmy users should expect to be deanonymized by law enforcement. Fuck around with kids and find out.

    Downvote this message if you are a pedophile, as I’m taking the stance that CSAM should not be allowed on lemmy servers.

    utopianfiat,

    Are you seriously conflating my position with arguing that CSAM should be allowed?

    dipshit,

    Are people having a difficult time reading today? It’s not just you. Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

    No, I don’t think you personally are advocating for CSAM to be allowed. I think commenters are getting a little uppity about missing out on their favorite community while the admins deal with content that is:

    • harmful to children
    • damaging to the admin’s psyche
    • damaging to the user’s psyche
    • against the law

    Imagine you owned an instance, and you found 100 moderators for your communities. You rest your head on the pillow and go to sleep. You wake up and find that some user has written a script to post CSAM on all your communities, because “fuck you that’s why”. You get on the line with your moderators and they tell you they’ve been battling this all night, just banning people and deleting comments on site. They tell you they’ve had to turn off a few communities and that some users are complaining. Your hard work for weeks and months to get this instance to a healthy place is being tested. You get an email from your hosting service, saying that they have reports that your site contains CSAM and that’s against ToS - they give you a day to get it under control before they boot your server or turn it over to police. Imagine in this case you make the drastic move to simply pull the plug - taking the entire instance offline until you can sort it through. Now imagine some users come in and start complaining about how you dear admin are killing the fediverse. Personally, I have no sympathy for those user who complain about their community or instance being taken offline while admins deal with real shit.

    utopianfiat,

    I think commenters are getting a little uppity

    What praytell the fuck do you mean by this term specifically

    dipshit,

    I’ve spent the better part of this morning explaining to people the fact that a community needs to be shut down in order for volunteers to work on cleaning it up in the time they have available.

    commenters seem to be pretty upset that something as “drastic” as turning off a community needs to be done. Some commenters have gone so far as to say that the policy of turning off communities in response to handling CSAM is what will “kill the fediverse”.

    I think the normal response to this is: “Wow, this sucks. Thanks admins for doing your best work. I understand the community make not come back for a bit, take all the time you need!”. Yet, I hear “it’s the dev’s fault for not putting in the code for blocking CSAM and taking a community offline is unacceptable”. I call that “upity” but there’s probably better words for it.

    cubedsteaks,

    Maybe it’s this topic and how it intermeshes with technology. Some people seem to think that there’s a technical solution for this already (one that works as well if not better than human moderators).

    So 4chan has this problem a lot but they are also based in the US where its most definitely illegal and they IP ban people and I think for the most part it works. It did suck though - I don’t go on there anymore but in the last few years I did, if I was on mobile, I would often get hit with a region ban because so many people in that area were banned that they just decided to block an entire IP region to prevent anyone else posting illegal content.

    maybe look into IP and region banning to prevent someone from just making new accounts.

    dipshit,

    You’re discussing how to ban people, this isn’t the problem.

    The problem is this: In the last hour, 10,000 images were uploaded. Some of those contain CSAM. Now, you have 1 hour to find all the CSAM photos (0 to 10,000 of them). In the next hour, another 10,000 images will be uploaded, some of them containing CSAM…

    Unless you have a lot of human moderators, you’re going to use automated tools and get false-postives or false-negatives.

    A site like 4 chan banning whole regions isn’t a great example of handling this well. I don’t think I need to explain (but maybe I do) that one person in a region who is posting CSAM doesn’t mean the entire region posts CSAM. You could just opt to block all regions by pulling the site off the internet. Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes. “Children can be abused only in these countries” “I’m sorry but your countries laws prevent images of children being abused, so this content is banned”. Yikes.

    maybe look into IP and region banning to prevent someone from just making new accounts.

    Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

    if ( $_SERVER[‘REMOTE_ADDR’] === ‘1.2.3.4’ ) { die(‘nope!’); }

    Here’s the code to ban a user at a specific region (pseudocode):

    $geoip = new GeoIPDB(); $region = $geoip->get_region( ‘1.2.3.4’ ); if ( $region === ‘USA’ ) { die(‘nope!’); }

    This isn’t difficult.

    Now, for the code to DETECT CSAM:

    look for skin tone tints (take into account all skin tone colors), look for quantity of skin on image (this would make close-ups of arms possible nude detections), detect a person in the photo, determine the person’s age by the photo… don’t detect images of art or of artful nudes, etc… or you know this is a lot of work, let’s make the humans detect instead.

    cubedsteaks,

    Region banning would prevent anyone in the area from posting. I even mentioned that I use to come across bans for other people. In the case of 4chan, when they region ban, its possible someone else will be prevented from posting.

    Not to mention, does this now mean that 4 chan allows CSAM for certain regions? Yikes No its against their TOS entirely. It’s readable on their site and they do enforce rules even though they also enable people to be shitty in other ways.

    Now, if you want to talk about legality in other countries - that’s a different discussion. The internet is open to the WORLD. And all I would be comfortable confirming is that it’s definitely illegal in the US where I am. I’m not gonna get into other countries where it might not be illegal. I don’t know enough about those places to be able to tell you more.

    Basically a region ban would be similar to just pulling that instance down. Preventing whatever region that person was posting in would prevent them from posting as well as making local accounts to try and post more.

    When I would be downtown where I live, and got a ban that wasn’t meant for me, but I was in the region that was banned, I was able to appeal my ban. In order to appeal, you have to be good at using your words because a person has to sit there and read the appeal to make the decision to unban or not. Mine always went through but I also am capable of talking things out and I’m smart enough to know when to properly explain myself.

    Other people didn’t get their appeals and I would see them complain about it elsewhere.

    Anyway, you don’t need to condescend to me. I’m not against what you’re saying. I agree with a lot of what you said in other comments.

    dipshit,

    I mentioned this before but I’m sorry that I didn’t see who I was responding to. I usually respond on the internet to ideas, not people. Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

    GeoIP databases aren’t perfect, which is another problem entirely. It’s better than pulling the plug on the entire internet, sure, but it has its own problems.

    I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries. This is a moral argument I’m making, but I am happy imposing the US law as it relates to CSAM being illegal (not US law such as FOSTA/KOSA, etc… those are a different can of worms entirely) on other countries. Or to put it another way, as an admin, if I get an email saying “actually bro in country xyz we get to abuse children”, it won’t sway me into allowing that content in that country. IF someone in that country wants to put up a site for that country, that’s their problem (and if I could intervene and prevent them from doing so, I would).

    cubedsteaks,

    Today I’ve been responding a lot to the idea that CSAM is easy to fix and that for reasons unknown it just hasn’t been done with lemmy and the way it’s being done with lemmy isn’t “the right way”.

    Right, it’s definitely not an easy fix and Lemmy doesn’t even operate the way other sites do but today I’m learning that using these instances seems to be easily exploitable.

    The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

    I was responding to the idea of gating csam content via geoip as “yikes” because I can’t find myself personally allowing CSAM in some countries, because it’s “legal” in those countries

    I agree. Honestly, if I was in charge in anyway - those countries just wouldn’t be allowed access. And that does happen. I use to work for an app where we had people working in the Philippines who couldn’t access the app itself. We had to just give them info and they would feed it to the customers. And it was because their country is blocked from viewing the app in the first place. They’re just straight up not allowed to use it there.

    Like I’m totally with you. Fuck MAPs, fuck all of em. If some archaic country still participates in something that is obviously harmful to people - yeah, impose these laws on them. Tell them to fuck off until they stop this shit.

    And lets be real. It’s gonna be years before they ever stop.

    dipshit,

    The reason I mentioned region banning is because it definitely worked. There weren’t people uploading 10000 images of CSA cause if you tried to, you’d get banned so hard that you’d ruin it for other people posting near by.

    That’s an interesting point. I didn’t take into account that 4chan might use region banning as a way to shame other anons by removing access from their country. That’s an interesting approach and I guess that’s something that lemmy admins could use in their toolbag. Users would absolutely hate it more than a simple community being banned, but whatever works or at least helps decrease the amount of this in existance.

    cubedsteaks,

    Oh right, and it wasn’t country based. I’m in a large city and only the downtown region of my city was banned. If I went back home, I could easily get on the site and post.

    dipshit, (edited )

    Yeah, there isn’t any geographic information required when giving out ip addresses. Companies like MaxMind maintain a large “GeoIP” database that tries to match IP address with location and is pretty good, but isn’t 100%. VPNs also make the situation worse, as it sounds like with 4chan someone could vpn to a country they didn’t like, post CSAM and get that country banned. It also means users can circumvent the banings with VPNs. All of this makes the entire region banning pretty useless. It’s value comes likely comes from users who don’t know how to use VPNs, but do know what 4chan is.

    And then there was the issue with people distributing CSAM via sprays on counterstrike servers. Just joining a server would mean downloading those sprays onto your machine without your knowledge, just so they could be visible in game (which is more or less how the web works anyway, but the point here was that someone could create hundreds of sprays with CSAM and then even if not used, every player would have that content somewhere on their computer.

    Edit: not sure why I brought up the counterstrike thing. I think I was trying to make the point thats unfortunately sometimes this material seems to be weaponized. All around awful.

    cubedsteaks,

    Yeah I mean, I only mentioned how they do this on 4chan with region bans because they have worked. Believe or not but people don’t just get VPNs to spam CP there.

    They use to do that back in like what… 2005 ish? Maybe even into like 2008, or around that era? In all my years wasted on that site, no one was just spamming CP and getting away with it. It was always taken down quickly and the bans were put in place and the spamming of illegal content stopped.

    And people there definitely use VPNs but to do things like torrent or pirate usually. They aren’t all dumb, they just have bad morals in general over there.

    I’m familiar with the CS sprays you’re talking about! This was an issue in other games too where you could upload images as a spray to put on a wall in a game. I believe my ex and I use to spray the image from Goatse in one of the earlier Call of Duty games.

    My ex ran his own server though. If someone uploaded CP, he would just ban them.

    dragontamer,

    Again, the technical issue isn’t on banning. Here’s the code to ban user at IP 1.2.3.4:

    How does an IP Ban work when this attack came through a different, legitimate, federated Lemmy server?

    Katana314,

    I don’t think the comment above was trying to express dissatisfaction towards Lemmy’s hosts for failure to respond. They’re simply stating that the way things are all set up, much as we might like it, has serious problems - ones that may end up being considered unsolvable. As you said, we might be heading for an eventual plug pull.

    It’s like pointing out that cars produce fossil fuel exhaust. It sucks, and we’re seeing it as unsustainable, but there’s no convenient alternative yet.

    dipshit,

    Things are setup the way they are because it’s the best way that admins (not just of lemmy instances but of major sites like reddit and facebook) have found to handle these situations.

    You could take it a step further and give law enforcement their own backdoor to your site, as Facebook has done, but I would not advocate for that solution. We are in a special place in the internet where we can somewhat self-police our own content, assuming we actually self-police our own content. The way we do this is the way these admins are currently handling this.

    It may be reasonable to think that sites like reddit and facebook have it all figured out, but all they have is similar code to what lemmy has, but with a bit more money to pay some content moderators on trust and safety to actually remove this content before users get a chance to see it. The difference between those sites and lemmy is $$$ and that’s not something that’s likely to change anytime soon.

    dipshit, (edited )

    Sorry to hear about your investment in lemmy. How much did you end up investing? It just sounds like you’re very unsatisified with the value that lemmy has provided.

    Personally, I don’t pay for lemmy. Lemmy is free as for as I understood it. As it being free, I can’t really dictate the legal risk that the admins have to go through, as I do not have power over them, and because I treat them as humans.

    But yeah, I guess if you have a good reason, they really should be falling over backwards to moderate all the CSAM away from your favorite community. You are an all-powerful being.

    Edit: Sarcasm on the internet doesn’t work well so let me be frank: admins aren’t responsible for going to jail for a user’s desire to post CSAM. admins have a right to shut down a community that posts CSAM or remove CSAM or any material they find objectionable from their site. Admins take on the legal risk of the content on the site and OWE USERS NOTHING. Y’all can “the customer is always right” all you want but if you aren’t the one paying you aren’t the customer and you aren’t right.

    wanderingmagus,

    I feel like you didn’t actually read their comment before posting, !dipshit

    It has nothing to do with Lemmyshitpost being their “favorite community” and they never mentioned “investing” or “value”. That’s all from you. Stop strawmanning their position. They were criticizing the ease with which entire communities can be taken down by single individuals. Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

    dipshit, (edited )

    I read it. Let’s read it together again.

    I hope the devs take this seriously as an existential threat to the fediverse.

    The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works. I assume the commenter meant to say “admins” here, as developers write the code, they don’t admin the sites. If a developer has a lemmy instance they admin, they they are both a dev and an admin. Lemmy wasn’t built for CSAM sharing specifically, it is a site that allows sharing of CSAM as much as reddit or facebook do. The devs can’t do much about this. The admins and mods can.

    Lemmyshitpost was one of the largest communities on the network both in AUPH and subscribers.

    Neat. Irrelevant, but cool.

    If taking the community down is the only option here, that’s extremely insufficient and bodes death for the platform at the hands of uncontrolled spam.

    This I take issue with, and is what I mostly responded to.

    “If taking the community down is the only option here” well no, it’s not. We could just get 100’s of mods to specifically address this one user’s posting of CSAM. Hey, anyone want to moderate the site? Oh right, and they’ll need to be vetted, and they’ll need to keep doing this on the side for free as volunteers since lemmy is volunteer run…

    “that’s extremely insufficient” hard disagree. A community is liable for the content on it. If we put a CSAM post up on a site and leave it around for a few minutes, that’s one thing. If it’s left up for days and weeks, that’s quite another problem entirely. The minute that an admin or mod saw CSAM material, they did the right thing by shutting that down. Even if it means downtime for users. Oh no! Users can’t read lemmyshitpost and now the world is ending.

    “and bodes death for the platform at the hands of uncontrolled spam.” Welcome to the internet, where all platforms are at the risk of uncontrolled spam. At first it was just email, but then it was bulletin boards, and then message boards, and then forums, and then community-moderated forums like reddit and lemmy. This has and will be a problem. This isn’t a new concern for lemmy devs or admins or mods, they all are aware that this can happen and is why they do what they do. Turning off the community is a viable option, and is what has happened in larger companies too while they cleaned up the mess.

    Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?

    I’ve been very consistent in my arguments. Show me the contradiction and I’ll address it.

    TL;DR: users cannot expect to be allowed to post CSAM material on lemmy instances. Allowing CSAM material to be up on lemmy instances constitutes a legal risk for admin owners, and thus we cannot leave it up. Blocking a community (even if it’s like the bestest and most favorited and most subscribed and everyone loves it and wow just super-duper community) is a viable means of blocking all CSAM on that community while it is cleaned up. To suggest that the community should have stayed online is assinine. To suggest that the admins should not have blocked a community to combat CSAM is assinine. Trust admins to do their jobs.

    wanderingmagus,

    They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

    NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

    You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

    Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

    dipshit, (edited )

    They aren’t asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.

    Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.

    Edit: I goggled and could not find this database. I’m thinking it’s a myth.

    NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the “material to be up on lemmy instances”. NOBODY is suggesting that a single instance going down is “the world is ending”. NOBODY is asking for “100’s of mods to specifically address this one user’s posting of CSAM”.

    ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.

    I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.

    You’re setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring “100’s of mods”, which is what resulted in the condition that “taking the community down is the only option here”.

    Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?

    Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.

    You’re not understanding how CSAM detection works or is handled.

    The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces… IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.

    Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?

    Pedophiles, instead of downvoting me, why not explain yourself?

    wanderingmagus,

    As another commenter posted below:

    But tools do exist. PhotoDNA by Microsoft. Although much more user-friendly implementation if you use Cloudflare, related links:

    As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

    Blaze,
    @Blaze@discuss.tchncs.de avatar

    Db0 even created a tool for Lemmy:

    lemmy.dbzer0.com/post/2896209

    dipshit,

    As far as I am aware, every major site does use it in addition to manual vetting for any flagged “borderline” or “uncertain” results caught up in the filter.

    I think this is where you could be wrong here. I appreciate the links, I’ll look into those in more detail. My best understanding is that these tools generate so many false-positives and false-negatives that it’s not worth using them. It may be a first line of defense until real humans get to see them, but my point is that humans are still needed. When humans are included because the system isn’t 100%, it means humans do the labor and as such, with limited time, humans need to determine when they can do the labor - sometimes shutting down a community is the best way to stop the flood while they clean up the mess.

    ricdeh,
    @ricdeh@lemmy.world avatar

    This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    dipshit,

    This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    Hello Richard, thanks for taking the time to present your criticism. I take some issue with it, but I believe it’s due to a misunderstanding. I’ll explain.

    This is just a matter of confirmation bias from your side now.

    I do think there is some confirmation bias at play here but I’m thinking you may be surprised where it’s coming from.

    You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question.

    This factual information, to the best as I can understand from reading the comments is that these tools exist. I don’t deny that they exist. I didn’t deny they exist.

    users who have many better things to do than respond to your inquiries

    Unfortunately, it would seem you are wrong here, as they have responded to my inquiries. Or, are you talking about the people who haven’t responded to my inquiries? Either way, I agree that there are people who have or haven’t responded to my inquiries that do have better things to do. What is also true is that the people who responded to my inquiries, even if they had better things to do, still responded to my inquires.

    you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question.

    But, I didn’t, is the thing. In the cases where people have provided links to projects, I’ve thanked them. I’ve mentioned my skepticism and I’ve also wished them well on their projects.

    “Advanced and reliable” are marketing terms, and I don’t really care to use them as they have no meaning. Advanced how? In that it uses neural networks? NEAT! reliable how? in that they work 100% of the time? That they don’t generate false-positives or false-negatives? That they don’t degrade the user experience? These are questions worth asking… but… let me be clear: they are questions worth asking for the sake of improving these tools and maintaining the user experience; these are not meant to discourage use of such tools. I believe admins should use all tools available to them, including turning the servers off if they need to - in that toolbelt includes ai based tools and scripts.

    And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.

    Richard, is it, or can I call you dick?

    Either way, Richard, I don’t claim to have superior knowledge. I honestly thought that would come across in my username. Sorry that it did not. I’m a dipshit.

    Like, an actual dipshit. I’m dumber than a lot of people on here. That doesn’t mean that I’m the dumbest, but it doesn’t mean I’m the smartest either. I’m far from being the smartest on this site, and that’s not impostor syndrome. If I sound smart it’s just because I do know a bit on some topics relating to technology and development, and because I have many interests in many topics. What I have at best is a baseline understanding, and I try to remain humble about it. I got pretty emotional in this thread, calling people who disagreed with my very much hard-to-disagree-with stances that “CSAM bad” and “admins should feel free using all tools available to remove and prevent CSAM” names that got me a 3-day ban. A ban which is now ending, allowing me to finally respond to you.

    You see the problem here is that the people I’ve been responding to are people who are misunderstanding some things, which I’m trying to clear up. I believe this is one of the main draw to any message board system, for most people. We like to communicate and share knowledge.

    however somehow failing to provide any data or secondary sources to back up your claims.

    I don’t walk around with sources handy for everything I say, ready to cite them in every single post. I also don’t see many people doing this. I don’t lie, and I’m happy to provide links, sources, whatever - when asked for them. No one’s asked for sources for my claims that “CSAM bad” or that “admins should feel free using all the tools available to them…” or even when I say that AI tools aren’t perfect and lead to false-positives and false-negatives. But, I can give them to you if you want to see them. I don’t talk out my ass. One easy example I’ve been giving people is what’s happening and has been happening for years to this one youtuber, who constantly gets her videos flagged as having a child in them, despite her being 30 and only having herself in her videos. Her experience is valid, as there are some people who will constantly be falsely identified (which again is just stating facts, not suggesting these tools shouldn’t be used - that would suggest that these tools cannot be improved, which they can and are).

    Richard, your comment was a slam-dunk. I’m just not sure you hit the right net. Please let me know if I am misunderstanding anything.

    Yours, Dipshit.

    utopianfiat,

    The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works.

    They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

    Oh no! Users can’t read lemmyshitpost and now the world is ending.

    Replace this with !technology, or !selfhosted, or !announcements. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

    You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

    Turning off the community is a viable option

    It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

    dipshit,

    They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you’re talking about.

    Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

    Links or you’re talking out your ass.

    Replace this with !technology, or !selfhosted, or !announcements. “Oh no, users can’t read the entire site” yes that is the definition of the end of the site.

    Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place. READ the original post where it was mentioned this is a stop-gap measure.

    You’re not seeing that this isn’t a lemmyshitpost issue, it’s an “any popular community on lemmy” issue. Snarkily taking potshots at lemmyshitpost as a community doesn’t change it.

    You’re… offended that I had snark on the topic of lemmySHITPOST? surely, you are joking.

    My point is not that this community is shit and that’s why this happened.

    My point is that this is a community on a lemmy instance that was flooded with CSAM, and was shutdown because of the flood of CSAM.

    It’s not “not an option”, it’s the last resort. It’s like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don’t have a community anymore, and without communities the site has no purpose.

    You do see how turning a community off and then on again isn’t the same thing as burning down a house (and unburning it again?)

    You do realize that we’re talking about a literal crime against children vs your ability to see memes? Fuck off with your self-importance.

    utopianfiat,

    Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place.

    Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

    dipshit,

    Are you being intentionally dense or do you not understand that it’s my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.

    Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

    “Flooding” a site with CSAM is a matter of opinion. If one person posted one image of CSAM on my instance, that would be flooding - that’s one image too many. It’s not like there’s some magic threshold of the amount of CSAM allowed on a site. All sites use human moderators to detect CSAM and all sties who do this have teams that are far too small and far too underpaid for the most part.

    Underpaid being the keyword here, as lemmy admins are volunteers. I would think that the threshold for “flooding” a lemmy instance with CSAM would be far lower than that of a major for-profit site.

    cubedsteaks,

    Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc… No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.

    it’s true, if I remember correctly, tumblr was removed from the App Store because of CSA issues. I could be remembering wrong and maybe it was the Google Play Store.

    dipshit,

    This is likely going to be an issue for any site that hosts nsfw content. It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.

    If the admins of nsfw instances aren’t already on high alert, they should be now.

    cubedsteaks,

    It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.

    Yeah, I fully agree with that. If people want porn, they should go to porn sites.

    cubedsteaks,

    Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.

    have you ever talked to janitors and mods on 4chan? Good luck getting any info out of them.

    dipshit,

    Do you realize that 4chan isn’t the full internet? That these programs that you already know of can exist outside of 4chan? I’m asking you - the person who knows of these apps - to provide links to back up your claims.

    cubedsteaks,

    I’m not the other person you responded to and I never claimed to have an apps or links.

    I’m just telling you how this works on 4chan. I’m aware that’s not the entire internet obviously - your sarcasm needs work considering we are both here on Lemmy, ie, not 4chan.

    If anyone on there is using these programs/apps/whatever, they’re not just gonna tell other people about them.

    And as far as I know, I haven’t been on 4chan in like 3 years not but they region ban for CSAM.

    dipshit,

    I don’t really keep track of who I’m responding to. I just respond to comments. Sorry if I got mixed up here.

    My comments still stand, though the snark isn’t directed towards you specifically.

    I just want people to understand that there isn’t a solution that we all think exists in other places and not here. The solution is largely people. People who get PTSD from viewing and moderating these images. It’s not a good solution but it’s the best solution we have so far.

    The other truth of the matter is if the Internet itself were to hypothetically shut down, this content will just be distributed via other means. The one nice thing about the internet is that lots of stuff is tracable back to the person who posted the infringing material.

    cubedsteaks,

    I just want people to understand that there isn’t a solution that we all think exists in other places and not here

    I agree with you on this. I agree with a lot of things you’re saying.

    part of responding means you know who you are talking to. I know it can be easy to forget someone is behind the screen here but there are in fact other human beings responding to you.

    Just try to be more considerate, if anything.

    newIdentity,

    Everyone got your sarcasm. We just think the Lemmyverse has no chance when it’s flooded with child porn

    dipshit, (edited )

    deleted_by_moderator

  • Loading...
  • DarthBueller,

    People are downvoting you because you’re acting like a dick.

    despotic_machine,
    @despotic_machine@lemmy.world avatar

    I don’t think they’re acting.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    If you’re a pedophile and disagree with me - instead of downvoting, why not explain yourself?

    People have been, but you’re not truly listening, Internet Warrior.

    dipshit,

    Whatever you say, kid.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Whatever you say, kid.

    I’m over 50, but you keep doing you, Internet Warrior, as it just proves my point.

    dipshit,

    Your age does not come across in your writing. You sound like a kid, which is why I called you one.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Your age does not come across in your writing.

    Well, lets see …

    People have been, but you’re not truly listening

    That sounds to you like a sentence a young person would say, punctuation and all?

    dipshit,

    Yes.

    It doesn’t to you? How young do you think these people are? If you’re 50, there’s 49 different ages which are younger than you. Not all of them know how to write like that, but I would say at least the ones with a high school education do.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Yes. It doesn’t to you?

    These days especially, I see very few young people who bother with things like commas, multiple paragraphs, or using words like “truly”.

    dipshit,

    You may want to keep your day job, and not move into the online-age-detecting sector, truly.

    Sorry for miss-generationing you.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    You may want to keep your day job, and not move into the online-age-detecting sector, truly.

    Well, for the record, you’re the one who started identifying the age of somebody online by what they wrote. So I would just advise you to follow your own advice.

    Sorry for miss-generationing you.

    Wow, assuming you weren’t just being sarcastic, an actual apology. Thanks. There’s still hope for you.

    cubedsteaks,

    I agree with a lot of what you said and upvoted you but you really need to just stop calling people pedos for disagreeing with you.

    I’m a victim of CSAM myself and you can take a look through my comment history where I talked about it in depth more. I hate pedos just as much as you do but going around calling people pedos isn’t going to do anything but upset people.

    dipshit,

    I’m taking the radical stance that CSAM isn’t a good thing, should be reported to law enforcement and that the site with CSAM can be shut down as a viable option for handling CSAM material.

    I’m getting downvotes from people who disagree with me on this “radical” stance. People who disagree that CSAM is a problem, that CSAM is a concern. I don’t have a lot of sympathy for people who promote CSAM like the people who downvoted my posts. I don’t care about the loss of internet points, I care that these worthless shits are still on lemmy, so yes, I call them what they are.

    newIdentity,

    You’re completely misinterpreting everything we said. If we would shutdown every site with CSAM, the internet wouldn’t exist. We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

    dipshit,

    You’re completely misinterpreting everything we said.

    Not at all. I am completely underestanding you.

    If we would shutdown every site with CSAM, the internet wouldn’t exist.

    You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

    We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

    My solution which is to remove CSAM? My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

    Another question for you: if your house is flooding due to a burst pipe, what do you do first:

    a) get all the water out of the house b) turn off the water coming into the house.

    my solution would be to do step B followed by step A. Your solution appears to be to just do step A, which means you’ll constantly be flooded and never have enough manpower to dry your house.

    I’d bet money that the following will happen:

    1. community gets turned off
    2. csam gets deleted, posters are identified, information turned over to law enforcement
    3. community gets turned back on.

    In the meantime, folks missing the community are free to go elsewhere on the internet. Why? because CSAM is a crime which depicts Sexual Assult and the evidence is posted online. It’s not a matter of just deleting content, it’s also a matter of turning over the people posting that content over to the police so they can be held accountable for their crimes.

    newIdentity,

    You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

    Sorry let me word this correctly: social media wouldn’t exist.

    My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

    The problem is that it won’t stop and that it will happen again.

    I’d bet money that the following will happen:

    1. community gets turned off
    2. csam gets deleted, posters are identified, information turned over to law enforcement
    3. community gets turned back on.

    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

    dipshit,

    Sorry let me word this correctly: social media wouldn’t exist.

    And this is hardly the argument you think it is. Again, not true of all social media sites, but let’s strongman your argument for a moment and say that you are refering to only the major social media sites.

    Well then, we have a problem, don’t we? What’s something the major social media sites have that lemmy doesn’t? Ad revenue, to the tune of millions of dollars. What do they do with that revenue? Well, some if it goes to pay real humans who’s entire job is simply seeking out and destroying CSAM content on the site.

    So then how does lemmy, with only enough money to pay hosting costs, if that… deal with CSAM when a user wants to create a botnet that posts CSAM to lemmy instances all day? My answer is: the admins do whatever they think is nessecary, including turning off the community for a bit. They have my full support in this.

    No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

    You’re strawmanning my argument. I’ve never said forever. I’ve said while the community gets cleaned up. I’ve even described a timeline below.

    The better tools you want to moderate are your own eyeballs. I’ve said this before but there have been many attempts at making automated CSAM detection material and they just don’t work as well as needed, requiring humans to intervene. These humans are paid by major social media networks but not volunteer networks.

    The problem is that it won’t stop and will happen again.

    Yes, this is the internet! No one has a solution to stop CSAM from happening. We aren’t discussing that. We are discussing how to handle it WHEN it happens.

    You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

    No, I’m correct about step 2, which I described as: “csam gets deleted, posters are identified, information turned over to law enforcement”

    I’ll break it down further:

    1. CSAM gets deleted from the instance. Admins and mods can do this, and they do this already.
    2. posters are identified. Admins and mods can do this, and might do this already. TO BE CLEAR, they can identify the users by IP address and user agent, that’s about it. The rest of it… is…
    3. “information turned over to law enformcement” … left up to law enforcement. “Hello police, I’m the owner of xyz.com and today a user at 23.43.23.22 posted CSAM on my site at this time. The user has been banned and have given you all the information we have on this. The cops can get a warrant for the ISP and go from there.

    Oh yeah, TOR. well, we’re getting deep off topic here but go on youtube and see some defcon talks about how TOR users are identified. You may think you’re slick going on TOR but then you open up facebook or check your gmail and it’s all over.

    Either way, I’m not speaking to the success of catching CSAM posters, I’m only speaking to what the admins likely are doing already, which is probably true.

    Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

    Nothing, which is why social media sites dedicate teams of mods to handle this exact thing. It’s a cat and mouse game. But not playing the game and not trying to remove this content means the admins face legal trouble.

    It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

    This makes no sense to me. What was your point? Yes, one image is easier to delete than thousands of images. I don’t see how that plays into any of what we have been discussing though.

    newIdentity,

    I don’t want to write a long text so here is the short version: These automated tools are not perfect but they don’t have to be. They just have to be good enough to block most of it. The rest can be done through manual labor which also people have done voluntarily on reddit. Reporting needs to get easier and you can prevent spammers from rate limiting them.

    To be clear, I don’t have anything against temporarly shutting down a community filled with CP until everything is cleared up. But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

    I’m sorry for the grammatical mistakes. I’m really tired right now and should probably go to bed.

    dipshit,

    i agree with most of what you’ve written, just one small issue:

    The rest can be done through manual labor which also people have done voluntarily on reddit.

    You’re probably right that some volunteers handle this content on reddit. By this I mean, mods are volunteers and sometimes mods handle this content.

    My point however has been that big social media sites can’t rely on volunteers to handle this content. Reddit, along with facebook and other major sites (but not twitter, as elon just removed this team) has a team of people who pick up the slack where the automated tools leave off. These people are paid, and usually not well, but enough so that it’s their job to remove this content (as opposed to it being a volunteer gig they do on the side). I’ll say that again: these people are paid to look at photographs of CSAM and other psychologically damaging content all day, usually for pennies.

    But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

    I fully agree with you. It’s just, as a dev, who has toyed around with AI and has been working on code for decades now, I don’t see a clear path forward. I am also not an expert in these tools, so I can’t speak specifically to how well they work. I can only say that they don’t work so well that humans are not required. Ideally, we want tools that work so well humans won’t be required (as it’s a psychologically damaging job), but at the sametime, we don’t want legit users to be misflagged either. The other day there was a link posted to hackerne.ws by a youtube creator who keeps needing to reenable comments on her shorts. The youtube algorithm keeps disabing comments on her shorts because it thinks there’s a child in the video - it’s only ever been her and while she is petite in stature, she’s also 30 years old. She’s been reaching out to youtube for over 3-4 years now and they still haven’t fixed the issue. Each video she uploads she needs to turn on comments manually, which affects her engagement. While nowhere near comparible to the sin of CSAM, it’s also not right for a legit user to be penalized just because of the way she looks - because the algorithm cannot properly determine her age.

    Youtube is a good example of how difficult it is to moderate something like this. A while ago, youtube revealed that “a years-worth of content is uploaded every minute” (or maybe it was every hour? still)… Consider how many people would be required to watch every minute of uploaded video, multiplied by each minute in their day. Youtube requires automated tools, and community reporting, and likely also has a team of mods. And it’s still imperfect.

    So to be clear, you’re not wrong, it’s just a very difficult problem to solve.

    cubedsteaks,

    I mean, I think people are downvoting you for other reasons.

    Obviously I agree with you that CSAM is bad. It happened to me and ruined my fucking life for like all of my teen years and then most of my early 20s.

    But calling people names is pointless. Especially when it comes off like a baseless accusation.

    dipshit,

    Noted. You’ll have to excuse the fact that I don’t really care about calling people names on the internet if the content of their message promotes abuse.

    cubedsteaks,

    Yeah I get that for sure. I mean, if I knew someone was some kind of MAP idiot who was trying to fight for the rights of pedos, I’d call them names too. Idiot seems fitting for that lol

    cubedsteaks,

    I’m a victim of CSAM and my dad exploited me for several websites.

    I get being upset about this. But it’s not the end of the world for a site. Lemmy is still totally fine and I have been using it without seeing any CSAM and the only knowledge I even have of this is from posts like OP’s.

    Like this isn’t a good time to be just down on the site and pessimistic.

    utopianfiat,

    I have a recurring donation to the instance, but that’s aside the point.

    dipshit,

    You’re right, it is. You may be the sole person donating, and maybe you of all people have the “right” to have your opinion “respected” for donating. My point is that by and large, the CSAM posters and most people who use this site aren’t directly paying for a service which contractually obligates them to take part in the site or service, let alone by posting CSAM.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

    They have very low to zero legal risk, as long as they’re doing their job.

    IANAL, but I can read laws.

    dipshit,

    Fyi, admins have the protection of federal law to not be held responsible, as long as they take action when it happens.

    Correct, emphasis mine. As long as they take action when it happens being the key phrase here.

    IANAL but from what I understand, doing something to take action (removing content, disabling communites, banning users, all of the above) shows that they are working to remove the content. This is why previously when having conversations with people about the topic of piracy I mentioned DCMA takedown notices and how the companies I’ve worked at responded to those with extreme importance (sometimes the higher ups would walk over to the devs and make sure the content was deleted).

    I’m annoyed at people in this thread who believe that the admins did the wrong thing, because turning off communities could cause users to go to another instance - who cares, this is bigger than site engagement. I’m annoyed at people who think that the devs had access to code which could prevent this issue but chose not to implement that code - this is a larger and much more difficult problem that can’t just be coded away, it usually involves humans to verify the code is working and correct false-positives and false-negatives.

    CosmicCleric,
    @CosmicCleric@lemmy.world avatar

    You misunderstood what I meant by the part that you highlighted of my comment.

    I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.

    dipshit,

    Yes, and believe it or not, I’ve been discussing both with people.

    I use DCMA actions because they are easily understood. People get copyright strikes. People pirate music.

    Safe Harbor provisions are not as easily understood, but basically amount to (IANAL) “if the administrator removes the offending content in a reasonable amount of time when they learn about the offending content, then we’re all good”. It’s not a safe haven for illicit content, it’s more of a “well, you didn’t know so we can’t really fault you for it” sort of deal. But when admins know about the content, they need to take action.

    systemglitch,

    It doesn’t bode well.

    1984,
    @1984@lemmy.today avatar

    Lemmy is new for all of us. I don’t see any other solution right now. You got some ideas how to handle it better?

    I think better mod tools are needed but it will take time. Doesn’t mean the platform will die, but means we may have to deal with some stuff like this.

    utopianfiat,

    It’s a hard problem but it absolutely is an existential risk. Spam is an existential risk. A platform that collapses under spam will either remain too small to be irrelevant or collapse from unusability. I’m sorry but I don’t think your response completely grasps the number of forums, social media sites, wikis, etc. that have been completely crushed by spam.

    1984,
    @1984@lemmy.today avatar

    I admit I haven’t kept track of that, true.

    SamboT,

    Solution now. Better solution later.

    utopianfiat,

    That’s what I’m hoping - we can’t just burn down any community with spam

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • [email protected]
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KbinCafe
  • TheResearchGuardian
  • Socialism
  • feritale
  • oklahoma
  • SuperSentai
  • KamenRider
  • All magazines