ChonkyOwlbear,

The blowjob section on the site is fascinating. So many pictures of women with dicks growing out of them.

neintynein,

Yup. First one genuinely startled me

cley_faye,

“Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.

But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.

Harpsist,

I have been waiting this day for decades ever since I first heard about AI generated images a decade or so ago.

themeatbridge,

Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

Bop,

On a visual level, we are more interested in genitals than hands? Also, faces.

lloram239,

They suck quite a lot at genitals too. But what makes hands especially tricky is simply that they are pretty damn complex. A hand has five fingers that can all move independently, the hand can rotate in all kinds of way and the individual parts of a hand can all occlude each other. There is a lot of stuff you have to get right to produce a good looking hand and it is especially difficult when you are just a simple 2D algorithm that has little idea of 3D structure or motion.

blackstampede,

They suck quite a lot at genitals

themeatbridge,

Are we still doing “phrasing”?

Bel_Shamharoth,

Said Ripley to the android Bishop.

douglasg14b,
@douglasg14b@lemmy.world avatar

It says that we are biologically predisposed to sex, which we are, like animals, which we are.

It doesn’t say anything about society, it just confirms the human condition.

themeatbridge,

But… why are hands so difficult?

elrik,

I’m sure it would have the same difficulty generating five properly articulating penises attached to a limb.

themeatbridge,

I’ll admit I have some difficulty with that concept myself.

teemrokit,
@teemrokit@lemmy.world avatar

Ever watch the “Everything Everywhere All at Once” hotdog scenes?

vaultdweller013,

Fully Articulated Hand Penises!

joelfromaus,
@joelfromaus@aussie.zone avatar

Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.

DaBabyAteMaDingo,

Super hot and spicy take incoming: AI will be able to make very realistic child porn and we might actually see huge drop in sexual child abuse.

I hate to even type that sentence, btw.

pinkdrunkenelephants, (edited )

The fact that it can make CP at all is the reason why it needs to be banned outright.

EDIT: Counting 8 9 10 11 butthurt pedophiles afraid their new CP source will be banned

DaBabyAteMaDingo,

Why’s that? There are no children being hurt.

DaBabyAteMaDingo, (edited )

No one is butthurt. I have no interest in CP (thank fucking god) but if it means people get their rocks off at home without *hurting any kids then I’m all for it.

What’s interesting is you have a strong disdain for fake porn but no real argument against it other than “heeeyuck kiddy porn bad aaahheeeyuuck”. 😂

Edit: no real arguments and just downvotes? Seems like a typical facts vs feelings argument ¯_(ツ)_/¯

pinkdrunkenelephants,
DaBabyAteMaDingo,

Did you make that for me?

I’m actually flattered! 😂

lightnsfw,

Nah, just hook it up to some predator drones and build a pedo hunting skynet.

crackajack,

It’s actually not totally different to Germany’s already existing approach. Germany treat pedophiles in their centres and allow sexual offenders to watch “child porn” but with actors who look underage. I don’t know if the approach works but it is something I just heard about.

DaBabyAteMaDingo,

Funny how just bringing up a solution that, although uncomfortable, reduces the cases of sexual abuse against kids without any victims gets you branded as a pedo.

I just want kids to stop getting abused lol

Zikeji,
@Zikeji@programming.dev avatar

It already is being used to make CSAM. I work for a hosting provider and just the other day we closed an account because they were intentionally hosting AI generated CSAM.

pinkdrunkenelephants,

Welp that’s horrifying

DaBabyAteMaDingo,

Can I ask why AI generated media is considered CSAM if there are no victims? I don’t like furry porn but it’s not beastiality. I don’t like loli shit but it’s not CP (well technically it is but it’s not real kids is my point). How is it any different?

Is it gross? Obviously, but I’m biased as I don’t like kiddie shit but no one is getting hurt and if it helps reduce sexual abuse cases against kids, why wouldn’t you be in favor of it?

I don’t understand how this is unreasonable. If AI generated CP increased the stats of kids being harmed then I’d be vehemently opposed. I know it’s a touchy subject but you can’t just write it off if it works for the greater good, no?

Zikeji,
@Zikeji@programming.dev avatar

The report came from a (non-US) government agency. It wasn’t reported as AI generated, that was what we discovered.

But it highlights the reality - while AI generated content may be considered fairly obvious for now, it won’t be forever. Real CSAM could be mixed in at some point, or, hell, the characters generating it could be feeding it real CSAM to have it recreate it in a manner that makes it harder to detect.

So what does this mean for hosting providers? We continuously receive reports for a client and each time we have to review it and what, use our best judgement to decide if it’s AI generated? We add the client to a list and ignore CSAM reports for them? We have to tell the government that it’s not “real CSAM” and expect it to end there?

No legitimate hosting provider is going to knowingly host CSAM, AI generated or not. We aren’t going to invest legal resources into defending that, nor are we going to jeopardize the mental well-being of our staff by increasing the frequency of those reports.

DaBabyAteMaDingo, (edited )

But it highlights the reality - while AI generated content may be considered fairly obvious for now, it won’t be forever. Real CSAM could be mixed in at some point, or, hell, the characters generating it could be feeding it real CSAM to have it recreate it in a manner that makes it harder to detect.

Very true and I would like to look into it further. Being able to disguise real content with an AI label could make things harder for people that detect and report these types of issues.

So what does this mean for hosting providers? We continuously receive reports for a client and each time we have to review it and what, use our best judgement to decide if it’s AI generated? We add the client to a list and ignore CSAM reports for them? We have to tell the government that it’s not “real CSAM” and expect it to end there?

I don’t understand the logic behind this. If it’s your job to analyze and deduce whether certain content is or is not acceptable, why shouldn’t you make assessments on a case by case basis? Even if you remove CSAM from the equation you still have to continuously sift through content and report any and all illegal activities - regardless of its frequency.

No legitimate hosting provider is going to knowingly host CSAM, AI generated or not. We aren’t going to invest legal resources into defending that, nor are we going to jeopardize the mental well-being of our staff by increasing the frequency of those reports.

And it’s the right of any website or hosting provider to not show any content they deem unsuitable for it’s viewers. But this is a non sequitur - illegal activities will never stop and it’s the duty of people like you to help and combat the distribution of such materials. I appreciate all the work people like you do and it’s a job I couldn’t handle. CP exists and will continue to exist. It’s just an ugly truth. I’m just asking a very uncomfortable question that will hopefully result in a very positive answer: can AI generated CP reduce the harm done to children?

Here’s a very interesting article of the potential positive effects of AI generated CP

Btw I appreciate your input in all of this. It means a lot coming from someone actually involved with this sort of thing.

Edit: and to your point, the article ends with a very real warning:

“Of course, using AI-generated images as a form of rehabilitation, alongside existing forms of therapy and treatment, is not the same as allowing its unbridled proliferation on the web.

“There’s a world of difference between the potential use of this content in controlled psychiatric settings versus what we’re describing here, which is just, anybody can access these tools to create anything that they want in any setting,” said Portnoff, from Thorn.”

Zikeji, (edited )
@Zikeji@programming.dev avatar

I don’t understand the logic behind this. If it’s your job to analyze and deduce whether certain content is or is not acceptable, why shouldn’t you make assessments on a case by case basis?

The bit about “ignoring it” was more in jest. We do review each report and handle it in a case by case basis, my point with this statement is that someone hosting questionable content is going to generate alot of reports, regardless of whether it is illegal or not, and we won’t take an operating loss and let them keep hosting with us.

Usually we try and determine if it was intentional or not, if someone is hosting CSAM and is quick and responsive with resolving the issue, we generally won’t immediately terminate them for it. But even if they (our client) is a victim, we are not required to host for them and after a certain point we will terminate them.

So when we receive a complaint about a user hosting CSAM, we review it and see they are hosting a site advertising itself as intended to allow users to distribute AI generated CP, we aren’t going to let him continue hosting with us.

Even if you remove CSAM from the equation you still have to continuously sift through content and report any and all illegal activities - regardless of its frequency.

This is not an accurate statement, at least in the U.S. where we are based. We are not (yet) required to sift through any and all content uploaded on our servers (not to mention the complexity of such an undertaking making it virtually impossible at our level). There have been a few laws proposed that would have changed that, as we’ve seen in the news from time to time. We are required to handle reports we receive about our clients.

Keep in mind when I say we are a hosting provider, I’m referring to pretty high up the chain - we provide hosting to clients that would say, host a Lemmy instance, or a Discord bot, or a personal NextCloud server, to name a few examples. A common dynamic is how much abuse is your hosting provider willing to put up with, and if you recall with the CSAM attacks on Lemmy instances part of the discussion was risking getting their servers shutdown.

Which is valid, hosting providers will only put up with so much risk to their infrastructure, reputation, and / or staff. Which is why people who run sites like Lemmy or image hosting services do usually want to take an active role in preventing abuse - whether or not they are legally liable won’t matter when we pull the plug because they are causing us an operating loss.

And it’s the right of any … [continued]

I’m just going to reply to the rest of your statement down here, I think I did not make my intent/purpose clear enough. I originally replied to your statement talking about AI being used to make CP in the future by providing a personal anecdote about it already happening. To which you asked a question as to why I defined AI generated CP as CSAM, and I clarified. I wasn’t actually responding to the rest of that message. I was not touching the topic or discussion of what impact it might have on the actual abuse of children, merely providing my opinion as to why, whether legal or not, hosting providers aren’t ever going to host that content.

The content will be hosted either way, but whether it is merely relegated to “offshore” providers but still accessible via normal means and not criminal content, or becomes another part of the dark web, will be determine at some point in the future. It hasn’t become a huge issue yet but it is rapidly approaching that point.

hesusingthespiritbomb,

I personally can’t wait for AI to cause a thotmarket collapse.

shami,
@shami@lemmy.world avatar

Technically if you’re versed enough then you can already do that but takes some effort

PoliticalAgitator,

But then who will you treat like shit?

PetDinosaurs,

We’ve programmed a robot to be treated like shit.

vaultdweller013,

I also gave it clynical depression!

hesusingthespiritbomb,

The AI eGirl, but it’ll be alright because I’ll pay CeoGPT an extra $5/month for a model that’s into that shit.

PoliticalAgitator,

Gotcha. You’re fine with paying someone to pretend they’d willingly fuck you, you’re just not comfortable with the money for it going anywhere except into an old white billionaires pocket.

I’m sure there’s nothing to unpack there.

hesusingthespiritbomb,

Nah the old white billionaire will be replaced by AI too.

FarceMultiplier,
@FarceMultiplier@lemmy.ca avatar

There’s always ‘self’.

Seudo,

Ai has no ficks to give to people who are not ready.

jiggle,

PixAI is better than pornpen IMO

IHaveTwoCows,

That will never not look like thinly veiled pedophilia to me

joel1974,

deleted_by_author

  • Loading...
  • JackbyDev,

    That’s very wishful thinking, I’m afraid.

    Cool_Name,

    Reduced, maybe, but I don’t think it will impact things significantly. The types of people who participate in that do not want an alternative because they do it because they want to hurt and abuse a real person. They want to have power over another human being and make them suffer. An endless stream of porn isn’t going to fix that sickness.

    AzureRT,
    @AzureRT@reddthat.com avatar

    Not sure how people will be so into this shit. It’s all so generic looking

    Psythik,

    AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.

    Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

    severien,

    Cam girls are going to lose their jobs.

    WhyDoesntThisThingWork,

    I really couldn’t care less.

    Stumblinbear,
    @Stumblinbear@pawb.social avatar

    There will always be a market for the “real thing”

    severien,

    Sure, but a much smaller one.

    JackbyDev,

    The article mentioned that at least one OnlyFans creator reached out to make a model of their own content and also mentioned that some OnlyFans creators outsource writers to chat with fans. I don’t think this will meaningfully affect cam girls’ jobs. Once we are able to make live animated images realtime with convincing speech and phrases then probably.

    CoolCat38,

    Good

    BreakDecks,

    The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

    lloram239,

    Using a LoRA was the old way, these days you can use Roop, FaceSwapLab or ReActor, which not only can work with as little as a single good photo, they also produce better locking results than LoRA. There is no time consuming training either, just drag&drog an image and you get results in a couple of seconds.

    pinkdrunkenelephants,

    So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.

    Or even worse, deepfake evidence of rape.

    Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

    They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

    Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

    This would leave the fascists open to do whatever they want to anybody with no serious consequences.

    But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

    hyperhopper,

    People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

    pinkdrunkenelephants,

    That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.

    It has to stop. We can’t allow the tech industry to enable fascism and propaganda.

    yuriy,

    This reads like satire.

    Serinus,

    You’re proposing to build a dam with duct tape.

    pinkdrunkenelephants,

    Nah, that Thanos I-am-inevitable shit doesn’t work on me. They can ban AI, you all just don’t want it because generative AI allows you to steal other people’s talent so you can pretend you have your own

    lloram239,

    You are literally trying to fight math.

    CoolCat38,

    Can’t tell whether this is bait or if you are seriously that much of a Luddite.

    pinkdrunkenelephants,

    Oh look at that, they just released pictures of you raping a 4-year-old, off to prison with you. Never mind they’re not real. That’s the world you wanted and those are the consequences you’re going to get if you don’t stop being lazy and learn to reject terrible things on ethical grounds.

    Silinde,

    Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

    pinkdrunkenelephants,

    So what happens when evildoers give AI-generated deepfakes to news media so they can avoid liability?

    Silinde,

    The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

    Liz,

    We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

    pinkdrunkenelephants,

    But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.

    Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.

    Castigant,

    How often does video evidence of rape exist, though? I don’t think this really changes anything for most victims.

    pinkdrunkenelephants,

    See Stuebenville, Ohio where the dumb motherfuckers date raped a girl and put the video on Facebook.

    People do shit like that.

    IHaveTwoCows,

    And they will respond to this fascist abuse by telling everyone to vote harder and donate money

    Pratai,

    As long as there will be simps, there will be a need for this trash.

    randon31415,

    When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.

    lloram239,

    It went quite a bit faster than that. StableDiffusion has only been out for about 13 months and this started about three months after that with Unstable Diffusion. What this article is reporting on is already quite a few months old and quite a bit behind what you can do with a local install of StableDiffusion/Automatic1111/ControlNet/etc. (see CivitAI).

    RBWells,

    Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.

    Zerfallen,

    You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

    funkless_eck,

    “eh I’ll take a look”

    first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.

    “alright then”

    AdmiralShat,

    Great, more man made horrors beyond comprehension

    Geriatrickid,

    Non-Euclidean anatomy

    chemical_cutthroat,
    @chemical_cutthroat@lemmy.world avatar

    BBC Escher.

    1847953620,

    World’s first mandelboobs - fractal porn enthusiast community incoming

    Nihilore,
    @Nihilore@lemmy.world avatar

    Yeh this site has nothing on some of the insane ai creators on pixiv

    Neon,

    woman with a dinosaur and a woman without legs for me

    Flambo,

    if xkcd was right about jpeggy porn being niche, i’d bank on terrible AI porn becoming a niche in the future too.

    lightnsfw,

    Once again pornography is setting unrealistic standards for women.

    sock,

    i prefer the pregnant woman with huge boobs instead of a pregnant stomach (and also less huge boobs where they normally are)

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines