AlolanYoda,

Clearly nobody related to this article has ever tried exploring stable diffusion models, all the most popular ones have an extreme bias towards young Asian women!

luthis,

I decided to give the stupid article a quick read to confirm it’s stupidity.

“How and whether artificial intelligence manages to solve these issues are yet to be seen.”

Definitely stupid.

How: LoRAs.

Whether: Already been solved for like, a year maybe?

Rage bait. Silly uninformed rage bait.

luthis,

There’s been huge discussion on this already: lemmy.nz/post/684888

Sorry, not sure how to ! post so it opens in your instance.

TL;DR

Any result is going to be biased. If it generated a crab wearing liederhosen, it’s obviously a bias towards crabs. You can’t not have a biased output because the prompting is controlling the bias. There’s no cause for concern here. The model is outputting by default the general trend of the data it was trained with. If it was trained with crabs, it would be generating crab-like images.

You can fix bias with LoRAs and good prompting.

icepuncher69,

Honestly she looked hoter before. Ai made her look like a Becky.

HardlightCereal,

AI made her look uncanny valley

pizzaiolo,

Looked better before

inspxtr,

the original look has (more) personality and soul

Couldbealeotard,
@Couldbealeotard@lemmy.world avatar

Better Off Ted vibes

Erk,

Came here for this

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • [email protected]
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • oklahoma
  • feritale
  • SuperSentai
  • KamenRider
  • All magazines