Comments

This profile is from a federated server and may be incomplete. Browse more on the original instance.

200fifty, to sneerclub in "As always, pedophilia is not the same as ephebophilia." - Eliezer Yudkowsky, actual quote
@200fifty@awful.systems avatar

import qualified Urbit.Ob.Ob as Ob (fein, fynd)

Why

200fifty, to sneerclub in Reply guy EY attempts incredibly convoluted offer to meet him half-way by implying AI body pillows are a vanguard threat that will lead to human extinction...
@200fifty@awful.systems avatar

Oh man, I won’t be able to unsee this, lol

200fifty, to sneerclub in this week's LW chud who is the sort of anti-wokeist who says he isn't right wing writes about human interaction from first principles. 100 upvotes.
@200fifty@awful.systems avatar

The economic incentive is coming from the popularity of stir-fry.

200fifty, to sneerclub in this week's LW chud who is the sort of anti-wokeist who says he isn't right wing writes about human interaction from first principles. 100 upvotes.
@200fifty@awful.systems avatar

I’m picturing some kind of flour-sifting Juicero-type smart device

200fifty, (edited ) to sneerclub in this week's LW chud who is the sort of anti-wokeist who says he isn't right wing writes about human interaction from first principles. 100 upvotes.
@200fifty@awful.systems avatar

Feel like the very beginning of this is not completely crazy (I’ve also thought in the past that straight people often perform “attractiveness” more for the approval of their same-sex friends) but it seems to kind of jump off the evo-psych deep end after that, lol

Also you can’t build a bunch of assumptions about “we should organize society this way” while ignoring the existence of LGBT people, and then go “yeah I know I ignored them but it simplified my analysis.” Like yeah it simplifies the analysis to ignore a bunch of stuff that actually exists in reality, but… then that means maybe your conclusions about how to structure society are wrong??

edit: also this quote is choice:

I don’t know if this really happens. But even if not, the fiction does a great job of highlighting the dynamic I’m thinking of.

200fifty, to sneerclub in Who Is @BasedBeffJezos, The Leader Of The Tech Elite’s ‘E/Acc’ Movement?
@200fifty@awful.systems avatar

It’s like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

200fifty, to sneerclub in loving the EA forum on how the problem with spending the charity money on a castle was the public relations
@200fifty@awful.systems avatar

The problem is just transparency, you see – if they could just show people the math that led them to determining that this would save X million more lives, then everyone would realize that it was actually a very good and sensible decision!

200fifty, (edited ) to sneerclub in the effectively altruistic AI's fans are gonna pipe bomb a Planned Parenthood
@200fifty@awful.systems avatar

Is this a correct characterisation of the EA community? That they all harbour anti-abortion sentiment but for whatever reason permit abortion?

I actually wouldn’t be surprised if this were the case – the whole schtick of a lot of these people is “worrying about increasing the number of future possibly-existing humans, even at the cost of the suffering of actually-existing humans”, so being anti-abortion honestly seems not too far out of their wheelhouse?

Like I think in the EAverse you can just kinda go “well this makes people have less kids which means less QALYs therefore we all know it’s obviously bad and I don’t really need to justify it.” (with bonus internet contrarian points if you are justifying some terrible thing using your abstract math, because that means you’re Highly Decoupled and Very Smart.) See also the quote elsewhere in this thread about the guy defending child marriage for similar reasons.

200fifty, to sneerclub in Serious Yud or Joking Yud?
@200fifty@awful.systems avatar

I think he means script as in, literally a series of lines to say to your doctor to magically hack their brain into giving you the prescription you need (gee, I wonder how these people ever got into pickup artistry!), not a script as in prescription. I think it’s not about cost, it’s about doctors… prescribing you the wrong thing for some reason so you have to lie to them to get the correct medication? Is this some conspiracy theory I’m not aware of, lol

200fifty, to sneerclub in Yud offers more weight loss discourse
@200fifty@awful.systems avatar

Yeah, it’s definitely really hard. The hard part is not “knowing that eating less food will make you lose weight,” it’s actually doing the thing without suffering from willpower failure. But, even given that, Yudkowsky seems to be arguing here that eating less calories won’t make you lose less weight, because such a simplistic model can’t possibly be true (analogizing it to the silly idea that eating less mass will make you lose weight.)

However, uh, his conclusion does contradict empirical reality. For most people, this would be a sign that they should reconsider their chain of logic, but I guess for him it is instead a sign that empirical reality is incorrect.

200fifty, to sneerclub in Yud offers more weight loss discourse
@200fifty@awful.systems avatar

I don’t really know enough about metabolism to say why his example is wrong, but I will say that I lost 30 lbs by counting calories, at pretty much exactly the rate that the calorie-counting predicted, so I’m gonna have to say his first-principles reasoning about why that’s impossible is probably wrong

200fifty, to sneerclub in Good Guy Orange Site refuses to believe rationalists/EAs can be as bad as we're describing and is sure we're just exaggerating
@200fifty@awful.systems avatar

it’s a shame, because gender transition stuff is probably one of the most successful “human biohacking” type things in common use today, and it’s also just… really cool. alas, bigotry

200fifty, to sneerclub in Good Guy Orange Site refuses to believe rationalists/EAs can be as bad as we're describing and is sure we're just exaggerating
@200fifty@awful.systems avatar

“We can transcend the limitations of our physical bodies via technology! Wait, no, not like that!”

200fifty, to sneerclub in Cold viruses and bitcoin mining oh noes
@200fifty@awful.systems avatar

What I don’t get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes… humans don’t entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can’t do when it doesn’t have access to the physical world, only things humans have written about it?

Even if it is using its godly intelligence to predict the next word, wouldn’t it only be able to predict the next word as it relates to things that have already been discovered through experiment? What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?

I guess maybe he thinks all of biology is “in” the DNA and it’s just a matter of simulating the ‘compilation’ process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that’s such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material

200fifty, to sneerclub in Big Yud and the Methods of Compilation
@200fifty@awful.systems avatar

I mean they’ll use an LLM instead of going to therapy too…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KamenRider
  • Ask_kbincafe
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • oklahoma
  • SuperSentai
  • feritale
  • All magazines