@200fifty@awful.systems
@200fifty@awful.systems avatar

200fifty

@[email protected]

This profile is from a federated server and may be incomplete. Browse more on the original instance.

200fifty,
@200fifty@awful.systems avatar

Oh man, I won’t be able to unsee this, lol

200fifty,
@200fifty@awful.systems avatar

import qualified Urbit.Ob.Ob as Ob (fein, fynd)

Why

200fifty, (edited )
@200fifty@awful.systems avatar

Feel like the very beginning of this is not completely crazy (I’ve also thought in the past that straight people often perform “attractiveness” more for the approval of their same-sex friends) but it seems to kind of jump off the evo-psych deep end after that, lol

Also you can’t build a bunch of assumptions about “we should organize society this way” while ignoring the existence of LGBT people, and then go “yeah I know I ignored them but it simplified my analysis.” Like yeah it simplifies the analysis to ignore a bunch of stuff that actually exists in reality, but… then that means maybe your conclusions about how to structure society are wrong??

edit: also this quote is choice:

I don’t know if this really happens. But even if not, the fiction does a great job of highlighting the dynamic I’m thinking of.

200fifty,
@200fifty@awful.systems avatar

I’m picturing some kind of flour-sifting Juicero-type smart device

200fifty,
@200fifty@awful.systems avatar

The economic incentive is coming from the popularity of stir-fry.

200fifty,
@200fifty@awful.systems avatar

It’s like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

200fifty,
@200fifty@awful.systems avatar

The problem is just transparency, you see – if they could just show people the math that led them to determining that this would save X million more lives, then everyone would realize that it was actually a very good and sensible decision!

200fifty, (edited )
@200fifty@awful.systems avatar

Is this a correct characterisation of the EA community? That they all harbour anti-abortion sentiment but for whatever reason permit abortion?

I actually wouldn’t be surprised if this were the case – the whole schtick of a lot of these people is “worrying about increasing the number of future possibly-existing humans, even at the cost of the suffering of actually-existing humans”, so being anti-abortion honestly seems not too far out of their wheelhouse?

Like I think in the EAverse you can just kinda go “well this makes people have less kids which means less QALYs therefore we all know it’s obviously bad and I don’t really need to justify it.” (with bonus internet contrarian points if you are justifying some terrible thing using your abstract math, because that means you’re Highly Decoupled and Very Smart.) See also the quote elsewhere in this thread about the guy defending child marriage for similar reasons.

Serious Yud or Joking Yud? (nitter.net)

AI doctors will revolutionize medicine! You’ll go to a service hosted in Thailand that can’t take credit cards, and pay in crypto, to get a correct diagnosis. Then another VISA-blocked AI will train you in following a script that will get a human doctor to give you the right diagnosis, without tipping that doctor off that...

200fifty,
@200fifty@awful.systems avatar

I think he means script as in, literally a series of lines to say to your doctor to magically hack their brain into giving you the prescription you need (gee, I wonder how these people ever got into pickup artistry!), not a script as in prescription. I think it’s not about cost, it’s about doctors… prescribing you the wrong thing for some reason so you have to lie to them to get the correct medication? Is this some conspiracy theory I’m not aware of, lol

200fifty,
@200fifty@awful.systems avatar

I don’t really know enough about metabolism to say why his example is wrong, but I will say that I lost 30 lbs by counting calories, at pretty much exactly the rate that the calorie-counting predicted, so I’m gonna have to say his first-principles reasoning about why that’s impossible is probably wrong

200fifty,
@200fifty@awful.systems avatar

Yeah, it’s definitely really hard. The hard part is not “knowing that eating less food will make you lose weight,” it’s actually doing the thing without suffering from willpower failure. But, even given that, Yudkowsky seems to be arguing here that eating less calories won’t make you lose less weight, because such a simplistic model can’t possibly be true (analogizing it to the silly idea that eating less mass will make you lose weight.)

However, uh, his conclusion does contradict empirical reality. For most people, this would be a sign that they should reconsider their chain of logic, but I guess for him it is instead a sign that empirical reality is incorrect.

200fifty,
@200fifty@awful.systems avatar

What I don’t get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes… humans don’t entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can’t do when it doesn’t have access to the physical world, only things humans have written about it?

Even if it is using its godly intelligence to predict the next word, wouldn’t it only be able to predict the next word as it relates to things that have already been discovered through experiment? What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?

I guess maybe he thinks all of biology is “in” the DNA and it’s just a matter of simulating the ‘compilation’ process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that’s such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material

200fifty,
@200fifty@awful.systems avatar

“We can transcend the limitations of our physical bodies via technology! Wait, no, not like that!”

200fifty,
@200fifty@awful.systems avatar

it’s a shame, because gender transition stuff is probably one of the most successful “human biohacking” type things in common use today, and it’s also just… really cool. alas, bigotry

200fifty,
@200fifty@awful.systems avatar

yeah, my first thought was, what if you want to comment out code in this future? does that just not work anymore? lol

200fifty,
@200fifty@awful.systems avatar

I mean they’ll use an LLM instead of going to therapy too…

200fifty,
@200fifty@awful.systems avatar

since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren’t must mean you are unaligned/need to be further cleansed of thetans.

They have to agree, it’s mathematically proven by Aumann’s Agreement Theorem!

200fifty,
@200fifty@awful.systems avatar

The winning votes will become investments into the post, binding the CONTENT_EXCRECATOR to CREATE_THE_CONTENT and based on some configurable metric (post score, ad revenue etc.) the investment will accrue dividends

I’m in, but only if this part is handled by fractionalizing an NFT linking to the original post on your custom blockchain

200fifty,
@200fifty@awful.systems avatar

🎶 everybody wants to rule the world 🎶

200fifty,
@200fifty@awful.systems avatar

Yud’s brilliant response is that this makes no sense to describe this as trauma, because you don’t get traumatized by physics class, right?

Isn’t this literally formally fallacious? “There exist non-traumatizing true things” doesn’t imply “all true things are non-traumatizing.”

Ordinarily I’m not one to harp on logical fallacies, but come on Yudkowsky, you’re supposed to be Mr. Rational!

200fifty,
@200fifty@awful.systems avatar

I actually personally happen to think it’s bad when people die but you do you weird lesswrong guy

200fifty,
@200fifty@awful.systems avatar

During the interview, Kat openly admitted to not being productive but shared that she still appeared to be productive because she gets others to do work for her. She relies on volunteers who are willing to do free work for her, which is her top productivity advice.

Productivity pro tip: you can get a lot more done if you can just convince other people to do your work for you for free

200fifty,
@200fifty@awful.systems avatar

this is long and meandering on purpose

Well, at least they admit it.

200fifty,
@200fifty@awful.systems avatar

hero images and their consequences have been a disaster something something something

200fifty,
@200fifty@awful.systems avatar

There’s something infuriating about this. Making basic errors that show you don’t have the faintest grasp on what people are arguing about, and then acting like the people who take the time to get Ph.Ds and don’t end up agreeing with your half-baked arguments are just too stupid to be worth listening to is outrageous.

Hey, that’s what we’ve been saying for years!

200fifty,
@200fifty@awful.systems avatar

I don’t even get his point. You can voluntarily use your freedom to constrain yourself already. What, is the Food Optimizer gonna knock down your door and force-feed you McDonald’s? Has vegetarianism become illegal? Clearly what he’s actually mad about is that the state won’t let him involuntarily constrain others

200fifty,
@200fifty@awful.systems avatar

The whole “autogynephilia” thing has always kind of struck me as similar to the “you gotta stay constantly vigilant because the devil is constantly trying to tempt men into having gay sex” thing. Like, yeah, if you conceptualize it as pathological, you’re gonna feel like there’s something wrong with you. But it only feels weird when you’re feeling it from the “wrong side,” so to speak.

I think this blog got posted to sneerclub before though and yeah it’s kinda too sad to make fun of. This post is a couple years old now but it looks like they’re still blogging in this vein… I hope eventually they’re able to come to terms with their true feelings.

18+ Silicon Valley’s Quest to Build God and Control Humanity (www.thenation.com)

The world we have is ugly enough, but tech capitalists desire an even uglier one. The logical conclusion of having a society run by tech capitalists interested in elite rule, eugenics, and social control is ecological ruin and a world dominated by surveillance and apartheid. A world where our technological prowess is finely...

200fifty,
@200fifty@awful.systems avatar

be here or be sneer, i guess

i sort of depend on this community to not go insane working in tech, so i’m happy it’s continuing

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KamenRider
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • oklahoma
  • SuperSentai
  • feritale
  • All magazines