Feel like the very beginning of this is not completely crazy (I’ve also thought in the past that straight people often perform “attractiveness” more for the approval of their same-sex friends) but it seems to kind of jump off the evo-psych deep end after that, lol
Also you can’t build a bunch of assumptions about “we should organize society this way” while ignoring the existence of LGBT people, and then go “yeah I know I ignored them but it simplified my analysis.” Like yeah it simplifies the analysis to ignore a bunch of stuff that actually exists in reality, but… then that means maybe your conclusions about how to structure society are wrong??
edit: also this quote is choice:
I don’t know if this really happens. But even if not, the fiction does a great job of highlighting the dynamic I’m thinking of.
It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts
The problem is just transparency, you see – if they could just show people the math that led them to determining that this would save X million more lives, then everyone would realize that it was actually a very good and sensible decision!
Is this a correct characterisation of the EA community? That they all harbour anti-abortion sentiment but for whatever reason permit abortion?
I actually wouldn’t be surprised if this were the case – the whole schtick of a lot of these people is “worrying about increasing the number of future possibly-existing humans, even at the cost of the suffering of actually-existing humans”, so being anti-abortion honestly seems not too far out of their wheelhouse?
Like I think in the EAverse you can just kinda go “well this makes people have less kids which means less QALYs therefore we all know it’s obviously bad and I don’t really need to justify it.” (with bonus internet contrarian points if you are justifying some terrible thing using your abstract math, because that means you’re Highly Decoupled and Very Smart.) See also the quote elsewhere in this thread about the guy defending child marriage for similar reasons.
I think he means script as in, literally a series of lines to say to your doctor to magically hack their brain into giving you the prescription you need (gee, I wonder how these people ever got into pickup artistry!), not a script as in prescription. I think it’s not about cost, it’s about doctors… prescribing you the wrong thing for some reason so you have to lie to them to get the correct medication? Is this some conspiracy theory I’m not aware of, lol
Yeah, it’s definitely really hard. The hard part is not “knowing that eating less food will make you lose weight,” it’s actually doing the thing without suffering from willpower failure. But, even given that, Yudkowsky seems to be arguing here that eating less calories won’t make you lose less weight, because such a simplistic model can’t possibly be true (analogizing it to the silly idea that eating less mass will make you lose weight.)
However, uh, his conclusion does contradict empirical reality. For most people, this would be a sign that they should reconsider their chain of logic, but I guess for him it is instead a sign that empirical reality is incorrect.
I don’t really know enough about metabolism to say why his example is wrong, but I will say that I lost 30 lbs by counting calories, at pretty much exactly the rate that the calorie-counting predicted, so I’m gonna have to say his first-principles reasoning about why that’s impossible is probably wrong
it’s a shame, because gender transition stuff is probably one of the most successful “human biohacking” type things in common use today, and it’s also just… really cool. alas, bigotry
What I don’t get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes… humans don’t entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can’t do when it doesn’t have access to the physical world, only things humans have written about it?
Even if it is using its godly intelligence to predict the next word, wouldn’t it only be able to predict the next word as it relates to things that have already been discovered through experiment? What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?
I guess maybe he thinks all of biology is “in” the DNA and it’s just a matter of simulating the ‘compilation’ process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that’s such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material