I am the journeyer from the valley of the dead Sega consoles. With the blessings of Sega Saturn, the gaming system of destruction, I am the Scout of Silence… Sailor Saturn.
This profile is from a federated server and may be incomplete. Browse more on the original instance.
"As always, pedophilia is not the same as ephebophilia." - Eliezer Yudkowsky, actual quote (www.lesswrong.com)
I somehow missed this one until now. Apparently it was once mentioned in the comments on the old sneerclub but I don’t think it got a proper post, and I think it deserves one.
The SSC subreddit ponders the difference between Bayesianism and plain old bias (old.reddit.com)
Existential Comics on rationalism and parmesan (existentialcomics.com)
Who Is @BasedBeffJezos, The Leader Of The Tech Elite’s ‘E/Acc’ Movement? (www.forbes.com)
At various points, on Twitter, Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.” …...
Urbit use case found: community for nofap cultist "now actually working!! best possible contact method, use this pls"
read.easypeasymethod.org...
Good Guy Orange Site refuses to believe rationalists/EAs can be as bad as we're describing and is sure we're just exaggerating (news.ycombinator.com)
Big Yud and the Methods of Compilation (nitter.net)
In today’s episode, Yud tries to predict the future of computer science.
18+ our new piece on SBF gets stuck into the rationalists (davidgerard.co.uk)
Eliezer complements Musk, Musk negs Eliezer (nitter.net)
Some light sneerclub content in these dark times....
Rationalist literary criticism by SBF, found on the birdsite (awful.systems)
original is here, but you aren’t missing any context, that’s the twit....
a scrawny nerd in a basement writes (www.lesswrong.com)
(whatever the poster looks like and wherever they live, their personality is a scrawny nerd in a basement)
18+ Do Novelists Need to Worry About AI? (In which I rant for entirely too long about Yudkowsky and his ilk.) (www.reddit.com)
TL;DR:...
that time Scott told the rationalists how adderall would turn them into geniuses of finance and even named specific rationalists he thought should get into adderall (slatestarcodex.com)
this btw is why we now see some of the TPOT rationalists microdosing street meth as a substitute. also that they’re idiots, of course....
Sequence classic: "I don’t think you could get up to 99.99% confidence for assertions like “53 is a prime number.” (www.lesswrong.com)
that time the rationalists decided that I, personally, was why anyone called them a "cult", and not, say, anything they said or did themselves (reddragdiva.tumblr.com)
you have to read down a bit, but really, I’m apparently still the Satan figure. awesome.
I am extremely curious what the general take around here is on the Singulairty
First, let me say that what broke me from the herd at lesswrong was specifically the calls for AI pauses. That somehow ‘rationalists’ are so certain advanced AI will kill everyone in the future (pDoom = 100%!) that they need to commit any violent act needed to stop AI from being developed....
Yud goes full seed oil-ist (nitter.net)
LessWrong: Where’s the economic incentive for wokism coming from? (www.lesswrong.com)
yes really, that’s literally the title of the post. (archive copy, older archive copy) LessWrong goes full Motte....
LessWrong: video games > IQ tests (www.lesswrong.com)
Video games also have potential legal advantages over IQ tests for companies. You could argue that “we only hire people good at video games to get people who fit our corporate culture of liking video games” but that argument doesn’t work as well for IQ tests....
The Control Problem: Unsolved or Unsolvable? (www.lesswrong.com)
I will never get over how the pretty girl in the photo attached is LITERALLY Roko of the Basilisk's example of the bad ending for humanity (awful.systems)
really: archive.ph/p0jPI...