I’m noticing that people who criticize him on that subreddit are being downvoted, while he’s being upvoted.
I wouldn’t be surprised if, as part of his prodigious self-promotion of this overlong and tendentious screed, he’s steered some of his more sympathetic followers to some of these forums.
Actually it’s the wikipedia subreddit thread I meant to refer to.
As a longtime listener to Tech Won’t Save Us, I was pleasantly surprised by my phone’s notification about this week’s episode. David was charming and interesting in equal measure. I mostly knew Jack Dorsey as the absentee CEO of Twitter who let the site stagnate under his watch, but there were a lot of little details about his moderation-phobia and fash-adjacency that I wasn’t aware of.
By the way, I highly recommend the podcast to the TechTakes crowd. They cover many of the same topics from a similar perspective.
For me it gives off huge Dr. Evil vibes.
If you ever get tired of searching for pics, you could always go the lazy route and fall back on AI-generated images. But then you’d have to accept the reality that in few years your posts would have the analog of a geocities webring stamped on them.
Trace seems a bit… emotional. You ok, Trace?
But will my insurance cover a visit to Dr. Spicy Autocomplete?
So now Steve Sailer has shown up in this essay’s comments, complaining about how Wikipedia has been unfairly stifling scientific racism.
Birds of a feather and all that, I guess.
what is the entire point of singling out Gerard for this?
He’s playing to his audience, which includes a substantial number of people with lifetime subscriptions to the Unz Review, Taki’s crapazine and Mankind Quarterly.
why it has to be quite that long
Welcome to the rationalist-sphere.
Scott Alexander, by far the most popular rationalist writer besides perhaps Yudkowsky himself, had written the most comprehensive rebuttal of neoreactionary claims on the internet.
Hey Trace, since you’re undoubtedly reading this thread, I’d like to make a plea. I know Scott Alexander Siskind is one of your personal heroes, but maybe you should consider digging up some dirt in his direction too. You might learn a thing or two.
Until a month ago, TW was the long-time researcher for “Blocked and Reported”, the podcast hosted by Katie ‘TERF’ Herzog and relentless sealion Jesse Singal.
She seems to do this kind of thing a lot.
According to a comment, she apparently claimed on Facebook that, due to her post, “around 75% of people changed their minds based on the evidence!”
After someone questioned how she knew it was 75%:
Update: I changed the wording of the post to now state: 𝗔𝗿𝗼𝘂𝗻𝗱 𝟳𝟓% 𝗼𝗳 𝗽𝗲𝗼𝗽𝗹𝗲 𝘂𝗽𝘃𝗼𝘁𝗲𝗱 𝘁𝗵𝗲 𝗽𝗼𝘀𝘁, 𝘄𝗵𝗶𝗰𝗵 𝗶𝘀 𝗮 𝗿𝗲𝗮𝗹𝗹𝘆 𝗴𝗼𝗼𝗱 𝘀𝗶𝗴𝗻*
And the * at the bottom says: Did some napkin math guesstimates based on the vote count and karma. Wide error bars on the actual ratio. And of course this is not proof that everybody changed their mind. There’s a lot of reasons to upvote the post or down vote it. However, I do think it’s a good indicator.
She then goes on to talk about how she made the Facebook post private because she didn’t think it should be reposted in places where it’s not appropriate to lie and make things up.
Clown. Car.
What a bunch of monochromatic, hyper-privileged, rich-kid grifters. It’s like a nonstop frat party for rich nerds. The photographs and captions make it obvious:
The gang going for a hiking adventure with AI safety leaders. Alice/Chloe were surrounded by a mix of uplifting, ambitious entrepreneurs and a steady influx of top people in the AI safety space.
The gang doing pool yoga. Later, we did pool karaoke. Iguanas everywhere.
Alice and Kat meeting in “The Nest” in our jungle Airbnb.
Alice using her surfboard as a desk, co-working with Chloe’s boyfriend.
The gang celebrating… something. I don’t know what. We celebrated everything.
Alice and Chloe working in a hot tub. Hot tub meetings are a thing at Nonlinear. We try to have meetings in the most exciting places. Kat’s favorite: a cave waterfall.
Alice’s “desk” even comes with a beach doggo friend!
Working by the villa pool. Watch for monkeys!
Sunset dinner with friends… every day!
These are not serious people. Effective altruism in a nutshell.
My attention span is not what it used to be, and I couldn’t force myself to get to the end of this. A summary or TLDR (on the part of the original author) would have been helpful.
What is it with rationalists and their inability to write with concision? Is there a gene for bloviation that also predisposes them to the cult? Or are they all just mimicking Yud’s irritating style?
Stephen Jay Gould’s The Mismeasure of Man is always a good place to start.
This is good:
Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.
Also this:
If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.
And:
If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.
“TempleOS on the blockchain”
Ok that’s some quality sneer. A bit obscure and esoteric, but otherwise perfect for those who know anything about Temple OS.
Yeah, Behe’s one of the leading lights (dimmest bulbs?) of the so-called “Intelligent Design” movement: a molecular biologist who knows just enough molecular biology to construct strawmen arguments about evolution. Siskind being impressed by him tells me everything I need to know about Siskind’s susceptibility to truly stupid ideas.
As anyone who’s been paying attention already knows, LLMs are merely mimics that provide the “illusion of understanding”.