If you can’t get through two short paragraphs without equating Stalinism and “social justice”, you may be a cockwomble.
Welp, time to start the thread with fresh Awful for everyone to regret:
r/phenotypes
Here’s a start:
Given their enormous environmental cost and their foundation upon exploited labor, justifying the use of Large Generative AI Models in telecommunications is an uphill task. Since their output is, in the technical sense of the term, bullshit, climbing that hill has no merit.
Man, now I’m bummed that I don’t have a cult trying to distribute translations of my Daria fic in which Jane becomes Hell Priest of the Cenobites.
I think it could be very valuable to alignment-pill these people.
Zoom and enhance!
alignment-pill
The inability to hear what their own words sound like is terminal. At this stage, we can only provide palliative care, i.e., shoving into lockers.
[Fiction] [Comic] Effective Altruism and Rationality meet at a Secular Solstice afterparty
When the very first thing you say about a character is that they “have money in crypto”, you may already be doing it wrong
“The Publisher of the Journal “Nature” Is Emailing Authors of Scientific Papers, Offering to Sell Them AI Summaries of Their Own Work”, by Maggie Harrison Dupré at Futurism:
Springer Nature, the stalwart publisher of scientific journals including the prestigious Nature as well as the nearly 200-year-old magazine Scientific American, is approaching the authors of papers in its journals with AI-generated “Media Kits” to summarize and promote their research.
In an email to journal authors obtained by Futurism, Springer told the scientists that its AI tool will “maximize the impact” of their research, saying the $49 package will return “high-quality” outputs for marketing and communication purposes. The publisher’s sell for the package hinges on the argument that boiling down complex, jargon-laden research into digestible soundbites for press releases and social media copy can be difficult and time-consuming — making it, Springer asserts, a task worth automating.
internally at Meta:
-trans and nonbinary themes stripped from Messenger
-enforcement policy now allows for the denial of trans people’s existence
-tampons removed from men’s restrooms
-DEI programs shuttered
-Kaplan briefed top conservative influencers the night before policy changes were announced
My favorite quote from flipping through LessWrong to find something passingly entertaining:
You only multiply the SAT z-score by 0.8 if you’re selecting people on high SAT score and estimating the IQ of that subpopulation, making a correction for regressional Goodhart. Rationalists are more likely selected for high g which causes both SAT and IQ
(From the comments for “The average rationalist IQ is about 122”.)
Saying that Excel is not and never was a good solution for any problem feels like a rather blinkered, programmer-brained technique.
Ah, here’s the post I was thinking of; I missed it somehow.
Yud is against seed oils, right? Or was that Siskind? I have a vague memory of the topic coming up but was unable to substantiate it in the 22 seconds of archive-searching that I was willing to do.
“What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us?”
su;dr
(saw URL; didn’t read)
You don’t even have to be well-known to get crank attention. Post anything with “quantum” in the title on the arXiv and they’ll find your e-mail.
Source: this is one of the few times when I can say “trust me, bro” and be entirely sincere about it
eyelid twitches
No worries