CinnasVerses

joined 4 months ago
[–] CinnasVerses@awful.systems 6 points 11 hours ago* (last edited 11 hours ago)

how is Tesla stock a thing?

Edward Niedermeyer write a book to answer that very question (cheating on taxes + organize gangs of invested fanboys who suppress negative news online)

Stock markets in the rest of the developed world seem less bubbly than the US market.

It looks like this site requites https:// or http;// to recognize a link as an external link, otherwise it prepends awful.systems/ and treats it as an internal link

[–] CinnasVerses@awful.systems 11 points 1 day ago* (last edited 1 day ago) (9 children)

"U" for "you" was when I became confident who "Nina" was. The blogger feels like yet another person who is caught up in intersecting subcultures of bad people but can't make herself leave. She takes a lot of deep lore like "what is Hereticon?" for granted and is still into crypto.

She links someone called Sonia Joseph who mentions "the consensual non-consensual (cnc) sex parties and heavy LSD use of some elite AI researchers ... leads (sic) to some of the most coercive and fucked up social dynamics that I have ever seen." Joseph says she is Canadian but worked in the Bay Area tech scene. Cursed phrase: agi cnc sex parties

I have never heard of a wing of these people in Canada. There are a few Effective Altruists in Toronto but I don't know if they are the LessWrong kind or the bednet kind. I thought this was basically a US and Oxford scene (plus Jaan Tallinn).

The Substack and a Rationalist web magazine are both called Asterisk.

[–] CinnasVerses@awful.systems 29 points 3 days ago (1 children)

There is an old principle in software development not to make the GUI too pretty until the back end works, because managers and customers will think its ready when they can click around buttons with nice shading and animations. I think slopware is like that. People see the demo that appears to work and don't see what maintaining it and integrating it with other systems is like.

[–] CinnasVerses@awful.systems 5 points 4 days ago* (last edited 4 days ago)

Yud has posted:

Among my friends who came to my attention for psychelics[sic] having had any significant impact on them, positive or negative, I would say the mean result has been overwhelmingly, heartbreakingly negative. Please seriously consider not doing drugs.

He seems very worried about his tendency to procrastination (akrasia in LessWrong jargon) and inability to make himself exercise. So he is concerned about inability to control himself. So his public position is that LSD might have medical uses, but it has harmed people close to him.

[–] CinnasVerses@awful.systems 7 points 4 days ago (3 children)

AFAIK, the people in this space who have acknowledged using LSD and other psychedelics are gwern, Aella, and QiaochuYuan (during his rationalist phase). Scott Alexander hinted that he might have tried it, Eliezer Yudkowsky is interested in psychedelic therapy but also tells readers to please not use LSD. Can any of you name anyone else in this space who has talked about dropping acid?

I don't want to get into "A says that B dropped acid" in a StubSack thread.

The 2016 Nootropics Survey results and Nootropics Survey 2020 Results suggest that it was popular with anonymous SlateStar readers.

[–] CinnasVerses@awful.systems 12 points 5 days ago (1 children)

There is a better timeline where Eliezer Yudkowsky got better religious education, understood that he was having messianic thoughts reinforced by Orison Scott Card's Mormonism, and ended up working in a cafe, writing pulp fiction, and participating in the local kink scene (I think Scott Alexander knows damn well what he is doing and thinks the Truth about the Lesser Breeds is more important)

[–] CinnasVerses@awful.systems 7 points 6 days ago* (last edited 6 days ago) (8 children)

I first sighted Nick Bostrom in a series of mad-science-flavoured erotic horror comics on the Internet Archive (The Apsinthion Protocol and Progress in Research by the same writer). I wish more people had the sense to keep those ideas in the world of weird fiction like Charlie Stross does

The comic I linked is pretty tame (particularly the first few pages with the Bostrom reference). The whole series contains a wide variety of squicks so use your judgement before exploring.

[–] CinnasVerses@awful.systems 6 points 6 days ago (1 children)

Yes, not mutilating infants' genitals is the default choice! A few nations have that custom like a few have the custom of stretching necks, or footbinding, or piercing ears. Its not even a very old custom in the USA (early 20th century I think whereas in North Africa and West Asia it goes back thousands of years).

[–] CinnasVerses@awful.systems 2 points 1 week ago

Argentina and Russia are the usual examples of countries which had great futures in 1913 and threw them away with a series of bad decisions

[–] CinnasVerses@awful.systems 6 points 1 week ago

Rationalists 🤝 Postrationalists Writing screeds about how following the advice of Internet posts and self-medicating with controlled substances can be good if you are very smart

[–] CinnasVerses@awful.systems 8 points 1 week ago* (last edited 1 week ago)

Also had a beef with Aella when they were both in Austin!

They are not beating the allegations that a Postrationalist is a Rationalist who admits Yud is just a dude and their goals are religious goals.

Update: There is a partially cached Facebook post where MacDonald states what he claims is Eigenrobot's government name and says that Eigenrobot's "(self admitted) BPD wife used to want to fuck me" (Google and Bing saved snippets, but do not share the full cache). That could have been what got him kicked out of VibeCamp for doxxing.

 

Its almost the end of the year so most US nonprofits which want to remain nonprofits have filed Form 990 for 2024 including some run by our dear friends. This is a mandatory financial report.

  • Lightcone Infrastructure is here. They operate LessWrong and the Lighthaven campus in Berkeley but list no physical assets; someone on Reddit says that they let fellow travelers like Scott Alexander use their old rented office for free. "We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity." They also published a book and website with common-sense, data-based advice for Democratic Party leaders called Deciding to Win which I am sure fills a gap in the literature. Edit: their November 2024 call for donationswhich talks how they spend $16.5m on real estate and $6m on renovations then saw donations collapse is here, an analysis is here
  • CFAR is here. They seem to own the campus in Berkeley but it is encumbered with a mortgage ("Land, buildings, and equipment ... less depreciation; $22,026,042 ... Secured mortgages and notes payable, $20,848,988"). I don't know what else they do since they stopped teaching rationality workshops in 2016 or so and pivoted to worrying about building Colossus. They have nine employees with salaries from $112k to $340k plus a president paid $23k/year
  • MIRI is here. They pay Yud ($599,970 in 2024!) and after failing to publish much research on how to build Friend Computer they pivoted to arguing that Friend Computer might not be our friend. Edit: they had about $16 million in mostly financial assets (cash, investments, etc.) at end of year but spent $6.5m against $1.5m of revenue in 2024. They received $25 million in 2021 and ever since they have been consuming those funds rather than investing them and living off the interest.
  • BEMC Foundation is here. This husband-and-wife organization gives about $2 million/year each to Vox Future Perfect and GiveWell from an initial $38m in capital (so they can keep giving for decades without adding more capital). Edit: The size of the donations to Future Perfect and GiveWell swing from year to year so neither can count on the money, and they gave out $6.4m in 2024 which is not sustainable.
  • The Clear Fund (GiveWell) is here. They have the biggest wad of cash and the highest cashflow.
  • Edit: Open Philanthropy (now Coefficient Giving) is here (they have two sister organizations). David Gerard says they are mainly a way for Dustin Moskevitz the co-founder of Facebook to organize donations, like the Gates, Carnegie, and Rockefeller foundations. They used to fund Lightcone.
  • Edit: Animal Charity Evaluators is here. They have funded Vox Future Perfect (in 2020-2021) and the longtermist kind of animal welfare ("if humans eating pigs is bad, isn't whales eating krill worse?")
  • Edit: Survival and Flourishing Fund does not seem to be a charity. Whereas a Lightcone staffer says that SFF funds Lightcone, SFF say that they just connect applicants to donors and evaluate grant applications. So who exactly is providing the money? Sometimes its Jaan Tallinn of Skype and Kazaa.
  • Centre for Effective Altruism is mostly British but has a US wing since March 2025 https://projects.propublica.org/nonprofits/organizations/333737390
  • Edit: Giving What We Can seems like a mainstream "bednets and deworming pills" type of charity
  • Edit: Givedirectly Inc is an excellent idea in principle (give money to poor people overseas and let them figure out how best to use it) but their auditor flagged them for Material noncompliance and Material weakness in internal controls. The mistakes don't seem sinister (they classified $39 million of donations as conditional rather than unconditional- ie. with more restrictions than they actually had). GiveDirectly, Give What We Can, and GiveWell are all much better funded than the core LessWrong organizations.

Since CFAR seem to own Lighthaven, its curious that Lightcone head Oliver Habryka threatens to sell it if Lightcone shut down. One might almost imagine that boundaries between all these organizations are not as clear as the org charts make it seem. SFGate says that it cost $16.5 million plus renovations:

Who are these owners? The property belongs to a limited liability company called Lightcone Rose Garden, which appears to be a stand-in for the nonprofit Center for Applied Rationality and its project, Lightcone Infrastructure. Both of these organizations list the address, 2740 Telegraph Ave., as their home on public filings. They’ve renovated the inn, named it Lighthaven, and now use it to host events, often related to the organizations’ work in cognitive science, artificial intelligence safety and “longtermism.”

Habryka was boasting about the campus in 2024 and said that Lightcone budgeted $6.25 million on renovating the campus that year. It also seems odd for a nonprofit to spend money renovating a property that belongs to another nonprofit.

On LessWrong Habryka also mentions "a property we (Lightcone) own right next to Lighthaven, which is worth around $1M" and which they could use as collateral for a loan. Lightcone's 2024 paperwork listed the only assets as cash and accounts receivable. So either they are passing around assets like the last plastic cup at a frat party, or they bought this recently while the dispute with the trustees was ongoing, or Habryka does not know what his organization actually owns.

The California end seems to be burning money, as many movements with apocalyptic messages and inexperienced managers do. Revenue was significantly less than expenses and assets of CFAR are close to liabilities. CFAR/Lightcone do not have the $4.9 million liquid assets which the FTX trustees want back and claim their escrow company lost another $1 million of FTX's money.

 

People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman's "Geeks, Mops, and Sociopaths in Subculture Evolution" to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that "bednet" effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don't know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

 

Form 990 for these organizations mentions many names I am not familiar with such as Tyler Emerson. Many people in these spaces have romantic or housing partnerships with each other, and many attend meetups and cons together. A MIRI staffer claims that Peter Thiel funded them from 2005 to 2009, we now know when Jeffrey Epstein donated. Publishing such a thing is not very nice since these are living persons frequently accused of questionable behavior which never goes to court (and some may have left the movement), but does a concise list of dates, places, and known connections exist?

Maybe that social graph would be more of a dot. So many of these people date each other and serve on each other's boards and live in the SF Bay Area, Austin TX, the NYC area, or Oxford, England. On the enshittified site people talk about their Twitter and Tumblr connections.

 

We often mix up two bloggers named Scott. One of Jeffrey Epstein's victims says that she was abused by a white-haired psychology professor or Harvard professor named Stephen. In 2020, Vice observed that two Harvard faculty members with known ties to Epstein fit that description (a Steven and a Stephen). The older of the two taught the younger. The younger denies that he met or had sex with the victim. What kind of workplace has two people who can be reasonably suspected of an act like that?

I am being very careful about talking about this.

 

An opposition between altruism and selfishness seems important to Yud. 23-year-old Yud said "I was pretty much entirely altruistic in terms of raw motivations" and his Pathfinder fic has a whole theology of selfishness. His protagonists have a deep longing to be world-historical figures and be admired by the world. Dreams of controlling and manipulating people to get what you want are woven into his community like mould spores in a condemned building.

Has anyone unpicked this? Is talking about selfishness and altrusm common in LessWrong like pretending to use Bayesian statistics?

 

I used to think that psychiatry-blogging was Scott Alexander's most useful/least harmful writing, because its his profession and an underserved topic. But he has his agenda to preach race pseudoscience and 1920s-type eugenics, and he has written in some ethical grey areas like stating a named friend's diagnosis and desired course of treatment. He is in a community where many people tell themselves that their substance use is medicinal and want proscriptions. Someone on SneerClub thinks he mixed up psychosis and schizophrenia in a recent post.

If you are in a registered profession like psychiatry, it can be dangerous to casually comment on your colleagues. Regardless, has anyone with relevant qualifications ever commented on his psychiatry blogging and whether it is a good representation of the state of knowledge?

31
submitted 4 months ago* (last edited 4 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems
 

Bad people who spend too long on social media call normies NPCs as in video-game NPCs who follow a closed behavioural loop. Wikipedia says this slur was popular with the Twitter far right in October 2018. Two years before that, Maciej Ceglowski warned:

I've even seen people in the so-called rationalist community refer to people who they don't think are effective as ‘Non Player Characters’, or NPCs, a term borrowed from video games. This is a horrible way to look at the world.

Sometime in 2016, an anonymous coward on 4Chan wrote:

I have a theory that there are only a fixed quantity of souls on planet Earth that cycle continuously through reincarnation. However, since the human growth rate is so severe, the soulless extra walking flesh piles around us are NPC’s (sic), or ultimate normalfags, who autonomously follow group think and social trends in order to appear convincingly human.

Kotaku says that this post was rediscovered by the far right in 2018.

Scott Alexander's novel Unsong has an angel tell a human character that there was a shortage of divine light for creating souls so "I THOUGHT I WOULD SOLVE THE MORAL CRISIS AND THE RESOURCE ALLOCATION PROBLEM SIMULTANEOUSLY BY REMOVING THE SOULS FROM PEOPLE IN NORTHEAST AFRICA SO THEY STOPPED HAVING CONSCIOUS EXPERIENCES." He posted that chapter in August 2016 (unsongbook.com). Was he reading or posting on 4chan?

Did any posts on LessWrong use this insult before August 2016?

Edit: In HPMOR by Eliezer Yudkowsky (written in 2009 and 2010), rationalist Harry Potter calls people who don't do what he tells them NPCs. I don't think Yud's Harry says they have no souls but he has contempt for them.

view more: next ›