CinnasVerses

joined 6 months ago

Hofstadter and Hesse seem to be namechecked on LW much more often than Leo Strauss. I wonder if Scott Alexander talks about Strauss over coffee if he trusts you, because so much of what our friends do is supposed to fool the vulgar masses while the wise smile and know the hidden truth.

I wonder if the real secret to Vassar's influence is that he influenced the leaders of Bay Area LW like Alexander, Anissimov, Constantin, Zvi Moshowitz, and Salamon.

Thanks! I don't get the impression that Michael Vassar posts or publishes a lot under his own name, he seems to prefer cornering susceptible people at events and then having private conversations and correspondence with the ones who respond in a promising way. The clearest description of his jailbreaking which I have read is by Scott Alexander in a back and forth with Jessica Taylor (and we know Scott Alexander tries to hide some of the beliefs he cares the most about).

In a LessWrong thread people just point to a deleted Twitter account and some YouTube videos by Vassar.

RationalWiki briefly mentions earlier woo abut brain hemispheres.

[–] CinnasVerses@awful.systems 4 points 2 days ago (2 children)

That specific instance of Archive Today seems to have been taken over by activists who edit their copies of some pages and performed a DDOS attack (although all I know comes from social media posts and news stories). https://www.avclub.com/archiveis-under-fbi-investigation

[–] CinnasVerses@awful.systems 3 points 2 days ago (4 children)

Well, I think the Buddhist idea that the self is an illusion goes back 2500 years or more, but Douglas Richard Hofstadter might have introduced nerdy American sci-fi fans to the idea.

[–] CinnasVerses@awful.systems 4 points 2 days ago

The Cut seems to like articles on cults and abuse within small groups, since they have an article on the Zizians, and one on a Neo-Tantric sex group where Aella would feel at home

[–] CinnasVerses@awful.systems 6 points 3 days ago (6 children)

I heard somewhere that "there is no unitary self" can be a Buddhist teaching and TPOT draws on Western Buddhism. There is work to be done figuring out where they got their eclectic mix of techniques and terminology.

[–] CinnasVerses@awful.systems 8 points 3 days ago (12 children)

Has anyone heard of the Internal Family Systems Model? One of the CFAR founders said he relied on it when he was designing self-help workshops. The IFS encourages you to see yourself as a system of entities and talk to them separately, and that reminds me of Ziz Lasota's two-hemispheres theory and Michael Vassar's jailbreaking.

[–] CinnasVerses@awful.systems 5 points 4 days ago

Forming a single legal entity would have made it hard to protect the other projects if the CFAR side had lost a lawsuit over abuse of a minor at a CFAR event, or Lightcone had lost a judgment over taking money from FTX and had to sell the Rose Garden property, I know these people don't do "fear of frequent consequences of ordinary human weaknesses" but that is a big risk.

I also wonder who served as treasurer and bookkeeper for each project. If one person served both projects, he or she could have caused all kinds of trouble, even if there were separate bank accounts.

[–] CinnasVerses@awful.systems 5 points 4 days ago (3 children)

Back in 2019, Ben Pace of Lightcone said that CFAR and Lightcone were one legal entity, but two boards with no overlap. Did CFAR + Lightcone really spend $22 million on real estate in Berkeley without spending a few grand to create a separate nonprofit and separate the finances? In 2024, CFAR still had the real estate and the mortgage on its books. https://www.lesswrong.com/posts/eR7Su77N2nK3e5YRZ/the-lesswrong-team-is-now-lightcone-infrastructure-come-work-3

I have never opened a US business bank account, but I would think it would be hard to keep the bank accounts separate if one organization has no independent legal existence, and transactions in the millions or tens of millions tempt the most righteous person to stick his fingers in the till.

[–] CinnasVerses@awful.systems 6 points 4 days ago

If Duncan Sabien or Eliezer Yudkowsky admitted what they were doing, they would have to take more responsibilities. If they moved from chanting "we have noticed the skulls" to expecting every serious member to be able to describe times that nerdy subcultures they were not part of went wrong, they would have to give up most of what they do.

[–] CinnasVerses@awful.systems 9 points 5 days ago* (last edited 5 days ago) (2 children)

CFAR seems to have pivoted back to focusing on the workshops. Their winter 2025/2026 fundraiser only raised $10k with a goal of $125k. The curriculum sounds very New Age:

If you’ve been to a CFAR workshop in the ~2015-2020 era, you should expect that current ones: ... Have roughly 1/3rd new content, mostly aimed at practical ways to be less “seeing like a state” when applying rationality techniques, and to be more “a proud gardener of the living processes inside you / a free person with increasing powers of authorship.” (We've been calling this thread "honoring who-ness.")

No masks in their photo of a workshop posted February 2025 (2024 was a pretty bad year for airborne infections where I live, and alienated educated young people are more likely to wear respirators than normies, so I would expect to see someone in that room wearing a N95 or Flo). If building warm and nurturing relationships is important then it helps to be able to eat together and see each other's faces. The venue is about a 90 minute drive from Oakland, CA (the East Bay).

This paragraph leapt out at me:

On Day 4 of the four day workshop, we spent three and a half hours on an activity called Questing, in which participants took turns being the “hero” (who worked on whatever they liked) and the “sidekick” (who assisted at the hero’s direction) for ~10 minute chunks. This activity was extremely well-liked (did best of all activities on our survey; many said many great things about it).

If you read that and say "doesn't that sound like Effort Exchange in the Dragon Army Barracks?" you should go home and rethink the regrettable things you learn on the Internet. I look forward to reading the book on LessWrong, the splinter sects, and just how much they had in common after a hard day gardening in a post-apocalyptic wasteland.

Before FTX collapsed my model of LW was something like cryptozoology enthusiasts who trade posts and sometimes meet at a con, now its more like Scientology. Early Scientology offered a community and a path to self-improvement.

 

Does anyone know what this June 2019 text from Epstein is about? I have added some links to RationalWiki and Wikipedia ~~but not corrected spelling~~ and corrected OCR errors. Was it at one of the institutions he sponsored like MIT Media Lab? Or more like his conference in the Virgin Islands? It seems to mix mainstream figures and people in the Libertarian/LessWrong network.

Another correspondent in 2016 suggested inviting Scott Alexander Siskind to speak at a different event Epstein was involved in. The correspondent has a Substack which cites Siskind in 2025.

Obviously just because Epstein had heard of a public figure does not mean that they knew him.

Epstein's words begin below:

  • List for summer talks. David Pizarro. Professor of Psychology and Philosopher at Cornell Univcrsit
  • Eric Weinstein, Mathematician
  • Matthew Putman, Scientist
  • Paul Saffo, Technology Forecaster, and Professor of Engineering
  • Lori Santos, Professor ofPsychology and Cognitive Science
  • Janna Levin, Theoretical Cosmologist
  • Ev Williams, Internet Entrepreneur
  • Phoebe Waller-Bridge, Author
  • Heiner Gocbbels, Composer, and Director
  • Martine Rothblatt, Lawyer and Entrepreneur
  • Peter Thiel, Venture Capitalist, and Entrepreneur
  • Richard Thaler, Behavioral Economics
  • Barbara Tversky, Professor of Psychology
  • Michael Vassar, Futurist, Activist
  • Bret Weinstein, Biologist, and Evolutionary Theorist
  • Susan Hockfield, MIT President, Professor of Neuroscience
  • David Deutsch, Physicist
  • Eliezer Yudkowsky, Al Researcher
  • N. Jeremy Kasdin, Astrophysicist
  • Carl Zimmer, Science Writer
  • Douglas Rushkoff, Media Theorist
  • Eric Topol, Cardiologist
  • Dustin Yellin, Artist
  • Sherry Turkic, Professor of Social Studies
  • Taylor Mac, Actor
  • Stephen Johnson, Author
  • Martin Hagglund, Swedish Philosopher and Scholar of Modernist Literature
  • Thomas Metzinger, Philosopher, and Professor of Theoretical Philosophy
  • Bjarke Ingels, Danish Architect, Founder of BIG, currently working on Floating Cities/Sustainable Habitats project
  • Kai-Fu Lee, Venture Capitalist, Technology Executive, and Al Expert, developed the world's first speaker-independent continuous speech recognition system
  • Poppy Crum, Neuroscientist, and Technologist, Chief Scientist at Dolby Laboratories, Adjunct Professor at Stanford University (Computer Research in Music)
  • Neil Burgess, Researcher, and Professor of Cognitive Neuroscience, investigating the role of the hippocampus in spatial navigation
  • Paul Sloom, Psychologist, and Researcher exploring how children and adults understand the physical and secin' world, with a special focus on language, religion and morality
  • Brian Cox, Physicist, and Professor of Particle Physics, Presenter of Science Programs
  • Eythor Bender. CEO of Berkeley Bionics, Innovator and Business Leader in human augmentation (bionics and robotics)
  • Gwynne Shotwell President. and COO at SpaceX, Engineer. listed in 2018 as the 59th most powerful woman in the world by Forbes
  • Jaap de Roodc. Associate Professor of Evolution (of parasites) and Ecology, focusing on how parasites attack monarch butterflies and in return how butterflies have the ability to self-medicate
  • Jim Holt, American Philosopher, and Contributor to the New York Times writing on string theory, time, the universe, and philosophy
  • Vijay Komar, Indian Roboticist and UPS Foundation Professor in School of Engineering & Applied Science:. became Dean of Penn Engineering, studies flying and cooperative robots
  • Hugh Herr, Biophysicist, Engineer, and Rock Climber, builds prosthetic knees, legs, and ankles that fuse biomechanics with microprocessors at MIT
  • Gabriel Zucman, French Economist at UC Berkeley. best known for his research on tax havens, inequalities, and global wealth
  • Fci-Fei Li, Professor of Computer Science, Director of Stanford's Human-Ccntered Al, works as Chief Scientist of Al/ML of Google Cloud
  • Dennis Hong, Korean American Mechanical Engineer, Professor and Founding Director of RoMeLa (Robotics & Mechanisms Laboratory) of the Mechanical & Aerospace Engineering Department at UCLA
  • Misha (Mikhail) Leonidovich Gromov, American
 

Its almost the end of the year so most US nonprofits which want to remain nonprofits have filed Form 990 for 2024 including some run by our dear friends. This is a mandatory financial report.

  • Lightcone Infrastructure is here. They operate LessWrong and the Lighthaven campus in Berkeley but list no physical assets; someone on Reddit says that they let fellow travelers like Scott Alexander use their old rented office for free. "We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity." They also published a book and website with common-sense, data-based advice for Democratic Party leaders called Deciding to Win which I am sure fills a gap in the literature. Edit: their November 2024 call for donations talks how they spend $16.5m on real estate and $6m on renovations then saw donations collapse is here, an analysis is here
  • CFAR is here. They seem to own the campus in Berkeley but it is encumbered with a mortgage ("Land, buildings, and equipment ... less depreciation; $22,026,042 ... Secured mortgages and notes payable, $20,848,988"). I don't know what else they do since they stopped teaching rationality workshops in 2016 or so and pivoted to worrying about building Colossus. They have nine employees with salaries from $112k to $340k plus a president paid $23k/year
  • MIRI is here. They pay Yud ($599,970 in 2024!) and after failing to publish much research on how to build Friend Computer they pivoted to arguing that Friend Computer might not be our friend. Edit: they had about $16 million in mostly financial assets (cash, investments, etc.) at end of year but spent $6.5m against $1.5m of revenue in 2024. They received $25 million in 2021 and ever since they have been consuming those funds rather than investing them and living off the interest.
  • BEMC Foundation is here. This husband-and-wife organization gives about $2 million/year each to Vox Future Perfect and GiveWell from an initial $38m in capital (so they can keep giving for decades without adding more capital). Edit: The size of the donations to Future Perfect and GiveWell swing from year to year so neither can count on the money, and they gave out $6.4m in 2024 which is not sustainable.
  • The Clear Fund (GiveWell) is here. They have the biggest wad of cash and the highest cashflow.
  • Edit: Open Philanthropy (now Coefficient Giving) is here (they have two sister organizations). David Gerard says they are mainly a way for Dustin Moskevitz the co-founder of Facebook to organize donations, like the Gates, Carnegie, and Rockefeller foundations. They used to fund Lightcone.
  • Edit: Animal Charity Evaluators is here. They have funded Vox Future Perfect (in 2020-2021) and the longtermist kind of animal welfare ("if humans eating pigs is bad, isn't whales eating krill worse?")
  • Edit: Survival and Flourishing Fund does not seem to be a charity. Whereas a Lightcone staffer says that SFF funds Lightcone, SFF say that they just connect applicants to donors and evaluate grant applications. So who exactly is providing the money? Sometimes its Jaan Tallinn of Skype and Kazaa.
  • Centre for Effective Altruism is mostly British but has a US wing since March 2025 https://projects.propublica.org/nonprofits/organizations/333737390
  • Edit: Giving What We Can seems like a mainstream "bednets and deworming pills" type of charity
  • Edit: Givedirectly Inc is an excellent idea in principle (give money to poor people overseas and let them figure out how best to use it) but their auditor flagged them for Material noncompliance and Material weakness in internal controls. The mistakes don't seem sinister (they classified $39 million of donations as conditional rather than unconditional- ie. with more restrictions than they actually had). GiveDirectly, Give What We Can, and GiveWell are all much better funded than the core LessWrong organizations.

Since CFAR seem to own Lighthaven, its curious that Lightcone head Oliver Habryka threatens to sell it if Lightcone shut down. One might almost imagine that boundaries between all these organizations are not as clear as the org charts make it seem. SFGate says that it cost $16.5 million plus renovations:

Who are these owners? The property belongs to a limited liability company called Lightcone Rose Garden, which appears to be a stand-in for the nonprofit Center for Applied Rationality and its project, Lightcone Infrastructure. Both of these organizations list the address, 2740 Telegraph Ave., as their home on public filings. They’ve renovated the inn, named it Lighthaven, and now use it to host events, often related to the organizations’ work in cognitive science, artificial intelligence safety and “longtermism.”

Habryka was boasting about the campus in 2024 and said that Lightcone budgeted $6.25 million on renovating the campus that year. It also seems odd for a nonprofit to spend money renovating a property that belongs to another nonprofit.

On LessWrong Habryka also mentions "a property we (Lightcone) own right next to Lighthaven, which is worth around $1M" and which they could use as collateral for a loan. Lightcone's 2024 paperwork listed the only assets as cash and accounts receivable. So either they are passing around assets like the last plastic cup at a frat party, or they bought this recently while the dispute with the trustees was ongoing, or Habryka does not know what his organization actually owns.

The California end seems to be burning money, as many movements with apocalyptic messages and inexperienced managers do. Revenue was significantly less than expenses and assets of CFAR are close to liabilities. CFAR/Lightcone do not have the $4.9 million liquid assets which the FTX trustees want back and claim their escrow company lost another $1 million of FTX's money.

 

People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman's "Geeks, Mops, and Sociopaths in Subculture Evolution" to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that "bednet" effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don't know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

 

Form 990 for these organizations mentions many names I am not familiar with such as Tyler Emerson. Many people in these spaces have romantic or housing partnerships with each other, and many attend meetups and cons together. A MIRI staffer claims that Peter Thiel funded them from 2005 to 2009, we now know when Jeffrey Epstein donated. Publishing such a thing is not very nice since these are living persons frequently accused of questionable behavior which never goes to court (and some may have left the movement), but does a concise list of dates, places, and known connections exist?

Maybe that social graph would be more of a dot. So many of these people date each other and serve on each other's boards and live in the SF Bay Area, Austin TX, the NYC area, or Oxford, England. On the enshittified site people talk about their Twitter and Tumblr connections.

 

We often mix up two bloggers named Scott. One of Jeffrey Epstein's victims says that she was abused by a white-haired psychology professor or Harvard professor named Stephen. In 2020, Vice observed that two Harvard faculty members with known ties to Epstein fit that description (a Steven and a Stephen). The older of the two taught the younger. The younger denies that he met or had sex with the victim. What kind of workplace has two people who can be reasonably suspected of an act like that?

I am being very careful about talking about this.

 

An opposition between altruism and selfishness seems important to Yud. 23-year-old Yud said "I was pretty much entirely altruistic in terms of raw motivations" and his Pathfinder fic has a whole theology of selfishness. His protagonists have a deep longing to be world-historical figures and be admired by the world. Dreams of controlling and manipulating people to get what you want are woven into his community like mould spores in a condemned building.

Has anyone unpicked this? Is talking about selfishness and altrusm common in LessWrong like pretending to use Bayesian statistics?

 

I used to think that psychiatry-blogging was Scott Alexander's most useful/least harmful writing, because its his profession and an underserved topic. But he has his agenda to preach race pseudoscience and 1920s-type eugenics, and he has written in some ethical grey areas like stating a named friend's diagnosis and desired course of treatment. He is in a community where many people tell themselves that their substance use is medicinal and want proscriptions. Someone on SneerClub thinks he mixed up psychosis and schizophrenia in a recent post.

If you are in a registered profession like psychiatry, it can be dangerous to casually comment on your colleagues. Regardless, has anyone with relevant qualifications ever commented on his psychiatry blogging and whether it is a good representation of the state of knowledge?

32
submitted 6 months ago* (last edited 6 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems
 

Bad people who spend too long on social media call normies NPCs as in video-game NPCs who follow a closed behavioural loop. Wikipedia says this slur was popular with the Twitter far right in October 2018. Two years before that, Maciej Ceglowski warned:

I've even seen people in the so-called rationalist community refer to people who they don't think are effective as ‘Non Player Characters’, or NPCs, a term borrowed from video games. This is a horrible way to look at the world.

Sometime in 2016, an anonymous coward on 4Chan wrote:

I have a theory that there are only a fixed quantity of souls on planet Earth that cycle continuously through reincarnation. However, since the human growth rate is so severe, the soulless extra walking flesh piles around us are NPC’s (sic), or ultimate normalfags, who autonomously follow group think and social trends in order to appear convincingly human.

Kotaku says that this post was rediscovered by the far right in 2018.

Scott Alexander's novel Unsong has an angel tell a human character that there was a shortage of divine light for creating souls so "I THOUGHT I WOULD SOLVE THE MORAL CRISIS AND THE RESOURCE ALLOCATION PROBLEM SIMULTANEOUSLY BY REMOVING THE SOULS FROM PEOPLE IN NORTHEAST AFRICA SO THEY STOPPED HAVING CONSCIOUS EXPERIENCES." He posted that chapter in August 2016 (unsongbook.com). Was he reading or posting on 4chan?

Did any posts on LessWrong use this insult before August 2016?

Edit: In HPMOR by Eliezer Yudkowsky (written in 2009 and 2010), rationalist Harry Potter calls people who don't do what he tells them NPCs. I don't think Yud's Harry says they have no souls but he has contempt for them.

view more: next ›