istewart

joined 8 months ago
[–] istewart@awful.systems 12 points 1 month ago (4 children)

He will never stop to reflect that his "philosophy," such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo

[–] istewart@awful.systems 13 points 1 month ago

Obvious joke is obvious, but

The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned “into dust. Into quarks” with the coherence of a meth-addled squirrel.

Harvard isn't already full of Quarks?

[–] istewart@awful.systems 11 points 1 month ago (1 children)

Another thread worth pulling is that biotechnology and synthetic biology have turned out to be substantially harder to master than anticipated, and it didn't seem like it was ever the primary area of expertise for a lot of these people anyway. I don't have a copy of any of Kurzweil's books at hand to look at his predicted timelines for that stuff, but they're surely way off.

Faulty assumptions about the biological equivalence of digital neural network algorithms have done a lot of unexamined heavy lifting in driving the current AI bubble, and keeping the harder stuff on the fringes of the conversation. That said, I don't doubt that a few refugees from the bubble-burst will attempt to inflate the next bubble on the back of speculative biotech, and I've seen a couple of signs of that already.

[–] istewart@awful.systems 5 points 1 month ago (2 children)

For my money, 2015/16 Adams trying to sell Trump as a "master persuader" while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining he's ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.

[–] istewart@awful.systems 7 points 1 month ago (2 children)

"This Is What Yudkowsky Actually Believes" seems like a subtitle that would get heavy use in a future episode of South Park about Cartman dropping out after one semester at community college.

[–] istewart@awful.systems 9 points 1 month ago

Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they're at best a student club that only aspires to be a proper curriculum. It's surely no coincidence that they're anchored in Berkeley, adjacent to the university's famous student-led DeCal program.

FWIW, my capsule summary of TPOT/"post-rationalists" is that they're people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.

[–] istewart@awful.systems 10 points 1 month ago* (last edited 1 month ago) (3 children)

I've been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was "just another fad for these people," and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.

I also prefer to highlight Kurzweil's obsession with perpetual exponential growth curves as a central point. That's often what I start with when I'm explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It'll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples' PhD theses.

awful.systems

[–] istewart@awful.systems 7 points 1 month ago

Huh, 2 paradigm shifts is about what it takes to get my old Beetle up to freeway speed, maybe big Yud is onto something

[–] istewart@awful.systems 6 points 1 month ago

It is what happened to look good in the valley between the Adderall comedown and yesterday evening's edible really starting to hit

[–] istewart@awful.systems 9 points 1 month ago (2 children)

Just had a video labeled "auto-dubbed" pop up in my YouTube feed for the first time. Not sure if it was chosen by the author or not. Too bad, it looks like a fascinating problem to see explained, but I don't think I'm going to trust an AI feature that I just saw for the first time to explain it. (And perhaps more crucially, I'm a bit afraid of what anime fans will have to say about this.)

[–] istewart@awful.systems 7 points 1 month ago

And the photos from a previous event are an ocean of whiteness. Hard to argue that they're not, uh, cultivating a certain demographic...

[–] istewart@awful.systems 9 points 1 month ago (1 children)

I propose we pool funds, buy an old motel on the other side of the city limits in Oakland, and rename it Farthaven

view more: ‹ prev next ›