Published: June 2, 2024
118
37
586

Thread from the PauseAI discord. This is not a dunk on any of these people, I believe they’re victims of a malicious ideology. “Mental health support groups” “Months of grief” “I panicked for 6 months… to save my family”

Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr

Not everyone is equally as crushed by doomer memetics. “It’s important to remember that all we have is expert *opinions*… experts are wrong all the time”

Image in tweet by Angantýr

“I see no exit” “Even if this is the end, let’s make some noise” “It is such a heavy burden to our children. It hurts me very much” “I don’t know… if I have laughed for a year now”

Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr

“Even if we all die… you’ll be glad that you tried to fight it” “All the advancements in AI have made me *really* anxious. It’s borderline depression at this point” “The logical thing to me would be to go around and tell everyone I see that the end of the world is near” “I

Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr
Image in tweet by Angantýr

@BasedNorthmathr When you hear that that AI presents an existential risk (86% of AI researchers believe the alignment problem is real and important), there are a couple of ways how you respond. At first, you deny it, of course. That's the default response from most people, including most in

@PauseAI Hrm interesting, however the opinions of experts don’t concern me and anybody who think it reasonable to try and throttle our ability to solve problems with sweeping regulations are my ideological enemy

@BasedNorthmathr If AI brings significant x-risk, their distress is justified. Do you have any object-level arguments?

@killerstorm Counterpoint: you cannot predict the future

@BasedNorthmathr something. serious.

@ns123abc Serious and awful

@BasedNorthmathr There's a whole cult built around AI doom https://x.com/kanzure/status/1...

@kanzure I know! Bad epistemology designed to capture autists and recruit them into fronts for sex-cults (MIRI) and have them obsess over a future that lacks any good explanations

@BasedNorthmathr I had no idea

@MPrinParr It’s a damn shame

@BasedNorthmathr someone needs to dunk on these people they are dangerous

@NftCelestials They’re misguided

@BasedNorthmathr Lots of parallels with what I see in climate action groups. There is a lot of cope and a general sense of doomerism. In the end some make it through and join an ESG consulting group 🤦🏽

@BasedNorthmathr Other than the last ss, all of those could be talking about any number of things including covid doomerism climate doomerism ai doomerism lol That says something

@AlignmentGuts Deferring to experts is a disastrous way to think, and is one of the commonalities. If you don’t know any better, someone with more credentials than you can tell you anything, and you’ll be inclined to believe it

@BasedNorthmathr I fear somebody from a community like this is gonna go full Ted sooner or later. Or maybe full Yates with their children, to save them from Roko's Basilisk.

@impershblknight They should honestly be more worried about the P(nutcase) from their own ranks. It’s very easy to justify sending a pipebomb to OAI offices if you genuinely think they’re trying to end the world

Share this thread

Read on Twitter

View original thread

Navigate thread

1/22