Published: April 20, 2023
28
60
314

These numbers are B.S. These figures came from: https://aiimpacts.org/2022-exp... They asked over 4,000 people who happened to publish at two 2021 AI conferences, & only 17% (~700) people responded. This doesn't represent 50% of AI researchers! & hardly from an unbiased sample.

@MelMitchell1 Based on the dataset for this survey: Only 149 answered the "Extinction from AI" Q. = 20% of the 738 respondents (3% of the 4,271 AI experts contacted) Only 162 answered "Extinction from human failure to control AI" Q. = 22% of the 738 respondents (3.7% of AI experts contacted)

Image in tweet by Melanie Mitchell

@MelMitchell1 AI Impacts FUNDING: MIRI (Yudkowsky), Survival & Flourishing Fund, The Centre for Effective Altruism (Oxford), Effective Altruism Funds, Open Philanthropy, Fathom Radiant. Previously: The Future of Life Institute, the Future of Humanity Institute (Bostrom), FTX Future Fund (RIP).

Image in tweet by Melanie Mitchell

@MelMitchell1 Tristan Harris cites this study in the "AI Dilemma," his podcast, on NBC, and his New York Times OpEd with Raskin and Harari. The New York Times REALLY loves to share this stat as well; It already appeared in Klein's column & podcast, Wallace-Wells' OpEd & The Morning newsletter

Image in tweet by Melanie Mitchell

@DrTechlash @MelMitchell1 "Prediction is very difficult, especially if it's about the future."-Niels Bohr. Also, Yogi Berra. Also, maybe, George Box. But about the uncertainty of future events, I like Mike Tyson: "Everyone has a plan until they get punched in the mouth." Or, maybe Napolean, not sure

@NeilRaden @MelMitchell1 "We should not overstate the results of these surveys," @MaxCRoser summarized. "Experts in a particular technology are not necessarily experts in making predictions about the future of that technology." 🎯 https://ourworldindata.org/ai-...

@DrTechlash @MelMitchell1 Just realized: this is also wrong because he's not describing the question correctly. The question he's describing got an answer of 5%, not 10%.

@DrTechlash @MelMitchell1 It is recommended and general practice to use 10% as the maximum population of a survey. Not answering the question does not mean they do/do not agree.

@DrTechlash @MelMitchell1 Extinction from the energy resources used. The increase in energy expenditure from AI use can be expected to surpass by a lot the rise in energy expenditure we saw with cloud resources. That alone will tip us over in terms of climate change.

@DrTechlash @MelMitchell1 Kindly read the whole dataset where "How positive or negative do you expect the overall impact of this to be on humanity, in the long run? - Extremely bad (e.g. human extinction)" had 559 answers and an average of 17%

Share this thread

Read on Twitter

View original thread

Navigate thread

1/11