RationalWiki's 2018 Fundraiser

There is no RationalWiki without you. We are a small non-profit with no staff — we are hundreds of volunteers who document pseudoscience and crankery around the world every day. We will never allow ads because we must remain independent. We cannot rely on big donors with corresponding big agendas. We are not the largest website around, but we believe we play an important role in defending truth and objectivity.

If everyone seeing this today donates $5, we will meet our goal for 2018.

Fighting pseudoscience isn't free.
We are 100% user-supported! Help and donate $5, $20 or whatever you can today with PayPal Logo.png!

Donations so far: $3630Goal: $5000

Scott Alexander

From RationalWiki
Jump to: navigation, search
Carefully, correctly
LessWrong
Icon lesswrong.svg
Singularity blues

Scott Alexander (b. 1984) is a LessWrong-rationalist blogger. After graduating with a bachelor’s degree magna cum laude in Philosophy[1], he gained an MD, and then completed a residencyWikipedia's W.svg as a psychiatrist-in-training. Scott Alexander is his pen name. As is customary in the writing of psychiatrists and psychologists, he mashes up details of different patients when he writes about them, so as to fictionalize the accounts and avoid his patients being identified.

He began writing on Less Wrong under the name Yvain, and then branched out into his own blog, Slate Star Codex (a near-anagram of "Scott Alexander"). SSC has become one of the top-tier blogs for LessWrong-style rationalists, this and his related Tumblr being linchpins of the LessWrong Diaspora.[2]

SSC posts tend to range from long to extremely long. Alexander uses Twitter[3] and Tumblr[4] to post short/frivolous posts and puns.

Notable internet publications include his giant anti-neoreactionary FAQ,[5] his map of the rationalist blogosphere, and a long collection of quotations from actual computer scientists on the subject of why we should take AI risk seriously. Additionally, he posted a lengthy and famed criticism[6] of feminism, which had been spurred by a feminist backlash[7][8] against a blog comment[9] by MIT professor Scott Aaronson.

Alexander is a frequent visitor of local LessWrong-rationalist meetups[10] in the US, and organizes some of them himself.

Political and social views[edit]

He does not always censor racist and sexist opinions in his comments section, which some of his fellow LessWrong-style rationalists have a problem with[11] (except on open threads, where race and gender discussions are always banned).

Iranian secularalist Kaveh Mousavi, while agreeing with Alexander that the intellectually bankrupt sections of the social justice community should be heavily critiqued, has nonetheless criticized Alexander himself for having an Americentric view of social issues and of creating a false equivalence between social justice advocates and social conservatives, as well as of downplaying discrimination against women and minorities in Western countries.[12] It is worth noting that though that Alexander has been willing to defend parts of social justice he views as worthwhile, such as uses for trigger warnings[13] and acknowledging discrimination still exists and has massive economic costs.[14]

Neoreaction[edit]

Alexander is critical of neoreactionaries, having written what is generally regarded as the definitive takedown of neoreaction,[5] though, per the header, he later took back some of the points he made in it. His blogroll is full of neoreactionaries and his comment section contains a lot of neoreactionary discussion, because he knows a pile of them personally, and he keeps discussing their ideas in his blog and e.g. considers Mencius Moldbug's Unqualified Reservations blog an obvious go-to reference his readers will immediately understand when he's talking about gay relationship counselling.[15]

Feminism[edit]

Alexander does not identify as a feminist or an anti-feminist[3], but feels like he has been unfairly associated with both.

He talked of "the sane 30%-or-so of feminists"[16][17] and described some essays as “blurring the already thin line between feminism and literally Voldemort”[18][19].

In the post meant to clarify his position on feminism and feminist issues "SSC on Feminism", he described his negative attitudes towards the movement as:

I think there’s a whole corner of Internet feminism – the Jezebel, Gawker, and Modal Tumblr User faction – which is really scary. [...]

This strain is absolutely not the entirety of the movement – but it has become a big enough piece of the movement, and sufficiently dangerous to anybody who doesn’t share their views, that I think it really needs talking about and can’t be dismissed as “a few bad apples”. [...]

I will sometimes complain about “feminists” in a way that doesn’t necessarily mean the millions of feminists who follow good discussion norms and treat other people with respect. I’m trying to generalize less now and be much more precise about how I mean only a certain strain, but I have left the older posts untouched.

Libertarianism[edit]

I feel pretty okay about both being sort of a libertarian and writing an essay arguing against libertarianism, because the world generally isn’t libertarian enough but the sorts of people who read long online political essays generally are way more libertarian than can possibly be healthy.[20]

Communism[edit]

He is highly critical of communism, and has more generally been persistently critical of what he views as millenarian ideologies, i.e., a catastrophe will destroy the current system, handwave, a new Golden Age will arise from the ashes.[21] Much as with neoreaction, this hasn't stopped him from writing long book reports and getting very interested, for example, in the details of central planning in the USSR.[22]

Existential risks[edit]

Alexander believes that the risks of superintelligent AIs (e.g. the risk of them misconstruing our goals and turning us all into paperclips) have been repeatedly misrepresented and downplayed by the media, that while immediate disaster is unlikely, the threat is worth taking seriously, and now is a good time to research it.[23]

However, Alexander, who echoes the views of Machine Intelligence Research Institute (MIRI), Stephen Hawking, Elon Musk and Nick Bostrom on this, is not an AI researcher, nor a computer scientist (and the same goes for most of the "researchers" at MIRI, including Eliezer Yudkowsky). An actual AI researcher, Richard Loosemore, has criticized the assumptions behind many of the MIRI-style superintelligent AI doomsday scenarios, pointing out that an AI that thought it could correctly interpret the core goals of humanity but got them so hideously wrong would not in fact be worthy of the name "intelligent" at all, and that this is not merely a naming issue but a basic design issue for AIs.[24]

Effective altruism[edit]

Alexander finds the logic of effective altruism difficult to accept intellectually, having come up with a very counterintuitive thought experiment about it, but is inclined to offer effective altruism his moral support anyway.[citation needed] Alexander is big supporter of charity on similar grounds and often gives speeches on efficient charity,[25] and currently supports the Giving What We Can project which attempts to separate effective charities from inefficient ones.[26]

Race and IQ[edit]

Alexander identifies with the 'hereditarian left',[27] and considers The Bell Curve co-author Charles Murray to be a close ideological ally.[28][29] He has also expressed support for Gregory Cochran and Henry Harpending's hypothesis that the frequency of congenital diseases among Ashkenazi Jews (of which Alexander is one) is caused by selection for intelligence.[30] There is almost nothing he won't try to apply human biodiversity to, e.g. Harry Potter.[31]

The Slate Star Codex comments section and the /r/slatestarcodex subReddit are even more extreme on this issue, with Cochran, Steve Sailer and Emil O. W. Kirkegaard all having taken part in the discussion.

/r/slatestarcodex[edit]

As usual, you can make anything worse by adding Reddit. /r/slatestarcodex is an unofficial fan forum for the blog. Scott comments occasionally, but didn't start it and doesn't run it (he's a moderator but doesn't do anything). The culture wars (a regular weekly thread) and scientific racism human biodiversity are regular and upvoted topics. Of course, much more offensive than the racism is objecting to the racism, which gets you a day's ban.[32] According to one moderator, "A belief in HBD doesn’t automatically equate to racism", somehow.

The moderators have a partial registry of bans. [33]

In popular culture[edit]

Dark Enlightenment philosopher Nick Land's 2014 psychological horror novella Phyl-Undhu includes a technological cult reminiscent of LessWrong, and a character called "Alex Scott" expressing some of Scott's ideas on the Doomsday Hypothesis, with an intelligence at the end of time you can communicate with, and a cultist pushed out of the cult who "wants to have not thought certain things."

External links[edit]

References[edit]

  1. Five Years And One Week Of Less Wrong
  2. https://wiki.lesswrong.com/wiki/Rationalist_movement
  3. https://twitter.com/slatestarcodex
  4. http://slatestarscratchpad.tumblr.com/
  5. 5.0 5.1 The Anti-Reactionary FAQ
  6. http://slatestarcodex.com/2015/01/01/untitled/
  7. http://www.newstatesman.com/laurie-penny/on-nerd-entitlement-rebel-alliance-empire
  8. Amanda Marcotte (Decemeber 30, 2014). "MIT professor explains: The real oppression is having to learn to talk to women". http://www.rawstory.com/2014/12/mit-professor-explains-the-real-oppression-is-having-to-learn-to-talk-to-women/. 
  9. http://www.scottaaronson.com/blog/?p=2091#comment-326664
  10. Less Wrong meetup groups
  11. Answer by Caio Camargo to "How true is the statement 'the comment threads on Slate Star Codex are a nightmare to read through'?"
  12. http://www.patheos.com/blogs/marginoferr/2015/06/16/the-irregular-symmetry/
  13. The Wonderful Thing about Triggers. Slate Star Codex, May 30, 2014
  14. http://slatestarcodex.com/2013/04/20/social-justice-for-the-highly-demanding-of-rigor/
  15. http://slatestarcodex.com/2015/12/01/setting-the-default/
  16. [1]
  17. He specified in the comments: The word “sane” in that context should not be taken to mean “stupid” or even “holds stupid views”, but rather “willing to hold rational discussions about their views with someone they are tempted to consider an evil enemy, based on the Principle of Charity” ([2]),
    The evil enemy in question being neoreactionaries.
  18. Scott Alexander, Radicalizing the Romanceless. Slate Star Codex, August 31, 2014
  19. He apparently regrets the popularity of this phrase saying "NO NEED TO TAKE THIS ONE SENTENCE OUT OF CONTEXT AND TRY TO SPREAD IT ALL OVER THE INTERNET", though it really doesn't improve at all with context.
  20. All Debates Are Bravery Debates by Scott Alexander (June 9, 2013) Slate Star Codex.
  21. http://slatestarcodex.com/2016/09/28/ssc-endorses-clinton-johnson-or-stein/
  22. http://slatestarcodex.com/2014/09/24/book-review-red-plenty/
  23. http://slatestarcodex.com/2015/05/29/no-time-like-the-present-for-ai-safety-work/
  24. http://ieet.org/index.php/IEET/more/loosemore20140724
  25. http://slatestarcodex.com/2013/04/05/investment-and-inefficient-charity/
  26. https://www.givingwhatwecan.org/
  27. https://slatestarcodex.com/2017/05/09/links-517-rip-van-linkle/
  28. https://slatestarcodex.com/2016/05/23/three-great-articles-on-poverty-and-why-i-disagree-with-all-of-them/
  29. https://slatestarcodex.com/2017/04/12/clarification-to-sacred-principles-as-exhaustible-resources/
  30. https://slatestarcodex.com/2017/05/26/the-atomic-bomb-considered-as-hungarian-high-school-science-fair-project/
  31. Despite apparently having never read them, but he can tell you all about Jensen.
  32. subthread (archive)
  33. https://www.reddit.com/r/slatestarcodex/wiki/bans