RationalWiki's Q4 2016 Fundraiser
We are 100% user-supported!
Without you, there is no RationalWiki!
Goal: $5000 Donations so far: $2030
40.6%

Fighting pseudoscience isn't free.
Help and donate today!

Talk:LessWrong

From RationalWiki
Jump to: navigation, search
Icon internet.svg

This Internet related article has been awarded BRONZE status for quality. It's getting there, but could be better with improvement.

This article is of MID importance to the wiki.

See RationalWiki:Article rating for more information.

Copperbrain.png


This page is automatically archived by Archivist
Archives for this talk page: <1>, <2>, <3>, <4>, <5>, <6>, <7>, <8>, <9>, <10>, <11>, <12>, <13>, <14>, <15>, <16>, <17>, <18>, <19>, <20>

Contents

[edit] Doing something for people distressed by the basilisk

moved to the shiny new Talk:Roko's basilisk

[edit] SomethingAwful thread

[1] - includes potentially useful criticisms of LW and of this very article (e.g. [2]) - David Gerard (talk) 13:31, 13 July 2014 (UTC)

[edit] Crossing the cult event horizon

I've finally given up looking for anything redeeming in LessWrong after a knowledgeable critic - who I'm refraining from naming here - posted, saying he couldn't deal with ongoing harassment across the net from LW members, as it's been affecting his health - I've seen them stalking his posts and callng him a liar wherever they find him, and I would shrug this off but I certainly wouldn't expect anyone else to - and asking how it could stop. Yudkowsky responded with something that reminded me of Scientology steps A-E. (Which are similarly something Scientology does when its harassment gets a victim to begging stage.) Am I overreacting, or is this actually creepy and borderline insane?

(This is why a pile of reference links on Roko's basilisk just went blank.) - David Gerard (talk) 16:31, 18 December 2014 (UTC)

*jaw drops* I think that at least a part of the response belongs as a quote in the article on Yudkowsky.--ZooGuard (talk) 19:22, 18 December 2014 (UTC)
Honestly, this feels like it crosses a line to racketeering, since by making demands, Yudkowsky crossed a line to being an active proponent of the harassment. I've always been mildly amused at LW, but "Admit I'm the best or harassment continues" is not a position that should be tolerated in a sane country. Ikanreed (talk) 19:33, 18 December 2014 (UTC)
I had never heard of LW until reading the article on RW a few years back, so I went in gave it a look. For reasons that are irrelevant to this discussion, I came to dislike the LW community with some rapidity. This example of cultish behavior is not surprising in the slightest. --Inquisitor (talk) 19:45, 18 December 2014 (UTC)
Well, I mean, it's kinda what happens any time a bunch of atheists get together and accidentally start a religion. The community's central ideas become immutable truths. Honestly, I appreciate the existence of LessWrong, because it gives our dark, corporate-controlled cyberpunk future that inevitable cult of computer worshipers that the genre demands. The fact that they aren't a motorcycle gang too is a bit depressing, though. Ikanreed (talk) 19:53, 18 December 2014 (UTC)
On the one hand, he was speaking for himself there, and he pointed out that "I do not control anyone except myself." On the other hand, he indirectly controls people other than himself. He's in a position of being able to set a (small) mob on someone, but I don't know if he could get the same mob to stop. (We've already seen how bad he is at Internet PR...) Player 03 (talk) 21:45, 2 September 2015 (UTC)

Yeah, I've always gotten the creepy cult vibe from LW, especially given the way that site regulars seem to have a compelling need to bring up Yudkowsky and the site in pretty much any context. "Greetings, sir, have you heard the Gospel of Yudkowsky?" Any bets on when one of them will start a compound and call for other LWies to move there? Unfortunately they'll probably wind up tarring the whole rationalist/skeptic subculture by association if they get into the public eye, but whatcha gonna do. In a nice bit of irony, the champions of rationalism are displaying many of the same irrational behaviors they focus so much on criticizing: leader worship, shunning outsiders, believing that things will happen if they wish really hard. --Ymir (talk) 00:34, 19 December 2014 (UTC)

Yeah it definitely felt "off" to me as well. But I freely admit this is due in part to my own biases. To me, a community of individuals that is earnestly seeking to be "less wrong" would sound like: "Hey! I discovered this cool thing today! Anybody else know about this? Let's talk about it!". Instead LW read a lot like "Gather round children and come bathe in the master's newest revelation. Oh, you disagree? How charming. Go and drink deeply of the master's Sequences. Once you have understood and accepted them in fullness, you too will be welcomed into the body." --Inquisitor (talk) 01:06, 19 December 2014 (UTC)

I suspect this article needs another rewrite. I am probably cursed with the greatest quantity of detailed information, but might come across as excessively pissed off. Anyone else feeling trepidatious? - David Gerard (talk) 15:09, 19 December 2014 (UTC)

The recent name-change to MIRI also gives off a "We're not nutbars, honestly!" vibe. Nebuchadnezzar (talk) 22:32, 19 December 2014 (UTC)
That was actually a deal with Ray Kurzweil - people kept assuming they were part of his organisation and there was a lot of confusion, so they sold the name, singularity.org and the Singularity Summit to him (quite friendly, they all know each other and he's on their advisory board) - David Gerard (talk) 22:53, 19 December 2014 (UTC)

The quote isn't representative of the community's response, and when I attempted to mention other users' responses, David Gerard removed it. If this is going to be about Yudkowsky in particular, why isn't it here? Player 03 (talk) 21:49, 2 September 2015 (UTC)

Dunno. You might ask DG directly. Mʀ. Wʜɪsᴋᴇʀs, Esϙᴜɪʀᴇ (talk/stalk) 22:40, 2 September 2015 (UTC)

[edit] Harassment or SIWOTI?

The opening paragraph from the post in question: "You may know me as the guy who posts a lot of controversial stuff about LW and MIRI. I don't enjoy doing this and do not want to continue with it. One reason being that the debate is turning into a flame war. Another reason is that I noticed that it does affect my health negatively [...]"

So when he says "it does affect my health negatively," what does "it" refer to?

In the first sentence, he talks about how he posts controversial stuff, and in the second, he says he doesn't enjoy doing so anymore. So maybe "posting controversial stuff" is what was hurting his health. (In other words, he's saying he has SIWOTI syndrome, and he's getting health problems because he's so addicted to refuting claims.)

But then again, the third sentence brings up a debate-turned-flame war, which could certainly be a source of stress. This is right before "it does affect my health," so the flame war has to be the "it," right? And a lopsided flame war probably does count as harassment...

Except that that's ignoring the "one reason...another reason" construction. Sentences three and four are separate, which is why I think the "it" refers to sentence one instead.

tl;dr: The page says "He posted to LessWrong saying that the ongoing harassment had been affecting his health," but when I read his post, he seems to be saying something else entirely. I suggest replacing it with something like "He posted to LessWrong saying that the stress of debating an entire community was affecting his health." Player 03 (talk) 23:38, 2 September 2015 (UTC)

[edit] Noting for future use

  • Post on IEET:[3] (can't find any other detail on this)
MIRI/SIAI tried to “take over” the transhumanist group HumanityPlus 3.5 years ago, when four SIAI members ran for H+’s Board. SIAI ran a sordid, pushy, insulting campaign, bribing voters, accusing opponents of “racism”, deriding Board members as “freaky… bat-shit crazy [with] broken reasoning abilities.” MIRI failed in their attempt to colonize H+, but they’ve successfully wormed their way into the heart of EA.
  • Transhumanism wasn't always as furiously right-wing as it is now. A similar colonisation happened in 2008-2009, when the libertarians moved in and took over from the more socialist types. From THE POLITICS OF TRANSHUMANISM AND THE TECHNO-MILLENNIAL IMAGINATION, 1626–2030 by James J. Hughes (a PDF I have here):
The elective affinity between libertarian politics and Singularity can be partly explained by the idea of technological inevitability. Collective agency is not required to ensure the Singularity, and human governments are too slow and stupid to avert the catastrophic possibilities of superintelligence, if there are any. Only small groups of computer scientists working to create the first superintelligence with core “friendliness code” could have any effect on deciding between catastrophe and millennium.

This latter project, building a friendly AI, is the focus of the largest Singularitarian organization, the Singularity Institute for Artificial Intelligence SIAI), headed by the autodidact philosopher Eliezer Yudkowsky. In “Millennial Tendencies in Responses to Apocalyptic Threats” (Hughes 2008), I parse Yudkowky and the SIAI as the “messianic” version of Singularitarianism, arguing that their semi-monastic endeavor to build a literal deus ex machina to protect humanity from the Terminator is a form of magical thinking. The principal backer of the SIAI is the conservative Christian transhumanist billionaire Peter Thiel. Like the Extropians Thiel is an anarcho-capitalist envisioning a stateless future and funder of the Seasteading Foundation, which works to create independent floating city-states in international waters. He also is the principal funder of the Methuselah Foundation, which works on anti-aging research. In 2011 and 2012 Thiel was the principal financier of the SuperPAC backing libertarian Republican Ron Paul, and he supports other conservative foundations and political projects on the right.

In 2009 the libertarians and Singularitarians launched a campaign to take over the World Transhumanist Association Board of Directors, pushing out the Left in favor of allies like Milton Friedman’s grandson and Seasteader leader Patri Friedman. Since then the libertarians and Singularitarians, backed by Thiel’s philanthropy, have secured extensive hegemony in the transhumanist community. As the global capitalist system spiraled into the crisis in which it remains, partly created by the speculation of hedge fund managers like Thiel, the left-leaning majority of transhumanists around the world have increasingly seen the contradiction between the millennialist escapism of the Singularitarians and practical concerns of ensuring that technological innovation is safe and its benefits universally enjoyed. While the alliance of Left and libertarian transhumanists held together until 2008 in the belief that the new biopolitical alignments were as important as the older alignments around political economy, the global economic crisis has given new life to the technoprogressive tendency, those who want to organize for a more egalitarian world and transhumanist technologies, a project with a long Enlightenment pedigree and distinctly millenarian possibilities.
In surveys I conducted in 2003, 2005, and 2007 of the global membership of the World Transhumanist Association, left-wing transhumanists outnumbered conservative and libertarian transhumanists 2-to-1 (Humanity+ 2008). By 2007 16 percent of respondents specifically self-identified as “technoprogressive.”
  • su3su2u1 on LessWrong's strongly anti-scientific viewpoint, and how this is not just incidental:[4]

- David Gerard (talk) 17:25, 6 August 2015 (UTC)

A great example of the anti-science shit is the time Yudkowsky waded into nuclear physics, I ranted about it: [5] . TL;DR; Yudkowsky skimmed through some popular book about Manhattan project, skipped any bits that were even remotely technical, and got the feeling that it must be really easy and thus scientists are idiots. Dmytry (talk) 19:28, 4 October 2015 (UTC)

[edit] Peter Thiel is a trump delegate

[6]. Hipocrite (talk) 21:01, 25 May 2016 (UTC)

It's irrelevant. Fuzzy "Cat" Potato, Jr. (talk/stalk) 22:16, 25 May 2016 (UTC)
Actually, Yudkowsky's been going well out of his way (on Facebook, so not reliably linkable from here) to loudly say how it's not relevant at all and that Thiel totally isn't even a Trump supporter (which doesn't match with Thiel's well-documented political views), so it may be more relevant than you think - David Gerard (talk) 08:50, 26 May 2016 (UTC)

[edit] Of course the LessWrong crowd is pro-gamergate

Source.204.11.142.106 (talk) 15:52, 7 June 2016 (UTC)

Personal tools
Namespaces

Variants
Actions
Navigation
support
Community
Tools