Information icon.svg The 2018 RMF board election has started! We are electing 3 boardmembers for the 2018-2020 term.
Nominate users here and read their campaign slogans here!

Talk:LessWrong

From RationalWiki
Jump to: navigation, search
Icon internet.svg

This Internet related article has been awarded BRONZE status for quality. It's getting there, but could be better with improvement. See RationalWiki:Article rating for more information.

Copperbrain.png


This page is automatically archived by Archiver
Archives for this talk page: <1>, <2>, <3>, <4>, <5>, <6>, <7>, <8>, <9>, <10>, <11>, <12>, <13>, <14>, <15>, <16>, <17>, <18>, <19>, <20>

Doing something for people distressed by the basilisk[edit]

moved to the shiny new Talk:Roko's basilisk

Noting for future use[edit]

  • Post on IEET:[1] (can't find any other detail on this)
MIRI/SIAI tried to “take over” the transhumanist group HumanityPlus 3.5 years ago, when four SIAI members ran for H+’s Board. SIAI ran a sordid, pushy, insulting campaign, bribing voters, accusing opponents of “racism”, deriding Board members as “freaky… bat-shit crazy [with] broken reasoning abilities.” MIRI failed in their attempt to colonize H+, but they’ve successfully wormed their way into the heart of EA.
  • Transhumanism wasn't always as furiously right-wing as it is now. A similar colonisation happened in 2008-2009, when the libertarians moved in and took over from the more socialist types. From THE POLITICS OF TRANSHUMANISM AND THE TECHNO-MILLENNIAL IMAGINATION, 1626–2030 by James J. Hughes (a PDF I have here):

The elective affinity between libertarian politics and Singularity can be partly explained by the idea of technological inevitability. Collective agency is not required to ensure the Singularity, and human governments are too slow and stupid to avert the catastrophic possibilities of superintelligence, if there are any. Only small groups of computer scientists working to create the first superintelligence with core “friendliness code” could have any effect on deciding between catastrophe and millennium.

This latter project, building a friendly AI, is the focus of the largest Singularitarian organization, the Singularity Institute for Artificial Intelligence SIAI), headed by the autodidact philosopher Eliezer Yudkowsky. In “Millennial Tendencies in Responses to Apocalyptic Threats” (Hughes 2008), I parse Yudkowky and the SIAI as the “messianic” version of Singularitarianism, arguing that their semi-monastic endeavor to build a literal deus ex machina to protect humanity from the Terminator is a form of magical thinking. The principal backer of the SIAI is the conservative Christian transhumanist billionaire Peter Thiel. Like the Extropians Thiel is an anarcho-capitalist envisioning a stateless future and funder of the Seasteading Foundation, which works to create independent floating city-states in international waters. He also is the principal funder of the Methuselah Foundation, which works on anti-aging research. In 2011 and 2012 Thiel was the principal financier of the SuperPAC backing libertarian Republican Ron Paul, and he supports other conservative foundations and political projects on the right.

In 2009 the libertarians and Singularitarians launched a campaign to take over the World Transhumanist Association Board of Directors, pushing out the Left in favor of allies like Milton Friedman’s grandson and Seasteader leader Patri Friedman. Since then the libertarians and Singularitarians, backed by Thiel’s philanthropy, have secured extensive hegemony in the transhumanist community. As the global capitalist system spiraled into the crisis in which it remains, partly created by the speculation of hedge fund managers like Thiel, the left-leaning majority of transhumanists around the world have increasingly seen the contradiction between the millennialist escapism of the Singularitarians and practical concerns of ensuring that technological innovation is safe and its benefits universally enjoyed. While the alliance of Left and libertarian transhumanists held together until 2008 in the belief that the new biopolitical alignments were as important as the older alignments around political economy, the global economic crisis has given new life to the technoprogressive tendency, those who want to organize for a more egalitarian world and transhumanist technologies, a project with a long Enlightenment pedigree and distinctly millenarian possibilities.

In surveys I conducted in 2003, 2005, and 2007 of the global membership of the World Transhumanist Association, left-wing transhumanists outnumbered conservative and libertarian transhumanists 2-to-1 (Humanity+ 2008). By 2007 16 percent of respondents specifically self-identified as “technoprogressive.”

  • su3su2u1 on LessWrong's strongly anti-scientific viewpoint, and how this is not just incidental:[2]

- David Gerard (talk) 17:25, 6 August 2015 (UTC)

A great example of the anti-science shit is the time Yudkowsky waded into nuclear physics, I ranted about it: [3] . TL;DR; Yudkowsky skimmed through some popular book about Manhattan project, skipped any bits that were even remotely technical, and got the feeling that it must be really easy and thus scientists are idiots. Dmytry (talk) 19:28, 4 October 2015 (UTC)

Peter Thiel is a trump delegate[edit]

[4]. Hipocrite (talk) 21:01, 25 May 2016 (UTC)

It's irrelevant. FuzzyCatPotato of the Dazzling Suzukis (talk/stalk) 22:16, 25 May 2016 (UTC)
Actually, Yudkowsky's been going well out of his way (on Facebook, so not reliably linkable from here) to loudly say how it's not relevant at all and that Thiel totally isn't even a Trump supporter (which doesn't match with Thiel's well-documented political views), so it may be more relevant than you think - David Gerard (talk) 08:50, 26 May 2016 (UTC)

Of course the LessWrong crowd is pro-gamergate[edit]

Source.204.11.142.106 (talk) 15:52, 7 June 2016 (UTC)

which is the correct way to be. — Unsigned, by: 70.198.56.162 / talk

LW on RW[edit]

[5]: as 'the rat officer (tailed grade)' would say 'Meow.' 82.44.143.26 (talk) 19:37, 1 November 2016 (UTC)

This is known -- it's from 2012. Herr FuzzyKatzenPotato (talk/stalk) 21:28, 1 November 2016 (UTC)

Referring to the Less Wrong article or [6]/[7]? (Mild timewaster warning on the latter) 82.44.143.26 (talk) 18:05, 2 November 2016 (UTC)

Both. FuzzyCatPotato of the Prepossessing Furnaces (talk/stalk) 21:24, 2 November 2016 (UTC)

I don't understand this section[edit]

The inner reaches of LessWrong can get a little ... insular. Here's Michael Vassar, former President of MIRI and an active member of the Bay Area transhumanist community, talking to Harper's in 2014:[1]

It was getting late. I asked him about the rationalist community. Were they really going to save the world? From what?

“Imagine there is a set of skills,” he said. “There is a myth that they are possessed by the whole population, and there is a cynical myth that they’re possessed by 10 percent of the population. They’ve actually been wiped out in all but about one person in three thousand.” It is important, Vassar said, that his people, “the fragments of the world,” lead the way during “the fairly predictable, fairly total cultural transition that will predictably take place between 2020 and 2035 or so.”

Using these principles, Vassar founded failed medical advice startup MetaMed, which put the assumption that rational thinking from first principles would beat the conventions of the field, and rapidly went bust.

What about the quote is "insular"? What "principles" from the quote were used in MetaMed? The quote seems to be saying, "actually us rationalists are a super-duper minority, also the apocalypse is coming" -- not what either of the taglines say. Herr FuzzyKatzenPotato (talk/stalk) 23:51, 5 June 2017 (UTC)

the "his people" part implies that the rare rational people are the people on the site and everyone else is not only helplessly irrational but helpless when it comes to the problems of the future.Vorarchivist (talk) 13:58, 12 October 2017 (UTC)