Talk:LessWrong/Archive19

From RationalWiki
Jump to navigation Jump to search

This is an archive page, last updated 3 May 2016. Please do not make edits to this page.
Archives for this talk page: , (new)(back)

Peer review[edit]

The following sentence, which I am going to remove now, from the 'Finance' section is factually incorrect:

...although in 10 years nothing has been published in a peer reviewed journal.

See a list of peer-reviewed papers here (see especially the last comment at the bottom).

Let us please try to be charitable in this respect. XiXiDu (talk) 12:11, 6 August 2013 (UTC)

Thank you for the improvement to the article--ADtalkModerator 13:27, 6 August 2013 (UTC)
By the way, note that "Artificial General Intelligence" is a book by Goertzel and Pennachin, not a peer reviewed journal. "Global Catastrophic Risks" is likewise a book. So it was not factually incorrect until 2012 when they published two papers in the "International Journal of Machine Consciousness", a journal which, as you can guess by name, publishes philosophical speculations about machine consciousness rather than technical papers about AI. Dmytry (talk) 21:27, 8 August 2013 (UTC)

Total utilitarianism vs. utilitarianism[edit]

The following sentence in the Roko's Basilisk section seems to be overloaded:

Yudkowsky has also advocated total utilitarianism, under which it would be justified to torture one person for 50 years to prevent dust specks in the eyes of sufficiently large numbers of people.

First of all, Yudkowsky claims to be an average utilitarian. Secondly, the distinction between average and total utilitarianism seems to be irrelevant with respect to choosing torture in the scenario outlined by Yudkowsky.

I am going to change it to simply 'utilitarianism' now. Please let me know how you disagree with that change, if you disagree. XiXiDu (talk) 15:34, 6 August 2013 (UTC)

Sounds good enough, not that average utilitarianism is any less problematic than total (The total utility will peak at some reasonable tradeoff between the number of people and quality of life, whereas the average utility will peak at 1 highest-utility person). This answer to a 3^^^3 dilemma would by itself alone be rather harmless (if fairly ill justified and somewhat bizarre), but combined with their other beliefs about the future, you get all sorts of crazy stuff. Dmytry (talk) 08:38, 7 August 2013 (UTC)
They seldom follow their own beliefs. Pascal's mugging is just one example. Another example is that nobody uses Bayesian probability except when facing well-defined problems. If you read Yudkowsky's answers in this thread you'll notice that he's acting based on intuition, rather than trusting the math.
I've been noticing this from the very beginning. There exists a lot of cognitive dissonance (click the link for how friendly AI might lead to worse outcomes than unfriendly AI). The most recent example is their work on the Löbian Obstacle Problem, in conjunction with their belief that an AI undergoing recursive self-improvement is a danger. Because, among other things, it wants to protect its utility-function. If an AI is unable to predict, with high confidence, that an improved version is going to preserve its values (e.g. maximizing paperclips), then it won't improve itself, because an improved version of itself with different values has negative expected utility. Which means that if they solve that problem they make unfriendly AI more likely. XiXiDu (talk) 10:26, 7 August 2013 (UTC)

Improving the RationalWiki entry for LessWrong (?)[edit]

I know nothing about LW so it's down to someone who does to check this. Scream!! (talk) 19:30, 8 August 2013 (UTC)

They did have 2 publications in "a peer reviewed journal" in 2012 and a few in 2013 , so that passage became obsolete. But "turned out to be factually incorrect" sounds like Alexander Kruel got misled by the wording of their post and thought RW had facts that were never correct. This is not the case - the pre-2012 "peer reviewed" publications were not in a peer reviewed journal, but in a book, by someone whose way they threw money, and yes that makes a difference. The utilitarianism of Yudkowsky which goes between the value of a vast interstellar civilization being large due to its size and being described as average utilitarianism, can't be described as either total or average, so that's correct. The disengagement from the practical: they do have rather hilarious practical ideas, such as hacking a dog food dispenser to dispense treats to combat procrastination via conditioning, or more harmfully, to self medicate with stimulant drugs (including illegal) to improve functioning. Dmytry (talk) 11:38, 9 August 2013 (UTC)
There are talk sections above, made for each of the changes. They've been pretty solid so far.--ADtalkModerator 12:53, 9 August 2013 (UTC)

Criticism and reply[edit]

This one is funny.
ArisKatsaris:

Rationalwiki has repeatedly proven itself utterly uninterested in factuality, accuracy or fair representation of whatever it is that it talks about. I've had personal experience with that in their LessWrong thread, where it went as follows: (1) I remarked in their talk page about several factual errors (factual falsehoods, not just issues of tone or fairness) (2) one of their better members does correct one of them (3) two others revert it back to the falsehood, while the rest of them make fun of me in the talk page for actually challenging their falsehoods, (and the remaining falsehoods remain unaltered).

Rationalwiki should be considered a den of malicious liars: It was created as an American Democrat response to the Republicans' Conservapedia, and it has to a degree cloned its ethos from those malicious liars, while supporting the nominally opposite political side.

And as for "rationalists", they're the sort of rationalists who think that the pinnacle of "rationalism" is atheism and will mock any idea more radical than atheism.

Politus:

Disclaimer: I am not a RationalWiki member, all of my findings are from the last hour or so as I've been reading (and chuckling) at the RationalWiki articles on EY, LessWrong, and Poko's Gorgon (or, in the local parlance, It-That-Shall-Not-Be-Named). The talk pages are particularly amusing, because watching self-described rationalists (of the RW and LW varieties) get into plebeian flamewars tickles the Discordian in me.

That being said, take Aris with a grain of salt. Participation in the LessWrong party aside, he's not exactly the most independent and objective source on RationalWiki. I've been reading through the talk pages on these articles and he's in all of them, and in all of them he comes across as the Platonic ideal of a -if you'll pardon the term- butthurt fanboy. Whenever anyone says stuff about EY, or LW, or questions both, or questions the plausibility of some facet of the LW-verse, or cracks wise at their expense, Aris is there to fight them tooth and nail. Some tools in his arsenal include the classic (paraphrase) "Hahaha wow that was funny thanks for the laugh" and the "I'M obsessed? YOU'RE obsessed!" His flame wars with Dmytry are the sort of classic, pseudo-rational froths, devoid of self-awareness that used to be the bread and butter of /r/subredditdrama before the advent of contemporary metadrama.

At one point, when someone goes so far as to humorously refer to Poko's Gorgon as LW's equivalent to a scary campfire story - "scary campfire stories for bored amateur philosophers" to be precise - Aris does the mature and rational thing. He, with all the humor of a German bureaucrat, goes "So now you're complaining because LessWrong is NOT banning scary stories?" and (I'm assuming) stormed up to his room, slammed his door shut, and started listening to Avril Lavigne.

Hehe.--ADtalkModerator 04:33, 28 August 2013 (UTC)

Yudkowsky SAT scores[edit]

Excuse me if this post is a little off topic, but this talkpage seems to get a bit of activity. It has been claimed by several people, including Yudkosky (I think he does in his autobiography anyway) that he scored 1410 on the SATs at the age of 11 and 1600 at the age of 15. This struck me as quite the achievement, being from Australia I have never taken the SATs but I assume they are as difficult as any other University entrance exam. Problem is, the only reference I can find for this claim is buried in a book by Damian Broderick called 'The Spike'. From the synopsis it seems that the book is heavily sympathetic to the transhumanist worldview and I think it's not unlikely that the author simply accepted the claim without further investigation. Furthermore, I can imagine these claims simply becoming part of the personality cult around Yudkowsky without anyone ever verifying them. Does anyone have any better evidence that Yudkowsky did indeed achieve these scores on the SAT? For anyone that knows better than me (which really would be anyone) is there a way to check other people's SAT scores, or will this claim forever live on as undisprovable? Tielec01 (talk) 01:37, 25 June 2013 (UTC)

Just to let y'all know, the SATs are scored out of 2400 so while I'm not saying 1600 is unimpressive for a 16 year old, I don't think it nearly qualifies someone to be a genious, considering that the test is designed for 17 and 18 year olds and 1600 would really only qualify one for a mid-tier college.--107.3.149.35 (talk) 05:17, 3 October 2013 (UTC)
You're just dating yourself... the SATs changed their scoring methods in 2005. For those of us who took it before then (i.e. people who are older than 23 or so) a 1600 is very impressive.
Not that his score is relevant or interesting, mind you.--ADtalkModerator 12:28, 3 October 2013 (UTC)
Wikipedia has only that source too. Interesting... Osaka Sun (talk) 01:59, 25 June 2013 (UTC)
I don't know if SAT scores are a matter of public record, but I very much doubt it. When he presumably took the test, it was in two sections: math and verbal. I've been told that the mean score on each test is scaled to 500, with 100 points being one standard deviation. 800+800, or a total of 1600, would mean the person scored at least three standard deviations above the mean on both sections, to within the accuracy of the tests.
I believe there were a few kids in my somewhat selective high school who came close to 1600, with at least two or three 800 scores seen in my class of about 200. I may have been one of those kids, but nowadays it still costs me two bucks for a cup of coffee at the corner store. Just saying— mere mortals can score that high, just not many of them. I'm not sure if a score of 1600 should be categorized as "that's nice" or "BFD." Sprocket J Cogswell (talk) 02:09, 25 June 2013 (UTC)
Well I bombed out on my SAT equivalent and coffee costs me 5-7 dollars a cup. Could be a coincidence, or it could be because I live in a mining driven boom town. It sounds plausible that Yudkowsky could have scored 1410/1600 and I would consider it impressive albeit unrelated to his other claims.However, having spent a touch of time around High-IQ communities (not as a member) I am well accustomed to outlandish claims of IQ scores. Under further exploration these claims often fall apart, for example, "I was assessed by a psychologist to have an IQ of 160 but I never sat a test". Anyway, I shouldn't spend too much time speculating; I have no reason to think that Yudkowsky is embellishing the truth except that the claim is quite extraordinary and he does have a habit of self-aggrandising. Tielec01 (talk) 03:25, 25 June 2013 (UTC)
Yeah... out of 100 000 people one has 1 in 100 000 performance, and far more than one may make extraordinary claims. Dmytry (talk) 11:23, 25 June 2013 (UTC)
Are we getting a wee bit obsessive here? I await the debate over Yudkowsky's shoe size. Nebuchadnezzar (talk) 21:59, 25 June 2013 (UTC)
Yeah point taken Neb, but I reserve the right to esoteric curiosities. I assure you, if Yudkowsky claimed a size 22 foot I would probably chase it up too; especially given his habit of grandiose claims. In any case, it sounds like his SAT scores would be too hard to determine either way, so I'll give him the benefit of the doubt. Tielec01 (talk) 00:42, 26 June 2013 (UTC)
One of the people working there claimed (in a private conversation) that he got a scan of a document emailed to him by one of Yudkowsky's parents, for the 1410. Dunno how much evidence is that, though. It's quite silly topic, yeah. Dmytry (talk) 07:41, 20 July 2013 (UTC)

So, if Yudkowsky never attended either high school or college, why did he need to take the SAT? And why does God need a starship? tuttlemsm 09:17, 4 August 2013 (UTC)

If my memory of an Extropians' mailing list post from long ago is accurate, he took the SAT (at a much earlier age than usual for college admission) to participate in Northwestern University's Midwest Academic Talent Search ("NUMATS"). JimF (talk) 14:58, 7 August 2013 (UTC)

I hit 1530, if memory serves, with no "studying" and no previous try, at 16. That qualifies me for Mensa, which I find to be fairly embarrassing. To them. 800 on the Physics test and over 600 of the French (both taken out of vanity). Vanity. Vanity seems to permeate the "high IQ" sector. It also permeates the "less wrong" sector, in my opinion. ħumanUser talk:Human 06:04, 6 August 2013 (UTC)