Talk:LessWrong/Archive4
What is bayesianism, hahahaha.[edit]
I didn't see this before:
http://lesswrong.com/lw/1to/what_is_bayesianism/
Would it be fair to say that they just reused name of a dead mathematician for some ideological thing similar in nature to 'dianetics' ? (edit: i.e. a bunch of truisms presented as awesome revelation that justifies the crazy beliefs) That's pretty clever. Everyone with a clue gets distracted to arguing fine points of statistics. And you can't describe it with a short phrase 'X sucks' without a zillion qualifiers. Dmytry (talk) 09:18, 11 June 2012 (UTC)
- Much like how people who love to go on about Occam's razor often show very little or no interest in what William of Occam himself thought about the subject. (((Zack Martin)))™ 20:25, 11 June 2012 (UTC)
- More like how people whom go on about quantum anything typically know nothing about the actual math there and can't do an interferometer without confusing i with phase of 180. Dmytry (talk) 09:45, 12 June 2012 (UTC)
- Or how people who have never opened a physics books confidently declare that MWI is "obviously correct".--Baloney Detection (talk) 20:33, 12 June 2012 (UTC)
- The core tenet of "LW rationalism" (I think) : without knowing subtleties rationalists know better. I like to picture mission statements like "devoted to refining the art of human rationality" as a spring pulling together two fenceposts in the ground, one going hundred meters deep labelled "what we are doing" and other shallowly buried in the sand, labelled "what we consider rationality to be". Add tough string and the definition of rationality is sent flying. Dmytry (talk) 21:25, 12 June 2012 (UTC)
- Some people seem to think that because LWians call themselves "rational", they are so. They are rational in the same way as Objectivists are, is word but not in deed.--Baloney Detection (talk) 18:29, 17 June 2012 (UTC)
- The core tenet of "LW rationalism" (I think) : without knowing subtleties rationalists know better. I like to picture mission statements like "devoted to refining the art of human rationality" as a spring pulling together two fenceposts in the ground, one going hundred meters deep labelled "what we are doing" and other shallowly buried in the sand, labelled "what we consider rationality to be". Add tough string and the definition of rationality is sent flying. Dmytry (talk) 21:25, 12 June 2012 (UTC)
- Or how people who have never opened a physics books confidently declare that MWI is "obviously correct".--Baloney Detection (talk) 20:33, 12 June 2012 (UTC)
- More like how people whom go on about quantum anything typically know nothing about the actual math there and can't do an interferometer without confusing i with phase of 180. Dmytry (talk) 09:45, 12 June 2012 (UTC)
A critique of the friendly AI "problem"[edit]
I wrote a short essay about the supposedly urgent problem that the SIAI has set itself to solve: Essay:Why there is no problem of friendly AI Constructive criticism very much appreciated!--Baloney Detection (talk) 22:37, 15 June 2012 (UTC)
Criticism of LW on Luke Muehlhauser's blog[edit]
In Luke Muehlhauser's (the executive director of the SIAI) now defunct atheist blog (where he started promoting LW), some interesting criticism of it came from commenters in this post[1]. Scroll down to the comment by antiplastic. I'll quote a part here:
- "Yudkowsky’s manifest incompetence in technical philosophy, his utter lack of engagement with the professional computer science community, his petulant dismissal of even mild criticism from Luke, his endless invocation of the same handful of catchphrases repeated reflexively by his followers (“taboo your words!” “bayes!” “Bayes!!!” “BAYES!!!!”), the creepy interest these “bootcamps” seem to take in the moral and spiritual railroading of the participants — none of these things seems to register.
- ...
- Speaking as the most militant atheist you will ever meet, it would have been better if Luke had remained some sort of moderated Christian with a host of silly supernatural beliefs who had nonetheless managed to outgrow the cult-mentality of fundamentalism, than to lose the former while hanging on to the latter like we’ve seen over the last few months. It’s especially painful because when he first appeared on the scene he was one of the more articulate and passionate and promising advocates for “my” side, and now he’s gone all Anakin on us."
Pretty damning!--Baloney Detection (talk) 23:42, 16 June 2012 (UTC)
- Yeah, except the discourse level on that blog was for shit, and most of the whining philosophers (including the one you quote) were butthurt Christians given to obfuscatory apologetics as if they were saying something that would actually refute atheism. I appreciate you're searching for any stick you can, but this isn't a good one - David Gerard (talk) 13:21, 17 June 2012 (UTC)
- I think it is true though that Luke just gone from religion into a cult. The cults prey upon religious de-converts. He's been given truth in form of EY the guru who supposedly understands some badass mathematics of truth (Solomonoff induction), except not only the guru haven't got a clue, there's formal proof that you can't just 'do' Solomonoff induction, and in any case LW is no mathoverflow, nobody would post a hard math problem here and expect an answer. Dmytry (talk) 14:26, 17 June 2012 (UTC)
- That sounds like an interesting experiment, actually - David Gerard (talk) 16:59, 17 June 2012 (UTC)
- "most of the whining philosophers (including the one you quote) were butthurt Christians" Did you even read the quoted part? He wrote "Speaking as the most militant atheist you will ever meet", so no, not a Christian. A few other commenters were though. But keep sticking up for pseudoscience promoters (no, cryonics is not feasible, whatever Yudkowsky might tell you: http://www.quackwatch.com/04ConsumerEducation/QA/cryonics.html ) who have nightmares about robots.
- @Dymytry: Yes I think that's true. Before Muehlhauser was into LW, he was pretty deeply into Alonzo Fyfe's desire utilitarianism (I couldn't find any difference between Fyfe's brand of utilitarianism and preference utilitarianism which is what most modern utilitarians advocate, hence I fail to see what's so amazing about Fyfe's stuff) and often wrote about it on his blog. But I think he is fundamentally smart, and as the quoted blog commenter said, his engagement with LW will probably blow over in five years or so. He is all too prone into believing he is saving the world, leading to some odd behavior.--Baloney Detection (talk) 15:06, 17 June 2012 (UTC)
- Meanwhile he's making a crackpot organization be less honest and collect more money via not understanding the technical details even at the level of being a honest crackpot. Actually, now I see what is honest about Eliezer. He named his thing 'singularity institute' because all he ever worked on was hyper-singularity scenario. Add someone like Luke - entirely technically clueless, to the point of incapability of being a honest crank - and it's transitioning into a bona-fide fraud. edit: actually they have an article on that, 'belief in belief'; the EY was or is a believer in some crackpot hypotheses. Whereas Luke doesn't believe in the crackpottery, he just believes it's true. At which point he gets himself moral carte-blanche to just go ahead and optimize strings (such as name and descriptions) for the purpose of maximizing the donations, at which he has no restraint, lacking the technical understanding to be a honest crank.Dmytry (talk) 05:24, 20 June 2012 (UTC)
- I think it is true though that Luke just gone from religion into a cult. The cults prey upon religious de-converts. He's been given truth in form of EY the guru who supposedly understands some badass mathematics of truth (Solomonoff induction), except not only the guru haven't got a clue, there's formal proof that you can't just 'do' Solomonoff induction, and in any case LW is no mathoverflow, nobody would post a hard math problem here and expect an answer. Dmytry (talk) 14:26, 17 June 2012 (UTC)
Yudkowsky as a crank[edit]
I checked Yudkowsky against some crank trait lists, and he certainly has more than his fair share of them.
Common characteristics of cranks (from Wikipedia[2])
- 1. Cranks overestimate their own knowledge and ability, and underestimate that of acknowledged experts.
- 2. Cranks insist that their alleged discoveries are urgently important.
- 3. Cranks rarely, if ever, acknowledge any error, no matter how trivial.
- 4. Cranks love to talk about their own beliefs, often in inappropriate social situations, but they tend to be bad listeners, being uninterested in anyone else's experience or opinions.
How Yudkowsky does, in my estimation:
1. Yes. 2. Yes. 3. Yes. 4. I don't know.
- Dunno about 3, I think not quite plus people are dicks online in general. Dmytry (talk) 21:32, 17 June 2012 (UTC)
Wikipedia continues:
- Many cranks:
- 1. seriously misunderstand the mainstream opinion to which they believe that they are objecting,
- 2. stress that they have been working out their ideas for many decades, and claim that this fact alone entails that their belief cannot be dismissed as resting upon some simple error,
- 3. compare themselves with Galileo or Copernicus (or in a religious context, Noah), implying that the mere unpopularity of some belief is in itself evidence of plausibility,
- 4. claim that their ideas are being suppressed, typically by secret intelligence organizations, mainstream science, powerful business interests, or other groups which, they allege, are terrified by the possibility of their revolutionary insights becoming widely known,
- 5. appear to regard themselves as persons of unique historical importance.
Again my estimation of Yudkowsky's performance:
1. Yes. 2. No. 3. No. 4. No. 5. Sort of. He doesn't necessarily view himself as being of unique historical importance, but he clearly views is friendly AI stuff in that light.
- the fanboys do 2 for him, http://lesswrong.com/lw/cef/how_many_people_here_agree_with_holden_actually/6l5t . See +22 Dmytry (talk) 21:32, 17 June 2012 (UTC)
Martin Gardner in his Fads and Fallacies in the Name of Science[3] also had a list of crank characteristics:
- The second characteristic of the crank (which also contributes to his or her isolation) is the tendency to paranoia. There are five ways in which this tendency is likely to be manifested.
- 1. The pseudo-scientist considers himself a genius.
- 2. He regards other researchers as stupid, dishonest or both.
- 3. He believes there is a campaign against his ideas, a campaign comparable to the persecution of Galileo or Pasteur. He may attribute his 'persecution' to a conspiracy by a scientific 'masonry' who are unwilling to admit anyone to their inner sanctum without appropriate initiation.
- 4. Instead of side-stepping the mainstream, the pseudo-scientist attacks it head-on: The most revered scientist is Einstein so Gardner writes that Einstein is the most likely establishment figure to be attacked.
- 5. He has a tendency to use complex jargon, often making up words and phrases. Gardner compares this to the way that schizophrenics talk in what psychiatrists call 'neologisms', "words which have meaning to the patient, but sound like Jabberwocky to everyone else."
My estimation of Yudkowsky's performance below:
1. Unsure. Does he? His fans certainly do. 2. Yes (in the field of AI). 3. No. 4. Yes (except he is not attacking any particular scientist). 5. Yes.
Another characteristic of the crank according to Gardner is that they work almost isolated from the scientific community (see the Wikipedia link for more on this). This is also true of Yudkowsky.
Do you disagree with my judgements on the various scores here? Are they correct? I can only base my judgements on what I have seen.--Baloney Detection (talk) 19:53, 17 June 2012 (UTC)
- Well I don't consider him to be a honest crank TBH. Too much success getting money. More like a cult starter, the cranky beliefs at the level of dragon in garage. Dmytry (talk) 21:32, 17 June 2012 (UTC)
- I'm conflicted about if Yudkowsky is honest or not. Perhaps he once believed it, but now only keeps it going to make a living. On the regular job market, he doesn't have much to offer really. Hardly anyone has any use for hand-waving about friendly AI. I know though that certain LW beliefs are very likely to be false, almost certainly false (like cryonics and singularity). I added a crank section on the Eliezer Yudkowsky entry, please help improve it.--Baloney Detection (talk) 21:55, 18 June 2012 (UTC)
- I think he honestly believes he might be right (as in, not certainly wrong), and honestly believes he's doing nothing morally wrong, but such sort of belief usually is produced by simply not caring enough to think things through. It may also be the case that he and rest of SI don't see anything but material utilitarianism (where the utility is necessarily material in nature), see it as the only philosophical doctrine, or the best one, see anything not adhering to such doctrine not generally intelligent, and mix up their notion of intelligence with everyone else's. I.e. thorough confusion and absence of any motivation to think anything through. I imagine a clinical psychopath of the kind that would be swindling people out of money just like this, would necessarily be genuinely concerned about AI as well (edit: i.e. assuming he believes AI will be built), so the AI fear got to be genuine in some way. Dmytry (talk) 06:33, 19 June 2012 (UTC)
- I'm conflicted about if Yudkowsky is honest or not. Perhaps he once believed it, but now only keeps it going to make a living. On the regular job market, he doesn't have much to offer really. Hardly anyone has any use for hand-waving about friendly AI. I know though that certain LW beliefs are very likely to be false, almost certainly false (like cryonics and singularity). I added a crank section on the Eliezer Yudkowsky entry, please help improve it.--Baloney Detection (talk) 21:55, 18 June 2012 (UTC)
Question to the defenders of LW on RW[edit]
I know there are quite a few here who I think have a net positive view of LW (David Gerard and Armondikov spring to mind), and tend to water the criticism of LW down. While I'm all in favor of having multiple viewpoints around (though it does make the entry look contradictory), I'd like to know how you view the LW promotion of cryonics, singularitarianism, Bayes being opposed to science and such wooish beliefs. And of course the infamous Roko's basilisk. Do you agree with LW on these issue? Do you not? Why or why not?--Baloney Detection (talk) 18:45, 19 June 2012 (UTC)
- The good bits are good (the writing on cognitive biases is good stuff, as noted) and the bad bits are bad. You appear to be having difficulty comprehending that something can have good bits and stupid bits at the same time, insisting that anything must be entirely one or the other. This is incorrect - David Gerard (talk) 19:07, 19 June 2012 (UTC)
- It's quite possible that I'm falling victim to the human tendency of black-and-white thinking. But even so, do you endorse cryonics and singularitarianism? Do you believe Bayes' theorem is opposed to science?--Baloney Detection (talk) 19:10, 19 June 2012 (UTC)\
- Do I endorse cryonics? Well, I wrote large chunks of cryonics and its talk page, so why don't you turn your baloney detector on the history of each - David Gerard (talk) 06:46, 20 June 2012 (UTC)
- One can probably find good and true bits in how scientologists justify their nonsense. Pretty much any screwed up ideology has a bunch of truisms and generally good things as the core tenets, too. First there was the AI belief and AI charity and cryonics. Then there came the writings on biases, created during the process of rationalization of why ignore everyone else. You got any clue about the topic? Availability heuristics. You are a programmer? Of course you're wrong about Solomonoff induction b/c you're used to making software fast (and someone who probably just read some popularization book would know better and have all sorts of earth shattering God-disproving insights right away which somehow none of the mathematicians who came up with that stuff could see (note: i am an atheist, but i am sick of pseudomathematical disproofs of God just as much as I am sick of pseudomathematical proofs of God)). And so on and so forth. Also, what's so good about the biases work? I see made up pop psychology that qualifies for pseudoscience status. Biases are complicated. I see also the belief that somehow the lack of biases would lead to truth, i.e. that the study of biases is improving 'art of human rationality', etc, a baseless assumption that doesn't even make much sense (even complete absence of particular form of stupidity is not intelligence, for there is pretty much infinite number of ways to be stupid). Dmytry (talk) 22:36, 19 June 2012 (UTC)
- No, LessWrong is nothing like Scientology, and that's a completely silly comparison. And trivialises Scientology the way casually comparing people to Hitler trivialises the Nazis and what they did.
- I'll note here that I happen to know really lots and lots about Scientology, so I can speak knowledgeably about the comparison.
- L. Ron Hubbard was an amazing bullshitter. He was pretty smart, but his real genius was knowing how to manipulate people. He was a true sociopath: empathetic but not sympathetic; both willing and able to do whatever it took to get other people to do what he wanted. He had no ethics and no sense of true or false, and frequently believed his own lies.
- Scientology was founded after the failure of Dianetics: Hubbard wanted to make lots of money without work, so he came up with a popular psychology. When that collapsed (because you can't bullshit forever), he formed Scientology to hide behind cover of religion. This was enough and it did really alarmingly well, and was only finally taken out by the Internet. It still exists, but it's a cripple like Christian Science (which was the same kind of vicious in its day, though not the same degree).
- Hubbard was also a terrible writer. In the pulp days, he churned out barely readable tosh that could be edited into printable form, and did so reliably in first drafts, so editors loved him. (The one good novel he wrote, Fear, was only good because John W. Campbell absolutely rode his arse to make it good.) He continued the same writing habits in Scientology. He made it up as he went along, and the contradictions, stupidities and impossibilities became a feature, as he got people to blame themselves for the failure of his stupidity. He didn't just let them do this, he made it a rule that if the text seems stupid, it's because you misunderstood a word. The mechanism was the basic religious one of "everything good is from the religion, everything bad is from me, I must try harder."
- Scientology was conceived as deeply evil shit and carried forward as deeply evil shit. It was a pretty efficient machine for separating marks from their money, for the glory of Hubbard.
- LessWrong isn't like any of that, in any way at all. The SIAI was an obscurity only transhumanists cared about. Yudkowsky started the blog Overcoming Bias with Robin Hanson (a non-Austrian economist at GMU who has two degrees in physics; he invented the idea of the prediction market) to blog about cognitive biases, and about what he thought about stuff, because he was blocked on writing and thought blogging every day or two would help him get this stuff out. What is now "The Sequences" was a bunch of blog posts in sequence, and didn't pretend to be anything else.
- The SIAI is not a cult headquarters. It's a semi-incompetent charity, as is really incredibly normal amongst charities. It's only just getting its shit together with Luke doing the daily running. Asking for money does not make an organisation inherently evil, even if they're asking for money for things you think are incredibly stupid; this does not constitute financial abuse.
- There are many lesswrongers who seem to treat it as if it were an object to worship. This is because humans are stupid, and because they're smart and socially inept kids who've never had an interest in their lives before. They usually get over it - I read all of LessWrong from the beginning to 2011 and saw names come and go on the typical online community cycle of 12-18 months, often with a dramatic door-slam. LessWrong is basically a waste of time, and is best consumed as Internet television.
- Scientology is basically the Godwin example of cults; comparing any damn thing to Scientology makes actually abusive cults that are not as bad as Scientology seem benign. LessWrong is not Scientology. Not even slightly. I know what cults are, and they really aren't it. I have looked at it very closely with this precise question in mind, because a friend was getting into LW jargon-spouting and several other friends and I were actually a bit worried. (I'm not worried now, and the friend is fine and still on LW.) If Yudkowsky really wanted a cult, he's quite smart enough and knows enough about human minds and the highly exploitable biases they are prone to that he could start one. Starting a cult is not hard at all - the main prerequisite is no ethics and not caring about truth or falsity. (There are, of course, people like me on the internet who simply delight in giving cults a hard time, so the environment is not so cult-friendly unless they cut people off from the start. Examples about, but not very much on the internet.)
- Their stupidities are stupid, their fans are fanboys, their good bits are fine, that's quite sufficient. The stupidities fully warrant horselaughs, as well as the 10,000 syllogisms. You're mistaking stupidity for malice, and this is leading you to make silly comparisons - David Gerard (talk) 21:26, 21 June 2012 (UTC)
- Nebuchadnezzar (talk) 22:30, 21 June 2012 (UTC)
- Yeah, pretty much what I'd say. There's good, there's not-so-good. I'd be lying if I said that wasn't applicable to RationalWiki. Or back when I was on Star Trek forums. Or the theatre group I work with. Or my own research group. Or my own writing. Hanging on every word Yudkowsky says and bigging up LW as some hub of extreme pseudoscience is pretty much turning this page into Cult of Schlafly Mk2. Not entirely a bad thing, of course, as RW should always aim to diversify in who it pisses off, but as DG says, putting it on par with Scientology is laughable. bomination 23:16, 21 June 2012 (UTC)
- "Hubbard wanted to make lots of money without work" Well and EY wanted to make some money without work, and isn't so talented. The LWism got the bootcamps and thought reform and every other feature of a nasty cult (complete with doomsday prophecy), which all those things Armondikov speaks of do not have, but cults do. The scientology been extremely demonized but it's just the same mundane thing at larger scale and after longer time, and done by a more talented more ambitious asshole. edit: and also, lesswrong.com is just online support group where they draw dedicated folks from, into various rationality bootcamps n stuff Dmytry (talk) 07:19, 22 June 2012 (UTC)
- You can indeed find superficial correspondences. But that still doesn't make the comparison anything other than specious. You're still holding to a completely wrong track here.
- "Hubbard wanted to make lots of money without work" Well and EY wanted to make some money without work, and isn't so talented. The LWism got the bootcamps and thought reform and every other feature of a nasty cult (complete with doomsday prophecy), which all those things Armondikov speaks of do not have, but cults do. The scientology been extremely demonized but it's just the same mundane thing at larger scale and after longer time, and done by a more talented more ambitious asshole. edit: and also, lesswrong.com is just online support group where they draw dedicated folks from, into various rationality bootcamps n stuff Dmytry (talk) 07:19, 22 June 2012 (UTC)
- Yeah, pretty much what I'd say. There's good, there's not-so-good. I'd be lying if I said that wasn't applicable to RationalWiki. Or back when I was on Star Trek forums. Or the theatre group I work with. Or my own research group. Or my own writing. Hanging on every word Yudkowsky says and bigging up LW as some hub of extreme pseudoscience is pretty much turning this page into Cult of Schlafly Mk2. Not entirely a bad thing, of course, as RW should always aim to diversify in who it pisses off, but as DG says, putting it on par with Scientology is laughable. bomination 23:16, 21 June 2012 (UTC)
- Nebuchadnezzar (talk) 22:30, 21 June 2012 (UTC)
- As I note: if EY aspires to be a cult leader, where's the cult? He's smart, he's a good writer (certainly a hell of a lot better than Hubbard) and he knows all he would need to about human cognitive biases. LW is a terrible idea for a cult, if he were an actual sociopath (another word not to be thrown around lightly - it actually means something, it isn't a synonym for "weird" or "stupid" or "talks stupid philosophy"), he'd do a tremendously better job of it.
- I don't intend to trivialise LW/SIAI's problems - there's plenty of stupidity there that needs documenting (which is why this article is here). But that doesn't require assuming the level of malice you're assuming.
- Please imagine for a moment: what if LW and SIAI's stupid bits were not the products of malice, but of perfectly normal stupidity? What would the world look like? How would it look different to how it looks to you now? - David Gerard (talk) 10:22, 22 June 2012 (UTC)
- Another thing to consider: if Yudkowsky were like Hubbard, he wouldn't have deleted Roko's post, but rather enthusiastically embraced the idea. Scaring your members into ponying up cash for the cause is a key feature of cults, yet he rejected this method, exposing himself to the censorship charge instead. LW also featured the results of GiveWell's audit of SIAI despite their negative impression. I highly doubt that Scientology would ever conduct a transparent review of how its leadership spends the organization's money. Cults do all they can to keep outside criticism from reaching their members, LW is somewhat open to it at least as far as technicalities and practices rather than core beliefs are concerned. Even the most incompetent cult leader would by now have figured out that such liberties massively undercut their solicitations, so it's unfair to insinuate that they're just after the money. Röstigraben (talk) 12:37, 22 June 2012 (UTC)
- The Roko's idea was pointing out a fault with the concept of friendliness and with 'decision theory' that he invented. Hence EY getting very angry. He also did in fact act out the entire taking it very seriously thing, after it gave nightmares to a donor, which I don't think helped to do anything about donor's fears. I do not buy into dichotomy between 'malice' and 'stupidity', especially as applied to utilitarian who rationalizes. What is exactly the difference between incompetence and believing in your own lies anyway? Ohh I see: if it is a past villain whom we vilified to the level of caricature James Bond's antagonist, then it was beliefs in the own lies and all around evil, while otherwise, it's honest self delusion. The way I see it, it is same mundane shit, no caricature arhvillains whom wake up in the morning and think, 'how can I be evil today', everyone's thinking they're doing good, using some entirely broken logic to justify their actions (which is awful easy if you believe that prevention of dust speck in sufficiently many eyes can justify even 50 years of worst torture, and when you seriously believe your actions affects the immense number of people who will ever live). Nobody evil ever puts their effort into making sure they're wrong about what they're doing. They do not care, that's the thing. Only caricature villains properly evaluate their actions, deem them evil, and then do them. Regarding the GiveWell's 'audit', I imagine the choice was very simple: publish it on LW or have it published on GiveWell, with the former being a lower loss. edit: and I don't believe the argument that "he could of done better if he wanted to make a cult" works. Look at this: http://www.timesunion.com/local/article/Secrets-of-NXIVM-2880885.php . Consider that this whole improving art of human rationality thing is very new compared to NXIVM. Consider that Keith Raniere is by all metrics most likely to be much, much smarter than EY, with much better talent at cult making. Consider that Keith had prior experience doing nasty shit. Consider that EY's working online and has to reach people via internet, which is a novel task for a cult. Dmytry (talk) 13:54, 22 June 2012 (UTC)
- That's actually another point - how, exactly, would you run a cult or sect when all you have is an online platform plus occasional small-scale meetings? Cults are all about isolating their members from the rest of the world, so nobody can interfere with the brainwashing. Personal face-to-face interactions are very important when you're trying to exert psychological pressure on people, not to mention really controlling their lives. On a platform such as LW, you can always leave without a trace, you can switch usernames, nobody has any clue about whom else you're in contact with, and most importantly, communication happens in one huge forum for everyone to see. That's totally anathema to all the methods of social control that cult leaders like to employ. Seriously, just because there are some slight tendencies towards a cult of personality surrounding Yudkowsky, that does not mean that the community actually resembles a cultist organization. Röstigraben (talk) 14:35, 22 June 2012 (UTC)
- The Roko's idea was pointing out a fault with the concept of friendliness and with 'decision theory' that he invented. Hence EY getting very angry. He also did in fact act out the entire taking it very seriously thing, after it gave nightmares to a donor, which I don't think helped to do anything about donor's fears. I do not buy into dichotomy between 'malice' and 'stupidity', especially as applied to utilitarian who rationalizes. What is exactly the difference between incompetence and believing in your own lies anyway? Ohh I see: if it is a past villain whom we vilified to the level of caricature James Bond's antagonist, then it was beliefs in the own lies and all around evil, while otherwise, it's honest self delusion. The way I see it, it is same mundane shit, no caricature arhvillains whom wake up in the morning and think, 'how can I be evil today', everyone's thinking they're doing good, using some entirely broken logic to justify their actions (which is awful easy if you believe that prevention of dust speck in sufficiently many eyes can justify even 50 years of worst torture, and when you seriously believe your actions affects the immense number of people who will ever live). Nobody evil ever puts their effort into making sure they're wrong about what they're doing. They do not care, that's the thing. Only caricature villains properly evaluate their actions, deem them evil, and then do them. Regarding the GiveWell's 'audit', I imagine the choice was very simple: publish it on LW or have it published on GiveWell, with the former being a lower loss. edit: and I don't believe the argument that "he could of done better if he wanted to make a cult" works. Look at this: http://www.timesunion.com/local/article/Secrets-of-NXIVM-2880885.php . Consider that this whole improving art of human rationality thing is very new compared to NXIVM. Consider that Keith Raniere is by all metrics most likely to be much, much smarter than EY, with much better talent at cult making. Consider that Keith had prior experience doing nasty shit. Consider that EY's working online and has to reach people via internet, which is a novel task for a cult. Dmytry (talk) 13:54, 22 June 2012 (UTC)
- Another thing to consider: if Yudkowsky were like Hubbard, he wouldn't have deleted Roko's post, but rather enthusiastically embraced the idea. Scaring your members into ponying up cash for the cause is a key feature of cults, yet he rejected this method, exposing himself to the censorship charge instead. LW also featured the results of GiveWell's audit of SIAI despite their negative impression. I highly doubt that Scientology would ever conduct a transparent review of how its leadership spends the organization's money. Cults do all they can to keep outside criticism from reaching their members, LW is somewhat open to it at least as far as technicalities and practices rather than core beliefs are concerned. Even the most incompetent cult leader would by now have figured out that such liberties massively undercut their solicitations, so it's unfair to insinuate that they're just after the money. Röstigraben (talk) 12:37, 22 June 2012 (UTC)
- Please imagine for a moment: what if LW and SIAI's stupid bits were not the products of malice, but of perfectly normal stupidity? What would the world look like? How would it look different to how it looks to you now? - David Gerard (talk) 10:22, 22 June 2012 (UTC)
Let me find again that post from some girl who was dropping out of school and moving to work on existential risk together with Eliezer, and her father was concerned. LW is a recruitment ground. There is an enormous advantage to online operation: you can reach a very huge number of people to pick from; if you are going for sheer devotion its hard to beat. Also if you look at NXIVM it is clear it takes a lot of intelligence/talent/luck to be this successful; I don't see the argument "he could of done better if he was making a cult" working. Dmytry (talk) 18:06, 22 June 2012 (UTC)
- here's example : http://lesswrong.com/lw/1sa/things_you_cant_countersignal/ but I think I had another person or another post in mind. Also on all the 'he could of done better', it's way beyond majority of people to even get 1 person to join like this, I am not buying that he could of done better, especially as he prefers to see what he's doing in positive light via rationalization (which is how bad things typically get done). Dmytry (talk) 18:14, 22 June 2012 (UTC)
- This talk page is starting to remind me of that chapter in Them where a bunch of activists organize a protest against that grave and dire threat to our society, David Icke! Nebuchadnezzar (talk) 20:13, 22 June 2012 (UTC)
- That's it? After years of insidious brainwashing and alarmism about the dangers of unfriendly AIs, Eliezer's got...one teenager dropping out of high school so she can visit SIAI? Worst. Cult. Ever. For the vast majority of his would-be disciples, their "sheer devotion" apparently exhausts itself in logging on to a website to discuss his pet ideas and contribute to their shared cognitive framework, all of it entirely voluntary. No traces of cult-like structures, power relationships or control. Admitting that outside critics and dissenters may have a point and exposing members to said criticism, instead of demonizing them and denouncing their statements as lies. Giving up your potentially most effective tools for fleecing believers and keeping them in line (as Roko's idea would essentially have worked out to the same threat of eternal torture that proved to be quite effective with Christians). Writing lengthy texts about how people can avoid misperceptions instead of manipulating them to your advantage. Sorry, but nobody but an idiot would have made that many obvious mistakes. The far more reasonable explanation is that he's not trying to build a cult. Yes, he wants to inspire a movement, he clearly enjoys the attention and flattery which he's getting from his community, and he vastly overestimates the kind of contribution SIAI could possibly make to the field, but that's about it. Simple human failings that nobody's immune to, not even if you've spent years analyzing cognitive biases, but nowhere near the kind of intentional deception and lust for power that cult leaders exhibit. Röstigraben (talk) 08:56, 23 June 2012 (UTC)
- It's just 1 teenager that I could find a post of on prompt (edit: and as a counter example to your idea that it is all happening on the internet. There's this real world part, which is poorly visible, and into which people join). There's also the folks wasting considerable money on SI promoting pseudoscience. Also, are you sure that creators of other cults actually "wanted to make cult", movie villain style? I think it just sort of grows onto people who at start genuinely think they can improve the world (or if we vilified them, 'believe in their own lies'), while getting a little well justified slice for themselves. Dmytry (talk) 13:04, 23 June 2012 (UTC)
- That's it? After years of insidious brainwashing and alarmism about the dangers of unfriendly AIs, Eliezer's got...one teenager dropping out of high school so she can visit SIAI? Worst. Cult. Ever. For the vast majority of his would-be disciples, their "sheer devotion" apparently exhausts itself in logging on to a website to discuss his pet ideas and contribute to their shared cognitive framework, all of it entirely voluntary. No traces of cult-like structures, power relationships or control. Admitting that outside critics and dissenters may have a point and exposing members to said criticism, instead of demonizing them and denouncing their statements as lies. Giving up your potentially most effective tools for fleecing believers and keeping them in line (as Roko's idea would essentially have worked out to the same threat of eternal torture that proved to be quite effective with Christians). Writing lengthy texts about how people can avoid misperceptions instead of manipulating them to your advantage. Sorry, but nobody but an idiot would have made that many obvious mistakes. The far more reasonable explanation is that he's not trying to build a cult. Yes, he wants to inspire a movement, he clearly enjoys the attention and flattery which he's getting from his community, and he vastly overestimates the kind of contribution SIAI could possibly make to the field, but that's about it. Simple human failings that nobody's immune to, not even if you've spent years analyzing cognitive biases, but nowhere near the kind of intentional deception and lust for power that cult leaders exhibit. Röstigraben (talk) 08:56, 23 June 2012 (UTC)
- @David Gerard: I don't know much about Scientology, but I do know a little about Objectivism, and I don't think the comparison between LW and Objectivism is misplaced. Is Objectivism a cult? Objectivists are not going to commit mass-suicide anytime soon, and I'd guess most people find Objectivism through the Internet these days (in academia, their views are fringe, just like many of those popular on LW). I think the word "cult" may give the wrong connotations. Take a look at Shermer's The Unlikeliest Cult In History and see how well it matches LW. Still, my main motivation to oppose LW is that it promotes crankery and pseudoscience, and unfortunately is attracting otherwise smart people (Muehlhauser was the best thing that could ever happen to the SIAI).--Baloney Detection (talk) 20:33, 27 June 2012 (UTC)
- This talk page is starting to remind me of that chapter in Them where a bunch of activists organize a protest against that grave and dire threat to our society, David Icke! Nebuchadnezzar (talk) 20:13, 22 June 2012 (UTC)