Talk:LessWrong/Archive10

From RationalWiki
Jump to navigation Jump to search

This is an archive page, last updated 3 May 2016. Please do not make edits to this page.
Archives for this talk page: , (new)(back)

LW fascination with donating to have Kim Suozzi frozen.[edit]

That's pretty weird for people that aspire at 'rationality'. There's a lot of people who die for lack of access to well proven medical care. Except its mundane and it doesn't get into news. Meanwhile, Kim is a brain cancer patient, with the obvious implications for cryonics. Some idiot told her of cryonics, which imo is irresponsible because a: unlikely to work and b: she may need to somehow kill herself months before her conditions kills her to have any chance at it. Terrible predicament, made only more terrible by cryonics. Selling woo to the dying people is just bad. (I do hope that she gets sufficient donations so she can have some hope, given that the people probably wouldn't have been donating anywhere else otherwise, but the whole vile business of cryonics company extracting a profit from this suffering... makes me wish hell was real just so that anyone using this for profit can end up in hell) Dmytry (talk) 08:54, 28 August 2012 (UTC)

Cryonics is not profitable. Cryonicists are desperately sincere, and terrible at running organisations. The Kim Suozzi issue is pure foolish distraction as far as the quest for rationality goes, and demonstrates that LW is fundamentally a transhumanist hangout with some good rationality bits - David Gerard (talk) 11:44, 28 August 2012 (UTC)
"Cryonicists are desperately sincere, and terrible at running organisations." Well, duh. I've said for years that the cryonics movement needs some people with successful business experience to run their organizations. I've also advocated developing open innovation models to expand greatly the numbers of human minds working on cryonics' many problems, and at their own expense. The experience of incentive prizes suggests that you get a multiplier effect of about 10 to 1 in terms of others' voluntary mobilization of resources. In other words, if you offer a $1 million prize for solving a hard problem, others will spend around $10 million of their own money to try to solve the problem and win the prize. Unfortunately I fear that Eliezer Yudkowsky shows signs of becoming the go-to authority figure in cryonics in the coming decades, even though he hasn't earned that status and he brings a lot of just bad, sterile ideas along with him. Advancedatheist (talk) 16:12, 30 September 2012 (UTC)
Okay, so put a load of computer scientists in one place with a fascination about how totally fucking awesome the future will be, mix in the average human fears over the permanent nature of death but add the "religion is inherently irrational" trope. An obsession over transhumanism and cryonics is likely to drop out of that equation as easily as evolution by natural selection drops out of "descent with modification". Scarlet A.pngtheistModerator 12:01, 28 August 2012 (UTC)
Quite a lot of transhumanists aren't computer scientists, they're people who were utterly taken with the ideas, i.e. the science fiction. LW is a transhumanist site, seeded from the SL4 list of the 2000s and many of the people (including EY) came from the Extropians list in the late '90s. Sci-fi all the way down - David Gerard (talk) 13:11, 28 August 2012 (UTC)
Transhumanism doesn't work as a philosophy of life, at least not now, because it assumes a false underlying premise, namely, that we live in a time of exponential technological progress. (Yeah, like we live in a technologically progressive "space age," as well, even though that went bust in the 1970's.) So I have to laugh at transhumanists when the guys who speak at their own conferences, for example Peter Thiel and Tyler Cowen, have decided lately that we live in a technologically stagnant era instead. You'd think more transhumanists would wise up around the time they turn 40 and undergo the middle-aged reality check that the world doesn't show signs of accommodating their futurist beliefs. Advancedatheist (talk) 16:23, 30 September 2012 (UTC)
I just note that the extreme levels of reductionism required to take this in seems common amongst those who work with computers rather than, say, human brains for a living. "Sci-fi nerd" surly correlates just as well, if not better. Scarlet A.pngssholeModerator 13:33, 28 August 2012 (UTC)
The bitter grumbler in me feels that if that's true, those computer science people more heavily involved with LW certainly can't be very GOOD computer science people. Just thinking about the enormous problem mapping a working, active instance of a human brain, much less reconstructing it, in context of computer science, engineering, or programming, makes my head hurt. The fact that I have never seen any of them even trying effectively to solve it (rather than just saying 'well in the FUTURE we will have it because someone will have invented it!' or saying 'someone did X, maybe this is conceptually related, SINGULARITY IS NIGH!) does not make me very confident about them. The best engineers and programmers I know are constantly thinking of problems and constantly trying to solve them. LW often just looks like a bunch of lazy pseudoscience babble groupies to me. Not like problem-solvers at all. ±Knightoftldrsig.pngKnightOfTL;DRwalls of text while-u-wait 13:49, 28 August 2012 (UTC)
Perhaps I'm just misreading the demographic based on a few instances. Scarlet A.pngbominationModerator 14:35, 28 August 2012 (UTC)
Who of them is a computer science person anyway? They barely ever post programming challenges beyond totally beginner level, they have no math puzzles session, and down-vote what ever puzzles are posted that are not totally trivial. And of course the entire AI idiocy. Their reductionism is infantile kind - magic of progress will explain it and that's what I believe, and screw anyone who's actually trying to explain. Ultimately the community only has people who do not experience revulsion towards any of the Yudkowsky's self important writings where he goes on how smart he is for adopting a view that other people invented, predominantly for either wrong reasons or the reasons other people invented (or people unfamiliar with said writings). @David Gerald: Unprofitable with those high tier options like whole body preservation, which is an equivalent of a personal hot tub on an airliner? Dmytry (talk) 15:47, 28 August 2012 (UTC)

No, Yvain does not "believe in the paranormal"[edit]

(previously posted here under the heading "Yvain believes in the paranormal"):

One little known fact is that there have been studies done about paranormal effects, and that when certain effects are looked into the studies consistently show a very small but statistically very significant positive effect, some of which have been of the sort that there's a billion to one chance of it happening by coincidence (I used to be able to phrase that in terms of z-scores and such, but it's been a while). When skeptics have heard about this, they have just said the studies were flawed, either providing no evidence for it or else pointing out trivial "problems" of the sort that it's impossible to avoid and which no one would think twice about if it were in a normal medical study or something. My favorite story is when James Randi himself did a study to prove that ESP didn't exist, got a small but very significant positive result, pretended the study never existed, and lied about it in court (eventually being caught). On the subject of Randi, I will also point out that his million dollars to anyone who can prove the paranormal is mostly a fraud, with him rudely refusing people who have actually shown talent and mostly using it to find the stupidest mock-psychics he can get and make fun of them with a large public audience.--Baloney Detection (talk) 16:51, 25 September 2012 (UTC)

-- Hello, RationalWiki. Yvain here. I notice once of you has hunted down one paragraph I wrote about a decade ago when I was a teenager and Less Wrong did not exist, taken it incredibly out of context, phrased it in the present tense as "Yvain believes", then posted it as what can only be interpreted as an an attempt to tar all of Less Wrong with it. Classy!
I do think it's important that parapsychology studies often come up positive, not because it proves the paranormal but because it proves that science is really messy - that even well-conducted studies frequently come up with the wrong results. I like the phrase that parapsychology is "a control group for science" such that if lots of parapsychology studies come up positive, we should be suspicious of, say, the latest exciting drug study or sociology study performed with the same sort of statistical methods.
Ten years ago when I wrote that livejournal post, I don't know if I'd developed this idea quite as fully. But I definitely believed (and still believe) that "science" describes methods and not results. If a study comes out that seems to support psi effects, I think it's better science to try to repeat it or honestly investigate its methodology, than it is to ignore or try to bury it because "it got the wrong results" - which I interpreted some skeptics as trying to do. As a teenager, this might have been because I thought it was possible (though unlikely) that maybe everyone was wrong and there really were psi effects. Now I am much less likely to take that possibility seriously, but I still think it sets a dangerous precedent in other areas to dismiss studies whose results we find unpalatable before we're sure we've located the flaw.
I'm not sure what teenage-me was referring to regarding Randi. I can't find the incident online. The closest I can come was the CSICOP fiasco with that astrologer in the '80s that caused Rawlins to resign. But Randi wasn't personally involved in that one, so it looks like teenage-me was just plain mistaken there.
If you'll allow me to speak for myself instead of putting up inflammatory headlines about my beliefs without informing me, I can tell you that I think there's well below a 1% chance that the occasional positive results that crop up in parapsychological studies represent any kind of a real effect beyond good experiment design being really really hard.
I assume it would be impolite for me to delete this, but I'm sure not going to be upset if some administrator here or someone else with the proper authority goes ahead and does so. — Unsigned, by: 67.164.95.212 / talk / contribs
RW generally doesn't delete silly stuff from talk pages except in extreme circumstances; it tends to be left as individual comment, and people can make fun of the person being silly, as you've done quite well at here. Per recent archives, most commenters here love your stuff, often more than Eliezer's - David Gerard (talk) 06:25, 27 September 2012 (UTC)
Note also that the vast majority of BaloneyDetection's and Dmytry's contributions are on this talk page, so please don't conflate them with the RationalWiki community in general (if such thing exists any more).--ZooGuard (talk) 08:31, 27 September 2012 (UTC)
Addendum to that note, many RW editors find this talk page embarrassing to watch. Scarlet A.pngpostateModerator 10:34, 27 September 2012 (UTC)
I definitely didn't mean to accuse the community in general. If BaloneyDetection = Dmytry or someone similar, I guess that makes it a certain kind of expected without reflecting on any of the rest of you. Thanks for your help. - Yvain
They're definitely different people by writing style - David Gerard (talk) 06:13, 28 September 2012 (UTC)
My mistake Yvain, you don't believe in the paranormal, and I apologize for this claim about you. I assumed it reflected your current views, perhaps naively so (I personally have changed online identities several times since 2004, not being as consistent in that regard as you have apparently been). Since there are some whacky beliefs in the LW environment, it wouldn't suprise me if some of them did believe in the paranormal.--Baloney Detection (talk) 20:52, 10 October 2012 (UTC)

Where is Yudkowsky's Nobel Prize?[edit]

Nobel Prizes have recently been anounced. Somehow Yudkowsky didn't recieve the one in physics for his stunning discovery that Bayes' theorem proves the MWI of quantum mechanics. What's wrong with the world? Must be those scientists who are stupid outside the lab and don't grasp the genius of Yudkowsky.--Baloney Detection (talk) 20:54, 10 October 2012 (UTC)

And where is Higgs' prize? Scarlet A.pngbominationModerator 19:45, 14 October 2012 (UTC)
Are you seriously comparing the two?--Baloney Detection (talk) 21:29, 22 October 2012 (UTC)
Are you seriously making the argument that someone who amounts to, effectively, an internet blogger would be noticed by the Nobel Prize Committee? Prove to me that Yudkowsky doesn't get a Nobel Prize in the year 2023 and you might be on to a vaguely reasonable argument. Scarlet A.pngpostateModerator 20:01, 26 October 2012 (UTC)


Or at least the Peace Prize for his work in creating the Friendly AI. Advancedatheist (talk) 04:19, 16 October 2012 (UTC)
Nobel prize is shit, TBH. There was a prize for lobotomy, for example, without evidence that it worked. IMO, direct-to-naive marketing of your ideas in fields requiring specialized knowledge pretty much always implies crackpottery and/or BS. Dmytry (talk) 16:00, 20 October 2012 (UTC)

Info on the SL4 mailing list[edit]

Kudos to David Gerard for the stuff on the origin of LW from the SL4 mailing list. I agree with you that LW is a transhumanist site in a pop cog-sci garb. If you have any more info on the SL4 mailing list, please consider adding it to the entry.--Baloney Detection (talk) 19:41, 14 October 2012 (UTC)

The obvious Google search turns up the obvious. The list has an archive and all - David Gerard (talk) 20:01, 14 October 2012 (UTC)
Ah yes, that's true.--Baloney Detection (talk) 21:27, 22 October 2012 (UTC)

Is it fair to call Yudkowsky a crank?[edit]

There is a thread about SIAI's arrogance problem. The comment section is a goldmine for both critical comments of EY/SIAI as well as, uhm, strong devotion to it.

This is just delicious:

"Personally, I think that a lot of Eliezer's arrogance is deserved. He's explained most of the big questions in philosophy either by personally solving them or by brilliantly summarizing other people's problems. CFAI was way ahead of its time, as TDT still is. So he can feel smug. He's got a reputation as an arrogant eccentric genius anyway."

Well, if Yudkowsky is such a genius, why doesn't it show off in real life? Why doesn't Yudkowsky write articles for scientific/academic journals? Why isn't Yudkowsky the guest of any scientific conference (Singularity Summits don't count, as they are hosted by his own organization and aren't really scientific conferences)? His achievements amount to having much karma on LW and writing Harry Potter (rape) fanfic. Put up or shut up!--Baloney Detection (talk) 20:25, 31 July 2012 (UTC)

Jesus christ. STILL???? at what point does this discussion of one person on one site become out right troling?? --Green mowse.pngGodotL'important c'est d'aimer 20:31, 31 July 2012 (UTC)
Well it's on the discussion page of said site...--Baloney Detection (talk) 20:42, 31 July 2012 (UTC)

Look up Yudkowsky on OKCupid (tho he might've deleted it already because of discussion at http://lesswrong.com/r/discussion/lw/ds4/article_about_lw_faith_hope_and_singularity/ ). Or wikipedia talk page about him. Dmytry (talk) 07:12, 1 August 2012 (UTC)

Well it's right here. After reading his self-description, I'm speechless...--Baloney Detection (talk) 22:05, 3 August 2012 (UTC)
Jealous? Looks to me like most of what he writes there is reasonable as self-disclosure. He's not "normal." So? Looks like he's having fun. I don't necessarily approve, but, hey, I've got my own priors. --Abd (talk) 02:27, 3 November 2012 (UTC)
Gosh, what a sick egomaniac. And subtly sexist to boot, judging by the "You should message me if" section. Though going by his writings (where he tries to rationalize gender bias with essentialism as some kind of special clause in contradiction of all his other writings about overcoming bias) it's hardly surprising. - LucidFox (talk) 03:54, 14 September 2012 (UTC)
I'd guess that LucidFox isn't going to be messaging him. Seems to me like he's got a filter running that works, my guess, unless he is missing out on the perfect opportunity. Yeah, he makes an assumption about "rationalist women." So he's a guy. Surprised?
LucidFox (nice name, by the way) seems to have come here with Yudkowsky on the brain.[1] --Abd (talk) 02:27, 3 November 2012 (UTC)