Talk:LessWrong
Doing something for people distressed by the basilisk[edit]
- moved to the shiny new Talk:Roko's basilisk
Of course the LessWrong crowd is pro-gamergate[edit]
Source.204.11.142.106 (talk) 15:52, 7 June 2016 (UTC)
- which is the correct way to be. — Unsigned, by: 70.198.56.162 / talk
LW on RW[edit]
[1]: as 'the rat officer (tailed grade)' would say 'Meow.' 82.44.143.26 (talk) 19:37, 1 November 2016 (UTC)
- This is known -- it's from 2012. Cømяade FυzzчCαтPøтαтø (talk/stalk) 21:28, 1 November 2016 (UTC)
Referring to the Less Wrong article or [2]/[3]? (Mild timewaster warning on the latter) 82.44.143.26 (talk) 18:05, 2 November 2016 (UTC)
- Both. Mʀ. Wʜɪsᴋᴇʀs, Esϙᴜɪʀᴇ (talk/stalk) 21:24, 2 November 2016 (UTC)
I don't understand this section[edit]
The inner reaches of LessWrong can get a little ... insular. Here's Michael Vassar, former President of MIRI and an active member of the Bay Area transhumanist community, talking to Harper's in 2014:[1]
It was getting late. I asked him about the rationalist community. Were they really going to save the world? From what?
“Imagine there is a set of skills,” he said. “There is a myth that they are possessed by the whole population, and there is a cynical myth that they’re possessed by 10 percent of the population. They’ve actually been wiped out in all but about one person in three thousand.” It is important, Vassar said, that his people, “the fragments of the world,” lead the way during “the fairly predictable, fairly total cultural transition that will predictably take place between 2020 and 2035 or so.”
Using these principles, Vassar founded failed medical advice startup MetaMed, which put the assumption that rational thinking from first principles would beat the conventions of the field, and rapidly went bust.
What about the quote is "insular"? What "principles" from the quote were used in MetaMed? The quote seems to be saying, "actually us rationalists are a super-duper minority, also the apocalypse is coming" -- not what either of the taglines say. Herr FuzzyKatzenPotato (talk/stalk) 23:51, 5 June 2017 (UTC)
- the "his people" part implies that the rare rational people are the people on the site and everyone else is not only helplessly irrational but helpless when it comes to the problems of the future.Vorarchivist (talk) 13:58, 12 October 2017 (UTC)
EA orgs praising AI pseudoscience charity. Is it useful?[edit]
The BoN seems to think so, I'm not so sure, but I'm definetly not committed to leaving it out. My reasoning is that Effective Altruism is kind of a captured movement belonging more-or-less wholesale to lesswrong types who take the most interest in it. The criticism we include from them represents voices that want the original intention of EA to be carried out, and the stuff added is just LWers committing to their delusion.
That's a very narrative take on why I like the way the article is, and frankly I'm not that invested in it. Your reasons are okay, I'm gonna make this case just in case anyone else cares and leave your change in. ikanreed 🐐Bleat at me 19:22, 21 August 2018 (UTC)
Comment: The article in its current view wants readers to believe that even EA charity evaluators do not believe that MIRI is a good orginasation. However, while this might have been the case in 2012 it is no longer true now. Leaving the article as it is misinforms readers who put weight on the view of EA organisations. It might be the case that the community of this wiki does not believe in EA but that is not the case for everyone who might read it. This is misleading and wrong. If it is the consensus view of this wikis community that EA evaluators are wrong you should not mention them, bu citing them when it supports your case against MIRI but not otherwise is not rational discourse. — Unsigned, by: 192.76.8.65 / talk / contribs
- Can't really disagree with that reasoning, just because it doesn't tell the story I want. Go ahead and make that change. ikanreed 🐐Bleat at me 02:28, 22 August 2018 (UTC)
- The reason is that the LWers have been pushing non-AI-risk people out. It is worth noting that Karnofsky tried to walk back his critique later. But the details belong on effective altruism - David Gerard (talk) 16:20, 24 August 2018 (UTC)
- That is just a plain lie ad a minute of googling would have made that obvious: The share of LWers in the EA movement is lower then ever and LW dropped out of the top 5 things that get people into EA http://effective-altruism.com/ea/1h5/ea_survey_2017_series_how_do_people_get_into_ea/ but I give up on changing these articles now. It is not worth my time to fight the 99% of this community who just have an unexplainable hate for LW and anything remotely related to it. It is very unfortunate that people who care about rational discourse and end up here by accidet (like I did) have to deal with these kind of misinformation but that can't be changed. But I really would appreciate if someone could explain me where this hate comes from? I know that this wiki is not about rationality but has a left/progressiv background wheareas some (but by far not the majority and certanly not me) of LW contributers are libertatian but is that really a reason to hate a whole movement where people are obviously trying to do good? Are there not worse people you could direct your hate at if you need an enemy you can bash to feel rightious? It makes me really sad to see people whos values I fully support on most topics (being a left leaning atheist) use the same tools of censorship and misinformation that I thought were exlusive to fringe right wing movements. I do not expect a reply (i rather expect this comment to be removed as well) but I am genuinely curious.
- Edit: It jsut occured to me that I might have fallen into the trap of assuming people are evil when misinformation explains things equally well. I am quite new to EA and have never been following lessWrong very actively so I cannot say how these communities used to be when you formed your terrible opinion of them, So I would encourage anyone who reads this to go to a local EA meatup and maybe open lesswrong once. I am reasonably sure that you cannot possibly go to EA meetups and continue to have the impression that the people there are somehow enemies of your values. You might disagree with some of their goals and methods to change things but the people there are just genuinely good people. — Unsigned, by: 192.76.8.65 / talk
- Good and Evil are ultimately fictitious labels that humans arbatrarily assign to individuals and groups based on their biases and worldviews. ☭Comrade GC☭Ministry of Praise 19:54, 24 August 2018 (UTC)
- P.S. You would be surprised how many cranks end up being labeled as "good people" by their supporters. Ultimately such defense attempts come off as more childish and naive than anything else. ☭Comrade GC☭Ministry of Praise 19:59, 24 August 2018 (UTC)
- Edit: It jsut occured to me that I might have fallen into the trap of assuming people are evil when misinformation explains things equally well. I am quite new to EA and have never been following lessWrong very actively so I cannot say how these communities used to be when you formed your terrible opinion of them, So I would encourage anyone who reads this to go to a local EA meatup and maybe open lesswrong once. I am reasonably sure that you cannot possibly go to EA meetups and continue to have the impression that the people there are somehow enemies of your values. You might disagree with some of their goals and methods to change things but the people there are just genuinely good people. — Unsigned, by: 192.76.8.65 / talk
- yeah, I think any edits this IP wants to make will need to go via the talk page first - David Gerard (talk) 20:33, 24 August 2018 (UTC)
- You guys did both ignore that I provided evidence that the last edit made was based on wrong fact and you still dont care? LW does not run EA and the influence of LW people in EA is declining, nevertheless the site now says that LW people have bullied everyone else out of EA because the truth is that people not related to LW now believe in MIRIs work and you don't like that. I find it an acceptable view to not care what the EA movement thinks and say its al bogus but again in this case you should not selectively quote EA Charity evaluators when they support your view. I am still really confused why you hate LW so much that you have stopped caring about the truth because as far as I can tell other parts of this wiki do usually stick to facts even about things this community does not like. Can anyone elaborate on that? — Unsigned, by: 192.76.8.65 / talk
- yeah, I think any edits this IP wants to make will need to go via the talk page first - David Gerard (talk) 20:33, 24 August 2018 (UTC)
- Two points: A) Please sign your posts. B) I'm waiting for an agreement on the talkpage. ☭Comrade GC☭Ministry of Praise 19:03, 25 August 2018 (UTC)
- I do not know how to sing posts. I still find it weird that there needs to be agreement before facts based ons statistics can be added to an article while their did not need to be agreement for the last edit which made a false claim absed on the impression of 1 guy at a EA conference. Don't you kind of feel that that is a double standart? — Unsigned, by: 192.76.8.65 / talk / contribs