Essay talk:The end of faith and the myth of rationality
I was reading some of your links (I'm waiting to read your essay, till you stop editing it...), and just don't get why some of these anti-religion types care so damn much. don't get me wrong, I'm anti any religion that belittles gays, abuses women, or demeans science. But I don't get why it's necessary to proclaim that "there can be no case where religion and science can co-exist. Religion necessarily means science is wrong. (or soemthing). why??? what the fuck is so wrong with religion, and why are these guys (who i generally respect as writers and scientists... WEIT is one of my favorite sites, he likes cats!) so damned "angry" about someone saying "religion fills a void in my life'. ok, rant over. tell us when you are done!--GodotGet over it!. 00:23, 4 November 2011 (UTC)
- I was just editing for punctuation/grammar. Read the section on sacred values for the answer to your question. Nebuchadnezzar (talk) 00:28, 4 November 2011 (UTC)
- "the dichotmy of reason and emotion is (an illusion)". I think i love you. :-) It's hard to work in semi-science (soft science?) fields like linguistics, and thinking fields like philo and religious studies (study of, not belief in), when "emotion" and "gut instinct" (which all humans both possess and use on a reagular basis, ans which are very much part of the moral game) are discounted. I don't know if it's changed in teh 10 years since I've been in formal academics, but emotion was just nay sayed. it had nothing to do with science, other than making us more the animal and less teh rational human. Anyhow, still reading, just had to clap. --GodotGet over it!. 00:33, 4 November 2011 (UTC)
- Times have changed. More scholars are joining the Revolution, comrade. Nebuchadnezzar (talk) 22:20, 4 November 2011 (UTC)
- "the dichotmy of reason and emotion is (an illusion)". I think i love you. :-) It's hard to work in semi-science (soft science?) fields like linguistics, and thinking fields like philo and religious studies (study of, not belief in), when "emotion" and "gut instinct" (which all humans both possess and use on a reagular basis, ans which are very much part of the moral game) are discounted. I don't know if it's changed in teh 10 years since I've been in formal academics, but emotion was just nay sayed. it had nothing to do with science, other than making us more the animal and less teh rational human. Anyhow, still reading, just had to clap. --GodotGet over it!. 00:33, 4 November 2011 (UTC)
- I have an invite to an event titled "God and Science: Must We Choose?", I could swing by, make some notes and report back if you like. Also, should we ever get around to formalising that "Gut/Brain" ratio thing that was mentioned very briefly a couple of weeks ago? I know it would be heretical for most *ahem* rationalists to say that you can get away with not thinking too much about certain things, but it's more prevalent than they might think as an opinion. moral 10:05, 4 November 2011 (UTC)
- I don't know that it can be quantified. Call me an "irrationalist," then. Nebuchadnezzar (talk) 22:20, 4 November 2011 (UTC)
- Okay then. Though, to be honest, the word "rational" and it's derivatives get banded about so much that I think they're all starting to go the way of "feminism" in that everyone claims to be one, no one agrees what it means and No True Scotsman rules practically every conversation ("Proper rationalists follow cryonics!" "Proper rationalists should never read Dawkins!" "Proper rationalists never swear!" "Proper rationalists don't claim ownership of the word rationalists!!"). Unfortunately, the RW name is something we're kinda stuck with... whoops. "Irrationalist" has a sarcastic ring to it, though. pathetic 23:02, 4 November 2011 (UTC)
- Indeed. This was in fact a stealth attempt to proselytize for the one true religion. Convert! All other paths are philosophical suicide. Nebuchadnezzar (talk) 00:21, 5 November 2011 (UTC)
- Okay then. Though, to be honest, the word "rational" and it's derivatives get banded about so much that I think they're all starting to go the way of "feminism" in that everyone claims to be one, no one agrees what it means and No True Scotsman rules practically every conversation ("Proper rationalists follow cryonics!" "Proper rationalists should never read Dawkins!" "Proper rationalists never swear!" "Proper rationalists don't claim ownership of the word rationalists!!"). Unfortunately, the RW name is something we're kinda stuck with... whoops. "Irrationalist" has a sarcastic ring to it, though. pathetic 23:02, 4 November 2011 (UTC)
- I don't know that it can be quantified. Call me an "irrationalist," then. Nebuchadnezzar (talk) 22:20, 4 November 2011 (UTC)
Sacred values[edit]
You are saying that "reason" is a sacred value of rationalists, and that they replace one set of misconceptions with another. It might be true, and there is certainly something quasi-religious about it. However, it doesn't mean there is no advantage in this exchange. Rationalism is a more useful misconception than, let's say, fundamentalist Islam. --Tweenk (talk) 04:02, 6 November 2011 (UTC)
- By misconceptions, I'm referring to misconceptions of religion as a cultural phenomenon (e.g., Harris reading the Holocaust as mainly a religiously motivated event vs., say, Dinesh D'Souza's "Hitler was an atheist" nonsense). This is separate from sacred values. The problem with reason as sacred value in the case of Harris et al is that it overlooks the question Atran is posing: "How do we as scientists advance reason in an inherently unreasonable world?" They can't answer that question effectively because they're stuck in the rut of thinking people always ought to be reasonable. Nebuchadnezzar (talk) 22:00, 6 November 2011 (UTC)
- I think you overestimate the degree to which the myth of man as a rational animal is still prevalent in the scientific and skeptical communities. Most people are aware that there are such things as cognitive biases, and scientists are probably the demographic that is most likely to recognize that these biases severely impede our ability to draw correct conclusions. Furthermore, whether we are currently rational has little to do with our ability to "advance reason." If we recognize that we are mostly irrational, we can simply have less credence in the accuracy of our beliefs, and in the meantime work on techniques to become more accurate. Believing that we are already rational is not necessary for us to do this. Tetronian you're clueless 22:14, 6 November 2011 (UTC)
- Awareness of cognitive biases doesn't translate to using that knowledge effectively for self-criticism. As Robin Hanson writes, "The fact that I can identify a particular bias in those I disagree with is only very weak evidence that I am more right than they." The rational actor model is indeed dying off (not as quickly as I would like, but it's getting there), but it is ingrained enough that people will misuse it consciously or unconsciously. That's why I have to laugh when I hear things about science "replacing" religion. Nebuchadnezzar (talk) 22:49, 6 November 2011 (UTC)
- It's always been one thing to recognise cognitive biases, to actually implement that knowledge to avoid them is a different skill entirely. I think this is why Feynman said "philosophy of science is about as useful to scientists as ornithology is to birds". You can learn all you like about confirmation bias, falsifiability, the problem of induction and so on, but it won't help you one jot when you're in a lab looking at an infra-red spectrum wondering 'what-the-fuck?'. If you want to "advance reason in an unreasonable world", I would say you have to understand this difference between merely an academic understanding of rational thought and head towards a practical application of this - and the first port of call is, as you've both pointed out, is to start by assuming we are irrational creatures. I'd go one further than this and try and figure out how to not destroy irrationality (I've the Less Wrong crowd occasionally discuss "barriers" to human rationality, I think this is the wrong approach) but to actually embrace it and, if necessary, exploit it. E.g., no amount of statistical rigour will convince someone a vaccine is (statistically) safe any more than presenting road accident statistics is going to convince someone to stop driving. We learned this from the MMR debacle, and so many doctors and science bloggers have suggested fighting back with anecdotal evidence because that information is salient. This takes advantage of our inherent ability to prioritise certain types of information over others regardless of objectivity or accuracy - which would be a fair description of what we mean when we say "human irrationality" wouldn't you think? The problem then becomes about balance and you're at square one again. Anyway, I've probably waffled off the point. bomination 11:57, 7 November 2011 (UTC)
- ADK: I just don't see why systematic biases are something to "embrace." Sure, you can exploit them when trying to manipulate people, and to a certain extent you can use some cognitive habits to reduce bias. But if you know that your own reasoning is untrustworthy, shouldn't the obvious next step be to make it better? And I disagree with the claim that knowing about induction and falsifiability will help you in a lab - I think you greatly underrate the helpfulness of this knowledge because you already know these things. But if you pulled a random person off the street who knows nothing about falsifiability or experimental design and asked them to do science, there's no way in hell they could do as good a job as a trained scientist.
- Nebby: Cool, I didn't know you're an Overcoming Bias fan. Hanson is right about meta-bias, but his argument doesn't invalidate the larger claim that cognitive biases can be reduced - it just implies that being able to recognize the bias isn't good enough. We can still use other methods to circumvent irrational thought processes, like this one or this one. Tetronian you're clueless 13:38, 7 November 2011 (UTC)
- Embrace them because we can't remove them, embrace as in "accept they're there and work around them, not against them". I don't mean everyone should be irrational, I'm just saying to take a different track by turning a problem into its solution. Otherwise we're fighting against the grain. Which I'm sure will satisfy people who exist only in internet forums and wish to rid themselves of these horrible biases, but it's not going to have any practical use to humanity. narchist 13:41, 7 November 2011 (UTC)
- What is the difference between working around them and working against them? Tetronian you're clueless 13:42, 7 November 2011 (UTC)
- Some people want to destroy biases, but that's not going to work except for some individuals who will end up leading boring lives. Anyone who preaches rationalism will invariably violate one of their commandments at some point, not necessarily proving that they're full of shit, but proving that practicing and preaching are two different things. One is possible, one less so. What I would say is to accept that irrational biases are there, and turn them into something useful or at least find a way of dealing with them that doesn't involve trying to eradicate them - because that will fail and you'll probably end up looking like a asshole/hypocrite. For example, you don't need to tell people to stop listening to salient anecdotal evidence and only trust rigourous statistics - stats are meaningless, anecdotes have salience, so this isn't going to work. What you need to do is to get people to recognise which anecdotes are worth paying most attention to and why - I say "people" as opposed to a generic "you" because this whole human rationality thing is utterly pointless if the only rational person is a single dweeb sat at his computer pretending to have all the answers. Approaching it in a way that you can swing people's biases to work for them is is somewhat different, I think, than just saying "this is confirmation bias, don't do it" because people are always going to do it, that's why they're called cognitive biases. Now, this doesn't mean you can't talk about cognitive biases and the usual suspects of rationalism (or even that you can't worship at the Church of Bayes any more) but that we need to stop treating biases as something to be killed and more like something to be tamed and controlled, so that we use them when appropriate. Hyper-rationality doesn't help you survive crossing a road. d hominem 13:59, 7 November 2011 (UTC)
- I suppose this is only because I'm more interested in "skepticism" than raw "rationality". The former has relevance, the latter can easily be dismissed as boring, mundane and academic. I want to see 5 straight days of rain and think "tomorrow must be sunny, I want a BBQ" and not have to think of the Gambler's fallacy (yes, I know, weather isn't a resetting system where each day is independent of the previous one) but I don't write a scientific paper riddled with biases - e.g., I'm currently working on a bridging hydride system and the last thing I want is to attempt to publish it without rigourously testing it. I would also like to be unreasonable when proving that in some cases unreasonableness is justified! :P bomination 14:14, 7 November 2011 (UTC)
- Sorry, I still don't get it. Can you taboo "killed" for me? If I do something like this to avoid rationalization, am I "killing" the bias or "controlling" it? While I'll be the first to agree that being aware of a bias is insufficient (I wrote something to this effect a while ago), if I could rewire my brain to rationalize less, I'd do it, because doing so would make me more accurate. (I assume that would qualify as "killing.")
- We might also have to taboo "rationality," because I think we're using the word in different ways. I'm using the word in the sense of these definitions. Rationality should help you cross the road, and if it doesn't, you're not doing it right. It's not about purging emotion or becoming a Straw Vulcan (WARNING: TVTropes), it's about being accurate and Getting Shit Done. Tetronian you're clueless 21:54, 7 November 2011 (UTC)
- I suppose this is only because I'm more interested in "skepticism" than raw "rationality". The former has relevance, the latter can easily be dismissed as boring, mundane and academic. I want to see 5 straight days of rain and think "tomorrow must be sunny, I want a BBQ" and not have to think of the Gambler's fallacy (yes, I know, weather isn't a resetting system where each day is independent of the previous one) but I don't write a scientific paper riddled with biases - e.g., I'm currently working on a bridging hydride system and the last thing I want is to attempt to publish it without rigourously testing it. I would also like to be unreasonable when proving that in some cases unreasonableness is justified! :P bomination 14:14, 7 November 2011 (UTC)
- Some people want to destroy biases, but that's not going to work except for some individuals who will end up leading boring lives. Anyone who preaches rationalism will invariably violate one of their commandments at some point, not necessarily proving that they're full of shit, but proving that practicing and preaching are two different things. One is possible, one less so. What I would say is to accept that irrational biases are there, and turn them into something useful or at least find a way of dealing with them that doesn't involve trying to eradicate them - because that will fail and you'll probably end up looking like a asshole/hypocrite. For example, you don't need to tell people to stop listening to salient anecdotal evidence and only trust rigourous statistics - stats are meaningless, anecdotes have salience, so this isn't going to work. What you need to do is to get people to recognise which anecdotes are worth paying most attention to and why - I say "people" as opposed to a generic "you" because this whole human rationality thing is utterly pointless if the only rational person is a single dweeb sat at his computer pretending to have all the answers. Approaching it in a way that you can swing people's biases to work for them is is somewhat different, I think, than just saying "this is confirmation bias, don't do it" because people are always going to do it, that's why they're called cognitive biases. Now, this doesn't mean you can't talk about cognitive biases and the usual suspects of rationalism (or even that you can't worship at the Church of Bayes any more) but that we need to stop treating biases as something to be killed and more like something to be tamed and controlled, so that we use them when appropriate. Hyper-rationality doesn't help you survive crossing a road. d hominem 13:59, 7 November 2011 (UTC)
- What is the difference between working around them and working against them? Tetronian you're clueless 13:42, 7 November 2011 (UTC)
- Embrace them because we can't remove them, embrace as in "accept they're there and work around them, not against them". I don't mean everyone should be irrational, I'm just saying to take a different track by turning a problem into its solution. Otherwise we're fighting against the grain. Which I'm sure will satisfy people who exist only in internet forums and wish to rid themselves of these horrible biases, but it's not going to have any practical use to humanity. narchist 13:41, 7 November 2011 (UTC)
- It's always been one thing to recognise cognitive biases, to actually implement that knowledge to avoid them is a different skill entirely. I think this is why Feynman said "philosophy of science is about as useful to scientists as ornithology is to birds". You can learn all you like about confirmation bias, falsifiability, the problem of induction and so on, but it won't help you one jot when you're in a lab looking at an infra-red spectrum wondering 'what-the-fuck?'. If you want to "advance reason in an unreasonable world", I would say you have to understand this difference between merely an academic understanding of rational thought and head towards a practical application of this - and the first port of call is, as you've both pointed out, is to start by assuming we are irrational creatures. I'd go one further than this and try and figure out how to not destroy irrationality (I've the Less Wrong crowd occasionally discuss "barriers" to human rationality, I think this is the wrong approach) but to actually embrace it and, if necessary, exploit it. E.g., no amount of statistical rigour will convince someone a vaccine is (statistically) safe any more than presenting road accident statistics is going to convince someone to stop driving. We learned this from the MMR debacle, and so many doctors and science bloggers have suggested fighting back with anecdotal evidence because that information is salient. This takes advantage of our inherent ability to prioritise certain types of information over others regardless of objectivity or accuracy - which would be a fair description of what we mean when we say "human irrationality" wouldn't you think? The problem then becomes about balance and you're at square one again. Anyway, I've probably waffled off the point. bomination 11:57, 7 November 2011 (UTC)
- Awareness of cognitive biases doesn't translate to using that knowledge effectively for self-criticism. As Robin Hanson writes, "The fact that I can identify a particular bias in those I disagree with is only very weak evidence that I am more right than they." The rational actor model is indeed dying off (not as quickly as I would like, but it's getting there), but it is ingrained enough that people will misuse it consciously or unconsciously. That's why I have to laugh when I hear things about science "replacing" religion. Nebuchadnezzar (talk) 22:49, 6 November 2011 (UTC)
- I think you overestimate the degree to which the myth of man as a rational animal is still prevalent in the scientific and skeptical communities. Most people are aware that there are such things as cognitive biases, and scientists are probably the demographic that is most likely to recognize that these biases severely impede our ability to draw correct conclusions. Furthermore, whether we are currently rational has little to do with our ability to "advance reason." If we recognize that we are mostly irrational, we can simply have less credence in the accuracy of our beliefs, and in the meantime work on techniques to become more accurate. Believing that we are already rational is not necessary for us to do this. Tetronian you're clueless 22:14, 6 November 2011 (UTC)
The Straw Vulcan view is essentially the basis for the rational actor model, which is the problem. ADK, I think, understands and has restated what Atran means by "leveraging" irrationality. Your average person doesn't understand the scientific method, and even when they do, they can still deny its findings for personal reasons. For example, in one study, conservatives were more likely to believe in global warming if the solution was framed as nuclear power (see here, second paragraph up from the bottom). It's the "double ethical bind" that Stephen Schneider talked about: "On the one hand, as scientists we are ethically bound to the scientific method, in effect promising to tell the truth, the whole truth, and nothing but — which means that we must include all the doubts, the caveats, the ifs, ands, and buts. On the other hand, we are not just scientists but human beings as well. And like most people we'd like to see the world a better place, which in this context translates into our working to reduce the risk of potentially disastrous climatic change. To do that we need to get some broadbased support, to capture the public's imagination. That, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have. This 'double ethical bind' we frequently find ourselves in cannot be solved by any formula. Each of us has to decide what the right balance is between being effective and being honest. I hope that means being both." Nebuchadnezzar (talk) 22:27, 7 November 2011 (UTC)
- I agree with all of that, and I see the utility of leveraging people's biases to better society. Where I disagree with ADK is with respect to the usefulness of overcoming biases on an individual level. As I understand ADK's position, doing so isn't very advantageous and helpful, especially because being aware of biases isn't enough to defeat them. I agree with the latter point, but not with the former. Tetronian you're clueless 03:24, 8 November 2011 (UTC)
- I think the general idea is, as Nassim Taleb (sorry, verbal tic) would say: "Do crazy things (break furniture once in a while), like the Greeks and stay "rational" in larger decisions." Nebuchadnezzar (talk) 22:06, 8 November 2011 (UTC)
- I don't think that's a good analogy. Breaking furniture has therapeutic value - you feel better afterwards, and you're less likely to get angry at other people. On the other hand, neglecting rationality on unimportant issues can break your entire epistemology. So if that's Taleb's argument, then I disagree. (Note: I'm not suggesting that you do an intensive analysis and meta-analysis for every decision you make. But some modes of cognition, especially rationalization, should be ultra-double-extra forbidden all of the time. Tetronian you're clueless 15:21, 9 November 2011 (UTC)
- Yeah, it doesn't quite work as an analogy because we've shifted ground from belief to action, which means ethics is going to come into the picture and the definition of "rationality" changes. Taleb is talking about the latter, so then we have to find some dividing line (or do we?) as you admit it's not necessary to analyze every decision intensively (unlike Damasio's patients). Nebuchadnezzar (talk) 16:48, 17 November 2011 (UTC)
- Yeah, the two goals certainly can come into conflict. My point is simply that there are some modes of thought - rationalization being the prime example - that can cripple your truth-seeking ability in exchange for negligible short-term instrumental gains. In addition, epistemic rationality is necessary in order to make good decisions - it's hard to make the optimal choice if you know little about your options. This means that allowing yourself to be biased can cripple you both instrumentally and epistemically. That's why I am ok with using certain cognitive shortcuts, but not ok with ones that can seriously damage your truth-seeking ability. Tetronian you're clueless 20:50, 17 November 2011 (UTC)
- The point is to know, on the meta-cognitive level, that you're acting a-rationally. Relating to bounded rationality, Herb Simon also coined the term satisficing to describe this. If the cost of obtaining enough information to make an optimal choice is too high, then the merely adequate choice becomes the "optimal" choice. Nebuchadnezzar (talk) 21:18, 17 November 2011 (UTC)
- (Pedantic Psych Aside) The "letting off steam" hypothesis is dead and buried (point #2). Nebuchadnezzar (talk) 22:45, 17 November 2011 (UTC)
- The point is to know, on the meta-cognitive level, that you're acting a-rationally. Relating to bounded rationality, Herb Simon also coined the term satisficing to describe this. If the cost of obtaining enough information to make an optimal choice is too high, then the merely adequate choice becomes the "optimal" choice. Nebuchadnezzar (talk) 21:18, 17 November 2011 (UTC)
- Yeah, the two goals certainly can come into conflict. My point is simply that there are some modes of thought - rationalization being the prime example - that can cripple your truth-seeking ability in exchange for negligible short-term instrumental gains. In addition, epistemic rationality is necessary in order to make good decisions - it's hard to make the optimal choice if you know little about your options. This means that allowing yourself to be biased can cripple you both instrumentally and epistemically. That's why I am ok with using certain cognitive shortcuts, but not ok with ones that can seriously damage your truth-seeking ability. Tetronian you're clueless 20:50, 17 November 2011 (UTC)
- Yeah, it doesn't quite work as an analogy because we've shifted ground from belief to action, which means ethics is going to come into the picture and the definition of "rationality" changes. Taleb is talking about the latter, so then we have to find some dividing line (or do we?) as you admit it's not necessary to analyze every decision intensively (unlike Damasio's patients). Nebuchadnezzar (talk) 16:48, 17 November 2011 (UTC)
- I don't think that's a good analogy. Breaking furniture has therapeutic value - you feel better afterwards, and you're less likely to get angry at other people. On the other hand, neglecting rationality on unimportant issues can break your entire epistemology. So if that's Taleb's argument, then I disagree. (Note: I'm not suggesting that you do an intensive analysis and meta-analysis for every decision you make. But some modes of cognition, especially rationalization, should be ultra-double-extra forbidden all of the time. Tetronian you're clueless 15:21, 9 November 2011 (UTC)
- I think the general idea is, as Nassim Taleb (sorry, verbal tic) would say: "Do crazy things (break furniture once in a while), like the Greeks and stay "rational" in larger decisions." Nebuchadnezzar (talk) 22:06, 8 November 2011 (UTC)