Essay talk:Inventory of Fringe Beliefs
If anyone is willing to help me with this I would be grateful, its the start of what I think might be interesting work. What I am trying to figure out now is what sub-topics in each of the dimensions its worth asking about. The goal is to hit "big" issues, not small obscure ideas that only a few people believe it. For example, alien visitations or abductions is better than lizard aliens in disguise ruling the UN. Any suggested sub-topics? Any feedback in general? tmtoulouse 22:20, 3 December 2009 (UTC)
- Sure, I'll help reminds me of a book I once read had a passage that went something like this: If you believe a magical man made the world in 6 days, you are a respected member of society and people admire you for your faith in this modern world. If you believe green men visit the earth, you can still be respected but eccentric person. If you believe that their are fairies at the bottom of the garden you are carted off being mentally unstable. - π 22:25, 3 December 2009 (UTC)
- I'll help too, but as Pi points out we have to be very careful to not be subjective. After all, it is difficult to quantify how crazy a particular belief is, particularly when comparing unfalsifiable beliefs. Before we can begin making a system to rank fringe beliefs, we need to come up with a system to create that system. Tetronian you're clueless 22:38, 3 December 2009 (UTC)
- "What is your religion" might be an easy question. If measuring fringe beliefs is what you want you could look at the numbers of each religion (perhaps only in the region studied). For example, Roman Catholic scores low, snake handling high. Unfortunately this isn't perhaps a good reasoning as atheism would classify as a fringe belief and Mormonism as perfectly rational if you do a study in Salt Lake City. Pietrow 11:28, 4 December 2009 (UTC)
- I can help out, so long as it's something that can be done remotely, either written or on the phone. --Ask me about your mother 14:57, 9 December 2009 (UTC)
- douleplusinteresting idea. Do you just want ideas for questions and to look over it? The subjectivity is very difficult to avoid, however, it's probably best to avoid the "what is your religion" question directly and ask about specifics of belief, while still maintaining some generalisations that you could use for categorising. I.e., "I believe that an omnipotent power created the world recently despite evidence to the contrary" (although I must admit that that is very loaded). Perhaps "I believe that there is more to life than evidence that I receive through my senses", "I believe it's all about interpretation". At least start off with that sort of thing to get an insight into the motives behind the specific beliefs and then start asking things like "Horrorscopes are a good way to predict the future". Of course, I could be talking bollocks and completely misread what you're talking about. gnostic 16:15, 12 December 2009 (UTC)
- I think you need to clarify "fringe belief". I would have thought that a "fringe belief is a belief held by a comparatively small percentage of the world's population. We can contrast with with "mainstream beliefs" - those held by the majority.
- But things like creationism, astrology and homoeopathy are believed in by substantial sections of the population. It would probably depend on how you sample your population but in some areas these could be the mainstream belief.
- So I think you need to make it something like "unsubstantiated beliefs" or something of that nature.--BobNot Jim 16:36, 12 December 2009 (UTC)
- I agree with Bob.
- Also, the more that I think about it, the more this seems incredibly hypocritical. We get on Schlafly's case for making little essays like "Why do non-conservatives exist", "what triggers reconsideration of liberal beliefs" and "Quantifying liberal style" (well that one was MarkGall but you get the idea). This concept is basically the same thing. Its condescending and is far from being scientific. Tetronian you're clueless 17:11, 12 December 2009 (UTC)
- I don't necessarily think it's condescending, but I do think it needs to sharpened up. Consider athiesm. It's not a mainstream belief in the US - so is athiesm a "fringe belief"?--BobNot Jim 17:15, 12 December 2009 (UTC)
- Oh, I figured that by "fringe" Trent meant "anything that is bullshit" (in his/our opinion). That's why I think it is somewhat Schlafly-esque. Tetronian you're clueless 17:44, 12 December 2009 (UTC)
- It's not particularly hypocritical. Whereas certain people's "essays" just pull stuff from their own arseholes, this is regarding legitimate research where experiments are going to be carried out. gnostic 18:05, 12 December 2009 (UTC)
- Yes, but wouldn't a person who is a dogmatic atheist be considered "fringe" by Trent's experiment as outlined in the last paragraph of the essay? It seems to me that the experiment is really testing how and why people believe certain things, not the validity or absurdity of the things themselves. Tetronian you're clueless 18:10, 12 December 2009 (UTC)
- I don't necessarily think it's condescending, but I do think it needs to sharpened up. Consider athiesm. It's not a mainstream belief in the US - so is athiesm a "fringe belief"?--BobNot Jim 17:15, 12 December 2009 (UTC)
- douleplusinteresting idea. Do you just want ideas for questions and to look over it? The subjectivity is very difficult to avoid, however, it's probably best to avoid the "what is your religion" question directly and ask about specifics of belief, while still maintaining some generalisations that you could use for categorising. I.e., "I believe that an omnipotent power created the world recently despite evidence to the contrary" (although I must admit that that is very loaded). Perhaps "I believe that there is more to life than evidence that I receive through my senses", "I believe it's all about interpretation". At least start off with that sort of thing to get an insight into the motives behind the specific beliefs and then start asking things like "Horrorscopes are a good way to predict the future". Of course, I could be talking bollocks and completely misread what you're talking about. gnostic 16:15, 12 December 2009 (UTC)
- I can help out, so long as it's something that can be done remotely, either written or on the phone. --Ask me about your mother 14:57, 9 December 2009 (UTC)
Actually, now I think about it even more, wouldn't he first have establish how and why people believe mainstream things before he got onto fringe topics? Though I guess the most usual reason people believe things is "everybody else believes them". So is he really talking about "culturally unusual beliefs"?--BobNot Jim 18:25, 12 December 2009 (UTC)
- My point exactly. Hell, if you believe this article then everyone engages in a little bit of rationalization rather than reasoning. Tetronian you're clueless 18:29, 12 December 2009 (UTC)
I sort of dropped all this in place before I really had time to go much into it, the end of a semester is always a nightmare. The main idea here is to look into some potential mechanisms for the formation of "magical thinking." Belief structures that are based around incredibly flimsy, though highly salient data. I want to avoid topics such as religion as much as possible, and instead focus on less "main stream" ideas. I want to hit a somewhat broad range of topics though since people will often latch onto certain ideas in one field while openly disdaining ideas with similar evidence in other areas. Someone may easily dismiss a NWO conspiracy theory, but be damn sure that big foot exists. I have set up the broad categories of topics in an attempt to cover a wide area for where these kinds of ideas develop. I guess what I am really looking for is recommendations for the kinds of things to be asking people about.
My current hypothesis stems from my work modeling the dopamine systems in the dorsal striatum. I have developed models and have been testing ideas that this dopamine system regulates a signal of "surprise and salience" that is used to learn and make decisions, particularly when learning is correlative in nature. The rat pushing the lever for food kinda thing. Evidence is mounting that there are some significant differences in the endogenous levels of dopamine in the general population and that it effects behavior. We have a behavioral test that is known to correlate with dopamine levels by comparing responses people make to PET scans that measure dopamine levels. So I can now use a simple behavioral test to help break people into groups with high endogenous dopamine levels, and low levels.
If the dopamine signal cues salience when processing correlative evidence, a higher level of dopamine should make certain kinds of observations solidify and generalize more quickly. Evidence that is surprising and salient in its own right should be given significantly more power in people with these higher dopamine levels.
My question is if some of these ideas can be underlying the development of beliefs in paranormal/fringe belief systems. Fringe can mean more than just its relative proportion in the population, I like to think of fringe in terms of "reality." Anyway, I want to create a questionnaire that simply can be used to test and see who holds what kinds of beliefs. Things I think I am interested in are psychics, UFOs, NWO conspiracy, cryptozology, astrology, extreme forms of medical quackary, healing from a distance, causes of disease, etc. Any thoughts on any of this would be appreciated. Particularly suggestions on the kinds of things to look for. tmtoulouse 18:12, 14 December 2009 (UTC)
- Would I be right in saying that your hypothesis is that because people may have high dopamine levels in a particular part of the brain, they'll be swayed by relatively weak evidence, such as an anecdote or just someone in authority mentioning it? Hence skepticism - a demand for more evidence before acceptance - could be "caused" by lower levels, am I right in this interpretation? If so, I think it could be a lot more complex - what's the difference between a belief and a non-belief in this respect, for instance. - and you'd have to look at why people believe anything not just fringe beliefs (and you have to admit that your definition of fringe is unfortunately too subjective at the moment). Specifically, you want to look at how people actually acquire these beliefs. gnostic 19:48, 14 December 2009 (UTC)
- I have been skirting the edges of details to avoid getting too technical but I think its confusing things so let me break it down a little more. I have been working with applying reinforcement learning models in machine learning to correlative learning tasks in animals. The idea here is to look at what kinds of algorithms can be used by using a system where you set up a prediction for what will happen when an action is performed, perform the action, then use the feed back of what actually happens to adjust your prediction. The internal model of the relationship between action choice and predicted outcomes, mediated by a range of factors such as context, serves as a world model for how you think your environment works.
- As these models are built and adjusted data is constantly coming in, when something comes in that is both surprising (i.e. greatly differs from predictions) and salient (relevant to the agent and the task) a choice has to be made about whether to try and integrate that information across all other data points, or whether it represents a special case.
- In my model, the dopamine signal is what relegates whether surprising and salient data are integrated generally or processed specifically. Take a concrete example to make it more clear. An animal is deciding on the value of pushing a lever to get a food pellet, it was trained while being very hungry. Later you take it out feed it till its full, then put it back in, it pushes the lever but its no longer hungry. The predicted reward is significantly different from actual reward. The animal has to decide if its going to "devalue" the food pellet across all context, or if there is something special about context ( in this case satiety) that means the data has to be specially processed.
- So in more general terms, dopamine serve as a regulator for deciding whether surprising and salient data should be weighed with information across many context, or whether a single "special case" can serve to produce a special internal model.
- Step back and look at some of the hallmark aspects of belief in pseudoscience, and conspiracy theories. Theory building is dominated by correlation evidence of extreme examples, that fail to be weighed against counter evidence of a less salient nature. So rather than saying anything about what lower dopamine levels my correlate with, I am specifically interested in the hypothesis about whether high dopamine levels, which I theorize lead to exaggerated special case processing of salient information in world model building, leads to a higher incidence in beliefs that rely heavily on that type of evidence and evidence processing. tmtoulouse 20:12, 14 December 2009 (UTC)
- Yes, that's much clearer (you should forgive me for being a bit thick regarding this, the closest I've got to neuroscience is a single lecture on applying hyperpolarisation to MRI brain scans). So it's all about the options "this new thing is just a special case" or "this new thing is going to change my worldview", right? The animal is thinking (possibly just subconsciously) that "oh, the reward is a food pellet. But I'm not hungry so that's not much of a reward" - therefore it has a choice of thinking that the food pellet is no longer a reward, or just realising that this is only the case because it isn't particularly hungry. I think I'm with you on the pseudoscience bit now, it's about how the evidence is processed and prioritised - although I thought if it was processed as "a special case" then it would lead to less belief in pseudoscience, because I interpret this "special case" thing as basically something more dismissive, e.g., "that anecdote about a homeopath curing your mother is just a special case, it still doesn't work". gnostic 20:43, 14 December 2009 (UTC)
- Not a matter of being dense, just a matter of experience and expertise in a field. I avoid getting super technical by default since I was taught quickly by eye glazing over that answering the question "what is that you do exactly?" should been on in twitter length doses.
- In my model, the world model building works using a Bayesian type learning algorithm, predictions, states, actions, etc. are updated as new data comes in. At any given point in time a probability distribution can easily be constructed for the internal prediction of a consequence. This probability distribution has a mean, and a variance. Dopamine serves as a signal for how far out the current experienced outcome is from the internal probability distribution. When the dopamine signal is high it signals that the current outcome is far out side the norm or predicted outcome and should be processed separately. Less dopamine means that it should be integrated with information from more standard outcomes.
- To call once more upon more concrete examples, lets look at food aversion. If any animal is made sick after it eats something that it has eaten many time before with out getting ill it doesnt really develop an aversion to that food, not without repeated pairings. However, if you inject a drug that increases the levels of dopamine in the brain a single pairing can be enough to cause aversion to that food particularly in the context in which it was delivered when the animal got sick. The dopamine serves as giving weight that an anomalous data point should not merely by treated as an outlier and integrated with all other experience, but rather as a data point for a unique occurrence that can create almost instant learning and action choice changes.
- So the idea is that high levels of dopamine means that instead of integrating a piece of anecdotal evidence with all of your experience with that data and outcome, you instead start to create one-shot learning sessions for highly salient and surprising information. tmtoulouse 21:05, 14 December 2009 (UTC)
- Yes, that's much clearer (you should forgive me for being a bit thick regarding this, the closest I've got to neuroscience is a single lecture on applying hyperpolarisation to MRI brain scans). So it's all about the options "this new thing is just a special case" or "this new thing is going to change my worldview", right? The animal is thinking (possibly just subconsciously) that "oh, the reward is a food pellet. But I'm not hungry so that's not much of a reward" - therefore it has a choice of thinking that the food pellet is no longer a reward, or just realising that this is only the case because it isn't particularly hungry. I think I'm with you on the pseudoscience bit now, it's about how the evidence is processed and prioritised - although I thought if it was processed as "a special case" then it would lead to less belief in pseudoscience, because I interpret this "special case" thing as basically something more dismissive, e.g., "that anecdote about a homeopath curing your mother is just a special case, it still doesn't work". gnostic 20:43, 14 December 2009 (UTC)