Silver-level article

Logical fallacy

From RationalWiki
(Redirected from Fallacy)
Jump to navigation Jump to search
Cogito ergo sum
Logic and rhetoric
Icon logic.svg
Key articles
General logic
Bad logic
We reason from our bedrock beliefs, not to them. Infanticide and slavery are not forbidden in our society because the arguments against these practices are stronger than the arguments in favor of them, but because the practices revolt us. We would not listen to anyone who cared to make arguments in favor of them.
—Judge Richard Posner, Overcoming Law (technically an Appeal to Disgust, showing why pure logic is best left to Vulcans).

A logical fallacy is an error in the logic of an argument[1][2] that prevents it from being logically valid or logically sound, but need not always prevent it from swaying people's minds.[note 1]

Examples of fallacies include the straw-man fallacy, in which one distorts another person's argument, which often makes his or her argument easier to attack. As with most fallacies, the straw-man fallacy may result from sloppy thinking — or, more dubiously, used on purpose. The ad hominem is also a common fallacy where you attack the person who makes the argument (their history, personality, ideologies, etc.), even though the validity of an argument is likely entirely independent of the character of a person who makes it. This is frequently used in political discourse. Another common fallacy is the non sequitur, in which someone takes premises and then forms a conclusion that the premises do not logically support. When a fallacy is identified, one should be careful not to assume that the conclusion is wrong because it was derived from the use of a logical fallacy. This is another well-known fallacy called the Fallacy fallacy. One may be wrong in how one arrived at a conclusion, but that doesn't mean the conclusion itself is wrong. It may be true based on some other premise or logical conclusion. For example, one may say:

P1: Bears are animals.
P2: All two legged animals are mammals.
C: Therefore a bear is a mammal.

There are multiple problems with this argument. One of the premises is false (there are two-legged animals that are not mammals),[note 2] the premise is confusing as most bears have four legs, and the conclusion is a non sequitur, as the first premise doesn't state how many legs bears have, meaning the conclusion would not follow from the premises even if they were correct. However, it is not wise to then conclude that just because one or more premises are false and/or the argument isn't sound, that the conclusion must be wrong. In this case, bears actually are mammals, though one should arrive at that conclusion by an entirely different set of arguments.

Logical fallacies often result from some particularities of human intuition. A logical fallacy is not necessarily a Bayesian fallacy,[3] so given a particular circumstance, jumping to a conclusion will be more likely than not, and if a particular train of thought has been correct more often than not (or close enough to correct to have a positive impact on survival and reproduction), it will get baked into human thinking as a heuristic.[4] When the heuristic is applied outside its reasonable bounds, it becomes a cognitive bias.

The problem is that this can lead to one being grievously wrong about reality.[5] So one may form an opinion by a heuristic (System 1 thinking), but one needs to show oneself working to make sure that one hasn't just said something silly (System 2 thinking).

This particularly applies to thinking about science, because scientific thinking is unintuitive for most people unless trained into it; and to arguing your points in general, because heuristics are full of glaring exceptions.

Explanation[edit]

See the main article on this topic: Syllogism

Must be used in argument[edit]

One common error when first learning about logical fallacies is to fail to realise that a fallacy can only be present if it is used as part of an argument.[6] For example, "So-and-so is a socialist" is not an ad hominem fallacy (see below) because it is simply a statement. So-and-so may be a socialist. "So-and-so is a socialist, therefore they are wrong" is an ad hominem because a conclusion is being drawn, and the conclusion has nothing to do with the premise. It attacks the opponent; not the opponent's argument. This can be more complicated than it sounds, however, because the conclusion that they are wrong is often implied rather than being explicitly stated.

Likewise, "You are an idiot" is merely an assertion. Further, "you are saying idiotic things, therefore you are an idiot" may be a valid argument regardless of whether the premise (the opponent is saying idiotic things) is true. However, it is only a sound argument in the event that the premise is true, and if "saying idiotic things" makes one an idiot. (Even geniuses have said idiotic things. Just ask their spouses.)

Validity versus truth[edit]

Just because an argument is valid does not mean the conclusion is true. A valid argument simply means that if the premises are true, the conclusion must be true as well. A sound argument is a valid argument with the additional requirement that the premises (and thus the conclusion) are true.[4] For instance, consider the following argument.

P1: All humans are cows.
P2: All cows are plants.
C: All humans are plants.

Although the conclusion is false and the premises are false, this is still a valid argument because if the premises were true, the conclusion must be true as well. Since at least one premise is false, the argument is valid but not sound.[5]

Fallacy-dropping[edit]

What we have here is a blatant example of argument by assertion. It's therefore clear your mother was a whore, and you flunked out of elementary school
—Colonel Custer[7]

It is not acceptable to merely state that one's opponent is using a fallacy (as above). One must explain how the opponent's argument is fallacious (e.g., they claim that you are a shill), why it is wrong (there's no evidence that you are a paid government disinformation agent), and what that means for their argument (if you're not a shill, then your arguments can't be handwaved away).[8]

This need not be a drawn-out paragraph. Even "your ad hominem is irrelevant to my argument, so my argument stands" is sufficient.

Otherwise, one runs into the risk of fallacy dropping — claiming someone's argument is wrong without bothering to explain why — which comes dangerously close to ad hominem. (It's equivalent to shouting "your logic is bad!" and claiming victory.)

A related concept is that of logic chopping,[9] where the tools of logic are used unhelpfully and only serve to obfuscate a conversation. This can include fallacy dropping, or nitpicking at statements rather than focusing on the actual discussion.

Reductio ad absurdum[edit]

One of the techniques that is often used to expose fallacies is reductio ad absurdum. When using this technique, one attempts to show that an argument is fallacious by showing that an argument with the same form can be used to produce a conclusion known to be false. For example, if someone commits the fallacy of affirming the consequent, one might say "by your logic, we can prove that 'Elvis Presley was a US President', as follows: If Elvis was a US President, he was famous. Elvis was famous. Therefore, he must have been a US President". It is an example of modus tollens with the form "If the logic of Argument A is valid, then Conclusion C follows from the set of true Premises P. But C is false. So the logic of A is not valid".

The exact details of how to use reductio ad absurdum is complex, so we shall refer you to the page on reductio ad absurdum instead of trying to recap it here. A good example of reductio ad absurdum in action is the page on the flat Earth model of the world which shows the absurdities that arise when one takes the flat earth model of the Earth seriously (which some people still do).

The relationship of paradoxes to logical fallacies[edit]

Logical fallacies are a common theme of paradoxes (infinite regress, circular definitions, and equivocation). [The artist] Patrick HughesWikipedia outlines three laws of the paradox:[10]

Self-reference: An example is "This statement is false", a form of the liar paradox. The statement is referring to itself. Another example of self-reference is the question of whether the barber shaves himself in the barber paradox.Wikipedia One more example would be "Is the answer to this question 'No'?"

Contradiction
"This statement is false"; the statement cannot be false and true at the same time. Another example of contradiction is if a man talking to a genie wishes that wishes couldn't come true. This contradicts itself because if the genie grants his wish, he did not grant his wish, and if he refuses to grant his wish, then he did indeed grant his wish (well, technically, he still didn’t, since he can’t both grant and refuse to grant wishes), therefore making it impossible either to grant or not grant his wish because his wish contradicts itself.

Vicious circularity, or infinite regress: "This statement is false"; if the statement is true, then the statement is false, thereby making the statement true. Another example of vicious circularity is the following group of statements:

"The following sentence is true."
"The previous sentence is false."

W. V. QuineWikipedia distinguished between three classes of paradoxes:[11]

  • A veridical paradox produces a result that appears absurd but is demonstrated to be true nonetheless.
  • A falsidical paradox establishes a result that not only appears false but actually is false, due to a fallacy in the demonstration. The various invalid mathematical proofsWikipedia (e.g., that 1 = 2) are classic examples, generally relying on a hidden division by zero.Wikipedia Another example is the inductive form of the horse paradox,Wikipedia which falsely generalises from true specific statements. Zeno's paradoxesWikipedia are 'falsidical', concluding, for example, that a flying arrow never reaches its target or that a speedy runner cannot catch up to a tortoise with a small head-start.
  • A paradox that is in neither class may be an antinomy,Wikipedia which reaches a self-contradictory result by properly applying accepted ways of reasoning. For example, the Grelling–Nelson paradoxWikipedia points out genuine problems in our understanding of the ideas of truth and description.

A fourth kind, which may be alternatively interpreted as a special case of the third kind, has sometimes been described since Quine's work.

  • A paradox that is both true and false at the same time and in the same sense is called a dialetheia.Wikipedia In Western logics, it is often assumed, following Aristotle, that no dialetheia[i] exist, but they are sometimes accepted in Eastern traditions (e.g. in the Mohists,Wikipedia[12] the Gongsun Longzi,Wikipedia[13] and in Zen[14]) and in paraconsistent logics. For example, it would be mere equivocation or a matter of degree to both affirm and deny that "John is here" when John is halfway through the door, but it is self-contradictory to both affirm and deny the event simultaneously. (This is equivocation owing to the ambiguity of the term here: Does it mean the general vicinity, the building, the room, or a particular spot one metre to the speaker's left? Once this ambiguity is resolved by defining terms, the apparent paradox promptly vanishes in a puff of logic.)

Types[edit]

There is no consensus among philosophers about how to best organize fallacies. They can be classified as inductive and deductive, formal and informal, categories pertaining to the psychological factors that led people to create them, and the epistemological or logical factors that underlie them.[15] Another problem that occurs when organizing fallacies is that many of them can be placed in different areas. Consider, for example, the equality fallacy. It is a fallacy of ambiguity, as it is not often clear what people mean when they say one should be treated “equally”. It is a political correctness fallacy, as liberal politicians advocate for the idea that one should be offended if people are not treated "equally”. It is a jumping to conclusions fallacy, as it assumes that the blind should be treated "equally" to someone who has 20/20 vision (which is clearly a logical error if one works at the DMV or one is responsible for hiring referees). It is an appeal to self-evident truth fallacy, as the Founding Fathers of the USA claimed that "all men are created equal" is a self-evident truth, even though it is quite clear that all men are not created equal, as some men are smarter than others, stronger than others, taller than others, etc. It is also a loaded language fallacy, as the term is imbued with emotional connotations. It is also a conditional fallacy, as there are logical ways to use the word equality, such as the right to be equally judged by the law. And if you doubt that the promotion of equality is a fallacy, then you probably don't know about how laws that promote equality in the USA, namely the 1964 Civil Rights Act and the Equal Employment Opportunity Act of 1972, have led to the prison rape of women by male guards.[16][17][18] But we digress.

Due to the difficulties organizing fallacies, often times sites will simply list the fallacies alphabetically and avoid attempting to organize them. While this works on some level, it is often not very helpful when trying to understand fallacies as there are a lot of them. People need to have some sort of schema in order to understand them, and for this reason, this page has decided to organize them for you. The primary division of fallacies utilized is formal and informal. A formal fallacy is an argument in which the conclusion would not necessarily be true whether or not its premises are correct, because it does not follow valid logical structure. An informal fallacy, on the other hand, is contingent on the argument's content or possibly the motive of the arguer. Within the informal category, the page has further subdivided the fallacies into fallacies of presumption, fallacies of relevance, and fallacies of clarity. This is a way of categorizing fallacies mentioned by the Internet Encyclopedia of Philosophy,[15] is used elsewhere such as at Wikiversity, and is one of the more common ways of organizing fallacies. Another site that was used as a guide for organizing fallacies was Fallacy Files. In addition to the formal and informal fallacy sections, this page has added a section for conditional fallacies, which are broader categories that have both a non-fallacious and a fallacious component, and a section for argumentative fallacies, which are fallacious ways of presenting information that incorporate informal fallacies. While this list is quite extensive, it is not comprehensive, as there are subfallacies or fallacies that pertain to specific fields of study that may not have been mentioned.

Formal[edit]

See the main article on this topic: Formal fallacy

All formal fallacies are forms of invalid (generally deductive) reasoning and specific types of non-sequitur.[15]

Syllogistic fallacy[edit]

See the main article on this topic: Syllogistic fallacy

A syllogistic fallacy is any instance in which a syllogism with incorrect structure is used.[19][20]

  1. Four-term fallacy: Any syllogism in which four terms are present, instead of the mandatory three, often due to using an ambiguous term in a premise of a logical syllogism.[19][21]
  2. Enthymeme: When an unstated premise is necessary for logical validity.
    1. Argument from incredulity: P1: One can't imagine how X could be true. P2: (unstated) If X is true, then one could imagine how X could be true. C: X is false. The contrapositive of this fallacy is the Argument from credulity: P1: One can imagine how X could be true. P2: (unstated) If one could imagine how X could be true, then X is true. C: X is true.[21]
    2. Commutation of Conditionals, known as illicit (false) conversion in quantificational logic and as confusion of the inverse in statistics (If P then Q. Therefore, if Q then P.), is an argument which needs to state P if and only if Q in order to be valid, but does not.[19][21][15]
    3. It is possible to interpret these particular propositional fallacies as complementary enthymemes:
      1. Affirming a disjunct (also depends on P or Q being ambiguous between inclusive and exclusive or): P or Q. P. Therefore, not Q.[19][21]
      2. Denying a conjunct (or false dilemma): Not both P and Q. Not P. Therefore, Q.[19][21]
    4. It is possible to interpret these particular propositional fallacies as not stating If P then Q or If Q then P (i. e. the principle of totality for material implication):
      1. Affirming the consequent: If P then Q. Q. Therefore, P.[19][21][15]
      2. Denying the antecedent: If P, then Q. Not P. Therefore, not Q.[19][21][15]
      3. Negating antecedent and consequent (also known as improper transposition): If P then Q. Therefore, if not-P then not-Q. (also leaves half of the conclusion, which also being necessary for logical validity, unstated)
      4. Confusion of the inverse: Given two events A and B, the probability of A happening given that B has happened is assumed to be about the same as the probability of B given A. More formally, P(A|B) is assumed to be approximately equal to P(B|A). It is a fallacy one encounters when using Bayes' Theorem (as are base rate fallacies in general).[22][19][21]
    5. It is possible to interpret these particular fallacies of quantificational logic as having an unstated premise which is necessary for logical validity:
      1. Existential assumption: All X, if they existed, would be Y. All Y that exist are Z. (An X exists.) All X are Z.[19][21][20]
      2. Some Are/Some Are Not (also known as unwarranted contrast and negative conclusion from affirmative premises or affirmative conclusion from a negative premise): (Any S which exists is P or it is not.) Some S are P. Therefore, some S are not P. or vice versa[19][21]
    6. Emotional appeals frequently appear as enthymemes because they depend on evaluating an argument based on feelings rather than logic.[19][21][15] Emotional appeals, while primarily syllogistic, are often also informal fallacies.
      1. Appeal to novelty (argumentum ad novitatem): Arguing that a claim is valid because it is novel.[21][15]
      2. What's the harm: It's just some water (and your payment of $50); what harm can it do?
      3. Appeal to nature (argumentum ad naturam) — Arguing that something is good because it is "natural".[21]
      4. Appeal to ancient wisdom: It's right because the Maya/Chinese/Hebrews said it thousands of years ago![21]
      5. Appeal to age: It's right/wrong because the claim-maker is old/young.[23]
      6. Argument to the Purse (argumentum ad crumenam): Using one's possession of money (or lack of it) to prove the truth of a claim.[21]
        1. Appeal to wealth: I have made a bunch of money so anything I tweet must be true![21][15]
        2. Appeal to poverty: My lack of money proves that I know the secret to happiness.[21]
      7. Appeal to Accomplishment (also known as appeal to success): I have three doctorates at BS University and have written 7 best-selling books on the subject of the quantum dream states therefore anything I say is true.[21]
      8. Appeal to confidence/Trust: Trust me, I know what I'm doing.[21]
      9. Appeal to Intuition: I have a gut feeling that something is true, therefore it is (even if it has at most only a weak factual basis).[21]
      10. Appeal to gravity: I'm the only one up here who takes this seriously. Disregard these jokers — I have the truth.
      11. Appeal to Desperation: Something must be done about these illegal immigrants. Let's build a border wall![21]
      12. Appeal to normality: It is normal in America to be in debt therefore it is nothing to worry about. Let me tell you about our financing options…[21]
      13. Appeal to Common Folk: Joe the Plumber is a common man. He says to do X. You are a common man, therefore X is the right thing to do.[21]
      14. Appeal to closure: Crime X occurred. Person Y looked suspicious but no evidence connected him to it and no other suspects were found, thus facilitating the scapegoating of person Y, whom the police then arrests out of a desire to close the case.[21]
      15. Appeal to flattery: What a lovely fallacy you have there! You must be a smart person, someone who'd find quantum healing quite fascinating.[21]
      16. Appeal to (insert your favorite emotion).[21]
        1. Appeal to shame/Ridicule (reductio ad ridiculum): Would you say that in front of your mother (if you knew how ridiculous she thought it was)?
        2. Appeal to Pity (argumentum ad misericordiam): Using the emotion of pity to distract from the truth of an argument. Many people in the USA view innocent by reason of insanity as an example of this fallacy which has lead to the adoption of guilty but mentally ill laws.[19][21][24]
        3. Argumentum ad fastidium: Ugh, that's so gross — it must be false.
        4. Appeal to anger (argumentum ad iram): When Rush Limbaugh's anger is used to prove the claims he is making.[21]
        5. Appeal to hatred/spite (argumentum ad odium): Don't you hate it when people point out your logical fallacies? I know I do. So come join me in my campaign against logic!"[19][21]
        6. Appeal to pride (argumentum ad superbiam): You know what is wrong with this country? People are no longer proud to be an American. Well I am and together we can make America great again![19]
        7. Appeal to fear (argumentum ad metum, argumentum in terrorem): We're surrounded by logical fallacies! RUN!![19][21]
      17. Loaded language (also known as prejudicial language): Using terms such as “hard-working Americans” that elicit strong emotions in the listener in order to establish the truth of an argument.[21][15]
    7. Argument by Pigheadedness: Stubbornly refusing to accept rational counter arguments to one’s position without providing any reasons as to why the counter arguments are wrong.[21]
    8. Fallacies of vacuity are the ultimate enthymemes because they "(do) not establish what the proponent of the argument intended because (they don't) put forward a substantive claim in favor of the conclusion [(i. e. a substantive premise)]."[25] Thought-terminating clichés often contain this type of fallacy.
      1. Self-sealing arguments can't be argued against because they are constructed in a way that seals themselves off from criticism although they are obviously based on invalid reasoning or speculative premises and therefore are Not even wrong or Fractally wrong (or unfalsifiable), but nevertheless it is doubtful that they can truly be proven correct.[25][21]
        1. Hypothesis contrary to fact: If Alexander the Great hadn't died, then the Greek Empire wouldn’t have fought amongst itself and the world would be far more intelligent than it is now as it would have been run by the Greeks.[21]
        2. Conspiracy theory: Of course I can't prove JFK was assassinated by Ted Cruz's dad. The government has covered it up![21]
      2. Meaningless Question: How much wood would a woodchuck chuck if a woodchuck would chuck wood?[21]
      3. Circular reasoning (also known as circulus in demonstrando) and begging the question (also known as petitio principii): Assuming the initial point. Claim A assumes A is true. Therefore, claim A is true.[19][21][25][15][24]
        1. Homunculus fallacy: If I say X, and then say that X proves that X is true, then I win![21]
        2. Subverted Support: Trying to explain how some phenomenon occurred when there is no evidence that the phenomenon occurred. Example – Conservative Christian explanations of the great flood and Noah's Ark.[21]
        3. Appeal to faith: Arguing that one must use faith rather than reason to understand something to be true.[21]
        4. Complex question fallacy (also known as plurium interrogationum or loaded question): Asking a question, which has an assumption built into it, so that it can't be answered without appearing to confirm the assumption, or else appearing evasive by questioning the assumption. To be distinguished from a leading question, which is not a fallacy, but is a way of suggesting the desired answer by how the question is phrased. "Are you still beating your wife?" is a loaded or complex question, for it assumes that at one time you did beat your wife; while "You weren't beating your wife, were you?" is a leading question, for it suggests the simple answer no.[21][15][24]
        5. Tautology: A is true because A is true (Necessitarian determinism strengthens this A is true because A always had to be true), e.g. the just because fallacy (not to be confused with ipse dixit, a.k.a. because I said so): Student: Why is the Sun yellow? Teacher: Just because.[21]
      4. Inconsistencies: An error in logic that concerns compound propositions, two of whose parts contradict one another in such a way that both cannot be true. (i.e., P and not P; If P then not P. P. Therefore, not P: which is also ambiguous between modus tollens and modus ponens)[19][21]
        1. Self-refuting idea (also known as contradictio in adjecto and conflicting conditions): A claim that on closer inspection disagrees with itself.[21][15]
          1. Stolen concept fallacy: When the thing you are seeking to disprove requires the existence of the thing you are trying to disprove. Example: "Logic can't possibly be a way to derive truth. To prove to you why I think this way…"[21]
          2. Kettle Logic: A collection of arguments made to try and prove a point but the arguments contradict one another.[21]
        2. Double standard: Using one set of criterion for one person (or group of people) and another set of criterion for a different person (or group of people) when only one set of criterion should be used.[21][15]
        3. Special pleading: When universal rules no longer apply in this specific instance. Examples:
          1. I know the law says that there is no left turn on red, but I was in a real hurry…[19][21][15]
          2. Notable Effort: You have made a notable effort while in prison, therefore life imprisonment no longer means spending your life in prison.[21]
      5. Having your cake: If-by-whiskey: Using words with strong connotations to hide the fact that one is supporting both sides of an issue and therefore not stating a position.[21]
      6. Argument by assertion: If you say something enough times, it eventually becomes true and therefore you win the argument. The less kind name for this is Argumentum ad nauseam (argument by repetition): If you say something often enough to make people vomit, you win. The reverse side of this is argumentum e[x] nausea: If people have told you something often enough to make you vomit, you win by saying anything that is not that.[21]
        1. Argument from silence (argumentum e[x] silentio): The lack of response to my point(s) makes my point(s) correct!/The lack of response to my counterpoint(s) to your point(s) makes your point(s) incorrect![21] Example: the Silent Majority.
      7. Circular definition: fails to establish any new information about its referend.[21]
      8. Phantom distinction (also known as distinction without a difference): When someone spends time arguing for the superiority of one term over another (rather than the intended debate), yet there is no effective difference.[21]
      9. Deepity: A statement which equivocates between one vacuous meaning it has, which is true, and another, which, though sounding profound, is false if it is at all meaningful.
  3. Universal conclusion from a particular premise: Asserting some universal fact from particular premises. Only IAA/AIA, IIA and IOE/OIE syllogisms commit this fallacy without drawing an impossible conclusion (AIA also has an undistributed middle term and OIE also has an illicit minor)
    1. Some S are M. All M are P. Therefore, all S are P./All S are M. Some M are P. Therefore, all S are P
    2. Some S are M. Some M are P. Therefore, all S are P. (i.e., induction)
    3. Some S are M. Some P are not M. Therefore, no S are P./Some S are not M. Some P are M. Therefore, no S are P.
  4. Negative conclusion from affirmative premises: Asserting some negative fact from positive premises.
  5. Affirmative conclusion from a negative premise: Asserting some positive fact from negative premises.[19][21][20]
  6. Fallacy of exclusive premises (also known as two negative premises): a categorical syllogism that is invalid because both of its premises are negative.[19][21]
  7. Illicit process: Incorrectly concluding for all of a set when the premises apply to only some of a set. Specifically, the illicit major and illicit minor.[19]
    1. Illicit major: All A are B. No C are A. Therefore, no C are B.[19][21]
    2. Illicit minor: All A are B. All A are C. Therefore, all C are B.[19][21]
  8. Undistributed middle: the middle term in a categorical syllogism is not distributed in either the minor premise or the major premise.[19][21][15]

Fallacies of quantificational logic (also known as fallacies of predicate logic)[edit]

A logical mistake that involves numerical concepts such as the difference between "some" and "all".[19]

  1. Illicit contraposition: No S are P. Therefore, no non-P are non-S.[19][21]
  2. Quantifier-shift fallacy (or scope fallacy): Every X has the relation R to some Y. Therefore, some Y has the inverse of relation R to every X.[19][21][15]

Probabilistic fallacy[edit]

When the conclusion reached from the premises of an argument violates the laws of probability.[19][note 3]

  1. Base rate fallacy (also known as base rate neglect): Incorrectly ignoring statistical information in favor of irrelevant information to make a judgment.[19][21]
    1. Prosecutor's fallacy: Jurisprudence in the USA can be described as poorly executed statistical inference done by three unqualified statisticians before a statistically ignorant jury. In Europe, this is not necessarily the case as they appear to understand the importance of Bayesian inference.[note 4] The prosecutor's fallacy occurs when someone overemphasizes the weight of the evidence proving someone's guilt.[27][15] Very frequently, this comes down to false allegations as proof (blindly trusting the accuser, assuming the guilt of the accused) when people assume that the victim of an alleged crime is telling the truth and this assumption is wrong. This is an ipse dixit fallacy committed by overzealous prosecutors.[note 5]
    2. Defense attorney's Fallacy: When someone downplays the weight of the evidence proving someone's guilt.[27] Very frequently, this comes down to blaming the victim when a victim's actions are used as proof that some offense against them was justified or didn't occur. This is an ad hominem fallacy commonly used by defense attorneys in cases involving rape.
  2. Multiple comparisons fallacy: A group of statistical studies shows that out of N studies, B number of studies produced result C and D number of studies produced result E. The media reports "Studies show E," ignoring result C.[19][21]
  3. Overfitting: Failing to ignore data outliers resulting in a model that is not representative of the general trend of the data set.[29][15]
  4. Data dredging (also known as post-designation, data fishing and the Texas sharpshooter fallacy): This is when you test all kinds of different hypotheses against the same set of data until you find something that is statistically significant, which you then use as an ad hoc conclusion without looking for corroborating data (or using any that you already know). This is a fallacy because that statistical result is most likely due to chance.[29] This is also a pattern recognition error.[19][21][15]
  5. Conjunction fallacy: A is a subset of B. Therefore, A is more probable than B.[19][21][15]
  6. Disjunction fallacy: Event A is more probable than the likelihood of event A or event B.[21]
  7. Gambler's fallacy: I lost the last twenty dice rolls — I'm due for a win, so I had better double down! Conversely, I won the last twenty dice rolls — What if I'm due for a loss? At least I can absorb it if I don't get carried away with my next bet.[19][21][15]
  8. Clustering illusion:
    1. The Hot hand fallacy: I am on a hot streak! Just one more hand! I can't lose!
    2. Drought fallacy: I am in a drought! No more hands! I must lose![19][21]
  9. Countless counterfeits fallacy: A lot of bad evidence for something means good evidence for it also exists. True believers often use this, treating evidence as though whether any particular instance is true or false is a matter of luck, so if there's a lot, there's almost certainly something true in it – e.g. a true alien or paranormal sighting, or a piece of yet-unknown slam-dunk evidence that a grand conspiracy is true.[30]

Bad reasons fallacy[edit]

Fallacy fallacy (argumentum ad logicam):

  1. Forward: Argument A for the conclusion B is fallacious. Therefore, B is false.
  2. Converse: The conclusion B is false. Therefore, Argument A for B is fallacious.
  3. Inverse: Argument A for the conclusion B is not fallacious. Therefore, B is true.
  4. Contrapositive: The conclusion B is true. Therefore, Argument A for B is not fallacious.[19][21]

Fallacy of modal logic[edit]

The Fallacy of modal logic is a formal fallacy in which modalities play a role in creating a fallacious argument.[19]

Modal Scope Fallacy: A fallacy in which an unwarranted degree of necessity falls on the conclusion of an argument. An example would be, "if Barack is President, then he must be 35-years old or older,"[31] since it is not his presidency that causes him to be this age.

Masked man fallacy[edit]

The masked man fallacy (also known as illicit substitution of identicals) is a fallacy that involves confusion between extensions and intensions.[19][21][15] Effectively, conflating knowing something with knowing it under all of its names. E.g. "I know who Bruce Wayne is, but I don't know who Batman is. Therefore, Bruce Wayne is not Batman."

Informal[edit]

See the main article on this topic: Informal fallacy

Fallacies of presumption[edit]

Fallacies of presumption occur when one uses a fallacious or unwarranted assumption to establish a conclusion.[15]

Jumping to conclusions[edit]

See the main article on this topic: Jumping to conclusions

Jumping to a conclusion occurs when coming to a judgement without taking the time to rationally evaluate the merits of the argument.[21][15]

  1. Accident Fallacy (a dicto simpliciter ad dictum secundum quid): When a rule of thumb is taken to be universally true.[19][21][15]
    1. Ecological fallacy: Interpreting statistical data about a group to make inferences about an individual of that group and coming to an incorrect conclusion.[21]
    2. Stereotyping fallacy: Assuming that all individuals of a group have a certain characteristic when this doesn't hold true for all individuals. Example: Stipulating that "All men are taller than women," is a stereotype as there are women who are taller than most men and men that are shorter than most women. On the other hand, stipulating "Men are usually taller than women." is not a stereotype fallacy. Rather, it is an accurate statistical statement.[21][15]
  2. Hasty generalization (also known as overgeneralization and conversely the fallacy of accident): Taking a few specifics and making a general rule out of them, without the few specifics adequately representing the entire group. This is frequently due to an Unrepresentative Sample (also known as biased sample fallacy and selection bias) leading to one drawing a conclusion about a population based upon a sample that isn't reflective of the population that it is supposed to represent.[21][19][15]
    1. Self–selection — a fallacious way of collecting data where the participants who choose to participate in the study are not likely to be representative of the population that it is supposed to represent (such as online polls).[15]
    2. Double counting — When something is counted twice resulting in a statistical error. For example, let's say that one is wishing to determine what percentage of people have a medical condition that could be described as intersex. In order to tabulate this figure, one could include people who have a genetic disorder called MKRH syndrome and people who have a phenotype called vaginal hypoplasia. If one includes both groups unquestioningly, there is likely to be double counting as MKRH syndrome causes vaginal hypoplasia.[21]
    3. Survivorship fallacy: An unrepresentative sample where 'survivors', 'winners' or 'high performers' are cherry picked to form an optimistically-biased sample. Example: Say 80% of people in a population are Christian and 20% are not, and car crashes kill and save lives independent of one's religious beliefs. 80% of the survivors say that Christianity saved their life which leaves the false impression that Christianity works because the 80% of the people who died who are Christians don't get to tell how Christianity didn't save their life… because they are dead.[21]
    4. Reverse survivorship/Casualty fallacy: An unrepresentative sample where 'casualties', 'losers' or 'low performers' are cherry picked to form a pessimistically-biased sample.[32]
    5. Small sample: Using a sample size that is too small to generate statistically relevant conclusions due to insufficient data.[15]
      1. Insufficient statistics: the drawing of statistical conclusions from the small sample size.[15]
      2. Overprecision (also known as fake precision): Assuming a prediction is exactly correct for any given point.[21][15]
      3. Anecdotal evidence or pragmatic fallacy: Using anecdotal evidence to make a general point. Example: Lorenzen Wright married his high-school sweetheart who is now on trial for his murder. This proves that you should never marry your high-school sweetheart. Cherry picking (also known as one-sidedness, suppressed evidence, and the fallacy of exclusion among others) is the result of intentionally only using information that supports one's desired general point and ignoring the evidence that contradicts it.[19][21][15]
        1. Apex fallacy: Using the best/worst group to generalize to the whole group, e.g. nutpicking where one is using examples that are insane(ly great) to represent a group.[15]
        2. Lack of proportion: Exaggerating or downplaying and/or contradicting a piece(s) of evidence that one is using to reach a conclusion.[15]
          1. Disregarding known science: Making a claim (without good evidence) that ignores and/or contradicts a scientifically-substantiated fact.[15]
          2. Exaggeration: Overemphasizing information of questionable relevance when coming to a conclusion.[15]
        3. Argument by selective reading: Acting as if the weakest argument made by an opponent was the only one made and focusing one's rebuttal on only that argument.[21]
        4. Oversimplification: Making a complicated issue appear simple when it really isn’t.[15][19]
    6. Ludic fallacy: Presuming that your statistical model works in situations where it doesn't.[21]
    7. Selective attention: Focusing on certain particulars of an argument while ignoring other aspects of it, such as in the case of the availability heuristic when certain facts are more easily recalled than others, resulting in an unrepresentative sample from which to draw conclusions.[21][15]
      1. Misleading vividness: A few dramatic events such as plane crashes give the mistaken imprecision that it is unsafe to fly when, in fact, it is statistically safer to fly than it is to drive.[21][15]
      2. Spotlight fallacy: when highly publicized data on a group is incorrectly assumed to represent a different or larger group, e.g. the tokenism of a rich politician who goes to a homeless shelter on Thanksgiving and shakes a couple of hands for about an hour then goes home to his or her mansion. News reports show the politician shaking hands. The next day, the politician gives tax breaks to the 1% and raises taxes on the middle class.[21][15]
      3. Historian's fallacy: Dixiecrat fallacy: Dems supported segregation! Dems are racist![21]
      4. Retrospective determinism: Assuming that because an event occurred under a set of circumstances, that it was bound to happen under those circumstances.
      5. Confirmation bias: Seeing only evidence that supports one's hypothesis and overlooking evidence that would contradict it. The toupée fallacy strengthens this with the claim that one is inherently not privy to this evidence.[15]

No True Scotsman[edit]

See the main article on this topic: No True Scotsman

When groups are redefined on the spot such as because they are indefensible from someone pointing it out the their obvious deficiencies so the claimant revises their claim, most frequently by using numerous exceptions to a claim to make it "accurate," in spite of this resulting in the claim having no real meaning, and acts as if was the same as the original claim. Examples:

  1. P: Christians are inherently moral people, but there are Catholic priests who have molested altar boys (because people with authority tend to abuse it), which is inherently immmoral. C1: Therefore the molested altar boys, and not the priests (if anybody in that scandal could have been) were the true Christians. C2: Therefore, the true Christians are the ones that have no "true" authority within the church.[21][15]
  2. Except for 9/11, the invasion of Iraq, the federal response to Katrina, and the financial crisis, George W. Bush's tenure as President proved that he was skilled at his job.[21]

Category mistake[edit]

See the main article on this topic: Category mistake

Confusing what is true of a part with what is true of the whole.

  1. Fallacy of composition: Individual things a whole entity comprises have characteristics A, B and C etc., therefore the whole entity has characteristics A, B and C.[19][21][15][24]
  2. Fallacy of division: The whole entity has characteristics A, B and C therefore its parts have characteristics A, B and C.[19][21][15][24]

False dilemma[edit]

False dilemma (also known as the Black-or-White fallacy, false dichotomy or false dilemma): When two opposing views are presented as the only options when they are not.[19][21][15]

The alternative advance is when both of the options presented to you are essentially the same thing, just worded differently.[21]

False equivalence[edit]

False equivalence: When you presume that two things are the same when they are not.[33][15]

  1. Moral equivalence: Arguing that two things are morally equal, even though they are not.[33]
  2. Political correctness fallacy: When you presume that people's ideas are of equal value or are equally true when they are not. (Think Galileo.) In the case of argument to moderation (argumentum ad temperantiam), one is technically presuming that somewhere between two disparate positions, both of which being partially incorrect, there must be a compromise position between them that is correct. Examples:
    1. Medicare Part D was such a great bill because it was a compromise between the positions of the Republicans and the Democrats. In fact, it was so great that pharmaceutical companies are price-fixing drugs, violating anti-trust laws, and costing American taxpayers billions of dollars. Here is a 60 Minutes expose[34] of just how great the compromise between Republicans and Democrats is for America.
    2. Winner-take-all is an anti-democratic way to run a multi-candidate election, but people would have "too much of a vote" in a purely proportional system. Therefore, Republican Presidential Primaries began to be run so as to award delegates to all candidates so long as none must have gotten an absolute majority of the vote in that state.[21] This also involves excluded middle as it ignores gerrymandering of districts.
  3. Balance fallacy — Giving equal weighting to both sides of an argument, even if one really doesn't deserve the time.[15]
  4. The fallacy fallacy – presumption that because a claim has been poorly argued, or a fallacy has been made, that the claim itself must be wrong.

Fallacies of relevance[edit]

Red herring: A group of fallacies which bring up facts or issues which are irrelevant to the argument often in an attempt to distract the opponent and/or audience.[19][21][15][24]

  1. Rights To Ought: The speaker deflects criticism for a behaviour or statement by declaring that they have the 'right' to perform said action. This is utterly irrelevant. Just because you can do something, does not mean it is desirable, pragmatic, or beneficial in any way to anyone.[35]
    1. Ignoratio elenchi: Missing the point by refuting something that is not stated. Related to the straw man.
  2. Appeal to Force: using force or the threat of force to gain acceptance to his or her conclusion

Argument from ignorance[edit]

Argument from ignorance (argumentum ad ignorantiam): When it is claimed that a proposition is true because it has not yet been proven false or that it is false because it has not yet been proven true.[19][21][15]

  1. Science doesn't know everything: P1: If science (or a person) can't explain X, then Y is true. P2: Science can't explain X. C: Y is true.
    1. Moving the goalposts: Science explains/discovers the X of the first premise leading to the refutation of the support for Y. New requirement: Well if science can't explain Z then Y is true. Science explains Z. New requirement: Well is science can't explain W… Inflation of conflict results if the argument refers to an incomplete agreement on a certain X, Z, or W, as sufficient to cause people not to know anything at all.[21]
    2. Confusing the currently unexplained with the unexplainable: Science hasn't explained how the Big Bang began (X), therefore Y = it will forever remain unknown.[21]
    3. One single proof: Dismissing all circumstantial evidence in favor of a single "smoking gun" that may not (and may not need to) exist.
  2. Shifting of the burden of proof (onus probandi): When one asserts something to be true without evidence for one's position, or against it in the case of a negative proof (also known as proving non-existence), and then one asks people to prove them wrong. (A person asserting a fact is the one who has to have proof, not the other way around.)[21][15]
  3. Missing data fallacy: One's hypothesis has been proven wrong. One asserts, "Well there is yet to be discovered information that will prove my flawed hypothesis or conclusion to be true."[21]
  4. Appeal to complexity: I can't understand something therefore no one else can either. This could be due to Willful ignorance.[21]

Genetic fallacy[edit]

Genetic fallacy: Occurs when the origin of a claim is used to establish truth or falsehood rather than the claim's current factual merits.[19][21][15]

Appeal to false authority[edit]

Appeal to false authority (argumentum ad verecundiam): Incorrectly asserting that respect given to some authority proves the assertion to be true.[19][21][24]

  1. Ultracrepidarianism: When a source is quoted outside their expertise, as if expertise in one field extended to another.
    1. Professor of nothing: When a source is introduced as "Prof." or "Dr.", yet they aren't, or their credentials are from a diploma mill.
    2. Appeal to celebrity: When a source is supposedly authoritative because of the respect people give them.[21]
  2. False attribution: Using an unreliable, fabricated, irrelevant or other form of untrustworthy source as the basis of one's argument.[21]
    1. Appeal to definition (argumentum ad dictionarium): If the dictionary says what I think something means, the dictionary is right. If not, find a new dictionary. The etymological fallacy occurs when this fallacy confuses the original meaning of a word and its current meaning.[19][21][15]
    2. Generalization from fictional evidence: Using a fake story to make a general point.
    3. Linking to authority: When a source is "cited" in-text yet the reference doesn't exist / is irrelevant / says something else.
    4. Anonymous authority: When a source is quoted (or supposedly quoted), but no name is given, e.g. because the person citing it doesn't have first-hand knowledge of it but knows somebody (who knows somebody…) who said that this is what it said. (It is not hearsay if the source states something so do that instead.)[21]
    5. Quote mining (it is also a fallacy of accent): When an authority is selectively quoted to distort their views, or misquoting someone to gain the appearance of authority.[19][21][15]
    6. Invincible authority: When a source is the entirety of an argument (which one knows due to amazing familiarity with the source/argument if not complete omniscience).[21]
  3. Alleged certainty: Asserting that a conclusion is certain because "everyone" knows it to be true (even though there are people who would rationally disagree with the conclusion).[21]
    1. Appeal to common sense: Arguing that “common sense" supports one's favored conclusion. As many of the readers of this page probably realize, there is no agreement over what constitutes common sense. And if it does exist, it can be awfully fallacious.[21]
    2. Appeal to self-evident truth: Arguing that something is true because it is "self-evident". What is or is not self-evident is highly debatable and subjective by definition. But that (self-evidently) doesn't stop you from fantasy projection, or expecting other people to accept your subjective interpretation of experiences as the basis for objective truth.[21]
    3. Proof surrogate: To prove X, I will assert it to be true without providing any evidence for my conclusion, but I will assert it confidently so you will believe me.[21][15]
  4. Blind authority fallacy (also known as the appeal to the law and appeal to Heaven or deus vult): When one believes something to be true simply because the person saying it is in charge, e.g. the "rights to ought fallacy" of confusing what one has a legal right to do with what one ought to do. Which is to say, one has the legal right to protest a march for breast cancer awareness. Having that right doesn't mean that one ought to do it though.[21] Amongst certain audiences, "Bible-believing scientists", though only a sub-set of the larger group of "all scientists", have greater credibility...
  5. Appeal to consequences of a belief (argumentum ad consequentiam): Whether something is true or not depends on whether the consequences of it being true are desirable or undesirable e. g. appeal to force (argumentum ad baculum) wherein one uses force or the threat of it to provide support to one's argument. The negative outcomes are being actualized by the one making the argument.[19][21][15][24]
    1. Argument by censorship: I have created silence; this shows that my point cannot be responded to!
    2. Galileo gambit — If someone is going against the tide of popular thinking, for which people have even died (argumentum ad martyrdom), they must be correct because Galileo was right, while in reality, Galileo was right because he had evidence.[21]
    3. (Self-)Righteousness fallacy: Assuming that if a person (whether self or other) has good intentions, then they also know the truth.[21]
    4. Wishful thinking: The desire for something, especially if improbable (appeal to possibility) or even impossible, to be true makes it true.[19][21][15]
  6. Argumentum ad populum (also known as the bandwagon fallacy, appeal to common belief, and the authority of the many - among others): Most people believe X to be true therefore it must be true.[19][21][15][24]
    1. Groupthink: When one reasons the same way everyone else does in their group out of a desire for social acceptance or because one is too stupid to think independently.[15]
    2. Appeal to tradition (argumentum ad antiquitatem): Because it's always been that way, it's absolutely the right way![21][15]
    3. Appeal to popularity (argumentum ad numeram): The popular thing to do or believe in is also the right thing to do or believe in, even in spite of a Silent Majority precluding much, if any evidence of its popularity.[21][15]
  7. Ipse dixit: When a source is the person making the argument. Example: Person X stole 10 million dollars from me because I said so even if I don't actually have 10 million dollars for anybody to steal.
Ad hoc[edit]

Ad hoc (meaning literally, "for this"): When some idea is asserted purely to shore up some other idea.[21][15]

  1. Lying: Intentionally saying something that isn't true.[15]
    1. Argumentum ex culo: When some fact is cited to defend something, but the fact is entirely fictional.
    2. Rationalization (also known as making excuses): Inventing a reason for something instead of giving the real reason. Example: "I can't go on a date with you because I am too busy with school right now to get involved with someone."[21][15]
  2. Misrepresentation: A mischaracterization of an opposing position, very often a straw man which is for greater rhetorical flexibility like what Ayn Rand did with socialism.[19][21][15]
  3. Non causa pro causa (also known as false cause) is an enthymeme which does not state that everything is the effect of something else, especially in certain forms:
    1. Post hoc, ergo propter hoc: Because event A happened before B, A must have caused B.[19][20][15][24]
    2. Cum hoc, ergo propter hoc (also known as correlation does not imply causation): Concluding that because A is correlated with B, A caused B.[19][21][15]
      1. Confounding causation (also known as joint effect): Asserting X causes Y when, in reality, X and Y are both caused by Z (either simultaneously or sequentially).[20][15][19][36]
      2. Coincidence: Asserting X causes Y when, in reality, the correlation is a statistical anomaly.[19][36]
      3. Reverse causation (or wrong direction): When a cause is mistakenly considered an effect.[20][15][19][36]
    3. Regression fallacy: Something naturally fluctuates. For example, a person gets sick on occasion. When they get sick, they take snake oil as a cure-all. They later feel better because they have reverted to the mean which for them is feeling healthy. They falsely conclude snake oil was a cure even though they only reverted to the mean.[19][21][15]
    4. Magical thinking (or superstitious thinking): Making causal connections between A and B based upon superstition rather than evidence. I danced for rain. It rains a week later. I caused the rain. I wore my lucky baseball cap. My team won. My wearing of the baseball cap has the power to make my team win.[21][15]
    5. Fallacy of the single cause (also known as causal oversimplification, causal reductionism, and the reduction fallacy): When it is assumed that there is a single, simple cause of an outcome when in reality it may have been caused by a number of jointly sufficient causes, e.g. the insignificant cause, which is the one minor factor out several contributing factors that is the sole cause.[21][15][20]
  4. Irrelevant reason — When one uses premises that are not relevant to the issue at hand.[15]
    1. Psychogenetic Fallacy: Assuming that there is a psychological reason why an argument is invalid. Example: You think I am dumb because you are on your period.[21]
    2. Confusing an explanation with an excuse: Assuming that someone’s explanation for bad behavior somehow excuses it.[21][15]
  5. Slothful induction (also known as appeal to coincidence): Ignoring the strongest conclusion of an inductive argument to focus on a weaker one.[21]
    1. Least Plausible Hypothesis: Favoring a hypothesis with a lower probability of likelihood over one that is far more probable.[21]
    2. Far-Fetched Hypothesis: Favoring a hypothesis that is not plausible over the more probable hypothesis.[21][15]
  6. Smokescreen: offering up irrelevant information to obscure the relevant information.[15]
    1. Quantum Physics Fallacy: Hmmm… how do I prove point X? Oh I know. People don't understand quantum physics so I will say that point X is proven by the uncertainty principle.[21]
    2. Zero-Sum Fallacy: Hmmm… now how do I now prove point Y? I think I will use game theory and call it a zero sum game. This is a fallacy commonly found in economics. There are valid ways to use game theory in economics but you have got to be smart about it like this guy.Wikipedia[33]
    3. Spiritual Fallacy: When something can not be explained using conventional logic, the person claims that it's correct in a 'spiritual' way. Examples:
      1. the Holy Trinity of Christianity (i.e., "The Holy Spirit, God, and Jesus are all one entity but they are also three separate entities at the same time").
      2. the interpretation of the Bible as somehow forbidding male-male sex absolutely (also includes Quote Mining and Destruction in Translation)[21]
    4. Chewbacca Defense which is a parody of Johnnie Cochran's famous closing argument in the O. J. Simpson trial: "Cochran: Ladies and gentlemen of this supposed jury, I have one final thing I want you to consider. Ladies and gentlemen, this is Chewbacca. Chewbacca is a Wookiee from the planet Kashyyyk. But Chewbacca lives on the planet Endor. Now think about it; that does not make sense!" (technically not ad hoc, most of this is true of what is in Star Wars)
    5. Ad hominem: When the source of the argument is attacked, rather than their idea.[19][21][15][24]
      1. Association fallacy: When someone's associations are used as evidence against their ideas.[19][21][15]
        1. Bad Seed: Arguing that the "Apple doesn't fall far from the tree."[15]
        2. The Hitler Card (also known as Reductio ad Hitlerum or Hitler Ate Sugar[37]): Hitler spoke German and you are learning to speak German as a second language, therefore your arguments have no merit as you are just like Hitler.[19][21]
      2. Appeal to bias (also known as ad hominem circumstantial and vested interest): Arguing that someone's argument has no merit because he or she stands to profit from it being true in some way.[21][15] The Shill gambit (also known as faulty motives) is the form of this fallacy asserting an arguer is working for someone and spreading disinformation.[15]
      3. Ad Fidentiam (argumentum ad fidentiam): attacking a person's self-confidence.[21] Argumentum ad cellarium is the form of this fallacy specifically accusing the arguer of still being in "mom's basement".
      4. Poisoning the well and demonization: Where an opponent is pre-painted as (unequivocally) terrible.[21][15]
      5. Tu quoque (argumentum ad hominem tu quoque): Where a criticism is falsely dismissed because its author is also guilty of the charge. Whataboutism is the form of this fallacy which includes red herrings or balance fallacies.[19][21][15][24]
      6. Subjectivist fallacy (also known as: relativist fallacy): When some objective fact is asserted to be true for some people but not true for others.[21][15]
      7. Damning with faint praise: When someone is attacked through praise of an achievement that isn't praiseworthy or isn't significantly praiseworthy, suggesting that no achievements worthy of praise exist.
      8. Tone argument: If you can't keep it civil, you clearly can't make truthful statements!
      9. Identity fallacy (also known as Bulverism): When the truth of an argument is determined by one's physical appearance, social class, or other form of social identity[21]: Chinese immigrant: Not all Chinese people are good at math. Person 2: Yes they are. And why should I believe you? You are Chinese![21]
      10. Fallacy of opposition (a.k.a., Gadarene swine fallacy, traitorous critic [fallacy ergo decedo]): Example — Person 1: No foreign country has as many problems with gun violence as America because of their tougher gun laws. Person 2: Well if you like them all so much more, which one are you just going to move to?): When someone's opposition to your opinion is taken as proof of their incorrectness.[21][15]
      11. Two wrongs make a right: A Hatfield: "A McCoy killed our kin! That ain't right! Lets get em!" (kills a McCoy). A McCoy: "A Hatfield killed our kin! That ain't right! Let's get em!" (Kills a Hatfield) (Repeat)[19][21][15]
    6. Emotional appeal: Evaluating an argument based on feelings rather than logic.[19][21][15] These appeals are usually syllogistic, see the Emotional appeal section above.

Weak analogy[edit]

Weak analogy: Using an analogy that is too irrelevant for it to be used to prove or disprove an argument.[19][21][15][24]

  1. Faulty Comparison: Comparing two things as if they were related when they are not in order to convey the idea that one is better than the other is. Example: X motorcycle gets 5 times better gas mileage than the best selling Y automobile.[21][15]
  2. Incomplete Comparison: A comparison that fails to state what it is being compared to. Example: Our garbage bags are 40% stronger![21]
  3. Extended analogy (Reductio ad Hitlerum): Saying something is bad because Hitler (allegedly) did it. Sometimes called "Hitler ate sugar."[19][21]
  4. Appeal to the Moon (argumentum ad lunam): Arguing if we can put a man on the moon, then surely we can cure trisomy 13.[21]
  5. Appeal to Extremes: Misrepresenting a reasonable argument by using extreme examples to try and prove the argument to be fallacious.[21]

Fallacies of clarity/ambiguity/vagueness[edit]

Fallacies of clarity/ambiguity/vagueness (equivocations): Fallacies that lead to logical confusion because of a lack of logical or linguistic precision. Often (subconsciously/unconsciously) substituting the meaning of a given word in one context for another context that is inappropriate in order to make your argument. Intentional (also known as ambiguous middle term) and extensional fallacies depend on using words or phrases that are open to more than one interpretation and treating the different meanings for the same word or object as being equivalent when the differences matter although each type depends on it in a different way.[19][21][15][24]

  1. Argument of the Beard (also known as the fallacy of the heap and the continuum fallacy): When one argues that there is no difference between two extremes of a spectrum because one is not sure when a man goes from being clean shaven to having a beard.[19][21][15]
    1. Science was wrong before: And therefore it can never be right.
    2. Wronger than wrong: The fallacy of assuming that different degrees of "wrong" are the same.
    3. Not as bad as (also known as relative privation): A moral fallacy that says because B is worse than A, A should be seen as something good. Example: Sure you may have lost your arm, but at least it wasn't both of your legs.[21]
    4. Nirvana fallacy: Claiming that a realistic solution is useless because it is not as good as an idealized perfect solution.[21][15]
  2. Slippery slope: A leads to B which leads to C which leads to D which leads E which leads to zebras having relations with elephants.[19][21][15][24]
  3. Fallacy of definition: Fallacies that convey confusion about the exact meaning of a word or phrase. The most obvious of these is the circular definition, which fails to establish any new information about its referend.[21]
    1. Phantom distinction (also known as distinction without a difference): When someone spends time arguing for the superiority of one term over another (rather than the intended debate), yet there is no effective difference.[21]
    2. Definist fallacy: When one makes up definitions with no real meaning and/or with loaded language in order to make one's position easier to defend.[21][15]
    3. Failure to elucidate (obscurum per obscurius): Purposefully making a definition more difficult than it needs to be.[21][15]
      1. Proof by Intimidation (argumentum verbosum): Purposefully making one's argument incomprehensible in order to intimidate those who would object to the premises if they could understand what was being said.[21]
      2. Deepity: "Love is more than just chemicals. It is also the quantum fluctuations of the sublime."
    4. Suppressed correlative: Attempting to redefine two mutually exclusive options so that one encompasses the other. Example: Person 1: That haunted house was pretty good. Were you scared or not? Person 2: Well, if you define scared as not having complete understanding of the future, then I am always scared.[21]
  4. Fallacy of accent: When the meaning of a text is changed by what word or words are stressed, and stress is unclear. For example, "She is a born-again virgin?" is a different form of disbelief than "She is a born-again virgin?" A fallacy occurs when something is stressed in way X in statement 1 and way Y in statement 2. Extreme forms of this fallacy such as much quoting out of context (also known as contextomy) involve omitting what would be the unstressed word(s) outright to distort the meaning of a text.[19][21][15]
  5. Mistaking the map for the territory: When a term is treated as representing its semantics
    1. Reification (also known as hypostatisation): When an abstraction is treated as if it was something concrete.[21][15]
    2. Anthropomorphism: the attribution of human traits to animals, a deity(s), or inanimate objects (pathetic fallacy).[21][15]
  6. Appeal to equality: Using the ambiguous and emotionally-charged word of 'equality' to argue that people, things, or concepts (places, ideas or data) should be treated equally when what exactly that means is far from apparent.[21]
  7. Use-mention error: Confusing a descriptive word of a thing with the thing itself. Example: Anslem's ontological argument.[21]
  8. Deepity: A statement which equivocates between one vacuous meaning it has, which is true, and another, which, though sounding profound, is false if it is at all meaningful
  9. Fallacy of amphiboly: When a sentence, because of its grammar, structure, or punctuation, can be interpreted in multiple ways.[19][15][24]
    1. Scope fallacy: When the scope of a logical operator (e.g., "not" [or "some", "every" or "all" in the case of the "fallacy of every and all"]) is vague and allows for misinterpretation and incorrect conclusions.[19][15][21]
    2. Type-token fallacy: A fallacy that confuses types of things with tokens (or numbers of things) Example: Person 1: We sell dozens of signs (type), anything from stop signs to deer crossing signs. Person 2: How do you stay in business if you only sell dozens of signs? (token)[21]

Conditional[edit]

See the main article on this topic: Conditional fallacy

For the purpose of this list, a conditional "fallacy" is an argument that may or may not be fallacious depending on how the argument is constructed. The fallacious forms of the argument can be placed in the informal category section (and many such fallacies are already listed there).

  1. Appeal to authority: When an appeal to authority is done correctly, then it can be called an appeal to a qualified authority and is not a fallacy. When it is done incorrectly, it can be called an appeal to false authority. Determining what is or is not a qualified authority is the subject of epistemology and is beyond the scope of this fallacy list. While determining an authority's qualifications is often viewed from a scientific vantage point, it is not limited to that field of study. The "He said she said" problem is also a question of whether an authority is qualified or not.[15]
  2. Is/ought problem (also known as Hume's law): The is/ought problem stipulates that "what is" is fundamentally distinct from "what ought". Consider the issue of black rhinos. Descriptions about what is happening to black rhinos (what is) cannot determine whether rhinos ought to be environmentally protected or allowed to go extinct (what ought). The ought is a human value that is associated with the "what is" but it is not a "what is" itself as it is contingent on subjective experience which varies from person to person (though there are some oughts that are more universal than others). Most of the problems pertaining to the is/ought problem have been placed in the conditional category as they are contingent on the values a person or society stipulates to be true, though some of the fallacies associated with is/ought problem have been placed in the informal category due to relatedness to other fallacies. For example, the "rights to ought fallacy" was placed under the "appeal to the law fallacy" due to their close association. The following are fallacies associated with the is/ought problem that are often seen as being examples of fallacious reasoning.[15]
    1. Moralistic fallacy: Concluding "what ought" determines "what is". Example: Homosexuality ought not to occur and therefore it is not something that is natural.[21]
    2. Naturalistic fallacy: Concluding "what is" determines "what ought". Example: Pedophilia is natural and therefore it ought to be allowed.[21][15]
    3. McNamara fallacy: Making a decision based only upon things that can be quantified and ignoring things that have a qualitative component. Example — Quantitative argument made: Denying education to people that are here illegally will save taxpayers X amount of dollars. Qualitative argument ignored: Seven-year-old is on the street instead of school because his parents are at work and he or she has no adult supervision.[21]
    4. Economic fallacies: Economics can be thought of as collective ought. We ought to promote laissez-faire capitalism or we ought to promote socialism, for example. Given that economics is entangled with what people believe ought to occur, economic fallacies can be characterized as a subcategory of the is/ought problem.
      1. Hyperbolic discounting: When one chooses to ignore the future in order to focus on present rewards. Example: Present: Fracking boosts the local economy! Future: Florida is under the sea.[15]
      2. Sunk cost: I have spent X amount of money searching for this sunken treasure and it's not anywhere where I thought it would be. Well I better spend some more money or else all the money I spent would have been wasted. Example: Oak Island money pit.[21]
      3. Broken window fallacy: A fallacy that asserts that the destruction of property in things like natural disasters actually boosts the economy. It fails to factor in what the money would otherwise be used for if it wasn't being used for reconstruction.[21]
      4. Just in case fallacy: Basing one's judgement on the worst-case scenario without adequately factoring in the cost-to-benefit ratio that would cause one to come to a different conclusion. For example, one could conclude that one should spend money on flood insurance for a home in the middle of the Mojave desert due to the very unlikely scenario that changing weather patterns could cause one's home to be caught in an unprecedented deluge of water.[21]
      5. Game theory fallacies: These fallacies are conditional. When game theory is done properly as it was by John Nash,Wikipedia then it is not a fallacy. When it is done improperly, then it is ad hoc.[33]

Argumentative[edit]

For the purposes of this list, argumentative fallacies are ones that occur in communication, both the verbal and written forms of it. These fallacies often incorporate many of the informal fallacies listed above when they are presenting information.

  1. Having Your Cake (If-by-whiskey): Using words with strong connotations to hide the fact that one is supporting both sides of an issue and therefore not stating a position.[21]
  2. Slanting: presenting a false representation for a particular argument by misrepresenting, falsifying, misconstruing, and/or suppressing evidence.[15]
    1. Lying with statistics: Using flawed statistics or a biased presentation of a statistical outcome to convey the idea that one's position has more support for it than it does.[21]
    2. Argument by gibberish: "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children" (the title for Andrew Wakefield's paper in the Lancet). Alphabet soup results from boiling this fallacy down to acronyms.[21]
    3. Smear tactic: attacking an opponent's character or position in an untruthful way.[15]
    4. Spin doctoring: Seeking to deceive people by presenting deceptive information that creates a distorted view of reality tailored to one's agenda.[21]
    5. Shoehorning: Current event X happens. Someone with an agenda uses X to show how their agenda is correct even though there is no rational connection between the two events. Example: A devastating earthquake hits Haiti. A religious commentator says Haiti is paying for its pact with the devil.[21]
  3. Style over substance fallacy: Using language or rhetoric (ethos or pathos) to enhance the appeal of an argument, but not its validity, or arguing the method of presentation affects the truth of a claim.[21]
    1. Escape hatch: When some rhetorical technique is used to evade the burden of proof.
    2. Handwave: The act of glossing over a difficult component in an argument, by ignoring or distracting from it.
    3. Argument by fast talking: When one talks like an auctioneer to convey the idea that one is really intelligent and therefore must be right.[21]
    4. Gish Gallop: The debate tactic of drowning your opponent in a flood of individually-weak arguments in order to prevent rebuttal of the whole argument collection without great effort.
    5. Argument by personal charm: Using one's charm or sex appeal to win over an audience rather than reasoned arguments.[21]
    6. Argument by emotive language: Using loaded language in an argument instead of rational arguments based on statements of fact in order to persuade the listener to one's position.[21]
    7. Hypnotic bait and switch: When you begin with a bunch of uncontroversial statements that the listener will agree with then switch to a controversial statement to get the listener to agree with that statement as well. It is a common sales technique. Motte and bailey combines this with equivocation between the uncontroversial but not very useful statements and the more useful, but much more controversial one in order to enable yourself to pretend that the listener doesn't really disagree with you.[21]
  4. Argument by pigheadedness: Stubbornly refusing to accept rational counter-arguments to one's position without providing any reasons as to why the counter arguments are wrong.[21]
  5. Quibbling and logic chopping: Focusing on a minor point and falsely believing that this minor point undermines the larger issue. Sometimes being incredibly precise about what one is saying is needed such as in scientific papers, but in everyday life it is oftentimes useful to talk in more general terms than to get caught up in the details.[21][15]
  6. Argumentum ad nauseam (argument by repetition): If you say something often enough to make people vomit, you win. The reverse side of this is Argumentum e[x] nausea: If people have told you something often enough to make you vomit, you win by saying anything that is not that.[21]
    1. Argument from silence (argumentum e[x] silentio): The lack of response to my point(s) makes my point(s) correct!/The lack of response to my counterpoint(s) to your point(s) makes your point(s) incorrect![21]
    2. Argument by assertion: If you say something enough times, it eventually becomes true and therefore you win the argument.
  7. Avoiding the issue and avoiding the question: These are often ambiguous because the lead-in to either doesn't particularly directly tell the respondent whether they are to be addressing a specific question or an entire issue. Example:
Person 1: There are reports of you having an affair with your intern. (Ambiguous: Is this supposed to be the one affair Person 2 is reportedly having with one particular intern or one of multiple interns with whom Person 2 is concurrently having affairs?)
Person 2: Let me tell you about my new tax plan.[21][15]

Fallacy collections[edit]

There are lots of fallacy collections on the Web. Some of them promote a particular agenda, but most fallacies listed in them are real and present in arguments everyday. Unfortunately, many are deprecated.

Here is a list of websites, ordered roughly by usefulness:

  1. Wikipedia
  2. Visualization: Rhetological Fallacies, InformationIsBeautiful.net
  3. Master List of Logical Fallacies University of Texas at El Paso
  4. Fallacy Files(link)
    1. Taxonomy of Logical Fallacies(link)
    2. Glossary(link)
    3. What is a logical fallacy?(link)
  5. Your Logical Fallacy Is(link)
  6. International Encyclopedia of Philosophy(link)
  7. Secular Web(link)
  8. Nizkor Project(link)
  9. Skeptic's Dictionary(link)
  10. About.com: Agnosticism/Atheism(link)
  11. Arthur Schopenhauer(link)
  12. Stephen's Guide to the Logical Fallacies(link)
  13. Dr. Michael LaBossiere(link)
  14. Free Dictionary(link)
  15. Bruce Thompson(link)
  16. Don Lindsay(link)
  17. Art of Debate(link)
  18. George Boeree(link)
  19. Philosophy in Action(link)
  20. Daniel Kies(link)
  21. L. Van Warren(link)
  22. Agent Orange(link)
  23. Humanist Discussion Group(link)

Deprecated ones, listed ad hoc:

  1. Sinclair Community College(link)
  2. Global Tester(link)
  3. Anti-Mormon Illogic(link)
  4. Objectivism(link)
  5. Evolution_V_Creation forums(link)
  6. Peter A. Angeles(link)
  7. Sine Wave(link)
  8. Carleton University(link)
  9. P5(link)
  10. Mathenomicon(link)
  11. Vanessa Hall(link)
  12. J. P. Craig(link)
  13. Informal Fallacies(link)
  14. Autonomist(link)
  15. Gordon, Hanks, & Zhu(link)
  16. Freemasonry(link)
  17. Taking Sides(link)
  18. Jeff Richardson(link)
  19. Chisnell.com(link)

In a nutshell[edit]



See also[edit]

Русскоязычным вариантом данной статьи является статья Логическая ошибка


Icon fun.svg For those of you in the mood, RationalWiki has a fun article about Justification generator.

External links[edit]

Notes[edit]

  1. See Essay:Why are logical fallacies effective?
  2. For instance, birds.
  3. This was placed in the formal category due to the fact that statistics are based in mathematical logical proofs.
  4. Using some form of statistical inference is the only mathematically valid way to evaluate evidence and determine the likelihood of guilt and innocence, but using Bayesian statistics is against USA's laws.[26]
  5. According to a federally-funded study made available online by the National Criminal Justice Reference Service, 2-8% of reported rapes in Los Angeles, California were false allegations.[28]

References[edit]

  1. A Logical Vacation by Julia Nefsky (2005) Philosophy Now 51:7-10.
  2. What is a logical fallacy? Fallacy Files.
  3. Probabilistic Fallacy Fallacy Files.
  4. 4.0 4.1 Logical Fallacies The Skeptics Guide to the Universe (archived from March 18, 2019).
  5. 5.0 5.1 Logical Fallacies
  6. Fallacies by Michael C. Labossiere (1995) The Nizkor Project (archived from January 3, 2015).
  7. Wealthy Americans can win any fight! (c. 2016) Reddit.
  8. FAQ: Logical fallacies and how to beat them by Tony Koutsoumbos (Oct 23, 2015) Medium.
  9. Logic Chopping Logically Fallacious.
  10. Vicious Circles and Infinity — A Panoply of Paradoxes by Patrick Hughes (1975) Penguin Books. ISBN 0385099177.
  11. The Ways of Paradox, and Other Essays by W.V. Quine (1966) Random House.
  12. See the Wikipedia article on School of Names. See the Wikipedia article on Warring States period. "Miscellaneous paradoxes" Stanford Encyclopedia of Philosophy
  13. Studies in Chinese Philosophy and Philosophical Literature by Angus Charles Graham (1990) State University of New York Press. ISBN 0791404498. p. 334.
  14. Chung-ying Cheng (1973) "On Zen (Ch’an) Language and Zen Paradoxes" Journal of Chinese Philosophy, V. 1 (1973) pp. 77-102
  15. 15.000 15.001 15.002 15.003 15.004 15.005 15.006 15.007 15.008 15.009 15.010 15.011 15.012 15.013 15.014 15.015 15.016 15.017 15.018 15.019 15.020 15.021 15.022 15.023 15.024 15.025 15.026 15.027 15.028 15.029 15.030 15.031 15.032 15.033 15.034 15.035 15.036 15.037 15.038 15.039 15.040 15.041 15.042 15.043 15.044 15.045 15.046 15.047 15.048 15.049 15.050 15.051 15.052 15.053 15.054 15.055 15.056 15.057 15.058 15.059 15.060 15.061 15.062 15.063 15.064 15.065 15.066 15.067 15.068 15.069 15.070 15.071 15.072 15.073 15.074 15.075 15.076 15.077 15.078 15.079 15.080 15.081 15.082 15.083 15.084 15.085 15.086 15.087 15.088 15.089 15.090 15.091 15.092 15.093 15.094 15.095 15.096 15.097 15.098 15.099 15.100 15.101 15.102 15.103 15.104 15.105 15.106 15.107 Fallacies by Bradley Dowden, Internet Encyclopedia of Philosophy
  16. Cross-Gender Supervision in Prison and the Constitutional Right of Prisoners to Remain Free from Rape by Flyn L. Flesher (2007) William & Mary Journal of Women and the Law, Volume 13, Issue 3, Article 8.
  17. An Ethical Dilemma In Corrections by Albert De Amicis (August 21, 2005) National Criminal Justice Reference Service.
  18. Barring Male Guards for Female Inmates Might Violate Title VII, Ninth Circuit Rules by Kevin P. McGowan (July 8, 2014) Bloomberg BNA (archived from July 18, 2016).
  19. 19.00 19.01 19.02 19.03 19.04 19.05 19.06 19.07 19.08 19.09 19.10 19.11 19.12 19.13 19.14 19.15 19.16 19.17 19.18 19.19 19.20 19.21 19.22 19.23 19.24 19.25 19.26 19.27 19.28 19.29 19.30 19.31 19.32 19.33 19.34 19.35 19.36 19.37 19.38 19.39 19.40 19.41 19.42 19.43 19.44 19.45 19.46 19.47 19.48 19.49 19.50 19.51 19.52 19.53 19.54 19.55 19.56 19.57 19.58 19.59 19.60 19.61 19.62 19.63 19.64 19.65 19.66 19.67 19.68 19.69 19.70 19.71 19.72 19.73 Fallacy Files by Gary N. Curtis, accessed 2018
  20. 20.0 20.1 20.2 20.3 20.4 20.5 20.6 Index (13 August 1996) Stephen’s Guide to the Logical Fallacies (archived from December 30, 2005).
  21. 21.000 21.001 21.002 21.003 21.004 21.005 21.006 21.007 21.008 21.009 21.010 21.011 21.012 21.013 21.014 21.015 21.016 21.017 21.018 21.019 21.020 21.021 21.022 21.023 21.024 21.025 21.026 21.027 21.028 21.029 21.030 21.031 21.032 21.033 21.034 21.035 21.036 21.037 21.038 21.039 21.040 21.041 21.042 21.043 21.044 21.045 21.046 21.047 21.048 21.049 21.050 21.051 21.052 21.053 21.054 21.055 21.056 21.057 21.058 21.059 21.060 21.061 21.062 21.063 21.064 21.065 21.066 21.067 21.068 21.069 21.070 21.071 21.072 21.073 21.074 21.075 21.076 21.077 21.078 21.079 21.080 21.081 21.082 21.083 21.084 21.085 21.086 21.087 21.088 21.089 21.090 21.091 21.092 21.093 21.094 21.095 21.096 21.097 21.098 21.099 21.100 21.101 21.102 21.103 21.104 21.105 21.106 21.107 21.108 21.109 21.110 21.111 21.112 21.113 21.114 21.115 21.116 21.117 21.118 21.119 21.120 21.121 21.122 21.123 21.124 21.125 21.126 21.127 21.128 21.129 21.130 21.131 21.132 21.133 21.134 21.135 21.136 21.137 21.138 21.139 21.140 21.141 21.142 21.143 21.144 21.145 21.146 21.147 21.148 21.149 21.150 21.151 21.152 21.153 21.154 21.155 21.156 21.157 21.158 21.159 21.160 21.161 21.162 21.163 21.164 21.165 21.166 21.167 21.168 21.169 21.170 21.171 21.172 21.173 21.174 21.175 21.176 21.177 21.178 21.179 21.180 21.181 21.182 21.183 21.184 21.185 21.186 21.187 21.188 21.189 21.190 Logically Fallacious by Bo Bennett
  22. The inverse fallacy: An account of deviations from Bayes's theorem and the additivity principle by Gaëlle Villejoubert & David R. Mandel (2002) Memory & Cognition 30(2):171-178.
  23. [1]
  24. 24.00 24.01 24.02 24.03 24.04 24.05 24.06 24.07 24.08 24.09 24.10 24.11 24.12 24.13 24.14 24.15 Fallacies by Hans Hansen (First published Fri May 29, 2015; substantive revision Sat Jun 29, 2019) Stanford Encyclopedia of Philosophy
  25. 25.0 25.1 25.2 Lecture 4, Philosophy 404/English 501/EDTE 404 & 504 by Michael O'Rourke (June 17 & 21, 1999) University of Idaho.
  26. Neglect the Base Rate: Its the law! by Christoph Engel (December 2012) Preprints of the Max Planck Institute for Research on Collective Goods
  27. 27.0 27.1 Prosecutor and Defense Fallacies Forensics: Examining the Evidence.
  28. Policing and Prosecuting Sexual Assault in Los Angeles City and County: A Collaborative Study in Partnership with the Los Angeles Police Department, the Los Angeles County Sheriff’s Department, and the Los Angeles County District Attorney’s Office by Cassia Spohn & Katharine Tellis (2012) National Criminal Justice Service. page 49.
  29. 29.0 29.1 4 Common Data Fallacies That You Need To Know (2017) KDNuggets.
  30. Johnson, D.K. (2018). Countless Counterfeits. In Bad Arguments (eds R. Arp, S. Barbone and M. Bruce). DOI: 10.1002/9781119165811.ch24 (Non-paywall link for chapter.)
  31. Modal (Scope) Fallacy Logically Fallacious.
  32. Reverse Survivorship Bias Investopedia.
  33. 33.0 33.1 33.2 33.3 False equivalence by Tim Harding (August 14, 2015 · 7:14 am) The Logical Place.
  34. The problem with prescription drug prices: What one city did to fight high drug prices reveals a drug supply chain in which just about every link can benefit when prices go up by Lesley Stahl (May 6, 2018) 60 Minutes, CBS.
  35. Rights To Ought Fallacy Logically Fallacious.
  36. 36.0 36.1 36.2 Causal Reasoning iSTAR Assessment: Inquiry for Scientific Thinking and Reasoning, filed in Dimensions of Scientific Reasoning on Apr.11, 2011
  37. Hitler Ate Sugar TV Tropes.