Logical fallacies

From English Wiki
Jump to navigation Jump to search

Classical rhetorical fallacies

Below is a list and brief description of logical fallacies commonly found in persuasive or argumentative writing and speaking Such fallacies were first describe by ancient Greek and Roman writers (especially orators and rhetoric teachers), so some of them have Latin names. Such rhetorical techniques have been used throughout history by orators, scholars, and common people alike. Today they are often used in advertising, politics, and occasionally even in professional and academic persuasive speaking or writing. Some examples may fall under more than one of the categories below. Learn to watch out for use of these fallacies for what Carl Sagan calls “baloney detection” – detecting false, nonsensical, or deceptive claims.

1 Errors of causality & association (including emotionalistic arguments)

1.1 False cause

(post hoc fallacy; post hoc, ergo propter hoc; reductive fallacy; oversimplification, correlation fallacy)

Simplistic cause-and-effect relationships are given for complex problems or issues; two events are given as cause-and-effect, when the relationship may be much more complicated or non-existent. This is also an error of evidence (below).

Poverty causes crime. (It’s a factor, but not the cause.)

The welfare system is causing a breakdown in American families.

In the 50’s, for example, after some nuclear bomb tests, the US was hit by a severe winter; some claimed that the bomb tests were responsible, even though meteorologists showed that the weather was caused by a predictable shift in the Gulf Stream. Politicians rely heavily on false cause, e.g., by priding themselves on economic achievements that happened while they were in office, or by blaming incumbents for economic problems that happened during their time in office.

One might claim (facetiously) that eating ice cream causes drowning. In the summer, there are more cases of drownings, and people eat more ice cream. Of course, the real cause is the summer heat, for which people eat ice cream, and for which people go swimming more, hence the increase in drownings. Hence, no correlation exists between ice cream and drowning; ice cream does not cause drowning. The causal relationship is much more complex.

The famous cautionary phrase is “correlation does not prove causation.” There may exist a statistical correlation even if there is no real causal effect. When a study shows a correlation between A and B, there are usually five possibilities: [1] A causes B; [2] B causes A; [3] A and B both partly cause each other; [4] A and B are both caused by a third factor, C; and [5] the observed correlation was due purely to chance. The possibility of a chance correlation can be shown by statistical tests that show whether the data sufficiently show a real correlation.

This fallacy can be used, for example, to claim that exposure to a substance has adverse heath effects. In such a situation, there may be a statistical correlation even if there is no real effect. For example, early studies of EMFs (electromagnetic fields) from high-voltage power lines seemed to show a link between the power lines and cancer. The real cause was that poorer people tended to live in areas with such power lines, and these people more likely to get cancer for other reasons, such as poor health care and diet.

1.2 Ad hominem

(argumentum ad hominem, argument to the person/man)

Such an argument focuses on an individual’s personal life or character and ignores real issues; it is common in political campaigning and advertising.

We shouldn’t adopt the proposed health care plan, because its advocates are simply a bunch of socialists.

1.3 Ad populum

(argumentum ad populum, argument to the people)

This is a similar emotionalistic appeal to common values or deep biases of the masses; it is similar to ad hominem, and likewise is common in politics and advertising.

This fine, patriotic, church-going American deserves your vote.

Variations of this common in advertising and politics are:

  • snob appeal: Advertisements that appeal to desire for status and wealth, e.g., by associating the product with use by high-status or wealthy people.
  • bandwagon: “Are you the only one on your block who doesn’t have a box of Super Choco-Bombs cereal?”
  • flattery: “You are obviously a very intelligent person, so can I get you to take a look at our encyclopedia?”
  • guilty by association: “This man is a communist, because he associates with other known communists.”

1.4 Appeal to ignorance (ad ignorantium)

A writer asserts that a claim must be true simply because no one can disprove it; in doing so, the writer evades his/her responsibility and unfairly shifts the burden of proof onto the reader/listener.

Although doctors say that wearing copper bracelets to improve arthritis problems has no medical basis, they haven’t shown that they don’t help or cause harm, therefore you should buy them.

1.5 Genetic fallacy

The writer assumes that the nature of character of a person, object or idea can be judged based on its origins.

She’s from Arkansas, so she must be stupid. He couldn’t have done such a thing — he’s from a good family.

Acupuncture can’t be considered an acceptable medical technique, since it came from ancient China.

1.6 Red herring

According to an old belief, dragging a red smoked herring (a particularly strong-smelling fish) across a trail would divert a pack of hunting dogs from the scent into another direction. So a red herring is a diversionary tactic that sidetracks an argument and diverts the reader’s attention with an irrelevant point. For example, if two candidates in a debate are discussing each other’s qualifications for holding office, one might introduce a red herring by bringing up questions about the other’s alleged socialist or “radical” connections, or travels to a communist country — totally unrelated to the discussion of qualifications.

1.7 Argument from false authority

We should accept a claim simply because some respected person tells us to do so.

Michael Jordan uses this product, so you should too.

You should believe this, because this famous doctor says it’s true.

1.8 Argument from adverse consequences

One is asked not to accept a position because doing so would require them to accept unpleasant consequences that stem from it.

If you don’t believe in Santa Claus, who’s going to bring you your Christmas presents? (a threat from a parent to a skeptical child)

1.9 Tu quoque (“you also”)

The writer evades an issue or deflects a charge or question by accusing the opponent of the same or something similar.

Who are you to criticize me for cheating on my taxes when you pad your expense account so lavishly?

1.10 Naturalistic fallacy and moralistic fallacy (appeals to nature)

A claim is made based on the assumption that what is natural (e.g., what occurs in nature or arose via evolution) is inherently good, right or moral, and whatever is not “natural” is wrong or immoral. It may be the case that what is natural is purely neutral morally, or has nothing to do with human morality or truth claims.

These vitamins are synthetic, not natural, and thus, are inferior. (Vitamins are simply chemicals, so scientists would say that the source does not matter.)

Warfare should be tolerated because it is part of the violent and natural instinct of human nature.

The naturalistic fallacy (claims about what is good) overlaps with the moralistic fallacy (regarding what is moral), that what is morally desirable is to be found in nature.

If other animal species engage in adultery or don’t stay with their partners, then why can’t we choose the same lifestyle?

These fallacies also overlap with the “is-ought” problem or fallacy – deducing an “ought” from an “is,” i.e., assuming that things should be as they are in nature or in the world. Such fallacies were invoked in the days of social Darwinism – misusing natural selection as a basis for human society.


2 Logical and syllogistic errors

2.1 False syllogism

A conclusion based on faulty assumptions, e.g.,

Socrates is a man.
All men are liars.
∴ Socrates is a liar.

2.2 Begging the question (loaded assertion, circular reasoning/argument)

A writer “begs the question” by assuming a premise to the argument to be already proved or disproved without providing evidence for this assumption — a sort of circular reasoning.

Since evolution is simply a theory, and has never been observably proven, public schools should teach any other theory of origins alongside evolution.

Are you going to listen to this liar, or impeach him like you should?! (His reputation as a liar here is asserted, not proven.)

2.3 Either/or fallacy

(black and white fallacy, false dilemma fallacy, fallacy of insufficient options)

Writers and speakers often try to force their audience to choose between two conflicting alternatives by suggesting that no other options or middle ground exist — an unfair oversimplification of options.

America — love it or leave it!

Those who reject socialism are merely neo-fascists.

If you don’t like our capitalist system, you’re some kind of communist!”

2.4 Loaded question (complex question)

A question is worded unfairly so that any kind of answer will support the writer’s assumptions.

Have you stopped beating your wife?

How long have you been consorting with known Mafia types?

How often have you been cheating on your taxes?

2.5 Non sequitur (“it doesn’t follow”)

One point or argument does not follow logically from the preceding one, i.e., no logical relationship exists between two or more supposedly connected ideas.

He has my vote for senator, because he has the best run campaign. (What does a campaign organization have to do with qualifications?)

I'm going to buy a computer with an Intel chip, because they have the best market share.

2.6 Comment on the obvious

A statement which is obvious, but is very general or uninformative, and does not actually prove any point.

If we don’t do anything about the drug problem, millions of Americans will continue to suffer from drug addiction, drug-related crime, and other social hazards. (While this is unarguably true, this obvious statement does not support the writer’s arguments in favor of his/her proposed policies or solutions.)

2.7 Slippery slope

One creates an irrational fear that by accepting a valid argument you will be drawn in turn toward similar, but less valid ones, until you are persuaded to accept completely unacceptable arguments. In other words, if you accept one argument, this automatically leads to accepting more serious claims.

If you accept a nationalized health care system, you’ll be on the road toward socialism, and then communism.

2.8 False analogy

The writer draws a comparison between two essentially unlike things, based on too few similarities. Just because they are alike in a few respects, they must supposedly be similar in other respects as well. Analogies can only illustrate a point, not prove it.

Non-human primates care for their young and protect their weak members. Why then must contemporary humans go excessively beyond this, with their Social Security, child care, welfare, national health care, etc., to protect every conceivable class of weak or infirm? (To be consistent, one must ask why we speak to each other when apes do not.)


3 Lexical / semantic fallacies

3.1 Euphemism

Using euphemisms to “soften” or hide the truth, e.g., when the Pentagon speaks of “collateral losses or damage” instead of “civilians killed”.

3.2 Misuse of jargon

Impressive jargon or academic vocabulary is used merely to make a weak argument sound impressive.

3.3 Equivocation

Using different definitions of the same words. Those who doubt evolution criticize it as “just a theory,” but ‘theory’ in science does not mean a hypothesis, conjecture, or guess, and it does not necessarily mean something that is unproven. It means an explanatory conceptual framework, a set of ideas, which might have been proven true and still be called a theory. Another famous example of equivocation was when Bill Clinton was asked if he had been alone with Ms. Lewinsky; he answered was that it depends on what one means by “being alone.” Here is an example of changing word meanings within an argument.

Evolution is just a theory, and therefore, not true.

Socrates is a man. All men are pigs. Thus, Socrates is a pig.

3.4 Hyperbole

Exaggerated words, emotionalistic wording, or extreme examples are selected to make a point.

3.5 Tautology

An empty statement composed of simpler statements so that makes it is logically true whether the simpler statements are true or false (x=x, not x=y); e.g., “Either it will rain tomorrow or it will not rain tomorrow”.

If we don’t succeed, we will fail.

3.6 Weasel words

Biased wording used to present doubtful, controversial or arguable ideas as if they were facts or valid ideas, with wording like “some / many scientists agree that...” (but who? - which scientists? - no specific credible scientists are cited) or “many linguists would claim / point out that Eskimos have 100 words for snow” (who?).


4 Empirical fallacies (errors of evidence)

4.1 Hasty generalization

This is a conclusion based on too little evidence or unrepresentative evidence.

If the team is losing, the coach should be fired.

He always screws up important projects. (“always” may not be true — just sometimes)

4.2 Overgeneralization

If you only know of lawyers who are dishonest, middle-aged men who cheat on their wives, it would be inaccurate to conclude that all lawyers are like that. With a subject that the general public is not directly knowledgeable or educated, it is easy to mislead them. For example, a person interviewed on TV might say, “Most autistics are hopelessly incurable if raised without parents or normal education,” many people will only remember the first part and will then believe that most autistics are hopelessly incurable. This is especially problematic on TV, e.g., when journalists or talk show hosts interview one individual as representative of a whole class of people.

When it comes to science and health news, often, the overgeneralization is made not by the original researcher, but by others interpreting the data, such as the commercially owned news media. So often they misreport health news and make claims that the researchers never made. A study might report a correlation between X and Y, but without making any claims about X causing Y. Further studies would be needed to further validate this correlation, and to confirm the causal relationship.

4.3 Insufficient statistical evidence

The statistics cited do not support the conclusion; or the sample size of the study is too small to make meaningful conclusions; or samples or statistics are arbitrarily selected; or one study is cited when more studies are needed to confirm the results. This is related to overgeneralization, e.g. in how the news media report health and science news.

4.4 Biased samples

In statistical studies, scientists hope to get a sample that is representative of the whole population. This can be achieved, at least in theory, by randomly selecting a group of subjects who adequately reflect the whole population. However, sample bias can come into play, leading to questionable results; that is, the sample is not fully random and not very representative. This can happen for various reasons. Health researchers may fail to recruit a fully diverse group because of how and where they advertised to recruit subjects.

Researchers conducting public opinion polls might also fail to get a proper sample due to how they did their polling. A famous example occurred in the 1948 presidential election. Based on faulty polling data, the Chicago Tribune printed the headline “Dewey defeats Truman,” which was mistaken. In the morning the grinning President-Elect, Harry Truman, was photographed holding a newspaper bearing this headline. The source of the error was a Gallup poll based on a phone survey. Survey research was then in its infancy, and few researchers or pollsters back then realized that a sample of telephone users was not representative of the general population. Telephones were not in common use in 1948, and those with phones then were likely wealthy persons with stable addresses. The phone survey data were also two weeks old by that time.

4.5 Observational selection

Noticing only the observations that tend to form the patterns that one wants to see and ignoring those that either don't fit of form undesirable patterns.

4.6 Data mining/dredging

Data mining might refer to examining large data sets to look (in a legitimate way) for patterns. But data dredging is an abuse of data mining, where one looks through large amounts of data to find any correlation, without any pre-defined choice of a hypothesis to be tested. Without following proper procedures, it may be possible to find spurious but apparently statistically significant results in any large data set.

4.7 Selective reporting

This is related to data dredging. The easiest and most common examples involve choosing a group of results that follow a pattern consistent with the preferred hypothesis while ignoring other results contradict the hypothesis. A common method of selective reporting is to perform a study that tests a large number of dependent (response) variables at the same time. For example, a medical study might use as dependent variables the probability of survival, the average number of days spent in the hospital, the patient's self-reported level of pain, and others. This also increases the likelihood that at least one of the variables will by chance show a correlation with the independent (explanatory) variable. A drug company might selectively report only the positive correlation that seems to show that drug X has an effect, and ignore the negative results. Similarly, psychic researchers can conduct many trials and only report the “good” results showing evidence for ESP.


5 Complex fallacies

5.1 Special pleading

This is an extended fallacy in which one presents an unfairly one-sided view of an issue. Although the particular points may be valid, the whole argumentation is biased and fails to consider opposing valid points. For example, some famous and intelligent writers like Mark Twain and Jean Paul Sartre have argued against religion by focusing solely on all the negative aspects (religious extremism, the Crusades, wrongs committed by organized religion, religious fundamentalism, the prevalence of suffering and evil in the world, etc), without conceding any good points to religion (humanitarian benefits, ethical and moral teachings, positive social and personal benefits, etc.).

5.2 Strawman argument

One argues against a theory or idea not by objectively criticizing the idea, but by attacking a misrepresentation of it. An incorrect and distorted representation of the idea is set up, like a fake strawman, and attacked. Or if a theory exists in several different versions or interpretations among its adherents, a critic attacks one particular version of the theory and then claims to have discredited the entire theory. For example, controversy has raged in psychology and linguistics over whether human language ability is an innate or environmentally acquired. Opponents of innateness have sometimes attacked only one version of innateness theory without considering other versions, or have misrepresented the theory to attack it.


6 Cognitive biases

Our mind does not always operate in logical ways; in fact, our brains often use shortcuts to help us quickly make decisions and judgments. These consist of biases, or subconscious preferences, dispositions and tendencies that affect our thinking, and heuristic, or problem-solving shortcuts. Many of these were discovered by the pioneering psychologists Tversky and Kahneman.

6.1 Hindsight bias

(Knew-it-all-along effect, creeping determinism) This is the tendency, after an event, to believe that it was clearly predictable, though at the time no clear evidence existed for its likelihood. In the first famous psychological study of this, researchers asked participants to guess the outcomes of US President Richard Nixon’s 1972 visit to China and the USSR. After the event, the participants were asked to recall or reconstruct the likelihoods of different outcomes; their guesses were heavily skewed by the actual outcomes.

6.2 Availability heuristic

People tend to rely on more memorable or immediate examples in making evaluations and decisions, and forget or ignore less memorable options. We assume that if something can be recalled, it must be important. People thus tend to focus on or consider more recent information. Thus, Americans tend to focus much on terrorism in some decisions or evaluations of events, even though such events are very rare compared to other more likely events or issues. People in general focus more on the potential dangers of flying, even though they are much more likely to be injured or killed in other kinds of accidents (like car accidents) and flying is statistically safer than other modes of transportation. In Tversky and Kahneman’s first famous study of this, they asked subjects if the letter K more likely occurs as the first or third letter of randomly selected words; they guessed first, because it is far easier to think of words beginning with K (kangaroo, kitchen, kale) than those with K as the third letter (acknowledge, ask).

6.3 Representativeness heuristic (base rate fallacy)

We often misjudge what is typical or representative. In experiments, subjects will often over-estimate what percentages of the population are doctors or lawyers, or how likely a certain type of person described is a lawyer, doctor, or such. This is often an effective decision making heuristic, but it sometimes leads us astray.

6.4 Anchoring effect

We can be mislead into incorrect evaluations by an initial baseline or starting point (“anchor”) that may be misleading. In one classic study by Tversky and Kahneman, subjects were ased whether the percentage of African countries that are members of the United Nations is larger or smaller than 65%. When they had to guess the true percentage, their answers correlated well with the arbitrary number they had been given. This works even with wildly exaggerated numbers (“Is the average temperature of San Francisco above/below 558 degrees?”). Not surprisingly, this can be misused by politicians, salespeople, advertisers, and courtroom lawyers to manipulate people.

6.5 Confirmation bias

We tend to search for, select, remember, or focus on information that supports our beliefs, and ignore information that goes contrary to our beliefs. This is especially so for emotionally charged issues or personal issues, including religious beliefs, political beliefs, and hot social issues. For this reason, people have strong commitments to, and even over-confidence in, their own beliefs and are rarely persuaded by evidence to the contrary. An example of this can be seen in horoscopes. People who believe in astrology, upon reading their astrology reading or daily horoscope, will focus on and remember experiences that are consistent with those very general statements in horoscopes, and ignore details that do not conform to those statements. Especially in social media, we filter out sources of information that run counter to our beliefs and perceptions, and form networks where we only hear information that is consistent with our beliefs.

6.6 Gambler’s fallacy

Gamblers believe that a series of losses means that surely s/he is about to have good luck. However, the outcome of such an event is not at all related to outcomes of previous events. Playing cards, coins, or other objects have no memory or magical properties. One coin toss or card draw is completely unrelated to previous tosses or draws. The same holds for life in general. A string of bad luck does not mean that something good is about to happen (or vice versa). But if something good happens, it is usually not related to the preceding bad events.

6.7 Cognitive dissonance

This refers to the mental difficulty of holding or considering two contradictory beliefs, or being confronted by evidence contrary to one’s beliefs. This often leads to a process of rationalization in order to preserve one’s personal beliefs, especially if the contrary views or evidence are perceived as a threat to one’s sense of self. This can involve rationalization in order to dismiss contrary evidence. It can also refer to rationalization used to protect one’s sense of self by altering one’s views of other people, ideas, etc. For example, when someone points out that you treated person X unfairly, you might justify yourself by rationalizing your behavior, and by expressing negative views of X; in the process, you might alter your views and view X more negatively in order to justify your behavior. This is a sort of defensive, face-saving, or self-esteem preservation device.