Ad Hominem Argument Show
Argumentum ad hominem (“argument directed at the man”) is a logical fallacy that involves irrelevant responses directed at the personality of an opponent instead of the content of his or her claim. An ad hominem attack is intended to steer attention away from the issue under debate and toward the debater or the person addressing the issue. Attacks may include derogatory statements about personal traits or characteristics, condemnation of the person’s behavior, or speculation about the individual’s motives or special interests. This entry discusses ad hominem attacks and related fallacies, plus their impact on corporate reputation since most smear campaigns in modern politics and business follow this kind of pattern. Ad hominem criticism is innately deceptive as it defies the principle of an ethical argument as ... locked icon Sign in to access this contentSign in Get a 30 day FREE TRIAL
sign up today! A fallacy is a kind of error in reasoning. The list of fallacies below contains 231 names of the most common fallacies, and it provides brief explanations and examples of each of them. Fallacious arguments should not be persuasive, but they too often are. Fallacies may be created unintentionally, or they may be created intentionally in order to deceive other
people. The vast majority of the commonly identified fallacies involve arguments, although some involve only explanations, or definitions, or other products of reasoning. Sometimes the term “fallacy” is used even more broadly to indicate any false belief or cause of a false belief. The list below includes some fallacies of these sorts, but most are fallacies that involve kinds of errors made while arguing informally in natural language. A charge of fallacious reasoning always needs to be
justified. The burden of proof is on your shoulders when you claim that someone’s reasoning is fallacious. Even if you do not explicitly give your reasons, it is your responsibility to be able to give them if challenged. An informal fallacy is fallacious because of both its form and its content. The formal fallacies are fallacious only because of their logical form. For example, the Slippery Slope Fallacy is an informal fallacy that has the following form: Step 1 often leads to step 2.
Step 2 often leads to step 3. Step 3 often leads to…until we reach an obviously unacceptable step, so step 1 is not acceptable. That form occurs in both good arguments and fallacious arguments. The quality of an argument of this form depends crucially on the probabilities of going from one step to another. The probabilities involve the argument’s content, not merely its form. The discussion below that precedes the long alphabetical list of fallacies begins with an account of the ways in
which the term “fallacy” is imprecise. Attention then turns to the number of competing and overlapping ways to classify fallacies of argumentation. For pedagogical purposes, researchers in the field of fallacies disagree about the following topics: which name of a fallacy is more helpful to students’ understanding; whether some fallacies should be de-emphasized in favor of others; and which is the best taxonomy of the fallacies. Researchers in the field are also deeply divided about how to
define the term “fallacy” itself, how to define certain fallacies, and whether any theory of fallacies at all should be pursued if that theory’s goal is to provide necessary and sufficient conditions for distinguishing between fallacious and non-fallacious reasoning generally. Analogously, there is doubt in the field of ethics regarding whether researchers should pursue the goal of providing necessary and sufficient conditions for distinguishing moral actions from immoral ones. The first known systematic study of fallacies was due to Aristotle in his De Sophisticis Elenchis (Sophistical Refutations), an appendix to the Topics. He listed thirteen types.
After the Dark Ages, fallacies were again studied systematically in Medieval Europe. This is why so many fallacies have Latin names. The third major period of study of the fallacies began in the later twentieth century due to renewed interest from the disciplines of philosophy, logic, communication studies, rhetoric, psychology, and artificial intelligence. The more frequent the error within public discussion and debate the more likely it is to have a name. That is one reason why there is
no specific name for the fallacy of subtracting five from thirteen and concluding that the answer is seven, though the error is common. The term “fallacy” is not a precise term. One reason is that it is ambiguous. It can refer either to (a) a kind of error in an argument, (b) a kind of error in reasoning (including arguments, definitions, explanations, and so forth), (c) a false belief, or (d) the cause of any of the previous errors including what are normally referred to as “rhetorical
techniques.” Philosophers who are researchers in fallacy theory prefer to emphasize (a), but their lead is often not followed in textbooks and public discussion. Regarding (d), ill health, being a bigot, being hungry, being stupid, and being hypercritical of our enemies are all sources of error in reasoning, so they could qualify as fallacies of kind (d), but they are not included in the list below. On the other hand, wishful thinking, stereotyping, being superstitious,
rationalizing, and having a poor sense of proportion are sources of error and are included in the list below, though they wouldn’t be included in a list devoted only to faulty arguments. Thus there is a certain arbitrariness to what appears in lists such as this. What have been left off the list below are the following persuasive techniques commonly used to influence others and to cause errors in reasoning: apple polishing, using propaganda techniques, ridiculing, being sarcastic,
selecting terms with strong negative or positive associations, using innuendo, and weasling. All of the techniques are worth knowing about if one wants to reason well. In describing the fallacies below, the custom is followed of not distinguishing between a reasoner using a fallacy and the reasoning itself containing the fallacy. Real arguments are often embedded within a very long discussion. Richard Whately, one of the greatest of the 19th century researchers into informal logic,
wisely said, “A very long discussion is one of the most effective veils of Fallacy; …a Fallacy, which when stated barely…would not deceive a child, may deceive half the world if diluted in a quarto volume.” The importance of understanding the common fallacy labels is that they provide an efficient way to communicate criticisms of someone’s reasoning. However, there are a variety of ways to label fallacies, and there are
a number of competing and overlapping ways to classify fallacies. For example, the fallacies of argumentation can be classified as either formal or informal. A formal fallacy can be detected by examining the logical form of the reasoning, whereas an informal fallacy depends upon the content of the reasoning and possibly the purpose of the reasoning. That is, informal fallacies are errors of reasoning that cannot easily be expressed in our system of formal logic (such as symbolic, deductive,
predicate logic). The list below contains very few formal fallacies. Fallacious arguments also can be classified as deductive or inductive, depending upon whether the fallacious argument is most properly assessed by deductive standards or instead by inductive standards. Deductive standards demand deductive validity, but inductive standards require inductive strength such as making the conclusion more likely. Fallacies can be divided into
categories according to the psychological factors that lead people to use them, and they can also be divided into categories according to the epistemological or logical factors that cause the error. In the latter division there are three categories: (1) the reasoning is invalid but is presented as if it were a valid argument, or else it is inductively much weaker than it is presented as being, (2) the argument has an unjustified premise, or (3) some relevant evidence has been ignored or
suppressed. Regarding (2), a premise can be justified or warranted at a time even if we later learn that the premise was false, and it can be justified if we are reasoning about what would have happened even when we know it didn’t happen. Similar fallacies are often grouped together under a common name intended to bring out how the fallacies are similar. Here are three examples. Fallacies of relevance include fallacies that occur due to reliance on an
irrelevant reason. Ad Hominem, Appeal to Pity, and Affirming the Consequent are also fallacies of relevance. Accent,
Amphiboly and Equivocation are examples of fallacies of ambiguity. The fallacies of illegitimate presumption include Begging the Question, False Dilemma,
No True Scotsman, Complex Question and Suppressed Evidence. It is commonly claimed that giving a fallacy a name and studying it will help the student identify the fallacy in the future and will steer them away from using the
fallacy in their own reasoning. As Steven Pinker says in The Stuff of Thought (p. 129), If a language provides a label for a complex concept, that could make it easier to think about the concept, because the mind can handle it as a single package when juggling a set of ideas, rather than having to keep each of its components in the air separately. It can also give a concept an additional label in long-term memory, making it more easily retrivable than ineffable
concepts or those with more roundabout verbal descriptions. For pedagogical purposes, researchers in the field of fallacies disagree about the following topics: which name of a fallacy is more helpful to students’ understanding; whether some fallacies should be de-emphasized in favor of others; and which is the best taxonomy of the fallacies. Fallacy theory is criticized by some teachers of informal reasoning for its over-emphasis on poor reasoning rather than good reasoning.
Do colleges teach the Calculus by emphasizing all the ways one can make mathematical mistakes? The critics want more emphasis on the forms of good arguments and on the implicit rules that govern proper discussion designed to resolve a difference of opinion. But there has been little systematic study of which emphasis is more successful. Researchers disagree about how to define the very term “fallacy.” Focusing just on fallacies in sense (a)
above, namely fallacies of argumentation, some researchers define a fallacy as an argument that is deductively invalid or that has very little inductive strength. Because examples of false dilemma, inconsistent premises, and begging the question are valid arguments in this
sense, this definition misses some standard fallacies. Other researchers say a fallacy is a mistake in an argument that arises from something other than merely false premises. But the false dilemma fallacy is due to false premises. Still other researchers define a fallacy as an argument that is not good. Good arguments are then defined as those that are deductively valid or inductively strong, and that contain only true, well-established premises, but are not question-begging. A complaint with
this definition is that its requirement of truth would improperly lead to calling too much scientific reasoning fallacious; every time a new scientific discovery caused scientists to label a previously well-established claim as false, all the scientists who used that claim as a premise would become fallacious reasoners. This consequence of the definition is acceptable to some researchers but not to others. Because informal reasoning regularly deals with hypothetical reasoning and with premises
for which there is great disagreement about whether they are true or false, many researchers would relax the requirement that every premise must be true. One widely accepted definition defines a fallacious argument as one that either is deductively invalid or is inductively very weak or contains an unjustified premise or that ignores relevant evidence that is available and that should be known by the arguer. Finally, yet another theory of fallacy says a fallacy is a failure to provide adequate
proof for a belief, the failure being disguised to make the proof look adequate. Other researchers recommend characterizing a fallacy as a violation of the norms of good reasoning, the rules of critical discussion, dispute resolution, and adequate communication. The difficulty with this approach is that there is so much disagreement about how to characterize these norms. In addition, all the above definitions are often augmented with some remark to the effect that the fallacies are
likely to persuade many reasoners. It is notoriously difficult to be very precise about this vague and subjective notion of being likely to persuade, and some researchers in fallacy theory have therefore recommended dropping the notion in favor of “can be used to persuade.” Some researchers complain that all the above definitions of fallacy are too broad and do not distinguish between mere blunders and actual fallacies, the more serious errors. Researchers in the field are deeply
divided, not only about how to define the term “fallacy” and how to define some of the individual fallacies, but also about whether any general theory of fallacies at all should be pursued if that theory’s goal is to provide necessary and sufficient conditions for distinguishing between fallacious and non-fallacious reasoning generally. Analogously, there is doubt in the field of ethics whether researchers should pursue the goal of providing necessary and sufficient conditions for distinguishing
moral actions from immoral ones. How do we defend the claim that an item of reasoning should be labeled as a particular fallacy? A major goal in the field of informal logic is provide some criteria for each fallacy. Schwartz presents the challenge this way: Fallacy labels have their use. But fallacy-label texts tend not to provide useful criteria for applying the labels. Take the so-called ad verecundiam fallacy, the fallacious
appeal to authority. Just when is it committed? Some appeals to authority are fallacious; most are not. A fallacious one meets the following condition: The expertise of the putative authority, or the relevance of that expertise to the point at issue, are in question. But the hard work comes in judging and showing that this condition holds, and that is where the fallacy-label texts leave off. Or rather, when a text goes further, stating clear, precise, broadly applicable criteria for applying
fallacy labels, it provides a critical instrument more fundamental than a taxonomy of fallacies and hence to that extent goes beyond the fallacy-label approach. The further it goes in this direction, the less it need to emphasize or event to use fallacy labels. (Schwartz, 232) The controversy here is the extent to which it is better to teach students what Schwartz calls “the critical instrument” than to teach the fallacy-label approach. Is the fallacy-label approach better for some kinds
of fallacies than others? If so, which others? Another controversy involves the relationship between the fields of logic and rhetoric. In the field of rhetoric, the primary goal is to persuade the audience. The audience is not going to be persuaded by an otherwise good argument with true premises unless they believe those premises are true. Philosophers tend to de-emphasize this difference between rhetoric and informal logic, and they concentrate on arguments that should fail to
convince the ideally rational reasoner rather than on arguments that are likely not to convince audiences who hold certain background beliefs. Given specific pedagogical goals, how pedagogically effective is this de-emphasis? Advertising in magazines and on television is designed to achieve visual persuasion. And a hug or the fanning of fumes from freshly baked donuts out onto the sidewalk are occasionally used for visceral persuasion. There is some controversy among researchers in
informal logic as to whether the reasoning involved in this nonverbal persuasion can always be assessed properly by the same standards that are used for verbal reasoning. Consulting the list below will give a general idea of the kind of error involved in passages to which the fallacy name is applied. However, simply applying the fallacy name to a passage cannot substitute for a detailed examination of the passage and its context or
circumstances because there are many instances of reasoning to which a fallacy name might seem to apply, yet, on further examination, it is found that in these circumstances the reasoning is really not fallacious. See
Ad Hominem. The Accent Fallacy is a fallacy of ambiguity due to the different ways a word or syllable is emphasized or accented. Also called Accentus, Misleading Accent, and Prosody. Example: A member of Congress is asked by a reporter if she is in favor of the President’s new missile defense system, and she responds, “I’m in favor of a missile defense system that effectively defends
America.” With an emphasis on the word “favor,” her response is likely to be for the President’s missile defense system. With an emphasis, instead, on the word “effectively,” her remark is likely to be against the President’s missile defense system. And by using neither emphasis, she can later claim that her response was on either side of the issue. For an example of the Fallacy of Accent involving the accent of a syllable within a single word, consider
the word “invalid” in the sentence, “Did you mean the invalid one?” When we accent the first syllable, we are speaking of a sick person, but when we accent the second syllable, we are speaking of an argument failing to meet the deductive standard of being valid. By not supplying the accent, and not supplying additional information to help us disambiguate, then we are committing the Fallacy of Accent. See the Fallacy of
Accent. We often arrive at a generalization but don’t or can’t list all the exceptions. When we then reason with the generalization as if it has no exceptions, our reasoning contains the Fallacy of Accident. This fallacy is sometimes called the “Fallacy of Sweeping Generalization.” Example: People should keep their promises, right? I loaned Dwayne my knife, and he said he’d
return it. Now he is refusing to give it back, but I need it right now to slash up my neighbors who disrespected me. People should keep their promises, but there are exceptions to this generalization as in this case of the psychopath who wants Dwayne to keep his promise to return the knife. See Scare Tactic and
Appeal to Emotions (Fear). See Appeal to Consequence. See Appeal to Money. Psychologically, it is understandable that you would try to rescue a cherished belief from trouble. When faced
with conflicting data, you are likely to mention how the conflict will disappear if some new assumption is taken into account. However, if there is no good reason to accept this saving assumption other than that it works to save your cherished belief, your rescue is an Ad Hoc Rescue. Example: Yolanda: If you take four of these tablets of vitamin C every day, you will never get a cold. Juanita: I tried that last year for several months, and still
got a cold. Yolanda: Did you take the tablets every day? Juanita: Yes. Yolanda: Well, I’ll bet you bought some bad tablets. The burden of proof is definitely on Yolanda’s shoulders to prove that Juanita’s vitamin C tablets were probably “bad”—that is, not really vitamin C. If Yolanda can’t do so, her attempt to rescue her hypothesis (that vitamin C prevents colds) is simply a dogmatic refusal to face up to the
possibility of being wrong. Your reasoning contains this fallacy if you make an irrelevant attack on the arguer and suggest that this attack undermines the argument itself. “Ad Hominem” means “to the person” as in being “directed at the person.” Example: What she says about Johannes Kepler’s astronomy of the 1600s must be just so much garbage. Do you realize she’s only fifteen years old? This attack may undermine the young
woman’s credibility as a scientific authority, but it does not undermine her reasoning itself because her age is irrelevant to quality of her reasoning. That reasoning should stand or fall on the scientific evidence, not on the arguer’s age or anything else about her personally. The major difficulty with labeling a piece of reasoning an Ad Hominem Fallacy is deciding whether the personal attack is relevant or irrelevant. For example, attacks on a person for their immoral sexual conduct are
irrelevant to the quality of their mathematical reasoning, but they are relevant to arguments promoting the person for a leadership position in a church or mosque. If the fallacious reasoner points out irrelevant circumstances that the reasoner is in, such as the arguer’s having a vested interest in people accepting the position, then the ad hominem fallacy may be called a Circumstantial Ad
Hominem. If the fallacious attack points out some despicable trait of the arguer, it may be called an Abusive Ad Hominem. An Ad hominem that attacks an arguer by attacking the arguer’s associates is called the Fallacy of Guilt by Association. If the fallacy focuses on a complaint about the origin of the arguer’s views, then it
is a kind of Genetic Fallacy. If the fallacy is due to claiming the person does not practice what is preached, it is the Tu Quoque Fallacy. Two Wrongs do not Make a Right is also a type of Ad Hominem fallacy. The intentional use of the ad hominem fallacy is a tactic
used by all dictators and authoritarian leaders. If you say something critical of them or their regime, their immediate response is to attack you as unreliable, or as being a puppet of the enemy, or as being a traitor. See Guilt by Association. See Appeal to Ignorance. See Appeal to Emotions. See Bandwagon. See Appeal to the People. See Appeal to the People. See Appeal to Authority. If you have enough evidence to affirm the consequent of a conditional and then suppose that as a result you have sufficient reason for affirming the antecedent, your reasoning contains the Fallacy of Affirming the Consequent. This formal fallacy is often mistaken for Modus Ponens, which is a
valid form of reasoning also using a conditional. A conditional is an if-then statement; the if-part is the antecedent, and the then-part is the consequent. The following argument affirms the consequent that she does speaks Portuguese. Example: If she’s Brazilian, then she speaks Portuguese. Hey, she does speak Portuguese. So, she is Brazilian. If the arguer believes or suggests that the premises
definitely establish that she is Brazilian, then the argumentation contains the fallacy. See the Non Sequitur Fallacy for more discussion of this point. See Ad Hominem. See Black-or-White Fallacy. Any fallacy that turns on ambiguity. See the fallacies of Amphiboly, Accent, and Equivocation. Amphiboly is ambiguity of syntax. Equivocation is ambiguity of semantics. Accent is ambiguity of emphasis. This is an error due to taking a grammatically ambiguous phrase
in two different ways during the reasoning. Example: Tests show that the dog is not part wolf, as the owner suspected. Did the owner suspect the dog was part wolf, or was not part wolf? Who knows? The sentence is ambiguous, and needs to be rewritten to remove the fallacy. Unlike Equivocation, which is due to multiple meanings of a phrase, Amphiboly is due to syntactic ambiguity,
that is, ambiguity caused by multiple ways of understanding the grammar of the phrase. This is fallacious generalizing on the basis of a some story that provides an inadequate sample. If you discount evidence arrived at by systematic search or by testing in favor of a few firsthand stories, then your reasoning contains the fallacy of overemphasizing anecdotal evidence. Example: Yeah, I’ve read the health warnings on those cigarette packs
and I know about all that health research, but my brother smokes, and he says he’s never been sick a day in his life, so I know smoking can’t really hurt you. This is the error of projecting uniquely human qualities onto something that isn’t human. Usually this occurs with projecting the human qualities onto animals, but when it is done to nonliving things, as in calling the storm cruel, the
Pathetic Fallacy is created. There is also, but less commonly, called the Disney Fallacy or the Walt Disney Fallacy. Example: My dog is wagging his tail and running around me. Therefore, he knows that I love him. The fallacy would be averted if the speaker had said “My dog is wagging his tail and running around me. Therefore, he is happy to see me.” Animals are likely to have some
human emotions, but not the ability to ascribe knowledge to other beings. Your dog knows where it buried its bone, but not that you also know where the bone is. You appeal to authority if you back up your reasoning by saying that it is supported by what some authority says on the subject. Most reasoning of this kind is not fallacious, and much of our knowledge properly comes from listening to authorities. However, appealing to authority as a reason to believe
something is fallacious whenever the authority appealed to is not really an authority in this particular subject, when the authority cannot be trusted to tell the truth, when authorities disagree on this subject (except for the occasional lone wolf), when the reasoner misquotes the authority, and so forth. Although spotting a fallacious appeal to authority often requires some background knowledge about the subject or the authority, in brief it can be said that it is fallacious to
accept the words of a supposed authority when we should be suspicious of the authority’s words. Example: The moon is covered with dust because the president of our neighborhood association said so. This is a Fallacious Appeal to Authority because, although the president is an authority on many neighborhood matters, you are given no reason to believe the president is an authority on the composition of the moon. It would be better to appeal to some
astronomer or geologist. A TV commercial that gives you a testimonial from a famous film star who wears a Wilson watch and that suggests you, too, should wear that brand of watch is using a fallacious appeal to authority. The film star is an authority on how to act, not on which watch is best for you. Arguing that a belief is false because it implies something you’d rather not believe. Also called Argumentum Ad Consequentiam. Example: That
can’t be Senator Smith there in the videotape going into her apartment. If it were, he’d be a liar about not knowing her. He’s not the kind of man who would lie. He’s a member of my congregation. Smith may or may not be the person in that videotape, but this kind of arguing should not convince us that it’s someone else in the videotape. Your reasoning contains the Fallacy of Appeal to Emotions when someone’s appeal to you to accept their claim
is accepted merely because the appeal arouses your feelings of anger, fear, grief, love, outrage, pity, pride, sexuality, sympathy, relief, and so forth. Example of appeal to relief from grief: [The speaker knows he is talking to an aggrieved person whose house is worth much more than $100,000.] You had a great job and didn’t deserve to lose it. I wish I could help somehow. I do have one idea. Now your family needs financial security even more. You need cash. I can help you.
Here is a check for $100,000. Just sign this standard sales agreement, and we can skip the realtors and all the headaches they would create at this critical time in your life. There is nothing wrong with using emotions when you argue, but it’s a mistake to use emotions as the key premises or as tools to downplay relevant information. Regarding the Fallacy of Appeal to Pity, it is proper to pity people who
have had misfortunes, but if as the person’s history instructor you accept Max’s claim that he earned an A on the history quiz because he broke his wrist while playing in your college’s last basketball game, then you’ve used the fallacy of appeal to pity. See Scare Tactic. The Fallacy of
Appeal to Ignorance comes in two forms: (1) Not knowing that a certain statement is true is taken to be a proof that it is false. (2) Not knowing that a statement is false is taken to be a proof that it is true. The fallacy occurs in cases where absence of evidence is not good enough evidence of absence. The fallacy uses an unjustified attempt to shift the burden of proof. The fallacy is also called “Argument from Ignorance.” Example: Nobody has ever proved to me there’s
a God, so I know there is no God. This kind of reasoning is generally fallacious. It would be proper reasoning only if the proof attempts were quite thorough, and it were the case that, if the being or object were to exist, then there would be a discoverable proof of this. Another common example of the fallacy involves ignorance of a future event: You people have been complaining about the danger of Xs ever since they were invented, but there’s never been any big problem with
Xs, so there’s nothing to worry about. The Fallacy of Appeal to Money uses the error of supposing that, if something costs a great deal of money, then it must be better, or supposing that if someone has a great deal of money, then they’re a better person in some way unrelated to having a great deal of money. Similarly it’s a mistake to suppose that if something is cheap it must be of inferior quality, or to suppose that if someone is poor financially then they’re
poor at something unrelated to having money. Example: He’s rich, so he should be the president of our Parents and Teachers Organization. See Appeal to the People. See Appeal to Emotions. See Appeal to Emotions. See Appeal to the People. See Appeal to the People. See
Appeal to the People. If you suggest too strongly that someone’s claim or argument is correct simply because it’s what most everyone believes, then your reasoning contains the Fallacy of Appeal to the People. Similarly, if you suggest too strongly that someone’s claim or argument is mistaken simply because it’s not what most everyone believes, then your reasoning also uses the fallacy.
Agreement with popular opinion is not necessarily a reliable sign of truth, and deviation from popular opinion is not necessarily a reliable sign of error, but if you assume it is and do so with enthusiasm, then you are using this fallacy. It is essentially the same as the fallacies of Ad Numerum, Appeal to the Gallery, Appeal to the Masses, Argument from Popularity, Argumentum ad Populum, Common Practice, Mob Appeal, Past Practice, Peer Pressure, and Traditional Wisdom. The “too strongly”
mentioned above is important in the description of the fallacy because what most everyone believes is, for that reason, somewhat likely to be true, all things considered. However, the fallacy occurs when this degree of support is overestimated. Example: You should turn to channel 6. It’s the most watched channel this year. This is fallacious because of its implicitly accepting the questionable premise that the most watched channel this year is, for that
reason alone, the best channel for you. If you stress the idea of appealing to a new idea held by the gallery, masses, mob, peers, people, and so forth, then it is a Bandwagon Fallacy. See Appeal to Emotions (fear). See
Appeal to Authority. See Appeal to Emotions. See Appeal to Ignorance. See Appeal to
Emotions. See Appeal to the People. See Ad …. without the word “Argumentum.” Argumentum Consensus GentiumSee Appeal to Traditional Wisdom. Availability HeuristicWe have an unfortunate instinct to base an important decision on an easily recalled, dramatic example, even though we know the example is atypical. It is a specific version of the fallacy of Confirmation Bias. Example:
This reasoning commits the Fallacy of the Availability Heuristic because the reasoner would realize, if he would stop and think for a moment, that a great many more lives are saved due to wearing seat belts rather than due to not wearing seat belts, and the video of the situation of the woman unable to unbuckle her seat belt in the car crash is an atypical situation. The name of this fallacy is not very memorable, but it is in common use. Avoiding the IssueA reasoner who is supposed to address an issue but instead goes off on a tangent is properly accused of using the Fallacy of Avoiding the Issue. Also called missing the point, straying off the subject, digressing, and not sticking to the issue. Example:
However, the fallacy isn’t used by a reasoner who says that some other issue must first be settled and then continues by talking about this other issue, provided the reasoner is correct in claiming this dependence of one issue upon the other. Avoiding the QuestionThe Fallacy of Avoiding the Question is a type of Fallacy of Avoiding the Issue that occurs when the issue is how to answer some question. The fallacy occurs when someone’s answer doesn’t really respond to the question asked. The fallacy is also called “Changing the Question.” Example:
Bad SeedAttempting to undermine someone’s reasoning by pointing our their “bad” family history, when it is an irrelevant point. See Genetic Fallacy. Bald ManSee Line-Drawing. BandwagonIf you suggest that someone’s claim is correct simply because it’s what most everyone is coming to believe, then you’re are using the Bandwagon Fallacy. Get up here with us on the wagon where the band is playing, and go where we go, and don’t think too much about the reasons. The Latin term for this Fallacy of Appeal to Novelty is Argumentum ad Novitatem. Example:
Like its close cousin, the Fallacy of Appeal to the People, the Bandwagon Fallacy needs to be carefully distinguished from properly defending a claim by pointing out that many people have studied the claim and have come to a reasoned conclusion that it is correct. What most everyone believes is likely to be true, all things considered, and if one defends a claim on those grounds, this is not a fallacious inference. What is fallacious is to be swept up by the excitement of a new idea or new fad and to unquestionably give it too high a degree of your belief solely on the grounds of its new popularity, perhaps thinking simply that ‘new is better.’ The key ingredient that is missing from a bandwagon fallacy is knowledge that an item is popular because of its high quality. Begging the QuestionA form of circular reasoning in which a conclusion is derived from premises that presuppose the conclusion. Normally, the point of good reasoning is to start out at one place and end up somewhere new, namely having reached the goal of increasing the degree of reasonable belief in the conclusion. The point is to make progress, but in cases of begging the question there is no progress. Example:
The president is saying basically that women shouldn’t fight bulls because women shouldn’t fight bulls. This reasoning isn’t making any progress. Insofar as the conclusion of a deductively valid argument is “contained” in the premises from which it is deduced, this containing might seem to be a case of presupposing, and thus any deductively valid argument might seem to be begging the question. It is still an open question among logicians as to why some deductively valid arguments are considered to be begging the question and others are not. Some logicians suggest that, in informal reasoning with a deductively valid argument, if the conclusion is psychologically new insofar as the premises are concerned, then the argument isn’t an example of the fallacy. Other logicians suggest that we need to look instead to surrounding circumstances, not to the psychology of the reasoner, in order to assess the quality of the argument. For example, we need to look to the reasons that the reasoner used to accept the premises. Was the premise justified on the basis of accepting the conclusion? A third group of logicians say that, in deciding whether the fallacy is present, more evidence is needed. We must determine whether any premise that is key to deducing the conclusion is adopted rather blindly or instead is a reasonable assumption made by someone accepting their burden of proof. The premise would here be termed reasonable if the arguer could defend it independently of accepting the conclusion that is at issue. Beside the PointArguing for a conclusion that is not relevant to the current issue. Also called Irrelevant Conclusion. It is a form of the Red Herring Fallacy Biased GeneralizingGeneralizing from a biased sample. Using an unrepresentative sample and overestimating the strength of an argument based on that sample. Biased SampleSee Unrepresentative Sample. Biased StatisticsSee Unrepresentative Sample. BifurcationSee Black-or-White. Black-or-WhiteThe Black-or-White fallacy or Black-White fallacy is a False Dilemma Fallacy that limits you unfairly to only two choices, as if you were made to choose between black and white. Example:
A proper challenge to this fallacy could be to say, “I do want to prevent the destruction of our environment, but I don’t want to give $20 to your fund. You are placing me between a rock and a hard place.” The key to diagnosing the Black-or-White Fallacy is to determine whether the limited menu is fair or unfair. Simply saying, “Will you contribute $20 or won’t you?” is not unfair. The black-or-white fallacy is often committed intentionally in jokes such as: “My toaster has two settings—burnt and off.” In thinking about this kind of fallacy it is helpful to remember that everything is either black or not black, but not everything is either black or white. CaricaturizationAttacking a person’s argument by presenting a caricaturization is a form of the Straw Man Fallacy and the Ad Hominem Fallacy. A critical thinker should attack the real man, not a caricatuzation of the man. Ditto for women, of course. The Caricaturization Fallacy is the same as the Fallacy of Refutation by Caricature. Changing the QuestionThis is another name for the Fallacy of Avoiding the Question. Cherry-Picking the EvidenceThis is another name for the Fallacy of Suppressed Evidence. Circular ReasoningThe Fallacy of Circular Reasoning occurs when the reasoner begins with what he or she is trying to end up with. Here is Steven Pinker’s example:
The most well known examples of circular reasoning are cases of the Fallacy of Begging the Question. Here the circle is as short as possible. However, if the circle is very much larger, including a wide variety of claims and a large set of related concepts, then the circular reasoning can be informative and so is not considered to be fallacious. For example, a dictionary contains a large circle of definitions that use words which are defined in terms of other words that are also defined in the dictionary. Because the dictionary is so informative, it is not considered as a whole to be fallacious. However, a small circle of definitions is considered to be fallacious. In properly-constructed recursive definitions, defining a term by using that same term is not fallacious. For example, here is an appropriate recursive definition of the term “a stack of coins.” Basis step: Two coins, with one on top of the other, is a stack of coins. Recursion step: If p is a stack of coins, then adding a coin on top of p produces a stack of coins. For a deeper discussion of circular reasoning see Infinitism in Epistemology. Circumstantial Ad HominemSee Ad Hominem, Circumstantial. Clouding the IssueSee Smokescreen. Common BeliefSee Appeal to the People and Traditional Wisdom. Common CauseThis fallacy occurs during causal reasoning when a causal connection between two kinds of events is claimed when evidence is available indicating that both are the effect of a common cause. Example:
However, it’s the rain that’s the common cause of both. Common PracticeSee Appeal to the People and Traditional Wisdom. Complex QuestionYou use this fallacy when you frame a question so that some controversial presupposition is made by the wording of the question. Example:
The question unfairly presumes the controversial claim that the policy really is a waste of money. The Fallacy of Complex Question is a form of Begging the Question. CompositionThe Composition Fallacy occurs when someone mistakenly assumes that a characteristic of some or all the individuals in a group is also a characteristic of the group itself, the group “composed” of those members. It is the converse of the Division Fallacy. Example:
Confirmation BiasThe tendency to look for evidence in favor of one’s controversial hypothesis and not to look for disconfirming evidence, or to pay insufficient attention to it. This is the most common kind of Fallacy of Selective Attention, and it is the foundation of many conspiracy theories. Example:
Using the Fallacy of Confirmation Bias is usually a sign that one has adopted some belief dogmatically and isn’t willing to disconfirm the belief, or is too willing to interpret ambiguous evidence so that it conforms to what one already believes. Confirmation bias often reveals itself in the fact that people of opposing views can each find support for those views in the same piece of evidence. ConjunctionMistakenly supposing that event E is less likely than the conjunction of events E and F. Here is an example from the psychologists Daniel Kahneman and Amos Tversky. Example:
Confusing an Explanation with an ExcuseTreating someone’s explanation of a fact as if it were a justification of the fact. Explaining a crime should not be confused with excusing the crime, but it too often is.
Consensus GentiumFallacy of Argumentum Consensus Gentium (argument from the consensus of the nations). See Traditional Wisdom. ConsequenceSee Appeal to Consequence. ContextomySee Quoting out of Context. Converse AccidentIf we reason by paying too much attention to exceptions to the rule, and generalize on the exceptions, our reasoning contains this fallacy. This fallacy is the converse of the Accident Fallacy. It is a kind of Hasty Generalization, by generalizing too quickly from a peculiar case. Example:
The original generalization is “Turtles live longer than tarantulas.” There are exceptions, such as the turtle bought from the pet store. Rather than seeing this for what it is, namely an exception, the reasoner places too much trust in this exception and generalizes on it to produce the faulty generalization that turtles bought from pet stores do not live longer than tarantulas. Cover-upSee Suppressed Evidence. Cum Hoc, Ergo Propter HocLatin for “with this, therefore because of this.” This is a False Cause Fallacy that doesn’t depend on time order (as does the post hoc fallacy), but on any other chance correlation of the supposed cause being in the presence of the supposed effect. Example:
Curve FittingCurve fitting is the process of constructing a curve that has the best fit to a series of data points. The curve is a graph of some mathematical function. The function or functional relationship might be between variable x and variable y, where x is the time of day and y is the temperature of the ocean. When you collect data about some relationship, you inevitably collect information that is affected by noise or statistical fluctuation. If you create a function between x and y that is too sensitive to your data, you will be overemphasizing the noise and producing a function that has less predictive value than need be. If you create your function by interpolating, that is, by drawing straight line segments between all the adjacent data points, or if you create a polynomial function that exactly fits every data point, it is likely that your function will be worse than if you’d produced a function with a smoother curve. Your original error of too closely fitting the data-points is called the Fallacy of Curve Fitting or the Fallacy of Overfitting. Example:
DefinistThe Definist Fallacy occurs when someone unfairly defines a term so that a controversial position is made easier to defend. Same as the Persuasive Definition. Example:
Denying the AntecedentYou are using this fallacy if you deny the antecedent of a conditional and then suppose that doing so is a sufficient reason for denying the consequent. This formal fallacy is often mistaken for Modus Tollens, a valid form of argument using the conditional. A conditional is an if-then statement; the if-part is the antecedent, and the then-part is the consequent. Example:
Disregarding Known ScienceThis fallacy is committed when a person makes a claim that knowingly or unknowingly disregards well known science, science that weighs against the claim. They should know better. This fallacy is a form of the Fallacy of Suppressed Evidence. Example:
DigressionSee Avoiding the Issue. DistractionSee Smokescreen. DivisionMerely because a group as a whole has a characteristic, it often doesn’t follow that individuals in the group have that characteristic. If you suppose that it does follow, when it doesn’t, your reasoning contains the Fallacy of Division. It is the converse of the Composition Fallacy. Example:
DominoSee Slippery Slope. Double StandardThere are many situations in which you should judge two things or people by the same standard. If in one of those situations you use different standards for the two, your reasoning contains the Fallacy of Using a Double Standard. Example:
This example is a fallacy if it can be presumed that men and women should have to meet the same standard for becoming a Post Office employee. Either/OrSee Black-or-White. EquivocationEquivocation is the illegitimate switching of the meaning of a term that occurs twice during the reasoning; it is the use of one word taken in two ways. The fallacy is a kind of Fallacy of Ambiguity. Example:
The term “nobody” changes its meaning without warning in the passage. Equivocation can sometimes be very difficult to detect, as in this argument from Walter Burleigh: If I call you a swine, then I call you an animal. EtymologicalThe Etymological Fallacy occurs whenever someone falsely assumes that the meaning of a word can be discovered from its etymology or origins. Example:
Every and AllThe Fallacy of Every and All turns on errors due to the order or scope of the quantifiers “every” and “all” and “any.” This is a version of the Scope Fallacy. Example:
In proposing this fallacious argument, Aristotle believed the common end is the supreme good, so he had a rather optimistic outlook on the direction of history. ExaggerationWhen we overstate or overemphasize a point that is a crucial step in a piece of reasoning, then we are guilty of the Fallacy of Exaggeration. This is a kind of error called Lack of Proportion. Example:
When we exaggerate in order to make a joke, though, we do not use the fallacy because we do not intend to be taken literally. Excluded MiddleSee False Dilemma or Black-or-White. False AnalogyThe problem is that the items in the analogy are too dissimilar. When reasoning by analogy, the fallacy occurs when the analogy is irrelevant or very weak or when there is a more relevant disanalogy. See also Faulty Comparison. Example:
False BalanceA specific form of the False Equivalence Fallacy that occurs in the context of news reporting, in which the reporter misleads the audience by suggesting the evidence on two sides of an issue is equally balanced, when the reporter knows that one of the two sides is an extreme outlier. Reporters regularly commit this fallacy in order to appear “fair and balanced.” Example:
False CauseImproperly concluding that one thing is a cause of another. The Fallacy of Non Causa Pro Causa is another name for this fallacy. Its four principal kinds are the Post Hoc Fallacy, the Fallacy of Cum Hoc, Ergo Propter Hoc, the Regression Fallacy, and the Fallacy of Reversing Causation. Example:
False DichotomySee False Dilemma or Black-or-White. False DilemmaA reasoner who unfairly presents too few choices and then implies that a choice must be made among this short menu of choices is using the False Dilemma Fallacy, as does the person who accepts this faulty reasoning. Example:
The pollster is committing the fallacy by limiting you to only those choices. What about the choice of “no times per week”? Think of the unpleasant choices as being the horns of a bull that is charging toward you. By demanding other choices beyond those on the unfairly limited menu, you thereby “go between the horns” of the dilemma, and are not gored. The fallacy is called the “False Dichotomy Fallacy” or the “Black-or-White” Fallacy when the unfair menu contains only two choices, and thus two horns. False EquivalenceThe Fallacy of False Equivalence is committed when someone implies falsely (and usually indirectly) that the two sides on some issue have basically equivalent evidence, while knowingly covering up the fact that one side’s evidence is much weaker. A form of the Fallacy of Suppressed Evidence. Example:
Far-Fetched HypothesisThis is the fallacy of offering a bizarre (far-fetched) hypothesis as the correct explanation without first ruling out more mundane explanations. Example:
Faulty ComparisonIf you try to make a point about something by comparison, and if you do so by comparing it with the wrong thing, then your reasoning uses the Fallacy of Faulty Comparison or the Fallacy of Questionable Analogy. Example:
Shouldn’t Durell hiking boots be compared with other hiking boots, not with tennis shoes? Faulty GeneralizationA fallacy produced by some error in the process of generalizing. See Hasty Generalization or Unrepresentative Generalization for examples. Faulty MotivesAn irrelevant appeal to the motives of the arguer, and supposing that this revelation of their motives will thereby undermine their reasoning. A kind of Ad Hominem Fallacy. Example:
Formal FallacyFormal fallacies are all the cases or kinds of reasoning that fail to be deductively valid. Formal fallacies are also called Logical Fallacies or Invalidities. That is, they are deductively invalid arguments that are too often believed to be deductively valid. Example:
This might at first seem to be a good argument, but actually it is fallacious because it has the same logical form as the following more obviously invalid argument:
Nearly all the infinity of types of invalid inferences have no specific fallacy names. Four TermsThe Fallacy of Four Terms (quaternio terminorum) occurs when four rather than three categorical terms are used in a standard-form syllogism. Example:
The word “banks” occurs as two distinct terms, namely river bank and financial bank, so this example also is an equivocation. Without an equivocation, the four term fallacy is trivially invalid. Gambler’sThis fallacy occurs when the gambler falsely assumes that the history of outcomes will affect future outcomes. Example:
The fallacious move was to conclude that the probability of the next toss coming up tails must be more than a half. The assumption that it’s a fair coin is important because, if the coin comes up heads five times in a row, one would otherwise become suspicious that it’s not a fair coin and therefore properly conclude that the probably is high that heads is more likely on the next toss. GeneticA critic uses the Genetic Fallacy if the critic attempts to discredit or support a claim or an argument because of its origin (genesis) when such an appeal to origins is irrelevant. Example:
Fortune cookies are not reliable sources of information about what gift to buy, but the reasons the person is willing to give are likely to be quite relevant and should be listened to. The speaker is committing the Genetic Fallacy by paying too much attention to the genesis of the idea rather than to the reasons offered for it. If I learn that your plan for building the shopping center next to the Johnson estate originated with Johnson himself, who is likely to profit from the deal, then my request that the planning commission not accept your proposal without independent verification of its merits wouldn’t be committing the genetic fallacy. Because appeals to origins are sometimes relevant and sometimes irrelevant and sometimes on the borderline, in those latter cases it can be very difficult to decide whether the fallacy has been committed. For example, if Sigmund Freud shows that the genesis of a person’s belief in God is their desire for a strong father figure, then does it follow that their belief in God is misplaced, or is Freud’s reasoning committing the Genetic Fallacy? Group ThinkA reasoner uses the Group Think Fallacy if he or she substitutes pride of membership in the group for reasons to support the group’s policy. If that’s what our group thinks, then that’s good enough for me. It’s what I think, too. “Blind” patriotism is a rather nasty version of the fallacy. Example:
Guilt by AssociationGuilt by Association is a version of the Ad Hominem Fallacy in which a person is said to be guilty of error because of the group he or she associates with. The fallacy occurs when we unfairly try to change the issue to be about the speaker’s circumstances rather than about the speaker’s actual argument. Also called “Ad Hominem, Circumstantial.” Example:
Has any evidence been presented here that Acheson’s actions are inappropriate in regards to communism? This sort of reasoning is an example of McCarthyism, the technique of smearing liberal Democrats that was so effectively used by the late Senator Joe McCarthy in the early 1950s. In fact, Acheson was strongly anti-communist and the architect of President Truman’s firm policy of containing Soviet power. Hasty ConclusionSee Jumping to Conclusions. Hasty GeneralizationA Hasty Generalization is a Fallacy of Jumping to Conclusions in which the conclusion is a generalization. See also Biased Statistics. Example:
In any Hasty Generalization the key error is to overestimate the strength of an argument that is based on too small a sample for the implied confidence level or error margin. In this argument about Nicaragua, using the word “all” in the conclusion implies zero error margin. With zero error margin you’d need to sample every single person in Nicaragua, not just two people. HeapSee Line-Drawing. HedgingYou are hedging if you refine your claim simply to avoid counterevidence and then act as if your revised claim is the same as the original. Example:
You do not use the fallacy if you explicitly accept the counterevidence, admit that your original claim is incorrect, and then revise it so that it avoids that counterevidence. Hooded ManThis is an error in reasoning due to confusing the knowing of a thing with the knowing of it under all its various names or descriptions. Example:
Hyperbolic DiscountingThe Fallacy of Hyperbolic Discounting occurs when someone too heavily weighs the importance of a present reward over a significantly greater reward in the near future, but only slightly differs in their valuations of those two rewards if they are to be received in the far future. The person’s preferences are biased toward the present. Example: When asked to decide between receiving an award of $50 now or $60 tomorrow, the person chooses the $50; however, when asked to decide between receiving $50 in two years or $60 in two years and one day, the person chooses the $60. If the person is in a situation in which $50 now will solve their problem but $60 tomorrow will not, then there is no fallacy in having a bias toward the present. HypostatizationThe error of inappropriately treating an abstract term as if it were a concrete one. Also known as the Fallacy of Misplaced Concreteness and the Fallacy of Reification. Example:
Nature isn’t capable of making decisions. The point can be made without reasoning fallaciously by saying: “Which organisms live and which die is determined by natural causes.” Whether a phrase commits the fallacy depends crucially upon whether the use of the inaccurate phrase is inappropriate in the situation. In a poem, it is appropriate and very common to reify nature, hope, fear, forgetfulness, and so forth, that is, to treat them as if they were objects or beings with intentions. In any scientific claim, it is inappropriate. Ideology-driven argumentationThis occurs when an arguer presupposes some aspect of their own ideology that they are unable to defend. Example:
The arguer is presupposing a liberal ideology which implies that permitting private citizens to carry concealed handguns increases crime and decreases safety. If the arguer is unable to defend this presumption, then the fallacy is committed regardless of whether the presumption is defensible. If the senator were to accept this liberal ideology, then the senator is likely to accept the arguer’s conclusion, and the argument could be considered to be effective, but still it would be fallacious—such is the difference between rhetoric and logic. Ignoratio ElenchiSee Irrelevant Conclusion. Also called missing the point. Ignoring a Common CauseSee Common Cause. Ignoring Inconvenient DataSee Suppressed Evidence. Incomplete EvidenceSee Suppressed Evidence. Improper AnalogyAnother name for the Fallacy of False Analogy. InconsistencyThe fallacy occurs when we accept an inconsistent set of claims, that is, when we accept a claim that logically conflicts with other claims we hold. Example:
That last remark implies the speaker does generalize, although the speaker doesn’t notice this inconsistency with what is said. Inductive ConversionImproperly reasoning from a claim of the form “All As are Bs” to “All Bs are As” or from one of the form “Many As are Bs” to “Many Bs are As” and so forth. Example:
The term “conversion” is a technical term in formal logic. Insufficient StatisticsDrawing a statistical conclusion from a set of data that is clearly too small. Example:
This fallacy is a form of the Fallacy of Jumping to Conclusions. IntensionalThe mistake of treating different descriptions or names of the same object as equivalent even in those contexts in which the differences between them matter. Reporting someone’s beliefs or assertions or making claims about necessity or possibility can be such contexts. In these contexts, replacing a description with another that refers to the same object is not valid and may turn a true sentence into a false one. Example:
Michelle said no such thing. The faulty reasoner illegitimately assumed that what is true of a person under one description will remain true when said of that person under a second description even in this context of indirect quotation. What was true of the person when described as “her new neighbor Stalnaker” is that Michelle said she wants to meet him, but it wasn’t legitimate for me to assume this is true of the same person when he is described as “a spy for North Korea.” Extensional contexts are those in which it is legitimate to substitute equals for equals with no worry. But any context in which this substitution of co-referring terms is illegitimate is called an intensional context. Intensional contexts are produced by quotation, modality, and intentionality (propositional attitudes). Intensionality is failure of extensionality, thus the name “Intensional Fallacy”. Invalid ReasoningAn invalid inference. An argument can be assessed by deductive standards to see if the conclusion would have to be true if the premises were to be true. If the argument cannot meet this standard, it is invalid. An argument is invalid only if it is not an instance of any valid argument form. The Fallacy of Invalid Reasoning is a formal fallacy. Example:
This invalid argument is an instance of Denying the Antecedent. Any invalid inference that is also inductively very weak is a Non Sequitur. Irrelevant ConclusionThe conclusion that is drawn is irrelevant to the premises; it misses the point. Example:
The testimony of Thompson may be relevant to a request for leniency, but it is irrelevant to any claim about the defendant not being near the murder scene. Other examples of this fallacy are Ad Hominem, Appeal to Authority, Appeal to Emotions, and Argument from Ignorance. Irrelevant ReasonThis fallacy is a kind of Non Sequitur in which the premises are wholly irrelevant to drawing the conclusion. Example:
Is-OughtThe Is-Ought Fallacy occurs when a conclusion expressing what ought to be so is inferred from premises expressing only what is so, in which it is supposed that no implicit or explicit ought-premises are need. There is controversy in the philosophical literature regarding whether this type of inference is always fallacious. Example:
This argument would not use the fallacy if there were an implicit premise indicating that he is a person and that persons should not torture other beings. Jumping to ConclusionsIt is not always a mistake to make a quick decision, but when we draw a conclusion without taking the trouble to acquire enough of the relevant evidence, our reasoning commits the fallacy of jumping to conclusions, provided there was sufficient time to acquire and assess that extra evidence, and provided that the extra effort it takes to get the evidence isn’t prohibitive. Example:
Hold on. Before concluding that you should buy it, ask yourself whether you need to buy another car and, if so, whether you should lease or rent or just borrow a car when you need to travel by car. If you do need to buy a car, you ought to have someone check its operating condition, or else you should make sure you get a guarantee about the car’s being in working order. And, if you stop to think about it, there may be other factors you should consider before making the purchase, such as its age, size, appearance, and mileage. Lack of ProportionThe Fallacy of Lack of Proportion occurs either by exaggerating or downplaying or simply not noticing a point that is a crucial step in a piece of reasoning. You exaggerate when you make a mountain out of a molehill. You downplay when you suppress relevant evidence. The Genetic Fallacy blows the genesis of an idea out of proportion. Example:
The speaker is blowing these isolated incidents out of proportion. Millions of tourists visit Russia with no problems. Another example occurs when the speaker simply lacks the information needed to give a factor its proper proportion or weight:
The speaker does not realize all experts agree that electric and magnetic fields caused by home wiring are harmless. However, touching the metal within those wires is very dangerous. Line-DrawingIf we improperly reject a vague claim because it is not as precise as we’d like, then we are using the line-drawing fallacy. Being vague is not being hopelessly vague. Also called the Bald Man Fallacy, the Fallacy of the Heap and the Sorites Fallacy. Example:
Loaded LanguageLoaded language is emotive terminology that expresses value judgments. When used in what appears to be an objective description, the terminology unfortunately can cause the listener to adopt those values when in fact no good reason has been given for doing so. Also called Prejudicial Language. Example:
This broadcast is an editorial posing as a news report. Loaded QuestionAsking a question in a way that unfairly presumes the answer. This fallacy occurs commonly in polls, especially push polls, which are polls designed to push information onto the person being polled and not designed to learn the person’s views. Example:
Logic ChoppingObscuring the issue by using overly-technical logic tools, especially the techniques of formal symbolic logic, that focus attention on trivial details. A form of Smokescreen and Quibbling. LogicalSee Formal. LyingA fallacy of reasoning that depends on intentionally saying something that is known to be false. If the lying occurs in an argument’s premise, then it is an example of the Fallacy of Questionable Premise. Example:
Roosevelt was never assassinated. Maldistributed MiddleSee Undistributed Middle. Many QuestionsSee Complex Question. MisconditionalizationSee Modal Fallacy. Misleading AccentSee the Fallacy of Accent. Misleading VividnessWhen the Fallacy of Jumping to Conclusions is due to a special emphasis on an anecdote or other piece of evidence, then the Fallacy of Misleading Vividness has occurred. Example:
The vivid anecdote is the story about Uncle Harry. Too much emphasis is placed on it and not enough on the statistics from the Surgeon General. Misplaced ConcretenessMistakenly supposing that something is a concrete object with independent existence, when it’s not. Also known as the Fallacy of Reification and the Fallacy of Hypostatization. Example:
John mistakenly supposed a group or set of concrete objects is also a concrete object. A less metaphysical example would be a situation where John says a criminal was caught by K-9 aid, and thereby supposed that K-9 aid was some sort of concrete object. John could have expressed the same point less misleadingly by saying a K-9 dog aided in catching a criminal. Misplaced Burden of ProofCommitting the error of trying to get someone else to prove you are wrong, when it is your responsibility to prove you are correct. Example:
If someone says, “I saw a green alien from outer space,” you properly should ask for some proof. If the person responds with no more than something like, “Prove I didn’t,” then they are not accepting their burden of proof and are improperly trying to place it on your shoulders. MisrepresentationIf the misrepresentation occurs on purpose, then it is an example of lying. If the misrepresentation occurs during a debate in which there is misrepresentation of the opponent’s claim, then it would be the cause of a Straw Man Fallacy. Missing the PointSee Irrelevant Conclusion. Mob AppealSee Appeal to the People. ModalThis is the error of treating modal conditionals as if the modality applies only to the then-part of the conditional when it more properly applies to the entire conditional. Example:
This apparently valid argument is invalid. It is not necessarily true that James has more than one child; it’s merely true that he has more than one child. He could have had no children. It is logically possible that James has no children even though he actually has two. The solution to the fallacy is to see that the premise “If James has two children, then he necessarily has more than one child,” requires the modality “necessarily” to apply logically to the entire conditional “If James has two children,then he has more than one child” even though grammatically it applies only to “he has more than one child.” The Modal Fallacy is the most well known of the infinitely many errors involving modal concepts. Modal concepts include necessity, possibility, and so forth. Monte CarloSee Gambler’s Fallacy. Name CallingSee Ad Hominem. NaturalisticOn a broad interpretation of this fallacy, it applies to any attempt to argue from an “is” to an “ought,” that is, from a list of facts to a conclusion about what ought to be done. Example:
Here is another example. Owners of financially successful companies are more successful than poor people in the competition for wealth, power and social status. Therefore, the poor deserve to be poor. There is considerable disagreement among philosophers regarding what sorts of arguments the term “Naturalistic Fallacy” legitimately applies to. Neglecting a Common CauseSee Common Cause. No Middle GroundSee False Dilemma. No True ScotsmanThis error is a kind of Ad Hoc Rescue of one’s generalization in which the reasoner re-characterizes the situation solely in order to escape refutation of the generalization. Example:
Non Causa Pro CausaThis label is Latin for mistaking the “non-cause for the cause.” See False Cause. Non SequiturWhen a conclusion is supported only by extremely weak reasons or by irrelevant reasons, the argument is fallacious and is said to be a Non Sequitur. However, we usually apply the term only when we cannot think of how to label the argument with a more specific fallacy name. Any deductively invalid inference is a non sequitur if it also very weak when assessed by inductive standards. Example:
The following is not an example: “If she committed the murder, then there’d be his blood stains on her hands. His blood stains are on her hands. So, she committed the murder.” This deductively invalid argument uses the Fallacy of Affirming the Consequent, but it isn’t a non sequitur because it has significant inductive strength. Obscurum per ObscuriusExplaining something obscure or mysterious by something that is even more obscure or more mysterious. Example:
One-SidednessSee the related fallacies of Confirmation Bias, Slanting and Suppressed Evidence. OppositionBeing opposed to someone’s reasoning because of who they are, usually because of what group they are associated with. See the Fallacy of Guilt by Association. Over-FittingSee Curve Fitting. OvergeneralizationSee Sweeping Generalization. OversimplificationYou oversimplify when you cover up relevant complexities or make a complicated problem appear to be too much simpler than it really is. Example:
Whom to vote for should be decided by considering quite a number of issues in addition to Cuban trade. When an oversimplification results in falsely implying that a minor causal factor is the major one, then the reasoning also uses the False Cause Fallacy. Past PracticeSee Traditional Wisdom. PatheticThe Pathetic Fallacy is a mistaken belief due to attributing peculiarly human qualities to inanimate objects (but not to animals). The fallacy is caused by anthropomorphism. Example:
Peer PressureSee Appeal to the People. Persuasive DefinitionSome people try to win their arguments by getting you to accept their faulty definition. If you buy into their definition, they’ve practically persuaded you already. Same as the Definist Fallacy. Poisoning the Well when presenting a definition would be an example of a using persuasive definition. Example:
PerfectionistIf you remark that a proposal or claim should be rejected solely because it doesn’t solve the problem perfectly, in cases where perfection isn’t really required, then you’ve used the Perfectionist Fallacy. Example:
Petitio PrincipiiSee Begging the Question. Poisoning the WellPoisoning the well is a preemptive attack on a person in order to discredit their testimony or argument in advance of their giving it. A person who thereby becomes unreceptive to the testimony reasons fallaciously and has become a victim of the poisoner. This is a kind of Ad Hominem, Circumstantial Fallacy. Example:
Post HocSuppose we notice that an event of kind A is followed in time by an event of kind B, and then hastily leap to the conclusion that A caused B. If so, our reasoning contains the Post Hoc Fallacy. Correlations are often good evidence of causal connection, so the fallacy occurs only when the leap to the causal conclusion is done “hastily.” The Latin term for the fallacy is Post Hoc, Ergo Propter Hoc (“After this, therefore because of this”). It is a kind of False Cause Fallacy. Example:
Your background knowledge should tell you that this pattern probably won’t continue in the future; it’s just an accidental correlation that tells you nothing about the cause of your team’s wins. Prejudicial LanguageSee Loaded Language. Proof SurrogateSubstituting a distracting comment for a real proof. Example:
This comment is trying to avoid a serious disagreement about whether one should vote Republican. Prosecutor’s FallacyThis is the mistake of over-emphasizing the strength of a piece of evidence while paying insufficient attention to the context. Example:
That is fallacious reasoning, and if you are on the jury you should not be convinced. Here’s why. The prosecutor paid insufficient attention to the pool of potential suspects. Suppose that pool has six million people who could have committed the crime, all other things being equal. If the forensic lab had tested all those people, they’d find that about one in every two thousand of them would have a hair match, but that is three thousand people. The suspect is just one of the 3000, so the suspect is very probably innocent unless the prosecutor can provide more evidence. The prosecutor over-emphasized the strength of a piece of evidence by focusing on one suspect while paying insufficient attention to the context which suggests a pool of many more suspects. ProsodySee the Fallacy of Accent. Quantifier ShiftConfusing the phrase “For all x there is some y” with “There is some (one) y such that for all x.” Example:
The error is also made if you argue from “Everybody loves someone” to “There is someone whom everybody loves.” Questionable BeggingSee Begging the Question Questionable AnalogySee False Analogy. Questionable CauseSee False Cause. Questionable PremiseIf you have sufficient background information to know that a premise is questionable or unlikely to be acceptable, then you use this fallacy if you accept an argument based on that premise. This broad category of fallacies of argumentation includes Appeal to Authority, False Dilemma, Inconsistency, Lying, Stacking the Deck, Straw Man, Suppressed Evidence, and many others. QuibblingWe quibble when we complain about a minor point and falsely believe that this complaint somehow undermines the main point. To avoid this error, the logical reasoner will not make a mountain out of a mole hill nor take people too literally. Logic Chopping is a kind of quibbling. Example:
Quoting out of ContextIf you quote someone, but select the quotation so that essential context is not available and therefore the person’s views are distorted, then you’ve quoted “out of context.” Quoting out of context in an argument creates a Straw Man Fallacy. The fallacy is also called “contextomy.” Example:
Jones’s selective quotation is fallacious because it makes Smith appear to advocate this immoral activity when the context makes it clear that he doesn’t. RationalizationWe rationalize when we inauthentically offer reasons to support our claim. We are rationalizing when we give someone a reason to justify our action even though we know this reason is not really our own reason for our action, usually because the offered reason will sound better to the audience than our actual reason. Example:
Red HerringA red herring is a smelly fish that would distract even a bloodhound. It is also a digression that leads the reasoner off the track of considering only relevant information. Example:
Bringing up the issue of working conditions and the committee is the red herring diverting us from the main issue of whether Senate Bill 47 unfairly hurts business. An intentional false lead in a criminal investigation is another example of a red herring. Refutation by CaricatureSee the Fallacy of Caricaturization. RegressionThis fallacy occurs when regression to the mean is mistaken for a sign of a causal connection. Also called the Regressive Fallacy. It is a kind of False Cause Fallacy. Example:
There is most probably nothing causing people from Dayton to be more like the average resident of the U.S.; but rather what is happening is that averages are regressing to the mean. ReificationConsidering a word to be referring to an object, when the meaning of the word can be accounted for more mundanely without assuming the object exists. Also known as the Fallacy of Misplaced Concreteness and the Hypostatization. Example:
He is treating “fate” as if it is naming some object, when it would be less misleading, but also less poetic, to say the introduction suggests that listeners will resign themselves to accepting whatever events happen to them. The Fallacy occurs also when someone says, “I succumbed to nostalgia.” Without committing the fallacy, one can make the same point by saying, “My mental state caused actions that would best be described as my reflecting an unusual desire to return to some past period of my life.” Another common way the Fallacy is used is when someone says that if you understand what “Sherlock Holmes” means, then Sherlock Holmes exists in your understanding. The larger point being made in this last example is that nouns can be meaningful without them referring to an object, yet those who use the Fallacy of Reification do not understand this point. Reversing CausationDrawing an improper conclusion about causation due to a causal assumption that reverses cause and effect. A kind of False Cause Fallacy. Example:
The false assumption here is that having a big boat helps cause you to be an officer in MEP, whereas the reverse is true. Being an officer causes you to have the high income that enables you to purchase a big boat. ScapegoatingIf you unfairly blame an unpopular person or group of people for a problem, then you are scapegoating. This is a kind of Fallacy of Appeal to Emotions. Example:
Scare TacticIf you suppose that terrorizing your opponent is giving him a reason for believing that you are correct, then you are using a scare tactic and reasoning fallaciously. Example:
David has given the editor a financial reason not to publish, but he has not given a relevant reason why the story is not newsworthy. David’s tactics are scaring the editor, but it’s the editor who uses the Scare Tactic Fallacy, not David. David has merely used a scare tactic. This fallacy’s name emphasizes the cause of the fallacy rather than the error itself. See also the related Fallacy of Appeal to Emotions. ScopeThe Scope Fallacy is caused by improperly changing or misrepresenting the scope of a phrase. Example:
The first sentence has ambiguous scope. It was probably originally meant in this sense: Every concerned citizen who believes (of someone that this person is living in the US and is a terrorist) should make a report to the authorities. But the speaker is clearly taking the sentence in its other, less plausible sense: Every concerned citizen who believes (that there is someone or other living in the US who is a terrorist) should make a report to the authorities. Scope fallacies usually are Amphibolies. Secundum QuidSee Accident and Converse Accident, two versions of the fallacy. Selective AttentionImproperly focusing attention on certain things and ignoring others. Example:
The pessimist who pays attention to all the bad news and ignores the good news thereby use the Fallacy of Selective Attention. The remedy for this fallacy is to pay attention to all the relevant evidence. The most common examples of selective attention are the fallacy of Suppressed Evidence and the fallacy of Confirmation Bias. See also the Sharpshooter’s Fallacy. Self-Fulfilling ProphecyThe fallacy occurs when the act of prophesying will itself produce the effect that is prophesied, but the reasoner doesn’t recognize this and believes the prophesy is a significant insight. Example:
The prediction will fulfill itself, so to speak, and the students’ reasoning contains the fallacy. This fallacy can be dangerous in an atmosphere of potential war between nations when the leader of a nation predicts that their nation will go to war against their enemy. This prediction could very well precipitate an enemy attack because the enemy calculates that if war is inevitable then it is to their military advantage not to get caught by surprise. Self-SelectionA Biased Generalization in which the bias is due to self-selection for membership in the sample used to make the generalization. Example:
The problem here is that the callers selected themselves for membership in the sample, but clearly the sample is unlikely to be representative of Americans. Sharpshooter’sThe Sharpshooter’s Fallacy gets its name from someone shooting a rifle at the side of the barn and then going over and drawing a target and bulls eye concentrically around the bullet hole. The fallacy is caused by overemphasizing random results or making selective use of coincidence. See the Fallacy of Selective Attention. Example:
SlantingThis error occurs when the issue is not treated fairly because of misrepresenting the evidence by, say, suppressing part of it, or misconstruing some of it, or simply lying. See the following related fallacies: Confirmation Bias, Lying, Misrepresentation, Questionable Premise, Quoting out of Context, Straw Man, Suppressed Evidence. Slippery SlopeSuppose someone claims that a first step (in a chain of causes and effects, or a chain of reasoning) will probably lead to a second step that in turn will probably lead to another step and so on until a final step ends in trouble. If the likelihood of the trouble occurring is exaggerated, the Slippery Slope Fallacy is present. Example:
The form of a Slippery Slope Fallacy looks like this:
The key claim in the fallacy is that taking the first step will lead to the final, unacceptable step. Arguments of this form may or may not be fallacious depending on the probabilities involved in each step. The analyst asks how likely it is that taking the first step will lead to the final step. For example, if A leads to B with a probability of 80 percent, and B leads to C with a probability of 80 percent, and C leads to D with a probability of 80 percent, is it likely that A will eventually lead to D? No, not at all; there is about a 50% chance. The proper analysis of a slippery slope argument depends on sensitivity to such probabilistic calculations. Regarding terminology, if the chain of reasoning A, B, C, D, …, Z is about causes, then the fallacy is called the Domino Fallacy. Small SampleThis is the fallacy of using too small a sample. If the sample is too small to provide a representative sample of the population, and if we have the background information to know that there is this problem with sample size, yet we still accept the generalization upon the sample results, then we use the fallacy. This fallacy is the Fallacy of Hasty Generalization, but it emphasizes statistical sampling techniques. Example:
How big a sample do you need to avoid the fallacy? Relying on background knowledge about a population’s lack of diversity can reduce the sample size needed for the generalization. With a completely homogeneous population, a sample of one is large enough to be representative of the population; if we’ve seen one electron, we’ve seen them all. However, eating in one restaurant is not like eating in any restaurant, so far as getting sick is concerned. We cannot place a specific number on sample size below which the fallacy is produced unless we know about homogeneity of the population and the margin of error and the confidence level. Smear TacticA smear tactic is an unfair characterization either of the opponent or the opponent’s position or argument. Smearing the opponent causes an Ad Hominem Fallacy. Smearing the opponent’s argument causes a Straw Man Fallacy. SmokescreenThis fallacy occurs by offering too many details in order either to obscure the point or to cover-up counter-evidence. In the latter case it would be an example of the Fallacy of Suppressed Evidence. If you produce a smokescreen by bringing up an irrelevant issue, then you produce a Red Herring Fallacy. Sometimes called Clouding the Issue. Example:
There is no recipe to follow in distinguishing smokescreens from reasonable appeals to caution and care. SoritesSee Line-Drawing. Special PleadingSpecial pleading is a form of inconsistency in which the reasoner doesn’t apply his or her principles consistently. It is the fallacy of applying a general principle to various situations but not applying it to a special situation that interests the arguer even though the general principle properly applies to that special situation, too. Example:
In our example, the principle of helping the police is applied to investigations of police officers but not to one’s neighbors. SpecificityDrawing an overly specific conclusion from the evidence. A kind of jumping to conclusions. Example:
Stacking the DeckSee Suppressed Evidence and Slanting. StereotypingUsing stereotypes as if they are accurate generalizations for the whole group is an error in reasoning. Stereotypes are general beliefs we use to categorize people, objects, and events; but these beliefs are overstatements that shouldn’t be taken literally. For example, consider the stereotype “She’s Mexican, so she’s going to be late.” This conveys a mistaken impression of all Mexicans. On the other hand, even though most Mexicans are punctual, a German is more apt to be punctual than a Mexican, and this fact is said to be the “kernel of truth” in the stereotype. The danger in our using stereotypes is that speakers or listeners will not realize that even the best stereotypes are accurate only when taken probabilistically. As a consequence, the use of stereotypes can breed racism, sexism, and other forms of bigotry. Example:
This argument is deductively valid, but it’s unsound because it rests on a false, stereotypical premise. The grain of truth in the stereotype is that the average German doesn’t dance sambas as well as the average South American, but to overgeneralize and presume that ALL Germans are poor samba dancers compared to South Americans is a mistake called “stereotyping.” Straw ManYour reasoning contains the Straw Man Fallacy whenever you attribute an easily refuted position to your opponent, one that the opponent wouldn’t endorse, and then proceed to attack the easily refuted position (the straw man) believing you have thereby undermined the opponent’s actual position. If the misrepresentation is on purpose, then the Straw Man Fallacy is caused by lying. Example (a debate before the city council):
The speaker has twisted what his opponent said; the opponent never said, nor even indirectly suggested, that everybody who ever came to America from another country somehow oppressed the Indians. Style Over SubstanceUnfortunately the style with which an argument is presented is sometimes taken as adding to the substance or strength of the argument. Example:
SubjectivistThe Subjectivist Fallacy occurs when it is mistakenly supposed that a good reason to reject a claim is that truth on the matter is relative to the person or group. Example:
Superstitious ThinkingReasoning deserves to be called superstitious if it is based on reasons that are well known to be unacceptable, usually due to unreasonable fear of the unknown, trust in magic, or an obviously false idea of what can cause what. A belief produced by superstitious reasoning is called a superstition. The fallacy is an instance of the False Cause Fallacy. Example:
It may be a good idea not to walk under ladders, but a proper reason to believe this is that workers on ladders occasionally drop things, and that ladders might have dripping wet paint that could damage your clothes. An improper reason for not walking under ladders is that it is bad luck to do so. Suppressed EvidenceIntentionally failing to use information suspected of being relevant and significant is committing the fallacy of suppressed evidence. This fallacy usually occurs when the information counts against one’s own conclusion. Perhaps the arguer is not mentioning that experts have recently objected to one of his premises. The fallacy is a kind of Fallacy of Selective Attention. Example:
This appears to be a good argument, but you’d change your assessment of the argument if you learned the speaker has intentionally suppressed the relevant evidence that the company’s Cray Mac 11 was purchased from his brother-in-law at a 30 percent higher price than it could have been purchased elsewhere, and if you learned that a recent unbiased analysis of ten comparable computers placed the Cray Mac 11 near the bottom of the list. If the relevant information is not intentionally suppressed but rather inadvertently overlooked, the fallacy of suppressed evidence also is said to occur, although the fallacy’s name is misleading in this case. The fallacy is also called the Fallacy of Incomplete Evidence and Cherry-Picking the Evidence. See also Slanting. Sweeping GeneralizationSee Fallacy of Accident. SyllogisticSyllogistic fallacies are kinds of invalid categorical syllogisms. This list contains the Fallacy of Undistributed Middle and the Fallacy of Four Terms, and a few others though there are a great many such formal fallacies. TokenismIf you interpret a merely token gesture as an adequate substitute for the real thing, you’ve been taken in by tokenism. Example:
If you accept this line of reasoning, you have been taken in by tokenism. Traditional WisdomIf you say or imply that a practice must be OK today simply because it has been the apparently wise practice in the past, then your reasoning contains the fallacy of traditional wisdom. Procedures that are being practiced and that have a tradition of being practiced might or might not be able to be given a good justification, but merely saying that they have been practiced in the past is not always good enough, in which case the fallacy is present. Also called Argumentum Consensus Gentium when the traditional wisdom is that of nations. Example:
The “of course” is the problem. The traditional wisdom of IBM being the right buy is some reason to buy IBM next time, but it’s not a good enough reason in a climate of changing products, so the “of course” indicates that the Fallacy of Traditional Wisdom has occurred. The fallacy is essentially the same as the fallacies of Appeal to the Common Practice, Gallery, Masses, Mob, Past Practice, People, Peers, and Popularity. Tu QuoqueThe Fallacy of Tu Quoque occurs in our reasoning if we conclude that someone’s argument not to perform some act must be faulty because the arguer himself or herself has performed it. Similarly, when we point out that the arguer doesn’t practice what he or she preaches, and then suppose that there must be an error in the preaching for only this reason, then we are reasoning fallaciously and creating a Tu Quoque. This is a kind of Ad Hominem Circumstantial Fallacy. Example:
Discovering that a speaker is a hypocrite is a reason to be suspicious of the speaker’s reasoning, but it is not a sufficient reason to discount it. Two Wrongs do not Make a RightWhen you defend your wrong action as being right because someone previously has acted wrongly, you are using the fallacy called “Two Wrongs do not Make a Right.” This is a special kind of Ad Hominem Fallacy. Example:
Undistributed MiddleIn syllogistic logic, failing to distribute the middle term over at least one of the other terms is the fallacy of undistributed middle. Also called the Fallacy of Maldistributed Middle. Example:
The middle term (“animals”) is in the predicate of both universal affirmative premises and therefore is undistributed. This formal fallacy has the logical form: All C are A. All D are A. Therefore, all C are D. UnfalsifiabilityThis error in explanation occurs when the explanation contains a claim that is not falsifiable, because there is no way to check on the claim. That is, there would be no way to show the claim to be false if it were false. Example:
This could be the correct explanation of his lying, but there’s no way to check on whether it’s correct. You can check whether he’s twitching and moaning, but this won’t be evidence about whether a supernatural force is controlling his body. The claim that he’s possessed can’t be verified if it’s true, and it can’t be falsified if it’s false. So, the claim is too odd to be relied upon for an explanation of his lying. Relying on the claim is an instance of fallacious reasoning. Unrepresentative GeneralizationIf the plants on my plate are not representative of all plants, then the following generalization should not be trusted. Example:
The set of plants on my plate is called “the sample” in the technical vocabulary of statistics, and the set of all plants is called “the target population.” If you are going to generalize on a sample, then you want your sample to be representative of the target population, that is, to be like it in the relevant respects. This fallacy is the same as the Fallacy of Unrepresentative Sample. Unrepresentative SampleIf the means of collecting the sample from the population are likely to produce a sample that is unrepresentative of the population, then a generalization upon the sample data is an inference using the fallacy of unrepresentative sample. A kind of Hasty Generalization. When some of the statistical evidence is expected to be relevant to the results but is hidden or overlooked, the fallacy is called Suppressed Evidence. There are many ways to bias a sample. Knowingly selecting atypical members of the population produces a biased sample. Example:
Most people’s background information is sufficient to tell them that people at this sort of convention are unlikely to be representative, that is, are likely to be atypical members of the rest of society. Having a small sample does not by itself cause the sample to be biased. Small samples are OK if there is a corresponding large margin of error or low confidence level. Large samples can be unrepresentative, too. Example:
Getting a larger sample size does not overcome sampling bias. UntestabilitySee Unfalsifiability. Vested InterestThe Vested Interest Fallacy occurs when a person argues that someone’s claim is incorrect or their recommended action is not worthy of being followed because the person is motivated by their interest in gaining something by it, with the implication that were it not for this vested interest then the person wouldn’t make the claim or recommend the action. Because this reasoning attacks the reasoner rather than the reasoning itself, it is a kind of Ad Hominem fallacy. Example:
This is fallacious reasoning by the speaker because whether Samantha is giving good advice about Anderson ought to depend on Anderson’s qualifications, not on whether Samantha will or won’t get a nice job if he’s elected. Victory by DefinitionSame as the fallacy of Persuasive Definition. Weak AnalogySee False Analogy. Willed ignoranceI’ve got my mind made up, so don’t confuse me with the facts. This is usually a case of the Traditional Wisdom Fallacy. Example:
Wishful ThinkingA reasoner who suggests that a claim is true, or false, merely because he or she strongly hopes it is, is using the fallacy of wishful thinking. Wishing something is true is not a relevant reason for claiming that it is actually true. Example:
You-TooThis is an informal name for the Tu Quoque fallacy. 7. References and Further Reading
Research on the fallacies of informal logic is regularly published in the following journals: Argumentation, Argumentation and Advocacy, Informal Logic, Philosophy and Rhetoric, and Teaching Philosophy. Author InformationBradley Dowden When a speaker uses an irrelevant topic to divert from the subject of discussion?Red herring fallacy occurs when a speaker distracts listeners with sensational, irrelevant material. Slippery slope fallacy occurs when the speaker argues that one bad thing will result in many other bad things.
Which fallacy diverts an audience away from an issue to a person?Whether or not this practice was ever used to train hunting dogs, as some suppose, the connection to logic and argumentation is clear. One commits the red herring fallacy when one attempts to distract one's audience from the main thread of an argument, taking things off in a different direction.
Which of the following is the fallacy that assumes that because something is popular it is therefore good or correct multiple choice question?Bandwagon. This fallacy is also referred to as “appeal to majority” and “appeal to popularity,” using the old expression of “get on the bandwagon” to support an idea. Essentially, bandwagon is a fallacy that asserts that because something is popular (or seems to be), it is therefore good, correct, or desirable.
Which of the following is the fallacy that forces listeners to choose between two alternatives when more than two alternatives exist?This one is often referred to as the “either-or” fallacy. When you are given only two options, and more than two options exist, that is false dilemma.
|