Is there such as thing as the "ill-informed opponent" fallacy?
What name is given to the fallacy of assuming your opponent holds their position, only because they don't have all the facts?
I don’t mean the fallacy of not having all the facts. I mean a fallacy you have about your opponent. If I had to think of a name for it, I might call it “the fallacy of the ill-informed opponent”. But I’m hoping there’s a more official name for it.
Example: an atheist thinks that a theist, if only they would read about evolution, would no longer believe in God. On the other side, a theist believes that an atheist, if only they would read the Bible, would start believing in God.
Both thoughts are fallacious, because it’s possible to read about evolution and still believe in God, and it’s possible to read the Bible and still be an atheist.
I think this is a very important fallacy to identify, because there are countless atheist/theist debates where one side constantly bombards the other with information, in the mistaken belief that their position is due to a lack of information.
asked on Monday, May 08, 2023 11:28:28 AM by Michael
Top Categories Suggested by Community
Comments
Want to get notified of all questions as they are asked? Update your mail preferences and turn on "Instant Notification."
Uncomfortable Ideas: Facts don't care about feelings. Science isn't concerned about sensibilities. And reality couldn't care less about rage.
This is a book about uncomfortable ideas—the reasons we avoid them, the reasons we shouldn’t, and discussion of dozens of examples that might infuriate you, offend you, or at least make you uncomfortable.
Many of our ideas about the world are based more on feelings than facts, sensibilities than science, and rage than reality. We gravitate toward ideas that make us feel comfortable in areas such as religion, politics, philosophy, social justice, love and sex, humanity, and morality. We avoid ideas that make us feel uncomfortable. This avoidance is a largely unconscious process that affects our judgment and gets in the way of our ability to reach rational and reasonable conclusions. By understanding how our mind works in this area, we can start embracing uncomfortable ideas and be better informed, be more understanding of others, and make better decisions in all areas of life.
Get 20% off this book and all Bo's books*. Use the promotion code: websiteusers
* This is for the author's bookstore only. Applies to autographed hardcover, audiobook, and ebook.
I think this is just a false belief people have. We assume that our ideas are correct and, if only people had the 'correct information', they'd adopt our obviously correct ideas. That isn't the case though, especially when dealing with moral judgments - see the 'is-ought' distinction.
If someone put forward a view, and someone else rejected it saying "you're wrong, you don't have all the facts" - with no elaboration - it would be an unsupported claim.
That said, it is fine to suggest - after pointing out the flaws in someone's argument - that they should to read up on something if they show clear gaps in their knowledge. It might not change their mind, but at least, they'd be able to competently argue for their position. The problem is when it is used as a carte blanche to disregard all opposing thought as being uninformed.
answered on Monday, May 08, 2023 01:44:23 PM by TrappedPrior (RotE)
TrappedPrior (RotE) Suggested These Categories
Comments
Mchasewalker
1
It's basically an ad hominem guilt by association that assumes because someone is labeled or identified with a general group then they are less than educated, ill-informed, or ignorant of a subject. Theists are ignorant of science, Atheists are ignorant of The Bible. Both are false assumptions.
answered on Monday, May 08, 2023 12:12:04 PM by Mchasewalker
Mchasewalker Suggested These Categories
Comments
Dr. Richard
0
A lot here depends upon the goal of the discussion. If it is to change the other person’s mind, then the “battle of facts” will usually fail.
In my experience, people never change their beliefs by being punched in the head with facts. Most people believe what they believe because they want to believe what they currently believe. Facts are not important. Michael Shermer made this addition to Cognitive Dissonance Theory in his book, “Why People Believe Weird Things.” So, if your goal is to change another person’s belief, I think you must use a different approach.
Peter Boghossian suggested a strategy to change a person’s belief. To be successful, he said the person whose belief you want to change must reconsider how he arrived at the belief under discussion. If your goal is to change his mind, as distinct from pontificating (which is better done in front of a mirror), then you need to get him thinking about how he arrived at the belief.
Boghossian’s book, “How to Have Impossible Conversations,” is an excellent manual on how to do this. In my experience, people never change their beliefs by being punched in the head with facts. Most people believe what they believe because they want to believe what they currently believe. Facts are not important. Michael Shermer made this addition to Cognitive Dissonance Theory in his book, “Why People Believe Weird Things.” So, if your goal is to change another person’s belief, I think you must use a different approach.
Peter Boghossian suggested a strategy to change a person’s belief. To be successful, he said the person whose belief you want to change must reconsider how he arrived at the belief under discussion. If your goal is to change his mind, as distinct from pontificating (which is better done in front of a mirror), then you need to get him thinking about how he arrived at the belief.
Boghossian’s book, “How to Have Impossible Conversations,” is an excellent manual on how to do this. He suggests asking questions. For example:
“I’ve come to a different conclusion and I’m having a hard time understanding where you’re coming from. I assume you must know some things about this subject that I don’t. Could you tell me more about where you’re coming from on that so I can understand better?”
The more ignorance you admit, the more readily your partner in the conversation will step in with an explanation to help you understand. And the more they attempt to explain, the more likely they are to realize the limits of their knowledge and epistemological errors made along the way.
If you ask someone a direct question and he obfuscates or refuses to answer, ask him to ask you the same question, and you answer it. Other Boghossian suggestions:
“That’s an interesting perspective. What leads you to conclude that?”
Say, “I’m skeptical,” not “I disagree.”
“On a scale from 1 to 10, with 1 being no confidence and 10 being absolute confidence, how confident are you that belief is true?”
“I’m not sure how I’d get to where you are, at a X. I want to see what I’m missing. Would you help walk me through it?”
“I am not trying to convince you of anything. I’m curious and would like to ask some questions to learn more.”
The idea is instead of people holding a belief because they think they should hold that belief, reverse it and claim to hold your belief and wish you could stop believing—if only the discussion partner could show you the error of your ways. The point is, you want to get them thinking about the process that led to the conclusion and not about the conclusion itself.
All of this deals with the Fallacy of Subjectivism. Subjectivism is not only a way of adopting conclusions on subjective grounds, but also — and probably more often — a way of evading the grounds. Some people have perfected the skill of ignoring what they don’t want to see, and most of us indulge in this habit from time to time. Heuristics are hell. If I put the statement into a proposition, it takes the form: “I don’t want to accept p. Therefore p isn’t true.” That’s the fallacy of subjectivism.
answered on Tuesday, May 09, 2023 11:19:09 AM by Dr. Richard
Dr. Richard Suggested These Categories
Comments
0
Petra Liveraniwrites:
It's interesting that you mention the Fallacy of Subjectivism of which I was unaware because I see the tendency of my identical twin to fall into that fallacy and I wonder why she does and I don't. For example, she'll say, "I can't believe something if it doesn't make sense to me," to which my response would be, "I can understand that but unless you have agreed-upon facts for others to work with why should anyone go along with your lack of belief because something doesn't make sense to you." I have no interest in my own opinion on something that is a fact-based matter if I want to convince others of what I think is true, I want to ensure I can point to relevant facts - at least what I believe to be facts.
It's so frustrating because facts really do prove things although certainly facts can sometimes lead one astray too - there is the question of determining the most relevant facts. The nature of reality is that all the facts one way or another will ultimately favour the correct hypothesis.
posted on Wednesday, May 10, 2023 08:12:40 PM
0
Dr. Richardwrites: [To Petra Liverani]
Perhaps you need to change your approach.
In my experience, people never change their beliefs by being punched in the head with facts. Most people believe what they believe because they want to believe what they currently believe. Facts are not important. Michael Shermer made this addition to Cognitive Dissonance Theory in his book, “Why People Believe Weird Things.” So, if your goal is to change another person’s belief, I think you must use a different approach.
Peter Boghossian suggested a strategy to change a person’s belief. To be successful, he said the person whose belief you want to change must reconsider how he arrived at the belief under discussion. If your goal is to change his mind, as distinct from pontificating (which is better done in front of a mirror), then you need to get him thinking about how he arrived at the belief.
Boghossian’s book, “How to Have Impossible Conversations,” is an excellent manual on how to do this. In my experience, people never change their beliefs by being punched in the head with facts. Most people believe what they believe because they want to believe what they currently believe. Facts are not important. Michael Shermer made this addition to Cognitive Dissonance Theory in his book, “Why People Believe Weird Things.” So, if your goal is to change another person’s belief, I think you must use a different approach.
Peter Boghossian suggested a strategy to change a person’s belief. To be successful, he said the person whose belief you want to change must reconsider how he arrived at the belief under discussion. If your goal is to change his mind, as distinct from pontificating (which is better done in front of a mirror), then you need to get him thinking about how he arrived at the belief.
His book is a good investment in conversations, beyond your identical twin.
[ login to reply ] posted on Thursday, May 11, 2023 11:40:32 AM
0
Petra Liveraniwrites:
[To Dr. Richard]
Could not agree more that most people believe what they believe because they want to believe whatever it is ... but my question is why? Why do people want to believe one thing rather than another, why not always simply want to believe whatever is true? There are a number of things I believe to be true which I find very unpalatable and would rather weren't true but if that's the reality so be it. If something is unpalatable we can understand people's reluctance to believe it and that is often what I believe to be people's obstacle in accepting something as true - its unpalatability but sometimes it doesn't seem as if the truth is unpalatable and yet people still don't want to believe it.
I'm perfectly happy to be punched in the head with facts because I recognise that it is facts that tell the truth but I realise that for other people it's not the same and quite often they have no desire to discuss the matter seriously so even if you don't try to bash them over the head with facts they're still very difficult to discuss the subject with.
EDIT: I didn't realise I'm familiar with Peter Boghossian from watching a YouTube where he conducts an exercise engaging students on campus on the topic of gender and gets a very hostile reaction from some. Very interesting exercise.
EDIT 2: Oh dear, his technique didn't work very well on this occasion. Oh my goodness. The thing is I think both sides make interesting points but the emotionality on one side destroys the discussion. https://youtu.be/VUOP_lY7DT8
[ login to reply ] posted on Sunday, May 14, 2023 06:42:54 AM
0
Dr. Richardwrites:
[To Petra Liverani]
The question of "why" enters the field of psychology. The list is long. If anyone can figure out a solution to that question, he will become a psychological hero. Unfortunately, as you note, and so does Boghossian, some people's beliefs are not amendable to change. I sure had that reinforced with my recent book www.deathbypitbull.com. As you noted about a Boghossian video, those people often become belligerent and attack with ad hominem comments.
[ login to reply ] posted on Sunday, May 14, 2023 11:09:56 AM
0
Petra Liveraniwrites: [To Dr. Richard]
Often you can see how someone came to their belief, especially when you held the belief yourself at one point but came to see that the belief was false.
I mentioned unpalatability but I forgot there's another one that seems incredibly powerful - anchoring in one's belief. There are certain things I believed quite strongly but when I could see so clearly how I was led to that belief and that not only clear evidence but good reason required a change of mind I changed it no problem and yet I find other people struggle with this change of belief in a way I find truly unfathomable, they have such a strong attachment to their belief even when it is exposed as false from quite a number of angles ... and while I can understand this phenomenon when the required belief is unpalatable it is difficult to understand when it is not unpalatable.
A truth’s initial commotion is directly proportional to how deeply the lie was believed. It wasn’t the world being round that agitated people, but that the world wasn’t flat.
When a well-packaged web of lies has been sold gradually to the masses over generations, the truth will seem utterly preposterous and its speaker a raving lunatic.
The ideal tyranny is that which is ignorantly self-administered by its victims. The most perfect slaves are, therefore, those which blissfully and unawaredly enslave themselves.
Dresden James
[ login to reply ] posted on Monday, May 15, 2023 01:31:54 AM
0
Dr. Richardwrites: [To Petra Liverani]
What you are describing, is, I think Festinger’s Question.
Festinger’s Question comes from his famous 1956 book, “When Prophecy Fails.” Suppose (1) an individual believes something with his whole heart and soul; (2) he has taken irrevocable actions because of it; and (3) he is then presented with evidence, unequivocal and undeniable evidence, evidence he himself fully accepts as true, that his first belief is wrong. Festinger’s question is: What will happen?
The answer, well documented by six decades of subsequent research, shows that people respond to dissonant beliefs using three key strategies.
First, they can ignore the dissonant belief. In essence, saying, I don’t want to believe it. Therefore it isn’t true. This, as the psychologists would say, is a form of repression. This is pure subjectivism, holding the primacy of consciousness to be true instead of the primacy of existence.
Second, they can reduce the importance of the conflicting belief. This is evident by phrases such as “I’ll think about it tomorrow,” meaning I have more important things to consider. This, as the psychologists would say, is a form of evasion.
Third, they can make the newer conflicting belief consistent with the older existing belief by twisting the evidence, then claiming the beliefs are not really in conflict. This, as the psychologists would say, is rationalization. Michael Shermer calls it “motivated reasoning.”
What Festinger did not expect was people did not question their beliefs. Quite the opposite. Researchers were astonished to find people became stronger in their irrational beliefs after having been presented with unequivocal and undeniable evidence the subject himself entirely accepted as true. For example, if they believed in the flat earth, then they were presented with the undeniable evidence of the spherical planet, they became firmer in their flat earth belief. Today, psychologists call this the “Backfire Effect.”
The most difficult beliefs for people to examine are those beliefs that have been (1) held for a long time, (2) adopted before the age of reason, and (3) most often repeated.
This explains why it is impossible to have a conversation on the two subjects one should never discuss socially: religious and political beliefs. Both of which are drilled into children from the time they are born.
One may say, “every belief should be open to reexamination upon the presentation of credible evidence,” but attempting to live up to that standard is difficult and takes a concentrated effort.
[ login to reply ] posted on Monday, May 15, 2023 12:28:36 PM
0
Petra Liveraniwrites:
[To Dr. Richard]
"Suppose (1) an individual believes something with his whole heart and soul; (2) he has taken irrevocable actions because of it; and (3) he is then presented with evidence, unequivocal and undeniable evidence, evidence he himself fully accepts as true, that his first belief is wrong. Festinger’s question is: What will happen?"
But sometimes no actions are taken and effectively nothing will change except the belief itself and the change in communication of that belief. And we can sometimes see that communication of the new belief should even be easier to achieve than communication of the old belief - but still the person resists the new belief seemingly purely from resistance caused by anchoring in their belief - that's the way it seems to me anyway.
So why don't we do a test, Dr Richard? Is there any mainstream dogma you've learnt is a lie and now that you hold the truth about it would have you labelled a crackpot?
[ login to reply ] posted on Tuesday, May 16, 2023 05:21:02 AM
0
Dr. Richardwrites: [To Petra Liverani]
I don't understand the paragraph beginning with: "But sometimes no actions are taken and effectively nothing will change except the belief itself and the change in communication of that belief"
[ login to reply ] posted on Tuesday, May 16, 2023 10:41:36 AM
0
Petra Liveraniwrites: [To Dr. Richard]
What I mean is:
there is no need for (2) (he has taken irrevocable actions because of [his beliefs]) for a person to resist changing their belief, eg, whether the earth is round or flat people don't behave differently, it's just a belief that has no bearing on how we conduct our lives and yet when it emerged that the earth was round and not flat people seemingly resisted the idea - as my quote says it wasn't the earth being round that was the problem, it was that it wasn't flat - if no one had had any notion about the shape of the earth prior to it being discovered it was round no doubt its shape would have been met with much less or even no resistance. It's more understandable, of course, if we have invested in our belief with actions that we might resist more the need to change it - the sunk-cost fallacy might be more inclined to influence our thinking - but I have noticed where it's just a matter of belief where actions play no part that people are still very resistant to changing that belief.
in terms of communication, I can see that where people wish to communicate what they believe to be true but what they believe is a difficult belief for others to take on, if they modified their belief to what actually is true (their belief can be shown to be not quite right) they are still very resistant to changing that belief. They seem to give absolutely no thought to the fact that a modified belief might be easier for others to accept and therefore it's worth investigating the challenge to their belief to see if the evidence supports changing it.
[ login to reply ] posted on Wednesday, May 17, 2023 12:20:33 AM
1
Dr. Richardwrites: [To Petra Liverani]
The response to your statements here are in Festinger's book. For example, "A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."