Coming from a group of academics in the nineteen-seventies, the contention that people cant think straight was shocking. Both studiesyou guessed itwere made up, and had been designed to present what were, objectively speaking, equally compelling statistics. Heres how the Dartmouth study framed it: People typically receive corrective informationwithin objective news reports pitting two sides of an argument against each other,which is significantly more ambiguous than receiving a correct answer from anomniscient source. The Grinch, A Christmas Carol, Star Wars. Isnt it amazing how when someone is wrong and you tell them the factual, sometimes scientific, truth, they quickly admit they were wrong? Before you can criticize an idea, you have to reference that idea. For example, when you drive down the road, you do not have full access to every aspect of reality, but your perception is accurate enough that you can avoid other cars and conduct the trip safely. We dont always believe things because they are correct. Because it threatens their worldview or self-concept, they wrote. Consider the richness of human visual perception. You have to slide down it. Elizabeth Kolbert New Yorker Feb 2017 10 min. You can't expect someone to change their mind if you take away their community too. To change social behavior, change individual minds. Engaging Youll read or watch this all the way through the end. Red, White & Royal Blue. Controversial Youll be confronted with strongly debated opinions. Curiosity is the driving force. Thanks for reading. Voters and individual policymakers can have misconceptions. It also primes a person for misinformation. The book has sold over 10 million copies worldwide and has been translated into more than 50 languages. These misperceptions are bad for public policy and social health. Scouts, meanwhile, are like intellectual explorers, slowly trying to map the terrain with others. Institute for Advanced Study The way to change peoples minds is to become friends with them, to integrate them into your tribe, to bring them into your circle. It's complex and deeply contextual, and naturally balances our awareness of the obvious with a sensitivity to nuance. Innovative You can expect some truly fresh ideas and insights on brand-new products or trends. Our analysis shows that the most important conservation actions across Australia are to retain and restore habitat, due to the threats posed by habitat destruction and . Victory is the operative emotion. If they abandon their beliefs, they run the risk of losing social ties. In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous. A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. So clearly facts change can and do change our minds and the idea that they do is a huge part of culture today. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threatsthe human equivalent of the cat around the cornerits a trait that should have been selected against. 7 Good. Hot Topic Youll find yourself in the middle of a highly debated issue. Among the many, many issues our forebears didn't worry about were the deterrent effects of capital punishment and the ideal attributes of a firefighter. At the end of the study, the students who favored capital punishment before reading the fake data were now even more in favor of it, and those who were already against the death penalty were even more opposed. Last month, The New Yorker published an article called 'Why facts don't change our minds', in which the author, Elizabeth Kolbert, reviews some research showing that even 'reasonable-seeming people are often totally irrational'. Silence is death for any idea. How do such behaviors serve us? On the Come Up. Why you think youre right even if youre wrong, 7 Ways to Retain More of Every Book You Read, First Principles: Elon Musk on the Power of Thinking for Yourself, Mental Models: How to Train Your Brain to Think in New Ways. Even when confronted with new facts, people are reluctant to change their minds because we don't like feeling wrong, confused or insecure, writes Tali Sharot, an associate professor of cognitive neuroscience and author of The Influential Mind: What the Brain Reveals About Our Power to Change Others. For all the large-scale political solutions which have been proposed to salve ethnic conflict, there are few more effective ways to promote tolerance between suspicious neighbours than to force them to eat supper together. 5, Perhaps it is not difference, but distance that breeds tribalism and hostility. This is the tendency that we have to . Inevitably Kolbert is right, confirmation bias is a big issue. Contents [ hide] Gift a book. You take to social media and it stokes the rage. "Telling me, 'Your midwife's right. She even helps prove this by being biased in her article herself, whether intentionally or not. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. Article Analysis of Why Facts Don't Change Our Minds by Elizabeth Kolbert Every person in the world has some kind of bias. And here our dependence on other minds reinforces the problem. What we say here about books applies to all formats we cover. Why Facts Don't Change Minds - https://aperture.gg/factsmindsDownload Endel to get a free week of audio experiences! The backfire effect has been observed in various scenarios, such as in the case of people supporting a political candidate . They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The students were provided with fake studies for both sides of the argument. A helpful and/or enlightening book that combines two or more noteworthy strengths, e.g. For example, our opinions. A typical flush toilet has a ceramic bowl filled with water. The tendency to selectively pay attention to information that supports our beliefs and ignore information that contradicts them. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. This, I think, is a good method for actually changing someones mind. What sort of attitude toward risk did they think a successful firefighter would have? All of these are movies, and though fictitious, they would not exist as they do today if humans could not change their beliefs, because they would not feel at all realistic or relatable. Reason, they argue with a compelling mix of real-life and experimental evidence, is not geared to solitary use, to arriving at better beliefs and decisions on our own. The author of the book The Sixth Extinction, (2014) Elizabeth Kolbert, wrote an article for the New Yorker magazine in February 2017 entitled: "Why Facts Don't Change Our Minds: New Discoveries about the Human Mind Show the Limitations of Reason," (New Yorker, February 27, 2017). As proximity increases, so does understanding. Often an instant classic and must-read for everyone. Therefore, we use a set of 20 qualities to characterize each book by its strengths: Applicable Youll get advice that can be directly applied in the workplace or in everyday situations. Once again, they were given the chance to change their responses. So, basically, when hearing information, wepick a side and that, in turn, simply reinforces ourview. But I knowwhere shes coming from, so she is probably not being fully accurate,the Republican might think while half-listening to the Democrats explanation. There is another reason bad ideas continue to live on, which is that people continue to talk about them. "Why facts don't change our minds". Whatever we select for our library has to excel in one or the other of these two core criteria: Enlightening Youll learn things that will inform and improve your decisions. Why is human thinking so flawed, particularly if its an adaptive behavior that evolved over millennia? As Julia Galef so aptly puts it: people often act like soldiers rather than scouts. It was like "the light had left his eyes," Maranda recalled her saying. I thought about changing the title, but nobody is allowed to copyright titles and enough time has passed now, so Im sticking with it. 7, Each time you attack a bad idea, you are feeding the very monster you are trying to destroy. In this case, the failure was particularly impressive, since two data points would never have been enough information to generalize from. Found a perfect sample but need a unique one? Instead, manyof us will continue to argue something that simply isnt true. Coperation is difficult to establish and almost as difficult to sustain. Even after the evidence for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs, the researchers noted. Presented with someone elses argument, were quite adept at spotting the weaknesses. They began studying the backfire effect, which they define as a phenomenon by which corrections actually increase misperceptions among the group in question, if those corrections contradict their views. I believe more evidence for why confirmation bias is impossible to avoid and is very dangerous, though some of these became more prevalent after the article was published, could include groups such as the kkk, neo-nazis, and anti-vaxxers. Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets. Growing up religious, the me that exists today is completely contradictory to what the old me believed, but I allowed myself to weigh in the facts that contracted what I so dearly believed in. Ideas can only be remembered when they are repeated. Though half the notes were indeed genuinetheyd been obtained from the Los Angeles County coroners officethe scores were fictitious. As a journalist,I see it pretty much every day. (Dont even get me started on fake news.) But some days, its just too exhausting to argue the same facts over and over again. Shaw describes the motivated reasoning that happens in these groups: "You're in a position of defending your choices no matter what information is presented," he says, "because if you don't, it. The article often takes an evolutionary standpoint when using in-depth analysis of why the human brain functions as it does. But back to the article, Kolbert is clearly onto something in saying that confirmation bias needs to change, but neglects the fact that in many cases, facts do change our minds. We want to fit in, to bond with others, and to earn the respect and approval of our peers. Such a mouse, bent on confirming its belief that there are no cats around, would soon be dinner. If your model of reality is wildly different from the actual world, then you struggle to take effective actions each day. They were presented with pairs of suicide notes. The farther off base they were about the geography, the more likely they were to favor military intervention. Hidden Brain is hosted by Shankar Vedantam and produced by Parth Shah, Jennifer Schmidt, Rhaina Cohen, Thomas Lu and Laura Kwerel. These groups thrive on confirmation bias and help prove the argument that Kolbert is making, that something needs to change. Many months ago, I was getting ready to publish it and what happens? In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. Discover your next favorite book with getAbstract. Select the sections that are relevant to you. How can we avoidlosing ourminds when trying to talk facts? The more you repeat a bad idea, the more likely people are to believe it. Get book recommendations, fiction, poetry, and dispatches from the world of literature in your in-box. The Harvard psychologist Steven Pinker put it this way, People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true. 2. In, Why Facts Don't Change Our Minds, an article by Elizabeth Kolbert, the main bias talked about is confirmation bias, also known as myside bias. Her arguments, while strong, could still be better by adding studies or examples where facts did change people's minds. Risk-free: no credit card is required. The students in the high-score group said that they thought they had, in fact, done quite wellsignificantly better than the average studenteven though, as theyd just been told, they had zero grounds for believing this. Its something thats been popping up a lot lately thanks to the divisive 2016 presidential election. The word kind originated from the word kin. When you are kind to someone it means you are treating them like family. Participants were asked to answer a series of simple reasoning problems. A new era of strength competitions is testing the limits of the human body. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others begins. Next thing you know youre firing off inflammatory posts to soon-to-be-former friends. Begin typing to search for a section of this site. New facts often do not change people's minds. You cant jump down the spectrum. Rational agents would be able to think their way to a solution. They are motivated by wishful thinking. Others discovered that they were hopeless. Friendship does. contains uncommonly novel ideas and presents them in an engaging manner. So while Kolbert does have a very important message to give her readers she does not give it to them in the unbiased way that it should have been presented and that the readers deserved. The students were asked to respond to two studies. They were presented with pairs of suicide notes. Research shows that we are internally rewarded when we can influence others with our ideas and engage in debate. One minute he was fine, and the next, he was autistic. However, the proximity required by a meal something about handing dishes around, unfurling napkins at the same moment, even asking a stranger to pass the salt disrupts our ability to cling to the belief that the outsiders who wear unusual clothes and speak in distinctive accents deserve to be sent home or assaulted. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. But hey, Im writing this article and now I have a law named after me, so thats cool. However, truth and accuracy are not the only things that matter to the human mind. Enjoy 3 days of full online access to 25,000+ summaries
When confronted with an uncomfortable set of facts, the tendency is often to double down on their current position rather than publicly admit to being wrong. The essay on why facts don't alter our beliefs is pertinent to the area of research that I am involved in as well. Why do you want to criticize bad ideas in the first place? It suggests that often human will abandon rational reasoning in favour of their long-held beliefs, because the capacity to reason evolved not to be able to present logical reasoning behind an idea but to win an argument with others. Or merit-based pay for teachers? Whats going on here? It's the reason even facts don't change our minds. Im just supposed to let these idiots get away with this?, Let me be clear. Some real-life examples include Elizabeth Warren and Ronald Reagan, both of whom at one point in life had facts change their minds and switched which political party they were a part of one from republican to democrat and the other the reverse. At the center of this approach is a question Tiago Forte poses beautifully, Are you willing to not win in order to keep the conversation going?, The brilliant Japanese writer Haruki Murakami once wrote, Always remember that to argue, and win, is to break down the reality of the person you are arguing against. They want to save face and avoid looking stupid. Once formed, the researchers observed dryly, impressions are remarkably perseverant.. Here is how to lower the temperature. It makes a difference. If weor our friends or the pundits on CNNspent less time pontificating and more trying to work through the implications of policy proposals, wed realize how clueless we are and moderate our views. The Atlantic never had to issue a redaction, because they had four independent sources who were there that could confirm Trump in fact said this. Mercier, who works at a French research institute . A very good read. For example, our opinions on military spending may be fixeddespite the presentation of new factsuntil the day our son or daughter decides to enlist. A helpful and/or enlightening book that, in addition to meeting the highest standards in all pertinent aspects, stands out even among the best. If the goal is to actually change minds, then I dont believe criticizing the other side is the best approach. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our hypersociability. Mercier and Sperber prefer the term myside bias. Humans, they point out, arent randomly credulous. Some students discovered that they had a genius for the task. Humans' disregard of facts for information that confirms their original beliefs shows the flaws in human reasoning. The desire that humans have to always be right is supported by confirmation bias. When most people think about the human capacity for reason, they imagine that facts enter the brain and valid conclusions come out. If reason is designed to generate sound judgments, then its hard to conceive of a more serious design flaw than confirmation bias. But what if the human capacity for reason didnt evolve to help us solve problems; what if its purpose is to help people survive being near each other? From my experience, 1 keep emotions out of the exchange, 2 discuss, don't attack (no ad hominem and no ad Hitlerum), 3 listen carefully and try to articulate the other position accurately, 4 show . E.g., we emotional reason heaps, and a lot of times, it leads onto particular sets of thoughts, that may impact our behaviour, but later on, we discover that there was unresolved anger lying beneath the emotional reasoning in the . This error leads the individual to stop gathering information when the evidence gathered so far confirms the views (prejudices) one would like to be true. This does not sound ideal, so how did we come to be this way? February 27, 2017 "Information Clearing House" - "New Yorker" - In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. Once again, midway through the study, the students were informed that theyd been misled, and that the information theyd received was entirely fictitious. Sloman and Fernbach see in this result a little candle for a dark world. Dont waste time explaining why bad ideas are bad. And the best place to ponder a threatening idea is in a non-threatening environment. If the source of the information has well-known beliefs (say a Democrat is presenting an argumentto a Republican), the person receiving accurate information may still look at it asskewed. In a well-run laboratory, theres no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. This, they write, may be the only form of thinking that will shatter the illusion of explanatory depth and change peoples attitudes.. Its no wonder, then, that today reason often seems to fail us. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. . Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. The New Yorker may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. By Elizabeth Kolbert February 19, 2017 In 1975, researchers at Stanford invited a group of. It is hard to change one's mindafter they have set it to believe a certain way. A recent example is the anti-vax leader saying drinking your urine can cure Covid, meanwhile, almost any scientist and major news program would tell you otherwise. This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. You can get more actionable ideas in my popular email newsletter. They begin their book, The Knowledge Illusion: Why We Never Think Alone (Riverhead), with a look at toilets. It disseminates their BS. What happened? Why dont facts change our minds? Still, an essential puzzle remains: How did we come to be this way? Any idea that is sufficiently different from your current worldview will feel threatening. She has written for The New Yorker since 1999. Among the other half, suddenly people became a lot more critical. And why would someone continue to believe a false or inaccurate idea anyway? At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. Cognitive psychology and neuroscience studies have found that the exact opposite is often true when it comes to politics: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. You have to give them somewhere to go. In a new book, The Enigma of Reason (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Author links open overlay panel Anne H. Toomey. Not usually, anyway. We're committed to helping #nextgenleaders. The rational argument is dead, so what do we do? Technically, your perception of the world is a hallucination. People have a tendency to base their choices on their feelings rather than the information presented to them. "The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man . Nearly sixty per cent now rejected the responses that theyd earlier been satisfied with. Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. George had a small son and played golf. In the weeks before John Wayne Gacys scheduled execution, he was far from reconciled to his fate. The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance. I allowed myself to realize that there was so much more to the world than being satisfied with what one has known all their life and just believing everything that confirms it and disregarding anything that slightly goes against it, therefore contradicting Kolbert's idea that confirmation bias is unavoidable and one of our most primitive instincts. What are the odds of that? Concrete Examples Youll get practical advice illustrated with examples of real-world applications or anecdotes. There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous. Government and private policies are often based on misperceptions, cognitive distortions, and sometimes flat-out wrong beliefs. How an unemployed blogger confirmed that Syria had used chemical weapons. A helpful and/or enlightening book, in spite of its obvious shortcomings. Every person in the world has some kind of bias. Humans are irrational creatures. Plus, you can tell your family about Clears Law of Recurrence over dinner and everyone will think youre brilliant. The New Yorker's Elizabeth Kolbert reviews The Enigma of Reason by cognitive scientists Hugo Mercier and Dan Sperber, former Member (198182) in the School of Social Science: If reason is designed to generate sound judgments, then its hard to conceive of a more serious design flaw than confirmation bias. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime. Oct. 29, 2010.