Английская Википедия:Belief perseverance

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Short description

Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, in a 2014 article in The Atlantic, journalist Cari Romm describes a study involving vaccination hesitancy. In the study, the subjects expressed their concerns of the side effects of flu shots. After being told that the vaccination was completely safe, they became even less eager to accept them. This new knowledge pushed them to distrust the vaccine even more, reinforcing the idea that they already had before.[4][5]

There are three kinds of backfire effects: Familiarity Backfire Effect (from making myths more familiar), Overkill Backfire Effect (from providing too many arguments), and Worldview Backfire Effect (from providing evidence that threatens someone’s worldview).[6] According to Cook & Lewandowsky (2011), there are a number of techniques to debunk misinformation, such as emphasizing the core facts and not the myth, or providing explicit warnings that the upcoming information is false, and providing alternative explanations to fill the gaps left by debunking the misinformation.[7] However, more recent studies provided evidence that the backfire effects are not as likely as once thought.[8]

Since rationality involves conceptual flexibility,[9][10] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".[11]

Evidence from experimental psychology

According to Lee Ross and Craig A. Anderson, "beliefs are remarkably resilient in the face of empirical challenges that seem logically devastating".[12]

The first study of belief perseverance was carried out by Festinger, Riecken, and Schachter.[13] These psychiatrists spent time with members of a doomsday cult who believed the world would end on December 21, 1954.[13] Despite the failure of the forecast, most believers continued to adhere to their faith.[13][14][15] In When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (1956) and A Theory of Cognitive Dissonance (1957), Festinger proposed that human beings strive for internal psychological consistency to function mentally in the real world.[13] A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance.[13][14][16] They tend to make changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance (rationalization) or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias).[13][14][16]

When asked to reappraise probability estimates in light of new information, subjects displayed a marked tendency to give insufficient weight to the new evidence. They refused to acknowledge the inaccurate prediction as a reflection of the overall validity of their faith. In some cases, subjects reported having a stronger faith in their religion than before.[17]

In a separate study, mathematically capable teenagers and adults were given seven arithmetical problems and asked to estimate approximate solutions using manual estimating. Then, using a calculator rigged to provide increasingly erroneous figures, they were asked for accurate answers (e.g., yielding 252 × 1.2 = 452.4, when it is actually 302.4). About half of the participants went through all seven tasks while commenting on their estimating abilities or tactics, never letting go of the belief that calculators are infallible. They simply refused to admit that their previous assumptions about calculators could have been incorrect.[18]

Lee Ross and Craig A. Anderson led some subjects to the false belief that there existed a positive correlation between a firefighter's stated preference for taking risks and their occupational performance. Other subjects were told that the correlation was negative. The participants were then thoroughly debriefed and informed that there was no link between risk taking and performance. These authors found that post-debriefing interviews pointed to significant levels of belief perseverance.[19]

In another study, subjects spent about four hours following instructions of a hands-on instructional manual.  At a certain point, the manual introduced a formula which led them to believe that spheres were 50 percent larger than they are. Subjects were then given an actual sphere and asked to determine its volume; first by using the formula, and then by filling the sphere with water, transferring the water to a box, and directly measuring the volume of the water in the box. In the last experiment in this series, all 19 subjects held a Ph.D. degree in a natural science, were employed as researchers or professors at two major universities, and carried out the comparison between the two volume measurements a second time with a larger sphere. All but one of these scientists clung to the spurious formula despite their empirical observations.[20]

Шаблон:Quote box

In cultural innovations

Physicist Max Planck wrote that "the new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it".[21] For example, the heliocentric theory of the great Greek astronomer, Aristarchus of Samos, had to be rediscovered about 1,800 years later, and even then undergo a major struggle before astronomers took its veracity for granted.[22]

Belief persistence is frequently accompanied by intrapersonal cognitive processes. "When the decisive facts did at length obtrude themselves upon my notice," wrote the chemist Joseph Priestley, "it was very slowly, and with great hesitation, that I yielded to the evidence of my senses."[23]

In education

Students often "cling to ideas that form part of their world view even when confronted by information that does not coincide with this view."[24] For example, students may spend months studying the solar system and do well on related tests, but still believe that moon phases are produced by Earth's shadow. What they learned was not able to intrude on the beliefs they held prior to that knowledge.[25]

Causes

The causes of belief perseverance remain unclear. Experiments in the 2010s suggest that neurochemical processes in the brain underlie the strong attentional bias of reward learning. Similar processes could underlie belief perseverance.[26]

Peter Marris suggests that the process of abandoning a conviction is similar to the working out of grief. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity".[27]

Philosopher of science Thomas Kuhn points to the resemblance between conceptual change and Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady). Hence, the difficulty of switching from one conviction to another could be traced to the difficulty of rearranging one's perceptual or cognitive field.[28]

See also

Шаблон:Div col

Шаблон:Div col end

  • Anussava - Do not go upon what has been acquired by repeated hearing.

References

Шаблон:Reflist

Further reading

Шаблон:Spoken Wikipedia

  1. Ошибка цитирования Неверный тег <ref>; для сносок :0 не указан текст
  2. Шаблон:Cite book
  3. Silverman, Craig (June 17, 2011). "The Backfire Effect: More on the press’s inability to debunk bad information". Columbia Journalism Review, Columbia University (New York City).
  4. Romm, Cari (December 12, 2014). "Vaccine Myth-Busting Can Backfire". The Atlantic.
  5. Nyhan, Brendan and Reifler, Jason (January 9, 2015) "Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information" Vaccine (journal)
  6. Шаблон:Cite journal
  7. Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. Шаблон:ISBN. [1]
  8. Шаблон:Citation
  9. Шаблон:Cite book
  10. Шаблон:Cite book
  11. Шаблон:Cite book
  12. Шаблон:Cite book
  13. 13,0 13,1 13,2 13,3 13,4 13,5 Шаблон:Cite journal
  14. 14,0 14,1 14,2 Шаблон:Cite journal
  15. Шаблон:Cite book
  16. 16,0 16,1 Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press.
  17. Шаблон:Cite book
  18. Шаблон:Cite journal
  19. Шаблон:Cite journal
  20. Шаблон:Cite journal
  21. Шаблон:Cite book
  22. Шаблон:Cite book
  23. Шаблон:Cite book
  24. Шаблон:Cite journal
  25. Шаблон:Cite journal
  26. Шаблон:Cite journal
  27. Шаблон:Cite book
  28. Шаблон:Cite book