Thomas
Gilovich has been researching what he sums up as "human judgment" for
nearly four decades. He is currently the Chair of the psychology department at
Cornell University and has written four books, one of which I chose to read for
this class, entitled "How We Know What Isn't So: The Fallibility of Human
Reason in Everyday Life". In a nutshell, "How We Know What Isn't
So" examines common cognitive errors and biases found in various human
beliefs and understandings. The book does so by analyzing Gilovich's own extensive
research in this field, and referencing research done in the same fields by his
peers.
Gilovich
breaks down the causes of the questionable beliefs that we've developed in the
modern era into two major categories - cognitive determinants and motivational/social
determinants. The cognitive determinants are related to the "Ways of
Thinking" lecture from our class and revolve around things like
misinterpreting data, finding patterns where none exist, and making poor decisions
based on ambiguous information. We've seen several examples of this in our
lectures covering psychic crime detectives and cold readings. The motivational
and social determinants include generally self-serving beliefs - "believing
what we want to believe" - and, moreover, believing what we think others
will believe; the "false consensus effect".
What
I found to be the most interesting part of the book, however, is when Gilovich
pulled all of his research and reasoning together and explained some of the "questionable
and erroneous beliefs" that are most commonplace today, and how they are
potentially negatively affecting people and, perhaps, society as a whole. Gilovich
touches on three major topics: belief in ineffective "alternative" health
practices, belief in the effectiveness of questionable interpersonal strategies,
and belief in ESP. What I found the most interesting and relevant, however, was his
analysis of belief in ineffective health practices, and why the field of
medicine and health is where the most harm is done to the human race as a
result of erroneous beliefs. From "psychic surgeons" to faith healers
to baseless healing rituals and unproven medications, humans continue to defy
even the most basic logic by forsaking proven, scientifically researched
medical treatments in favor of perceived positive outcomes and the anecdotal
evidence of quasi-medical procedures and treatments.
Our
very first lecture touched on a simple but major example of a pseudoscientific
"medical device", the "Q-RAY bracelet" that "balances negative
ions and positive ions" to improve wellness and performance. The entire
sales pitch for this bracelet is based on vague descriptions, no clinical
evidence, and the subjective, anecdotal response from actual users, who
"cannot believe" how great they feel and call it "magic".
Later in our lecture series, we learned about many of the tactics used by
psychics and mediums that perform "cold readings", and as it turns
out, nearly everything in the Q-RAY sales pitch fits the characteristics of a
cold reading - vague claims, testimonial/opinion based statistics, and "shotgunning",
or spreading the possible positive effects of the bracelet out so widely that
no one can really determine if it is helpful or not. The bracelet supposedly
"optimizes bio-energy" (though they don't even explain what
"bio-energy" is), "promotes a more active, better lifestyle (with no
explanation of what the general "better" even refers to), and
"provides an overall sense of well-being". Could these "benefits"
possibly be any more vague when one uses scientific thinking rather than fast
thinking? Not surprisingly, clinical studies found that the bracelet had
essentially the same effect as a placebo...in other words, no effect
whatsoever.
Though,
as humans, we've come a long way from what would today be considered archaic,
almost ridiculous medical practices like lobotomies and trephination (where
doctors cut holes in the skull or sever parts of the brain to treat mental
disorders), the anti-vaccination movement - based almost solely on fear mongering
and erroneous conclusions - shows that we really haven't learned much when it comes
to associating scientific evidence with perceived outcomes. Gilovich's book was
published nearly 30 years ago, yet the "post hoc" fallacy (the Latin
"Post hoc ergo propter hoc",
"after this, therefore because of this") is as relevant as ever; an
outcome is associated with a perceived treatment, so it must be because of that
treatment. As Gilovich points out, almost any outcome can appear to support a
treatment's effectiveness...or ineffectiveness, for that matter. Ironically, we
again tie back to the "believing what we want to believe" hypothesis
when it comes to why we, as humans, continue to make these cognitive errors in
the first place. Three decades later, this fallacy has reared its ugly head
again, and the very phenomenon that Gilovich researched and wrote about has brought
about a major regression in modern medicine.
The
causes of autism are not yet 100% understood, though researchers are making great
strides in understanding both its causes and effects. Unfortunately, since medical
science has not yet established a cause with certainty, many have speculated
that vaccines have caused autism, and have made the irrational decision to not
vaccinate their children. Those supporting the anti-vaccination movement have little
to no science to base their decision on; the only studies that have purported
to find a link between vaccinations and autism have been proven to be
fraudulent. Nevertheless, "anti-vaxxers", as they are known, have
remained steadfast to their convictions, claiming that vaccines can and have
caused autism, despite no evidence supporting this fact; post hoc ergo propter hoc. What was thought to be an innocent,
personal decision is now putting the lives of children at large in jeopardy.
Diseases like measles - which was essentially eradicated in the United States
as of 2000 - have returned and have infected thousands of children and killed
many in this country alone. As one can clearly see, erroneous human reason can
literally cost lives.
As
Gilovich points out, "what is most important, then, is not dispelling
particular erroneous beliefs (although there is surely some merit in that), but
creating an understanding of how we form erroneous beliefs". If we
understand how and why we form erroneous beliefs, and if we think more slowly -
more scientifically - rather than relying solely on fast thinking and cognitive
heuristics, we will undoubtedly be better off as individuals and, subsequently,
better off as human beings. Think critically, employ all of the elements of
thought, and have the courage and confidence to be a fair-minded thinker.
I've
compiled a few relevant and interesting videos that supplement my report and
complement Gilovich's book. I hope you find them as interesting as I did!
Tom Gilovich on why it's worth studying human judgment and decision making:
Ethan
Lindenberger, a vaccinated teenager who was unvaccinated as a child, explains the
dangers of misinformation and how erroneous thinking can put lives at risk:
Tom
Gilovich lecture covering many of the topics he researched and wrote about in
his book:
References
Cherry,
K. (2019, June 17). How Heuristics Help You Make Quick Decisions or Biases. Retrieved from https://www.verywellmind.com/what-is-a-heuristic-2795235
Kahneman,
D. (2012, June 15). Of 2 Minds: How Fast and Slow Thinking Shape Perception and Choice. Retrieved from
https://www.scientificamerican.com/article/kahneman-excerpt-thinking-fast-and-slow/
Post
hoc ergo propter hoc. (2019, July 26). Retrieved from https://en.wikipedia.org/wiki/Post_hoc_ergo_propter_hoc
The
anti-vaccination movement. (2018, September 13). Retrieved from https://measlesrubellainitiative.org/anti-vaccination-movement/
Tom
Gilovich. (n.d.). Retrieved from https://gilovich.socialpsychology.org/
No comments:
Post a Comment