21 Apr The Psychology of Vaccine Denial and The New Anti-Intellectualism
I don’t know if this could really be called “new”, but it’s a form of anti-intellectualism that usually goes unnoticed. I find it particularly frustrating because I so often see it often among people who claim to respect knowledge, education, and expertise. It is an ironic lack of respect for that same knowledge, education, and expertise.
The Psychology of Vaccine Denial
I’m sure you’re wondering what I’m talking about here, so I will get to the point. Rebecca Watson wrote a short piece, published in Skeptical Inquirer Online, that seems to question the potential effectiveness of a bill currently in the works in California which would eliminate non-medical exemptions for vaccination requirements to attend public school. I say “seems to” because it’s actually unclear.
My main criticism of the piece itself is not that much of what she says is blatantly wrong, but that the piece doesn’t go anywhere and the research cited doesn’t support the weak, barely identifiable thesis at all. It is disjointed and doesn’t flow well. The transition from the topic of education to that of the bill is a huge leap. Her conclusion makes little sense given the rest of the piece. It’s only a few paragraphs (very short for SI), but in those few paragraphs she manages to treat some important research shallowly and selectively, missing the valuable knowledge that a nuanced look at the findings would provide. I won’t make that mistake here.
She cites two articles, the first mention is this:
Researchers Brendan Nyhan and Jason Reifler have spent the past few years conducting studies that seem specifically designed to depress science communicators. Last year, they published a paper in which they showed that correcting myths about the MMR vaccination actually decreased a parent’s intention to vaccinate.
What’s missing is that this was only true among those “with the least favorable vaccine attitudes”.
Even showing participants images of sick children was counterproductive, increasing their belief that vaccines are connected with autism.
Yes, but the “even” part is very misleading. Emotional pleas such as describing disease risks and showing images of or telling stories about children with diseases all increased this belief, but education refuting a link successfully reduced that same belief.
From the article:
“Autism correction” is most effective in reducing agreement with the autism misperception. Strong agreement declines from a predicted probability of 8.9% to 5.1% (and likewise for other response options). By contrast, the predicted probability of strong agreement increases to 12.6% for “Disease images.” Similarly, the predicted probability of believing serious side effects from MMR are very likely increased from 7.7% among control subjects to 13.8% in the “Disease narrative” condition.
This combination of results tells us a lot about what is happening when people are confronted with different strategies, yet nothing Watson wrote went beyond the few bits she selected from the abstract.
For the second citation, Watson writes:
Last month, they conducted a similar test using the common belief that the flu vaccine causes the flu. The results were the same: correcting the misconception only decreased the subjects’ self-reported intention to get vaccinated.
But this is what the article’s abstract actually says:
Corrective information adapted from the Centers for Disease Control and Prevention (CDC) website significantly reduced belief in the myth that the flu vaccine can give you the flu as well as concerns about its safety. However, the correction also significantly reduced intent to vaccinate among respondents with high levels of concern about vaccine side effects – a response that was not observed among those with low levels of concern.
After reading the article, I can tell you that the education measures worked across the board–in every concern level, educating people about the vaccine significantly reduced belief in the myth. However, those with the most concern about side effects dug in when it came to intent to vaccinate–not everybody, those with the most concern. (BTW, they didn’t conduct the test “last month”.)
These are finer points, but they are far from trivial. The details are what tell us what’s going on. I would not expect someone without an education in psychology to recognize the implications, although Abbie Smith, who reviewed the first study in a blog post last year, managed quite a bit of insight (which she wrote about with care, describing the findings in detail and not speculating beyond the what happened in the study).
And this is where the anti-intellectualism is most apparent in Rebecca’s piece:
At this point, we can only guess as to the reason why this happens.
No, Rebecca, at this point, you can only guess. So, if you don’t know why it happens, then nobody does?
To anyone who has studied decision making, reason, attention, or just about any area of social psychology for a few years, this statement is absurd. The pattern of results found in these two studies is exactly what I would have predicted. We have decades of research telling us why this happens.
For a much more in-depth look, I will ask you to read Mistakes Were Made, But Not By Me by Carol Tavris and Elliot Aronson. The nutshell is that people do all sorts of mental gymnastics to reduce something we call cognitive dissonance–a tension between contradictory attitudes or an attitude and a behavior–in ways that allow us to avoid changing the behavior or strongly-held attitude.
In this case, when those holding strong anti-vaccine attitudes accept that their expressed reasons (e.g., autism) for those attitudes are invalid, they simply find another reason to maintain the attitude (e.g., side effects).
People are invested in the choice not to vaccinate, not the reason for the choice.
This would be especially true for those who have acted on that choice. The alternative is to accept that they have put their children at risk for no reason.
So, although Watson is not incorrect in reporting that some approaches backfired, she failed to see or report the nuances in these findings that tell us why and what we might do about it. And there’s more that I would not expect a layperson to recognize.
These studies only measured attitudes immediately following education–education that worked in dispelling myths about those vaccines. What I would like to see is follow up research examining attitudes months or years afterward. What happens, for example, when people are educated, then given time to change their attitudes without threat to their egos and identities? I predict that a large portion of them will change their minds. Much of the resistance is probably rooted in ego threat. Giving people time and space may allow them to save face while changing the attitude to reduce the cognitive dissonance associated with the conflicting ideas.
Furthermore, in these laboratory studies, parents are asked to report their attitudes prior to the exposure to materials. This is a form of declaration, committing people to a viewpoint that they then feel compelled to defend. That’s not what happens in real world situations.
So her next paragraph…
Do people hold their anti-vaccination beliefs so deeply that correcting a misconception only encourages them to spend time digging around for another reason to hate vaccines? If so, then the answer may be to address the underlying reasons for the belief instead of the scientific facts.
How are these two sentences connected? That people find other reasons to maintain a behavior or attitude is not evidence that there is some hidden reason. And expressed reasons are precisely what what are addressed in the studies she cited. She’s come full-circle with nothing at all to show for it.
Cognitive dissonance and the unconscious strategies people use to reduce it are human nature. We cannot “address” human nature so easily. We can educate people about human nature and how it does not always lead us to the best decisions to meet our goals (and by “we”, I mean people who have studied human nature, such as social scientists with years of training and knowledge), but of course that’s a much broader goal. Increasing vaccination rates is a public safety issue that must be addressed with more urgency and specificity.
Finally, all of this came down to this one guess of hers:
For instance, perhaps the belief is rooted in a fear of government control over individual choices.
Um, seriously? This came out of nowhere as if she just didn’t have an ending to her story, or perhaps couldn’t come up with a good segue to get to the one thing that she meant to talk about: the California bill that may eliminate personal belief exemptions for unvaccinated kids to attend public schools. Rebecca’s logic is that if fear of vaccine harm is actually rooted in fear of government control, then the bill might make matters worse.
This is a huge leap. For one thing, she cites no research showing that fear of government control has anything to do with the average vaccine denier’s choices. Even if it did, the very research she cited shows that removing all government involvement in vaccination would not change intent to vaccinate (those with the most concern would simply find another thing to worry about). But more importantly, she begs the question:
But will the law (which already exists in West Virginia and Mississippi) only encourage the anti-government anti-vaccine activists to band together and renew their efforts to fight for their freedom to harm innocent kids?
Well, now, why not take a few minutes and do a little research to find out how laws affect vaccine rates?
According to Pew, “states with the strictest immunization laws tend to have the highest immunization rates” (they have a nice graph sorted by vaccination rates and Mississippi is at the top).
Not surprising. People tend to follow the law, and if they want to send their kids to public schools, they must vaccinate. But does this change attitudes? I think a lot of people would say that they don’t care, as long as it changes the behavior, but I think we can all agree that changing the attitude would be best.
And stricter immunization laws will change attitudes and beliefs about vaccines. How do I know this? Simple: cognitive dissonance.
One early finding in studies of cognitive dissonance theory is that it is often easier for people to change an attitude than it is to change a behavior. We have seen numerous examples of this, not only on laboratory studies, but in real-world behaviors such as smoking and exercise habits. Once invested in a behavior, the attitude follows as a matter of reducing the tension because we are invested in the behavior, not the reason for the behavior.
In this case, the biggest thing currently in the way of attitude change is the personal belief exception. Remove that, and behaviors must change. Once behaviors change, attitudes will follow, especially with education, which will pave the way for attitude change by giving parents a way to engage in the behavior (of sending their child to public school) without dissonance. This is especially true when the parent has not declared their attitude prior to education, as they do in a laboratory study.
What the research cited suggests, when included in the context of decades of psychological research about the relationships among attitudes, behaviors, and values, is that a combination of stricter laws and education correcting myths about vaccines is not only highly likely to increase vaccination rates, it will also decrease perceptions of risk of harm from vaccines. Giving people facts does indeed work. It works to educate people about facts. If you want them to change their attitudes, however, you need to dig a little bit deeper.
To head off what will surely be a the first thing Watson’s supporters will point out: what’s the difference we came to the same conclusion? Her argument is “I think we should try X because nothing else seems to work” and mine is that X is what the science suggests. Only one of these is a valid argument. The ends do not justify the means.
The New Anti-Intellectualism
The implication that just anyone can write about this stuff with authority is the kind of anti-intellectualism I’m referring to in the title.
And before you assume that I am saying that skeptics have nothing to say, think again. Pseudoscience and fraud, the core of skepticism, are not science. Skepticism is a field in and of itself, very distinct from science. It includes scientific thinking and it benefits, as every field does, from the products of science, but it is not science.
This piece is poorly researched, weak, and reads like a book report that someone started, put away, then suddenly realized it was due and wrote the rest while the other kids were watching a film in class. A big part of that is the fact that Watson simply does not know the field, something she has demonstrated repeatedly. Yet, knowing her track record in this area, CFI decided to commission and publish this. The poor quality of the piece is a side effect of overconfidence coupled with a lack of expertise, but it further points to a huge drop in standards by SI Online. Not that SI hasn’t published some misses in the past, but this sad little piece is just one of many lesser-quality articles recently appearing there, including one in which the author describes party/county fair psychics as harmless fun.
So, am I saying journalists and other non-scientists (e.g., skeptics) should never write about science? No, I am not.
It is fine for non-experts to write on topics when they do so with great care. I cannot stress this enough.
A non-expert can do a great job when they do a proper amount of research by talking with experts (rather than spending a few minutes Googling and picking sentences out of abstracts that one believes supports one’s already-formed opinion), when they discuss experiments and studies accurately without omitting important details, when they properly credit the sources of ideas and opinions, when they follow what they find rather than start with a conclusion and attempt to support it, and when they refrain from stating their personal opinions as authoritative. Watson rarely appears to do any of those things when she writes about science. She simply writes and speaks with an air of confidence and that seems to be enough to make some people think that she is clever and knowledgeable.
Good science journalism allows the researchers’ voices to be heard, not the author’s. Think about that.
I have written before about the dangers and hypocrisy of speaking and writing on topics which require expertise one does not have (here, here, and here, for example–two of which are also about Rebecca Watson). It’s actually a topic that has received a lot of coverage, from Massimo Pigliucci’s talk at TAM8 to articles by Daniel Loxton (yes, I’ve linked to them both before and for good reason). I expect to see this kind of thing all over the blogosphere, but to see it on the Skeptical Inquirer’s site is disheartening, especially on the heels of other pieces that fall far below their old standards.
As I stated in this post more than three years ago, we (skeptics in general) criticize Jenny McCarthy and Bill Maher because they don’t have the expertise to make the statements they make. We criticize “the Food Babe” and many, many others for the same reasons. We tell people not to take medical advice from a Playboy Bunny and a talk show host, yet we (skeptics again) give a microphone to a blogger to talk about the psychology of vaccine denial simply because she calls herself “Skepchick”? How is this justified?
Now, I am perfectly aware that many people don’t believe that psychology is a science or that expertise in the field is actually a thing. I deal with that kind of anti-intellectualism every day. But I am still stunned when I see such blatant disregard for it among people and organizations who wave the flag of “listen to the experts” when it suits their purposes. CFI, you should be ashamed.
Originally published on ICBS Everywhere