When it comes to infectious diseases, public health officials face a matrix of questions before they can effectively respond to an emerging health crisis.
Can we fully define the threat? Do we understand the scope of exposure? Do we have a treatment? Can we prevent its spread? Is there a vaccine?
Often, the answers to those questions are a mix of “yes,” “no,” and “sort of.” However, when it comes to 1 of the most dangerous emerging threats, the answer to all 5 questions is decidedly: no. Health misinformation might sound like a problem for the communications industry, but the issue has become an increasingly dangerous public health concern. And unlike so many other threats, this concern has a nearly insurmountable problem: it spreads on the internet.
Joseph A. Hill, MD, PhD, a professor of medicine and microbiology at the University of Texas Southwestern Medical Center, raised the alarm about misinformation earlier this year when he and more than 2 dozen colleagues penned a widely published editorial
warning about the dangers of medical misinformation.
The co-authors, all editors-in-chief of major cardiovascular journals, warned that medical misinformation is a crisis, and extends far beyond fear-mongering about vaccines. Patients are increasingly second-guessing treatment recommendations based on incomplete, misinterpreted, or downright false
“One significant cause of suboptimal utilization of our prodigious tool chest is medical misinformation hyped through the internet, television, chat rooms, and social media,” Hill, the editor-in-chief of Circulation
, and colleagues wrote. “In many instances, celebrities, activists, and politicians convey false information; not uncommonly, authors with purely venal motives participate.”
The results can be deadly—anything from a patient suffering unnecessarily from a treatable condition to outbreaks of an entirely preventable disease like measles
In an email to Contagion®,
Hill said he believes the great majority of patients trust their physicians’ advice, even if they don’t always perfectly adhere to recommendations or prescriptions.
“I do believe that the group(s) predisposed to rejecting this advice is a small, but highly vocal, subset,” he said. “That said, their not pursuing science-based advice puts themselves, and often others (eg, herd effect immunity) at risk.”
Vish Viswanath, PhD, professor of health communication at the Harvard T. H. Chan School of Public Health, said he doesn’t think public trust in physicians is any lower than it used to be.
Rather, he said the problem is 2-fold. First, it’s easier than ever before to spread information generally, thanks to social media and the internet more broadly. Secondly, though, it’s easier than ever to access scientific and medical information.
“There is greater specialized information available than was ever the case before,” he said, regardless of the reader’s ability to understand the information. Thus, for some patients, the democratization of scientific literature doesn’t result in more knowledge, so much as in more confusion and anxiety.
Stopping the Crisis
Typically, in a public health crisis, physicians would use a variety of avenues to address the problem. The first is prevention—trying to stop misinformation before it starts. The second is treating the individual patient—clearing up misinformation at the point of care. A third avenue is combatting existing misinformation at the population level—finding ways to stop or curb misinformation once it’s out in the wild. None of the avenues is easy when it comes to health care misinformation.
A Vaccine for Misinformation?
Stopping misinformation before it starts is particularly difficult. Viswanath categorizes bad health information into 2 categories: dis
information—the intentional spreading of false information; and mis
information—spreading false information due to an honest misinterpretation or misunderstanding.
“We know that some individuals legitimately, genuinely misinterpret [health information] with good intentions because they don’t have sufficient training and analytical skills in a particular area,” he said.
He notes that this often occurs when patients mistake correlation for causation, as happened when patients anecdotally reported that their children seemed to act differently after receiving a vaccine. The difference between correlation and causation is widely known and respected in the scientific and medical communities, but it can be difficult for the general public to appreciate.
Even if one could stop bad actors from creating disinformation, there would be no way to stop misinformation.
Stopping Misinformation at the Point of Care
So what about stopping misinformation in the exam room? That’s not particularly easy, either.
Hill said he has yet to find a strategy that works well to combat patient skepticism or resistance based on misinformation, but he doesn’t think physicians ought to try and argue their patients into accepting the science.
“Right or wrong, I tell my patients, ‘Your insurance company is paying my institution for me to provide you with the best advice based on the latest science, to answer your questions, and to lay out the justification for my advice. I have done that. Now, it’s your body, and you’re free to accept or reject my advice,’” he said.
Viswanath added that such conversations put physicians in a bind because there is no billing code for talking a patient out of a misinformed health care decision, and spending time on a lengthy conversation with a misinformed patient will ultimately detract from the amount of time the physician can spend with other patients who might have more pressing medical matters that need addressing.
Stopping Misinformation on a Large Scale
Viswanath said physicians can’t be expected to combat such a major problem alone. Rather, there needs to be a shift in the ecosystem of health information. The question is: What would such a shift look like?
“I think that’s a collective responsibility we all have,” he said. “We as a society, both media, publishers, scientists, scientific institutions, all of us have to collectively think through what it is we can do to really minimize this problem. We can’t eliminate this problem.”
Some of the misinformation comes from media companies. Hill said he doesn’t trust the media to correct the problem on its own, in part because the economics of the internet likely wouldn’t support hiring the kinds of experts and fact-checkers necessary to get the science right.
“All that said, government oversight/regulation may well be required, but it would be optimal in my opinion for the purveyors of social media content to manifest the social conscience—the moral integrity—to do this on their own,” he said.
Social media platforms have been under increasing pressure to do more to stop the spread of misinformation, and some, notably Instagram, have begun taking steps
toward that end.
But the problem is not just about dubious sources. Misinformation can also come from the spread of information gleaned from seemingly credible sources. A case-in-point is a spike in patients requesting genetic testing to see if their children carry a particular mutation of the gene MTHFR. There is no scientific evidence that the gene has anything to do with adverse responses to vaccines. However, a decade-old study published in a prominent journal suggested that a variation in the gene might be connected with patient responses to vaccines.
The anti-vaccination movement seized upon the study, prompting some parents to order genetic testing on their children in hopes of predicting potential negative impacts of vaccines. When The Atlantic wrote about the phenomenon
earlier this year, the study’s senior author conceded that the research doesn’t hold up and “isn’t a valid study by today’s methodology.” And yet, the study is easily accessible, in full and for free, via a quick Google search.
In other cases, a study might be perfectly sound, but the public simply misinterprets it or makes unjustified conclusions.
Viswanath said he has a “libertarian” approach to information, so he doesn’t think making these journals inaccessible is the solution to misinterpretation. Besides, locking up the information would only lead to more conspiracy theories, he said. Rather, publishers need to find ways to add context or labeling to steer readers down the right track.
“We know labels work,” he said. “If you take the tobacco example, labeling works. It’s honest labeling to say that smoking kills.”
However, Hill also said publishers and scientists ought to be held responsible, too.
Although misinformation and disinformation have received significant news coverage, Viswanath said there hasn’t been enough talk about solutions. That will have to change if the public health crisis of misinformation is ever to be contained, he said.
“My plea is that we as institutions engaged in health and health information really have to think about this much more seriously than we have been doing and come up with more institutional-level solutions,” he said.
To stay informed on the latest in infectious disease news and developments, please sign up for our weekly newsletter.