Top 5 This Week

Related Posts

Colorado doctors, advocates tackle “pervasive” health misinformation head-on

Before the pandemic, the conventional wisdom in public health was that trying to engage with misinformation was like throwing water on a grease fire — nearly guaranteed to spread the problem.

But it’s become clear over the last few years that bad information about vaccines and other hot health topics is already so widespread that it’s unlikely that the first time someone encounters a myth will be when they see a post debunking it, said Emily Clancy, communications director for Immunize Colorado.

A 2021 poll by the Kaiser Family Foundation found 78% of adults had heard at least one myth about COVID-19 vaccines and either believed it or weren’t sure whether it was true.

For example, if almost no one had heard the idea that COVID-19 vaccines could change their DNA, pointing out that they can’t might spread the lie to people who wouldn’t have otherwise heard it. But as is, almost everyone has heard those kinds of falsehoods, so that’s not much of a concern, Clancy said.

“Because misinformation is so pervasive, we also recognize that when we don’t address the myth directly, we risk not sharing accurate information and leaving people with only the misinformation they may have seen,” she said.

More than three years after COVID-19 became a household word, the debate about how to reach people with correct information and reduce the odds they’ll make decisions based on lies is ongoing. Some experts have raised concerns that the United States could actually be less prepared for a future pandemic than it was for COVID-19, because many people distrust public health leaders and 30 states limited their power to take steps like temporarily closing businesses or requiring people to quarantine.

In February, U.S. Rep. Diana DeGette, a Denver Democrat, introduced a bill in Congress that would appropriate $45 million over four years to study how to disseminate correct information and fight misinformation, especially during public health emergencies. It would also direct the U.S. Department of Health and Human Services to conduct an education campaign directed at populations that are most often targeted with misinformation, to give them tools to vet data and sources.

The bill is still stuck in a committee, though, and not many proposals are gaining traction while Congress is preoccupied with a fight over raising the debt ceiling.

Dr. Jon Samet, dean of the Colorado School of Public Health, said the problem of misinformation isn’t new; the tobacco industry created the playbook for disinformation during its long battle to suppress the truth that smoking kills. (Misinformation is unintentionally wrong — think of that person on Facebook who’s convinced drug companies are hiding a cure for cancer — whereas disinformation is something that the spreader knows isn’t right, but chooses to push anyway.)

While disagreement and skepticism are healthy and part of the scientific process, what the tobacco industry did was create the impression of doubt so the public would be confused about the true state of the science, Samet said.

People who spread disinformation today use many of the same tactics, but their financial self-interest is sometimes less obvious than the tobacco industry’s, Samet said. For example, conspiracy theorist Alex Jones made millions selling supplements, but his listeners may not have realized that was why he was telling them to reject vaccines and proven medications, he said.

“This is not a new problem, but it’s incarnated in new ways because of the internet and because of politics,” he said.

Charlatans have always existed, but now technology allows them to reach a large audience in a way that they couldn’t decades ago, Dr. Robert Califf, commissioner of the U.S. Food and Drug Administration, said during a visit to Denver in January. The FDA can regulate drugs and devices that claim to treat, prevent or cure a disease, but online sellers can just shut down one site and pop up somewhere else, he said. And of course, they can avoid the game of regulatory whack-a-mole altogether by sticking to vague claims, such as that a supplement “boosts immunity” — a phrase that sounds legitimate to laypeople but has no medical meaning.

“A lot of people who are saying ‘don’t use the vaccine’ are selling something else and they’re making a lot of money doing it,” he said.

Misinformation also tends to morph over time. When enthusiasm for the anti-parasitic drug ivermectin as a COVID-19 therapy began to wane — whether because proof stacked up that it didn’t work or because public fear of the disease has lessened — its biggest backers shifted to promote it as a treatment for flu and respiratory syncytial virus, according to The Washington Post. There’s no evidence it works against either virus in humans; many things kill viruses in petri dishes, but either don’t work in the body or would have to be given at unsafe doses to have an effect.

The challenge in combatting those kinds of claims is that many people don’t trust the FDA, or government in general, and doctors aren’t taught how to deal with misinformation, Califf said. More patients are saying they did their own research — which could mean anything from reading medical journals to watching YouTube videos — and physicians need to know how to respond to that, he said.

A survey in February 2022 found about 37% of adults said they trusted the Centers for Disease Control and Prevention’s recommendations “a great deal,” while about a quarter said the same of their state and local health departments. Doctors and nurses had the highest levels of trust, with 54% and 48% of people, respectively, saying they trusted their health recommendations a great deal.

Not everyone thinks it’s the role of public health or the government to try to persuade people. Weld County Commissioner Scott James, who spoke on behalf of the county’s health department, said he thought that health officials actually lost trust with his constituents when trying to persuade them to take the COVID-19 vaccine. (The vaccine significantly reduces the risk of hospitalization or death.) People don’t like to be told what to do, especially in the West, he said.

“As a county department of health, we deal with fact and the only thing we can control is what we put out,” he said.

Dr. Patrick Pevoto, an obstetrician-gynecologist and president of the Colorado Medical Society, said that sharing facts is important, but that the best thing doctors can do is show that they’re listening and they take their patients’ concerns seriously. That can mean admitting when they don’t know something or when the science is still evolving, he said.

“I think people trust that,” he said.

Pevoto said he has hope that people who’ve accepted disinformation can change their minds, based on his family’s experience. When his brother got COVID-19, he first tried hydroxychloroquine, an antiparasitic drug that was touted as a cure even as evidence mounted that it didn’t help with the virus. Eventually, his brother got sick enough that he sought mainstream care, which changed his view of who was trustworthy on health matters, Pevoto said.

There’s no quick solution to misinformation, and some of the most effective efforts are one-on-one conversations that include both data and people’s personal stories, Clancy said. It’s especially helpful to pair a vaccine expert with someone who is trusted in a particular community, she said.

“These are often much more effective than any social media or digital campaign,” she said.

Sign up for our weekly newsletter to get health news sent straight to your inbox.

Popular Articles