Let’s keep the vaccine misinformation problem in perspective


The most vaccine-skeptical public figures, such as Tucker Carlson or the Senator Ron johnson (R-Wisconsin), understand that. They don’t need spread demonstrable lies. They can simply focus night after night on aberrant cases of serious side effects. Or they can selectively present results of scientific studies or government communications in a way that seems to suggest something disturbing about the virus or vaccine. Or they can bypass the scientific question entirely in favor of storm on how the government vaccination campaign is really about social control. Like any illusionist, they know that the most powerful tool available is not misinformation, but misguidance.

This subtle distinction is often lost on members of the media and the political establishment. Sometimes “disinformation” becomes a catch-all term for any material used to deter people from getting shot, whether objectively false or not. A recent New York Times item About influential anti-vaxxer Joseph Mercola, for example, titled “The Most Influential Coronavirus Disinformation Spreader Online,” concluded by noting that Mercola had posted a Facebook post suggesting that the Pfizer vaccine was not effective than against 39% against infection with the Delta variant. . Mercola accurately relayed the conclusions of a real study, which had been covered by the mainstream media. the Times However, the article tweaked it for not mentioning the study’s other finding, which is that the vaccine is 91% effective against serious illnesses.

Without a doubt Mercola, an osteopathic doctor who has made a fortune selling “natural” health products often presented as alternatives to vaccines would have done its supporters a service by sharing this data point. Choosing real statistics to cast doubt on vaccines is dangerous. But to sweep this example under the guise of disinformation is to engage in a conceptual drift. Misinterpretation is not the same as disinformation, and it is not simply a semantic distinction. Facebook, YouTube and Twitter are rightly under immense pressure to do more to prevent the spread of dangerous lies on their platforms. They often draw inspiration from established media organizations. It would be a disturbing development for free speech online if, in the name of preventing real-world damage, platforms were routinely removed as “disinformation” messages that did not contain anything objectively false. It’s hard enough to distinguish between truth and falsity on a large scale. It would be unwise to ask platforms to take responsibility for judging whether the interpretation facts – their opinion on a matter of public policy – is acceptable or not.

“Misinformation definitely makes it worse,” said Gordon Pennycook, a behavioral psychologist at the University of Regina. “There are people who believe things that are wrong, and they read these things on the Internet. It is sure that it is happening. But, Pennycook continued, “the more you focus on this, the less you talk about the avenues in which people hesitate and have nothing to do with misinformation.”

In his research, Pennycook conducts experiments to understand how people actually react to misinformation online. In to study, he and his co-authors tested whether people would be convinced by the allegation in a fake headline after being exposed online. (Sample headline: “Mike Pence: Gay Conversion Therapy Saved My Marriage.”) In one phase of the experiment, exposure to fake news caused the number of people who rated the claim to go up as exact 38 to 72. You can look at this and say that online disinformation increases belief by 89 percent. Or, you can note that there were 903 participants in total, which means the headlines only worked on 4% of them.

The current debate over vaccine misinformation sometimes seems to imply that we live in an 89 percent world, but the 4 percent figure is probably the most useful benchmark. It would still be a serious problem if only a small percentage of Facebook or YouTube users were sensitive to vaccine misinformation. They would be more likely to refuse to get the vaccine, get sick, and spread the virus – and perhaps their false beliefs – to others. At the same time, it is important to keep in mind that somewhere around a third of American adults still choose not to be vaccinated. Even if Facebook and YouTube could erase all anti-vaxx content from their platforms overnight, it would only take a bite out of a much bigger problem.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *