The AstraZeneca Vaccine Crisis in Europe Wasn’t About Science at All

 The AstraZeneca Vaccine Crisis in Europe Wasn't About Science at All
 The AstraZeneca Vaccine Crisis in Europe Wasn't About Science at All

Scared citizens cannot be easily convinced by expertise that feels remote.

There was a crisis brewing in Europe, about the safety of the Oxford-AstraZeneca vaccine against COVID-19. Europeans in multiple countries had reported blood clots and abnormal bleeding after receiving it, which had occasionally required hospitalization. In response, regulators across Europe, where this vaccine is most widely distributed, had suspended its use and were reviewing its safety. But scientific and medical experts have been frustrated. They have emphasized that the incidence of blood clots is actually much lower than in the general population and that the vaccine is safe. They have worried that the regulators’ response will amplify vaccine hesitancy and increase potentially deadly COVID-19 infections. The science, they insist, is clear and should be trusted.

But this crisis wasn’t about science at all. It was about public trust, and scared citizens cannot be easily convinced by expertise that feels remote. Our solutions need to reflect that.

There is a long-standing perception that vaccine hesitancy is the result of public ignorance or dismissal of science. But studies show that vaccine hesitancy is the result of mistrust in our governing institutions, including those dedicated to science and technology. Citizens are alarmed by the often cozy relationships between regulators and the industries they oversee and frustrated with the role of private interests in the research enterprise.

My own research, conducted through the Technology Assessment Project at the University of Michigan, adds two more types of institutional failure that lead to citizen distrust. We have seen serious problems due to what sociologist Diane Vaughan calls the “normalization of deviance,” where unsafe bureaucratic practices come to seem normal if they don’t cause immediate catastrophe. Reviewing the space shuttle Challenger disaster, Vaughan discovered that NASA’s organizational culture made it essentially impossible for managers to hear engineers’ concerns about the weaknesses of the O-ring, the technical component ultimately blamed for the explosion because it cracked in exceedingly cold temperatures. Similarly, organizational problems at the US Centres for Disease Control and Prevention led to its faulty COVID-19 test early in the pandemic.

In addition, marginalized communities often feel that the decisions made by governments and technical experts simply do not represent them. Consider the recent water crisis in Flint, Michigan. Soon after leaders changed the city’s water source in order to save money, residents (54% of whom are Black) noticed that their water had a foul smell and was discolored. They began losing their hair, and their skin was breaking out in rashes. Some people died of Legionnaires’ disease. They cried out to local experts and government officials, including environmental and health regulators, but their concerns were summarily dismissed for months. This episode inflamed existing concerns that government officials did not respect community knowledge or needs, and to this day there is great skepticism that their water is safe to drink and use.(the wire)

Tags

Share Article

A best news channel which publishes latest videos from Jammu and Kashmir which includes Politics, Public Grievances, Current affairs, Off-beat stories, Religion, etc. and We want to combine literature and journalism on one platform.

Related Posts

This is articles having same tags as the current post.