Before the Next Disease Spreads, This One Already Has

As global health experts continue to warn of the next “Disease X”, a highly infectious and unpredictable outbreak, the world faces a new kind of epidemic that may prove just as deadly as the disease itself: misinformation. In an age where information spreads faster than viruses, falsehoods, half-truths, and conspiracy theories can significantly undermine public health efforts, worsen the spread of disease, and erode trust in the institutions meant to protect us.

During a fast-moving outbreak, time is critical. Containing a highly infectious disease requires swift action: early detection, public cooperation, timely isolation, vaccination where possible, and trust in health authorities.

However, the rise of social media and instant messaging platforms has created a parallel outbreak, one of misinformation, that can travel farther, faster, and deeper into communities than any virus. In some cases, misinformation can be so potent that it leads people to actively reject lifesaving guidance, delay seeking care, or even attack health workers and scientists.

Consider what might happen in a hypothetical outbreak of a new respiratory disease. Within hours of the first case being reported, posts begin to circulate online claiming the illness is a government hoax, or worse, a bioweapon. Others suggest the disease was caused by a new 5G network rollout or that a particular religious group is responsible. In the absence of clear, trustworthy information, people begin to panic, and panic feeds on confusion. Homemade remedies go viral: boiling herbs, drinking industrial bleach, inhaling smoke. Meanwhile, public health officials struggle to get accurate information to people who no longer know whom to trust.

This challenge is not hypothetical. The COVID-19 pandemic, monkeypox outbreaks, and even past Ebola responses have all revealed how damaging misinformation can be in a health crisis. During the early months of COVID-19, conspiracy theories about the origin of the virus and safety of vaccines were shared widely, often outpacing official communications. In many communities, vaccine hesitancy fueled by misinformation resulted in preventable deaths and prolonged the pandemic.

One of the main drivers of misinformation is a lack of trust in governments, in scientific institutions, and in the media. In many parts of the world, especially where people have experienced marginalization, historical abuse, or state neglect, there is a strong basis for skepticism. Public health efforts, no matter how well intentioned, can be viewed through a lens of suspicion. When institutions fail to communicate transparently or appear to prioritize political or economic agendas over people’s well-being, they create fertile ground for misinformation to thrive.

In a predicted outbreak of a highly infectious disease, the spread of misinformation could also be amplified by artificial intelligence tools, deepfakes, and bots designed to destabilize public trust. Malicious actors may exploit the crisis to spread fear, erode faith in public health, or even create geopolitical unrest. This adds another layer of complexity to managing a health emergency, not only must we combat the disease, but we must also protect the information ecosystem surrounding it.

So, what can be done? Fighting misinformation requires more than just posting facts. People don’t always trust facts when they come from institutions they distrust. Instead, public health communication must become human-centered, empathetic, and inclusive. Trusted local voices, community leaders, religious figures, educators, and healthcare workers should be empowered to share clear, culturally relevant information. Messaging must be delivered in local languages and through the channels people actually use, whether that’s radio, WhatsApp, TikTok, or town hall meetings.

In addition, governments and tech companies must take more responsibility for moderating content during public health emergencies. Algorithms that prioritize sensational or polarizing content can be deadly during an outbreak. Partnerships between health authorities and social media platforms can help to flag misleading content and elevate trusted sources. But these actions must be transparent and respect free speech, or they risk further eroding public trust.

Lastly, the public must be better prepared before the next outbreak occurs. Just as we conduct drills for fires or earthquakes, we need public education campaigns about misinformation and its dangers. Teaching digital literacy, critical thinking, and how to evaluate sources should be part of school curriculums and adult education. Communities that are equipped to question what they see online, check sources, and consult professionals are far less likely to fall victim to misinformation.

In a future epidemic, the virus itself may not be the only or even the greatest threat. A population divided by confusion, fear, and misinformation is vulnerable, not just to disease but to deeper societal breakdown. To prevent this, we must start treating the infodemic with the same urgency as the epidemic. Our ability to respond effectively to the next global health crisis may well depend on it.

Leave a comment