The 2 AM Scroll: Inside the Health Misinformation Storm.
Reference
Do Nascimento IJB, Pizarro AB, Almeida JM, Azzopardi-Muscat N, Gonçalves MA, Björklund M, et al. (2022) Infodemics and health misinformation: A systematic review of reviews. Bulletin of the World Health Organization, 100(9), 544–561.
Video Lay Summary

Lay Summary Author
Anat Eldar
View at The Collaborative Library Website
Scientists traced how health misinformation spreads—and uncovered a hidden problem: the studies fighting it are often not of good quality themselves.
It’s two in the morning. You’re scrolling in the dark, looking for answers—but all you find is noise. Your aunt shares: “Drinking bleach kills COVID”. Your neighbor writes: “Vaccines contain microchips”. Your colleague forwards: “5G towers spread the virus”. You just want to keep your family safe, but it’s hard to know what to believe. The problem isn’t just the virus—it’s the flood of information that comes with it. Scientists have a word for this: an infodemic—when truth and falsehood spread together so quickly that people can’t tell them apart. And this pattern repeats with every outbreak or disaster—whether it’s COVID, Ebola, or Zika.
Health misinformation isn’t just confusing—it’s dangerous. It can make people delay care, skip vaccines, or lose trust in doctors. It spreads fear and confusion, and leads to bad use of science and resources. As false claims flooded the internet, scientists rushed to study the problem—trying to understand what was spreading and why. But with new studies appearing so quickly, another question emerged: could we really trust the science about misinformation itself?
So, what did the scientists do next? These questions sparked a global investigation. An international team joined forces, led by Israel Júnior Borges do Nascimento from Brazil and David Novillo-Ortiz from the World Health Organization’s European office, with colleagues from Colombia, Denmark, and Sweden. Instead of studying one rumor or one social-media platform, the team did something new—they studied the research itself.
Usually, one scientist looks at a single question, for example, how many tweets about Ebola were false. That’s one study. Then other scientists collect many of those small studies into a systematic review—a study that pulls results together, like completing one part of a large puzzle. But even those reviews can miss the full picture. So, this team went one step higher—bringing all those reviews together to see what science as a whole really knows about how misinformation spreads. They searched the world’s biggest databases of medical research in 2022, reviewing nearly ten thousand papers.
From that mountain of data, they found 31 major review studies, but only 17 fully published in scientific journals with a full text available. These covered about a thousand smaller studies from different countries, platforms, and health topics. The researchers registered their systematic review with PROSPERO, which is a public database that tracks reviews to make them transparent and avoid duplication. They followed international reporting standards called PRISMA 2020 and QUOROM, which help make reviews clear and reliable. They asked four main questions:
1. How well do current studies deal with the special features of infodemics (too much or false information)?
2. What kinds of information about infodemics appear in systematic reviews?
3. What challenges, opportunities, and recommendations do those reviews mention?
4. How good is the quality and reporting of the existing reviews themselves?
To make sense of this mountain of data, they grouped the evidence into key themes:
• The harm misinformation causes: how false claims affect people’s decisions and emotions.
• Where it comes from and how it spreads: who creates and shares it, and how it travels.
• How common it is online: how much misinformation appears on social platforms.
• When social media helps instead of hurts: cases where it actually supports public health.
• What works to correct false claims: which methods best counter misinformation?
But they didn’t stop there. They realized that even the studies about misinformation might have weaknesses of their own. They wanted to know not just what researchers found, but how well they did their work. To find out, two of the researchers used a checklist with 16 quality characteristics, called AMSTAR 2, to see how carefully each review had been carried out. They asked things like:
• Did the authors plan their methods in advance to avoid bias, i.e., to avoid giving a wrong picture of what’s really true?
• Did they double-check their data for mistakes?
• Did they assess whether the original studies were designed and analyzed properly?
They compared their results, discussed any differences, and made sure their agreement was strong—measured with a statistic called Cohen’s kappa (κ). A score above 0.85 meant their ratings matched well. After collecting and checking all those studies, the team asked two key questions:
1. What’s already known about infodemics (when truth and falsehood spread together so quickly that people can’t tell them apart)?
2. How good is the research we’re relying on?
What did the study reveal?
Across all the studies, misinformation appeared often online.
How common is it?
Between 0.2% and 28.8% of health posts contained false or misleading claims—anywhere from 1 in 500 to nearly 1 in 3, depending on the topic.
Where does it spread?
Mostly through Twitter, Facebook, YouTube, and Instagram, where information travels fast and far.
What harm does it cause?
False health information was linked to fear, confusion, delayed care, wasted resources, vaccine hesitancy, and stress among both the public and health workers. Some studies also noted increased pressure on hospitals and even hostility toward medical staff.
When can social media help?
Some reviews showed that these same platforms can also save lives—helping governments issue early warnings, track outbreaks, and share verified updates when used responsibly. Communities even created “fact-check” groups where users helped one another find reliable information.
What works to stop false claims?
Corrections from experts, doctors, and scientists were found to work far better than those from friends or influencers. Quick, calm explanations backed by credible sources had the strongest impact, especially when shared early.
How strong is the research itself?
When the team rated the research quality of 17 published reviews, 16 were “critically low,” and only one was “low.” Overall confidence in the evidence was low to very low, showing that much of what we know about misinformation still rests on weak foundations.
Why does it matter?
Right now, governments, health agencies, and tech companies make real decisions based on this kind of evidence. But if that evidence is weak, those decisions risk missing the mark. Still, there’s reason for hope.
This research gives us an initial idea of where false information may spread fastest, whose voices people trust most, what types of corrections work best, and what standards future studies must meet. So, eventually, we don’t have to be caught off guard by health misinformation.
What couldn’t this study show?
Because almost all the reviews they examined were low in quality, the findings should be taken with caution. The team only analyzed published scientific reviews, not government reports or unpublished studies, which might have painted a different picture. Moreover, since they were reviewing other researchers’ work, their conclusions depended on how carefully those earlier studies were done.
Finally, much of the evidence came from COVID-19 research. We still don’t know if misinformation behaves the same way during other emergencies, like natural disasters or future outbreaks.
What can we do next time?
So here we are. The next time your phone buzzes with a shocking health claim, you’ll face a choice: Will you share it or pause and check before passing it on?
This study is a wake-up call. Misinformation doesn’t just spread online—it spreads through weak science, rushed research, and decisions made before evidence is ready. Fixing it starts with all of us—scientists, policymakers, and ordinary people. We need better studies, stronger standards, and a shared commitment to accuracy over speed. But it also gave us hope. We now know where false information spreads, how it harms, and what helps stop it. Truth can still win—when we slow down, think critically, and verify before we share.
Your phone is a tool. It can spread fear, or it can spread facts. Pause. Verify. Protect.
Key takeaways:
For everyone
• Health misinformation online ranges widely—from less than 1% to nearly 1 in 3 posts.
• Experts correct misinformation best—and speed matters.
• Always pause and question before you share.
For scientists
• Much of the existing research on health misinformation and infodemics is low quality—we need stronger methods and transparency.
For policy makers
• Decisions must rest on trusted evidence, not speed.
• Health literacy is a long-term investment in public safety.
For the next crisis: We know what spreads, what works, and what to fix. This time, we don’t have to be caught off guard. To tackle infodemics and the spread of false health information, action is needed from many sectors—not just the health field. This means developing fair and effective laws and policies, running public awareness campaigns, improving how health topics are presented in the media, and helping people build stronger digital and health literacy skills so everyone can recognize and question unreliable information.
Lay Summary License
This lay summary is distributed under the Creative Commons Attribution–NonCommercial 4.0 International license (CC BY-NC 4.0).
Related Work
Related work will show here

