As search and rescue teams in Texas continue to search for those lost in extreme flash floods and communities try to piece together lives, claims quickly spread about what happened and who was to blame.
Many on the left blamed the Trump administration’s cuts to the National Weather Service. On the right, keyboard warriors accused cloud seeding technologies of causing the devastating floods. Others in the community spread news of the miraculous survival of some of those caught in the flood.
These claims and accusations have been called misinformation, commonly understood as “false” or “misleading” information. The floods in Texas have inundated news cycles with a broader discussion of what misinformation is, how it works, and the impacts it can have.
It is not surprising that Americans are worried about misinformation. Recent polling by the Cato Institute shows that Americans believe misinformation is the greatest threat to their freedom. This finding is true for Republicans and Democrats, though they likely consider misinformation to be a threat for different reasons.
Other polls have reported that 80 percent of Americans view misinformation as a major problem. And according to a 2023 Pew poll, 55 percent of Americans believe the U.S. government should take action to restrict false information, even if it limits freedom of information.
Research on misinformation, though, shows that it is not as serious a threat as it is made out to be, and we must be careful that in our efforts to address it, we don’t make matters worse.
Misinformation is an incredibly subjective issue to which people respond to in complex ways. In fact, misinformation is most often adopted and spread by those who are already predisposed to believe it, as we can see clearly in the recent events in Texas.
The cycle is familiar: Politically motivated actors spread false or misleading information that was too good to check because it reinforced their beliefs. Similarly, locals hoping for some good news shared and believed information that they desperately wanted to be true, but sadly, it was not. And as often happens during significant disasters, false or misleading information spreads because of the rapidly evolving nature of the tragedy — we often simply don’t know what the truth is yet.
So, while misinformation can be harmful, it is often more of a symptom than a disease. Research shows that misinformation itself often does not change the beliefs and actions of those who encounter it; rather, it tends to reinforce existing beliefs or behaviors. In that sense, misinformation does not have the powerful impact of which the media and political world commonly speak.
Unfortunately, despite this evidence minimizing its impact and power, the clouds of misinformation loom large over our society today. Americans have been told for years now that we are in the midst of an “infodemic” of powerful misinformation that infects our minds like a virus. For example, last year, the World Economic Forum’s risk report labeled AI-powered misinformation and disinformation as the greatest threat facing the world in the next couple of years. The number of academic research, books, journalism and fact-checking resources has surged over the past decade.
Rather than panicking about misinformation and opening the door to government censorship, the threat of misinformation must be addressed from the ground up rather than the top down. For tech companies, this means rebuilding user trust and helping users be better consumers of information. Tools like community notes — as being adopted or tested in some form by X, Meta, TikTok, YouTube and other platforms — are likely to be helpful in getting users to trust the fact-checks they are seeing. And efforts to “pre-bunk” misinformation through better media literacy will help by empowering users.
When the government begins funding counter-misinformation research, things tend to go awry. This may sound counterintuitive, but we often disagree about what misinformation is and tend to favor our political biases, as seen in the news around the Texas floods. So when the government doles out money to research misinformation, it is inevitably funding those biases, which over time contributes to polarization and a lack of trust in our institutions.
Similarly, the U.S. government should limit what it deems “foreign disinformation” to include only the most clear-cut and harmful cases. When not handled carefully, such efforts can and have turned into government attacks on Americans’ speech and political views — see the intelligence experts getting the Hunter Biden laptop story wrong — further polarizing and degrading Americans’ trust in their leaders.
The flood waters are receding in Texas, but the storm of misinformation still rages within our society. Instead of doubling down on misplaced panic over misinformation, we must instead trust and help Americans discover the truth. More speech, more discussions — not less speech and more government control — are the way we sort through information and find a brighter tomorrow.
David Inserra is a fellow for free expression and technology at the Cato Institute.
Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
For the latest news, weather, sports, and streaming video, head to The Hill.