There was a time when proof ended arguments. You could point to a document, a photograph, a number in a ledger, and people would at least pause. Evidence once had authority. It didn’t settle everything, but it carried weight. Now it’s just another weapon. The more proof we produce, the less anyone believes it.
Somewhere in the past decade, fact-checking turned into its own faith. It has priests, rituals, and heretics. You don’t seek truth in the old sense; you seek affirmation that your side’s version of it remains pure. To “debunk” something has become a form of worship, not inquiry. The goal isn’t clarity anymore—it’s victory. And victory, in the digital age, means clicks, not consensus.
You can see this every time a story breaks. Within minutes, experts line up to “contextualize” it, which usually means sanding off the edges that make their side look bad. Screens fill with colored checkmarks and authoritative graphics explaining what’s “mostly true,” “misleading,” or “false.” The words sound scientific, but they’re just categories of convenience. Even the cleanest fact-checks can’t escape the human urge to frame reality in our own image.
When proof became content, it lost its gravity. Everything became debatable, even the camera feed. A photo is evidence until someone claims it’s deepfake. A statistic is gospel until the wrong person quotes it. The internet trained us to distrust not only the source but the existence of facts themselves. The result is epistemic freefall—a society that treats proof as a challenge, not a foundation.
This is how conspiracy culture thrives. You don’t need to invent data; you just need to claim someone else’s data is corrupted. Every liar borrows the language of skepticism. Every grifter claims to be the “real journalist.” Every demagogue says, “Do your own research.” And millions do—by scrolling through the same handful of feeds that confirm what they already thought.
The worst part is that the institutions built to protect evidence helped destroy it. Media outlets chasing engagement learned that outrage outperforms nuance. Universities, once proud of teaching critical thinking, built entire departments around identity and grievance. Government agencies spun their own narratives, then acted surprised when no one believed them. Proof doesn’t work when every referee has a jersey.
I still remember when “show your work” meant something. In the classroom, it was about transparency—you laid out the steps so anyone could follow your reasoning. In public life now, showing your work just gives opponents a road map to tear you apart. So people stop explaining. They just assert. The tone replaces the trail. The argument becomes performance art.
There’s a difference between data and meaning. We’ve buried meaning under so much information that people stop trying to find it. We have dashboards for everything—disease, debt, crime, temperature—but no shared vocabulary for interpreting any of it. We argue about the metrics while the reality decays. Proof can’t help us if we don’t agree what it’s for.
Sometimes I think we’ve confused skepticism with nihilism. The first questions; the second devours. Healthy doubt checks power; pathological doubt destroys trust. What we have now is a marketplace of distrust, and proof can’t survive there. It’s too fragile. It depends on good faith, on some shared willingness to believe that facts still matter. Without that, even the best evidence turns to noise.
The answer isn’t to build bigger databases or launch another round of “disinformation” task forces. It’s to rebuild the habit of meaning. Proof has to live somewhere—in people’s judgment, not just in files. We need to teach how to think again, not just what to know. Otherwise, every photo, every chart, every document will keep dissolving on contact with disbelief.
Truth doesn’t vanish when proof stops working, but it hides. It waits for a quieter time, when we remember that knowing something isn’t the same as winning. Maybe then we’ll rediscover what proof was for—to bind us to reality, not to each other’s factions.