How to spot a low-quality source before you share it
The hardest thing about evaluating sources is that the cues we instinctively look for — does the site look professional, does the writing feel confident — turn out to be poor signals. A polished site with a confident voi...
The hardest thing about evaluating sources is that the cues we instinctively look for — does the site look professional, does the writing feel confident — turn out to be poor signals. A polished site with a confident voice can still be a single anonymous person reposting other outlets' work with a partisan slant. A clumsy-looking site with rough copy can still be a careful local reporter doing primary work nobody else is doing. The trustworthiness signal is usually elsewhere.
A faster shortcut is to look at how the source treats uncertainty. Reputable outlets, especially on developing stories, signal what they don't yet know. They use words like "according to," "the official said," "we have not independently verified," "earlier reports stated X but those reports relied on Y." Low-quality sources flatten all of this into confident assertion. The presence of careful language is a stronger signal than the absence of typos.
The next signal is corrections. A serious outlet has a corrections page or a visible correction history on its articles. Mistakes happen everywhere; what differs is whether they're acknowledged. An outlet that has never had to correct a story is either incredibly lucky or doesn't admit when it's wrong, and the second is more likely.
The third signal is the byline. Real journalism has a person's name on it. Often the name links to a bio with prior work, sometimes a phone number or email. Aggregator sites, content farms, and many partisan sites use generic bylines or no byline at all, because the goal of the operation isn't to build journalist accountability — it's to publish volume.
The fourth signal is the source list inside the article. Stories about an event should reference original documents, named officials, or eyewitnesses. Stories that are only sourced from "reports indicate" or "according to social media" are usually one or two layers away from any actual reporting, and the chain of telephone is where errors compound.
A fast practical test for a story you're tempted to share: try to find the original source it's based on. If the story is actually based on a primary document, court filing, or named official, the original is usually one or two clicks away. If you can't find an original — or if every reference traces back to other aggregators of aggregators — you're looking at a story whose factual basis is unverifiable, and sharing it makes you part of the chain.
The fifth signal, and the one that takes the longest to develop, is calibration over time. After a year of paying attention to which sources got which stories right and wrong, you build a personal map. Some outlets you used to trust slip; some you'd dismissed turn out to be reliable on a specific beat. Maintain the map mentally and update it. There's no shortcut.
The piece we'd recommend on this is a long, calmly-written guide from a former newsroom editor. It walks through ten real-world examples of stories that broke down in different ways, and what cues would have flagged each before they spread. The case studies on the stories that fooled experienced journalists are the most humbling and the most educational.
NapMap editorial
Curated content recommendations from independent publishers.