The sort of mental heuristics I use to assess information really depends on the nature and context of the information.
If it’s some sort of scientific claim, then I’ll want to see if there’s peer-reviewed research, if it’s been replicated, if there’s meta-analysis. A single piece of research can easily be wrong, or superceded by newer studies, for example. At least in science, the more the scientists are sure of something, and the more of them agree, then it’s usually a good sign that it’s ‘true’.
When it comes to more general human affairs, society, politics, economics, etc, then things get muddy, because it’s nearly all value-laden. The motivations are more about upholding comforting worldviews and gratifying egos. ‘Truth’ in these realms may be viscerally opposed by the majority. You can’t trust the popularity of an opinion or even the number of sources, because they can reflect common doctrines and values. Here the dissident views can be more correct, or nearer to the truth, than the popular ones.
Sometimes I just stop and think about what it is I’m reading or seeing. A lot passes into people’s heads it because they don’t even consider that it might be untrustworthy.
I suspect being able to detect irony might be a related and relevant skill, because it also requires questioning and assessing the information, and being open to the idea that it’s simply a joke.