Epistemic status: parody well-intentioned with known mistakes (see comments on LessWrong), more confident about clarification at the end
Followup to: Strong Evidence is Common
One time, someone asked me if I was evil. I said "no". Afterward, they probably didn't believe I wasn't evil. I'm guessing they wouldn't have happily accepted a bet at 20:1 odds that I don't deliberately hurt people.
The prior odds that someone is evil are realistically 1:10. Posterior odds above 1:20 implies that the odds ratio of me saying "no" is less than 2:1, or roughly 1 bit of evidence. That's a little bit of evidence.
Seeing an Onion headline say "X did Y" is teeny evidence that X didn't Y. Someone telling you "I am not gay" is small evidence they they are not gay. Putting a complicated, unclear technical claim into ChatGPT and getting "correct" is little evidence that the claim is correct. Tiny odds ratios lurk behind many encounters.
On average, people are overconfident, but 12% aren't. It takes 8:1 evidence to conclude someone isn't overconfident. Asking if they're overconfident and getting "yes" won't be enough, and might be negative.
Running through Bayes' Rule explicitly might produce a bias towards significant values. Extraordinary claims require extraordinary evidence, and a lot of the evidence you get is quite ordinary.
This is not to contradict the original, but to remind you not to take it too far.
Strong evidence is common (but certainly not guaranteed) for questions with mostly-honest sources and tiny priors, such as
In many of those cases, the priors are tiny sith there are thousands or millions of plausible options you'd consider before the evidence.
Weak evidence is common when the source of evidence is likely to lie or be stupid, and that situation seems to be especially common with binary questions.