Jump to content

Tips To Interpret Scientific Claims

(0 reviews)

Do we not read and talk about research findings more often than we critically discuss methods that led to the findings we discuss?

Trained in clinical epidemiology, I often believe we should discuss methods more, especially before we move into processes of changing clinical protocols on treatments and diagnostic methods. EBM!

Nature published a very nice article about how to interpret research claims. The idea was rather to aim the article towards non-scientists but I think their advice is worth to high-light for a medical audience. ( and read the article in full text here)

The 20 tips are...

  1. Chance cause variation (results can be due to chance)
  2. No measurement is exact (as we didn't know)
  3. Bias is rife (it certainly is)
  4. Bigger is usually better for sample size (yes!)
  5. Correlation does not imply causation (we all know this, but we tend to forget that)
  6. Regression to the mean can mislead (it does)
  7. Extrapolating beyond the data is risky (and set patients at risk)
  8. Beware the base-rate fallacy (it is hard to diagnose uncommon conditions)
  9. Controls are important (or rather, they are essential, and it is essential to select controls right)
  10. Randomization avoids bias (or at least reduces bias)
  11. Seek replication, not pseudoreplication (research needs to replicated)
  12. Scientists are human (and therefore im-perfect)
  13. Significance is significant (but confidence intervals are more important than p-values)
  14. Separate no effect from non-significance (abscence of evidence is not evidence of abscence)
  15. Effect size matters (but remember that effects tend to decrease with study size, i.e. the world is not as good as it seems to be in small trials)
  16. Study relevance limits generalizations (i.e. don't generalize findings among 33-weekers to 23-weekers)
  17. Feelings influence risk perception (and that's why we tend to be more afraid in a plane than in a car, despite the higher death risk to drive)
  18. Dependencies change the risks (some factors or events are related, in additive or multiplicative ways)
  19. Data can be dredged or cherry picked (see #12)
  20. Extreme measurements may mislead (and usually do not have a single cause)

0 Comments

Recommended Comments

There are no comments to display.

Guest
Add a comment...