Skip to main content

More on Research Follies

By April 16, 2023Commentary

If you want to understand issues in good research, read this entire study.

Good research is hard–designing a study to avoid confounders and bias and other issues, collecting good data, doing solid statistical analysis.  It isn’t hard, however, when you know what you want to find and just make everything up to get there.  This is very, very, very easy in so-called “social sciences”.   This is an interesting study, in which 73 research teams were given the same set of  survey data on attitudes toward immigration.  Despite many of the teams using similar statistical methods, widely varying results were found on whether more immigration reduced or increased support for social policies such as public pensions, health care, housing, etc.  Here is the money quote:   “research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions”.   The varying numerical findings is the real head-scratcher, but again, you can find what you want to.  While the authors attribute the outcome to hidden factors, I strongly suspect if you look more closely at the researchers’ beliefs, you will find a correlation with the results they reported.  But I think the other important lesson is the inherent uncertainty in statistical analysis and model building.  Raw data often tells you as much or more than all the sophisticated statistics do.   (PNAS Study)

Join the discussion One Comment

  • Susan Wasson says:

    My sister is a veterinarian with a PhD in population genetics. I once griped to her about the garbage concept of “relative risk” and how it is used to trick stupid doctors into prescribing minimally effective treatments.

    Her response was, “maybe they make mountains out of molehills because there aren’t any mountains left to find.”

    When given relative risk I always compute actual risk from the data provided. NNT is also helpful. Unfortunately none of this does any good if the drug company behind the research throws out any data not supporting their conclusion. John Ioaniddis (sp.?) should get a lot more exposure for his work on the lack of replicability in the medical literature..

Leave a comment