It’s not just galaxies and atoms. It’s your doctor with the results of a cancer biopsy. It’s your tax advisor telling you not to claim that deduction. It’s your investment advisor telling you whether you can ever afford to retire. It might even be your decision that you can exceed the speed limit by 10 mph this time without risking a citation.
These are examples of scientific truth. They’re all instances where we rely on someone’s years of expertise and knowledge of statistics to advise us in an area where the answers are not black or white. Scientific truth is not simply a NASA report of finding water on Mars, which some bozo will deny anyway. Science is a way to be happy by connecting with the real world.
Some people want scientific truth to be Science’s answer to Faith. That’s a waste of time. Religion gives the certainty of revelation and it’s 100% true within the world of its believers. Scientific truth, on the other hand, is never achieved. Science may be 50% or 95% confident, but it is never 100% certain about anything.
Nevertheless, we rely on scientific truth every time we read the warning label on a medication or read a nutrition label at the grocery store. So it’s worth asking, what is scientific truth, exactly? More to the point, when “experts” claim something is true, or mostly true, when can we believe them?
An interesting new piece of research bears on scientific truth and what we can believe. It’s reported in PLOS, an online journal.
A group of Spanish and German researchers with lead author Maira Bes-Rastrollo looked at the association between sugar-sweetened beverages and weight gain. They asked the question, if researchers are financed by the beverage industry, does it bias the results of their research?
You can anticipate the answer: a resounding YES. But this is not a flippant essay by some New Age revolutionary who had the answer before he even began. The researchers made a careful study with controls to protect against their own biases. Here’s how they proceeded:
– They searched three major research databases for articles in English, Spanish or French relating to soft drinks and weight; of the 405 articles found, 17 were “systematic reviews” that purported to objectively analyze a large number of research publications.
– Two researchers who did not know the review authors’ sources of funding and stated conflicts of interest independently extracted eighteen “conclusions” stated in the 17 articles.
– A third researcher independently found 93.3% agreement between the two sets of conclusions; this allowed sorting the articles into those that did, and those that didn’t, find an association between drinking sugary beverages and weight gain.
– Another researcher, without knowing any of the article content, classified the articles according their association with the beverage industry due to funding or stated conflict of interest.
In other words, each researcher operated “blind,” drawing limited conclusions without being exposed to other information that might have biased the researcher’s own analysis.
When the results were assembled, here’s what the researchers found:
– 61% of the conclusions (11 out of 18) found an adverse association between sugary beverage consumption and obesity; none reported any significant benefit.
However, when the results were sorted according by financial ties to the beverage industry, there was a decided tilt in the conclusions:
– 6 studies, with 6 conclusions, had evident financial ties to beverage manufacturers (although three of them stated that the funding source did not affect their results!). 83% of these conclusions (5 out of 6) found no association between soft drinks and obesity.
– 11 studies, with 12 conclusions, revealed no conflict of interest. 83% of these conclusions (10 out of 12) found that sugary drinks do cause weight gain.
As Berkeley Wellness summarized, the results are “disconcerting, but not-so-surprising.” An essay discussing bias in biomedical research found a similar result, concluding that bias in scientific research is a pervasive problem.
Regardless of how researchers try to be objective, and regardless of what researchers may claim in their articles, it’s very difficult to be happy if you are seeking truth. Here is the main finding of the research:
Scientific truth is biased by who is paying for the work.
What a lousy, disheartening result! No wonder only 30% of Americans trust scientists “completely” or “a lot.” This 30% level of trust places scientists just above auto mechanics and below nursing home operators.
So if truth, absolute truth, is not what science is offering, exactly why is scientific truth of any use to us? Here we must leave the realm of fact and resort to speculation instead.
Science Speculation: Scientific truth is like infinity or heaven: something unattainable here on Earth. An ideal. However, even if it’s not attainable, scientific truth still points a direction. If we can look at the stated results and weed out the biases, we may find a truth that is “true enough” to be useful, or at least truer than what we previously believed.
Scientific truth, like other kinds of knowing, can be judged by its source. If the subject is something we really care about, such as whether our drinking water is infected or whether our child needs an expensive medical treatment, we want to know at least three things about the people giving the opinion: their motives, their knowledge and their honesty.
Sometimes we can judge motive simply by following the money. Does the expert gain financially from what he is saying? Will the surgeon be able to buy a speedboat with the money from the back surgery he is recommending to me? But sometimes motive is more subtle. The platelet scientists featured in the previous blog are principals in a company whose financial success may depend in part on what they say about their work. This is a good thing, because who has a better chance to make a new technology succeed than the people with the deepest understanding of it? Other experts may have a big psychological investment in being right or being seen as a guru, and that may affect their pronouncements.
Scientific publications we generally rely on peer review to judge the accuracy of the work. We also judge knowledge level by the reputation of the speaker or her institution. However, as discussed in earlier blogs (such as STEM Education #2: Politics Trumps Science), the opinions of friends and role models strongly affect our judgment in matters of fact and science. We have to guard against our own bias as well as the bias of experts!
We usually assume that an expert is speaking honestly, and I would hate to live in a world where we could not safely make that assumption. However, sometimes people, including scientists, shade the truth or out-and-out lie for reasons of their own. You may recall the recent controversy about Korean stem cell research, but no nationality or scientific field is proof against dishonesty. Perhaps the surest way to judge honesty is to first verify that motive is clean and knowledge is sufficient; then, rely on our previous experience with the expert, either personally or via friends. That approach is not sure proof against a pathological liar or someone who places no value on the truth, but it may be the best we can do in practice.
Have you ever been misled by a supposed “scientific truth”?
Drawing Credit: johnny automatic, on openclipart.org