Biased research: Science and the quest for pure objectivity

Even scientific research is prone to bias, whether conscious or unconscious

“Our number one mission is to be objective, but  various unconscious influences on the scientific mind make bias-free research virtually impossible,” says Luke O’Neill, professor of biochemistry at TCD. Photograph: iStockphoto

“Our number one mission is to be objective, but various unconscious influences on the scientific mind make bias-free research virtually impossible,” says Luke O’Neill, professor of biochemistry at TCD. Photograph: iStockphoto

 

Nullius in Verba is the motto of the Royal Society, the oldest scientific organisation of its kind in existence. It translates as “Take no one’s word for it” and embodies the noble principle that all scientists should strive for objectivity in their work.

Sadly, striving is all anyone can hope for. Human bias affects all scientific research, regardless of the study. With no exceptions. Whether it’s conscious or unconscious, explicitly or implicitly motivated by an agenda, the pursuit of pure objectivity – for now at least – is as futile as a lab rat’s efforts to outrun a spinning wheel.

“The greatest trick the devil ever pulled was convincing the world he didn’t exist.” This quote from the 1995 movie The Usual Suspects makes sense in this context. Pure objectivity is not attainable but, in some respects, we must continue to fool ourselves into believing it is. That’s according to Luke O’Neill, professor of biochemistry at Trinity College Dublin, who was elected a fellow of the Royal Society in 2016.

“Our number one mission is to be objective, but psychologists have suggested various unconscious influences on the scientific mind make bias-free research virtually impossible,” he says.

There are emotionally charged subjects – such as climate change or the question of when life begins in the context of abortion – where human bias is brazenly present in the overall corpus of research.

Cognitive bias

However, bias has an impact on seemingly uncontroversial research areas too. False findings in a multiple of disciplines are often later uncovered, not because of some avoidable error made in the research process, but because scientists may have unwittingly fooled themselves. Cognitive biases, or common ways of thinking that lead one to believe incorrect but convenient – or in some cases more attractive – conclusions, can happen to us all.

“Motivated reasoning”, or the interpretation of observations to fit a particular mode of thinking, is one such example.

“I think this kind of bias lies at the root of what has been called the ‘replication crisis’ within social psychology,” says Art Markman, a psychology professor at the University of Texas in Austin. “There have been numerous studies in the last decade with ‘headline-grabbing’ results that, ultimately, couldn’t be replicated elsewhere after they were published.”

The standard-bearer for these kinds of studies, says Markman, is research conducted by John Bargh, professor of psychology for the Acme Lab at Yale University. “He and his colleagues demonstrated if you had college students unscramble many sentences that contained words relating to the stereotype for old people, that the participants walked slower when leaving the lab than if they unscrambled sentences that did not relate to a stereotype.”

There have been a plethora of other studies in “social embodiment” carried out elsewhere where replication, or lack thereof, rendered findings at best questionable and at worst obsolete.

One study suggested if you asked people to create facial gestures signalling pleasure or displeasure, it made people’s ratings of liking an item stronger or weaker. Another suggested when subjects held a heavy clipboard, it made them feel more important. There was even a study that suggested people who washed their hands after performing an ethically questionable act judged themselves less severely in terms of how guilty they felt.

“No one is suggesting the researchers were guilty of committing fraud, rather they were interpreting their findings in the best possible light in order to make a splash,” says Markman.

Hard habit to break

Bias is just as prevalent in the “hard” sciences too. A couple of years ago, a bacterium that supposedly lived on arsenic was found in Mono Lake in California. The research discovery, published in 2010, made the front pages of newspapers worldwide but, two years later, it was revealed (to much less media fanfare) that the bacterium in the lake had in fact been living off phosphorus deposits.

“I think scientists really wanted this one to be true because if it were, it would have had major consequences for our understanding of what supports life, the question of extraterrestrials etc,” says Prof O’Neill.

Prof O’Neill isn’t surprised, though, that there was little or no media attention when the discovery of the “arsenic life bacterium” was found to be bogus.

“Negative evidence doesn’t interest most scientists, even less the public. It is human nature to be motivated towards positive results. You’re far more likely to get a paper published showing which pathogen causes a certain disease, than a paper listing all the pathogens that don’t cause the same disease. But that research is potentially just as important.”

Prof O’Neill suggests the metric for scientific “success” needs to change.

“We must move away from publication as the ultimate aim of research to a model where the primary focus of researchers upon completion of a study is to find another lab to successfully replicate their study in a different setting. This won’t negate the impact of bias entirely but it will bring us one step closer to truly living up to the motto Nullius in Verba.”

Make it ineresting: The biggest bias?

Researchers at the Hobby-Eberly Telescope Dark Energy Experiment (Hetdex) at McDonald Observatory in west Texas don’t have to worry about the natural bias to study what interests them. The research team have just begun a long-term study of the darkness in between all the fun stuff in space – stars, planets, gases and so on – in the hope of learning more about dark energy, an unknown form of energy believed to permeate all of space, and be the driving force accelerating the expansion of the universe.

“All astronomers have preconceived notions about what they want to study, what objects are interesting,” says Matthew Shetrone, senior research scientist at Hetdex. “What we’re doing is pointing our telescope towards seemingly blank parts of the sky, and if we discover something interesting, like a star or something other than what we understand to be dark energy, we discard it and move on to another blank spot in the sky in the hope we see nothing.”

How does bias affect research as apparently neutral as this? First of all, determining what’s interesting from what is not is ultimately a human decision. Pointing the telescope in one direction or another on a given night is also subjective, regardless of any explicit intention. Even the location of the telescope itself in the northern hemisphere impacts upon the objectivity of any research output from Hetdex.