How scientists succumb to corruption and cook results
Horrible bosses can cause misery in any kind of business, but in science, they wield uniquely destructive power. In a recent survey compiled by the journal Nature, a number of young scientists reported that they felt pressured to find “particular results” that would presumably please their bosses, as opposed to the truth. That’s a problem for society at large, since it degrades the integrity of research that we’re supporting.
Last week, a number of experts weighted in for a special section of the journal devoted to the interconnected problems of bad laboratory leadership and compromised research integrity. One reason some experts see more integrity trouble now is that there’s a growing power chasm in science. So many young people train to be scientists that only a small fraction of those earning Ph.D.s will be able to get the most coveted jobs — tenure-track academic positions. And those who do may still have to spend years in low-paying, insecure post-doctoral fellowships. It’s a source of cheap labor, but with a hidden cost.
In an ideal situation, people should be able to pick their bosses and colleagues based on character, said C.K. Gunsalus, who heads the National Center for Professional and Research Ethics at the University of Illinois. It’s tricky to pick your bosses in fields where there’s a prevailing sense that you’re lucky to have a job at all.
Scientific integrity became a public issue in the 1980s after a couple of dramatic cases involving famous researchers, but experts say they’ve come to recognize that while the press still focuses on the lone bad actors, most cases involve multiple researchers — some cutting corners or skewing results, and others going along with it.
“Most of us believe it’s the smaller lapses of integrity that are more important than the few cases of serious misconduct,” said Nicholas Steneck, who studies research integrity at the University of Michigan. The Nature survey, which included more than 3,200 scientists, revealed something about how people can collectively get into trouble. About one in five junior scientists called their workplaces “stressful,” “tense” or “toxic.”
The journal reported that more than half of the survey respondents said they felt pressured to get a “particular result.” The bosses — or principal investigators, as they’re usually known — responded much more positively about their own leadership skills than did those working under them. Those in junior positions were much more likely to report that it wasn’t clear what was expected of them, that they lacked communication with their bosses, and that they were encouraged to cut corners.
Gunsalus, the University of Illinois integrity expert, co-authored a piece on misconduct for the Nature section, citing three examples from her own investigations. In one, a student was wrongly included as an author on a paper that later turned up in an ethics investigation, while in another, a famous researcher was included despite no real involvement in the research. In another situation, two junior scientists quit a lab team after noting that the raw data on some medical scans didn’t support the conclusion their boss was promoting.
She said she believes that most people who breach standards in science do not set out to cheat: “Nobody wakes up in the morning and says, ‘This is the day I decide to screw it all up.’” Junior people have good intentions, she said, but don’t know what to do when their superiors do something wrong — a problem she’s detailed in writings on how to be a whistleblower and survive.
Gunsalus has broken down nine steps that lead to misconduct, which roughly make the apt acronym “tragedies”: temptation, rationalization, ambition, group and authority pressure, entitlement, deception, incremental-ism, and stupid systems. The competitive nature of science and the egos of the leaders can set up both temptation and rationalization. Those without power may think they need to cheat to get by, while those with more clout may be sure they know the answers to scientific problems, and feel the ends justify less-than-honest means. The last part, stupid systems, describes the publish-or-perish culture, which encourages quantity of publications over quality.
Incremental-ism is one of the most interesting elements: the tendency to start with small integrity violations and gradually ramp up. A litany of small charges, for example, ultimately led to this New York Times take-down of a powerful cancer biology lab at Ohio State University.
The experts agreed that the power disparities in science can lead to power abuses, and that there’s little or no training in leadership offered to those who find themselves heading groups. To make matters worse, said Gunsalus, in academic circles people are often suspicious of anyone who actively wants a leadership position — even if that person has a gift for management. It’s seen as careerist, she said.
For younger people, getting caught up in toxic work situations or groups with integrity problems is indeed tragic. “There never comes a time in your career,” said Gunsalus, “where you say, ‘Now I have enough power that I can start being ethical.’”