Scientific misconduct: The ultimate negative career move


  • The views here expressed are solely those of the author and are not meant to reflect the policy of either Wiley-Liss, Inc., or the American Association of Anatomists.

Science functions as the engine driving technologic advances because, at its core, science is objective. In fact, objective reporting of experimental data is the guiding principle of scientific investigation. The integrated circuit does, or does not, perform to specifications. The monoclonal antibody does, or does not, recognize an epitope signal over background. A breast cancer drug trial does, or does not, show the drug to confer significant benefit beyond that of placebo.

Honesty in experimental design, data collection, and reporting of results is a necessity for scientific objectivity. Some experimental results may only hint at a complete answer — or at worst, appear uninformative. Incremental answers themselves will likely beget even more questions that require answers. This progression is fundamental to the scientific method: each datum, no matter how insignificant, builds upon its predecessors and likewise forms a foundation for further investigation. This is how the Truth, in scientific terms, is discerned. Should research be based on questionable or falsified data generated in the past, then scientific truth—and therefore progress—become suspect.

Or so I was taught in graduate school, at our annual Symposia on Scientific Integrity. These symposia were organized in the early 1990s, in the wake of several high-profile cases of scientific misconduct publicized widely in the popular and scientific media. Attendance was “mandatory” for all students, postdoctoral fellows, and faculty (somehow, technicians were omitted). I say “mandatory” because I knew of several faculty who at the time never managed to attend; perhaps insulted that our respected institution would even dream that they would fall victim to the temptation of falsifying data, plagiarizing text from an obscure source, or even knowingly misrepresenting authorship on a publication. These egregious offenses to the principles of science were simply things that happened in the other lab, in the other institution, and not in ours.

Two recent, high-profile misconduct cases surely left researchers in France and Texas scratching their heads in stunned disbelief that blatant offenses to the scientific method could have occurred at their own institutions. An article in this issue of The New Anatomist discusses a powerful and reproducible new method for digitally manipulating images for scientific purposes. However, Gilbert and Richards (2000) caution about the ethics of image modification by describing a recent case of digital malfeasance. It was uncovered when editors at Nature suspected striking similarities between a submitted photograph showing a new species of Indonesian coelacanth that a French group claimed to have captured. The photo of a Southeast Asian specimen caught by researchers and published in Nature a few years previously. The image submitted by the French group was shown to be a forgery.

Even more unsettling is the recent case of a University of Texas immunologist who had compulsively fabricated data throughout his graduate school and postdoctoral years (Malakoff, 2000). The researcher's fraud was uncovered after he had moved on to an industrial position. At first, his former university colleagues pursuing similar research were unable to reproduce his experimental results which had already published. They invited him back to the lab as an untenured faculty, to let him personally reproduce his data. With reasons for suspecting foul play, they devised a sting operation and caught him generating fraudulent data. Officials of the federal Office of Research Integrity found considerable evidence that he had been systematically falsifying data since he was a graduate student, notably with samples sent to him from outside collaborators. He had to retract at least four published articles and part of another report, thus infuriating former colleagues and generating holes in the published literature. His career (deservedly) unraveling, he has also been barred from receiving federal research grants, may incur financial penalties, and could have his 1996 Ph.D. revoked.

There is no place for this kind of dishonest behavior at any level of scientific research, or at any level of society. Not only does it violate the law, it shows utter disrespect to one's colleagues, collaborators, competitors, and the scientific method. Even more contemptible: each event of misconduct spreads mistrust and cynicism like a contagion, causing yet more scientists to question deeply the veracity of the published literature on which they rely for their own research.

Since the coelacanth episode, for example, I cannot look at an image in either a submitted or published manuscript without briefly wondering if maybe, just maybe, the image had been fabricated. What's more, the public, who pay for the majority of research through taxes, will become less and less willing to support a research enterprise that appears rife with lies, exaggerations, and misconduct — never mind the fact that the vast majority of researchers are reputable and honest.

That an air of cynicism and mistrust in science are palpable today may be no surprise to some, especially in an age when even the highest elected officials in the land, and some candidates for these offices, seem to make or break the rules of honesty and professional conduct on a whim. No one is totally immune to temptation, but how can we help prevent otherwise intelligent scientists from succumbing to the stupid negative career choice so gently called “scientific misconduct?” How can the temptation of misconduct be most effectively combatted? Surely in training young researchers we have the opportunity to squelch such temptations by consistently demanding high standards both in the classroom and in publication, but also by targeting the issue specifically.

This raises some important questions. Should institutions, for example, be required to develop accredited ethics education programs, such as were established at my graduate school? Should they also develop “scared straight” programs in order to inculcate a knee-jerk response away from the temptation to tweak or fudge or fix or enhance? Should all faculty be required to sign a pledge stating that they will fight for truth in the lab and brand the souls of their students with righteousness?

It is insulting to me as a scientist that I even feel the need to make the above suggestions, let alone write this editorial. I know what is right and what is wrong, as does everyone reading these words. As an editor, in order to sleep at night I must believe that honesty reigns supreme among the authors and reviewers on whom I rely. I must believe that the dishonest are few and far between.

But facts are facts: fabrication and falsification subtly pervade our scientific enterprise, apparently at a growing rate. Fame, fortune, or even increasing pressure from the competition for ever-smaller slices of the research funding pie are simply not excuses for dishonesty. At the same time, draconian measures, which establish mistrust of scientific professionals as the norm, are hardly an acceptable remedy.

Perhaps scientific societies can help by broadcasting news regarding misconduct cases to their membership or taking active roles in integrity education at their annual meetings. In the end, however, we can only hope that the scientific method weeds out all cases of scientific fraud, as it did for the two cases mentioned above. The more that would-be denigrators of scientific integrity realize that they will ultimately be caught, the more likely it is that they will not succumb to the temptation of breaking the rules of conduct in the first place.