A typical reaction to global warming skepticism is to point to all the institutions that endorse global warming and argue that this would require a grand conspiracy if global warming were false.
I argue that all that is needed is for incentives to align in a certain direction. The awarding of grants, the publication of papers and the media attention all point in one direction and there is positive feedback between them.
As reported in the New York Times, Diederik Stapel literally made up results of psychological experiments that were never done. It is not necessary to go quite that far.
Fraud like Stapel’s — brazen and careless in hindsight — might represent a lesser threat to the integrity of science than the massaging of data and selective reporting of experiments. The young professor who backed the two student whistle-blowers told me that tweaking results — like stopping data collection once the results confirm a hypothesis — is a common practice. “I could certainly see that if you do it in more subtle ways, it’s more difficult to detect,” Ap Dijksterhuis, one of the Netherlands’ best known psychologists, told me.
Journals and reviewers can play a part:
If Stapel was solely to blame for making stuff up, the report stated, his peers, journal editors and reviewers of the field’s top journals were to blame for letting him get away with it. The committees identified several practices as “sloppy science” — misuse of statistics, ignoring of data that do not conform to a desired hypothesis and the pursuit of a compelling story no matter how scientifically unsupported it may be.
The adjective “sloppy” seems charitable. Several psychologists I spoke to admitted that each of these more common practices was as deliberate as any of Stapel’s wholesale fabrications. Each was a choice made by the scientist every time he or she came to a fork in the road of experimental research — one way pointing to the truth, however dull and unsatisfying, and the other beckoning the researcher toward a rosier and more notable result that could be patently false or only partly true. What may be most troubling about the research culture the committees describe in their report are the plentiful opportunities and incentives for fraud. “The cookie jar was on the table without a lid” is how Stapel put it to me once. Those who suspect a colleague of fraud may be inclined to keep mum because of the potential costs of whistle-blowing.
So there are incentives to take an easy path of painting a simple, neat picture because it is more persuasive and saleable.
Stapel did not deny that his deceit was driven by ambition. But it was more complicated than that, he told me. He insisted that he loved social psychology but had been frustrated by the messiness of experimental data, which rarely led to clear conclusions. His lifelong obsession with elegance and order, he said, led him to concoct sexy results that journals found attractive. “It was a quest for aesthetics, for beauty — instead of the truth,” he said.
What the public didn’t realize, he said, was that academic science, too, was becoming a business. “There are scarce resources, you need grants, you need money, there is competition,” he said. “Normal people go to the edge to get that money. Science is of course about discovery, about digging to discover the truth. But it is also communication, persuasion, marketing. I am a salesman.
It is not just money; the rewards are the respect and admiration of one’s peers. In my talk on open source software on Friday I mentioned that this is one of the reasons individuals give away their source code or donate their time to open source projects. It feels good to make something that others find impressive.
I am lucky enough to work in software. There, the most aesthetically pleasing solution is usually the best one. And software can not easily be faked; it becomes apparent very quickly if it does not work. I can imagine software that appears to do what it claims to do without actually doing it, such as an encryption program that leaks your secrets. Open source software has largely solved this problem. In fact, science could learn a lot from open source software.