April 2014
« Jan    


Social scientist outed for “massive fraud”:

When colleagues called the work of Dutch psychologist Diederik Stapel too good to be true, they meant it as a compliment. But a preliminary investigative report released on October 31 gives literal meaning to the phrase, detailing years of data manipulation and blatant fabrication by the prominent Tilburg University researcher.

“We have some 30 papers in peer-reviewed journals where we are actually sure that they are fake, and there are more to come,” says Pim Levelt, chair of the committee that investigated Stapel’s work at the university.

Stapel’s eye-catching studies on aspects of social behaviour such as power and stereo¬≠typing garnered wide press coverage. For example, in a recent Science paper (which the investigation has not identified as fraudulent), Stapel reported that untidy environments encouraged discrimination ( Science 332, 251-253; 2011).

“Somebody used the word ‘wunderkind’,” says Miles Hewstone, a social psychologist at the University of Oxford, UK. “He was one of the bright thrusting young stars of Dutch social psychology — highly published, highly cited, prize-winning, worked with lots of people, and very well thought of in the field.”

Luckily, this time, he was caught out.  This underscores very significantly why we need a great deal of oversight in the social sciences.

4 comments to Problematic

  • Joshua

    Again with the meat. Grow up.

    Meat aside, fabrication of data is thankfully very rare. It is eye-catching when it is caught, and most scientists are ethical folk who adhere to professionalism. At least, scientists are well aware of the dangers of unethical behavior, and whether through goodness or badness most keep to the ethical side of the line.

    Social scientists are a weird breed, but not in a fabricating-data kind of way. What passes for social science is not always social and not always science and not always both. Most work in social science is through so-called secondary data, or the data someone else collected (usually by large, well-funded organizations who have it in their best interest not to fabricate). Social psychologists, who rely on experiments, are different, in that they usually collect their own data. Rarely do people analyze someone else’s experiment, so “primary” data collection is the norm for them.

    I’m not in social psychology.

  • If you can’t stand the meat, get out of the kitchen.

    What has become clear is that the politicization of science (AGW, for example) has called scientific ethics into question as never before.

    As the social sciences, by their very nature, cannot be judged the same way as chemistry (consistent, replicable results), there needs to be a different way of gauging their accuracy.

    I’m not in social psychology.

  • Joshua

    Some social science can be replicated: statistical analysis of data can easily be replicated when the researcher provides sufficient information on what he/she/shim/herm did. What’s not replicable are most so-called qualitative studies, such as interviewing certain populations at particular time points (we can’t go back and re-examine, say, the Yanomamo of the 1950s).

    Experiments can be replicated, and should be.

    I am not a social psychologist.

  • Social science still doesn’t have the consistency of chemistry or physics. There’s statistical analysis of social science data, but the data isn’t as consistent as the charge of an electron.

    No one is saying experiments cannot be replicated.

    I am not a social psychologist.