Failed replications, questioning authority, and outdated mentalities

Researchers face a tricky situation when another scientist fails to replicate their work. Do you double down on your original findings, admit your work was likely a false positive, or something between these poles? And how do you respond when a colleagues work fails to replicate? All these questions were raised again after a series of articles and revelations casting further doubt on one of the most famous studies in social psychology; the Stanford Prison Experiment (SPE) by Philip Zimbardo. I won’t retread ground here as you can go and read the excellent articles (the ones I recommend are: Blum, 2018; Haslam, Reicher, Ranz-Schleifer, & Van Bavel, 2018; and Resnick, 2018). The response by Zimbardo has been predictable: reject the criticisms; give a wordy response which doesn’t fully deal with the points raised; and restate his faith in the original findings.

Other researcher responses

What has been more interesting (to me at least) was the response of other figures in the field. Zimbardo posted a message to the Society for Personality and Social Psychology (SPSP) listserv (a discussion forum for those who are SPSP members) which characterises the criticisms as “rather virulent attacks”:

As Patrick Forscher states, it’s notable that he presents these attacks as having been published in various blogs (ignoring the original work critical of the SPE by Reicher & Haslam, 2006). My interpretation is that he views blogs as scientifically less valid, therefore the criticisms shouldn’t be taken as seriously[note]I don’t think Zimbardo believes the criticisms should be ignored just because they have been published on blogs, because he clearly hasn’t done so.[/note]. Obviously I’m biased, but I disagree with the idea of blogs being a priori less valid (for detailed reasons as to why, I recommend Lakens, 2017).

But the main aspect I want to focus on is how other prominent researchers have responded to the criticisms. I admit I don’t have access to the forum so I may be missing some important/relevant responses, though I will update this blog post as more come to light. The examples below are the most eye-opening (with the last being particularly noteworthy):

The commenters don’t deal directly with any of the points raised by the criticisms, nor do they touch on methodological issues. They make hint at “unreferred sources”, which seems strange given the extensive referencing in the articles criticising the work[note]With the exception of Resnick (2018), which is an interview with Zimbardo himself.[/note]. My suspicion is this is alluding to the fact some of the criticisms come in the form of blogs, but I can’t be certain. And they all leap to the defence of Zimbardo and his work. This is perhaps understandable, taking into account his standing in the field and the popularity of the SPE, but rather worrying given (in my opinion) the devastating critiques of the study[note]This isn’t to say all of the responses I’ve seen have been defensive: I’m sure there are those in the forum raising these methodological criticisms and almost all of the responses I’ve seen on social media have been critical of the SPE.[/note].

Results over methods?

But these are not the most conspicuous comments. The line that is most revealing about the mentality of some psychologists within SPSP (and the field as a whole) is: “The fact some slapstick psychologist couldn’t replicate Baumeister or Claude and then says that both are wrong is frankly insulting”. This emphasis on results, rather than the quality of the methods, is worrying. The fact they were unsuccessful in their replication attempt makes them a “slapstick psychologist”[note]Presumably, they would have been paragons of excellence if they had been successful.[/note]. It doesn’t matter whether the methods were more rigorous[note]They were e.g. Hagger et al. (2016)[/note], what matters is the result. There is ample evidence demonstrating that focusing on results rather than the quality of methods is detrimental to science (e.g. Munafo et al., 2017). Many would argue the field’s collective fixation on results rather than good practice has led us into the “reproducibility crisis”[note]See Nosek, Spies, & Motyl (2012) for one example.[/note]. I agree psychological science is hard (as Sanjay Srivasta has argued) but this appeal to the difficulty of psychology often seems to be a deflection from dealing with the very real methodological problems.

How do we respond?

The responses given by some researchers aren’t surprising but do reflect an outdated view of how science should be performed. I am uncomfortable with these comments going unchallenged. This is why I applaud Yuthika Girme‏ for asking excellent questions of one of the responses and Patrick Forscher’s attempt to “redirect the conversation to the evidence”. He has started a Google doc to write a collective response to the listserv discussion. I fully support this aim and would encourage you to sign it if you already haven’t. I’m not expecting it to achieve much in and of itself, but hopefully it can be another tiny step towards encouraging all to adopt better practices and a more critical outlook.


Blum, B. (2018). The lifespan of a lie. Medium. Accessible at:

M. S. Hagger, N. L. D. Chatzisarantis, H. Alberts, C. O. Anggono, C. Batailler, A. R. Birt, R. Brand, M. J. Brandt, G. Brewer, S. Bruyneel, D. P. Calvillo, W. K. Campbell, P. R. Cannon, M. Carlucci, N. P. Carruth, T. Cheung, A. Crowell, D. T. D. De Ridder, S. Dewitte, M. Elson, J. R. Evans, B. A. Fay, B. M. Fennis, A. Finley, Z. Francis, E. Heise, H. Hoemann, M. Inzlicht, S. L. Koole, L. Koppel, F. Kroese, F. Lange, K. Lau, B. P. Lynch, C. Martijn, H. Merckelbach, N. V. Mills, A. Michirev, A. Miyake, A. E. Mosser, M. Muise, D. Muller, M. Muzi, D. Nalis, R. Nurwanti, H. Otgaar, M. C. Philipp, P. Primoceri, K. Rentzsch, L. Ringos, C. Schlinkert, B. J. Schmeichel, S. F. Schoch, M. Schrama, A. Schütz, A. Stamos, G. Tinghög, J. Ullrich, M. vanDellen, S. Wimbarti, W. Wolff, C. Yusainy, O. Zerhouni, and M. Zwienenberg. (2016). A Multilab Preregistered Replication of the Ego-Depletion Effect. Perspectives on Psychological Science, 11 (4), 546 – 573.

Haslam, S. A., Reicher, S. D., Ranz-Schleifer, N., & Van Bavel. (2018). Rethinking the ‘nature’ of brutality: Uncovering the role of identity leadership in the Stanford Prison Experiment. Accessible at:

Lakens, D. (2017). Five reasons blog posts are of higher scientific quality than journal articles. Accessible at:

Nosek, B., Spies, J., & Motyl, M. (2012). Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Accessible at:

Reicher, S. D., & Haslam, S. A. (2006). Rethinking the psychology of tyranny: The BBC Prison Experiment. British Journal of Social Psychology45, 1-40.

Resnick, B. (2018). Philip Zimbardo defends the Stanford Prison Experiment, his most famous work. Accessible at:

Srivastava, S. (2009). Making progress in the hardest science. Accessible at:

Zimbardo, P. (2018). Statement from Philip Zimbardo. Accessible at:

4 responses to “Failed replications, questioning authority, and outdated mentalities”

  1. As a moral psychologist, I must as a matter of conscience endorse this.


    1. Love it!


  2. For what it’s worth, a couple weeks ago James Heathers tweeted about criticisms of the Stanford Prison Experiment and pointed out that the first methodological criticism of was published in 1975, two years after the paper reporting it. Here is his tweet, and here is the paper in question.


    1. Yes you’re absolutely right. There is a long history of criticising the SPE, apologies if my blog post didn’t acknowledge that fact explicitly.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: