Christopher J. Ferguson, PhD
Academic psychology is currently experiencing one of the most difficult times I can recall during my years as a student or faculty member. Much of this is occurring as an aspect of the overarching replication crisis in psychology, worsened in many respects by the acrimony and confusion rather than by a “let’s fix it” mentality. However, our crisis is not merely a product of discovering that many past findings have been difficult to replicate. Indeed, usually a refinement in methods would be something to boast about, but I am concerned that the replication crisis is particularly embarrassing given the past shamelessness of our field in having encouraged the disingenuous over-promotion of research results from individual studies.
I’ve discussed this issue of “Death by Press Release” before (Ferguson, 2015). If you are inclined to “kill” the credibility of your study through a dubious press release, here are some common approaches:
- Although the study was done with a limited convenience sample of college students, be sure to imply that the findings generalize to all humans, everywhere.
- The press release is a good place to highlight the statistically significant findings while ignoring any inconvenient non-significant results.
- Though the study might have been done with some dodgy laboratory analog outcomes, be sure to apply the findings to important real-life phenomena. Reference a recent important event if you find even a vague similarity.
- Don’t be shy about using emotionally valenced words. Could your study be evidence of an “epidemic,” “crisis” or need to “sound the alarm?” By contrast, are your results “astonishing” or “enormous” or “groundbreaking?”
- Be sure to say you were “surprised” by your results even if you were not.
- Effect sizes were tiny? Don’t mention that. Or, why not convert everything to a binomial effect size display? Sure, it misleadingly dichotomizes your key variables and eliminates any important control variables, and readers may not understand what those percentages mean, but they will sound important!
- Frame the whole thing as an uncertainty that existed until your study popped into being. Who could doubt the outcome of your study, right?
Ok, I’m having a bit of fun with this, and I’m sure people could probably think of other examples to add to the list. These observations, however, are made with some caveats. First, Death by Press Release is probably an aspect of human nature … we are all inclined to be excited about our results and want to present them in the best possible light. In the spirit of Simmons et al. (2011), I do not exclude myself from among the naughty press-releasers. Second, in most cases the intent is certainly excitement rather than purposeful dishonesty. Third, our professional organizations such as the American Psychological Association (APA) have often set awful examples for scholars through their own press releases, misleading policy statements, and through nudging scholars to be more conclusive rather than qualifying in the discussion of their findings.
Media psychology has been particularly hard hit by Death by Press Release which, across realms ranging from media violence to media/body dissatisfaction to “sexy media,” has resulted in public misconceptions of “consistent” pools of research that simply do not exist.
This problem of Death by Press Release can be seen in a recent article on youth violence (Bushman et al., 2016). Granted even the original article grossly exaggerates the evidence for media effects stating “Exposure to media violence is significantly related to violent criminal behavior, although the effects are smaller than for aggressive behavior” (p. 23), failing to acknowledge disconfirming evidence related to both violent outcomes (DeCamp, 2015) or aggression (Przybylski et al., 2014) and also mischaracterizing one article (Savage & Yancey, 2008) as supportive of links when the authors concluded otherwise. Citation bias, sadly, remains a circle of scholarly hell for media psychology articles (Babor & McGovern, 2008). At least this article (Bushman et al., 2016) was inclined to (potentially begrudgingly) acknowledge some data which suggested that using violent media takes kids away from other antisocial activities (e.g., Dahl & DellaVigna, 2009). However, the APA’s (2016) press release, despite absence of clear evidence, dropped all caution and made extreme associations of the link between media violence and youth gun violence. The original article already contained significant problems and biases, but the APA press release only made matters worse, significantly misinforming the public. This, despite the fact that criminologists are increasingly calling links between media violence and serious acts of gun violence a “myth” (Fox & DeLateur, 2014).
We must remember that groups like the APA are guilds to promote our profession. As a member, I believe this is good in many respects, but it can also make for poor science. The APA (2015) press release for the already controversial task force on video game violence is another good example. The press release quoted the task force chair Mark Appelbaum as stating “However, the link between violence in video games and increased aggression in players is one of the most studied and best established in the field” ignoring the considerable controversies and null results in this field that conflict with such an exaggerated claim. Fortunately, in this case, most media outlets (e.g., Wofford, 2015) did pick up on the controversies and often offered critical coverage of the controversial and biased task force. This task force press release highlights once again the degree to which press releases can be used to misinform rather than inform the public about science in a particular field. In this case, the press release got “caught” but often that is not the case.
These are, of course, just two examples which undoubtedly reflect a larger problem in our field. There’s little question that bigger, flashier claims attract more news media attention and it’s natural to bask in the glow of media coverage. But making big claims, particularly based on weak data can do real damage to the field’s credibility. The replication crisis, although it still would have come, might have been less visible, acrimonious or damaging to the credibility of psychological science had it not come on the heels of lofty, grandiose, counterintuitive and sometimes arrogant claims in many study press releases. With a bit of humility, we can do better!
Babor, T. F., & McGovern, T. (2008). Dante’s inferno: Seven deadly sins in scientific publishing and how to avoid them. In T. F. Babor, K. Stenius, S. Savva, & J. O’Reilly (Eds.) Publishing addiction science: a guide for the perplexed (2nd edition) (pp. 153-171. Essex, UK: Multi-Science Publishing Company, Ltd.
Bushman, B. J., Newman, K., Calvert, S. L., Downey, G., Dredze, M., Gottfredson, M., & … Webster, D. W. (2016). Youth violence: What we know and what we need to know. American Psychologist, 71, 17-39. doi:10.1037/a0039687
Dahl, G., & DellaVigna, S. (2009). Does movie violence increase violent crime? The Quarterly Journal of Economics, 124, 677-734.
DeCamp, W. (2015). Impersonal agencies of communication: Comparing the effects of video games and other risk factors on violence. Psychology of Popular Media Culture, 4, 296-304. doi:10.1037/ppm0000037
Ferguson, C. J. (2015). “Everybody knows psychology is not a real science”: Public perceptions of psychology and how we can improve our relationship with policymakers, the scientific community, and the general public. American Psychologist, 70, 527-542.
Fox, J., & DeLateur, M. (2014). Mass shootings in America: Moving beyond Newtown. Homicide Studies, 18, 125-145. doi:10.1177/1088767913510297
Przybylski, A. K., Deci, E. L., Rigby, C. S., & Ryan, R. M. (2014). Competence-impeding electronic games and players’ aggressive feelings, thoughts, and behaviors. Journal Of Personality And Social Psychology, 106, 441-457. doi:10.1037/a0034820
Savage, J. & Yancey, C. (2008). The effects of media violence exposure on criminal aggression: A meta-analysis. Criminal Justice and Behavior, 35, 1123-1136.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359-1366. doi:10.1177/0956797611417632