Monday, July 27, 2015

Median publication delays at 38 APA journals

Last week, I had a paper accepted at the Journal of Personality and Social Psychology.  This acceptance is good for me, as JPSP is one of the more prestigious journals in my field.  However, given how grueling the review process has been, it's hard for me to feel happy about this acceptance -- based on my records, this paper spent about 17 months in review, and right now it has only been accepted, not published.

Of course, I am far from the only person whose paper has spent a long time in the limbo between acceptance and publication.  In fact, based on two analyses of papers in PubMed, this experience seems distressingly common.  For example, Steve Royle found that papers submitted to cell biology journals in 2013 and indexed by PubMed take about 100 days to go from received to accepted and another 120 days to go from accepted to published, for a total of 220 days.  In another analysis, Daniel Himmelstein analyzed the time between acceptance and publication for 3,476 journals indexed by PubMed in 2014.  I didn't see an overall median lag time, but most of the lags seem to be between 50 and 60 days.

Both of these analyses focus primarily on biology journals, and primarily on journals indexed by PubMed.  For example, if you try search for "J Pers Soc Psychol", the PubMed abbreviation for JPSP, on the Himmelstein site, you will not find this journal listed -- possibly because JPSP does not report the receipt and acceptance dates for each article in PubMed.  This leads me to my question: Do the Royle and Himmelstein analyses reflect the typical delays at psychology journals?

Tuesday, July 14, 2015

Mapping "prejudice" research reveals its preoccupation with implicit bias

One of the many difficulties of doing social science is that the concepts that we study are often fuzzy.  Precisely defining concepts like "attitudes", "cognition", and the "self" can be challenging, which sometimes leads to dramatic differences in how scientists use the terms.

The challenges are only enhanced when the object of study is a politically charged concept like my chosen field of study, prejudice.  I believe this fuzziness in the definition of "prejudice" has exerted a distorting influence on research on the topic, affecting the questions researchers ask, the measures researchers use, and the interventions researchers develop.

Today, I'm going to focus on a small piece of this issue by answering the following questions:
  1. When contemporary researchers choose to study "prejudice", how do they use the term?
  2. What does contemporary researchers' use of the term "prejudice" reveal about their (often unstated) definitions of of the term?

Tuesday, June 16, 2015

Idealized vs actual psychological science

I have been reading recently about the philosophy of science, which has got me thinking about the scientific method, both as it's taught in most psychology classes and as it's commonly practiced in psychology.  This thinking has led me to the following conclusion: the version of the scientific method that is usually taught in psychology classes is a farce, to the detriment of the science as a whole.  Let me explain.

Monday, June 15, 2015

What we can learn from the LaCour data fabrication incident

Mike LaCour, author of a paper on
canvassing that was later retracted
About two weeks ago, news broke that Michael LaCour, the first author of a study about how, purportedly, gay canvassers can successfully improve people's attitudes toward gay men and same-sex marriage initiatives, likely fabricated his data. Although news about fraud is always troubling, this news was particularly troubling -- after all, the study was published in Science, one of the most high-profile journals for scientific research (as the joke goes, the shorter the title, the more prestigious the journal). In addition, the methods of the study appeared to be rigorous, and the findings just "felt good" -- according to the study, brief, 20-minute conversations with canvassers who admitted they were gay created dramatic changes in attitude that persisted up to nine months.

Tuesday, September 20, 2011

Diederik Stapel and the frequency of scientific shenanigans

On August 27, two junior researchers working with the Dutch social psychologist Diederik Stapel at Tilburg University contacted a university administrator with suspicions that their senior colleague was using faked data.  As one of the worst forms of academic shenanigans that fall under the broad umbrella of "academic misconduct", an allegation of data fabrication was quite serious.  This is especially true because Diederik Stapel was in the early stages of a prolific scientific career; he served on the editorial board of six different academic journals and had received the 2007 "Early Career Award" from the International Society for Self and Identity (ISSI).  He had also published many articles that received generous press attention, including one in Science that claimed that messy environments promote discrimination.

Nonetheless, a little over a week and one university investigation later, Stapel admitted to making up data and was sacked from Tilburg University.