On that recent Publishing Economics study: How Slow? Why Slow? Is Slow Productive? How to Fix Slow?

Andreas Ortmann
6 min readMar 12, 2024

The long time that it takes in economics for papers to go from submission to acceptance and publication has been a long-known and well-publicized concern.

In the just published Journal of Economic Literature issue, Hadavand, Hamermesh, & Wilson (HHW) provide considerable new evidence on that issue.

Here is what their abstract says:

Economics publishing proceeds much more slowly than in the natural sciences, and more slowly than in the other social sciences and finance. It is relatively even slower at the extremes. Much of the lag, especially at the extremes, arises from authors’ dilatory behavior in revisions. Additional rounds of resubmissions at top economics journals are related to additional citations; but conditional on resubmission, the delays are unrelated to greater scholarly attention. We offer several proposals for speeding publication, including no-revision policies such as Economic Inquiry’s, the use of “cascading referee reports,” limits on authors’ time revising, and limits on editors waiting for dilatory referees.

Specifically, answering the question of How Slow?, HHW conclude that it took on average 2 and 3 years to acceptance and publication, respectively, in four of the Top 5 journals and REStat, while it took about half that for a couple of roughly matched journals in Political Science and Psychology, and about one fifth in Nature and PNAS. See Figure 1 p. 272. So, yes, the publishing process in economics seems to continue to be slow although one could argue back and forth about the comparability of these selected journals (e.g., the acceptance rate for these journals differ significantly, with econ top journals sitting at about 6 percent, the three pol science — psych journals at around 10 percent, and Nature and PNAS sitting at 8 and 15 percent, respectively.) For no particular reason, let me note that the acceptance rate of Experimental Economics sits at 11%.

Answering the question of Why Slow?, HHW — controlling for factors such as pages, references included, cumulative WoS citations and Google Scholar citations, etc.) — find that “the main proximate determinant of inter-decile differences in the speed of publication is the huge rise in the amount of time spent in authors’ hands” (p. 276) Figure 2, on p. 277, details this rise starkly and is an important finding in light of the usual gripe-fests in places like Facebook’s Reviewer 2 Must Be Stopped! page which tend to blame editors and reviewers for everything that goes wrong in the very high opinion that many authors have of themselves.

Answering the question of Is Slow Productive?, HHW — again controlling for a number of potential confounders such as number of rounds, time spent in the journals’ and authors’ hands respectively, previous citations/experience by most-cited author, number and gender of authors, number of pages and references — find that “the 51 percent of articles that require a third round (two resubmissions) are related to greater subsequent citations than the 27 percent of papers that go through only two rounds. … On the other hand, the marginal additional citations related to the fourth round (the 22 percent of articles that are resubmitted, resubmitted again, and then accepted after yet another resubmission) is smaller, although positive.)” (pp. 281–2)

Answering the question of How to Fix Slow?, HHW discuss three options.

First, fast-tracking as initiated by Economic Inquiry in 2007. Using confidential data obtained from the Western Economic Association International for more than 800 published papers (and more than 5,000 rejected manuscripts), HHW find that “fast-track papers were only slightly, albeit statistically significantly, more likely to be accepted for publication than those submitted through the regular track. … There is little difference in the time between submission and first decision among accepted papers along the two tracks. Rejection times are also similarly distributed across tracks.” (p. 285) Obviously there is the good chance that selection effects confound these findings and indeed there is some evidence that “more successful (in terms of prior scholarly impact) and more senior authors were more likely to choose the fast track.” (p. 285) There is also a discussion (section 3.2.) about the fast-tracking options at AER: Insights and the various affiliated AEJs. The data are too recent to allow replication of some of the earlier studies in HHW but the authors suggest that the practice of “cascading referee reports” (i.e., the handing down of referee reports from the AER to one of the affiliated AEJs) is a promising strategy for shortening the time to publication.

Secondly, use of desk rejections. Footnote 28 on p. 287 suggest that the desk rejection rates at the AER, Econometrica, the Journal of Political Economy, the Quarterly Journal of Economics, and the Review of Economic Studies are in the 40–60 percent range. Again, for no particular reason, let me note that Experimental Economics’s desk-rejection rate sits at the upper end of this range.

Thirdly, use of strict enforcement of limited revision time. As mentioned, the authors document that some authors take extraordinarily numbers of months on revisions, and seem to attribute it to procrastination (see their discussion on p. 289). That is possible but surely there are other reasons (heavy teaching loads, paternal leaves, serious sicknesses, etc.) HHW also propose to limit refereeing/editing time. Like elsewhere in this article, here, too, we encounter considerable heterogeneity: “almost two-thirds of referees solicited agree to do the job and complete it within three months.” (p. 290) However, 20 percent take more than 3 months, 5 percent are never heard of, and 17 percent quickly decline refereeing requests. I have no problem with the latter — everyone is maxxed out these days — and I have reason to believe that these numbers are even higher outside of the top econ journals considered by HHW. The authors also suggest we impose limits on the time manuscripts spent with referees and/or editors. Lovely thought, except referees and editors *typically* do not get paid for the services that they provide — to my surprise, I learned that one top five journal pays its editors $51,500 per annum, and another pays $32,000, with the editor-in-chief being paid $64,000 (p. 291)— , so “firing” editors or blacklisting referees is not the most obvious and effective strategy at least for journals outside of the top five.

There are people who — in light of the dismal time it takes to get manuscripts published in economics —have argued econs should abandon the current system, to allow everyone to publish their stuff on their private websites and/or the many archives that have popped up recently, and then let the market work its magic. I have addressed this proposal last year. I fear any such arrangement would disadvantage the not-haves (junior folks included) even more.

As the famous saying has it:

No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time …’

So, too, I fear for the publishing process in economics.

Update 23 March 2024 (correspondence with Dan H 13–14 March 2024):

Andreas (if I may): Many thanks for the very nice write-up about our JEL piece in Medium (which I saw on your FB page). Good to get the word out that our profession is very busy shooting itself in the foot in the way that we organize publishing.


Dan Hamermesh

You are more than welcome, Dan. Yes, as a profession we are not doing too well in this respect but it is not clear to me what the alternative is. We know from the Nature experiment many years back that ex-post review is not a viable alternative. All best and greetings from Down Under.


I think the solution is clear. One revision, that’s all, with a deadline. If the paper can’t be done with one revision, it shouldn’t be given an R&R.

Dan Hamermesh

But isn’t one of your findings that the 2nd round for sure adds value?

Best, Andreas

Indeed it does; but it is marginal. And I wonder if it’s worth the extra time spent. Perhaps if editors did there job better and they knew they had one additional bite out of the author’s apple, they would concentrate their redo comments more than they do now.

Dan H

Consider making your opinion known by applauding, or commenting, or following me here, or on Facebook.



Andreas Ortmann

EconProf: I post occasionally on whatever tickles my fancy: Science, evidence production, the Econ tribe, Oz politics, etc. Y’all r entitled to my opinions …