Pages

Tuesday, 27 January 2015

ESRC Success Rate Drops to 10% for July Open Call

'Swindon, we have a problem'
My colleague Brian Lingley drew my attention to a rather alarming set of data from the ESRC last week. The scores and funding outcomes published in November show that it received 144 responsive mode applications for July. Of those, only 14 were funded, giving a success rate of 10%.

Is this an anomaly? It is certainly less than the 25% ESRC average for 2013-14 given in the Times Higher round up published at the same time. More alarming, perhaps, was the number of high quality applications which didn't make it past the funding cut off point, which was in the 8-8.9 score range. In fact, there were more unsuccessful proposals in this range than there were successful ones.

Thus, according to ESRC's own definitions, it has had to turn away more than half of the 'excellent proposals which are of significant value, and are highly likely to make a very important scientific contribution and/or will significantly enhance the development of the applicant's academic career.'

As the scores demonstrate, the ESRC's push to increase the quality of applications has clearly had an effect: 45% of applications were scored seven ('very good...significant value...likely to make an important contribution') or above. However, this push for quality has not had a dampening effect on demand.

The current system serves neither the applicants nor the ESRC well. The applicants will have put in a huge amount of work in devising the project and developing the proposal. To be told by the ESRC that, whilst it was 'excellent...of significant value, and...highly likely to make a very important scientific contribution,' it wouldn't fund it, is galling. The project is as good as some that were funded; what more can they do?

Similarly, for the ESRC, combing through 144 applications to identify the 14 to be funded seems like a colossal waste of their time. It's inefficient, and it's not good to appear as such with the Nurse Review just around the corner.

So what's to be done? I don't want to revisit old arguments here (I explored the problems with, and potential solutions to, peer review here), but I do believe we need to rethink the adversarial way we assess applications, and move to a more iterative, supportive and nurturing system of proposal development that works with the funders.

However, I recognise that this won't happen any time soon. In the meantime I think the least that the ESRC should do is introduce a simple two stage process, so that those with no hope of progressing get winnowed out early, and those with potential are allowed to develop further.

In addition, second stage rejects should get constructive feedback and be allowed to reapply. By doing so, the ESRC would counter the sense of hopeless futility and despondency that results from all those highly rated applications, and it would help to allay the widespread disillusionment with the present system. And restoring confidence in the system would be good for both the applicant, the ESRC and the social science research base as a whole.


4 comments:

  1. I agree with you. Unfortunately we are in a political climate that tends to define "excellence" in relative terms like "top 10%". Your argument could thus be read that we are now at a total funding level that means we can't even fund the top 10% and those who are not close to excellent are not applying. Of course you've also got a system that is DISCOURAGING those who are not excellent from applying.

    The kind of thing you envision is more like what was happening at NSERC in Canada. For a long time they had something like an 80% success rate, providing program funding that might not be equivalent to the request but nevertheless allowed researchers to sustain research programs and build towards higher funding levels. The powers that be saw that 80% as evidence that they had no standards and were probably funding less than excellent research. A whole new system has been instituted that is causing much frustration.

    I'm not sure what the answer is, but I'm pretty sure it involves throwing more money at research which no one will commit to.

    ReplyDelete
  2. of the 144 proposals 30 were marked as Ref reject, office reject or withdrawn so could be discounted from these figures - this would increase the success rate to 12.3% which is still pitiful but slightly better than 9.7%

    ReplyDelete
  3. Good point, ZtF! thanks for that. Yes, a little better, but still in scratch card territory (https://www.researchprofessional.com/0/rr/funding/know-how/Straight-Talk/2015/2/straight-talk-success-rates-and-scratch-cards.html)

    ReplyDelete
  4. it gets better (worse?)
    Nov 14 figures just released show a massive 10% increase to a success rate of 11% - that's not weighting for rejects which could push it to heady heights of almost 14%

    ReplyDelete