Pages

Showing posts with label Grant Assessment Panel. Show all posts
Showing posts with label Grant Assessment Panel. Show all posts

Tuesday, 8 April 2014

Thoughts from an ESRC Mock Panel

Last month we held an ESRC mock panel as part of the Grants Factory. This is a really useful exercise; it gives participants a flavour of the discussions and debates that take place in a real Grant Assessment Panel (GAP), but it also gives them an idea of the tough assessment their application will have to go through.

Whilst I've posted elsewhere on this blog about what makes a good application (eg here, here, and here), a couple of points were raised at the workshop that I thought that they were worth repeating:

  • Firstly, panellists rarely read your proposal in the strict order in which its presented. The two panellists who led the workshop said that they normally read the JeS form first, to get a sense of what the project is about, then skip to the reviewers' comments and PI's response, before returning to the Case for Support. Your response, then, is crucial. This is true of all the Councils. A panellist for one of the other Research Councils said the following after returning from a panel meeting:
'The PI's responses were key and a substantial number of these were badly done (serving simply to refute or to point out disagreement between reviewers rather than rebut with argument, to clarify or to accept reviewers' suggestions).  Not all PIs made use of the whole space allowed.  Spending time reminding the panel of the positive things that reviewers had said was a waste of space if there were substantial issues to be addressed...I'm certain that I saw applications that would have received a higher final grade (and possibly funding) if the PI response had been better done.'
  • Secondly, most panellists won't have a background in your area. The GAPs are quite broad (see their disciplinary configuration here), so you need to make sure of two things: first, that you explain your research in a way that an intelligent general reader can understand; and secondly, that your methodology is watertight. Why? Because although the panellists might not understand the specifics of your project, they will all understand (or think they understand) the underlying methodology. So that is where they're going to pick holes. In particular, you need to be strong on how you analyse the data. Try and preempt any problems they might see in your methodology, and head them off at the pass. 
The final mock panel of the year will focus on the EPSRC, and will take place on 4 June. Drop me a line if you want to take part.

ESRC Changes Grant Assessment Panels

The ESRC has decided to change the configuration of its Grant Assessment Panels. These changes will take effect from November 2014.

Previously, the three panels covered the ESRC's remit as follows:


From November they will be:


My understanding is that these changes came about partly to balance out disproportionate workloads, but partly because (to quote my source) 'panels were getting set in their ways'.

This makes sense. I wouldn't want the panels to ossify (to borrow the ESRC's own term). Nevertheless, this does seem like an odd mix, and there are some strange divisions, such as 'economics' and 'economic and social history' being in separate panels. Similarly, I would have thought that 'Social Policy' had more in common with 'Sociology' and 'Socio-Legal Studies' than, say, international relations.

But the ESRC's task was a thankless one. They were never going to please everyone. I wish them well with this, and I do hope - for all our sakes - that the new arrangement works. For those who might be nervous about how their application might be viewed under the new system, there's still time to get your proposal in to be viewed in July under the final meeting of the old panels.


Friday, 8 February 2013

ESRC Peer Review Process

As I said in my the previous post, I took part in a PGCHE mock panel today. I preparation I did some background reading on the ESRC process, and thought it would be interesting to set down the way that they assess applications.

  •   First Stage – ESRC receive the application

o   Roughly 10% of applications get rejected at this stage on technicalities, such as not having the right attachments, sections not being filled, format not being adhered to, etc

  •  Second Stage – External Reviewers

o   Each application gets sent to at least 3 academic reviewers, and a user reviewer (if relevant). These are identified using key words.
o   Reviewers use a scale of 1 (low) -6 (high) to rate the application.
o   Applications are assessed on:
§  Originality and potential contribution to knowledge;
§  Research design and methods;
§  Value for money;
§  Outputs, dissemination and impact;
§  Scheme specific criteria (not relevant to responsive mode).
o   Any application with an average score of less than 4/6 it will be rejected at this stage. This applies to about 30% of applications

  •  Third Stage – Introducers

o   The remaining applications are allocated to Grant Assessment Panel (GAP) members who will act as 'introducers' at the panel meeting. There are usually two introducers per application.
o   The ESRC tries to match the applications to the GAP member with the most relevant experience. However, there are only three GAPs, which cover a wide range of disciplines, so applications may well be introduced by someone with limited knowledge or understanding of the discipline.
§  The disciplines covered by the three GAPs are:

GAP A
Education 
Psychology 
Linguistics

GAP B
Sociology 
Social Work 
Social Policy 
Social Legal 
Area Studies 
Anthropology 
Statistics and Methods 
Politics and International Studies 
Science and Technology Studies
GAP C
Economics 
Management 
Demography 
Environmental Planning 
Geography 
History


§  If it is felt that there is no one with relevant experience on the GAP, they can either cross refer to another GAP, or even to another Research Council.
o   Each introducer gets around 7-10 proposals to assess each meeting, and 4-5 weeks to write their assessments. Their assessments will highlight key strengths and identify any weaknesses that need to be addressed.
o   Based on this assessment, they rate the application using a scale of 1 (low) – 10 (high).
o   The ESRC will analyse these scores and work out approximately what score the applications need to have got in order to go forward to the GAP meeting. Roughly 30% of applications get rejected at this stage, and realistically only those scoring 6 or above are likely to be funded.

  •  Fourth Stage – The Panel Meeting

o   Each application is introduced by the two GAP members. It is not always clear beforehand who will be the leading introducer and who will be the seconder.
o   The panel works through the applications in order of introducers’ scores: the ones with the highest score get discussed first, those with a lowest score get discussed last.
o   The Panel works with PDFs which save on paper but do make it difficult to refer back, check, and follow the discussion of applications easily.
o   ESRC officers are present, and do have input:
§  By saying roughly how many applications they can fund in that round;
§  By highlighting any problems with applications that have not been picked up before (eg they have been submitted to the Council before)
o   Whilst the panel will take a steer from the introducers, the panel discussions allow for proposals to be pulled up or down the rankings. Most of the discussion is around marginal or controversial proposals.
o   The Chair is key. S/he summarises the discussion and, if there is no consensus, has the final say.

Thursday, 27 October 2011

From Me to You: ESRC Panellists Advice to Reviewers

The ESRC has recently held workshops - or 'masterclasses' - for members of its peer review college. These involved some of the Grants Assessment Panel (GAP) members talking about their experience, about what they have to bear in mind when assessing applications, and on the importance of the reviewers in the process.
As you can imagine, this provided some interesting insights. The GAP members were generally grateful to the reviewers, and recognised their reliance on the reviewers' knowledge to make their decisions. Some points to highlight:

Firstly, the process itself:


  • Each application gets sent to at least 3 academic reviewers, and a user reviewer (if relevant)

  • If the average score for these is above 4/6, they get sent to GAP members who will act as 'introducers' at the panel meeting. Introducers usually get 7-10 proposals each to assess each meeting, and 4-5 weeks to write their assessments.

  • Each application will have 2 introducers assessing them. If the average introducers' score is above 4/10, they go to panel. Only those scoring 6 or above are likely to be funded. Thus, as I've said before, it's worth noting how important the introducers are.

  • However, the panel discussions allow for proposals that fall below this to be pulled up the rankings.

  • Most of the discussion is around marginal or controversial proposals.

What are the core assessment criteria for reviewers:

  • Scientific quality and intellectual contribution;

  • originality and potential contribution to knowledge;

  • timelineness;

  • robustness of research design and methods;

  • value for money;

  • outputs, dissemination and impact.

So what should reviewers bear in mind when assessing an application?

  • Give yourself enough time to properly assess the application.

  • Judge only what is written, not what you imagine the project to be. If the applicant hasn't made clear what they're going to do, that is their fault.

  • Base your judgement on the research question the applicant asks. Is it interesting/important? Are the methods appropriate for answering the question?

  • Evaluate, don't advocate. Not every proposal can be funded. Be frank, and indicate risk versus benefit.

  • Justify your arguments, and provide constructive criticism. It's important that (a) the panellists understand why you have scored as you have, and that (b) your score matches your comments. And, of course, it's useful for the applicant if they are rejected.

And what should they not do?

  • be personal or aggressive;

  • be too brief or too verbiose;

  • be ambiguous;

  • make inappropriate, irrelevant or polemic remarks.

  • forget that reviewing research proposals is different from reviewing papers: here, the research is speculative, so you have to evaluate the likely results;

  • forget to draw attention to ways in which the proposal meets specific assessment criteria particularly well, and to point to any major logical flaws, contradictions or omissions;

  • forget what the point of the review is, namely to weigh up the positive aspects of the proposal against the negative ones.

After the workshops the ESRC has updated its website with FAQs and a checklist (short and long). It's worth having a look at these, to get an idea of what the reviewers will be considering when they start to read your proposal.