Reviewers and panel members have been issued with new assessment criteria/forms to reflect these changes. For applicants, the national importance section must now be included in the case for support.
Monday, 31 October 2011
Reviewers and panel members have been issued with new assessment criteria/forms to reflect these changes. For applicants, the national importance section must now be included in the case for support.
Thursday, 27 October 2011
As you can imagine, this provided some interesting insights. The GAP members were generally grateful to the reviewers, and recognised their reliance on the reviewers' knowledge to make their decisions. Some points to highlight:
Firstly, the process itself:
- Each application gets sent to at least 3 academic reviewers, and a user reviewer (if relevant)
- If the average score for these is above 4/6, they get sent to GAP members who will act as 'introducers' at the panel meeting. Introducers usually get 7-10 proposals each to assess each meeting, and 4-5 weeks to write their assessments.
- Each application will have 2 introducers assessing them. If the average introducers' score is above 4/10, they go to panel. Only those scoring 6 or above are likely to be funded. Thus, as I've said before, it's worth noting how important the introducers are.
- However, the panel discussions allow for proposals that fall below this to be pulled up the rankings.
- Most of the discussion is around marginal or controversial proposals.
What are the core assessment criteria for reviewers:
- Scientific quality and intellectual contribution;
- originality and potential contribution to knowledge;
- robustness of research design and methods;
- value for money;
- outputs, dissemination and impact.
So what should reviewers bear in mind when assessing an application?
- Give yourself enough time to properly assess the application.
- Judge only what is written, not what you imagine the project to be. If the applicant hasn't made clear what they're going to do, that is their fault.
- Base your judgement on the research question the applicant asks. Is it interesting/important? Are the methods appropriate for answering the question?
- Evaluate, don't advocate. Not every proposal can be funded. Be frank, and indicate risk versus benefit.
- Justify your arguments, and provide constructive criticism. It's important that (a) the panellists understand why you have scored as you have, and that (b) your score matches your comments. And, of course, it's useful for the applicant if they are rejected.
And what should they not do?
- be personal or aggressive;
- be too brief or too verbiose;
- be ambiguous;
- make inappropriate, irrelevant or polemic remarks.
- forget that reviewing research proposals is different from reviewing papers: here, the research is speculative, so you have to evaluate the likely results;
- forget to draw attention to ways in which the proposal meets specific assessment criteria particularly well, and to point to any major logical flaws, contradictions or omissions;
- forget what the point of the review is, namely to weigh up the positive aspects of the proposal against the negative ones.
After the workshops the ESRC has updated its website with FAQs and a checklist (short and long). It's worth having a look at these, to get an idea of what the reviewers will be considering when they start to read your proposal.
Tuesday, 25 October 2011
- Weds 9 May TBC Eurovision: the pros & and cons of European funding (Dr Jenny Billings, Prof Simon Thompson)
- Weds 30 May 10am -12 Writing Better Bids (Prof David Shemmings)
- Mon 11 June TBC Winning Fellowship Funds (Prof Paul Allain, Prof Darren Griffin)
Please get in touch if you would like more information or want to reserve a place.
- details of the shiny new Peer Review system that was launched at the beginning of October;
- a snapshot of selection of recent awards;
- information about the new Grants Factory programme;
- all the funding gossip from Brussels and Swindon;
- an overview of the research interests of the 25 new academics who have joined the University recently;
- pass notes on the REF;
- contact details for everyone in Research Services.
Friday, 21 October 2011
Thursday, 20 October 2011
- the information should be harvested automatically and electronically;
- the system should be voluntary.
- Follows the trail of grants through individual HEI financial system. This can tell them: who is funded (via the HR system), including PI, Co-I, RA and students, where the money is being spent (via the procurement system), and who they are subcontracting to or collaborating with (via the finance system);
- Follows the trail of outputs, by linking with the patent office and publication databases;
- Follows the individual via various CV systems, such as Vivo, Harvard Catalyst and Eurocris;
- Analyses the areas funded by the federal agencies by scanning and machine reading the applications, and doing a key word analysis.
- What expertise there is in a particular area, or at a particular site;
- Where there is a shortage of expertise;
- How much funding has been put into any discipline area;
- Areas of overlap between funders;
- What has been funded in any geographical location (such as a state);
- What has been funded in any institution, or for an individual;
- What local or national businesses have benefited from the funding;
- The outputs and outcomes from any funded project;
- The career development of anyone associated with the project.
The Wellcome Trust spends £650m per year, roughly equivalent to the MRC. Liz Allen, the Senior Evaluator Adviser at the Trust, outlined the challenges faced and tools used in understanding and quantifying the effect of this funding.
She highlighted a number of challenges in assessing impact in the biosciences:
- the time frame involved: for instance, it took Robert Edwards 18 years from first developing the technique for IVF to the first successful ‘test tube baby’
- the serendipity of science: for instance, Alec Jeffreys has been quoted as saying ‘our discovery of DNA fingerprinting was of course totally accidental...but at least we had the sense to realise what we had stumbled upon.’
- Attribution and contribution: there is often a long and diverse list of people and organisations involved in the evolution of a piece of research. For instance, John Todd had funding from Wellcome, MRC, JDRF and NIH. In addition, there is the ‘ripple effect’, the value of negative findings, and the ‘counter factual’ question: what would have happened if x didn’t discover y?
- Understanding impact in terms of ‘progress’ rather than ‘success’;
- With this in mind, the Trust looks for traditional indicators of progress, including: publications, people and training, products and interventions, software and databases, engagement with communities, policy, advocacy and influence, funding, and awards and prizes.
- In addition, it has explored ‘new’ indicators, such as article-level metrics (through software such as PLoS), post-publication peer review (such as F1000), clinical guidelines (NICE has recently digitised all citations used for clinical guidelines, and the Trust can check who has been involved), and the separate contributions of all members of a research team (via the ORCID initiative).
- The Trust is also making use of ‘softer’ indicators, such as case studies. However, it recognises the dangers inherent in this. Allen once again turned to another quote, this time from John Allen Clements: ‘when you ask the memory to reconstruct daily events from forty years ago, you’ve got to be appropriately cautious.’ The Trust needed to make the award winners know what it might be interested in right from the start, so that they can note and monitor it.
- Wellcome Trust Career Checker. This is a recent development, and aims to track the career choices and progression of cohorts of individuals after their funding has finished. What have they gone on to do?
Which is quite compelling. Sure, assessing the effect of biomedical research is a tough call, but Allen rounded off with a quote from Mary Lasker: ‘if you think research is expensive, try disease.’
Tuesday, 18 October 2011
A number of funders are inviting people to become involved in their decision making. These include the following:
- AHRC - Peer Review College
- BBSRC - Council
- EPSRC - Council
- ESRC – Council
- MRC – Council
- STFC – Council
- NIHR - Programme Advisory Board & Refereeing
Do consider putting yourself forward for these vacancies. Being involved with funders raises your profile, but also gives you the opportunity to influence their policy and direction.
Friday, 14 October 2011
Thursday, 13 October 2011
- Other Research Outputs
- Staff Development
- Further Funding
Monday, 10 October 2011
- EIT: there's no separate provision for the European Institute of Technology. To me, this suggests that they want closer integration with other parts of the Framework Programme, but does it also mean that the EIT is quietly being sidelined or shelved, that it is being reabsorbed back into the body from which it emerged?
- ERC: the latest proposals don't specify the different schemes as 'objectives'. This suggests that the EC wants to allow the ERC room to develop and introduce new schemes as and when necessary. Which, in turn, suggests that the EC has confidence in the Council, and is going to allow it a little more independence.
- Societal Challenges: this is based around six multidisciplinary areas. These are evolving as we speak, but some interesting developments recently. These include 'Smart, Green & Integrated Transport,' for which the EC has added a new section on 'evidence-based transport policy for the long term.' So the EC is wanting to expand future transport beyond the scientific and technical to include socio-economic policy implications. 'Resource Efficiency & Climate' has changed its name to 'Climate Action & Resource Efficiency including Raw Materials', which makes clearer the overall aim and direction of the challenge, and ecosystems have been made more of a priority within this. 'Inclusive, Innovative & Secure Societies', which is closest to the current 'Socioeconomic Sciences & Humanities', has been simplified, and appears to be moving away from the FP7 theme from which it emerged. Whilst the Humanities were never a huge player in FP7, it looks like it will be even less so under Horizon 2020.
- Marie Curie: the EC is obviously wanting to drum home exactly what each of the Marie Curie schemes will do. They've ditched the original headings, and gone for headings that provide more explanation of what each scheme is intended for. For example 'Research staff exchange' becomes 'stimulating innovation through cross-fertilisation of knowledge.' Got it?
- Industrial Leadership & Competitiveness Frameworks: not a lot of change here, although in the most recent drafts the EC is emphasising the need for these underlying technologies to link more explicitly to the societal challenges.
Thursday, 6 October 2011
So if it's that easy, why do so many people fail? Askew suggested it was down to time. You need time to not only draft the application but, way before you set pen to paper, time to lay the foundations. So here's a quick run down of what you should be doing, now, to prepare.
- Think. Askew singled out one of the hapless workshop participants and asked, 'what's your strategy for getting European funding?' Like an embarrassed schoolboy the participant mumbled and looked at his shoes. As would the rest of us if he'd picked on us. The truth is most universities have a laissez faire attitude to applying. Askew, however, suggested that we should all be thinking strategically: what are our strengths? What are our weaknesses? What are our connections? Where should our focus be? Identify those strengths, those networks, and build on them. Don't leave it to chance, or to those on the peripherary of European research, to play the tune.
- Talk. Once you've established a European strategy, you need to lay the foundations for your consortium. Who are the best people working in your area? Who should you approach to be part of a consortium? Not everyone need be an equal partner, but equally there should be no 'make weights' or padding. Each partner should have a clear purpose. Once identified you need to sound them out and set ground rules about the collaboration. If you're coordinating, you will be the one held responsible, and you don't want to be left to pick up the bill should a partner renege on a collaboration agreement.
You also need to talk to the Commission. Get a sense of what's on the horizon. Now is the time to start establishing contacts with Commission officials and project officers. They are there to help. Later, as the bid develops, they can clarify the intentions of the call, so that you don't end up pushing your project down a false trail. Your relationship should continue once your project's off the ground and you need to provide progress reports. Don't be scared of picking up the phone to the commission (or, in fact, to the NCPs) to get an insight into their thinking. Better still, spend the money on a Eurostar ticket to Brussels for an informal talk.
- Plan. So you've identified your strengths and you have in place your partners. Now is the time to think about the project itself. One person - preferably with English as their native language - needs to pull it together and draft the application.It must appear to be coherent and unified, not like some kind of clippings album, with pieces taken from a selection of different newspapers. Each work package should interlink and interweave with the others; it should be interdependent and integral with the whole. It should be written in plain English, with acronyms spelt out and explained where necessary, and any jargon or slang cut out. Spell out everything, and don't assume anything. Just because you think you've got a global reputation, or your university is the toast of the UK, that doesn't mean that a Latvian evaluator will have heard of you or your institution.
- Write. As you draft your application, you should keep in mind the assessment criteria that the EC will use. There are three elements, each of which gets a score out of five:
- Science and Technology
The first of these is usually well met by applicants, albeit with a little too much context. The second is often so-so, and the third is frequently dire. Recent signals from the Commission are that they are tiring of poor impact programmes, so think seriously about how you will disseminate the findings of your research, and how you will engage with stakeholders. As with Research Council applications, it is a good idea to have an 'advisory group' that includes end users who can guide you in your research, and ensure that you are meeting the needs of those who may benefit from the research.
The evaluation itself is, in the eyes of Askew, fair, balanced and objective. There is no truth in the belief that lobbying has any effect, or that the EC expects consortia to be balanced and equal, with members from north and south Europe, or from new and old member states. The consortium has to 'make sense' (see above), and that's it. In the peer review meeting there is a member of the Commission on hand to ensure fair play, and to object if they sense that the rapporteur is not heeding the views of all panellists, or being too partisan.
As well as the evaluation criteria, you should bear in mind that, if your application is successful, you will have to go through a gruelling negotiation. At this stage the Commission will meet with you to discuss the nitty gritty of your project. They might present you with questions and queries that were raised by evaluators, such as unnecessary costs, or an unbalanced consortium. They may ask you to strip these out, and this may well affect your overall project. So preempt this by checking both the eligibility and the necessity of all components of your project.
- Submit. Submit early, and often. Each time you submit via the EPSS system it overwrites what has already been submitted. Don't leave it until the last moment, only to find the software crashes, leaving you out in the cold.
Wednesday, 5 October 2011
- There has to be a strong link between the impact and the research upon which it is based. It's not enough to be working generally in that area; you need to highlight the project, and the findings of the project, and make clear how these led on to the resultant impact.
- The research has to have been undertaken whilst you were at the University. It's fine if it began elsewhere, but at least part of it has to have happened after you arrived at Kent;
- It helps to have quantifiable indicators of impact. Whilst HEFCE define impact very broadly (note their definition in the checklist here), it will help you to objectively demonstrate your impact if you are able to show some figures to back up your claims.
- The impact has to have happened already. Unlike RCUK's understanding of impact, HEFCE's is backward looking. It's past, not potential. You have to be describing impact that has already been felt.
- It is better to write in the third person. This adds to the sense of an objective, impersonal analysis of the impact (as does having quantifiables, see above), which will help give your case study substance and credibility.
Tuesday, 4 October 2011
Applications are sought from academics at all stages of their career and, if chosen, you will serve a four year term. If you want to be nominated do get in touch with your CV. I will pass your details on to Prof John Baldock, the PVC Research, who will put forward nominations on behalf of the University.
Monday, 3 October 2011
Now in some ways you can see the logic of this. In strained financial times it may be better to prioritise the important, high quality work that has the potential to make a difference to science globally, as well as nationally bolstering the UK's competitiveness.
However, as you can imagine, those who are adversely affected by this prioritisation are angry about what they see as the fairly arbitrary algorithm by which it has been decided. Have a look at the sub-GCSE graph above. This is known as the 'Bourne Graph'. It is a visual representation of how EPSRC see the relative value of subjects within its remit.
But how were these relative positions decided? What scale is being used along the X and Y axes? Hmm. It's not clear, and EPSRC's reticence on this is not helping. People are thinking the worst. As a York-based organic chemist comments on his blog, 'if one didn't know better you may be forgiven for thinking it had been thought up on the back of a fag packet over a pint in the pub after work.'
Helpfully, he provides a similarly inane graph for his relationship with fruit and vegetables, as follows:
Prof Timothy Gower, another blogger, has tried to work out where they're coming from by deconstructing the newspeak pronouncements to come out of EPSRC.
More seriously, the sector's disquiet has resulted in letters from the chemists, statements from the mathematicians, and articles from the physicists, as well as a call from, well, everybody (in the shape of the Royal Society) to 'pause' the strategy. David 'Smalls' Delpy responded, specifically to the chemists, saying that he felt their pain, but ultimately it was their own fault for getting too much of the budget recently. Or words to that effect. Elsewhere he's poured oil on the troubled waters by saying that the complaints were an 'overreaction', backed up by 'relatively little' evidence.
The storm has been rumbling on since July, and there's no sign of it abating any time soon. If anything, it's growing in strength, and there's hope that, as Dylan said, 'the loser now will be later to win.' Whilst I have sympathy for EPSRC, and believe it acted in good faith, I think this kind of engineering is dangerous and ultimately fruitless.
Remember Robert Edwards, who was awarded the Nobel Prize for Medicine this time last year? He had developed in vitro fertilisation, which has led, since 1978, to millions of 'test tube babies.' Well, when he turned to his sector's funder, the MRC, in 1971 they turned him down. At the time his discipline wasn't of interest, as the politics of the day suggested the world was heading for malthusian destruction. If there hadn't been a private funder on hand his research may well have withered on the vine.
His is a cautionary tale. The allocation of research funding shouldn't be left up to politicians and apparatchiks (like me): it should be up to peers and contemporaries to decide what should be prioritised. Only then will the best, bravest and brightest have an equal chance - from whatever discipline.