top of page

Research impact - what is it and how is it assessed by the NHMRC?

Updated: Apr 4

The NHMRC recently published a report summarising their evaluation of the Research Impact Track Record Assessment (RITRA) framework.  

The RITA framework, used to assess researcher track records, was developed and implemented in 2019 when the new Synergy Grants and Investigator Grants schemes were initiated, along with Ideas Grants and Clinical Trials and Cohort Studies Grants. 

What is research impact according to the NHMRC?

The NHMRC defines research impact as:

» the effect of research after it has been adopted, adapted for use, or used to inform further research

» the verifiable outcomes from research and not the prospective or anticipated effects of the research – it can include research that leads to a decision not to use a particular diagnostic, treatment or health policy.

How is research impact assessed in track record assessments?

Research impact is one of 3 major components of track record assessment for Investigator or Synergy grants. Applicants are required to choose the type of research impact (knowledge, health, economic or social) and explain in 3 separate fields of 3,000 characters each:

1.       the reach and significance of the impact

2.      the research program’s contribution to the impact

3.      the applicant’s contribution to the research program.

All fields must contain corroborating evidence that must be included within the character count of each field.

The RITA framework was designed to shift grant application assessment away from traditional bibliometric measures of track record towards a greater emphasis on later stage outcomes and impacts.

A timely review

With 5 rounds of Investigator Grants completed since the 2022 introduction of this new track record assessment framework, it was deemed time to review the structure and assess if it was achieving its aims. The review was an evaluation of the process rather than an evaluation of the influence of the framework on long term research translation and impact.

For those who don’t want to wade through the 82 pages of the full report (although it is a great read if you have the time!), we touch on some key points of interest below.

How was the review conducted?

There were two main components to the review:

1)      a survey of applicants and peer reviewers of the 2023 Investigator Grants round

2)     analysis of the complete database of submitted grant applications for scores on each section and text/content analysis of research impact sections.  

Major outcomes/recommendations:

1. Changes could be made to reduce duplication in the application. From the point of view of both applicants and reviewers, there is often substantial replication of content across the research impact sections and between the research impact and publication sections. This may result in the future removal of one or more of the research impact sections to streamline the content, although it is not clear at this point how duplication between research impact and publications would or could be addressed.

2. Corroborating evidence could be removed from the research impact section character count. This would provide a more level playing field for applicants to describe the relevant research impact without having to adjust what they can provide to accommodate long or unwieldy website links or other forms of evidence. This takes us back in time to how this section used to be presented – with the explanation provided in one text field of 2000 characters, and the evidence provided in another of 1000 characters. However, this recommendation goes even further to support the evidence field being open-ended/unlimited – to accommodate whatever and however much evidence researchers need to bring to bear to show their research impact.

3. The domination of knowledge impact and how this is evidenced. Knowledge impact was selected by 91% of applicants across all Investigator Grant leadership levels. Text analysis indicated that even those that selected other impact types also reported evidence of knowledge impact within their narrative. The evidence provided of knowledge impact was more often the generation of knowledge that may have impact rather than providing actual evidence of research impact. On this basis, the recommendation is that the requirements are changed here so that applicants describe their pathway to impact through engagement with research end users.

4. Revision of category/score descriptors. Peer reviewers, applicants and the NHMRC all need to understand the category/score descriptors in the same way to ensure that differences between scores (e.g. exceptional (7), outstanding (6), excellent (5) etc) are clear and consistently understood.

Review of the RITRA framework is expected to be guided by an expert working group. Updates to the framework will begin piloting in the Investigator Grants scheme for 2025 and 2026 funding rounds.

Watch this space when the 2025 Investigator Grants round opens in June to see if any of these recommended changes have been rapidly implemented by the NHMRC.


bottom of page