Academics Allegedly Using ChatGPT for Grant Application Peer Reviews
Australian researchers have raised concerns over the use of generative AI model ChatGPT in peer reviews of grant applications, prompting the Australian Research Council (ARC) to remind academics about the importance of maintaining confidentiality in the review process. The ARC_Tracker Twitter account reported that it had received reports of assessor reports for ARC discovery projects being produced with ChatGPT. These reports were criticized for lacking critical feedback and instead resembling a summary of the original proposal. The use of AI in peer reviews may constitute a breach of confidentiality, according to the ARC.
Rigorous Review Process for ARC Discovery Projects
ARC discovery projects are highly competitive research programs aiming to secure government grants of up to $500,000. With only a 15-to-20 percent success rate, the review process is known for its rigorous evaluation. However, concerns have been raised about the quality and authenticity of some assessor reports, leading to suspicions that ChatGPT has been utilized in generating these reports. The ARC_Tracker account owner received expert assessments that appeared to be simple rearrangements of the grant proposals, lacking any critique or unique insight.
Confidentiality Breach Cautionary Statement from the ARC
In response to these allegations, the ARC has released a statement advising peer reviewers against using AI tools, including ChatGPT, in their assessments. The council emphasized that sharing materials outside of the closed research management system, including generative AI tools, could potentially breach confidentiality obligations. While the ARC did not confirm the frequency of ChatGPT usage in peer reviews, it acknowledged the need to address confidentiality challenges presented by the use of generative AI.
Factors Influencing the Use of ChatGPT in Peer Reviews
ARC_Tracker highlighted two main factors contributing to the potential adoption of ChatGPT in peer reviews. First, academics often face overwhelming workloads, leaving limited time for reviewing grant proposals thoroughly. Secondly, ARC’s lack of clear policies regarding the use of AI, specifically generative text engines like ChatGPT, may have contributed to reviewers resorting to such tools. The statement released by the ARC was criticized for its lack of explicit mention of generative AI, leading to confusion among academics.
The Need for Clear Guidelines and Preemptive Policies
ARC_Tracker argued that the ARC’s policy on confidentiality does not clearly prohibit or restrict the use of AI text generators like ChatGPT. The Twitter account owner emphasized the importance of having concise and straightforward policies in place to prevent potential issues. The ARC was urged to be proactive and address community concerns in a timely manner rather than relying on complex policies that are difficult to interpret.
Implications for Grant Program Administration
The use of generative AI tools presents confidentiality and security challenges, not only for the ARC but also for other grant programs. While ChatGPT usage in peer reviews needs further examination, universities like the University of Melbourne have already used AI detection models to identify AI tool usage by students. As the ARC considers these issues, it has assured applicants that their concerns will be addressed in accordance with existing policies.
Ensuring the integrity of the peer review process is crucial for maintaining the credibility and fairness of grant applications. The potential use of AI tools like ChatGPT in this process raises important questions about maintaining confidentiality and the quality of assessments. The ARC’s warning serves as a reminder to researchers to prioritize ethical practices and avoid breaching confidentiality obligations. Developing clear guidelines and policies around the use of AI in peer reviews can help prevent such issues and provide a more transparent and effective evaluation process.
For more news and updates on artificial intelligence and technology, visit GPT News Room.