Home Australia Hundreds of students are using AI to cheat on their assignments – but universities have worked out how to catch them and it’s simpler than you might think

Hundreds of students are using AI to cheat on their assignments – but universities have worked out how to catch them and it’s simpler than you might think

0 comments
A 'new wave' of fraudulent assignments using AI has been found by universities in 2023

<!–

<!–

<!– <!–

<!–

<!–

<!–

Waves of students are turning to artificial intelligence to write their assignments, but universities are doubling down on methods to catch them.

Sydney University has revealed that 330 tasks were carried out using artificial intelligence by 2023, and the University of NSW recently said it had similarly found a ‘new wave’ of cheaters emerging.

OpenAI’s ChatGPT engine, which helped fuel the AI ​​boom, has emerged as the tool of choice for lazy students, accounting for 60.2 percent of total industry site visits, according to Visual Capitalist.

The tool takes already written chunks of text from the web and then combines, reformulates and rewrites them to answer questions posed to it – with varying levels of accuracy and impressiveness.

Marking systems have struggled to keep pace with the use of artificial intelligence in the classroom, but recent developments combined with a crackdown on sloppy grading by university watchdogs are making it harder for those who would rather let a robot write for them.

Although UNSW did not reveal how many AI-assisted assignments had been caught, its academic misconduct report recorded a significant increase in offenses in 2023, reports Sydney Morning Herald.

A 'new wave' of fraudulent assignments using AI has been found by universities in 2023

A ‘new wave’ of fraudulent assignments using AI has been found by universities in 2023

A widespread consensus among students is that the use of artificial intelligence was undetectable, but Deakin University fraud detection expert Professor Phillip Dawson revealed that close readings of altered work showed otherwise.

Prof Dawson said Turnitin, an AI-branded software tool, is only good at finding plagiarized work if the student is ‘an idiot’.

“Most of the research that shows good detection rates is based on the assumption that someone is just copying and pasting, and they’re not asking ChatGPT to rephrase or rewrite,” he said.

Students are increasingly using AI tools to help them complete their work

Students are increasingly using AI tools to help them complete their work

Students are increasingly using AI tools to help them complete their work

A Sydney University spokeswoman told the newspaper that it was actually much easier to spot fraud just with closer inspection from human award markers.

“If (an assignment) contains different language, is irrelevant to the question, has false references or fails to answer the question asked, we investigate it and use the Turnitin AI tool as part of this process along with a number of indicators of misconduct,” she said.

Turnitin’s regional vice president, James Thorley, agreed that the tool was meant to be part of the proofreading process, not the be-all-end-all of AI detection.

The higher education sector’s watchdog, the Tertiary Education Quality and Standards Agency (TEQSA), in June required all higher education providers to draw up action plans on how they will eradicate AI tasks.

These plans must carefully consider how each institution will ensure the integrity of its education.

Prof Dawson said unless students are being monitored during an assessment, markers must assume they can turn to AI to finish it for them.

Both he and Mr Thorley agreed that universities must now navigate the difficult landscape of how much students can use AI before it becomes a major problem.

A Sydney University spokeswoman said markers read assignments with more attention to detail to weed out cheaters

A Sydney University spokeswoman said markers read assignments with more attention to detail to weed out cheaters

A Sydney University spokeswoman said markers read assignments with more attention to detail to weed out cheaters

Sir. Thorley said the universities his company had consulted with were encouraging the use of generative AI ‘in the right framework and right guidelines’.

An English lecturer at Sydney University, Associate Professor Huw Griffiths, told the publication that he had already integrated ChatGPT in its course work.

Sir. Griffiths said the use of artificial intelligence allowed students to ‘understand their own agency’ by discovering the limitations of it compared to traditional research sources.

The University of Technology Sydney (UTS) is also taking a similar approach by encouraging staff to discuss AI tools with students.

UTS’s idea behind embracing tools like ChatGPT is that staff caninvite students to actively engage … and to reflect critically on how they can be used’.

You may also like