Lecturer Begs Students to Skim Read AI-Generated Assignments Before Submission
4 March 2026
ISSUE NO. 15
ADELAIDE — An Administrative Law lecturer has appealed to students to “at least skim read” their AI-generated assignments before submission, following what she described as “a semester of entirely avoidable errors.”
Dr Eleanor Grant, who teaches judicial review and statutory interpretation, said the university initially attempted to prohibit the use of artificial intelligence tools in assessment tasks. The policy was quietly abandoned after it became apparent that AI functionality is now embedded in Microsoft Word, Google search results, email drafting tools, and several students’ smartwatches.
“We briefly required students to declare they had not used AI,” Dr Grant said. “It became clear that this would require most of the cohort to make a false declaration.”
The university has since adopted a policy permitting AI “as a support tool,” provided students engage critically with the material.
Dr Grant confirmed the difficulty is not the use of AI, but the absence of any review.
“I am not asking them to write it themselves,” she clarified. “I am asking them to read it once.”
Recent submissions in her Administrative Law course have included essays on jurisdictional error citing the United States Supreme Court, references to “federal agencies” rather than Commonwealth decision-makers, and repeated discussion of the Fifth Amendment.
One 2,500-word paper analysing procedural fairness under Australian law began with:
“Certainly. Here is a comprehensive overview tailored to your request.”
Another concluded with:
“As a large language model, I cannot provide legal advice, but I can outline general principles.”
Several essays retained visible prompt instructions, including:
“Rewrite this in a more academic tone.”
“Add three High Court cases.”
“Expand this argument.”
In one submission addressing review under the Administrative Decisions (Judicial Review) Act 1977 (Cth), the student inserted “[Insert relevant High Court authority here]” and left it unchanged.
“If you see bracketed instructions in your own assignment,” Dr Grant said, “that is generally an indication it has not been proofread.”
The most common substantive error involved analysing Australian judicial review as though it were American constitutional litigation.
“If your answer to a question about jurisdictional error includes the President,” she noted, “something has gone wrong.”
When Dr Grant sought guidance from the faculty on enforcement, she received the following from the Academic Integrity Helpdesk:
“Thank you for your enquiry. Your concern is important to us. We understand that AI is evolving rapidly in the higher education ecosystem. Please refer to the attached guidance framework for holistic pedagogical alignment.”
The email concluded with, “This response may have been generated automatically.”
Students have defended their approach on efficiency grounds.
“It gives you a structure,” said one second-year student. “You just adjust it a bit.”
Dr Grant confirmed that “adjusting it a bit” would represent significant progress.
“At minimum,” she said, “remove the American spelling of ‘defense’ and ensure the High Court of Australia is not described as ‘the Supreme Court of Australia.’”
At press time, Dr Grant confirmed she had decided to streamline the marking process by uploading the submissions into Copilot and asking it to provide structured feedback.
“It can make sense of what they were trying to generate,” she said. “Or at least it says it can. And it is more familiar with American law than I am.”
This article is a real as a student’s medical certificate in exam time. It is parody.