The Problems with NSW’s Generative AI Practice Note
The NSW Chief Justice has recently issued SC Gen 23 – Use of Generative Artificial Intelligence (“Gen AI”). However, the main problem with this Practice Note becomes apparent through a simple example: opening a Word Document containing a recent Trust Distributions paper.
The Omnipresence of Gen AI
Microsoft Copilot, now embedded into MS Word and the entire Microsoft Suite, automatically generates summaries upon opening documents and constantly offers suggestions while typing. While these suggestions might often be more of a nuisance than help (similar to GDPR popups), they demonstrate how we’re always just one click away from inserting Gen AI content – even accidentally. This makes Gen AI perpetually present, extending far beyond the Practice Note’s definition of AI that “merely corrects spelling or grammar, provides transcription, assists with formatting and otherwise does not generate substantive content”.
The Universal Challenge
Given Microsoft Word’s near-universal adoption by lawyers (Microsoft arguably being the world’s premier legal tech company), Copilot’s Gen AI capabilities are potentially available to all subscribing lawyers. Furthermore, Gen AI is increasingly integrated into various platforms and apps, making it virtually impossible for practitioners in a modern software ecosystem to completely avoid its use, whether knowingly or unknowingly.
The Separation Dilemma
While the NSW general guidance makes a commendable attempt to navigate this rapidly evolving technological landscape by trying to separate functions like search and autocorrect, this distinction becomes increasingly problematic. Take Bing internet search, for example – Gen AI now provides the first search results, and a deeper understanding reveals that Gen AI is fundamental to both generating webpage content and training search algorithms.
Fundamental Implementation Issues
The Practice Note’s approach of defining Gen AI by its outcomes (text generation versus other uses) rather than its fundamental nature creates several governance challenges:
- Who will monitor and enforce the Practice Note?
- What penalties will apply for inadvertent breaches, especially given the Note’s apparent technological misunderstandings?
- Could lawyers using outdated software systems report others for inevitable Gen AI use?
- Should lawyers seek preemptive authorization for Gen AI use, given its inseparability from modern systems?
- How will widespread technical breaches affect enforcement against deliberate violations?
The Value of Human Oversight
One commendable aspect of the Practice Note is its emphasis on human review of Gen AI material. The Note specifically requires verification that all citations, legal and academic authorities, case law, and legislative references:
“(a) exist, (b) are accurate, and (c) are relevant to the proceedings”
A Simpler Solution
The Practice Note would have been more effective had it simply reminded legal practitioners of their existing obligations to properly supervise their staff and technology, and to carefully review court submissions. The specific prohibitions it contains are already covered by existing ethical requirements.
Creating specific versions of existing prohibitions based on misunderstandings of new technology is unnecessary – just as we don’t need new laws to prevent Gen AI-assisted murder when murder is already illegal!