An Australian lawyer has been referred to the New South Wales Legal Services Commissioner (OLSC) after admitting to using ChatGPT to generate court documents containing nonexistent case citations in an immigration case.
Federal Circuit and Family Court Justice Rania Skaros made the referral on Friday, with the lawyer’s name redacted from the ruling. The case emerged when the lawyer submitted an amended application and outline of submissions to the court in October 2024, which were found to contain fabricated case citations and quotes.
“Both documents contained citations to cases and alleged quotes from the tribunal’s decision which were nonexistent,” Justice Skaros noted in her ruling.
Lawyer Admits to Using AI for Court Documents
The lawyer acknowledged his error on November 19, expressing deep regret for the mistakes. During a hearing on November 25, he admitted to using artificial intelligence to draft the documents. According to Justice Skaros, “The [lawyer] stated that he had used AI to identify Australian cases, but it provided him with nonexistent case law.”
The court expressed significant concern about the incident, particularly regarding the lawyer’s failure to verify the accuracy of the submitted documents. Justice Skaros highlighted that “a considerable amount time had been spent by the court and my associates checking the citations and attempting to find the purported authorities.”
Lawyer Uses ChatGPT for Case Summary, Faces Scrutiny
In his defense, the lawyer submitted an affidavit citing time constraints and health issues as factors that led to his decision to use AI. The judgment revealed that “he accessed the site known as ChatGPT, inserted some words and the site prepared a summary of cases for him.” More concerning was the admission that “he said the summary read well, so he incorporated the authorities and references into his submissions without checking the details.”

The immigration minister’s counsel argued for the referral to OLSC, emphasizing the public interest in addressing AI misuse in legal proceedings. They stressed that “such conduct would continue to occur and must be ‘nipped in the bud’.”
This incident marks the second case in Australia where a lawyer has faced regulatory scrutiny for AI misuse, following a similar situation in Melbourne last year when a lawyer admitted to using AI in a family court case that also generated false citations.
AI’s Growing Pains in the Legal Field
The New South Wales Supreme Court has responded to these concerns by implementing new practice guidelines. Starting Monday, these guidelines will restrict the use of generative AI in legal proceedings, specifically prohibiting its use in generating affidavits, witness statements, character references, or other materials intended for evidence or cross-examination.
Justice Skaros emphasized that the use of generative AI in legal proceedings remains a developing issue requiring careful consideration. The lawyer, who was reportedly deeply embarrassed by the incident, has since taken steps to improve his understanding of AI technology.
This case underscores the growing pains of integrating AI into the legal field. The legal profession faces a critical challenge: balancing the potential benefits of AI tools with the need to uphold the integrity of legal processes and documentation. Â
As AI becomes more prevalent, questions arise about its impact on traditional legal practices. This case highlights the complex intersection of technological advancement and the fundamental principles of law, forcing the legal community to navigate uncharted territory to ensure fairness, accuracy, and accountability in the age of artificial intelligence.