Lawyers caution that AI is not a replacement for professional advice
The Victorian Legal Services Board has issued a warning on the rising use of AI in legal practices, saying it carries multiple risks.
Several legal regulators across Australia have combined to issue strict guidelines on the use of artificial intelligence (AI) by lawyers, highlighting several areas where professional standards could be breached if its use is not carefully monitored.
Some law practices are already using large language models (LLMs)—forms of AI that can process and generate text—and are even developing in-house versions using their own data.
The initiative was led by the Victorian Legal Services Board and the Commissioner of Legal Services, but the resultant rules have also been adopted by the Law Society of New South Wales (NSW) and the Legal Practice Board of Western Australia.
“While enjoying the benefits of AI, it’s important for lawyers to remember that it’s their duty to provide accurate legal information, not the duty of the AI program they use,” warned Victorian Legal Services Commissioner Fiona McLeay.
“Unlike a professionally trained lawyer, AI can’t exercise superior judgement, or provide ethical and confidential services.”
Clients were relying on access to professional expertise when engaging the services of a lawyer, she said.
AI Can ‘Hallucinate’
The new guidance points out that AI chatbots “cannot reason, understand, or advise” and cautions lawyers that they remain responsible for exercising their own forensic judgement when advising clients.
“No tool based on current LLMs can be free of ‘hallucinations’ (i.e. responses which are fluent and convincing, but inaccurate),” the guideline says.
“Generative AI tools can be biased, and cannot understand human psychology or other external complicating factors that may be relevant.”
And, if they do use AI, lawyers should ensure that it does not unnecessarily increase costs for their client because of additional time spent verifying or correcting its output, beyond what they would have paid if the law firm had used traditional methods.
Among the other risks is loss of client confidentiality.
“Lawyers cannot safely enter confidential, sensitive or privileged client information into public AI chatbots/copilots (like ChatGPT), or any other public tools,” the guideline warns.
President of the Law Society of NSW Brett McGrath said the adoption of the statement reflects the need and willingness of lawyers to adapt to changing technology.
“In more than 200 years of legal practice in Australia, technology has evolved from parchment and quill to digital communication, remote working and most recently, the widespread availability of AI. This statement reflects lawyers’ commitment to upholding the rule of law, protecting individual rights and freedoms and promoting access to justice,” McGrath said.