Lawyer Competence in the Age of AI
Created with Microsoft Designer.

Lawyer Competence in the Age of AI

AI adoption by lawyers is growing. Yet, concerns and fears remain about ethical duties and professional obligations when using this technology.

In addition to the duties to be officers of the court and to the administration of justice, lawyers owe a duty of competence. This is a duty reflected in many jurisdictions across the world. This edition focuses on what legal competence could mean in this dawning age of AI.

What does competence in the age of AI mean?

Whilst the duty of competence is reflected in many jurisdictions, for example Australia, New Zealand, United Kingdom, Canada and the United States, some jurisdictions expand the existing competence duty to include technology. For example, the American Bar Association has created an additional model rule around duty of technological competence. This LinkedIn contribution by Professor Jaen is a comprehensive comparative study in the United States in this exact context.

For those of us in the Antipodes, the situation can seem more challenging. This is because it has only been recently established that the duty of competency extends to technology in legal practice. Most competency issues are in the context of general file and deadline mismanagement (something which proper use of technology could certainly assist with). Technology is not directly mentioned in these competency cases.

The use of AI is of concern because a lot of what AI (or at least the ChatGPT varieties) appears to do is legal research and legal writing. The issue lies not in the technology but in our understanding of it. AI is an enormous area of technology. The lawyer equivalent would be like referring to public law, where specifying criminal law would be much more helpful in context. The reality is that just like with legal practice, the devil with AI is in the details.

Taking a look at how lawyers have gotten into trouble with using AI, the only example so far has been how competence has been called into question when a US lawyer was pulled up for citing an AI-hallucinated case. Other concerns seem not to be related to client complaints or lawyer conduct, but more about conflicts with lawyer standards generally. For example, most popular AI systems are what's called Large Language Models (LLMs) and these tend to amplify the bias of content used to train these systems. Such bias has the possibility to go against legal professional standards, such as against the duty to the court and administration of justice. Other concerns relate to the general nature of ChatGPT and use of a law-relevant AI-based tool, rather than something that is non-specific to law.

How can you improve your competence in the age of AI?

Going back to competence generally, it would seem that not sufficiently checking the case law in accordance with the doctrine of precedent fits squarely into the realm of competence. This is something we are all taught in law school and is less reliant on specific technology. This should provide some relief as the skills learnt here are still relevant.

Similarly, lawyers who do not familiarise themselves with technology could fall foul of other duties and not just competence. This is because the rate at which technology is transforming law means that there are technological advances which have the potential to aid a lawyer's duty to the administration of justice, clients interests, and the obligations to the courts. It stands to reason, then, that failure to avail themselves to the technological tools available could have the potential for findings of ethical failure.

This is not a situation where lawyers can stick their heads in the sand and wait for this revolution to go by.

The technology is useful for lawyers and clients are more likely to choose the lawyer who has optimised their practice to provide available efficiencies. It does mean that those efficiencies cannot be compromised by lack of general legal competence.

A roadmap to lawyer competence with AI

Here's a brief roadmap to navigating lawyer competence in this age of AI:

  • Seek efficiencies where there is no lawyering involved. Examples include process and productivity tools that are not directly related to legal analysis or research.
  • Seek out specific legal tools, especially where they are related to analysis and research. If the research tool is not specifically a legal one, it is likely to not meet the benchmark to be suitable.
  • Do not dismiss technology due to the incompetence of others. Their mistakes make it an easy lesson to learn what traps there are with the technology, rather than taking the lesson not to engage with the technology at all.

If you find this newsletter useful, please consider sharing with a friend or colleague. If you're inspired and want to discuss, I'm open to all sorts of conversations around law, lawyering, technology and wellness.

To view or add a comment, sign in

More articles by Chantal McNaught

Insights from the community

Others also viewed

Explore topics