Why You Shouldn’t Rely on AI for Legal Advice

With the rise of ChatGPT and Google’s AI Mode, more and more people are using generative AI tools to explain complex topics in simpler terms. However, they’re not a substitute for a qualified lawyer when it comes to legal matters. Australian law is detailed, constantly changing, and heavily dependent on the specifics and nuances of your situation. This is something AI simply can’t interpret correctly. Whether you’re making a workers compensation claim or need advice on a different area of law, relying on AI can lead to misinformation, costly errors and missed deadlines.

How Generative AI Tools Work

AI Chatbots generate answers by predicting words based on patterns from existing texts. They can’t actually “think” for themselves; they can only decide what the next logical word in a sentence should be. They don’t understand context, intent, or consequence and will very rarely ask for this information before presenting their answer. This means they can’t interpret the important nuances of your case.

AI is designed to provide general information, not tailored legal advice. It can summarise legislation or explain basic legal concepts, but it cannot consider the unique details of your case. For example, if you ask about workers’ compensation eligibility in Western Australia, AI might list broad criteria. However, it won’t weigh the medical evidence, employer obligations or timelines relevant to your specific claim. Only a qualified lawyer can assess your evidence, apply the correct legislation, and represent your interests.

Why AI Often Gets the Law Wrong

Outdated Information

One of the biggest problems with AI is that it may not reflect current law. For example, WA’s workers compensation framework was overhauled in July 2024, and many approved forms were updated as recently as July 2025. However, many AI models still cite the repealed Workers Compensation and Injury Management Act 1981, which is no longer valid. This is because the latest version of ChatGPT is only trained on data up to September 2024. Acting on outdated information could lead to invalid claims or incorrect calculations.

AI “Hallucinations” and Fabricated Cases

Hallucinations are responses generated by AI tools that contain false or misleading information presented as fact. AI tools have been caught generating “fake” legal cases and citing judgments that don’t exist. For example, in February 2025, an Australian lawyer used ChatGPT to make some submissions due in an immigration case. It was later found that 17 cases highlighted in the applicants’ documents did not exist [1].

Wrong Jurisdications

AI often pulls sources from other states or even overseas. An answer about workers compensation in Western Australia might reference New South Wales or Queensland laws that have no relevance to the case. Applying the wrong jurisdiction’s principle could seriously compromise your legal position.

How AI Misleads Workers Compensation Claims

Workers compensation claims involve multiple stages, from medical assessments and collecting evidence to insurer correspondence and dispute resolution. AI cannot guide you through each stage or anticipate legal strategies employers or insurers might use. Some people may submit self-drafted WorkCover claims based on AI-generated templates, only to have them rejected for missing essential details. Others have followed incorrect timeframes and lost entitlements as a result. A rejected claim can lead to months of delays, lost income and expensive appeals. Early legal advice from a qualified workers compensation lawyer could prevent this, and often costs less than fixing mistakes later.

AI and Client Confidentiality

Legal professional privilege is a fundamental protection that keeps confidential conversations and documents between a lawyer and their client from being disclosed. This protection allows clients to speak openly and honestly with their lawyers. However, a client can lose this privilege by doing something that doesn’t align with keeping the information confidential. For example, disclosing information to a publicly available AI model poses a major risk to confidentiality.

Legal profession regulators from Western Australia, New South Wales, and Victoria have specifically warned that lawyers cannot safely put confidential, sensitive or privileged client information into public AI chatbots [2]. Using these tools can be seen as acting in a way that is inconsistent with preserving confidentiality, which could potentially harm your case.

Better Alternatives to AI for Legal Help

AI can be a useful tool for getting general information. However, it has its limits. Be cautious and avoid over-relying on AI software as it isn’t a substitute for legal advice. A qualified lawyer can interpret your circumstances, anticipate legal challenges, and advocate on your behalf (all things AI will never do). There are also services such as WorkCover WA, which can offer limited guidance and support in relation to workers compensation.

If you have been injured at work or are currently going through the compensation claims process, it’s best to get legal advice from a qualified lawyer. We can help protect your rights and guide you towards the best possible outcome.