In a world that is increasingly shaped by AI, a growing number of people are turning to it for private client legal advice. An individual may think their situation is straightforward and therefore AI can appear a quick and simple tool to use; it can generate drafts in seconds, predicting outcomes based on data, addressing an administrative, emotive burden that has meant procrastination for months or years.
According to recent research conducted by Censuswide, 72% of UK adults aged 30 to 34 would consider using AI for will-writing. However, many people do not realise that they are putting their estate and loved ones at risk due to errors, ambiguous wording that can be difficult to contest, or a failure to meet legal requirements. Additionally, no matter how advanced the AI model or however many times you refine your queries for a different outcome, often so that it provides the response you wish to hear, AI cannot replace a trusted human advisor. Instead, AI mimics empathy, using carefully chosen words and in some cases, human-like intonation, but it does not match the warmth, reassurance, or emotional intelligence of a skilled private client practitioner.
Although AI is set to become a significant feature of most workplaces, private client legal advice must surely remain fundamentally human and personal. The best advisors listen and support, become trusted confidants, navigating personal decisions, grief, vulnerability and family dynamics. Whilst AI may simulate these human qualities, it lacks the empathy gained from years of legal practice and life experience.
Empathy in Private Client practice
Empathy is essential in private client matters, allowing practitioners to truly understand a client’s values, fears, and intentions. In some situations, this means advising against a technically perfect solution if it does not align with a client’s emotional needs or family dynamics.
Advisers offer not only legal solutions but also authenticity, compassion, and insight. Private client lawyers are entrusted with clients’ personal concerns about life and death; conversations of this nature require sensitivity, irreplaceable by a machine.
A skilled practitioner notices far more than just words. A meaningful pause, hesitation before an admission, emotion behind a decision, a sideways glance, all speak volumes. Unspoken words are often of the greatest significance, reading between lines, sometimes leading to the most meaningful moments in client meetings. These subtleties are neither detected nor interpreted by AI. A chatbot cannot begin to understand the complexity of grief, years of sibling rivalry or the silent dignity of a widowed client trying to maintain equilibrium in a broken family after death.
Putting AI to the test
Question 1 to AI:
‘I want to leave my home to my partner, but my kids don’t like him’
Response: AI provides a technical overview of the advantages and disadvantages of life interest trusts and right of occupation clauses. It does not address the emotional implications of such a decision or guide a client through the difficult family conversations that may follow. It offers to draft ‘simple wording for your Will’, the result of which could be devasting, because although this may be a common scenario, and AI recognises this aspect, the advisable outcome is not as straightforward as adding simple wording to a Will.
Question 2 to AI:
‘My brother died without a Will and I’m not sure what to do’
Response: AI provides a practical empathetic response (I’m sorry to hear about your brother—dealing with grief and legal matters at the same time can feel overwhelming’) and offers to draft letters to financial institutions for date-of-death values. AI is unaware that family disharmony may lead to a claim on the estate under the Inheritance (Provision for Family and Dependents) Act 1975.
The risks of using AI in place of professional advice
When advising on the preparation of a Will and estate planning, it is essential to understand a client’s intentions and consider any grounds that could give rise to a challenge to the Will following death. To reduce risk, practitioners should keep detailed attendance notes at the time of taking instructions and when signing the Will, all of which reduces the risk of a successful claim due to, for example, undue influence or a lack of knowledge and approval of the content. Furthermore, for a Will to be valid, it must be signed and witnessed in accordance with the correct formalities. AI platforms cannot ensure compliance with any of these procedures.
AI can also make errors, the danger here being that if it replaces professional advice, a client is likely to be oblivious to errors. In the recent case of B Zzaman v HMRC [2025] UKFTT, Mr Zzaman represented himself at First-tier Tribunal, relying on AI-generated arguments, using legal references that were neither correct nor relevant, citing case law that had nothing to do with tax or his personal situation.
Empathy over automation
In private client matters, trust, discretion, and emotional sensitivity are paramount. Human connection remains irreplaceable. Empathy is not a soft skill in private client matters, but instead, a foundation which no algorithm can truly deliver.
Advising vulnerable clients is a responsibility and a privilege. Lawyers, compared to a chatbot, are not only accountable for the advice provided to clients, but are also bound by stringent standards and regulations set by the Solicitors Regulation Authority and the Law Society.
If you have questions or concerns and need to speak to a private client lawyer, please contact Charlotte Pollard.
This article is for general information purposes only and does not constitute legal or professional advice. It should not be used as a substitute for legal advice relating to your particular circumstances. Please note that the law may have changed since the date of this article.