A busy few days
Unregistered psychologists; jury reforms; football bans; AI; and public office
Family judges should not grant permission for expert evidence to be given in children proceedings by “an expert ‘psychologist’ who is neither registered by a relevant statutory body not chartered by the British Psychological Society” unless there is no alternative, the president of the High Court family division ruled in a judgment delivered on Friday.
Sir Andrew McFarlane was explaining why he had set aside a “flawed judgment” delivered by District Judge G Smith in December 2019. Smith ruled that two children of a separated couple should move immediately to live with their father. At that time the elder, a girl, was 12 years old and her brother was nine.
Smith had heard evidence from Melanie Gill, a psychologist. Without going on to hear from any other witness, the judge concluded that the children had been alienated from their father as a result of the mother’s highly negative attitude towards him.
But, said McFarlane, Smith had fallen into a basic error “by not establishing the factual matrix first, in particular whether there had been domestic abuse, before considering any expert evaluation”.
He added:
The submissions of the mother’s counsel — that that was what should have happened — were spot-on. They were based on the long-established principle that judges decide the facts and experts advise on the basis of those facts — and not the other way around as was, unfortunately, the case here.
At the High Court hearing last month, the boy’s father did not oppose an application from his former wife that their “troubled” son, now 16, should move in with her.
“This judgment is not about Melanie Gill,” McFarlane stressed. “It is, much more worryingly, about the failure of the system to act as it should have done in discharging its responsibility to protect the children and to prioritise their welfare needs.”
But he drew attention to the fact that Gill does not have a clinical or therapeutic practice in which she sees patients. She is not registered with the Health and Care Professions Council.
His 40-page ruling, he explained, was “about those individuals who hold themselves out as ‘psychologists’ and are willing to be instructed in family court cases but who are neither registered, nor chartered, as psychologists.”
McFarlane had been told of a claim by Gill on social media that she had been “exonerated” by a judgment the family division president had given three years ago in a case reported as Re C (“Parental Alienation”: Instruction of Expert).
“If such a claim has been made by Ms Gill,” he said on Friday, “ she has fundamentally misunderstood the court’s judgment in Re C which was critical of her claim to any form of expert qualification and which strongly cautioned any court in the future from instructing an expert, such as Ms Gill, who is neither registered nor regulated.”
The context of Friday’s judgment — which goes further than McFarlane’s earlier ruling — was guidance from the Family Justice Council just over a year ago that so-called “parental alienation syndrome” has no evidential basis and is considered a harmful pseudo-science. In accordance with recent practice for anonymised judgments, McFarlane gave his ruling a useful name: Re Y (Experts and Alienating Behaviour: The Modern Approach).
Lammy to announce jury reforms
According to a report on Friday afternoon from the Telegraph’s well-informed home affairs editor Charles Hymas, David Lammy is to press ahead this week with his planned jury reforms.
“The justice secretary is due to unveil the legislation without any alterations despite fierce opposition from within his own party, Hymas says.. “Government sources said polling and canvassing returns showed the plans are not causing the same backlash among voters on the doorsteps, nor registering as a big issue in focus groups.”
Subscribers to A Lawyer Writes will recall that Sir Brian Leveson told me to expect the proposals this month. But detailed analysis of the government’s plans will have to wait for publication of Lammy’s bill and accompanying documents.
Don’t take us for fools, say tribunal judges
“The Upper Tribunal cannot afford to have its limited resources absorbed by representatives who place false information before the tribunal,” three Upper Tribunal judges said in a ruling published on Friday.1 “The citation of cases which do not exist sends [the] judge on a fool’s errand. The time spent on such an errand is at the expense of other judicial business and is not in the interests of justice.”
Sitting in the immigration and asylum chamber and exercising the Hamid jurisdiction that I explained here last May, Upper Tribunal Judges Lindsley, Keith and Blundell said solicitors “must ensure that fee-earners under their supervision are aware of the dangers of using non-specialist artificial intelligence (AI) for legal research and drafting”.
They continued:
Failure to do so, or to undertake appropriate checks on the drafting of fee-earners, is likely to result in a referral to the Solicitors Regulation Authority or other professional body.
A supervisor who fails to ensure that the work of a more junior fee-earner does not contain false cases or citations is likely to be more culpable than a lawyer who fails to ensure that his own work is free from such “hallucinations”.
The claim form by which judicial review is sought in the Upper Tribunal has now been amended so as to require a legal representative to confirm by a statement of truth that any authority cited within the form or in any documents appended to it (a) exists; (b) may be located using the citation provided; and (c) supports the proposition of law for which it is cited…
Uploading confidential documents into an open-source AI tool, such as ChatGPT, is to place this information on the internet in the public domain — and thus to breach client confidentiality and waive legal privilege — and any such conduct might itself warrant referral to the Solicitors Regulation Authority and should, in any event, be referred to the Information Commissioner’s Office.
The Upper Tribunal was dealing with two cases. It was hearing the first to decide whether to refer Tahir Mehmood Mohammed of TMF Immigration Lawyers to the Immigration Advice Authority for investigation. This turned out to be unnecessary because he had already reported himself to the Solicitors Regulation Authority.
Asked to explain how a “fake case” had been included in grounds of appeal, Mohammed thought he might have inadvertently used the “AI mode” of a Google search. The tribunal observed that “the danger in using AI for legal research is not confined to generative AI models such as ChatGPT”. It said that Google AI was just as likely to generate false results that might at first appear accurate.
The second case involved Zubair Rasheed of City Law Practice in Birmingham. I won’t trouble readers with the details but this was the tribunal’s decision:
We accept that Mr Rasheed is a solicitor with an unblemished record of fifteen years’ practice and we accept that he genuinely regrets what occurred in this case. Having taken into account what was said in Ayinde, however, we consider that the inclusion of false citations in the grounds for judicial review and Mr Rasheed’s failure to supervise the work undertaken by his brother in this and an unknown number of other cases necessitates a referral to the Solicitors Regulation Authority.
We do not consider their mother’s illness to provide an adequate explanation for those failures and we consider that our concerns about Mr Rasheed’s oral evidence serve to strengthen the case for referral. It will be for the Solicitors Regulation Authority to decide what if any action to take thereafter.
Police intelligence
Talking of AI, the House of Commons home affairs committee published its report yesterday on the risk assessment by West Midlands Police that led to a ban on fans of Maccabi Tel Aviv attending the Europa League fixture at Aston Villa on 6 November.
MPs found that West Midlands Police had relied on inaccurate information on the behaviour of Maccabi Tel Aviv fans in reaching a view of them as unusually high-risk. Officers “failed to do even basic due diligence on the information they received. This included false information that was generated by AI”.
Specifically, a police reference to a non-existent West Ham match had been created by Microsoft Copilot.
The committee observed:
Former Chief Constable Guildford was not informed ahead of giving oral evidence on 6 January that Microsoft Copilot AI had been used to generate the erroneous information about a match between West Ham and Maccabi Tel Aviv. On this basis we can only conclude that the former chief constable did not intentionally mislead the committee.
However, by 6 January 2026 we understand that the use of AI had been disclosed within West Midlands Police. So it is reasonable to expect that former Chief Constable Guildford and Assistant Chief Constable O’Hara should have been accurately briefed on this matter.
Having been asked specifically about the use of AI on 1 December 2025, it demonstrates a remarkable lack of professional curiosity on the part of the former chief constable not to interrogate the evidential basis to furnish himself with accurate information ahead of our session on 6 January. The fact that he was able to give the committee incorrect information on two separate occasions is more evidence of the poor due diligence which West Midlands Police applied to information in this case.
West Midlands Police are not the only public body involved in the ban to come in for criticism. The committee said:
While we cannot conclude that the safety advisory group’s decision was made because of political pressure, on the basis of the evidence we have seen we also cannot conclude with any confidence that the decision was not politically influenced. It is clear that on this occasion councillors with a stated political aim had a disproportionate opportunity to influence safety advisory group decision-making on a deeply divisive political issue…
We recommend that the government takes the necessary steps to ensure that elected politicians cannot sit on safety advisory groups.
The government had “increased tension around the fixture but was ineffectual in enabling Maccabi Tel Aviv fans to attend”, say MPs:
The Home Office failed to recognise the significance of the decision and escalate appropriately, which is surprising given that it had already been asked by No 10 for information regarding the fixture. We believe that early intervention could have been achieved in a way that was sensitive to operational independence.
This would have been preferable to the action the government did take in publicly challenging a policing decision before the event had taken place, without full consideration of the evidence that supported that decision.
Dame Karen Bradley MP, chair of the home affairs committee, said:
It is an extraordinary measure to decide to ban fans from attending a fixture, particularly in the cultural and political climate that this occurred in. It is therefore crucial that the decision-making process, and the information underpinning it, is beyond reproach.
Instead, there appears to have been a “that’ll do” attitude. Banning Maccabi Tel Aviv fans would make policing the match much easier. To justify this step, information that showed the Maccabi fans to be a high risk was trusted without proper scrutiny. Shockingly, this included unverified information generated by AI. While Maccabi Tel Aviv fans were falsely characterised as unusually violent, the threat posed by local communities was downplayed and too little care was given to the impact on the Jewish community in Birmingham.
Government intervention was clumsy and came too late, and we reject the government’s argument that it could only intervene once the decision was taken.
Misconduct in public office
It has been a busy few days. Between Thursday morning and Friday lunchtime, I had given 16 broadcast explanations of misconduct in private office. This seemed to impress other broadcasters:
Even more incredible was that, if the broadcasters’ on-screen captions were accurate, I was in two separate but identical locations at the same time. Sky got it right, by the way.




Having retired some years ago I can only feel grateful that the need to spot and avoid AI was before my time. Appearing in the higher courts was enough of an ordeal as it was without adding another bear trap for the overworked and stressed out lawyer. Nonetheless there can be no excuse for going into court without having read thoroughly any case they (or indeed the other party or parties) cited. That was always and surely still is a basic rule for any advocate.
Who on earth could take someone seriously who chooses to try to look like a q3 year old punk with pink hair and an overeating problem? Who thinks she can misrepresent herself as a professional without bothering to get the essential experience and qualifications?
That "judge " has quite rightly, potentially put into doubt any and all of their previous judgments