The use of AI facial recognition technology to determine the ages of child asylum seekers in the UK threatens to result in more of them being deemed to be adults, campaigners have warned.
The Home Office this month began inviting companies to bid to provide the technology with a view to rolling it out over the course of 2026, as Britain continues to deal with migrants arriving by small boats.
The Border Security and Asylum Minister Angela Eagle has said using AI is the “most cost-effective option” for fixing flaws in the system that assesses the ages of unaccompanied asylum seekers.
This has led to hundreds of them being wrongly deemed to be over 18 and placed in accommodation with adults where they are vulnerable to abuse and exploitation. But refugee and human rights groups are warning that AI technology could lead to even less accurate evaluations than ones carried out visually.

Kamena Dorling, director of policy at the Helen Bamber Foundation, told The National the government’s proposals to use AI are “concerning, unless significant safeguards are put in place”.
“Existing evidence has found that AI can be even less accurate and more biased than human decision-making when judging a person’s age, with similar patterns of errors,” she said.
Anna Bacciarelli, senior AI researcher at Human Rights Watch, said so far technology has only been trialled in a handful of supermarkets and websites to assess if customers buying age-restricted goods are under 25, not whether a person is over 18.
“Experimenting with unproven technology to determine whether a child should be granted protections they desperately need and are legally entitled to is cruel and unconscionable,” she told The National.
All asylum seekers who arrive in the UK on their own and give their age as being under 18 are subject to an initial assessment by immigration officers when they are taken ashore at Dover.

They are then also assessed by social workers from local councils, who have responsibility for housing unaccompanied asylum seeker children and placing them in education. They are meant to go through a formal process known as a Merton Assessment, although in many cases that doesn't happen.
A report by the UK’s chief inspector of borders and immigration found the age assessment process was “perfunctory” and that this has led to children being put into adult accommodation. The Helen Bamber Foundation revealed in 2024 that at least 681 children had been wrongly assessed to be adults.
Legal challenges in the courts by asylum seekers wrongly deemed adults have exposed the problems of visual assessments. In one case, a Syrian male was deemed by officials to be 28, turned out to be 17.
In another case, a shaving manual produced by Gillette was used to determine that a child was an adult. These children were forced to share rooms with strangers in adult asylum accommodation, and many are now also ending up in adult prisons after being prosecuted for arriving illegally.
The Helen Bamber Foundation cites academic studies which cast doubt on the ability of AI to accurately assess age. In one of them, researchers from Ben-Gurion University of the Negev in Israel found in a 2022 study that when it comes to assessing age, “as it turns out AI platforms make many of the same errors as humans, although to a larger extent”.
Ms Bacciarelli said the turn to AI to assess the ages of asylum seeker children comes after governments in Europe and tried various other methods, such as measuring bone density, which have not lived up to expectations. “What we're really worried about with the introduction of AI is that it’s a quick-fix solution to a complex problem,” she said.
She believes the technology that could be used was not designed for children who are seeking asylum.
She explained that algorithms used age assessment to identify patterns in the distance between nostrils and the texture of skin. But these cannot account for children who have “aged prematurely from trauma and violence”.
“It doesn't take into account the particular facial features of someone arriving in this country by boat. They may have had their face weathered by exposure to salt water, may have experienced a lot of trauma, not slept and suffered malnutrition. Algorithms cannot grasp how malnutrition, dehydration, sleep deprivation and exposure to salt water during a dangerous sea crossing might profoundly alter a child’s face.”
Ms Dorling said that even if AI is introduced, the final decision on whether an asylum seeker is a child or not should always rest with trained social workers and properly trained border staff.
“Replacing poor human decision-making with facial recognition technology that is not sufficiently accurate will not address the current safeguarding crisis.
The Home Office has this week announced the expansion of live facial recognition technology as a "targeted" crackdown on high harm offenders.
A Home Office representative said: “Robust age assessments are a vital tool in maintaining border security.
"We will start to modernise that process in the coming months through the testing of fast and effective AI Age Estimation technology at key Border Force locations, with a view to fully integrate Facial Age Estimation into the current age assessment system over the course of 2026.”