A new survey of 2,000 UK adults, commissioned by Face Int and conducted by Opinium, has found:
- 69% believe the public should have a say in how facial recognition technology (FRT) is used in the UK
- 61% worry errors could get people into trouble for things they have not done
- 57% are concerned about how images of their face are stored
More than two thirds of Britons believe the public should be consulted on how facial recognition technology is used, according to new research by Face Int, which also highlights concerns around errors, data privacy and surveillance.
Face Int commissioned Opinium to survey 2,000 UK adults. It found that 69% of respondents believe the public should have a say in how FRT is deployed across the UK.
Concerns around the reliability of the technology and data privacy also emerged as key issues. Some 61% of respondents worry that errors could result in individuals being wrongly implicated in incidents they were not involved in, while more than half (57%) are concerned about how images of their face are stored.
There is also a broader unease about the implications of the technology. Over half (57%) of respondents believe facial recognition represents a step towards a surveillance society, while 39% think there is a risk of racial bias within FRT systems. More than half (54%) also said facial recognition gives them an eerie sense that “Big Brother” is watching them constantly.
Despite these concerns, attitudes are not entirely negative, with just over half (54%) agreeing that people should not worry about facial recognition if they have done nothing wrong.
Tony Kounnis, CEO of Face Int, said: “These findings show that public concern around facial recognition is not abstract or ideological. People are worried about very specific issues – whether the technology can make mistakes, how their data is stored and used, and what it means in the context of wider surveillance.
“That matters for organisations using the technology. It means the conversation cannot stop at whether the technology works. There needs to be clear justification for its use, transparency around how it is deployed, and strong safeguards in place to protect individuals and their data.
“Just as importantly, there is a clear expectation from the public to have a say in how facial recognition is used. Organisations need to recognise that and engage with it, rather than treat it as a purely technical or operational decision.”


