The use of automated facial recognition technology (AFR) has caused outrage across the world, particularly through the irresponsible practices of its users
The issue of AFR has many heads, including whether the technology is used by public or private bodies, where developers can source individuals’ images from, the confirmed bias in most versions of the technology and to what extent the consent of members of the public is necessary.
Unfortunately, despite there being so much we still don’t understand about this technology, police forces have been utilising it across the world, and questions have been raised about its use at BLM protests.
When the police are using this new technology on members of the public we can rightfully question whether they are using it legally or fairly.
Today’s judgement in the UK’s Court of Appeal helps to answer some of those questions.
Ed Bridges v South Wales Police
Ed Bridges’ case started in 2018, when he argued that the South Wales Police department had acted unlawfully when they used AFR to capture his image twice.
In the first court case, the High Court did agree that AFR had the potential to be unlawful – because it did have the potential to infringe Mr Bridges right to privacy – but found that the way South Wales Police had used it was justified. Mr Bridges, and the Liberty legal team, appealed this decision to the Court of Appeal and on 11 August 2020 this court overturned the previous decision.
Liberty’s argument was based on five points, and the court agreed to three of these. Each of these will have a vital impact on the future of AFR in this country. Below are the implications to keep in mind, and a brief explanation of the points which provided them.
1. The government and local authorities will be forced to properly consider the implications of AFR in the UK, and produce clear guidance on how, if at all, it can be properly used.
The High Court had already agreed that the use of AFR has an effect on the public’s right to privacy (under Article 8 of the Convention on Human Rights). However, this right can be infringed when it is necessary and proportionate. The High Court had decided that it was in this case.
Liberty disagreed. If the state wants to justify its actions when it infringes the privacy rights of a member of the public, one thing it needs to prove is that its actions are legal according to the state’s own laws.
This is important, so that the actions of the police remain transparent and can be held to a reasonable standard. Otherwise it would be very hard to judge which actions are justified, as the police would be able to do anything they wanted – including illegal acts – and then claim justification afterwards.
In this case, there is very little guidance on how AFR can be used. Therefore, the limits to their power were not sufficiently clear, and there was too much uncertainty as to what they were and weren’t supposed to be doing. This makes it very challenging to prove that their actions were justified.
This means that the government and the Surveillance Camera Commissioner will have to discuss the proper use of AFR and put some clarifying legislation in place. This will be a very useful exercise to consider the implications of this technology on the public, and we look forward to this discussion.
2. The fact that AFR infringes on the public’s right to privacy is now undisputed, and any future use of it will have to take this into account
The South Wales Police department is required to do a proper impact assessment before using new technology. This impact assessment is a very important document, as it helps the police understand how their actions might affect the public and helps them avoid any negative impacts.
However, when the South Wales Police department put together their impact assessment, they started with the assumption that they would not be infringing the public’s right to privacy. Therefore, they never put plans in place to limit this infringement (which did exist), or to limit the negative impact it could have on people like Mr Bridges.
This means that any future use of AFR must start from the basis that it’s use CAN (and usually will) infringe the public’s right to privacy. The user must then find ways to limit or justify this.
3. The racial bias found within most AFR must be considered in any future use by a public authority
One of the major ongoing issues of AFR is that it is regularly found to be biased. Numerous studies have found that it has difficulty differentiating the faces of black people and women.
This has been potentially explained by a bias in the data which is used for this technology. As the developers mainly run images of white men through the system to teach it, it became adept at learning the differences between their faces. However, as fewer images of black people or women were put through the system, it had difficulty learning the differences between their faces.
Whether this is the true reason or not, the impact is that there are far more false positives for black people and women. That means far more black people and women are likely to be flagged up by the system as being connected to a crime when they were not.
Unfortunately, the South Wales Police failed to factor this into their use of the system. The Equality Act 2010 requires them to consider whether their actions will result in negative discrimination against a certain group. However, when they filled in their impact statement, they never properly investigated this. They simply continued on the assumption that it would not, despite extensive evidence to the contrary.
Therefore, when they used the technology, they were exposing every black person and woman who they encountered to the documented discrimination of the technology, and made no effort to stop this.
This means that, in the future, racial biases in the system will have to be investigated, acknowledged and accounted for. Hopefully, this can force AFR developers and users to remove this bias from the system.