Judgment Approved by the court for handing down

R (Bridges) v CCSWP and SSHD

153.

In our view, and on the facts of this case there is an air of unreality about the
Claimant’s contention. There is no suggestion that as at April 2017 when the
AFR Locate trial commenced, SWP either recognised or ought to have
recognised that the software it had licenced might operate in a way that was
indirectly discriminatory. Indeed, even now there is no firm evidence that the
software does produce results that suggest indirect discrimination. Rather, the
Claimant’s case rests on what is said by Dr Anil Jain, an expert witness. In his
first statement dated 30th September 2018, Dr Jain commented to the effect
that the accuracy of AFR systems generally could depend on the dataset used
to “train” the system. He did not, however, make any specific comment about
the dataset used by SWP or about the accuracy of the NeoFace Watch software
that SWP has licensed. Dr Jain went no further than to say that if SWP did not
know the contents of the dataset used to train its system “it would be difficult
for SWP to confirm whether the technology is in fact biased”. The opposite is,
of course, also true.

154.

In a statement dated 26th November 2018 made on behalf of SWP, Dominic
Edgell an officer in the SWP’s Digital Services Division provided information
about the rate of false positive matches based on deployments of AFR Locate
between May 2017 and June 2018. That was that the rate of false positives
was proportionally higher for men than women; and that the proportion of
female false positive alerts compared to the total number of female alerts was
higher than the proportion of male false positive alerts to the total number of
male alerts. When Mr. Edgell investigated this, he concluded that the higher
proportion of female false positives was the consequence of two watchlist
female faces which had significant generic features. His evidence is that the
variation was because these specific faces were on the watchlists, not the
consequence of gender bias. Mr. Edgell also explained that he reviewed the
use of AFR Locate for bias based on ethnic origin. His results suggested no
such bias.

155.

In a second statement dated 25th January 2019 Dr Jain commented as follows
on AFR Locate (at paragraph 15):
“I cannot comment on whether AFR Locate has a discriminatory
impact as I do not have access to the data sets on which the
system is trained and therefore cannot analyse the biases in those
data sets. For the same reason, the defendant is not in a position
to evaluate the discriminatory impact of AFR Locate. However,
bias has been found to be a feature of common AFR systems.”
and then on Mr. Edgell’s evidence (at paragraphs 34 to 35)
“34. Mr. Edgell concludes that he has seen no gender bias when
using AFR technology. Despite there being proportionally more
false positive female alerts than false positive male alerts, he
explains this as being due to the presence of two “lambs” …

Select target paragraph3