Judgment Approved by the court for handing down.
R (Bridges) -v- CC South Wales & ors
“To minimise any impact of bias as a result of gender, the
NeoFace Algorithm training data set contains roughly equal
quantities of male and female faces.”
At para. 24 he states that the NeoFace Algorithm training data includes a wide spectrum
of different ethnicities and has been collected from sources in regions of the world to
ensure a comprehensive and representative mix. He states that great care, effort and
cost is incurred by NEC, as a socially responsible major corporation, to ensure that this
is the case.
197.
Dr Jain responded to Mr Roberts’ statement in a second witness statement dated 25
January 2019. He fairly acknowledges, at para. 15, that he cannot comment on whether
AFR Locate has a discriminatory impact as he does not have access to the datasets on
which the system is trained and therefore cannot analyse the biases in those datasets.
He goes on to say, however, that bias has been found to be a feature of common AFR
systems and that SWP themselves are not in a position to evaluate the discriminatory
impact of AFR Locate.
198.
At paras. 24-28, Dr Jain specifically responds to the witness statement of Mr Roberts.
He expresses the opinion that what Mr Roberts says is not sufficient to be able to
determine that the NeoFace algorithm is not biased towards a particular demographic
group. To make this determination, he says, a thorough evaluation needs to be done of
the demographic composition of the NeoFace algorithm training dataset. Dr Jain states
at para. 28, without that information SWP are not able to assess whether the training
dataset is biased or may be.
199.
We acknowledge that it is not the role of this Court to adjudicate on the different points
of view expressed by Mr Roberts and Dr Jain. That would not be appropriate in a claim
for judicial review, still less on appeal. The fact remains, however, that SWP have
never sought to satisfy themselves, either directly or by way of independent
verification, that the software program in this case does not have an unacceptable bias
on grounds of race or sex. There is evidence, in particular from Dr Jain, that programs
for AFR can sometimes have such a bias. Dr Jain cannot comment on this particular
software but that is because, for reasons of commercial confidentiality, the
manufacturer is not prepared to divulge the details so that it could be tested. That may
be understandable but, in our view, it does not enable a public authority to discharge its
own, non-delegable, duty under section 149.
200.
Finally, we would note that the Divisional Court placed emphasis on the fact that SWP
continue to review events against the section 149(1) criteria. It said that this is the
approach required by the PSED in the context of a trial process. With respect, we do
not regard that proposition to be correct in law. The PSED does not differ according to
whether something is a trial process or not. If anything, it could be said that, before or
during the course of a trial, it is all the more important for a public authority to acquire
relevant information in order to conform to the PSED and, in particular, to avoid
indirect discrimination on racial or gender grounds.
201.
In all the circumstances, therefore, we have reached the conclusion that SWP have not
done all that they reasonably could to fulfil the PSED. We would hope that, as AFR is
a novel and controversial technology, all police forces that intend to use it in the future