Parabon’s technology “does not tell you the exact number of millimeters between the eyes or the ratio between the eyes, nose and mouth,” says Greytak. Without that kind of accuracy, facial recognition algorithms can not produce accurate results – but deriving such accurate measurements from DNA would require fundamentally new scientific discoveries, she says, and “the papers that tried to make predictions at that level did not good luck. ” Greytak says Parabon only predicts the general shape of someone’s face (although the scientific feasibility of such a prediction was also questioned).
Police are known for making forensic sketches based on witness descriptions using facial recognition systems. A 2019 study from Georgetown Law’s Center for Privacy and Technology found that at least half a dozen police agencies in the U.S. “allow, if not encourage” to use forensic sketches, whether hand-drawn or computer-generated, as import photos for face recognition systems. AI experts have warned that such a process is likely leads to lower levels of accuracy.
Corsight has also been criticized in the past for exaggerating the capabilities and accuracy of its face recognition system, calling it the “most ethical face recognition system for highly challenging conditions”, according to a slippery slope. offer available online. In a technology demo for IPVM Last November, Corsight’s chief executive, Watts, said Corsight’s face recognition system could identify someone with a face mask — not just a face mask, but a ski mask. IPVM reported that the use of Corsight’s AI on a masked face yielded a 65% confidence score, Corsight’s own measure of how likely it is that the face captured in its database matched, noting that the mask was more accurately described as a balaclava or neck gaiter, as opposed to a ski mask with only mouth and eye clippings.
Broader issues with the accuracy of face recognition technology were well–documented (including by MIT Technology Overview). They are more pronounced when photos are poorly lit or taken at extreme angles, and when the subjects have darker skin, are women, or are very old or very young. Privacy advocates and the public have also criticized face recognition technology, especially systems such as Clearview AI which craps social media as part of their matching engine.
Law enforcement use of the technology is particularly charged — Boston, Minneapolis, and San Francisco are among the many cities that have banned it. Amazon and Microsoft stopped selling face recognition products to police groups, and IBM withdrew its face recognition software from the market.
“The idea that you’ll be able to create something with the level of finesse and fidelity needed to do a face search – to me it’s ridiculous,” said Albert Fox Cahn, a civil rights lawyer and executive director. of the Surveillance Technology Oversight Project, which extensively works on issues related to face recognition systems. “It’s pseudoscience.”
Dzemila Sero, a researcher in the Computational Imaging Group of Centrum Wiskunde & Informatica, the national research institute for mathematics and computer science in the Netherlands, says the science to support such a system has not yet been sufficiently developed, at least not in public. Sero says the catalog of genes needed to produce accurate depictions of faces from DNA samples is currently incomplete, citing Human Longevity’s 2017 study.
In addition, factors such as environment and aging have significant effects on faces that cannot be captured by DNA phenotyping, and research has shown that individual genes do not affect the appearance of someone’s face as much as their gender and origin do. “Premature attempts to implement this technique are likely to undermine confidence and support for genomic research and have no social benefit,” she said in an email to MIT Technology Review.