NYPD Used Celebrity Lookalikes to Alter Facial Recognition Results, Researchers Say

admincamera, Video

NEW YORK — The use of facial recognition by law enforcement continues to be a hot topic. While it has potential to be a valuable tool, there is no guarantee it will be used responsibly.

A new report on law enforcement’s use of surveillance technology said the New York Police Department abused its facial recognition system by editing suspects’ photos.

Researchers from the Georgetown Center on Privacy and Technology, who specializes in facial recognition, obtained documents for their investigation into the NYPD, reports NBC News.

They found that the department was editing photos and uploading celebrity lookalikes into the facial recognition software in an effort to identify people wanted for crimes.

In the report, a case is detailed from April 2017 where NYPD investigators were trying to identify a man caught on surveillance stealing beer from CVS. The researchers said the image was not high quality and did not produce any potential matches in the facial recognition system.

A detective, however, noted that the suspect looked like actor Woody Harrelson, so an image of the actor was submitted in the suspect’s place.

From a new list of results, detectives found a man they believed was a match and arrested him, according to the researchers.

The report also found evidence that the NYPD doctored images of suspects to make them look more like mugshots. To do so, they would replace facial features using photos of a model taken from Google.

“These techniques amount to the fabrication of facial identity points: at best an attempt to create information that isn’t there in the first place and at worst introducing evidence that matches someone other than the person being searched for,” the report says.

The report calls to ban police use of the technology, which cities like San Francisco just passed last week.

“It doesn’t matter how accurate facial recognition algorithms are if police are putting very subjective, highly edited or just wrong information into their systems,” says Clare Garvie, the report’s author and senior associate at the Center on Privacy and Technology. “They’re not going to get good information out. They’re not going to get valuable leads. There’s a high risk of misidentification. And it violates due process if they’re using it and not sharing it with defense attorneys.”

The report also documented incidents in Maricopa County, Ariz., Washington County, Ore. and Pinellas County, Fla. about officers misusing facial recognition technology.

“At least half a dozen police departments across the country permit, if not encourage, the use of face recognition searches on forensic sketches — hand drawn or computer generated composite faces based on descriptions that a witness has offered,” the report states.

The Washington County Sheriff’s Office said in a statement that it “has actually never used a sketch with our facial recognition program for an actual cause. A sketch has only been used for demonstration purposes, in a testing environment.”

The Pinellas County Sheriff’s Office responded similarly, while the Maricopa County Sheriff’s Office said they no longer maintain a facial recognition system.

The NYPD initially fought Georgetown’s efforts to obtain information about how its facial recognition system worked, but ultimately handed over thousands of pages in documents.

The department said in a statement that is “has been deliberate and responsible in its use of facial recognition technology” and has used it to solve a variety of crimes.

Many officers say that facial recognition technology helps solve cases that otherwise would have gone cold, like in homicides, rapes, or attacks in the city’s subway system where the person is often unidentified.

The department did not dispute the facts stated in the Georgetown report, but said it is reviewing its facial recognition protocols.

The report also provides recommendations for law enforcement agencies that choose to continue to use face recognition in their investigations:

  • Stop using celebrity look-alike probe images.
  • Stop submitting artist or composite sketches to face recognition systems not expressly designed for this purpose.
  • Follow minimum photo quality standards, such as pixel density and the percent of the face that must be visible in the original photo.
  • Carefully document any edits made to the image and their results.
  • Prohibit the use of face recognition as a positive identification under any circumstance.

You can see the full report and its recommendations here.


Editor’s Note: This story first ran in Security Sales and Integration’s sister publication Campus Safety.

The post NYPD Used Celebrity Lookalikes to Alter Facial Recognition Results, Researchers Say appeared first on Security Sales & Integration.