Connect with us


Toronto-area police adopt facial recognition tech linked to Black man’s wrongful arrest in New Jersey | CBC News



Toronto-area police adopt facial recognition tech linked to Black man’s wrongful arrest in New Jersey | CBC News

A New Jersey man who was wrongly jailed after being misidentified through facial recognition software has a message for two Ontario police agencies now using the same technology.

“There’s clear evidence that it doesn’t work,” Nijeer Parks said.

Parks, now 36, spent 10 days behind bars for a January 2019 theft and assault on a police officer that he didn’t commit. He said he was released after he provided evidence he was in another city, making a money transfer at the time of the offence. Prosecutors dropped the case the following November, according to an internal police report.

Investigators identified Parks as a suspect using facial recognition technology, according to police documents provided as part of a lawsuit filed by Parks’s lawyer against several defendants, including police and the mayor of Woodbridge, N.J. The lawsuit names French tech firm Idemia as the developer of the software.

Police in Peel and York regions, near Toronto, announced in late May they were jointly implementing Idemia’s technology, which they will use to compare existing mugshots with crime scene images of suspects and persons of interest. 

Parks said his case highlights the limitations of such software.

“He doesn’t look anything like me,” Parks, who is Black, said of the man in the picture that police used to identify him. “I’m like … you’re basically telling me we all look alike.”

The photo had come from a fake Tennessee driver’s licence the suspect provided to officers at the scene of the theft, according to a police report submitted as a court exhibit in the civil case. 

A suspect handed police in Woodbridge, N.J., this fake Tennessee driver’s licence bearing the name Jamal Owens. (United States District Court for the District of New Jersey exhibit)

The man was accused of stealing snacks from a hotel gift shop in Woodbridge, N.J., and nearly running over an officer as he later sped away. 

Two days later, an investigator emailed a Woodbridge detective a PDF file containing a “good possible hit on facial recognition,” according to court exhibits reviewed by CBC News. 

“That’s him,” the detective replied, referring to the suspect from the hotel incident. 

Parks was arrested and charged with a series of offences, including aggravated assault and resisting arrest. According to a transcript of his police interview, he told an investigator he had, in fact, never been to Woodbridge, which is roughly 40 kilometres from his home in Paterson, N.J. 

Parks recently described to CBC the ordeal as an “out-of-body experience, because it was something that I couldn’t believe was happening.”

A man in a white t-shirt with his hands clasped
Parks speaks to CBC News from his home in Paterson, N.J. (CBC)

In Ontario, police insist they’ve implemented safeguards to prevent a mismatch from resulting in an arrest.

“It’s the human element,” York Regional Police Const. Kevin Nebrija told CBC. He said investigators will personally “look at the match and see if that supports other evidence that we’ve obtained.”

York and Peel police both said separately the software would be used as an additional tool to provide investigative leads and will not serve as the sole basis for an arrest. They also said the system would not be used to analyze live video.  

“Idemia Face Expert will be used to aid human decision-making, not replace it,” Peel Regional Police Deputy Chief Nick Milinovich said in a video posted online. “It will improve public safety for everyone.”

Allegations of ‘biased technology’

Research has repeatedly pointed to shortcomings in facial recognition technology, particularly the risk it will misidentify racialized individuals.

Parks’s lawsuit partly blames his wrongful arrest on the “misuse of biased technology.” 

The Township of Woodbridge declined CBC’s request for comment on the matter, as the case remains in litigation.

A representative for Idemia did not respond to emailed questions.

The American Civil Liberties Union (ACLU) earlier this year filed a court brief in support of Parks, stating “officers unreasonably relied on a shaky lead from fundamentally unreliable technology.”

“As in this case, the harms of [facial recognition technology] misidentification disproportionately fall on Black Americans,” the ACLU wrote.

The U.S. General Services Administration, which oversees federal contractors, said in a 2022 report that such tools disproportionately failed to match African Americans in its tests.

A police officer is seen sitting at a desk, looking at a computer screen
In a video posted online, a Peel Regional Police officer demonstrates Idemia’s facial recognition software. (Peel Regional Police)

Yuan Stevens, an academic associate at McGill University’s Centre of Genomics and Policy in Montreal, said there should be more transparency about the way facial recognition algorithms are refined.

“It’s actually very possible that Idemia’s database was trained on white European faces, [so] people of colour, such as myself, would be more wrongfully suspected of a crime more often.” 

Stevens said Black and Indigenous faces are frequently overrepresented in mugshots, since such databases “contain images of people who are subject to heightened scrutiny and surveillance by the police.”

Idemia cited as most accurate

Idemia has disputed allegations of bias in its software.

In slides prepared for a 2018 presentation titled “Face Recognition Evaluation @ Idemia,” a representative wrote the company’s algorithm has the “same [false positive identification rate] for Black or white subjects, male or female.”

Software on screen displays pictures of men
An online demonstration of Idemia’s Face Expert software shows it can be used to match images from a database with faces seen in CCTV video. (Idemia)

York Regional Police said on their website “in the past five years, facial recognition technology has made tremendous strides in accuracy and demographic differences,” citing data from the U.S. National Institute of Standards and Technology (NIST).

Among a list of vendors, NIST ranked Idemia’s algorithm in 2022 as the most accurate on a false match rate fairness test.

Ontario Provincial Police said they’re looking into implementing a similar program, while evaluating “accuracy, privacy implications and potential biases associated with facial comparison.”

The RMCP said they asked some third-party vendors to disable facial recognition functions integrated in tools used by the national police force.

In 2014, Calgary police became the first police agency in Canada to use facial recognition technology, launching a system designed by NEC Corporation of America.

The Toronto Police Service said it’s been using facial recognition since 2018. Its website also lists NEC as the technology provider. 

Investigators in both cities briefly used the controversial Clearview AI system, which searched images of the public scraped from the internet.

Peel and York police said they discussed their plan with the province’s Information and Privacy Commissioner.

The commissioner’s office told CBC it “does not endorse, approve or certify” any program it’s consulted on.

WATCH | Parks spent 10 days in jail for crimes he didn’t commit: 

He was wrongfully jailed because of facial recognition tech now used in Ontario

Two Ontario police forces are now using facial recognition software that wrongly identified a New Jersey man who ended up in jail. The man is now suing for wrongful arrest and says the technology is biased.

The office provides public guidance for police agencies seeking to use facial recognition to search through mugshot databases.

As for Parks, he and his lawyer have requested a summary judgment, meaning their case wouldn’t need to go to trial. His lawyer, Daniel Sexton, said he’s also been in talks to settle the case out of court.

“I don’t want to see anyone go through what I went through,” Parks said.

Continue Reading