Tech

A facial recognition company lied to a school about how racist its algorithm is

It said it was only pretty racist when it’s actually almost twice as bad at recognizing Black faces overall.

FatCamera/E+/Getty Images

After an arduous battle with parents, civil liberties groups, and even the New York State Education Department (NYSED), Lockport City School District began using facial recognition software in January. Motherboard reports that the software, SN Technologies’ AEGIS system, is even worse at identifying Black faces than the statistics shared with the district. The algorithm, in part meant to alert local authorities to a mass shooting, also mistakes brooms for guns.

Don’t trust the algorithm or its maker — For a while, racism in technology pointed to a lack of diversity and products like automatic sinks that can’t detect dark skin. Over the past decade, and more earnestly in the past five years, racism has bled into algorithms and facial recognition has become the most ominous AI of all. Understandably, when parents of the 11 percent Black Lockport City School District and other parents of color heard it was coming to their kids’ schools, they fought back.

The U.S. National Institute of Standards and Technology (NIST) confirmed that even the best facial recognition software has issues with race, gender, and age in a study released last year. Patrick Grother, a NIST scientist who oversaw racial bias testing, told Motherboard — and Lockport’s lawyers, last summer — that the system data shared with Lockport didn’t match the agency’s actual testing data.

In an audit from accounting firm Freed Maxick in October 2019, the algorithm used by the AEGIS system was far worse at identifying Black people than the company had advertised. SN Technologies claimed, compared to white men, that it only misidentified Black men twice as often and Black women 10 times as often. NIST tests actually put misidentification at four times as often and 16 times as often, respectively.

What does this mean for students — Late last year, NYSED unceremoniously got on board with the district’s push for facial recognition software. Now, the pandemic makes the $2.7 million spent on the software essentially moot because of the widespread mask wearing. The millions come from New York’s Smart Schools Bond Act funding which other schools have more aptly invested in new laptops and better internet connectivity.

A passed state bill banning facial recognition in schools is expected to be signed by Governor Andrew Cuomo before the year is over, but 11 other districts are still trying to follow in Lockport’s footsteps. In the meantime, the New York Civil Liberties Union (NYCLU) is representing a group of Lockport parents in a lawsuit against the district.