In a move bound to please privacy activists, New York lawmakers have passed a moratorium on the use of Aegis, a facial recognition system, in schools until 2022. The decision comes after the state was hit with a blistering lawsuit from the New York Civil Liberties Union, according to the Lockport Union-Sun and Journal.
All that the moratorium needs now to get the New York education department in line is a green signal from governor Andrew Cuomo.
Key background — In June 2019, Lockport initiated a series of tests on facial recognition and eventually went live with its Aegis program earlier in 2020. It was a controversial move and immediately elicited disapproval from privacy watchdogs. In June, according to the Lockport Union-Sun and Journal, the NYCLU sued the state's education department for greenlighting Lockport's Aegis system. The lawsuit highlighted the possibility of racial bias and deep privacy breaches as consequences of using the program.
The concerns are not too different than what we've previously seen with the development and application of the notorious Clearview AI, which enjoys a cozy rapport with the alt-right or even Amazon's facial recognition system that, among other mishaps, failed to accurately identify 28 members of Congress.
Supporters of facial recognition disagree — Not everyone is happy with this decision, though. Those who support facial recognition assert that the Lockport City school district could do well with the application of the program given that the school is expected to open multiple entry points as a COVID-19 temperature measuring decision. By doing so, Aegis' proponents argue, the school can monitor its student population more effectively.
In a comment to the Lockport Union-Sun and Journal, superintendent Michelle Bradley asserted that Aegis would not in "any way record or retain biometric information relating to students or any other individuals on district ground."
What critics say — As critics have pointed out, though, the use of such technology could pose serious risks to students, particularly Black and Brown students. In an NYCLU statement, deputy director of the Education Policy Center Stephanie Coyle said:
This is especially important as schools across the state begin to acknowledge the experiences of Black and Brown students being policed in schools and funneled into the school-to-prison pipeline. Facial recognition is notoriously inaccurate especially when it comes to identifying women and people of color. For children, whose appearances change rapidly as they grow, biometric technologies’ accuracy is even more questionable.
The potential ramifications of such a program could be devastating — and even irreparable — for a young student. "False positives, where the wrong student is identified," Coyle explained, "can result in traumatic interactions with law enforcement, loss of class time, disciplinary action, and potentially a criminal record."