surveillance

Intel’s classroom AI scans student faces for boredom and frustration

Software made by Class and Intel is designed to give teachers insight into online lessons, but could inadvertently encroach on students' privacy.

A portrait of a young Black man can be seen. He is wearing a denim jacket, eyeglasses, a white shirt...
Shutterstock

Technology company Intel has partnered with virtual school software company Class to make AI facial analysis programs to detect student boredom and confusion in online classes, reports Protocol. Class, the company whose software works to expand Zoom, announced its partnership with multinational corporation Intel in a March 23 blog post.

The analytics system that detects the student facial expressions was developed by Intel using videos of students in real-life classrooms. A team of psychologists viewed the videos, labeled the emotions that they could detect in students, and fed the data to an algorithmic model that categorizes student emotions.

The software could help instructors get feedback from students' faces during online classes, but it could also threaten student privacy and weaken bonafide teacher-student relations. Een if the tool helps teachers realize when lessons are flying over students' heads (or boring them to death), critics worry the automated emotion detection could also be used as a dystopian, impersonal substitute for person-to-person communication.

Panopticon Academy — EdTech has the potential to free knowledge or to burden students with excessive surveillance; it can be a pedagogical boon or a punitive nightmare. Sorry, your grade today was a B because you blinked a little too long during the lecture. Perhaps if you had burrowed your eyebrows in frustration, your instructor would have gotten feedback to slow down the lesson.

The way people express emotions varies across cultures and situations. Even the best facial recognition software has issues with race, gender, and age according to a study by The U.S. National Institute of Standards and Technology (NIST).

It’s everywhere — Facial recognition software is seeping into more areas of life, from the “emotion-monitoring systems” used in Chinese prisons and businesses, to police departments’ use of the notorious Clearview AI (which holds a cozy rapport with the alt-right).

In schools, facial recognition systems have been controversial. After a lawsuit from the New York Civil Liberties Union, New York lawmakers passed a moratorium on the facial recognition system, Aegis, which lied about how bad its model is at identifying Black faces. Higher education is shelling out for technologies that use pupil movement as a proxy for honesty and flood the internet with “honeypot” sites tempting students to cheat. How’s that helpful?