Facial Recognition In The Classroom Tells Teachers When Students Are Spacing

Instead of filming teachers to evaluate their performance, the EngageSense system analyzes student faces and spits back metrics about whether the lesson is keeping their interest.

Cameras in the classroom are usually trained on the teacher. Watch, and you’ll see what works, the thinking goes. Bill Gates showed off one classroom Flip cam setup in a TED Talk earlier this year. Afterwards he said: “One day we’d like every classroom in America to look something like that.”


But engineers at SensorStar Labs in Queens, New York, have an idea that takes the recorded classroom to Big Brother levels: cameras for every student’s face, and an algorithm to analyze the footage.

“This idea of adding the cameras and being able to use that information to assist teachers to improve their lessons is already underway,” says SensorStar Labs co-founder Sean Montgomery. “Where this is trying to add a little value on top of that is to make it less work for the teachers.”

Montgomery repeatedly emphasized that the technology, which SensorStar calls EngageSense, is in the research and development phase. But in its current form, it uses webcams to shoot students’ faces and computer vision algorithms to analyze their gaze (are their eyes darting around or watching the teacher?) and expression (smiling? frowning? confused?). That, coupled with audio, can be transformed into a rough, automated metric of student engagement throughout the day. After a lesson, a teacher could boot up EngageSense and see, with a glance at the dashboard, when students were paying rapt attention, and at what points they became confused or distracted. “By looking at maybe just a couple of high points and a couple of low points, you get enough takeaway that the next day you can try to do more of the good stuff and less of the less-good stuff,” Montgomery says.

In other words, instead of watching the video and taking notes a la Gates’s model classroom, in the EngageSense classroom, the software takes the notes. Beyond that, the concept is still being refined, as SensorStar Labs looks both for funding and for schools to give EngageSense a real-world trial. “We’re looking for the right type of partnerships,” says Montgomery.

The biggest bump on the road to real-world use clearly could be data privacy concerns. “This information is something that teachers already have access to in the classroom,” Montgomery says. “It’s just what the teacher can already see with their eyes and what the teacher can already hear with their ears.” But it doesn’t take a great stretch of the imagination to imagine administrators or law enforcement using the recordings of that information for purposes other than low-stakes feedback.

For the students, Montgomery envisions a system that he compares to checking “yes” when an Android app asks for access to your GPS data. “If you wanted to give the teacher the ability to better teach to your [child], that service would then require you to give some level of permission for that teacher to access your [child’s] data,” Montgomery suggests.


But first, he’ll have to demonstrate that it actually makes for better teaching.

“This is something that has to be validated,” he says. “But I think there’s so much information that could be useful to teachers, it’s inevitable that information is going to come to bear fruit.”

About the author

Stan Alcorn is a print, radio and video journalist, regularly reporting for WNYC and NPR. He grew up in New Mexico.