Fast company logo
|
advertisement

Surveillance technology, designed to ensure students don’t cheat at remote tests, has become a source of horror and frustration in an already unbelievably difficult school year.

Remote test-taking software is an inaccurate, privacy-invading mess

[Source images: Radila Radilova/iStock; oatintro/iStock]

BY Albert Fox Cahn and Grace Deng3 minute read

“Don’t make any sudden movements.” “Look directly ahead.” “Don’t speak.”

These commands sound like what you’d hear during an arrest, not during your finals, but they are actually just a portion of the rules that remote proctors police in the age of online testing. During the COVID-19 pandemic, automated and human overseers are peering into students’ bedrooms through an increasingly invasive array of academic spy tools.

2020 was already an awful time to try to learn, let alone prove yourself on high-stakes exams. The unequal playing field that defined academic testing prior to the pandemic has become even more distorted by the Silicon Valley goldrush to sell as much surveillance equipment to schools as possible—all in the name of thwarting cheating. According to our recent report, Snooping Where We Sleep, a majority of colleges and universities have started using this tech since the start of the pandemic. And now, social media is filled with horror stories of test takers tearfully describing being wrongfully flagged as a cheater or suffering technical difficulties mid-test.

But there isn’t conclusive evidence the technology is needed. Maybe students cheat more during remote exams, maybe they cheat less. The research on this topic from before the pandemic began is actually quite mixed, though some recent studies have pointed to evidence of cheating.

What we do know for certain is that these invasive surveillance tools only make it more likely for students of color and low-income students to fail. Students who were already struggling to find reliable Wi-Fi, studying in fast food parking lots and street corners, can fail a test simply because they don’t have a bedroom of their own and have to take tests with family in the room.

This is just one of the reasons that more students are flunking tests this year. Fairfax County, one of the nation’s largest school districts, saw an 83% jump in failing grades this semester. And the results may be even more devastating for younger children, for whom one bad semester can derail an entire academic career.

The technology that’s powering this wave of academic surveillance is actually fairly primitive. Firms like ProctorU and Examsoft use facial recognition to identify who’s logging in for a test. Sadly, this is the same facial recognition technology that has been shown to be more error-prone for women, non-binary individuals, and students of color. Even worse, systems like ExamSoft don’t just require you to pass facial recognition tests once or twice. Instead, they analyze your face throughout the exam.

Artificially intelligent movement tracking sounds high-tech, but the reality can be anything but. These simplistic algorithms flag students with disabilities as cheaters because of facial ticks, eye movements, and other involuntary acts.

Even more subtle differences in posture, typing, and movement could be enough to flag a student as suspicious. For these algorithms, mere difference is suspect, and disability is a crime.

These machines can be invasive in the best of times, but if anything goes wrong, they can be utterly unforgiving. Students who menstruate reported having to choose whether to risk a failing grade or use sanitary products when their periods unexpectedly began mid-test.

advertisement

Even before this unrelenting year, 25% of students suffered from severe test anxiety. Layer on unprecedented political tensions, an economic downturn, and a once in a century pandemic and students are at their breaking points. And frequently, online proctoring is breaking them.

Taking a step back, it is truly perplexing what the schooling system has enacted. The point of testing is to ascertain academic achievement, but no one can sincerely claim that is what schools and universities doing. Software firms erect a façade of fairness, but the real results are not a measure of achievement. They’re merely a biased and arbitrary sorting system.

We shouldn’t force students to use this badly coded software to maintain a testing regime that was already broken. If anything, this should be a moment to abandon the misguided mindset that test scores ever were equated with achievement, let alone potential. This is a time to nurture students with systems centered on compassion for what they’re experiencing, not with technology that makes everything worse.

This story has been updated with more information about cheating during remote learning.


Cahn (@FoxCahn) is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.) at the Urban Justice Center, a New York-based civil rights and privacy group and a fellow at the Engelberg Center for Innovation Law & Policy at N.Y.U. School of Law.

Deng (@gracesdeng) is a communications intern at the Surveillance Technology Oversight Project.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics