Students of color are flagged to teachers because Proctorio testing software cannot see them


Proctorio, exam monitoring software designed to prevent students from cheating during tests, relies on open source software that has a history of racial bias issues, according to a report by Motherboard. The problem was discovered by a student who understood how the software performed facial detection and found that it did not recognize black faces more than half the time.

Proctorio, and other similar programs, are designed to keep an eye on students as they take tests. However, many students of color reported that they having problems getting the software at see their faces – sometimes have to resort to extreme measures for the software to recognize them. This could potentially cause problems for students: Proctorio will flag them to instructors if it does not detect their face.

After hearing about these issues anecdotally, Akash Satheesan decided to examine the facial detection methods used by the software. He found that it looked and worked identically to OpenCV, an open source computer vision program that can be used to recognize faces (which has had racial bias issues in the past). After learning this, he performed tests using OpenCV and a dataset designed to validate how well machine vision algorithms handle various faces. According to his second blog post, the results weren’t good.

The test results of the software on which Proctorio relies were not good.
Graphics: Akash Satheesan, ProctorNinja

Not only did the software fail to recognize black faces more than half the time, it was not particularly good at recognizing faces of all ethnicities – the highest success rate was less than 75%. . In his report, Motherboard contacted a security researcher, who was able to validate both the results and Satheesan’s analysis. Proctorio itself also confirms that it uses OpenCV on its licensing page, although it doesn’t go into details on how.

In a statement to Motherboard, a spokesperson for Proctorio said Satheesan’s tests prove the software only seeks to detect faces, not recognize identities associated with them. Well, that may be a (little) solace for college students who may rightly worry about privacy issues with monitoring software, it doesn’t answer accusations of racial bias at all.

This isn’t the first time Proctorio has been called in for failing to recognize various faces: The problems he caused to students of color was cited by one university as a reason why he wouldn’t renew his contract with the business. Senator Richard Blumenthal (D-CT) even challenged the company by speaking of bias in monitoring software.

While racial bias in the code is nothing new, it is particularly distressing to see it affect students who are just trying to do their schoolwork, especially in a year where distance learning is the only option available to some. .



Comments are closed.