Colored students are flagged to their teachers because testing software cannot see them

0

Proctorio, a piece of exam monitoring software designed to prevent students from cheating while taking tests, relies on open-source software with a history of racial bias, according to a report by MotherboardThe problem was discovered by a student who found out how the software did facial recognition, and found that it doesn’t recognize black faces half the time.

Proctorio and other similar programs are designed to keep an eye on students as they take tests. However, a lot of students of color have reported that they have problems obtaining the software to see their faces – sometimes have to resort to extreme measures to make sure the software recognizes them. This can potentially cause problems for the students: Proctorio will flag them to instructors if it doesn’t recognize their face.

After hearing anecdotal about these issues, Akash Satheesan decided to investigate the facial recognition methods used by the software. He found that it looked and performed identical to OpenCV, an open-source computer vision program that can be used to recognize faces (which has had issues with racial prejudice in the pastAfter learning this, he ran tests with OpenCV and a data set designed to validate how well machine vision algorithms handle different faces. According to his second blog post, the results were not good.

The results of testing the software that Proctorio relies on were not good.
Chart: Akash Satheesan, ProctorNinja

Not only did the software fail to recognize black faces more than half the time, it wasn’t particularly good at recognizing faces of any ethnicity – the highest hit rate was less than 75 percent. In his report, Motherboard contacted a safety researcher, who was able to validate both the results and the analysis of Satheesan. Proctorio also confirms that it uses OpenCV on its license page, although it doesn’t go into detail on how.

In a statement to Motherboard, a Proctorio spokesperson said that Satheesan’s tests prove that the software only detects faces, not recognize the identities associated with itWell, that might be a (small) consolation for students who may rightly be concerned about privacy concerns related to proctoring software, it does not address the racial bias allegations at all.

This isn’t the first time Proctorio has been summoned for failing to recognize several faces: the problems it caused students to be of color quoted by a university as a reason why it would not renew its contract with the company. In fact, Senator Richard Blumenthal (D-CT) the company shouted when we talk about bias in proctoring software.

While racial bias in code is nothing new, it is especially troubling to see it affects students who are just trying to do their schoolwork, especially in a year when distance learning is the only option available to some.