Big Proctor: Online Proctoring Problems and How FERPA can Promote Student Data Due Process

3 Notre Dame J. of Emerging Technology 74 (2023).

When the pandemic forced schools to shift to remote education, school administrators worried that unsupervised exams would lead to widespread cheating. Many turned to online proctoring technologies that use facial recognition, algorithmic profiling, and invasive surveillance to detect and deter academic misconduct. It was an “epic fail.”

Intrusive and unproven remote proctoring systems turned out to be inaccurate, unfair—and often ineffectual. The software did not account for foreseeable student diversity, leading to misidentification and false flags that disadvantaged test-takers from marginalized communities. Educators implemented proctoring software without sufficient transparency, training, and oversight. As a result, students suffered privacy, academic, reputational, pedagogical, and psychological harms.

Online proctoring problems prompted significant public backlash but no systemic reform. Students have little recourse under existing legal frameworks, including current biometric privacy, consumer protection, and antidiscrimination laws. Student privacy laws like the Family Educational Rights and Privacy Act (FERPA) also offer minimal protection against schools’ education technology. However, FERPA’s overlooked rights of review, explanation, and contestation offer a stop-gap solution to promote algorithmic accountability and due process.


Read entire publication here.

Previous
Previous

The Silicon Ceiling: How Artificial Intelligence Constructs an Invisible Barrier to Opportunity

Next
Next

Artificial Intelligence in Strategic Context