Proctoring vendors made little or no effort to consult students or work through edge cases before selling their technology into hundreds of colleges and universities reeling from COVID-19. Clearly, these solutions have failed to meet faculty and student needs let alone respect their civil rights. Ultimately, the purpose of proctoring technology is not to stop elite college students from cheating. It is to expand access to education and testing while preserving the value and integrity of formal degrees. Remote proctoring is about giving students “the right to learn,” says SAB member Dylan Singh, an accounting student at the University of Southern California.
The right to learn varies by population and environment. SAB member Clara Brewer, for example, noted that proctoring technology is not equipped for hands-on, laboratory tests — the kind she does as a neurobiology, physiology, and behavior major at the University of California, Davis. It would be backwards to design STEM tests for the limited abilities of proctoring technologies rather than the other way around. Vendors have yet to meet the actual needs of faculty and students like Clara.
Dylan worries about the grade school children he tutors in Los Angeles near USC. Education and proctoring solutions, made for big-budget universities, are too dry and disengaging for young learners. Cheating isn’t the issue here — rather, the issue is to make children feel invested in learning and testing outside a traditional classroom.
Clark Chung, who studies naval architecture and marine engineering at the University of Michigan, emphasized that testing environments change across borders. A South Korean student, connected to some of the world’s best internet infrastructure, is in a different situation than a student in Afghanistan working with a 2G connection and no webcam. Should Afghani students be denied a coding certificate just because they don’t have high-speed internet and a new computer?
Efforts to democratize digital education will falter until students from all places and backgrounds can test on an even playing field. If technologists get proctoring right, students worldwide could learn lucrative trades, earn marketable certifications, and transform their lives. What will that take?
Vendors in this space have caused serious harm to their industry’s reputation and the students who have been forced to go along with their missteps. To earn their trust again, vendors should adopt several practices:
1. Give students control over their personal data. Remote proctoring has a conundrum. On the one hand, students feel nervous about being video recorded and having that data stored in the cloud by a vendor. It feels invasive. On the other hand, vendors need that data to train their AI and improve the proctoring experience. AI can learn the difference between a water bottle and a cheat card, but not without practice content from real exams.
A compromise may resemble the “right to be forgotten” clause in Europe’s General Data Protection Regulation (GDPR). Tell students exactly what data is collected, where it’s stored, how it’s used, and when (or if) it will be deleted. Give students the choice to delete or obtain their data if and when they choose.
2. Make proctoring less contingent on Wi-Fi and computing power. Students who have spotty wi-fi or slower computers suffer most from proctoring technology. Usually, these students live in rural or urban areas and lack access to university-grade computing resources. Their exams are more likely to be disrupted and flagged for cheating.
If that happens in the U.S., how are these solutions supposed to improve access to learning in, say, Central Asia and Sub-Saharan Africa, where cell towers often provide internet connectivity? Proctoring solutions need the ability to run on any web device at speeds as low as 300 kbps per second — barely enough to download two text emails per second.
3. Keep humans in the loop. Live proctoring isn’t scalable. Proctors aren’t paid particularly well and after several hours, they grow fatigued. AI doesn’t suffer those disadvantages, but it doesn’t understand context. There are many reasons why a student might gaze away, glance at the ceiling, or move their cell phone.
Rather than assume students are guilty, vendors should bring these instances to the attention of a human in the loop without disrupting the student’s exam. The human proctor can dismiss innocent behavior and mark instances where an AI has behaved in discriminatory ways. After the exam, the software can ask students to comment on any unusual event while it’s fresh in their memory. The proctoring AI should start a conversation, not a trial.
Sooner or later, the “Ed Scare” will end the way many scares do: through courage, empathy, and innovation. The students have spoken. Now it’s time for vendors to listen.
Julie Allegro Maples W90 WG04 is co-founder and managing director of FYRFLY Venture Partners, a seed stage venture firm investing at the intersection of “data + intelligence.” She is also the founder of the V Foundation Wine Celebration which has raised $118 million for cancer research since inception.
(This contribution originally appeared at Wharton Magazine with the title “The Surveillance State of Education” on May 3, 2021)