7 Mins

Your Online Exam Experience Matters: How 4 Students Rate the Most Popular Platforms

Since the mass migration to online education in 2020, students are increasingly speaking out about the damaging effects of many popular proctoring platforms. However, concerns about online invigilation have existed for years.

A 2017 study at the University of Minnesota found that 40% of students were comfortable or very comfortable with taking exams in a remote proctoring environment with a live human proctor, while 60% reported being somewhat or not at all comfortable. Only 30% reported being comfortable with the presence of the proctor, and a full 70% said they were somewhat or not at all comfortable. This discomfort could have a significant impact on test anxiety, which is compounded by the fact that prior to the exam, about half of students did not feel they had the expertise to set up, use, and/or navigate the proctoring software. Although most students ultimately felt that the test environment conducive to test-taking, 38% did not.

These results are concerning. And while it is reasonable to believe that the rapidly growing prevalence of remote proctoring has increased familiarity with using such technology, online exam experiences have not necessarily improved. In fact, the ramifications of poor experiences are more visible than ever, and the relatively simple concerns about software setup are replaced by disturbing testimonies and complex ethical questions.

“There are so many systematic barriers.”

In the spring of 2020, Areeb Been Khan was fresh out of law school. In preparation for the bar, he was required to take a mock exam using ExamSoft. ExamSoft uses facial recognition algorithms to verify the test-taker’s identity and prevent impersonation—a process that failed for Khan. Although the company claimed that the issue was “inadequate lighting”, the problems persisted regardless of lighting conditions, leading Khan to conclude that the issue wasn’t lighting at all; it was the color of his skin. “There are so many systematic barriers preventing people like me from obtaining these degrees—and this is just another example of that.”

Research suggests he’s right; people of color have experienced the effects of poor facial recognition for years. As Klint Findley wrote in Wired Magazine:

Facial-recognition software works by training algorithms with thousands, or preferably millions, of examples and then testing the results. Researchers say the problematic facial-recognition systems likely were given too few black faces and can only identify them under ideal lighting conditions.

These technological shortcomings aren’t just frustrating; they can directly impact test-takers’ future opportunities. Perhaps even more disturbing, they perpetuate historical inequalities and discriminatory practices against racialized groups. This has significant implications for both individuals and communities and can deeply erode trust in educational institutions.

“I knew that if I had stayed home, there was no way I would pass.”

For Jazi, a 19-year-old student at the University of Texas at San Antonio, the online exam experience was compromised for different reasons. When Jazi was sent back home to study due to Covid, she had to take care of her younger siblings. She also had to take online exams using Proctorio.

Students using Proctorio must take exams in quiet places; any noise may be flagged as a violation of the rules. But when you take care of kids while completing an exam, noises both on and off screen are difficult to avoid—like when Jazi’s youngest sibling banged doors in the middle of a test.

“I was ignoring him the entire time, just saying, ‘Please, God, don’t let them email me about the sound,’' she said in a recent New York Times article. “You constantly ask, ‘Why is this so hard?’ Because you know it doesn’t have to be that way. But it is.” The situation became so bad that she used all her savings to move back to campus the following semester. “I knew that if I had stayed home, there was no way I would pass.”

Even those in typical student living environments run into issues. For Nkiru Chigbogwu, the presence of roommates talking in another room or knocking on her door intensified the stress of test-taking. “All you can think about after the exam is like, was I flagged, will they think I was cheating?” she says. In the absence of real-time feedback, she has taken to emailing professors after exams to explain any potential trigger, just in case.

The environmental requirements of some online proctoring solutions can have an outsized effect on students with children or younger siblings, who are living with roommates or in large multi-generational households, or who do not have a private space to complete exams. In practice, this tends to affect groups that already face barriers to higher education: women, single parents, people of color, students with precarious housing, and low-income students.

“It feels like an invasion of privacy.”

Most remote proctoring software is notoriously bad at accurately identifying potential violations by students with certain disabilities or medical conditions. This is because the AI algorithms are not trained to discriminate between movements caused by physical and behavioral differences vs. those caused by attempts to cheat on exams.

Many students know this and seek out accommodation to protect themselves from inaccurate judgments. However, accommodation can make students feel vulnerable in new ways, as simply requesting accommodation may be seen as a way of trying to skirt the rules. They also have to cope with potential invasions of privacy.

This was the experience of Sabrina Navarro, a junior at California State University, Fullerton, who has a neurological condition that causes facial tics. Her tics weren’t different in a remote test environment, but their implications were; suddenly, they became a threat to her academic performance, as they may be flagged by proctoring software as abnormal movements. This is not unusual. In fact, many neurodivergent students find that the online exam experience exacerbates the very symptoms that may be perceived as test violations. Accommodations seemed like the only way to give herself a fair shot.

To apply for accommodations, students must submit medical records to prove their diagnosis. Revealing such personal information to an educational institution is invasive, and the accommodations themselves can amplify it. “Just the fact that professors might have access to seeing me ticcing, over and over again—it feels like an invasion of privacy with something that all my life, I’ve been pretty good at hiding.”

This leaves students in a difficult position: do you ask for accommodation and feel your privacy stripped away, or do you leave yourself vulnerable to false flagging? These are choices students should not have to make. They’re also choices not all students have. Those who don’t have an extensively documented history of disability or a clear diagnosis may not believe their application for accommodation will be successful.

The online exam experience matters for everyone involved. As more and more students go public with their stories, it is becoming evident that most remote proctoring solutions leave much to be desired.

Not only do the problems caused by inferior products threaten students’ academic success and professional opportunities, but they also damage relationships between students and institutions and can tarnish the reputations of even the finest schools. For some, they even invite cheating. As one student says, “I feel morally conflicted. I am at the point where I’m like, if I’m paying you thousands of dollars for an education and you’re not doing your job, then I don’t have to do mine either.”

Rosalyn’s innovative invigilation solution is designed to promote academic integrity without compromising student dignity or invading their privacy. To do this, we:

  • Understand the student experience. Providing students with the tools to succeed requires insight into their perspectives. At Rosalyn, we created an industry-first Student Advisory Board to promote an ongoing dialogue and provide a space where students can freely express their thoughts and share their experiences. This feedback allows us to continuously refine our invigilation system to meet the evolving needs of test-takers.
  • Make AI smarter. Many of the problems with the most popular online exam platforms arise from the use of limited datasets that skew white, male, able-bodied, and American, perpetuating a host of biases. In contrast, Rosalyn uses a large and ever-expanding proprietary dataset that includes test-takers of all skin colors, ethnicities, abilities, and genders. By training our AI on this diverse dataset, the invigilation process becomes more accurate and more equitable.
  • Prioritize human judgment. Technology opens tremendous opportunities for students and allows education to continue even amid a global pandemic. But technology doesn’t replace human judgment. Rosalyn’s advanced human-in-the-loop system can tell the difference between a crying baby and prohibited conversation because it requires that all potential violations are reviewed by a human proctor. This means that false positives are reduced, and students are never without recourse.

By building Rosalyn around these principles, our system supports fair, comfortable test-taking experiences that both students and institutions can trust.

See also:

As educators and certifying organizations increase their reliance on remote testing, students’ voicing of concerns about privacy and the intrusiveness of the technology is reaching a crescendo. Ultimately, the issue is about much more than protecting the privacy of test-takers’ confidential information.

Read More

Educational institutions developing their online administration guidance spend a lot of time listening to technologists and test company vendors. There is one more class of stakeholders they should listen to: students.

Read More

One of the oldest axioms of the computer industry is “Garbage in, garbage out.” The modern equivalent of that truism as it relates to artificial intelligence algorithms is “Bias in, bias out.”

Read More