Comparing E-Proctoring Software to Hydroxychloroquine: An Apt Analogy

November 4, 2020
Image courtesy of patrisyu at FreeDigitalPhotos.net

Image courtesy of patrisyu at FreeDigitalPhotos.net

To help educators and administrators understand why I urge caution, and even skepticism about the use of e-proctoring software and other surveillance technologies such as those that lockdown students’ Internet browsers, here’s an analogy I have been using that seems to resonate:

In my opinion, e-proctoring software is to higher education what Hydroxycloroquine has been to the COVID-19 virus.

It’s not that e-proctoring software is bad, it is that it was never designed to be used under the current conditions. There are colleagues who would disagree with me about this kind of software being bad in principle. I accept their position. Let’s look at this through the eyes of scholar who is trained to reserve judgement on an issue without evidence to back it up. If we assume the software was designed for a specific purpose – to invigilate exams taken via a computer, then it fulfills that purpose. So, in that sense, it does what it is supposed to do. However, that is not the whole story.

We can turn to Hydroxychloroquine as an analogy to help us understand why we should be skeptical.

Hydroxychloroquine is an anti-malaria drug, also used to treat arthritis. It was never designed to be used against the SARS-CoV-2 (COVID-19) virus. Hasty attempts to do research on the coronavirus, including studies on Hydroxychloroquine, have resulted in numerous papers now being retracted from scientific journals. People ran to this drug as a possible antidote the coronavirus, just as schools are running to e-proctoring software as an antidote for exam cheating. Neither e-proctoring software nor Hydroxychloroquine were designed to be used during the current pandemic. People flocked to them both as if they were some kind of magic pill that would solve a massively complex problem, without sufficient evidence that either would actually do what they so desperately wanted it to do.

The reality is that there is scant scientific data to show that e-proctoring actually works in the way that people want it to, that is, to provide a way of addressing academic misconduct during the pandemic. By “scientific data” I do not mean sales pitches. I am talking about independent scholarly studies undertaken by qualified academic researchers employed at reputable universities. By “independent scholarly studies” I mean research that has not been funded in any way by the companies that produce the products. That kind of research is terrifyingly lacking.

We need to back up for a minute and look about why we invigilate exams in the first place. To invigilate means “to keep watch over”. Keeping watch over students while they write an exam is part of ensuring that testing conditions are fair and objective.

The point of a test, in scientific terms, involves controlling all variables except one. In traditional testing, all other factors are controlled, including the conditions under which the test was administered such as the exam hall with desks separated, same lighting and environment for all test-takers, length of time permitted to take the test, how it is invigilated, and so on. All variables are presumably controlled except one: the student’s knowledge of the subject matter. That’s what’s being tested, the student’s knowledge.

Exams are administered in what could be termed, academically sterile environments. In an ideal situation, academic hygiene is the starting point for administering a test. Invigilation is just one aspect of ensuring academic hygiene during testing, but it is not the only factor that contributes to this kind of educational hygiene that we need to ensure testing conditions control for all possible variables except a student’s knowledge of the subject matter.

During the pandemic, with the shift to remote learning, we cannot control all the variables. We simply cannot assure an academically hygienic environment for testing. Students may have absolutely no control over who else is present in their living/studying quarters. They may have no control over a family member (including their own children) who might enter a room unannounced during a test. The conditions under which students are being tested during the pandemic are not academically hygienic. And that’s not their fault.

E-proctoring may address one aspect of exam administration: invigilation. It cannot, however, ensure that all variables are controlled.

As an academic integrity scholar, I am distressed by the lack of objective, peer-reviewed data about e-proctoring software. Schools have turned to e-proctoring software as if it were some kind of magic pill that will make academic cheating go away. We have insufficient evidence to substantiate that e-proctoring software, or any technology for that matter, can serve as a substitute for an in-person academically hygienic testing environment.

Schools that were using e-proctoring before the pandemic, such as Thompson Rivers University or Athabasca University in Canada, offered students a choice about whether students preferred to take their exams online, at home, using an e-proctoring service, or whether they preferred to drive to an in-person exam centre. During the pandemic, students’ choice has been taken away.

We all want an antidote to academic misconduct during remote learning, but I urge you educators and administrators to think like scholars and scientists. In other words, approach this “solution” with caution, and even skepticism. At present, we lack sufficient evidence to make informed decisions. Educators need to be just as skeptical about this technology and how it works during pandemic conditions as physicians and the FDA have been about using Hydroxychloroquine as a treatment for the coronavirus. Its use as being effective against the coronavirus is a myth. The use of e-proctoring software as being an effective replacement for in-person exams is also a myth, one perpetuated by the companies that sell the product.

Forcing surveillance technology on students against their will during a pandemic is tantamount to forcing an untested treatment on a patient; it is unethical to the extreme.

______

Share or Tweet this: Comparing E-Proctoring Software to Hydroxychloroquine: An Apt Analogy – https://drsaraheaton.wordpress.com/2020/11/04/comparing-e-proctoring-software-to-hydroxychloroquine-an-apt-analogy/(opens in a new tab)

This blog has had over 2 million views thanks to readers like you. If you enjoyed this post, please “like” it or share it on social media. Thanks!

Sarah Elaine Eaton, PhD, is a faculty member in the Werklund School of Education, and the Educational Leader in Residence, Academic Integrity, University of Calgary, Canada. Opinions are my own and do not represent those of the University of Calgary.


Canadian English: Not Just a Hybrid of American and British English

December 13, 2011

NALD logoA number of years ago I had the pleasure of participating in a professional development workshop in Cuba for English teachers there. Due to the large numbers of Canadian tourists in Cuba, the teachers were intensely interested in “Canadian” English. They asked, “What is Canadian English?”, “How does it differ from British English? Or American English?” and “Is there really such a thing as ‘Canadian’ words?”

I wrote a paper on the topic of Canadian English for the workshop and I must say, I was surprised at how much I learned about my own language!

The National Adult Literacy Database (NALD) has archived the paper and has made it available for all researchers, teachers and literacy practitioners free of charge. Download your free copy of “Canadian English: Not Just a Hybrid of British and American English”.

________________

Like this post? Share or Tweet it: Canadian English: Not Just a Hybrid of American and British English http://wp.me/pNAh3-15H

Update – January 2018 – This blog has had over 1.8 million views thanks to readers like you. If you enjoyed this post, please “like” it or share it on social media. Thanks!

Sarah Elaine Eaton is a faculty member in the Werklund School of Education, University of Calgary, Canada.


How and why my students wrote their own final exam

December 13, 2010

I teach a first-year university course called “Effective Learning”. This semester, topics included managing exam stress, how to prepare for exams and strategies to during a test including such things as reading over the exam before you start writing and answering the questions you know first. Most of the assessment I did for this class was strength-based evaluation such as group projects, evaluated presentations and portfolios. We did one test at the end of the semester.

I decided to engage the students in the exam development process. We spent time in class reviewing what types of exam questions were acceptable (e.g. multiple choice, short answer, essay) and what content would be covered. The questions were based on material from the two textbooks, as well as materials from in-class presentations and discussions. All the material covered from the first day of the semester was to be included in the final exam.

Earlier in the semester students had worked with a partner to present a presentation that was a synthesis of two readings each. For the development of the test questions, students worked with the same partner and prepared questions on each reading they had done their class presentations on some weeks earlier. Students were challenged to come up with at least 5 questions per chapter and to include more than one type of question (multiple choice, short answer, etc.)

Students prepared test questions and handed them in to me.  I compiled them into one document, noting which questions related to which chapters in the text or readings from the course pack. I also noted which students had contributed which questions. The questions were distributed to all students for study purposes. The result was a 10-page study guide comprised of potential test questions that they themselves had generated.

I let them know that I would be selecting from their contributed test questions and that I would also be adding some questions of my own that would not be shared before the exam.

The process of having students develop test questions proved to be a useful learning exercise for them. They got to experience what it is like to write exam questions and the thought-process that goes into it. Knowing that this was not simply an academic exercise but that some of these questions would actually appear on the final exam added a much-needed element of authenticity. Students took the exercise seriously when they knew that it would impact their peers.

Finally, they reported being more engaged with both the material and the study process when they had the opportunity to contribute questions. Suddenly it wasn’t an exam inflicted upon them, so much as a challenge they co-developed and were ready to take on.

Related post:

Course design: 7 ways I engaged my students in the process http://wp.me/pNAh3-nV

_____________

Share or Tweet this post: How and why my students wrote their own final exam http://wp.me/pNAh3-o2

Update – January 2018 – This blog has had over 1.8 million views thanks to readers like you. If you enjoyed this post, please “like” it or share it on social media. Thanks!

Sarah Elaine Eaton is a faculty member in the Werklund School of Education, University of Calgary, Canada.


%d bloggers like this: