Considering authentic assessments as a tool to help prevent academic dishonesty.
We are living in a time of unprecedented access to technology, much of which can make upholding academic integrity more challenging than ever. When looking to secure Canvas exams from academic dishonesty, one application within the system faculty can turn to is Respondus to “lockdown” learners’ browsers and to help remotely proctor exams. When using these tools, it is important to consider the implications they have for our courses and our relationships with our students. In this blog, we will explore Respondus proctoring tools and alternative assessment strategies to addressing academic dishonesty.
What is Respondus?
Respondus Lockdown Browser is a tool designed to limit academic misconduct when students take assessments, such as quizzes and exams. When Lockdown Browser is enabled on a device, students are restricted from navigating to webpages, using search engines, or taking screenshots. Students download and install Respondus software on their device to access their exam. Additionally, this software interacts with other systems on the computer to operate, and, in a few cases where technical support is needed, the software will collect and store private information to assist. These are included in their privacy policies, which are located in the Canvas Course Template available for Ball State University community members.
Respondus Monitor is another tool offered to Ball State faculty to limit academic misconduct. This proctoring tool uses the camera in a student’s computer and AI analysis of facial and eye movement to serve as a “live” proctor. The instructor can choose to view this live during the exam attempt or view a recording after the fact. Although the instructor is the only person with access to the recording, this requires students to allow visual access to their testing space, which could potentially be private space off-campus. An instructor has the option to require a photo of an ID or images of surroundings. To provide this heightened level of security, Respondus Monitor accesses all data that is processed on the computer during the testing session and stores this data on an encrypted server for up to 5 years. While both of these tools offer additional safeguards against academic dishonesty, they do not prevent all potential cheating.
Academic Integrity Through Authentic Assessment
In reviewing online discussions of proctoring tools, a particular thread on Reddit stood out. One student on this thread described the program as “malware.” Another notable quote expresses, “I wouldn’t let it go anywhere near my own devices” and “these privacy violations terrify me…these applications are a huge violation of privacy.”
This might seem confusing at first when considering how much the modern student connects through social media. As one community college professor helpfully identifies, “There’s a difference between posting information yourself — often the carefully curated version of a life you want to convey — and having a proctoring service require you to scan your bedroom before a test for cheat sheets or open books” (Mangan, 2021).
When considering the specific workings of and discourse around Respondus, we can be reflective about how we employ these tools in our own classes. In researching this, I became aware of how our decisions about tools like Respondus affect our classroom as an active community. Using tools focused on policing has consequences for how our students feel about their agency in a classroom. It may be beneficial for us to consider how different forms of assessment can prevent avenues for academic dishonesty before students even begin completing the assignment.
Authentic assessments, according to Grant Wiggins, are when “we directly examine student performance on worthy intellectual tasks.” These direct examinations involve students producing products which apply their knowledge and thus can help mitigate our dishonesty concerns. Many of the examples of authentic assessments are contrasted with typical test questions. Typical test questions such as multiple choice or true/false require specifically correct responses, must be unknown to students in advance, and can be easily scored. For example, this would be giving a student a definition of a type of rhetorical appeal and then asking them to identify the correct term.
In contrast, authentic assessments look for justifications of solutions, must be known beforehand, and require more thought in evaluating. This could be giving students a brief passage of writing or a video of a speech to watch and then asking them to identify which types of rhetorical appeals are present and to provide evidence. This question could be asked in an online exam using the quiz tool in Canvas, however, since students are asked to produce unique examples which require them to think beyond regurgitating notes, it limits opportunity for dishonesty. If the assessment is crafted carefully enough the design of the assessment will have disincentivized the act of cheating by providing the opportunity to produce practical material they can use in the future.
Typical questions are still useful in a course, especially when considering a grasp of terminology or smaller concepts that build upon one another. Authentic assessment does not have to replace these typical questions but could be used as a way to vary exam items and measure skills in ways that disincentivize academic dishonesty. Following the above examples related to rhetorical appeals, you could ask students a series of typical questions based on identifying the definitions of appeals that lead to an authentic assessment asking them to apply them. This way, the typical items are helping build toward the authentic assessment. You would incentivize understanding over memorization, which would in turn disincentivize dishonesty. Building a kind of proctor within your students and how they relate to the exam.
Reflecting on Integrity
Respondus Lockdown Browser and Monitor can mitigate academic dishonesty. There are many situations where they may be the most appropriate tools to use. However, meaningful assessment design should be considered a viable option as well. Different ways to assess learning can disincentivize academic dishonesty and may have the added benefit of making exams feel less like a tension point between us and our students. This tension is ultimately connected to a desire for our students to behave in morally consistent ways. Can promoting this type of learning help our students develop their own ethical standards?
Authentic assessment can be a big lift considering the energy and care required to evaluate them. There are some courses where implementing them would be more feasible than others. What are some of the other teaching choices we can make that can help us mitigate dishonesty?
References
Katherine Mangan, “The Surveilled Student,” The Chronicle of Higher Education, February 15, 2021, https://www.chronicle.com/article/the-surveilled-student
No_Info_Retained. “Respondus Lockdown Browser – An Invasion of Privacy???” Reddit, May 3, 2021. https://www.reddit.com/r/ucla/comments/n43pom/respondus_lockdown_browser_violation_of_privacy/
“Privacy Policy,” Additional Privacy Information – Respondus Monitor” Respondus, effective November 20, https://web.respondus.com/privacy/privacy-additional-monitor/
Wiggins, Grant. “The Case for Authentic Assessment.” Practical Assessment, Research & Evaluation 2, no. 2. (1990)
Comments:
Hi Caleb,
You know this is a very hot topic for me right now.
This article is important, the the author needs to also make the point that Respondus monitor does not detect the way students cheat. It does not detect multiple devices being used. It does not detect eye movements used to read multiple devices. It does not report conversations with others in the room. It records these things, but it does not flag them as a problem. It is quite useless if a student has decided before the exam to cheat, because there are many ways to fool it and the students are aware of all of them. Respondus should not even be a choice in a large class because there is no way for an instructor to watch every minute of 96 videos to catch the 30 seconds where someone reads the answers the student gets from google via a phone on the desk. All that said, the onus to “ask the right/authentic” questions should not be on the instructor. If a student cannot answer a very basic multiple choice question about what the lung does, that student shouldn’t progress in health care, for example. No one would want that person as a doctor. The vocabulary here “authentic” makes it sound like asking someone what two organs are most responsible for acid base balance in the body is somehow “not authentic, fake, unimportant” — when it is far more important than a rambling answer based on a creative interpretation of what is real and what is false. Much of medicine is choosing the right option among multiple choices — very complicated multiple choice. Yes, it is also an art which require creative answers — but for the creative answers to have any meaning, you have to know facts first., least you get someone who wants to cure covid with the creative answer of bleach.
Hello Jennifer,
Thank you for your comment! I agree the vocabulary of “authentic assessment” is imperfect. I find it helpful to think of it as a way for students to demonstrate understanding of concepts in more applied or practical ways than to think about it as “Authentic vs. Inauthentic Assessments.” In recent years, people have taken to referring to it as “alternative” or “performance” assessment instead, which might feel better.
I take your point about something like knowledge about the workings of the lung being important and having foundational information that shouldn’t be ignored. In a situation like that, it might be beneficial to blend the use of “traditional” assessments such as multiple choice questions and build them to a more applied “alternative” assessment asking them to diagnose a problem found in the lung using that knowledge to make an inference and argument. It would be less about asking them to be creative with the facts and more about giving them the opportunity to demonstrate they understand why the facts matter.
I feel we are speaking to similar frustrations with proctoring in that it often feels as if it’s not possible to make a test “cheating proof.” Similarly, I don’t believe it would be possible to construct the “perfect” question which would prevent this behavior. Instead, it’s my hope that we think of our assessment design as an opportunity to potentially mitigate this frustration by both helping students understand why we are asking them questions and creating an environment that dissipates feelings of conflict we might have with them about academic honesty.
I appreciate the discussion and the opportunity to talk through these ideas more!