Hi all, this question has resulted in a LOT of excellent replies. Thank you for the thoughtful discussion. The roundup is below. Amy
*****
You might see about checking in with local ABS Directors and see if they have any “tips”. They have been doing some online assessment for measuring of skills gains. They have to use national testing standards.
*****
Our college has been using Honorlock for online proctoring for the last year. We piloted the software with and for Math, and it went really poorly. Our instructors were using a mix of MyOpenMath, native LMS exams, and maybe a few other vendors (Honorlock supports third party exam hosts). Regardless of where the students took their test, we ran into continual issue with the monitoring AI – whenever students looked down to work on their scratch paper, it would disrupt the test, make them reauthenticate, and caused a bunch of false flags that make the assessment for instructors meaningless. This might be less of a problem with a proctoring service that uses real people to do monitoring the entire time, but for any system that uses AI it did not work well at all.
For faculty giving traditional exams it’s worked fine. But for math with its external components, online proctoring wasn’t successful or effective.
*****
While I don't want to promote Examity, they have supported us with the use of "unusual" exam types, including computer science programming on a Linux command line, etc. I bet they'd be able to support something like MOM. It's human proctoring and you can provide guidance like "it's math, so they are allowed sheets of paper and to do problems by hand."
I might stir something up here, but I'll go ahead and mention that you can use Zoom to proctor exams. Lots of schools are doing it and there are lots of guides (eg
https://keepteaching.ucdavis.edu/test/zoom) that walk faculty through the process. The upside to using Zoom are as follows:
- You already have it
- Instructors can proctor the exam themselves (and experience how tedious and absurd the process is and hopefully consider revisiting assessment altogether.)
- Furloughed college staff can proctor exams (our testing centers have been largely shuttered or doing other work the last year)
- It fits the "security theater" threshold of making it seem like you're at least trying to have some proctoring in place.
- You're not further creating costs/barriers for students by sending money to yet another tech company with some questionable practices (I'd like to see hands raised by people who actually like working with their proctoring vendor)
- Students are likely familiar with Zoom by now.
- Breakout rooms can support multiple exam takers at the same time.
I've been thrilled that our leadership has put a moratorium on proctoring during the pandemic. I hope we largely continue this except for situations where required by external accreditation.
*****
From snippets of conversations I’ve heard on our campus, everyone is struggling to find a better path forward.
We have several sections utilizing MyOpenMath (MOM). Some instructors are choosing to create their own exams and deploy them through our LMS. This allows more control over random questions (larger quiz bank) but often isn’t enough and instructors are wanting to use Respondus to proctor the quizzes online (via webcam or a “live” zoom session). While we provide Respondus as a resource to all instructors, but do our best to discourage general use. It’s 100% not a magic bullet solution.
I can't speak for the institution, but I see this as an ethical crisis. Online proctoring software/solutions are invasive, technically problematic, and expensive. Simply, they suck and perpetuate high stakes testing which has always disproportionally effected those who need academic assistance the most — and we know it (There, I said it!). Outsourcing the proctoring entirely only exacerbates communication and technical issues faced by students.
On the flip side, asking instructors for more of their time to watch hundreds of student presentations, revision math concepts, explore complex gamification, and deploy 100’s of activities instead of high stakes exams is a tremendous shift and involves solving a different mountain of issues.
In short: No easy answers here but want to see this conversation continue to be had in all corners of our institutions.
*****
I have some thoughts, but I am certain that I have not figured this out completely so I'm interested in any else's ideas.
1) I try to make at least 50% of my exams comprised of problems where students can't just look up the answer (this way, if they're only test strategy is look up the answers, the best they can hope for is 50%):
- I have a lot of problems where I show them a student's solution for a problem and ask them to identify and articulate/explain the mistake that was made
- Instead of giving them a function to integrate, I ask them to give a definite integral whose value is 4 (for example)
- Or, I ask them to sketch the graph of a function whose definite integral over a given interval would be 0
- I ask them to sketch the graph of a function given various limits, and/or information about the derivative and second derivative
Stuff like that; I have to grade all of these by hand, but these are at least not easy to look up (I don't think).
2) For the other problems that students can look up, I agree with this person that it is usually very easy to tell. The student's work will be very weird (or completely wrong) and unlike anything from the course material or what we've done in class. When I see this, I only give the student 1 point (out of n) for a correct answer and I just say something like "I don't understand what you did here; I will give you full credit for this problem if you join me during office hours this week and walk me through it". I put this right in the feedback box next to each problem. So far, not a single student has taken me up on this, which pretty much confirms what I suspected. If they were to meet with me and explain their solution to my satisfaction, I would have no problem restoring their points.
Also, FWIW, I do not use any proctoring software. I monitor them via Google Meet (or sometimes I just check them in on Google Meet and then leave them be, if it is a group that I trust). I also require them to scan and email their work to me within 1 hour of completing the exam. I do not believe I am preventing all cheating, but I am preventing most. The students who I am not keeping from cheating are at least not getting a very good grade. I don't really see any point to the lockdown browser, because students can access anything on their phone.
*****
What we did last fall was to try to integrate LockDown Browser (LDB) with MyOpenMath exams. One big issue here is that LDB can only be used for Canvas quizzes, but if you import MyOpenMath via LTI it can only be done as assignments. Thus, we used a 'hack' to get the two to work together where the MyOpenMath assessment was added as an assignment, RLB was tied to a quiz, but the quiz opened the MyOpenMath exam through an allowed web url. Actually the hack was pretty inventive.
Now, when I tested this method, I had no issues, and about 80-85% of my students had no issue. That said, 10-15% did have issues in that running both RLB and MyOpenMath took a lot of computing power which caused problems to load slowly (very slowly), some problems would not submit, and in a lot of these cases students would eventually have the delay take so long they couldn't get back in to the exam. Furthering the frustration is that we do not have anyone in the district who is an 'expert' with both MyOpenMath and Canvas and thus there is no technical support for this hack. Since it was a workaround, Canvas folks were not sure what to do.
We ditched that approach and currently are using CARES funds to use ProctorU for our remote proctoring. We have some faculty who proctor via Zoom which allows multiple students to share their screens so the instructor can see the student on the webcam and what is on their screen. We are hoping to be able to use ProctorU in the fall, but to prepare I have moved my exams in to Canvas so they can be used with RLB without having to go through a hack. The upside here is now there will be true technical support for students who have issues.
All of this said, none of this eliminates cheating. Just google "how to cheat on lockdown browser" or "ProctorU' and there is plenty of info on how to cheat both. Best we can do with remote testing is make it as hard as possible to cheat so some of the ideas mentioned by the other teacher are good ones to implement (timed exams, upload work, etc).
I'm not sure if this was much help, but wanted to let you know what we have been through at SCC since we cannot test on campus and our testing center currently doesn't do testing (I suggested changing the name, but got shot down :-p).
Would love to hear what others are doing as this is a real issue going forward. Online enrollment continues to grow and exam integrity should be important to all instructors.
*****
- For questions that are best answered in multiple-choice format (some content is too difficult to test in ways deeper than multiple-choice), a strict time limit (something like no more than 1 minute per question) is a strong disincentive to cheating. I also give my students questions that are similar to homework questions they have seen (so it's reasonable to expect students to be able to answer in 1 minute, for those who study and prepare). These questions are almost always randomized in some way, so students can't simply memorize correct answers (or look them up on Chegg in bulk).
- For questions that can be asked in essay/free-form format, that is still a good format to ask, as trivial attempts at cheating (i.e. copy-and-paste from "online resources" like Chegg) are easily detected. Time limit here (about 10 minutes per each question that takes a paragraph or so to answer) also serves to discourage cheating, as students have limited amount of time to "seek help." An additional feature on MyOpenMath that is useful is the assessment level "Add Work" feature---so the questions on assessment can direct students to focus on answering the question within the time limit, encouraging them to organize and attach their work after the assessment time limit (either for partial credit or as a requirement for receiving full credit, when the question is manually graded).