MyOpenMath and proctoring

3,959 views
Skip to first unread message

Amy Hofer

unread,
May 12, 2021, 4:36:03 PM5/12/21
to (occdla@lists.chemeketa.edu), 'occdla@lists.chemeketa.edu', Open Textbook Network, SPARC Libraries & OER Forum, CCCOER Advisory
Hi all, this message is cross-posted. I'll share back a roundup of responses.  

An Oregon college is talking about what they can take forward from this period to improve equity and access for students. The math department is considering online exams, so that remote students don’t have to go to a testing center, but they have concerns about online proctoring.

Some of their discussion:

Working remotely, there’s no point in administering assessments that ask students to come up with a single answer; it’s simply too easy to cheat. It seems there is a full ecosystem for MyOpenMath cheating. Instead, teachers are focusing more on assessing students’ conceptual understanding of the mathematics—and they’ve had to do that without being able to gauge students’ body language or talk in person. But assessing conceptual understanding isn’t an easy transition to make, especially within our MyOpenMath environment.

Does anyone have thoughts or suggestions regarding online proctoring from other schools using MyOpenMath?

Thanks, 
Amy  

--
Amy Hofer (she/her)
Statewide Open Education Program Director

Linn-Benton Community College
6500 SW Pacific Blvd.
Albany, OR 97321

Andrew Park

unread,
May 12, 2021, 4:48:19 PM5/12/21
to Amy Hofer, CCCOER Advisory
This is indeed a challenging question to answer comprehensively; I'll just share what has worked in my case (all these approaches were implemented on MyOpenMath, some pre-COVID and others post-COVID):
  • For questions that are best answered in multiple-choice format (some content is too difficult to test in ways deeper than multiple-choice), a strict time limit (something like no more than 1 minute per question) is a strong disincentive to cheating. I also give my students questions that are similar to homework questions they have seen (so it's reasonable to expect students to be able to answer in 1 minute, for those who study and prepare). These questions are almost always randomized in some way, so students can't simply memorize correct answers (or look them up on Chegg in bulk).
  • For questions that can be asked in essay/free-form format, that is still a good format to ask, as trivial attempts at cheating (i.e. copy-and-paste from "online resources" like Chegg) are easily detected. Time limit here (about 10 minutes per each question that takes a paragraph or so to answer) also serves to discourage cheating, as students have limited amount of time to "seek help." An additional feature on MyOpenMath that is useful is the assessment level "Add Work" feature---so the questions on assessment can direct students to focus on answering the question within the time limit, encouraging them to organize and attach their work after the assessment time limit (either for partial credit or as a requirement for receiving full credit, when the question is manually graded).
I have had some success with 1-on-1 follow-up meetings after written assessments (... this is like oral exam except I promise to my students that I will limit my questions to topics they have seen in written form already), but this is rather time-consuming (and really the purpose of this is to discourage more "sophisticated cheaters" who have someone else take the written assessment for them---while MyOpenMath does allow you to use proctoring software, I don't use them myself).

P.S. Lockdown browser (e.g. Safe Exam Browser, https://safeexambrowser.org/download_en.html) has been used with MyOpenMath by other folks before, but these are meant more for in-person proctoring on computer---but maybe there is a way to make a setup like this work, if you have a live proctor who monitor the video feeds of students; I have to say I myself work from the assumption that it's difficult to ensure no cheating when students are not present in person, so all the measures I take (time limits and follow-up meetings) are only meant to reduce the "effectiveness" of cheating, in hopes of discouraging dishonest students and encouraging honest students.

--
If you have any questions about or technical difficulties with this email list, contact liz...@oeglobal.org
To find out more or be added to this list visit https://www.cccoer.org/community-email/
---
You received this message because you are subscribed to the Google Groups "CCCOER Advisory" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cccoer-adviso...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cccoer-advisory/CAPKOurbFqJ8SRbfRDQRXyCBbXJrBz23L9UnYYNcagVAwkLYGRQ%40mail.gmail.com.

Dempsey, Megan

unread,
May 21, 2021, 9:43:55 AM5/21/21
to CCCOER Advisory
I find this to be an interesting discussion as cheating in the online environment is something that our institution - like most others I'm sure - has been grappling with this year. However, I wonder how time limits on questions align with student accommodations for extra time? I have a child who processes slowly (not in college yet, but I'm sure this issue won't go away by the time he gets there!) and even in the best circumstances, he simply takes longer to process information and respond with an answer than other people do. I have also heard (although not read the research) that online proctoring software often discriminates against students with disabilities by interpreting movements associated with their disability as cheating behaviors. 

It's a really difficult conundrum to pitch open resources as promoting equity while also trying to surround them with anti-cheating measures that potentially erase any equity gained. I'm struggling mentally with that balance!

Megan


From: cccoer-...@googlegroups.com <cccoer-...@googlegroups.com> on behalf of Andrew Park <bp...@peralta.edu>
Sent: Wednesday, May 12, 2021 4:48 PM
To: Amy Hofer <hof...@linnbenton.edu>
Cc: CCCOER Advisory <cccoer-...@googlegroups.com>
Subject: Re: MyOpenMath and proctoring
 

** CAUTION: This message originated from outside the RVCC email system **





Andrew Park

unread,
May 21, 2021, 11:50:49 AM5/21/21
to Dempsey, Megan, CCCOER Advisory
In terms of making ADA accommodations technically possible, this is easily done in MyOpenMath---there is a "time limit multiplier" associated with each student in a class; when a student has ADA accommodation for additional time, this multiplier can be changed to whatever it needs to be, 1.5x, 2.0x, etc.

P.S. There is a larger question, of course, if this formulaic additional time adequately addresses individual situations. I don't think that's one I am equipped to handle well under the limits of 100% online environment.

Amy Hofer

unread,
May 21, 2021, 12:34:36 PM5/21/21
to (occdla@lists.chemeketa.edu), 'occdla@lists.chemeketa.edu', Open Textbook Network, SPARC Libraries & OER Forum, CCCOER Advisory
Hi all, this question has resulted in a LOT of excellent replies. Thank you for the thoughtful discussion. The roundup is below. Amy

*****

You might see about checking in with local ABS Directors and see if they have any “tips”. They have been doing some online assessment for measuring of skills gains. They have to use national testing standards. 

*****

Our college has been using Honorlock for online proctoring for the last year. We piloted the software with and for Math, and it went really poorly. Our instructors were using a mix of MyOpenMath, native LMS exams, and maybe a few other vendors (Honorlock supports third party exam hosts). Regardless of where the students took their test, we ran into continual issue with the monitoring AI – whenever students looked down to work on their scratch paper, it would disrupt the test, make them reauthenticate, and caused a bunch of false flags that make the assessment for instructors meaningless. This might be less of a problem with a proctoring service that uses real people to do monitoring the entire time, but for any system that uses AI it did not work well at all.

For faculty giving traditional exams it’s worked fine. But for math with its external components, online proctoring wasn’t successful or effective.

*****

While I don't want to promote Examity, they have supported us with the use of "unusual" exam types, including computer science programming on a Linux command line, etc. I bet they'd be able to support something like MOM. It's human proctoring and you can provide guidance like "it's math, so they are allowed sheets of paper and to do problems by hand."

I might stir something up here, but I'll go ahead and mention that you can use Zoom to proctor exams. Lots of schools are doing it and there are lots of guides (eg https://keepteaching.ucdavis.edu/test/zoom) that walk faculty through the process. The upside to using Zoom are as follows:

- You already have it
- Instructors can proctor the exam themselves (and experience how tedious and absurd the process is and hopefully consider revisiting assessment altogether.)
- Furloughed college staff can proctor exams (our testing centers have been largely shuttered or doing other work the last year)
- It fits the "security theater" threshold of making it seem like you're at least trying to have some proctoring in place.
- You're not further creating costs/barriers for students by sending money to yet another tech company with some questionable practices (I'd like to see hands raised by people who actually like working with their proctoring vendor)
- Students are likely familiar with Zoom by now.
- Breakout rooms can support multiple exam takers at the same time.

I've been thrilled that our leadership has put a moratorium on proctoring during the pandemic. I hope we largely continue this except for situations where required by external accreditation.

*****

From snippets of conversations I’ve heard on our campus, everyone is struggling to find a better path forward.

We have several sections utilizing MyOpenMath (MOM). Some instructors are choosing to create their own exams and deploy them through our LMS. This allows more control over random questions (larger quiz bank) but often isn’t enough and instructors are wanting to use Respondus to proctor the quizzes online (via webcam or a “live” zoom session). While we provide Respondus as a resource to all instructors, but do our best to discourage general use. It’s 100% not a magic bullet solution.

I can't speak for the institution, but I see this as an ethical crisis. Online proctoring software/solutions are invasive, technically problematic, and expensive. Simply, they suck and perpetuate high stakes testing which has always disproportionally effected those who need academic assistance the most — and we know it (There, I said it!). Outsourcing the proctoring entirely only exacerbates communication and technical issues faced by students.

On the flip side, asking instructors for more of their time to watch hundreds of student presentations, revision math concepts, explore complex gamification, and deploy 100’s of activities instead of high stakes exams is a tremendous shift and involves solving a different mountain of issues.

In short: No easy answers here but want to see this conversation continue to be had in all corners of our institutions.

*****

I have some thoughts, but I am certain that I have not figured this out completely so I'm interested in any else's ideas.

1)  I try to make at least 50% of my exams comprised of problems where students can't just look up the answer (this way, if they're only test strategy is look up the answers, the best they can hope for is 50%):
   - I have a lot of problems where I show them a student's solution for a problem and ask them to identify and articulate/explain the mistake that was made
   - Instead of giving them a function to integrate, I ask them to give a definite integral whose value is 4 (for example)
   - Or, I ask them to sketch the graph of a function whose definite integral over a given interval would be 0
   -  I ask them to sketch the graph of a function given various limits, and/or information about the derivative and second derivative
  Stuff like that;  I have to grade all of these by hand, but these are at least not easy to look up (I don't think).

2)  For the other problems that students can look up, I agree with this person that it is usually very easy to tell.  The student's work will be very weird (or completely wrong) and unlike anything from the course material or what we've done in class.  When I see this, I only give the student 1 point (out of n) for a correct answer and I just say something like "I don't understand what you did here; I will give you full credit for this problem if you join me during office hours this week and walk me through it".  I put this right in the feedback box next to each problem.  So far, not a single student has taken me up on this, which pretty much confirms what I suspected.  If they were to meet with me and explain their solution to my satisfaction, I would have no problem restoring their points.

Also, FWIW, I do not use any proctoring software.  I monitor them via Google Meet (or sometimes I just check them in on Google Meet and then leave them be, if it is a group that I trust).  I also require them to scan and email their work to me within 1 hour of completing the exam.  I do not believe I am preventing all cheating, but I am preventing most.  The students who I am not keeping from cheating are at least not getting a very good grade.  I don't really see any point to the lockdown browser, because students can access anything on their phone.  

*****

What we did last fall was to try to integrate LockDown Browser (LDB) with MyOpenMath exams. One big issue here is that LDB can only be used for Canvas quizzes, but if you import MyOpenMath via LTI it can only be done as assignments. Thus, we used a 'hack' to get the two to work together where the MyOpenMath assessment was added as an assignment, RLB was tied to a quiz, but the quiz opened the MyOpenMath exam through an allowed web url. Actually the hack was pretty inventive.

Now, when I tested this method, I had no issues, and about 80-85% of my students had no issue. That said, 10-15% did have issues in that running both RLB and MyOpenMath took a lot of computing power which caused problems to load slowly (very slowly), some problems would not submit, and in a lot of these cases students would eventually have the delay take so long they couldn't get back in to the exam. Furthering the frustration is that we do not have anyone in the district who is an 'expert' with both MyOpenMath and Canvas and thus there is no technical support for this hack. Since it was a workaround, Canvas folks were not sure what to do.

We ditched that approach and currently are using CARES funds to use ProctorU for our remote proctoring. We have some faculty who proctor via Zoom which allows multiple students to share their screens so the instructor can see the student on the webcam and what is on their screen. We are hoping to be able to use ProctorU in the fall, but to prepare I have moved my exams in to Canvas so they can be used with RLB without having to go through a hack. The upside here is now there will be true technical support for students who have issues.

All of this said, none of this eliminates cheating. Just google "how to cheat on lockdown browser" or "ProctorU' and there is plenty of info on how to cheat both. Best we can do with remote testing is make it as hard as possible to cheat so some of the ideas mentioned by the other teacher are good ones to implement (timed exams, upload work, etc).

I'm not sure if this was much help, but wanted to let you know what we have been through at SCC since we cannot test on campus and our testing center currently doesn't do testing (I suggested changing the name, but got shot down :-p).

Would love to hear what others are doing as this is a real issue going forward. Online enrollment continues to grow and exam integrity should be important to all instructors.

*****

This is indeed a challenging question to answer comprehensively; I'll just share what has worked in my case (all these approaches were implemented on MyOpenMath, some pre-COVID and others post-COVID):

- For questions that are best answered in multiple-choice format (some content is too difficult to test in ways deeper than multiple-choice), a strict time limit (something like no more than 1 minute per question) is a strong disincentive to cheating. I also give my students questions that are similar to homework questions they have seen (so it's reasonable to expect students to be able to answer in 1 minute, for those who study and prepare). These questions are almost always randomized in some way, so students can't simply memorize correct answers (or look them up on Chegg in bulk).

- For questions that can be asked in essay/free-form format, that is still a good format to ask, as trivial attempts at cheating (i.e. copy-and-paste from "online resources" like Chegg) are easily detected. Time limit here (about 10 minutes per each question that takes a paragraph or so to answer) also serves to discourage cheating, as students have limited amount of time to "seek help." An additional feature on MyOpenMath that is useful is the assessment level "Add Work" feature---so the questions on assessment can direct students to focus on answering the question within the time limit, encouraging them to organize and attach their work after the assessment time limit (either for partial credit or as a requirement for receiving full credit, when the question is manually graded).

I have had some success with 1-on-1 follow-up meetings after written assessments (... this is like oral exam except I promise to my students that I will limit my questions to topics they have seen in written form already), but this is rather time-consuming (and really the purpose of this is to discourage more "sophisticated cheaters" who have someone else take the written assessment for them---while MyOpenMath does allow you to use proctoring software, I don't use them myself).

P.S. Lockdown browser (e.g. Safe Exam Browser, https://safeexambrowser.org/download_en.html) has been used with MyOpenMath by other folks before, but these are meant more for in-person proctoring on computer---but maybe there is a way to make a setup like this work, if you have a live proctor who monitor the video feeds of students; I have to say I myself work from the assumption that it's difficult to ensure no cheating when students are not present in person, so all the measures I take (time limits and follow-up meetings) are only meant to reduce the "effectiveness" of cheating, in hopes of discouraging dishonest students and encouraging honest students.

*****

I find this to be an interesting discussion as cheating in the online environment is something that our institution - like most others I'm sure - has been grappling with this year. However, I wonder how time limits on questions align with student accommodations for extra time? I have a child who processes slowly (not in college yet, but I'm sure this issue won't go away by the time he gets there!) and even in the best circumstances, he simply takes longer to process information and respond with an answer than other people do. I have also heard (although not read the research) that online proctoring software often discriminates against students with disabilities by interpreting movements associated with their disability as cheating behaviors.

It's a really difficult conundrum to pitch open resources as promoting equity while also trying to surround them with anti-cheating measures that potentially erase any equity gained. I'm struggling mentally with that balance!

*****

In terms of making ADA accommodations technically possible, this is easily done in MyOpenMath---there is a "time limit multiplier" associated with each student in a class; when a student has ADA accommodation for additional time, this multiplier can be changed to whatever it needs to be, 1.5x, 2.0x, etc.

P.S. There is a larger question, of course, if this formulaic additional time adequately addresses individual situations. I don't think that's one I am equipped to handle well under the limits of 100% online environment.

Reply all
Reply to author
Forward
0 new messages