Mastery learning xblock

74 views
Skip to first unread message

Mattijs Joosten

unread,
Apr 18, 2017, 11:17:30 AM4/18/17
to General Open edX discussion
I'm thinking about writing one or more xblocks to present problems exactly like Khan Academy does: in a modal, asking students to get a certain number right in a row.
I feel that I'd like the xblock to be on the subsection level instead of the unit level, so I can use existing problem types.

The philosophy behind Khan Academy is called mastery learning. So what I would like to create is a mastery learning xblock.
Because learners could get questions wrong a lot, the library would need to contain maybe around one or two hundred variations on the same question. Question variations could be generated outside Open edX in OLX and imported, so I'm not very concerned about that.

My questions at this point in time are:
Is an xblock on the subsection level a good idea? Could I capture and 'hijack' grading events of lower xblocks?

There is not much documentation on writing xblocks not on the unit level. Maybe writing a 'mastery learning' version of every major problem type would be better, but somehow it doesn't feel right. (Or DRY, do not repeat yourself, an important coding principle).

Thanks for your input.



Auto Generated Inline Image 1

Piotr Mitros

unread,
Apr 18, 2017, 2:22:46 PM4/18/17
to General Open edX discussion
Good idea for a block. Not the best name. I've had mastery learning in Open edX from day 0 in two forms:
  • Open-ended authentic assessments with immediate feedback. 
  • Parameterized problems, where students attempt new variants of a problem until they get it right. 
To get into the nitty gritty:
  • You should be able to capture grading events in JavaScript from child XBlocks. The way this would be done is a little bit crude right now, but rewriting client-side event handling with proper JS event listeners has been on our todo list for a while (if no one has done it yet)
  • Looking at randomized problem banks may be helpful.
The major use for this tool would probably be MCQs and simple assessments where the existing edX models don't work too well. What I've seen there is that students need to get increasingly long runs of correct problems with number of attempts. With a true/false question, eight random attempts will lead to three correct answers in a row. 

Piotr

Mattijs Joosten

unread,
Apr 19, 2017, 3:47:58 AM4/19/17
to General Open edX discussion
Thank you very much for your post, Piotr. I really appreciate it.

My idea for a block actually started with parameterized problems with variables to be substituted with a value from a list.

For example:
"$firstname $lastname (age $age) has Dutch basic health insurance and no supplementary insurance."

to generate:
"Liam Jansen (age 62) has Dutch basic health insurance and no supplementary insurance."

I figured I'd generate the problems outside of Open edX and then import them.
But you've had parameterized problems for years. Is there a way to use lists, like for $firstname and $lastname in the example?

About the block:
Instead of generating new problems until students get one right, I'd like to keep the same problem in front of the student until they get that right.
And then generate (or select) a new problem until they get X right in a row.

I think a good place to start for me would be numerical input questions, followed by checkboxes and multiple choice.
With numerical input a student might guess one right, but the odds of guessing five or six right in a row would be low.

I'm not really comfortable yet rewriting client side event handling. I love Open edX, but I'm relatively new to coding.
Where I could start is by taking "Numerical Input with Hints and Feedback" and adding the "Get X right in a row" functionality.
Then I could apply what I learned to other question types or to an overarching block that handles its children's events.

I'd love to use parameters and lists from the start to generate questions on the fly.
This avoids filling a question bank with a hundred or so problems while most students might only use ten.

The name could be more descriptive, like "X correct in a row", just "correct in a row" or something containing "serial" or "streak".

What do you think?

Piotr Mitros

unread,
Apr 21, 2017, 11:39:49 AM4/21/17
to General Open edX discussion
Virtually all problems in 6.002x were parameterized this way. The syntax is based on LON-CAPA, but with Python scripts instead of Perl scripts (and LON-CAPA documents this nicely). Jerry Sussman (who did the plurality of the work on the course, including writing all of these problems) released a copy of the course under a free and open license a couple of years ago, so it's a good place to look for examples:


When we were first developing the course, each time you submitted a problem, the values changed. While this works well with simple one-concept Khan-style questions, it was a disaster for problems with a lot of algebra. One of the first platform enhancement was to do it such that randomization could be set to per-student, so each student saw different values (to prevent cheating), but a student's values remained constant. That had issues for student collaboration, so we set the course to use the same random number seed for all students and all problems. 

The settings are documented here:

If you just want to use this type of problem, an easy platform enhancement would be to modify this setting to include an additional option to look for a run, or better yet, to look for a good estimate of mastery (assuming both occasionally errors and correct guesses). 

Piotr

Mattijs Joosten

unread,
Apr 23, 2017, 2:09:38 AM4/23/17
to General Open edX discussion
Thank you very much for your time and your suggestions, dr. Mitros. I'm a big fan of Open edX, am learning more about it every day and hope to increase its popularity in the Netherlands. You've visited the Netherlands. Do you know any people or groups working with Open edX here?

Piotr Mitros

unread,
Apr 23, 2017, 7:28:37 AM4/23/17
to General Open edX discussion
Delft is one of my favorite partners. They have a commitment to free and open licensing of educational content. This commitment is very important -- I originally created the MITx/Open edX platform because I believed education should continue to be a public good (as it has been in the past), run in student interest, and with high levels of transparency. I was concerned about the way digital technology pushes education to be proprietary. Delft is one of the few organizations which continues to use Creative Commons licenses, despite strong structural pressures to do otherwise.

Piotr

Mattijs Joosten

unread,
Aug 18, 2017, 7:11:43 AM8/18/17
to General Open edX discussion
Update: Régis Behmo and I have posted a proposal for an open source 'streak' XBlock on edxchange.

https://edxchange.opencraft.com/t/building-the-n-correct-answers-in-a-row-aka-streak-xblock/156

We could really use one or more partners to share the development costs. If you read this and would like to see this kind of mastery learning in Open edX and have a say on its exact implementation, let us know on edxchange!
Reply all
Reply to author
Forward
0 new messages