nbautoeval - a tool for designing auto evaluated exercises for learning python

105 views
Skip to first unread message

Thierry Parmentelat

unread,
Jan 18, 2017, 5:21:18 AM1/18/17
to Teaching with Jupyter Notebooks
Hi folks

I am new to this list, but it sounds like the right place to introduce a - very lightweight - tool for auto-evaluated exercise in python programming

This is a by-product of a [MOOC on python (in french)](https://www.fun-mooc.fr/courses/inria/41001S03/session03/about) that we've been running 3 times already on the french openedx-based MOOC hosting platform https://www.fun-mooc.fr/

Anyway, the tool itself lets you define python objects that define an exercise for students

Each exercise is about writing a python function, or class, or regexp, that must fulfil some requirements
it contains essentially a teacher implementation, and some test inputs.

It can then be embedded in a notebook, where of course the requirements are explained in plain text
the framework can then automatically compare the outputs between the teacher's and the student's version
so that each student can check her own implementation through visual green/red feedback

In case there is interest, this is available in github here

with a binder link here to see a few real examples


---
I certainly am not suggesting that the code is ready to be reused as is - my primary target while I was writing this was the MOOC contents

However I'd be eager to know if this kind of things are of interest to others, and also if there were other better structured frameworks that could support similar features, or where similar features could be easily added..

-- Thierry

Jessica B. Hamrick

unread,
Jan 18, 2017, 1:16:38 PM1/18/17
to Thierry Parmentelat, Teaching with Jupyter Notebooks
Hi Thierry,

Thanks for sending this, this looks very useful!

I'm curious -- have you had any issues with student introspecting into the exercise module to find the answer code and then just copying it?

Cheers,
Jess


--
You received this message because you are subscribed to the Google Groups "Teaching with Jupyter Notebooks" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jupyter-education+unsubscribe@googlegroups.com.
To post to this group, send email to jupyter-education@googlegroups.com.
Visit this group at https://groups.google.com/group/jupyter-education.
To view this discussion on the web visit https://groups.google.com/d/msgid/jupyter-education/9ec78846-54c8-49bf-ae5a-b0b47e940e0b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thierry Parmentelat

unread,
Jan 18, 2017, 5:47:02 PM1/18/17
to Jessica B. Hamrick, Teaching with Jupyter Notebooks
Hi Jessica

Thanks for your interest


You’re making a very good point :)

Short answer is : not that I know of, and I do not care that much in my context
As I may say somewhere in the readme, so far this has been deployed in a context where autoevaluated also means it’s for the student’s sake only
I do collect statistics but have not been able to connect this to the hosting MOOC testbed per se
All this amounts to saying that students would only cheat themselves :)

But yes, you’re right, if grades were given based on this it would be a game changer
I haven’t given that angle much thought yet to be honest, but I would not see any easy way to implement a tight protection

— Thierry
> To unsubscribe from this group and stop receiving emails from it, send an email to jupyter-educat...@googlegroups.com.
> To post to this group, send email to jupyter-...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages