Hi Tim, Thanks for these comments - some of the other aspects look quite useful. The part about previous years is something that we should talk to Ed Tech people about. I have cc'ed the list so that others can see your email and can comment. Best, david On 10/01/2023 16:24, Evans, Tim S wrote:
Hi David,
After our discussion in the meeting I looked up gradescope and found the following UK based discussion of this software
https://www.elearning.fse.manchester.ac.uk/fseta/gradescope/ <https://www.elearning.fse.manchester.ac.uk/fseta/gradescope/>
Has some advantages/disadvantages bullet points that might be useful.
Gradescope is much more than just MOSS so we might not want the other stuff. For instance, it contains AI “marking” of assignments. It generally reproduces all the structure of Blackboard/Turnitin reports but for code. MOSS is just the turnitin replacement though it is more limited than Turnitin. Not sure we need all that other guff or if it is helpful. Perhaps it can be if we learn to use it.
My biggest concern is that MOSS only checks code against what you submit. So I submit all the code (now 9 years worth) to MOSS and that works for me. Not sure it is possible to add this old code to Gradescope when checking an assignment. So Gradescope does NOT appear to compare code outside the one assignment, so previous years work is not checked. Nor does gradescope check against github or anywhere else on the internet. It is not simply Turnitin for code in that sense – I don’t need to submit past reports for the same project, Turnitin find matches for those (only trivial text matches so far). I find the most serious problems are people using bits of code which was placed on Github in previous years (I found 20 examples then stopped looking) then changing the variable names and sometimes a bit more. I think Gradescope is designed for simple code examples and to catch in class copying. It is not designed for bigger projects (3^rd year computational physics) where chunks of standard code reappear legitimately – you can only implement Maxwells equations and Runge-Kutta in so many ways – but where code is most likely to be copied from previous years. I also find code in my 3^rd year course is too long/hard to check by eye. I have to use grep to check for identical strings to be able to be sure. [Labels and comments are the best as students forget to change those when changing variable names though the best I found was a misspelt word in a comment as they had changed part of the word that had been used in an old variable name!]
So I am not convinced Gradescope does the job I want. However it seems to be the only option other than MOSS. Gradescope might be a partial solution for others however e.g. cover 1^st year lab exercises. I’d like to test the AI marking out, that could be interesting.
Thanks for all your work with the committee
Tim
Dr Tim Evans,
Senior Lecturer, Theoretical Physics group, Centre for Complexity Science, Room H609, Physics Dept., Imperial College London, SW7 2AZ
020 7594 7837 http://imperial.ac.uk/people/t.evans <https://exchange.imperial.ac.uk/owa/redir.aspx?C=esaZLh0UHEqt0kuUzGCfC2DJkFmKadEIqHYhy_fzoV5yQLt1HbEwJ3NhDztNlRj0XzZNcjIVnGc.&URL=http%3a%2f%2fimperial.ac.uk%2fpeople%2ft.evans>
@netplexity http://netplexity.org/ <https://exchange.imperial.ac.uk/owa/redir.aspx?C=esaZLh0UHEqt0kuUzGCfC2DJkFmKadEIqHYhy_fzoV5yQLt1HbEwJ3NhDztNlRj0XzZNcjIVnGc.&URL=http%3a%2f%2fnetplexity.org%2f>