Dear David et al, Thank you for arranging the meeting on Tuesday. Here are my notes from the meeting and actions; let me know if there is anything that you need updated: - Response time when tickets are raised is a general issue. ACTION: Okan to look into this and produce reports. Okan also stated that the intention is to give all areas suitable escalation paths so that important / urgent issues can be dealt with accordingly. - Clusters are used to deliver courses, which is critical to education delivery. Teaching suites; 120 machines. [We need to understand the full inventory.] Common core build; various apps including LabView; simulation software (Geant4) not a commercial package; admin rights on machines, when they are set up. One lab is running Linux (old Oracle licence), which needs to be moved on to the new RHEL licence. It wasn't very clear whether Physics technicians have Admin access to these machines. Microprocessors lab; AppsAnywhere could be flaky causing intermittent issues, where the only solution is to ask the student to move on to another machine. regular maintenance of the machines and software; AppsAnywhere installation of Python and Anaconda not working properly. Masters teaching lab; heterogeneous lab, Jupiter workbooks; Anaconda wasn’t updating; Python install could also be flaky. We could look into Easter time to start doing an overhaul. Certain users having issues. A self-recovery solution will be most ideal. ACTION: It would be useful to have a full view of the labs and the machines in them. Location, number of machines, OS / installation pack need; could you please look into starting to prepare this - David Colling ACTION: Identify an allocated person to work with David and the Physics team to address the immediate issues (e.g. Python/Anaconda issues) [Subsequent note: Okan asked Simon Thompson to raise a problem ticket and start looking into this.] ACTION: Identify an allocated person to work with David and the Physics team to prepare a plan for making the clusters set up and run as required. Use the setup of the Instrumentation course as an example set up. - Okan Kibaroglu - Use of H: drive needs to be tested on Linux machines [I checked with Ingrid after the meeting that this is different to Group Space and we unfortunately do not have too long a grace to get a solution in place.] ACTION: Check with Ingrid about any H: drive solutions and the progress made. - Okan Kibaroglu [Subsequent to the meeting, I found that solutions are being looked into. We will arrange this to be tested as a matter of priority.] ACTION: Regular preventative maintenance of the clusters will be useful to reduce number of issues both for hardware and software. - Okan Kibaroglu - Plagiarism suite for code; Gradescope with MOSS built into I snow being marketed by Turnitin. ACTION: Check the availability of Gradescope via Turnitin. - Okan Kibaroglu [After the meeting, I checked with Adrian Thomas, the Product Owner, to check this.] - Review of software on the Software Hub as we have many old versions. ACTION: How are we doing the review of the software available on the Hub. - Okan Kibaroglu Best Regards, Okan Kibaroglu (he/him) Interim Head of Customer Success Imperial College London Information and Communication Technologies South Kensington, London, UK SW7 2AZ Tel: +44 (0)20 7594 1614 -----Original Message----- From: physics-departmental-computing-bounces@imperial.ac.uk <physics-departmental-computing-bounces@imperial.ac.uk> On Behalf Of David Colling Sent: 10 January 2023 17:00 To: Evans, Tim S <t.evans@imperial.ac.uk> Cc: physics-departmental-computing <physics-departmental-computing@imperial.ac.uk> Subject: Re: [Physics-Departmental-Computing] MOSS and Gradescope - plagiarism comments Hi Tim, Thanks for these comments - some of the other aspects look quite useful. The part about previous years is something that we should talk to Ed Tech people about. I have cc'ed the list so that others can see your email and can comment. Best, david On 10/01/2023 16:24, Evans, Tim S wrote:
Hi David,
After our discussion in the meeting I looked up gradescope and found the following UK based discussion of this software
https://www.elearning.fse.manchester.ac.uk/fseta/gradescope/ <https://www.elearning.fse.manchester.ac.uk/fseta/gradescope/>
Has some advantages/disadvantages bullet points that might be useful.
Gradescope is much more than just MOSS so we might not want the other stuff. For instance, it contains AI “marking” of assignments. It generally reproduces all the structure of Blackboard/Turnitin reports but for code. MOSS is just the turnitin replacement though it is more limited than Turnitin. Not sure we need all that other guff or if it is helpful. Perhaps it can be if we learn to use it.
My biggest concern is that MOSS only checks code against what you submit. So I submit all the code (now 9 years worth) to MOSS and that works for me. Not sure it is possible to add this old code to Gradescope when checking an assignment. So Gradescope does NOT appear to compare code outside the one assignment, so previous years work is not checked. Nor does gradescope check against github or anywhere else on the internet. It is not simply Turnitin for code in that sense – I don’t need to submit past reports for the same project, Turnitin find matches for those (only trivial text matches so far). I find the most serious problems are people using bits of code which was placed on Github in previous years (I found 20 examples then stopped looking) then changing the variable names and sometimes a bit more. I think Gradescope is designed for simple code examples and to catch in class copying. It is not designed for bigger projects (3^rd year computational physics) where chunks of standard code reappear legitimately – you can only implement Maxwells equations and Runge-Kutta in so many ways – but where code is most likely to be copied from previous years. I also find code in my 3^rd year course is too long/hard to check by eye. I have to use grep to check for identical strings to be able to be sure. [Labels and comments are the best as students forget to change those when changing variable names though the best I found was a misspelt word in a comment as they had changed part of the word that had been used in an old variable name!]
So I am not convinced Gradescope does the job I want. However it seems to be the only option other than MOSS. Gradescope might be a partial solution for others however e.g. cover 1^st year lab exercises. I’d like to test the AI marking out, that could be interesting.
Thanks for all your work with the committee
Tim
Dr Tim Evans,
Senior Lecturer, Theoretical Physics group, Centre for Complexity Science, Room H609, Physics Dept., Imperial College London, SW7 2AZ
020 7594 7837 http://imperial.ac.uk/people/t.evans <https://exchange.imperial.ac.uk/owa/redir.aspx?C=esaZLh0UHEqt0kuUzGCf C2DJkFmKadEIqHYhy_fzoV5yQLt1HbEwJ3NhDztNlRj0XzZNcjIVnGc.&URL=http%3a%2 f%2fimperial.ac.uk%2fpeople%2ft.evans>
@netplexity http://netplexity.org/ <https://exchange.imperial.ac.uk/owa/redir.aspx?C=esaZLh0UHEqt0kuUzGCf C2DJkFmKadEIqHYhy_fzoV5yQLt1HbEwJ3NhDztNlRj0XzZNcjIVnGc.&URL=http%3a%2 f%2fnetplexity.org%2f>
_______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computing