Dear Members of the Physics Computing Committee, You will all no doubt be aware of the reorganisation being planned in ICT. From correspondence with some of you today, it is clear that some of you have read the white paper and others have not. I am not sure how secret this document is meant to be but I have been sent so many copies that I view it as essentially being public and so I have attached it to this email. I would urge you all to read it carefully and especially the staff planning at the end of the document. best, david
Hi All, I would encourage you to circulate this white paper to all the more computer dependent members of your group. Things that particularly worry me are the loss of the whole RSC team (including HPC and RSE teams), the reduced services to users (including AV services) and the constraints on what ICT will be able to support. I would be very interested in the thoughts and opinions of other people. Best, david On 5 June 2020 20:07:52 BST, David Colling <d.colling@imperial.ac.uk> wrote:
Dear Members of the Physics Computing Committee,
You will all no doubt be aware of the reorganisation being planned in ICT. From correspondence with some of you today, it is clear that some of you have read the white paper and others have not. I am not sure how
secret this document is meant to be but I have been sent so many copies
that I view it as essentially being public and so I have attached it to
this email. I would urge you all to read it carefully and especially the staff planning at the end of the document.
best, david
-- Sent from my Android device with K-9 Mail. Please excuse my brevity.
Dear Physics Computing Committee, I think as physicists we have a lot of case studies where our unusual needs have been met by bespoke local ICT support. I think maybe providing a dossier of evidence of this might be a useful and concrete thing we can do. My comments from the EXSS perspective: * Research Computer Services disappear as a separate entity, and seemed to be lumped in without any management in the 'portfolio function'. The provision of high performance computing at Imperial has been a real success story over the last 15 years. We have an enormous number of people in EXSS doing computational materials research, even within groups that are mainly experimental, and which would not individually be able to support this work with either hardware or expertise. This has led to some extremely high quality science, and has only been possible due to the support for training by RCS / ICT. * The multiple negative references in the white paper to the cost of supporting obsolete hardware / software from 'shadow ICT'. A lot of the EXSS experimental rigs run custom hardware & drivers often associated with obsolete versions of Windows, so this may cause issues. We might find that suddenly no help is offered for any technical issues with these old machines, and are no longer allowed to connect them to the intranet. * Something similar might be the case for the CNC machines in the workshop. * Generally we have very varied and heterogenous requirements in EXSS, and have definitely benefited from bespoke solutions and local expertise. The CMTH/EXSS 'cluster' group operations manager (Carolyn Dale) is concerned that: I am in complete shock about this and at the start of the lockdown at our first admin teams meeting from home we were asked by Luke if we had to time to complete an ICT survey about the service we receive which after what they have done now seems pointless but from my own admin team view and I made them aware of the redundancies on Friday as I had spoken to Martin Morris we are very concerned about the level of service we would receive as we often have to ask for one to one help to assist new students/visitors and problems with our own computers and group laptops so that message comes from the 3 of use in the cluster office Prof Nelson added after I distributed the above to the EXSS PIs: * EXSS has a relatively large number of students and RAs dong computational research alongside experimental work. Since as an experimental group EXSS does not have and could not support its own computational support staff, the existence of the HPC and the RCS service have been absolutely critical in allowing us to work. * EXSS staff and students have developed software that they have published open source or otherwise want to make available for wider use. The services provided by ICT have been essential in allowing that to happen. At least one such piece of software is currently forming part of a physics department REF case study. * the old MRes in plastic electronics and the new MRes that we are opening this October have a significant computational teaching part where we use networked computers and software to teach concepts in the physics and chemistry of materials. We are stepping up the computational teaching part this year due to the pandemic. The support of ICT has been critical in allowing the computational teaching to be done (I can give specific examples of help given by CT staff at the last minute), since this was not a part of standard UG or PG teaching programmes. The computational teaching is considered a relatively innovative part of our MRes programme. * I have an ERC grant largely for computational work. The support of the HPC and RCS was essential to demonstrate that the work could be done within EXSS. Others (like you?) are likely to have similar examples On Sat, 6 Jun 2020 at 14:20, David Colling <d.colling@imperial.ac.uk> wrote:
Hi All,
I would encourage you to circulate this white paper to all the more computer dependent members of your group. Things that particularly worry me are the loss of the whole RSC team (including HPC and RSE teams), the reduced services to users (including AV services) and the constraints on what ICT will be able to support.
I would be very interested in the thoughts and opinions of other people.
Best, david
On 5 June 2020 20:07:52 BST, David Colling <d.colling@imperial.ac.uk> wrote:
Dear Members of the Physics Computing Committee,
You will all no doubt be aware of the reorganisation being planned in ICT. From correspondence with some of you today, it is clear that some of you have read the white paper and others have not. I am not sure how
secret this document is meant to be but I have been sent so many copies
that I view it as essentially being public and so I have attached it to
this email. I would urge you all to read it carefully and especially the staff planning at the end of the document.
best, david
-- Sent from my Android device with K-9 Mail. Please excuse my brevity.
_______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computing
Some specifics examples from SPAT where ICT have performed critical roles, and still are: (not a comprehensive sweep by any means, but certainly shows that ICT is NOT failing in all aspects): Rich ----------------------------------------------------------------------------------------------------------------------------------------- [1] Here is an example of how the RSE team helped solve a unique problem for the Physics Department's Cassini team - ensuring the continuity of provision of unique data to the community. The text below was submitted to the "Presidents Award for Research Support" (I don't know the outcome). Statement of support Please include a statement of support from an individual not involved in the nominee's direct line of management Dr Richard Bantges, Department of Physics: The scientific elements of the MAGDA data preservation project precluded employing a contract web development team but the expertise of the Research Computing Service was a perfect fit. The RSE team quickly understood the complexities of the problem at hand, rapidly prototyping web pages and visualisations for us to review. Without the help of Chris Cave-Ayland and Mark Woodbridge in particular, we would have struggled to meet our commitment to maintain a web-based quick-look visualisation tool for accessing and distributing Cassini mission data for the benefit of the global research community. Further to our expectations the team showed an attention to detail and a pro-active approach that was not only effective but very reassuring. They provide an invaluable and unique service to the research community at Imperial. [Richard is the Scientific Project Manager for MAGDA, a separate project to which the RSE team have recently contributed. This work was commissioned by the Head of the Department of Physics and sponsored by the Vice-Provost for Research and Enterprise via the College's Strategic Development Fund] ----------------------------------------------------------------------------------------------------------------------------------------- [2] Another example, of how ICT has been successful was the support from ICT for the Cassini end of mission phase (Apr-Sep 2017). This required: 1. Set up of a dedicated Cassini Virtual Team - consisting of support members from the OS Team, Networks and Data Centre 2. Provision by the OS Team of a series of customised virtual machines to act as backup systems for Windows Server 2000 systems that could not be upgraded 3. Specialist UNIX support to help support SOLARIS 10 (SPARC) systems 4. Advanced network support to ensure seamless dedicated VPN links between Imperial College London and NASA's JPL 5. Liaisons between the College's Estates division and Networks to provide multiple electrical power sources for computing and network infrastructure resilience Certain aspects of the above were called upon in the final stages of the mission, and ICT helped ensure that the Imperial College Cassini team were able to monitor data from its magnetometer onboard the Cassini spacecraft (built in the Magnetometer lab in Huxley 6M) thus ensuring the highest quality science data were obtained at end of mission. ------------------------------------------------------------------------------------------------------------------------------------------- [3] Currently the space mission, Solar Orbiter, with another of SPAT's magnetometers on board, is relying on RCS hosted VMs to provide near realtime data analysis and monitoring. This solution was chosen for resilience and the level of service offered by the RCS. ------------------------------------------------------------------------------------------------------------------------------------------ -----Original Message----- From: physics-departmental-computing-bounces@imperial.ac.uk [mailto:physics-departmental-computing-bounces@imperial.ac.uk] On Behalf Of Jarvist Moore Frost Sent: 08 June 2020 13:44 To: Colling, David J <d.colling@imperial.ac.uk> Cc: physics-departmental-computing <physics-departmental-computing@imperial.ac.uk> Subject: Re: [Physics-Departmental-Computing] ICT reorganisation Dear Physics Computing Committee, I think as physicists we have a lot of case studies where our unusual needs have been met by bespoke local ICT support. I think maybe providing a dossier of evidence of this might be a useful and concrete thing we can do. My comments from the EXSS perspective: * Research Computer Services disappear as a separate entity, and seemed to be lumped in without any management in the 'portfolio function'. The provision of high performance computing at Imperial has been a real success story over the last 15 years. We have an enormous number of people in EXSS doing computational materials research, even within groups that are mainly experimental, and which would not individually be able to support this work with either hardware or expertise. This has led to some extremely high quality science, and has only been possible due to the support for training by RCS / ICT. * The multiple negative references in the white paper to the cost of supporting obsolete hardware / software from 'shadow ICT'. A lot of the EXSS experimental rigs run custom hardware & drivers often associated with obsolete versions of Windows, so this may cause issues. We might find that suddenly no help is offered for any technical issues with these old machines, and are no longer allowed to connect them to the intranet. * Something similar might be the case for the CNC machines in the workshop. * Generally we have very varied and heterogenous requirements in EXSS, and have definitely benefited from bespoke solutions and local expertise. The CMTH/EXSS 'cluster' group operations manager (Carolyn Dale) is concerned that: I am in complete shock about this and at the start of the lockdown at our first admin teams meeting from home we were asked by Luke if we had to time to complete an ICT survey about the service we receive which after what they have done now seems pointless but from my own admin team view and I made them aware of the redundancies on Friday as I had spoken to Martin Morris we are very concerned about the level of service we would receive as we often have to ask for one to one help to assist new students/visitors and problems with our own computers and group laptops so that message comes from the 3 of use in the cluster office Prof Nelson added after I distributed the above to the EXSS PIs: * EXSS has a relatively large number of students and RAs dong computational research alongside experimental work. Since as an experimental group EXSS does not have and could not support its own computational support staff, the existence of the HPC and the RCS service have been absolutely critical in allowing us to work. * EXSS staff and students have developed software that they have published open source or otherwise want to make available for wider use. The services provided by ICT have been essential in allowing that to happen. At least one such piece of software is currently forming part of a physics department REF case study. * the old MRes in plastic electronics and the new MRes that we are opening this October have a significant computational teaching part where we use networked computers and software to teach concepts in the physics and chemistry of materials. We are stepping up the computational teaching part this year due to the pandemic. The support of ICT has been critical in allowing the computational teaching to be done (I can give specific examples of help given by CT staff at the last minute), since this was not a part of standard UG or PG teaching programmes. The computational teaching is considered a relatively innovative part of our MRes programme. * I have an ERC grant largely for computational work. The support of the HPC and RCS was essential to demonstrate that the work could be done within EXSS. Others (like you?) are likely to have similar examples On Sat, 6 Jun 2020 at 14:20, David Colling <d.colling@imperial.ac.uk> wrote:
Hi All,
I would encourage you to circulate this white paper to all the more computer dependent members of your group. Things that particularly worry me are the loss of the whole RSC team (including HPC and RSE teams), the reduced services to users (including AV services) and the constraints on what ICT will be able to support.
I would be very interested in the thoughts and opinions of other people.
Best, david
On 5 June 2020 20:07:52 BST, David Colling <d.colling@imperial.ac.uk> wrote:
Dear Members of the Physics Computing Committee,
You will all no doubt be aware of the reorganisation being planned in ICT. From correspondence with some of you today, it is clear that some of you have read the white paper and others have not. I am not sure how
secret this document is meant to be but I have been sent so many copies
that I view it as essentially being public and so I have attached it to
this email. I would urge you all to read it carefully and especially the staff planning at the end of the document.
best, david
-- Sent from my Android device with K-9 Mail. Please excuse my brevity.
_______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computi ng
_______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computing
Forward: I am copying this to Michelle, especially for the “personal concerns on ICT restructuring process” near the end (given Paul French’s suggestion about appropriate channels) –––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––– Hi Dave & All, Just a few quick examples of how ICT affect my research & teaching. I believe that my colleagues in PLAS have similar opinions about the research: * I believe that the current ICT setup has proved to be effective & resilient through its rapid and often bespoke response in keeping teaching and research working through the current pandemic, especially in the first critical weeks of lockdown. * Research – The Research Computing Service at Imperial is the envy of my colleagues employed at other universities, both in the UK and internationally. * The RCS has provided a first class service at all levels; HPC hardware, provision of training, expert advice about HPC code engineering, etc. * It provides my research team (and the research teams of other colleagues) the basis to go on and secure access to national (e.g. ARCHER) and international (e.g. PRACE) HPC. It enables us to pump-prime and optimise our codes, which are pre-requisites for tier-1 and tier-0 HPC access. It also allows us to perform modest production runs that mean our high-fidelity production runs on tier-1 and tier-0 HPC etc., can be obtained with less resource (or alternatively, we get more quantity and/or quality of publishable results for the fixed resource that we can competitively obtain) . * All my 12 PhD students since 2004 used/use it. All my 4 postdocs have used it. Their training and research output would not have been possible without it. An example of the value that RCS has added: * From this set, 4 hold faculty positions (or equivalent in national labs). * The research teams of (at least) 7 out of 10 academics in PLAS use Imperial RCS (i.e cx1) in their research. The 4 non-experimental academics (like me) use it extensively. Most of the experimental teams perform HPC (e.g. 2D3V particle-in-cell, or 3D magnetohydrodynamic) support simulations. You can’t publish experimental results in PRL or other high-impact journals in our field without this these days! * Teaching – ICT has provided a very stable, consistent and useful platform for delivering UG teaching. It could be better. It could be a lot worse. But it has a track record of being fit enough for purpose, through the 220+ students Physics graduates each year. I run the 2nd year UG computing module in my department. (I have had the pleasure(?) of doing this for ~10 years since 2005. I have also lectured the 3rd year Computational Physics course for 3 years. The whole cohort takes 2nd year computing. Passing it is a requirement for UG progression. ICT currently understands that the best solution for the needs of the people like me, tasked with delivering the actual teaching (i.e. academics) is not necessarily the best solution from a purely "computing technology" point of view. * Every time the provided software is changed, I need to invest a considerable amount of time to effectively debug the course materials. This is time taken away from improving the academic quality of the material, time taken away from personal interaction with students, time taken away from writing grant proposals and running my research team, and a quite simply a loss of "space" to be truly innovative in research and teaching. * I believe that the current ICT staff and management understands this. * The ICT restructuring, led by someone with no prior university operational experience, is happening so fast, that I fear that it cannot possible capture such requirements. (Happy to be proved wrong, of course!) Sorry it is a bit wordy. “Research” is probably the most evidenced example of ICT’s worth. “Teaching” probably best exemplifies my serious concerns about the whole ICT restructuring process : * Speed & inappropriate timing. * Lack of understanding of the end-users’ needs and priorities (or at least that of a typical “Academic”). * #1 priority is to deliver teaching for 2020/21 and have ICT stability (known environment, sufficient ICT bodies in place, personal connections to get things done; heck … to have a functioning ICT at all!) to deliver what is looking to be a challenging 2021/22 teaching wise. * #2 priority is to have a functioning HPC service. Best wishes, Robert PS: I am copying this in to Michelle to save her having to read a separate email from me on this! This also gives me a little more time to assess those MSci reports that are becoming every more urgent! Thanks to ICT restructuring, my teaching duties for today have been utterly disrupted. Probably a prime example of an indirect cost of the ICT restructuring process! One I doubt the architects understand. Following the logic further as a thought experiment: Late MSci assessment ==> possible delay in cohort graduating ( ==> possible loss in future income from loss of kudos, etc. / me getting fired-reprimanded) |————————————————————————————> Dr Robert J. Kingham Plasma Physics Group Imperial College London rj.kingham@imperial.ac.uk<mailto:rj.kingham@imperial.ac.uk> Professional Web Page<http://www3.imperial.ac.uk/people/rj.kingham> I work flexibly and aspire to a sensible work-life balance. I’m sending this email now because it suits me, but I don’t expect you to read or respond outside of your own working hours. During the COVID-19 pandemic it may take me longer than usual to reply to emails, due to my care & education responsibilities of my school-age children. <————————————————————————————| On 8 Jun 2020, at 15:34, Bantges, Richard J <r.bantges@imperial.ac.uk<mailto:r.bantges@imperial.ac.uk>> wrote: Some specifics examples from SPAT where ICT have performed critical roles, and still are: (not a comprehensive sweep by any means, but certainly shows that ICT is NOT failing in all aspects): Rich ----------------------------------------------------------------------------------------------------------------------------------------- [1] Here is an example of how the RSE team helped solve a unique problem for the Physics Department's Cassini team - ensuring the continuity of provision of unique data to the community. The text below was submitted to the "Presidents Award for Research Support" (I don't know the outcome). Statement of support Please include a statement of support from an individual not involved in the nominee's direct line of management Dr Richard Bantges, Department of Physics: The scientific elements of the MAGDA data preservation project precluded employing a contract web development team but the expertise of the Research Computing Service was a perfect fit. The RSE team quickly understood the complexities of the problem at hand, rapidly prototyping web pages and visualisations for us to review. Without the help of Chris Cave-Ayland and Mark Woodbridge in particular, we would have struggled to meet our commitment to maintain a web-based quick-look visualisation tool for accessing and distributing Cassini mission data for the benefit of the global research community. Further to our expectations the team showed an attention to detail and a pro-active approach that was not only effective but very reassuring. They provide an invaluable and unique service to the research community at Imperial. [Richard is the Scientific Project Manager for MAGDA, a separate project to which the RSE team have recently contributed. This work was commissioned by the Head of the Department of Physics and sponsored by the Vice-Provost for Research and Enterprise via the College's Strategic Development Fund] ----------------------------------------------------------------------------------------------------------------------------------------- [2] Another example, of how ICT has been successful was the support from ICT for the Cassini end of mission phase (Apr-Sep 2017). This required: 1. Set up of a dedicated Cassini Virtual Team - consisting of support members from the OS Team, Networks and Data Centre 2. Provision by the OS Team of a series of customised virtual machines to act as backup systems for Windows Server 2000 systems that could not be upgraded 3. Specialist UNIX support to help support SOLARIS 10 (SPARC) systems 4. Advanced network support to ensure seamless dedicated VPN links between Imperial College London and NASA's JPL 5. Liaisons between the College's Estates division and Networks to provide multiple electrical power sources for computing and network infrastructure resilience Certain aspects of the above were called upon in the final stages of the mission, and ICT helped ensure that the Imperial College Cassini team were able to monitor data from its magnetometer onboard the Cassini spacecraft (built in the Magnetometer lab in Huxley 6M) thus ensuring the highest quality science data were obtained at end of mission. ------------------------------------------------------------------------------------------------------------------------------------------- [3] Currently the space mission, Solar Orbiter, with another of SPAT's magnetometers on board, is relying on RCS hosted VMs to provide near realtime data analysis and monitoring. This solution was chosen for resilience and the level of service offered by the RCS. ------------------------------------------------------------------------------------------------------------------------------------------ -----Original Message----- From: physics-departmental-computing-bounces@imperial.ac.uk<mailto:physics-departmental-computing-bounces@imperial.ac.uk> [mailto:physics-departmental-computing-bounces@imperial.ac.uk] On Behalf Of Jarvist Moore Frost Sent: 08 June 2020 13:44 To: Colling, David J <d.colling@imperial.ac.uk<mailto:d.colling@imperial.ac.uk>> Cc: physics-departmental-computing <physics-departmental-computing@imperial.ac.uk<mailto:physics-departmental-computing@imperial.ac.uk>> Subject: Re: [Physics-Departmental-Computing] ICT reorganisation Dear Physics Computing Committee, I think as physicists we have a lot of case studies where our unusual needs have been met by bespoke local ICT support. I think maybe providing a dossier of evidence of this might be a useful and concrete thing we can do. My comments from the EXSS perspective: * Research Computer Services disappear as a separate entity, and seemed to be lumped in without any management in the 'portfolio function'. The provision of high performance computing at Imperial has been a real success story over the last 15 years. We have an enormous number of people in EXSS doing computational materials research, even within groups that are mainly experimental, and which would not individually be able to support this work with either hardware or expertise. This has led to some extremely high quality science, and has only been possible due to the support for training by RCS / ICT. * The multiple negative references in the white paper to the cost of supporting obsolete hardware / software from 'shadow ICT'. A lot of the EXSS experimental rigs run custom hardware & drivers often associated with obsolete versions of Windows, so this may cause issues. We might find that suddenly no help is offered for any technical issues with these old machines, and are no longer allowed to connect them to the intranet. * Something similar might be the case for the CNC machines in the workshop. * Generally we have very varied and heterogenous requirements in EXSS, and have definitely benefited from bespoke solutions and local expertise. The CMTH/EXSS 'cluster' group operations manager (Carolyn Dale) is concerned that: I am in complete shock about this and at the start of the lockdown at our first admin teams meeting from home we were asked by Luke if we had to time to complete an ICT survey about the service we receive which after what they have done now seems pointless but from my own admin team view and I made them aware of the redundancies on Friday as I had spoken to Martin Morris we are very concerned about the level of service we would receive as we often have to ask for one to one help to assist new students/visitors and problems with our own computers and group laptops so that message comes from the 3 of use in the cluster office Prof Nelson added after I distributed the above to the EXSS PIs: * EXSS has a relatively large number of students and RAs dong computational research alongside experimental work. Since as an experimental group EXSS does not have and could not support its own computational support staff, the existence of the HPC and the RCS service have been absolutely critical in allowing us to work. * EXSS staff and students have developed software that they have published open source or otherwise want to make available for wider use. The services provided by ICT have been essential in allowing that to happen. At least one such piece of software is currently forming part of a physics department REF case study. * the old MRes in plastic electronics and the new MRes that we are opening this October have a significant computational teaching part where we use networked computers and software to teach concepts in the physics and chemistry of materials. We are stepping up the computational teaching part this year due to the pandemic. The support of ICT has been critical in allowing the computational teaching to be done (I can give specific examples of help given by CT staff at the last minute), since this was not a part of standard UG or PG teaching programmes. The computational teaching is considered a relatively innovative part of our MRes programme. * I have an ERC grant largely for computational work. The support of the HPC and RCS was essential to demonstrate that the work could be done within EXSS. Others (like you?) are likely to have similar examples On Sat, 6 Jun 2020 at 14:20, David Colling <d.colling@imperial.ac.uk<mailto:d.colling@imperial.ac.uk>> wrote: Hi All, I would encourage you to circulate this white paper to all the more computer dependent members of your group. Things that particularly worry me are the loss of the whole RSC team (including HPC and RSE teams), the reduced services to users (including AV services) and the constraints on what ICT will be able to support. I would be very interested in the thoughts and opinions of other people. Best, david On 5 June 2020 20:07:52 BST, David Colling <d.colling@imperial.ac.uk<mailto:d.colling@imperial.ac.uk>> wrote: Dear Members of the Physics Computing Committee, You will all no doubt be aware of the reorganisation being planned in ICT. From correspondence with some of you today, it is clear that some of you have read the white paper and others have not. I am not sure how secret this document is meant to be but I have been sent so many copies that I view it as essentially being public and so I have attached it to this email. I would urge you all to read it carefully and especially the staff planning at the end of the document. best, david -- Sent from my Android device with K-9 Mail. Please excuse my brevity. _______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk<mailto:Physics-Departmental-Computing@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computi ng _______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk<mailto:Physics-Departmental-Computing@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computing _______________________________________________ Physics-Departmental-Computing mailing list Physics-Departmental-Computing@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/physics-departmental-computing
participants (4)
-
Bantges, Richard J
-
David Colling
-
Jarvist Moore Frost
-
Kingham, Robert J