Draft email to distribute regarding HX2 and future needs
Hi All, I promised both the Physics Computing Committee and the FRCC that I would send around and email for people to circulate to aquire information about HX2 and other future needs. I started and then thought that it might be more fruitful to circulate a draft first and then refine it slightly. So below is a (fairly rough) draft for comment. Comments are very welcome. Best, david Sub: Input into future research computing needs Dear All, As you may know the College's HPC platform has been upgraded with HX1 coming online earlier this year. HX1 is a cluster of over 18k cores (Intel Xeon Platinum 8358) and 60 GPUs (Nvidia A100s). This year the focus is on the update of the data storage with a move away from the current RDS to a more structured Research Data Facility (RDF). The RDF will be coming online later this year. We are now looking for input as to what will be needed both next year and further into the future on a 3 to 5 year time scale. Next year the plan is to decommission the College's main High Throughput Cluster (HTC) CX3 and replace with a new system. The budget available for this purchase will be approximately £5M. Because this will be our first fully watercolled system, we are negotiating space in a new data centre with this capability and so this resource will not come online until the summer of 2025. We are seeking your input as to what we should purchase. As this is replacing an HTC system it is assumed that the majority of the money will be spent on another HTC system, however, if your research has need for other shapes of resources then please let us know. Also even if what you want is an HTC resources what parameters are important to you (e.g. memory/core, access to GPUs, throughput from disk, etc etc). *** Different for different departments - this is the physics example *** Please give us feedback via your representatives on the Departmental Computing Committee. We are also trying to establish what the more medium term computing needs will be. So please can you also discuss these with your representatives on the Departmental Computing Committee. These may be expected needs from from specific instruments/experiments/projects of general growth in computing needs. We will then try to amalgamate these into a coherent input for future purchses. Best, david
Dear All Attached are the minutes from last week's FRCC meeting. The next meeting will be 6th November. Many thanks Craig -----Original Message----- From: David Colling <d.colling@imperial.ac.uk> Sent: Tuesday, July 16, 2024 3:48 PM To: Richards, Andrew <a.j.richards@imperial.ac.uk>; Bryce, Craig T <c.bryce@imperial.ac.uk>; Bearpark, Michael J <m.bearpark@imperial.ac.uk>; Vilar Compte, Ramon <r.vilar@imperial.ac.uk>; Bresme, Fernando <f.bresme@imperial.ac.uk>; Keaveny, Eric E <e.keaveny@imperial.ac.uk>; Sternberg, Michael J E <m.sternberg@imperial.ac.uk>; Pearse, Will <will.pearse@imperial.ac.uk>; Staffell, Iain L <i.staffell@imperial.ac.uk>; Pengelly, Ellen <e.pengelly@imperial.ac.uk>; Bantges, Richard J <r.bantges@imperial.ac.uk>; Michalickova, Katerina <k.michalickova@imperial.ac.uk>; physics-departmental-computing <physics-departmental-computing@imperial.ac.uk> Cc: Armstrong-Brown, Sophie <s.armstrong-brown@imperial.ac.uk>; Buchaca-Domingo, Ester <e.buchaca-domingo@imperial.ac.uk>; David Colling <david.colling@gmail.com>; Kibaroglu, Okan <o.kibaroglu@imperial.ac.uk>; Willson, Thomas H <t.willson@imperial.ac.uk>; Oram, Debbie <d.oram@imperial.ac.uk>; Lumley, Emily K <e.lumley@imperial.ac.uk> Subject: Draft email to distribute regarding HX2 and future needs Hi All, I promised both the Physics Computing Committee and the FRCC that I would send around and email for people to circulate to aquire information about HX2 and other future needs. I started and then thought that it might be more fruitful to circulate a draft first and then refine it slightly. So below is a (fairly rough) draft for comment. Comments are very welcome. Best, david Sub: Input into future research computing needs Dear All, As you may know the College's HPC platform has been upgraded with HX1 coming online earlier this year. HX1 is a cluster of over 18k cores (Intel Xeon Platinum 8358) and 60 GPUs (Nvidia A100s). This year the focus is on the update of the data storage with a move away from the current RDS to a more structured Research Data Facility (RDF). The RDF will be coming online later this year. We are now looking for input as to what will be needed both next year and further into the future on a 3 to 5 year time scale. Next year the plan is to decommission the College's main High Throughput Cluster (HTC) CX3 and replace with a new system. The budget available for this purchase will be approximately £5M. Because this will be our first fully watercolled system, we are negotiating space in a new data centre with this capability and so this resource will not come online until the summer of 2025. We are seeking your input as to what we should purchase. As this is replacing an HTC system it is assumed that the majority of the money will be spent on another HTC system, however, if your research has need for other shapes of resources then please let us know. Also even if what you want is an HTC resources what parameters are important to you (e.g. memory/core, access to GPUs, throughput from disk, etc etc). *** Different for different departments - this is the physics example *** Please give us feedback via your representatives on the Departmental Computing Committee. We are also trying to establish what the more medium term computing needs will be. So please can you also discuss these with your representatives on the Departmental Computing Committee. These may be expected needs from from specific instruments/experiments/projects of general growth in computing needs. We will then try to amalgamate these into a coherent input for future purchses. Best, david
Dear All, I have a spreadsheet of all the machines with possible outward facing vulnerabilities across FoNS (thanks Tom). Most are minor or not a problem but there are a few that really should be addressed. I have been chasing up those in Physics but am not sure what is the best thing to do about other departments. Do people want me to chase or would it be better coming from the representatives from that department - I think the later but I would be interested in your opinions. Best, david
Dear David, Best from the department representatives. They will be a "known face" to the people being asked to change their computing practices, so the message may be received more easily. Plus that takes some of the burden off you! Kind regards, Iain -----Original Message----- From: David Colling <d.colling@imperial.ac.uk> Sent: 29 October 2024 19:08 To: Bryce, Craig T <c.bryce@imperial.ac.uk>; Richards, Andrew <a.j.richards@imperial.ac.uk>; Bearpark, Michael J <m.bearpark@imperial.ac.uk>; Vilar Compte, Ramon <r.vilar@imperial.ac.uk>; Bresme, Fernando <f.bresme@imperial.ac.uk>; Keaveny, Eric E <e.keaveny@imperial.ac.uk>; Sternberg, Michael J E <m.sternberg@imperial.ac.uk>; Pearse, Will <will.pearse@imperial.ac.uk>; Staffell, Iain L <i.staffell@imperial.ac.uk>; Pengelly, Ellen <e.pengelly@imperial.ac.uk>; Bantges, Richard J <r.bantges@imperial.ac.uk>; Michalickova, Katerina <k.michalickova@imperial.ac.uk>; physics-departmental-computing <physics-departmental-computing@imperial.ac.uk> Cc: David Colling <david.colling@gmail.com>; Kibaroglu, Okan <o.kibaroglu@imperial.ac.uk>; Willson, Thomas H <t.willson@imperial.ac.uk>; Oram, Debbie <d.oram@imperial.ac.uk>; Lumley, Emily K <e.lumley@imperial.ac.uk> Subject: FoNS vulnerabilities Dear All, I have a spreadsheet of all the machines with possible outward facing vulnerabilities across FoNS (thanks Tom). Most are minor or not a problem but there are a few that really should be addressed. I have been chasing up those in Physics but am not sure what is the best thing to do about other departments. Do people want me to chase or would it be better coming from the representatives from that department - I think the later but I would be interested in your opinions. Best, david
Dear David Many thanks for this. Would it be possible for ICT to liaise directly with the PIs? Perhaps we can discuss this point at our next meeting? Best wishes Fernando. -- FERNANDO BRESME Professor of Chemical Physics FRSC Department of Chemistry Imperial College London White City Campus 207C, Molecular Sciences Research Hub London, W12 0BZ E f.bresme@imperial.ac.uk T +44 (0)20 7594 5886 ________________________________ From: David Colling <d.colling@imperial.ac.uk> Sent: 29 October 2024 6:07 PM To: Bryce, Craig T <c.bryce@imperial.ac.uk>; Richards, Andrew <a.j.richards@imperial.ac.uk>; Bearpark, Michael J <m.bearpark@imperial.ac.uk>; Vilar Compte, Ramon <r.vilar@imperial.ac.uk>; Bresme, Fernando <f.bresme@imperial.ac.uk>; Keaveny, Eric E <e.keaveny@imperial.ac.uk>; Sternberg, Michael J E <m.sternberg@imperial.ac.uk>; Pearse, Will <will.pearse@imperial.ac.uk>; Staffell, Iain L <i.staffell@imperial.ac.uk>; Pengelly, Ellen <e.pengelly@imperial.ac.uk>; Bantges, Richard J <r.bantges@imperial.ac.uk>; Michalickova, Katerina <k.michalickova@imperial.ac.uk>; physics-departmental-computing <physics-departmental-computing@imperial.ac.uk> Cc: David Colling <david.colling@gmail.com>; Kibaroglu, Okan <o.kibaroglu@imperial.ac.uk>; Willson, Thomas H <t.willson@imperial.ac.uk>; Oram, Debbie <d.oram@imperial.ac.uk>; Lumley, Emily K <e.lumley@imperial.ac.uk> Subject: FoNS vulnerabilities Dear All, I have a spreadsheet of all the machines with possible outward facing vulnerabilities across FoNS (thanks Tom). Most are minor or not a problem but there are a few that really should be addressed. I have been chasing up those in Physics but am not sure what is the best thing to do about other departments. Do people want me to chase or would it be better coming from the representatives from that department - I think the later but I would be interested in your opinions. Best, david
participants (4)
-
Bresme, Fernando
-
Bryce, Craig T
-
David Colling
-
Staffell, Iain L