******************* This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address. ******************* Dear All, The upgrade of the DIRAC server has now finished; please let us know of any problems. Please note that python3 is now the default and we recommend upgrading your UI to python3 using the instructions provided here: https://www.gridpp.ac.uk/wiki/Quick_Guide_to_Dirac For our cvmfs DIRAC UI users: *After cvmfs syncs the next time - this might take a day*: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp will point to a python3 install, the legacy python 2 install can be found here: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp_py2 Regards, Daniela -- ----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: Working from home, please use email. http://www.hep.ph.ic.ac.uk/~dbauer/
******************* This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address. ******************* Dear All, Well, that didn't go well. It looks like the DIRAC environment has started leaking into user jobs (again). About 2.5 years ago we went through a lengthy spell of code development (and lively discussion with the core DIRAC team) to ensure that the DIRAC tools were available to our users while keeping the DIRAC environment separate from the user defined one. Back then the battle ground was the LD_LIBRARY_PATH - something you might have noticed is clear even in the new set up - if not that would have raised an alarm much earlier. We've been scrambling behind the scenes trying to come up with at least a temporary solution, including trying to downgrade various bits of DIRAC code, but so far we haven't found a winning combination. We have reported this to core DIRAC: https://github.com/DIRACGrid/DIRAC/discussions/6277 Snoplus has filed a GGUS ticket for this: https://ggus.eu/?mode=ticket_info&ticket_id=158152 If you run into similar issues, please let us know, via GGUS or on this list (please not in private emails, it's hard to point a developer at a private email). If your stuff still works, can you please let us know as well, whether you know what made it works or not. Apologies, Daniela On Mon, 25 Jul 2022 at 14:04, Daniela Bauer < daniela.bauer.grid@googlemail.com> wrote:
Dear All,
The upgrade of the DIRAC server has now finished; please let us know of any problems. Please note that python3 is now the default and we recommend upgrading your UI to python3 using the instructions provided here: https://www.gridpp.ac.uk/wiki/Quick_Guide_to_Dirac For our cvmfs DIRAC UI users: *After cvmfs syncs the next time - this might take a day*: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp will point to a python3 install, the legacy python 2 install can be found here: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp_py2
Regards, Daniela --
----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: Working from home, please use email. http://www.hep.ph.ic.ac.uk/~dbauer/
-- ----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: Working from home, please use email. http://www.hep.ph.ic.ac.uk/~dbauer/
Hi Daniela, I've noticed I'm unable to direct my jobs to files in my sandbox after the update. A test job that I ran before the UI update now fails, as it's unable to find the file in my sandbox - although it is uploaded. When directing it, I originally used (which worked) "foo bar ./file_in_sandbox.sh args" Which produces an error log with: FATAL: stat /home/pltcom01/file_in_sandbox.sh: no such file or directory As a test, uploading this file to cvmfs and directing it to the cvmfs version instead works successfully. Is it possible that the environment leakage is causing this problem? Or is there a change to the UI to use the sandbox? Many thanks, Roden ________________________________ From: gridpp-dirac-users-bounces@imperial.ac.uk <gridpp-dirac-users-bounces@imperial.ac.uk> on behalf of Daniela Bauer <daniela.bauer.grid@googlemail.com> Sent: 26 July 2022 14:35 To: gridpp-dirac-users <gridpp-dirac-users@imperial.ac.uk> Subject: Re: [Gridpp-Dirac-Users] The DIRAC upgrade has finished This email from daniela.bauer.grid@googlemail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list<https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email stamping for this address. Dear All, Well, that didn't go well. It looks like the DIRAC environment has started leaking into user jobs (again). About 2.5 years ago we went through a lengthy spell of code development (and lively discussion with the core DIRAC team) to ensure that the DIRAC tools were available to our users while keeping the DIRAC environment separate from the user defined one. Back then the battle ground was the LD_LIBRARY_PATH - something you might have noticed is clear even in the new set up - if not that would have raised an alarm much earlier. We've been scrambling behind the scenes trying to come up with at least a temporary solution, including trying to downgrade various bits of DIRAC code, but so far we haven't found a winning combination. We have reported this to core DIRAC: https://github.com/DIRACGrid/DIRAC/discussions/6277 Snoplus has filed a GGUS ticket for this: https://ggus.eu/?mode=ticket_info&ticket_id=158152 If you run into similar issues, please let us know, via GGUS or on this list (please not in private emails, it's hard to point a developer at a private email). If your stuff still works, can you please let us know as well, whether you know what made it works or not. Apologies, Daniela On Mon, 25 Jul 2022 at 14:04, Daniela Bauer <daniela.bauer.grid@googlemail.com<mailto:daniela.bauer.grid@googlemail.com>> wrote: Dear All, The upgrade of the DIRAC server has now finished; please let us know of any problems. Please note that python3 is now the default and we recommend upgrading your UI to python3 using the instructions provided here: https://www.gridpp.ac.uk/wiki/Quick_Guide_to_Dirac For our cvmfs DIRAC UI users: *After cvmfs syncs the next time - this might take a day*: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp<http://dirac.egi.eu/dirac/bashrc_gridpp> will point to a python3 install, the legacy python 2 install can be found here: /cvmfs/dirac.egi.eu/dirac/bashrc_gridpp_py2<http://dirac.egi.eu/dirac/bashrc_gridpp_py2> Regards, Daniela -- ----------------------------------------------------------- daniela.bauer@imperial.ac.uk<mailto:daniela.bauer@imperial.ac.uk> HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: Working from home, please use email. http://www.hep.ph.ic.ac.uk/~dbauer/ -- ----------------------------------------------------------- daniela.bauer@imperial.ac.uk<mailto:daniela.bauer@imperial.ac.uk> HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: Working from home, please use email. http://www.hep.ph.ic.ac.uk/~dbauer/
Hi Roden, Your files are being included correctly in the sandbox (you can verify this by downloading an input sandbox via the web UI, or dirac-wms-job-get-input command). The error is generated from when you start your singularity container inside the job. The new version of DIRAC includes its own version of singularity which is given precedence over any locally available versions (this is a side-effect of the DIRAC environment change). The new singularity behaviour is that on start-up it changes the current directory (CWD) to $HOME, but your scripts are almost always in some kind of site/job-specific scratch directory. E.g.: Your script might be in /scratch/jobid/myscript.sh, but if you run "singularity <args> ./myscript.sh", it'll try to run it from /home/batchuser/myscript.sh instead: Hence the no such file or directory error. Can you please try adding this flag to your singularity options? -H "$(pwd):/home" This will set the home directory for the container to be simply "/home" inside the container, the CWD will be set to /home and this will contain the current scratch directory from outside the container. This should then work exactly as it used to, please let us know if it doesn't or you find any other issues. For debugging jobs, you can run them on your local node with the UI sourced: This gives an almost identical environment to running on the grid now (which is one of the advantages of the new environment set-up). Regards, Simon On Tue, Aug 02, 2022 at 12:21:05PM +0000, Derveni, Roden wrote:
Hi Daniela,
I've noticed I'm unable to direct my jobs to files in my sandbox after the update. A test job that I ran before the UI update now fails, as it's unable to find the file in my sandbox - although it is uploaded.
When directing it, I originally used (which worked) "foo bar ./file_in_sandbox.sh args"
Which produces an error log with: FATAL: stat /home/pltcom01/file_in_sandbox.sh: no such file or directory
As a test, uploading this file to cvmfs and directing it to the cvmfs version instead works successfully.
Is it possible that the environment leakage is causing this problem? Or is there a change to the UI to use the sandbox?
Many thanks, Roden
Hi Simon, Thank you, that's worked like a charm. Regards, Roden ________________________________ From: gridpp-dirac-users-bounces@imperial.ac.uk <gridpp-dirac-users-bounces@imperial.ac.uk> on behalf of Simon Fayer <simon.fayer05@imperial.ac.uk> Sent: 02 August 2022 14:30 To: gridpp-dirac-users <gridpp-dirac-users@imperial.ac.uk> Subject: Re: [Gridpp-Dirac-Users] Singularity issues after the upgrade [was: The DIRAC upgrade has finished] Hi Roden, Your files are being included correctly in the sandbox (you can verify this by downloading an input sandbox via the web UI, or dirac-wms-job-get-input command). The error is generated from when you start your singularity container inside the job. The new version of DIRAC includes its own version of singularity which is given precedence over any locally available versions (this is a side-effect of the DIRAC environment change). The new singularity behaviour is that on start-up it changes the current directory (CWD) to $HOME, but your scripts are almost always in some kind of site/job-specific scratch directory. E.g.: Your script might be in /scratch/jobid/myscript.sh, but if you run "singularity <args> ./myscript.sh", it'll try to run it from /home/batchuser/myscript.sh instead: Hence the no such file or directory error. Can you please try adding this flag to your singularity options? -H "$(pwd):/home" This will set the home directory for the container to be simply "/home" inside the container, the CWD will be set to /home and this will contain the current scratch directory from outside the container. This should then work exactly as it used to, please let us know if it doesn't or you find any other issues. For debugging jobs, you can run them on your local node with the UI sourced: This gives an almost identical environment to running on the grid now (which is one of the advantages of the new environment set-up). Regards, Simon On Tue, Aug 02, 2022 at 12:21:05PM +0000, Derveni, Roden wrote:
Hi Daniela,
I've noticed I'm unable to direct my jobs to files in my sandbox after the update. A test job that I ran before the UI update now fails, as it's unable to find the file in my sandbox - although it is uploaded.
When directing it, I originally used (which worked) "foo bar ./file_in_sandbox.sh args"
Which produces an error log with: FATAL: stat /home/pltcom01/file_in_sandbox.sh: no such file or directory
As a test, uploading this file to cvmfs and directing it to the cvmfs version instead works successfully.
Is it possible that the environment leakage is causing this problem? Or is there a change to the UI to use the sandbox?
Many thanks, Roden
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
participants (3)
- 
                
                Daniela Bauer
- 
                
                Derveni, Roden
- 
                
                Simon Fayer