Please read: phasing out of lcg-utils - action required
Hi All, With the advent of SL7 it is necessary to switch DIRAC to use the gfal tools for file management, e.g. input and output data. At the moment this is handled by the lcgtools within DIRAC, however these are not available on SL7. Unfortunately it's an all or nothing approach, so we need to switch all sites in Dirac in one go. **** This requires all Dirac UIs to be upgraded to v6r17p18 (or higher), so if you are using a Dirac UI, please upgrade to this version by May 18th. https://www.gridpp.ac.uk/wiki/Quick_Guide_to_Dirac#Dirac_client_installation **** If you are using Ganga *from CVMFS*, we will talk to the Ganga developers and you should not have to do anything. Note that if you are using lcgtools explicitly within your jobs and they run at SL6 sites, these are still available, but your file transfers within dirac will be handled by gfal. Regards, Daniela -- Sent from the pit of despair ----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: +44-(0)20-75947810 http://www.hep.ph.ic.ac.uk/~dbauer/
On 04/05/17 10:27, Daniela Bauer wrote:
Note that if you are using lcgtools explicitly within your jobs and they run at SL6 sites, these are still available, but your file transfers within dirac will be handled by gfal.
It might be good to ask job submitters to switch to GFAL tools within their jobs soonish. Since SL6 sites will dwindle soon, perhaps? GFAL tools should be already working OK on SL6 gridsites. So the same jobs could run (perhaps) on 7 clusters if by chance they land there. We'll talk at Ops Meeting about the transition I hope .... in any case, if you need a version 7 test cluster, you can use hepgrid5.ph.liv.ac.uk, which is an SL6 ARC CE fronting a mini-cluster (48 slots) with version 7 workernodes. I just thought I'd mention it. ILC jobs jobs are running on it now, so they are getting a bit of extra grid time while the going is good! But other VO job submitters could test their set-ups on it if they wish. Cheers, Ste
Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 University of Liverpool http://www.liv.ac.uk/physics/hep/
Hi Daniela So just to check as long as we update our Dirac UI we don't need to change our code. So if we have something like lcg-ls -l <url> we don't have to find the gfal equivalent. Hi Stephen Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example). David On 4 May 2017 at 05:55, Stephen Jones <sjones@hep.ph.liv.ac.uk> wrote:
On 04/05/17 10:27, Daniela Bauer wrote:
Note that if you are using lcgtools explicitly within your jobs and they run at SL6 sites, these are still available, but your file transfers within dirac will be handled by gfal.
It might be good to ask job submitters to switch to GFAL tools within their jobs soonish. Since SL6 sites will dwindle soon, perhaps? GFAL tools should be already working OK on SL6 gridsites. So the same jobs could run (perhaps) on 7 clusters if by chance they land there.
We'll talk at Ops Meeting about the transition I hope .... in any case, if you need a version 7 test cluster, you can use hepgrid5.ph.liv.ac.uk, which is an SL6 ARC CE fronting a mini-cluster (48 slots) with version 7 workernodes. I just thought I'd mention it. ILC jobs jobs are running on it now, so they are getting a bit of extra grid time while the going is good! But other VO job submitters could test their set-ups on it if they wish.
Cheers,
Ste
Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 University of Liverpool http://www.liv.ac.uk/physics/hep/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- *Dr David John Auty* *Centre for Particle Physics2-090 CCISUniversity of Alberta, EdmontonAlberta Canada, T6G 2E1* *Office Phone: 780 4920338* *Mobile Phone: 780 9340002*
On 04/05/17 15:45, David Auty wrote:
Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example). Hi David,
I'm afraid I'm hardly any wiser. I did some googling to find how to test that GFAL tools work well at our site (Liverpool), and made a little test job from what I read (see below). I'm cc'ing this over to our "Storage Group" to see if anyone can recommend some formal documentation on GFAL. So ... any answers? Cheers, Ste --- that gfal test job --- #!/bin/bash echo Copying in file:grid3.txt from some remote place gfal-copy srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt file://$PWD/grid3.txt echo Get rid of remote grid3.txt gfal-rm srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt echo Copy back the new file to the remote place gfal-copy file://$PWD/grid3.txt srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
-- /Dr David John Auty/ / / /Centre for Particle Physics 2-090 CCIS University of Alberta, Edmonton Alberta Canada, T6G 2E1/ / / /Office Phone: 780 4920338/ /Mobile Phone: 780 9340002/
-- Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 University of Liverpool http://www.liv.ac.uk/physics/hep/
Hi Steve (everyone), While I look for more detailed comparisons, this might help (from 2013 or so, when we actually started retiring lcg-utils - they've been retired since around 2014, btw!). https://indico.egi.eu/indico/event/2189/contribution/2/material/slides/1.pdf was a presentation giving an overview of the differences between lcg-utils and gfal2... Sam On Thu, 4 May 2017 at 16:05 Stephen Jones <sjones@hep.ph.liv.ac.uk> wrote:
On 04/05/17 15:45, David Auty wrote:
Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example). Hi David,
I'm afraid I'm hardly any wiser. I did some googling to find how to test that GFAL tools work well at our site (Liverpool), and made a little test job from what I read (see below). I'm cc'ing this over to our "Storage Group" to see if anyone can recommend some formal documentation on GFAL. So ... any answers?
Cheers,
Ste
--- that gfal test job ---
#!/bin/bash
echo Copying in file:grid3.txt from some remote place gfal-copy srm:// hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt file://$PWD/grid3.txt
echo Get rid of remote grid3.txt gfal-rm srm:// hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
echo Copy back the new file to the remote place gfal-copy file://$PWD/grid3.txt srm:// hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
-- /Dr David John Auty/ / / /Centre for Particle Physics 2-090 CCIS University of Alberta, Edmonton Alberta Canada, T6G 2E1/ / / /Office Phone: 780 4920338/ /Mobile Phone: 780 9340002/
-- Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 <0151%20794%203396> University of Liverpool http://www.liv.ac.uk/physics/hep/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
Of course gfal-ls is broken in the current version of the dirac externals, so hmpf. This "let's replace lcg-utils before we have a supported replacement" has been going on my nerves, possibly since 2013. I'll deal with the dirac failure once I made some headway into my other tickets. Daniela On 4 May 2017 at 23:15, Sam Skipsey <sskipsey@googlemail.com> wrote:
Hi Steve (everyone),
While I look for more detailed comparisons, this might help (from 2013 or so, when we actually started retiring lcg-utils - they've been retired since around 2014, btw!). https://indico.egi.eu/indico/event/2189/contribution/2/material/slides/1.pdf was a presentation giving an overview of the differences between lcg-utils and gfal2...
Sam
On Thu, 4 May 2017 at 16:05 Stephen Jones <sjones@hep.ph.liv.ac.uk> wrote:
On 04/05/17 15:45, David Auty wrote:
Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example). Hi David,
I'm afraid I'm hardly any wiser. I did some googling to find how to test that GFAL tools work well at our site (Liverpool), and made a little test job from what I read (see below). I'm cc'ing this over to our "Storage Group" to see if anyone can recommend some formal documentation on GFAL. So ... any answers?
Cheers,
Ste
--- that gfal test job ---
#!/bin/bash
echo Copying in file:grid3.txt from some remote place gfal-copy
srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt file://$PWD/grid3.txt
echo Get rid of remote grid3.txt gfal-rm
srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
echo Copy back the new file to the remote place gfal-copy file://$PWD/grid3.txt
srm://hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
-- /Dr David John Auty/ / / /Centre for Particle Physics 2-090 CCIS University of Alberta, Edmonton Alberta Canada, T6G 2E1/ / / /Office Phone: 780 4920338/ /Mobile Phone: 780 9340002/
-- Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 University of Liverpool http://www.liv.ac.uk/physics/hep/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- Sent from the pit of despair ----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: +44-(0)20-75947810 http://www.hep.ph.ic.ac.uk/~dbauer/
Well, strictly, gfal2 was supposed to be the supported replacement... and the Experiments were asked to engage with this back in 2012/13! To be fair to the gfal guys, whilst the documentation for their tools could be much much better, they've also not had the best amount of feedback from the people they were trying to support. [fts3, btw, is basically a gfal2 powered service, so really you've all been using gfal2 things since that was rolled out] On Fri, 5 May 2017 at 09:40 Daniela Bauer <daniela.bauer.grid@googlemail.com> wrote:
Of course gfal-ls is broken in the current version of the dirac externals, so hmpf. This "let's replace lcg-utils before we have a supported replacement" has been going on my nerves, possibly since 2013. I'll deal with the dirac failure once I made some headway into my other tickets.
Daniela
On 4 May 2017 at 23:15, Sam Skipsey <sskipsey@googlemail.com> wrote:
Hi Steve (everyone),
While I look for more detailed comparisons, this might help (from 2013 or so, when we actually started retiring lcg-utils - they've been retired since around 2014, btw!).
https://indico.egi.eu/indico/event/2189/contribution/2/material/slides/1.pdf
was a presentation giving an overview of the differences between lcg-utils and gfal2...
Sam
On Thu, 4 May 2017 at 16:05 Stephen Jones <sjones@hep.ph.liv.ac.uk> wrote:
On 04/05/17 15:45, David Auty wrote:
Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example). Hi David,
I'm afraid I'm hardly any wiser. I did some googling to find how to test that GFAL tools work well at our site (Liverpool), and made a little test job from what I read (see below). I'm cc'ing this over to our "Storage Group" to see if anyone can recommend some formal documentation on GFAL. So ... any answers?
Cheers,
Ste
--- that gfal test job ---
#!/bin/bash
echo Copying in file:grid3.txt from some remote place gfal-copy
srm://
hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
file://$PWD/grid3.txt
echo Get rid of remote grid3.txt gfal-rm
srm:// hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
echo Copy back the new file to the remote place gfal-copy file://$PWD/grid3.txt
srm:// hepgrid11.ph.liv.ac.uk/dpm/ph.liv.ac.uk/home/dteam/sjones.d/grid3.txt
-- /Dr David John Auty/ / / /Centre for Particle Physics 2-090 CCIS University of Alberta, Edmonton Alberta Canada, T6G 2E1/ / / /Office Phone: 780 4920338/ /Mobile Phone: 780 9340002/
-- Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 <0151%20794%203396> University of Liverpool http://www.liv.ac.uk/physics/hep/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- Sent from the pit of despair
----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: +44-(0)20-75947810 <020%207594%207810> http://www.hep.ph.ic.ac.uk/~dbauer/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
Hi David, To answer your first question: Yes, lcg-ls -l will still be available, *unless* you are running on a SL7 node. Having said this, sites shouldn't enable a VO on their SL7 queues without checking with the VO first. We think we have now worked out a fix for the gfal-ls issues, Simon is about to write an 'angry systemmamanger email(TM)' to the dirac developers. On that note, it might be worth waiting for this fix to go into the dirac UI before upgrading. I'll let the list know. Regards, Daniela On 4 May 2017 at 15:45, David Auty <auty@ualberta.ca> wrote:
Hi Daniela
So just to check as long as we update our Dirac UI we don't need to change our code. So if we have something like lcg-ls -l <url> we don't have to find the gfal equivalent.
Hi Stephen
Can you direct me to a good page that has gfal equivalents of lcg functions as at the moment I am having to look at many pages and I'm still not able to find all the equivalent functions in gfal (see above example).
David
On 4 May 2017 at 05:55, Stephen Jones <sjones@hep.ph.liv.ac.uk> wrote:
On 04/05/17 10:27, Daniela Bauer wrote:
Note that if you are using lcgtools explicitly within your jobs and they run at SL6 sites, these are still available, but your file transfers within dirac will be handled by gfal.
It might be good to ask job submitters to switch to GFAL tools within their jobs soonish. Since SL6 sites will dwindle soon, perhaps? GFAL tools should be already working OK on SL6 gridsites. So the same jobs could run (perhaps) on 7 clusters if by chance they land there.
We'll talk at Ops Meeting about the transition I hope .... in any case, if you need a version 7 test cluster, you can use hepgrid5.ph.liv.ac.uk, which is an SL6 ARC CE fronting a mini-cluster (48 slots) with version 7 workernodes. I just thought I'd mention it. ILC jobs jobs are running on it now, so they are getting a bit of extra grid time while the going is good! But other VO job submitters could test their set-ups on it if they wish.
Cheers,
Ste
Steve Jones sjones@hep.ph.liv.ac.uk Grid System Administrator office: 220 High Energy Physics Division tel (int): 43396 Oliver Lodge Laboratory tel (ext): +44 (0)151 794 3396 University of Liverpool http://www.liv.ac.uk/physics/hep/
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- *Dr David John Auty*
*Centre for Particle Physics2-090 CCISUniversity of Alberta, EdmontonAlberta Canada, T6G 2E1*
*Office Phone: 780 4920338 <(780)%20492-0338>* *Mobile Phone: 780 9340002 <(780)%20934-0002>*
-- _______________________________________________ Gridpp-Dirac-Users mailing list Gridpp-Dirac-Users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/gridpp-dirac-users
-- Sent from the pit of despair ----------------------------------------------------------- daniela.bauer@imperial.ac.uk HEP Group/Physics Dep Imperial College London, SW7 2BW Tel: +44-(0)20-75947810 http://www.hep.ph.ic.ac.uk/~dbauer/
participants (4)
-
Daniela Bauer
-
David Auty
-
Sam Skipsey
-
Stephen Jones