Build issue on Imperial clusters/Intel CC
Hi We've been experiencing a problem with the firedrake build using intel compilers/Intel MPI on the Imperial clusters. We have a workaround: see attached. However it would be nice if the build was fixed. The workaround basically builds petsc separately so that we can put a couple of symlinks into the petsc/include dir. They are links to mpi.h and mpio.h. If we don't do that then PyOP2 fails to find the headers in the intel mpi environment. The includes are set in the mpi module but seem to be lost when it comes to building PyOP2. Thanks Bob Bob Cregan HPC Systems Analyst Information & Communication Technologies Imperial College London, South Kensington Campus London, SW7 2AZ T: 07712388129 E: b.cregan@imperial.ac.uk W: www.imperial.ac.uk/ict/rcs<http://www.imperial.ac.uk/ict/rcs> [1505984389175_twitter.png] @imperialRCS @imperialRSE [1505983882959_Imperial-RCS.png]
Hi Bob,
On 20 Mar 2019, at 11:28, Cregan, Bob <b.cregan@imperial.ac.uk> wrote:
Hi We've been experiencing a problem with the firedrake build using intel compilers/Intel MPI on the Imperial clusters.
We have a workaround: see attached. However it would be nice if the build was fixed. The workaround basically builds petsc separately so that we can put a couple of symlinks into the petsc/include dir. They are links to mpi.h and mpio.h. If we don't do that then PyOP2 fails to find the headers in the intel mpi environment. The includes are set in the mpi module but seem to be lost when it comes to building PyOP2.
Hmm, so firedrake-install just builds with whatever mpicc/mpif90 it finds: you can control this by saying firedrake-install --mpicc your-mpicc --mpicxx ... Do you happen to have a log from a build that failed? (I note that it seems like madness that one has to set the I_MPI_CC environment variable to point to the include directory for the MPI headers: surely the compiler wrapper mpicc should arrange for that to happen) Also, if you use --honour-petsc-dir then the PETSC_CONFIGURE_OPTIONS line is ignored (firedrake-install doesn't build PETSc in this case). Cheers, Lawrence
Hi See attached - not sure what build options were used for that particular one had to get from backup. Setting the compiler works for all the various sub-builds - it was just PyOP2 that could not find the mpi header files. Bob Bob Cregan HPC Systems Analyst Information & Communication Technologies Imperial College London, South Kensington Campus London, SW7 2AZ T: 07712388129 E: b.cregan@imperial.ac.uk W: www.imperial.ac.uk/ict/rcs<http://www.imperial.ac.uk/ict/rcs> [1505984389175_twitter.png] @imperialRCS @imperialRSE [1505983882959_Imperial-RCS.png] ________________________________ From: Lawrence Mitchell <wence@gmx.li> Sent: 20 March 2019 11:36 To: Cregan, Bob Cc: firedrake; Lacalle Puig, Santiago Subject: Re: [firedrake] Build issue on Imperial clusters/Intel CC Hi Bob,
On 20 Mar 2019, at 11:28, Cregan, Bob <b.cregan@imperial.ac.uk> wrote:
Hi We've been experiencing a problem with the firedrake build using intel compilers/Intel MPI on the Imperial clusters.
We have a workaround: see attached. However it would be nice if the build was fixed. The workaround basically builds petsc separately so that we can put a couple of symlinks into the petsc/include dir. They are links to mpi.h and mpio.h. If we don't do that then PyOP2 fails to find the headers in the intel mpi environment. The includes are set in the mpi module but seem to be lost when it comes to building PyOP2.
Hmm, so firedrake-install just builds with whatever mpicc/mpif90 it finds: you can control this by saying firedrake-install --mpicc your-mpicc --mpicxx ... Do you happen to have a log from a build that failed? (I note that it seems like madness that one has to set the I_MPI_CC environment variable to point to the include directory for the MPI headers: surely the compiler wrapper mpicc should arrange for that to happen) Also, if you use --honour-petsc-dir then the PETSC_CONFIGURE_OPTIONS line is ignored (firedrake-install doesn't build PETSc in this case). Cheers, Lawrence
On 20 Mar 2019, at 11:54, Cregan, Bob <b.cregan@imperial.ac.uk> wrote:
Hi See attached - not sure what build options were used for that particular one had to get from backup.
Setting the compiler works for all the various sub-builds - it was just PyOP2 that could not find the mpi header files.
Ah thanks. It looks like we are not careful about how we control environment variables for various subbuilds. c.f. here https://github.com/firedrakeproject/firedrake/issues/1396 Lawrence
Hi Bob, I think we have now managed to fix this in firedrake-install. Can you try things out and let us know? You'll need to get a new version from the download page and then do: firedrake-install --your-args --mpicc /path/to/mpicc --mpicxx /path/to/mpicxx --mpif90 /path/to/mpif90 --mpiexec /path/to/mpiexec This does two things: 1. Ensure that compilers are set correctly when we install things 2. Adds symlinks from firedrake/bin/{mpicc, mpicxx, ...} to the provided MPI compilers in the virtualenv (this way, in an activated virtualenv, the compilers all point at the right place). Cheers, Lawrence
On 20 Mar 2019, at 12:07, Lawrence Mitchell <wence@gmx.li> wrote:
On 20 Mar 2019, at 11:54, Cregan, Bob <b.cregan@imperial.ac.uk> wrote:
Hi See attached - not sure what build options were used for that particular one had to get from backup.
Setting the compiler works for all the various sub-builds - it was just PyOP2 that could not find the mpi header files.
Ah thanks. It looks like we are not careful about how we control environment variables for various subbuilds.
c.f. here https://github.com/firedrakeproject/firedrake/issues/1396
Lawrence
Hi Lawrence, System is down now - will test as soon as I can. Bob Bob Cregan HPC Systems Analyst Information & Communication Technologies Imperial College London, South Kensington Campus London, SW7 2AZ T: 07712388129 E: b.cregan@imperial.ac.uk W: www.imperial.ac.uk/ict/rcs<http://www.imperial.ac.uk/ict/rcs> [1505984389175_twitter.png] @imperialRCS @imperialRSE [1505983882959_Imperial-RCS.png] ________________________________ From: Lawrence Mitchell <wence@gmx.li> Sent: 25 March 2019 11:26 To: Cregan, Bob Cc: firedrake; Lacalle Puig, Santiago Subject: Re: [firedrake] Build issue on Imperial clusters/Intel CC Hi Bob, I think we have now managed to fix this in firedrake-install. Can you try things out and let us know? You'll need to get a new version from the download page and then do: firedrake-install --your-args --mpicc /path/to/mpicc --mpicxx /path/to/mpicxx --mpif90 /path/to/mpif90 --mpiexec /path/to/mpiexec This does two things: 1. Ensure that compilers are set correctly when we install things 2. Adds symlinks from firedrake/bin/{mpicc, mpicxx, ...} to the provided MPI compilers in the virtualenv (this way, in an activated virtualenv, the compilers all point at the right place). Cheers, Lawrence
On 20 Mar 2019, at 12:07, Lawrence Mitchell <wence@gmx.li> wrote:
On 20 Mar 2019, at 11:54, Cregan, Bob <b.cregan@imperial.ac.uk> wrote:
Hi See attached - not sure what build options were used for that particular one had to get from backup.
Setting the compiler works for all the various sub-builds - it was just PyOP2 that could not find the mpi header files.
Ah thanks. It looks like we are not careful about how we control environment variables for various subbuilds.
c.f. here https://github.com/firedrakeproject/firedrake/issues/1396
Lawrence
participants (2)
- 
                
                Cregan, Bob
- 
                
                Lawrence Mitchell