OK, in this case you don't even need to load the old OpenMPI anymore.
    
Does it work now?
    
This is what I have done: delete Firedrake and re-install it with the command :
python firedrake-install --no-package-manager --disable-ssh
De : Homolya, Miklós <m.homolya14@imperial.ac.uk>
Envoyé : mardi 30 août 2016 10:53:52
À : Floriane Gidel [RPG]; firedrake
Objet : Re: [firedrake] Saving txt files while running in parallelOK, so in case of a system upgrade which brings new versions (not just bugfixes) of the C compiler, MPI library etc. I suggest doing a fresh installation of Firedrake. Currently, this isn't a scenario that firedrake-update can correctly handle.
From: Floriane Gidel [RPG] <mmfg@leeds.ac.uk>
Sent: 30 August 2016 10:47:36
To: Homolya, Miklós; firedrake
Subject: RE: [firedrake] Saving txt files while running in parallelHi Miklós,
Basically I had installed Firedrake on my linux machine and the programs were running correctly, but later on it stopped working. I asked the IT and they said :
"It seems that Firedrake got broken after an update of one of it's many dependencies. Firedrake requires an older version of OpenMPI to function correctly. You need to load the module mpi/compat-openmpi16-x86_64 to activate this older version."
By doing this, I have no error coming from mpi, but other errors as those I sent earlier (OSError, ValueError...), and I don't have them on my laptop. It is not on a cluster.
De : Miklós Homolya <m.homolya14@imperial.ac.uk>
Envoyé : mardi 30 août 2016 10:07:42
À : firedrake@imperial.ac.uk; Floriane Gidel [RPG]
Objet : Re: [firedrake] Saving txt files while running in parallelAnswers inlined below.
On 30/08/16 07:11, Floriane Gidel [RPG] wrote:
Well, what is the value of N?mesh = UnitIntervalMesh(N)
raises the error
ValueError: Number of cells must be a positive integer
while the file runs correctly on my laptop with another version of Firedrake.
It's not correct to say that Firedrake requires a specific MPI implementation or that it requires a specific version of OpenMPI. Firedrake, in principle at least, should work with any MPI implementation.I did not encounter any issue while running the installation script, but I know that Firedrake requires a version of OpenMPI different from the one installed on my machine,
The install script happens to install OpenMPI on Ubuntu and Mac OS X, but that's just for convenience. If you have another MPI implementation installed, with the --no-package-manager option firedrake-install should just pick that up and use it.
Firedrake uses whatever MPI implementation provides mpicc and similar command. You can check e.g. whatso I load the required version before sourcing Firedrake. Maybe this is not enough for Firedrake to access the required version?
Below are the commands I used to load the module and install Firedrake:
module load/mpi/compat-openmpi16-x86_64
python firedrake-install --no-package-manager --disable-ssh
and to source the Firedrake environment before running simulations:
module load/mpi/compat-openmpi16-x86_64
source firedrake/bin/activate
Does Firedrake use the loaded version of openmpi with these commands, or is there something missing?
$ mpicc -v
says.
It is important to use the same MPI when installing and when using Firedrake, but you seem to have done this correctly.
I wonder though why did you load an OpenMPI implementation... If this is on a cluster, you should use whatever the "native" MPI of that cluster is.