Hi Miklós, Basically I had installed Firedrake on my linux machine and the programs were running correctly, but later on it stopped working. I asked the IT and they said : "It seems that Firedrake got broken after an update of one of it's many dependencies. Firedrake requires an older version of OpenMPI to function correctly. You need to load the module mpi/compat-openmpi16-x86_64 to activate this older version." By doing this, I have no error coming from mpi, but other errors as those I sent earlier (OSError, ValueError...), and I don't have them on my laptop. It is not on a cluster. ________________________________ De : Miklós Homolya <m.homolya14@imperial.ac.uk> Envoyé : mardi 30 août 2016 10:07:42 À : firedrake@imperial.ac.uk; Floriane Gidel [RPG] Objet : Re: [firedrake] Saving txt files while running in parallel Answers inlined below. On 30/08/16 07:11, Floriane Gidel [RPG] wrote: mesh = UnitIntervalMesh(N) raises the error ValueError: Number of cells must be a positive integer while the file runs correctly on my laptop with another version of Firedrake. Well, what is the value of N? I did not encounter any issue while running the installation script, but I know that Firedrake requires a version of OpenMPI different from the one installed on my machine, It's not correct to say that Firedrake requires a specific MPI implementation or that it requires a specific version of OpenMPI. Firedrake, in principle at least, should work with any MPI implementation. The install script happens to install OpenMPI on Ubuntu and Mac OS X, but that's just for convenience. If you have another MPI implementation installed, with the --no-package-manager option firedrake-install should just pick that up and use it. so I load the required version before sourcing Firedrake. Maybe this is not enough for Firedrake to access the required version? Below are the commands I used to load the module and install Firedrake: module load/mpi/compat-openmpi16-x86_64 python firedrake-install --no-package-manager --disable-ssh and to source the Firedrake environment before running simulations: module load/mpi/compat-openmpi16-x86_64 source firedrake/bin/activate Does Firedrake use the loaded version of openmpi with these commands, or is there something missing? Firedrake uses whatever MPI implementation provides mpicc and similar command. You can check e.g. what $ mpicc -v says. It is important to use the same MPI when installing and when using Firedrake, but you seem to have done this correctly. I wonder though why did you load an OpenMPI implementation... If this is on a cluster, you should use whatever the "native" MPI of that cluster is.