Hi Alessandro,

Thanks for your message. Can you give us some further information about the error you’re getting when trying to run with MPI?

What error are you seeing? Are you using Linux? If so, which Linux distribution and version are you using? Also, what version of MPI are you using? I assume you’ve built Nektar++ with either OpenMPI or MPICH, if you could tell us which, and which version we can try and offer some further advice.

Kind regards,

Jeremy

On 15 Feb 2021, at 11:38, Alessandro Alberto de Lima <aalima@usp.br> wrote:

This email from aalima@usp.br originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list to disable email stamping for this address.
 
Hi.
My name is Alessandro. I'm from Brazil end new user of nektar++.
I work with professor Bruno Carmo and professor Julio Meneghini in the University of São Paulo.
I'm trying to simulate 3d flow around the cylinder. I compiled nektar++ im my pc, a serial version, with no problems but the mpi version was not so well concluded.
I run the 2D case of flow around the cylinder from the tutorial with no problems.
Then I compiled nektar++ 5.0.1 with the optinos 

cmake -DNEKTAR_BUILD_DEMOS=ON -DNEKTAR_BUILD_TESTS=ON -DNEKTAR_USE_FFTW=ON -DNEKTAR_USE_MPI=ON ..

I tried to run the 3D case described in the incns-taylor-green-vortex.pdf with the command:
mpirun --hostfile myHost -np 8 IncNavierStokesSolver TGV64_mesh.xml TGV64_conditions.xml

and 

mpirun -np 8 IncNavierStokesSolver TGV64_mesh.xml TGV64_conditions.xml

I tried the two ways and both result in errors of mpi execution.
Can you send me more specific examples of compilation, setup and run of 2.5D and 3D cases with periodic boundary conditions?
Thanks for help.

Alessandro
_______________________________________________
Nektar-users mailing list
Nektar-users@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/nektar-users