|
This
email from dxz324@student.bham.ac.uk originates from outside Imperial. Do not click on
links and attachments unless you recognise the sender. If you trust the
sender, add them to your safe senders
list to disable email stamping for this address.
|
Dear Nektar++ developers,
I am currently trying to run a CWIPI–Nektar++ coupled example on Archer2, where a Python program (CWIPI source side) sends data to Nektar++ (AcousticSolver) to compute APE. I would like to confirm whether my setup steps are correct and ask for any advice from
those who have successfully used CWIPI with Nektar++.
1. CWIPI side (Python)
CWIPI initialization is done as follows:
comm = MPI.COMM_WORLD
code_name = "PyApp"
coupled_code_name = "NekApp"
intra = pycwp.init(comm, [code_name], True)
cpl = pycwp.Coupling(
code_name,
"LambFile", # matches XML coupling name
coupled_code_name,
2. Nektar++ side
In the Nektar++ XML condition file (Condition_try.xml), the coupling section is:
<COUPLING NAME="LambFile" TYPE="Cwipi">
<I PROPERTY="RemoteName" VALUE="PyApp"/>
<I PROPERTY="ReceiveSteps" VALUE="1"/>
# the variables for source are set to 0 in source function
<I PROPERTY="ReceiveVariables" VALUE="F_0_p,F_0_u,F_0_v"/>
<I PROPERTY="NotLocMethod" VALUE="keep"/>
</COUPLING>
Nektar++ is started with: AcousticSolver --cwipi 'NekApp' mesh.xml Condition.xml --verbose
so the local code name is "NekApp", matching the coupled_code_name in the Python script.
3. Archer2 run setup
I created two shell scripts:
run_cwipi.sh:
export MPICH_SMP_SINGLE_COPY_MODE=CMA
source ...
exec python -u cwipi_source.py
run_nek.sh (Nektar++ side):
export ...
exec AcousticSolver --cwipi 'NekApp' mesh.xml Condition.xml --verbose
and a .conf file--multi.conf:
0 ./run_cwipi.sh
1 ./run_nek.sh
Run command on Archer2:
srun --ntasks=2 --multi-prog multi.conf
4. The issue
When running, both sides print the CWIPI initialization banner:
cwipi 1.3.0 initializing
------------------------
5. Request for advice
I would like to confirm:
Is my CWIPI–Nektar++ code-name mapping correct (PyApp ↔ NekApp, "LambFile" coupling name)?
Is srun --multi-prog with shared MPI_COMM_WORLD the recommended way for intra-communicator mode?
Are there known restrictions in CWIPI–Nektar++ coupling when each code only has one MPI rank?
Could there be additional configuration required on the Nektar++ side for CWIPI to complete initialization?
Any guidance or examples from someone who has successfully run CWIPI with Nektar++ would be very helpful.
Thank you in advance for your time and assistance.
Best regards,
Dao