Bug in IncNavierStokes solver terminal output for SVV with MPI
Hi Jacques, thank you for your message. I‘m sorry, I send my last answer with some additional info and the files directly to Henrik (see below). The files are attached to this message as txt. Thanks and all the best Alex
Hello Henrik,
thank you very much for your answer and pointing out the relevant routines, I'll have a look! Attached to this mail you'll find the necessary files (mesh + session).
Please note that no Homo1D-SVV option is included in the SOLVERINFO, but the respective parameters are set under PARAMETERS. I tested it with a containerized Nektar++ 5.5.0 on my local machine, Nektar++ 5.5.0 on our hpc and a local installation of Nektar++ 5.3.0 and the issue always came up. Sometimes you have to re-start it several times before the terminal output changes, but most of the time an incorrect output is shown (Homo1D-SVV activated) with the Homo1D-SVV parameter values according to the session file, even though it's not included in the SOLVERINFO.
Just for the sake of completeness: I invoke the solver with mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml
All the best Alex
Sicher versendet mit [Proton Mail](https://proton.me/).
Henrik Wüstenberg <henrik.wuestenberg@hotmail.de> schrieb am Freitag, 30. August 2024 um 10:41 vorm.:
Hi Alex,
I have had a quick look, the output you are reporting is generated by a call to EquationSystem::PrintSummary which then calls v_GenerateSummary. The function is only called if rank == 0. So, I am not sure why you would see different behaviour for MPI or serial execution.
If you are keen to take a look yourself. The parent function is in library/SolverUtils/EquationSystem.cpp::GenerateSummary and for the VelocityCorrectionScheme you would have virtual calls from child to parent in the files:
- solvers/IncNavierStokesSolver/VelocityCorrectionScheme.cpp - (library/SolverUtils/AdvectionSystem.cpp) - no definition here - library/SolverUtils/UnsteadySystem.cpp - library/SolverUtils/EquationSystem.cpp
Could you share the session file for your case? (without the .xml ending as outlook filters those)
Best wishes, Henrik
---------------------------------------------------------------
Von: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> im Auftrag von Alexander Schukmann <alexander.schukmann@protonmail.com> Gesendet: Dienstag, 27. August 2024 18:02 An: nektar-users@imperial.ac.uk <nektar-users@imperial.ac.uk> Betreff: [Nektar-users] Bug in IncNavierStokes solver terminal output for SVV with MPI
This email from alexander.schukmann@protonmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your [safe senders list](https://spam.ic.ac.uk/SpamConsole/Senders.aspx) to disable email stamping for this address.
Hello everyone,
I noticed some odd behavior of the IncNavierStokesSolvers terminal output regarding SVV when using MPI for Quasi-3D cases. When I try to use SVV in the spectral/hp planes only, i.e. without SVV along the homogeneous direction, the terminal output sometimes includes the SVV-Homo1D info, eventhough it's not specified under the SOLVERINFO tag in the session file! When I re-run the same case, it will randomly disappear as you can see here for two consecutive runs with identical settings:
nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) =======================================================================
nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) Smoothing-Homo1D: SVV (Homogeneous1D - Exp Kernel(cut-off = 0.75, diff coeff = 0.10000000000000001)) =======================================================================
The first run does not output the "Smoothing-Homo1D" info, while the second one includes it, which makes it impossible to know whether it is activated or not. It'll use the SVV parameters that are set for the spectral/hp SVV if no parameters for the Homo1D-SVV are specified under the PARAMETERS tag. Otherwise the values will be overwritten, eventhough - as I said before - the feature is turned off under the SOLVERINFO tag.
Running on one CPU shows the expected behavior.
So it would be interesting to know whether this only concerns the terminal output or if the feature really is activated despite not being defined by the user.
All the best Alex
Sicher versendet mit [Proton Mail](https://proton.me/).
Hi Alex, Sorry for the delayed response. I think that the problem might be due to an inappropriate initialization of the parameter m_useHomo1DSpecVanVisc which causes inconsistent behaviour when using MPI. I have a proposed fix here: https://gitlab.nektar.info/nektar/nektar/-/merge_requests/1880 You can fetch this specific MR using git fetch origin merge-requests/1880/head:fix-svv git checkout fix-svv Cheers, Jacques ________________________________ From: Alexander Schukmann <alexander.schukmann@protonmail.com> Sent: 02 September 2024 18:56 To: nektar-users <nektar-users@imperial.ac.uk>; Xing, Jacques <j.xing@imperial.ac.uk> Subject: [Nektar-users] Bug in IncNavierStokes solver terminal output for SVV with MPI Hi Jacques, thank you for your message. I‘m sorry, I send my last answer with some additional info and the files directly to Henrik (see below). The files are attached to this message as txt. Thanks and all the best Alex Hello Henrik, thank you very much for your answer and pointing out the relevant routines, I'll have a look! Attached to this mail you'll find the necessary files (mesh + session). Please note that no Homo1D-SVV option is included in the SOLVERINFO, but the respective parameters are set under PARAMETERS. I tested it with a containerized Nektar++ 5.5.0 on my local machine, Nektar++ 5.5.0 on our hpc and a local installation of Nektar++ 5.3.0 and the issue always came up. Sometimes you have to re-start it several times before the terminal output changes, but most of the time an incorrect output is shown (Homo1D-SVV activated) with the Homo1D-SVV parameter values according to the session file, even though it's not included in the SOLVERINFO. Just for the sake of completeness: I invoke the solver with mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml All the best Alex Sicher versendet mit Proton Mail<https://proton.me/>. Henrik Wüstenberg <henrik.wuestenberg@hotmail.de> schrieb am Freitag, 30. August 2024 um 10:41 vorm.: Hi Alex, I have had a quick look, the output you are reporting is generated by a call to EquationSystem::PrintSummary which then calls v_GenerateSummary. The function is only called if rank == 0. So, I am not sure why you would see different behaviour for MPI or serial execution. If you are keen to take a look yourself. The parent function is in library/SolverUtils/EquationSystem.cpp::GenerateSummary and for the VelocityCorrectionScheme you would have virtual calls from child to parent in the files: * solvers/IncNavierStokesSolver/VelocityCorrectionScheme.cpp * (library/SolverUtils/AdvectionSystem.cpp) - no definition here * library/SolverUtils/UnsteadySystem.cpp * library/SolverUtils/EquationSystem.cpp Could you share the session file for your case? (without the .xml ending as outlook filters those) Best wishes, Henrik ________________________________ Von: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> im Auftrag von Alexander Schukmann <alexander.schukmann@protonmail.com> Gesendet: Dienstag, 27. August 2024 18:02 An: nektar-users@imperial.ac.uk <nektar-users@imperial.ac.uk> Betreff: [Nektar-users] Bug in IncNavierStokes solver terminal output for SVV with MPI This email from alexander.schukmann@protonmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list<https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email stamping for this address. Hello everyone, I noticed some odd behavior of the IncNavierStokesSolvers terminal output regarding SVV when using MPI for Quasi-3D cases. When I try to use SVV in the spectral/hp planes only, i.e. without SVV along the homogeneous direction, the terminal output sometimes includes the SVV-Homo1D info, eventhough it's not specified under the SOLVERINFO tag in the session file! When I re-run the same case, it will randomly disappear as you can see here for two consecutive runs with identical settings: nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) ======================================================================= nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) Smoothing-Homo1D: SVV (Homogeneous1D - Exp Kernel(cut-off = 0.75, diff coeff = 0.10000000000000001)) ======================================================================= The first run does not output the "Smoothing-Homo1D" info, while the second one includes it, which makes it impossible to know whether it is activated or not. It'll use the SVV parameters that are set for the spectral/hp SVV if no parameters for the Homo1D-SVV are specified under the PARAMETERS tag. Otherwise the values will be overwritten, eventhough - as I said before - the feature is turned off under the SOLVERINFO tag. Running on one CPU shows the expected behavior. So it would be interesting to know whether this only concerns the terminal output or if the feature really is activated despite not being defined by the user. All the best Alex Sicher versendet mit Proton Mail<https://proton.me/>.
Hello Jacques, no worries! Thank you very much for the clarification and the fix. All the best Alex Am Mittwoch, 18. September 2024 um 17:17, Xing, Jacques <[j.xing@imperial.ac.uk](mailto:Am Mittwoch, 18. September 2024 um 17:17, Xing, Jacques <<a href=)> schrieb:
Hi Alex,
Sorry for the delayed response. I think that the problem might be due to an inappropriate initialization of the parameter m_useHomo1DSpecVanVisc which causes inconsistent behaviour when using MPI. I have a proposed fix here: https://gitlab.nektar.info/nektar/nektar/-/merge_requests/1880
You can fetch this specific MR using
git fetch origin merge-requests/1880/head:fix-svv git checkout fix-svv
Cheers, Jacques
---------------------------------------------------------------
From: Alexander Schukmann <alexander.schukmann@protonmail.com> Sent: 02 September 2024 18:56 To: nektar-users <nektar-users@imperial.ac.uk>; Xing, Jacques <j.xing@imperial.ac.uk> Subject: [Nektar-users] Bug in IncNavierStokes solver terminal output for SVV with MPI
Hi Jacques,
thank you for your message. I‘m sorry, I send my last answer with some additional info and the files directly to Henrik (see below).
The files are attached to this message as txt.
Thanks and all the best Alex
Hello Henrik,
thank you very much for your answer and pointing out the relevant routines, I'll have a look! Attached to this mail you'll find the necessary files (mesh + session).
Please note that no Homo1D-SVV option is included in the SOLVERINFO, but the respective parameters are set under PARAMETERS. I tested it with a containerized Nektar++ 5.5.0 on my local machine, Nektar++ 5.5.0 on our hpc and a local installation of Nektar++ 5.3.0 and the issue always came up. Sometimes you have to re-start it several times before the terminal output changes, but most of the time an incorrect output is shown (Homo1D-SVV activated) with the Homo1D-SVV parameter values according to the session file, even though it's not included in the SOLVERINFO.
Just for the sake of completeness: I invoke the solver with mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml
All the best Alex
Sicher versendet mit [Proton Mail](https://proton.me/).
Henrik Wüstenberg <henrik.wuestenberg@hotmail.de> schrieb am Freitag, 30. August 2024 um 10:41 vorm.:
Hi Alex,
I have had a quick look, the output you are reporting is generated by a call to EquationSystem::PrintSummary which then calls v_GenerateSummary. The function is only called if rank == 0. So, I am not sure why you would see different behaviour for MPI or serial execution.
If you are keen to take a look yourself. The parent function is in library/SolverUtils/EquationSystem.cpp::GenerateSummary and for the VelocityCorrectionScheme you would have virtual calls from child to parent in the files:
- solvers/IncNavierStokesSolver/VelocityCorrectionScheme.cpp - (library/SolverUtils/AdvectionSystem.cpp) - no definition here - library/SolverUtils/UnsteadySystem.cpp - library/SolverUtils/EquationSystem.cpp
Could you share the session file for your case? (without the .xml ending as outlook filters those)
Best wishes, Henrik
---------------------------------------------------------------
Von: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> im Auftrag von Alexander Schukmann <alexander.schukmann@protonmail.com> Gesendet: Dienstag, 27. August 2024 18:02 An: nektar-users@imperial.ac.uk <nektar-users@imperial.ac.uk> Betreff: [Nektar-users] Bug in IncNavierStokes solver terminal output for SVV with MPI
This email from alexander.schukmann@protonmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your [safe senders list](https://spam.ic.ac.uk/SpamConsole/Senders.aspx) to disable email stamping for this address.
Hello everyone,
I noticed some odd behavior of the IncNavierStokesSolvers terminal output regarding SVV when using MPI for Quasi-3D cases. When I try to use SVV in the spectral/hp planes only, i.e. without SVV along the homogeneous direction, the terminal output sometimes includes the SVV-Homo1D info, eventhough it's not specified under the SOLVERINFO tag in the session file! When I re-run the same case, it will randomly disappear as you can see here for two consecutive runs with identical settings:
nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) =======================================================================
nektar@0d6fdc64e227:~/host_pwd/run$ mpirun -n 16 IncNavierStokesSolver -f mesh.xml session.xml ======================================================================= EquationType: UnsteadyNavierStokes Session Name: CylReD3900_quad_coarse2_Pgeo3 Spatial Dim.: 3 Max SEM Exp. Order: 4 Num. Processes: 16 Quasi-3D: Homogeneous in z-direction Expansion Dim.: 3 Num. Hom. Modes (z): 64 Hom. length (LZ): 3.14159 FFT Type: FFTW Projection Type: Continuous Galerkin Advect. advancement: explicit Diffuse. advancement: implicit Time Step: 0.004 No. of Steps: 1 Checkpoints (steps): 1 Integration Type: IMEX Splitting Scheme: Velocity correction (strong press. form) Dealiasing: Homogeneous1D + spectral/hp Smoothing-SpecHP: SVV (spectral/hp DG Kernel (diff coeff = 0.10000000000000001*Uh/p)) Smoothing-Homo1D: SVV (Homogeneous1D - Exp Kernel(cut-off = 0.75, diff coeff = 0.10000000000000001)) =======================================================================
The first run does not output the "Smoothing-Homo1D" info, while the second one includes it, which makes it impossible to know whether it is activated or not. It'll use the SVV parameters that are set for the spectral/hp SVV if no parameters for the Homo1D-SVV are specified under the PARAMETERS tag. Otherwise the values will be overwritten, eventhough - as I said before - the feature is turned off under the SOLVERINFO tag.
Running on one CPU shows the expected behavior.
So it would be interesting to know whether this only concerns the terminal output or if the feature really is activated despite not being defined by the user.
All the best Alex
Sicher versendet mit [Proton Mail](https://proton.me/).
participants (2)
- 
                
                Alexander Schukmann
- 
                
                Xing, Jacques