mpirun error when using steady-state solver
******************* This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address. ******************* Dear all, I am trying to calculate a base flow with the steady-state solver using selective frequency damping. Recently, an error, shown in the picture below, was given when I tried to run in parallel the IncNavierStokesSolver for my new mesh-refined case using mpirun -np: [cid:53ae808d-8f8c-4062-9c1e-374f9c477a1c] However, this error has never occurred before when: 1.using serial running for the same session file/ setting of this mesh-refined case; 2.using parallel running for other cases so far, which are all with much coarser meshes; So I want to know what triggered this kind of problem and how to deal with it. Thanks in advance for the help. Best regards, Mona.
Hi 依然, This error occurs in the global linear system solver (GlobalSysSoln), which solves the global matrix of the Poisson or the Helmholtz equation. If you don't set GlobalSysSoln in SOLVERINFO. In serial running, the default GlobalSysSoln is a direct solver, DirectMultiLevelStaticCond. In parallel running, the default GlobalSysSoln is a conjugate-gradient iterative method, and the maximum number of iterations is 5000. Obviously, in your case, the CG iteration didn't converge within 5000 iterations. If your simulation is 2D or 3DH1D(with one homogeneous direction), you can use a parallel direct solver, <I PROPERTY="GlobalSysSoln" VALUE="XxtMultiLevelStaticCond" /> If your simulation is 3D, you can add a preconditioner to accelerate the C-G iteration. <GLOBALSYSSOLNINFO> <V VAR="u,v,w"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="LowEnergyBlock"/> <I PROPERTY="SuccessiveRHS" VALUE="8" /> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-3"/> </V> <V VAR="p"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="FullLinearSpaceWithLowEnergyBlock"/> <I PROPERTY="SuccessiveRHS" VALUE="8" /> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-3"/> </V> </GLOBALSYSSOLNINFO> Cheers Ankang ________________________________ From: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> on behalf of 李 依然 <mona.li92@hotmail.com> Sent: 31 October 2021 06:26 To: nektar-users <nektar-users@imperial.ac.uk> Subject: [Nektar-users] mpirun error when using steady-state solver This email from mona.li92@hotmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list<https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email stamping for this address. Dear all, I am trying to calculate a base flow with the steady-state solver using selective frequency damping. Recently, an error, shown in the picture below, was given when I tried to run in parallel the IncNavierStokesSolver for my new mesh-refined case using mpirun -np: [cid:53ae808d-8f8c-4062-9c1e-374f9c477a1c] However, this error has never occurred before when: 1.using serial running for the same session file/ setting of this mesh-refined case; 2.using parallel running for other cases so far, which are all with much coarser meshes; So I want to know what triggered this kind of problem and how to deal with it. Thanks in advance for the help. Best regards, Mona.
Hi AnKang, Thank you very much!!! My problem has successfully been solved and I benefited really a lot from your kindly help. 谢谢! Best Regards, Mona ________________________________ 发件人: Gao, Ankang <ankang.gao@imperial.ac.uk> 发送时间: 2021年10月31日 19:13 收件人: 李 依然 <mona.li92@hotmail.com>; nektar-users <nektar-users@imperial.ac.uk> 主题: Re: [Nektar-users] mpirun error when using steady-state solver Hi 依然, This error occurs in the global linear system solver (GlobalSysSoln), which solves the global matrix of the Poisson or the Helmholtz equation. If you don't set GlobalSysSoln in SOLVERINFO. In serial running, the default GlobalSysSoln is a direct solver, DirectMultiLevelStaticCond. In parallel running, the default GlobalSysSoln is a conjugate-gradient iterative method, and the maximum number of iterations is 5000. Obviously, in your case, the CG iteration didn't converge within 5000 iterations. If your simulation is 2D or 3DH1D(with one homogeneous direction), you can use a parallel direct solver, <I PROPERTY="GlobalSysSoln" VALUE="XxtMultiLevelStaticCond" /> If your simulation is 3D, you can add a preconditioner to accelerate the C-G iteration. <GLOBALSYSSOLNINFO> <V VAR="u,v,w"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="LowEnergyBlock"/> <I PROPERTY="SuccessiveRHS" VALUE="8" /> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-3"/> </V> <V VAR="p"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="FullLinearSpaceWithLowEnergyBlock"/> <I PROPERTY="SuccessiveRHS" VALUE="8" /> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-3"/> </V> </GLOBALSYSSOLNINFO> Cheers Ankang ________________________________ From: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> on behalf of 李 依然 <mona.li92@hotmail.com> Sent: 31 October 2021 06:26 To: nektar-users <nektar-users@imperial.ac.uk> Subject: [Nektar-users] mpirun error when using steady-state solver This email from mona.li92@hotmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list<https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email stamping for this address. Dear all, I am trying to calculate a base flow with the steady-state solver using selective frequency damping. Recently, an error, shown in the picture below, was given when I tried to run in parallel the IncNavierStokesSolver for my new mesh-refined case using mpirun -np: [cid:53ae808d-8f8c-4062-9c1e-374f9c477a1c] However, this error has never occurred before when: 1.using serial running for the same session file/ setting of this mesh-refined case; 2.using parallel running for other cases so far, which are all with much coarser meshes; So I want to know what triggered this kind of problem and how to deal with it. Thanks in advance for the help. Best regards, Mona.
participants (2)
- 
                
                Gao, Ankang
- 
                
                李 依然