VCSMapping and parallel execution
Dear all, I am a new user. When I tried the VCSMapping with test case (nektar/solvers/IncNavierStokesSolver/Tests/CylFlow_Mov_mapping.xml) it failed to run in parallel with errors: " mpirun -np 8 IncNavierStokesSolver CylFlow_Mov_mapping.xml Fatal : Level 0 assertion violation A parallel solver must be used when run in parallel. " I had to run it in serial to get the final result. However the mesh grid and the position of cylinder remain the same as initial field, even if I have tried different "NumSteps". I just followed the user guide and used "FieldConvert CylFlow_Mov_mapping.xml CylFlow_Mov_mapping.fld cyl.vtu" to get the converted result. I guess maybe because 'CylFlow_Mov_mapping.xml' only contains initial mesh data and CylFlow_Mov_mapping.fld only contains field data, i.e., deformed mesh information is missed by this command. so I think this problem is about post-processing. Could you please help to find out what I've missed and fix this problem? I would appreciate your Help. Kind regards, Hanbo
Hi Hanbo, You are getting this error in parallel because this test case is using the option <I PROPERTY="GlobalSysSoln" VALUE="DirectMultiLevelStaticCond"/> and the direct solver only works in serial. You can run it in parallel by changing this option to IterativeStaticCond or XxtMultiLevelStaticCond. To visualize the movement, you have to use the mapping module during postprocessing, i.e. FieldConvert -m mapping CylFlow_Mov_mapping.xml CylFlow_Mov_mapping.fld cyl.vtu This way the vtu file will have additional fields xCoord and yCoord containing the x and y deformed coordinates. If you are using paraview, you can then create a Calculator with the expression "xCoord*iHat + yCoord*jHat + coordsZ*kHat" and activate the "Coordinate results" option to visualize the deformed geometry. Cheers, Douglas 2018-01-31 8:37 GMT-02:00 Hanbo JIANG <hjiangan@connect.ust.hk>:
Dear all,
I am a new user. When I tried the VCSMapping with test case (nektar/solvers/ IncNavierStokesSolver/Tests/CylFlow_Mov_mapping.xml) it failed to run in parallel with errors: " mpirun -np 8 IncNavierStokesSolver CylFlow_Mov_mapping.xml
Fatal : Level 0 assertion violation A parallel solver must be used when run in parallel. "
I had to run it in serial to get the final result. However the mesh grid and the position of cylinder remain the same as initial field, even if I have tried different "NumSteps".
I just followed the user guide and used
"FieldConvert CylFlow_Mov_mapping.xml CylFlow_Mov_mapping.fld cyl.vtu"
to get the converted result.
I guess maybe because 'CylFlow_Mov_mapping.xml' only contains initial mesh data and CylFlow_Mov_mapping.fld only contains field data, i.e., deformed mesh information is missed by this command. so I think this problem is about post-processing.
Could you please help to find out what I've missed and fix this problem? I would appreciate your Help.
Kind regards, Hanbo
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
Dear all, I am running the timestepper of the I*ncNavierStokesSolver* for a *Quasi-3D* problem in *parallel*. As I mentioned in a previous email, the default combination of linear solver and preconditioning (allegedly *IterativeStaticCond*) seems to be producing non-physical oscillations in the pressure field. I have tried to follow the suggestion that non-default settings (and different for velocity and pressure fields) like those specified in the user guide as an example might improve the situation: <GLOBALSYSSOLNINFO> <V VAR="u,v,w"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="LowEnergyBlock"/> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-8"/> </V> <V VAR="p"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="FullLinearSpaceWithLowEnergyBlock"/> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-6"/> </V> </GLOBALSYSSOLNINFO> These preconditioners seem incompatible with Quasi-3D discretisations and the solver crashes with an error. I have tried different combinations but the only that seems to be working is *Diagonal* for "*u,v,w*" and *FullLinearSpaceWithDiagonal* for "*p*". Are these the defaults when a mere IterativeStaticCond is specified in SOLVERINFO or nothing is said at all? If so, what are the default values for *IterativeSolverTolerance*? Are these tolerances the same for velocity and pressure fields? Is there any other preconditioning option that I can use with quasi-3D problems? Cheers
Hi, That’s an interesting observation — not sure I can claim that I’ve seen the same behaviour, but occasionally the iterative solver has been a bit unstable (particularly when e.g. using random noise to start up a transitional problem in the quasi-3D case). There are a few things you might try: - drop the solver tolerance. The default is 1e-9, but maybe something is going wrong with the normalisation we use on the pressure field which is making the pressure field converge to the wrong values - try another solver type. Another option in parallel is XxtStaticCond or XxtMultiLevelStaticCond, both of which are parallel direct solvers and avoid iterative solvers completely. These could be quite a bit slower, depending on the size of your mesh. - alternatively if your mesh isn’t too big, you can set the ‘npz’ command-line option to the same number of processors as your parallel simulation. In this case each plane will lie on a single processor, and you can then use a serial direct solver such as DirectMultiLevelStaticCond. - If none of these solve the problem, I’d also advise looking at the mesh/boundary conditions/etc to make sure nothing has been set incorrectly. Hope this helps. Thanks, Dave
On 1 Feb 2018, at 15:00, F Mellibovsky <fernando.mellibovsky@upc.edu> wrote:
Dear all,
I am running the timestepper of the IncNavierStokesSolver for a Quasi-3D problem in parallel. As I mentioned in a previous email, the default combination of linear solver and preconditioning (allegedly IterativeStaticCond) seems to be producing non-physical oscillations in the pressure field.
I have tried to follow the suggestion that non-default settings (and different for velocity and pressure fields) like those specified in the user guide as an example might improve the situation:
<GLOBALSYSSOLNINFO> <V VAR="u,v,w"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="LowEnergyBlock"/> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-8"/> </V> <V VAR="p"> <I PROPERTY="GlobalSysSoln" VALUE="IterativeStaticCond" /> <I PROPERTY="Preconditioner" VALUE="FullLinearSpaceWithLowEnergyBlock"/> <I PROPERTY="IterativeSolverTolerance" VALUE="1e-6"/> </V> </GLOBALSYSSOLNINFO>
These preconditioners seem incompatible with Quasi-3D discretisations and the solver crashes with an error. I have tried different combinations but the only that seems to be working is Diagonal for "u,v,w" and FullLinearSpaceWithDiagonal for "p". Are these the defaults when a mere IterativeStaticCond is specified in SOLVERINFO or nothing is said at all? If so, what are the default values for IterativeSolverTolerance? Are these tolerances the same for velocity and pressure fields? Is there any other preconditioning option that I can use with quasi-3D problems?
Cheers _______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
participants (4)
- 
                
                David Moxey
- 
                
                Douglas Serson
- 
                
                F Mellibovsky
- 
                
                Hanbo JIANG