Re: [Nektar-users] forces computation in parallel computations
Hi, I have tried the preconditioner choices you suggested but they do not seem to work for Quasi-3D simulations (I am using Fourier expansions in 1 direction). I have tried FullLinearSapce instead of LowEnergyBlock, but nektar comes out with the same error message. Is there any preconditioner -other than diagonal- that I can use? By the way, I have the impression (I might be wrong) that the problem with the forces oscillation in parallel computations could be related with the failure to have the data from all nodes upon computation of the pressure forces. Could it be that the mpi communication is missing a "wait" statement and that forces are computed with incomplete data? Cheers On 13/01/18 17:42, Spencer Sherwin wrote:
Hi,
When running in parallel the code uses an iterative solver where as in serial we use a direct multi-level static condensation technique. If nothing else is specified then the default preconditioner on the iterative solver is a diagonal preconditioner which often is not great and for a complex mesh it can lead to pressure fluctuation.
From what has been stated in the thread below what is likely to be happening is that the introduction of the GlobalSysSolnInfo as suggested in the user guide will use a more advanced preconditioner that leads to a more robust pressure solution and then causes the forces to be less oscillatory.
Cheers Spencer
On 12 Jan 2018, at 09:18, Feifei Tong <feifei.tong@uwa.edu.au <mailto:feifei.tong@uwa.edu.au>> wrote:
Hi Mellibovsky,
I am not an expert, but the same issue bothered me for a while. I finally managed to solve it by adding the <GLOBALSYSSOLNINFO> into the xml file (page 31, User Guide 4.4.0). This does not only smooth the force signal, but also appears to speed up the simulation by a magnitude. Hopefully some people can give a good explanation on it.
Kind regards, Feifei
-----Original Message----- From: nektar-users-bounces@imperial.ac.uk <mailto:nektar-users-bounces@imperial.ac.uk> [mailto:nektar-users-bounces@imperial.ac.uk] On Behalf Of F Mellibovsky Sent: Friday, 12 January 2018 4:47 PM To: nektar-users Subject: [Nektar-users] forces computation in parallel computations
Dear all,
When running parallel (72 core) aggregate quantitities such as body forces (particularly pressure forces) seem to include a certain degree -low amplitude- of randomness in their evolution. The high frequency random oscillations seem to be of the order of the sampling period. This suggests some kind of issue with the gathering of data from all parallel processes for aggregate quantity computation.
Is there some precaution that must be taken upon compilation or in defining the cluster parallel environtment to avoid this issue?
I attach one such forces time-series to illustrate this. Viscous forces seem to be just fine, while pressure (and total) forces oscillate at high frequency.
Cheers _______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk <mailto:Nektar-users@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
Spencer Sherwin FREng, FRAeS Head, Aerodynamics, Professor of Computational Fluid Mechanics, Department of Aeronautics, Imperial College London South Kensington Campus, London, SW7 2AZ, UK s.sherwin@imperial.ac.uk <mailto:s.sherwin@imperial.ac.uk> +44 (0)20 7594 5052 http://www.imperial.ac.uk/people/s.sherwin/
participants (1)
- 
                
                F Mellibovsky