Hi Alex,

Thanks for your email. Unfortunately, this issue exists with parallel simulations for AWGN initial conditions. 

I am currently figuring out a solution for this which seems to be related to enforcing C0 continuity at the partition boundaries. 

A workaround is to initialise an AWGN initial condition in serial (-np 1) by setting TFinal = 0 -  let's say the solution field will be written to solution.fld

Then start a parallel run from the solution.fld

Cheers,
Chi Hin



From: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> on behalf of Alexander Schukmann <alexander.schukmann@protonmail.com>
Sent: 25 March 2024 10:07
To: nektar-users <nektar-users@imperial.ac.uk>
Subject: [Nektar-users] Problem with MPI + Quasi-3D + AWGN initialization for ICNS solver
 

This email from alexander.schukmann@protonmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list to disable email stamping for this address.

 

Hello everybody,

I am currently running parallel quasi-3D simulations (Hom1D) with the incompressible Navier-Stokes solver within Nektar++ version 5.3.0 and the data transfer at the partition boundaries appears to be faulty when initializing the velocity field in the homogeneous z-direction with additive white Gaussian noise as can be seen here:

g853.png

I don't know if this problem still exists in the current Nektar version, but I couldn't find anything about it in the user archives and therefore wanted to draw attention to this.

All the best,
Alex