Hi Fabian, Thanks for getting in touch and apologies for the slow response. I have a few thoughts that might help. - I would say that from your pictures, the meshes look very coarse, so you could definitely try some additional refinement. - Another source of instability can arise through the default quadrature points being used for inner products etc. You might find that using a more consistent integration could help, particularly if the grid is not well resolved. Depending on your mesh and setup, you can use an expansion tag like: <E COMPOSITE=“C[0]” BASISTYPE=“Modified_A,Modified_A" NUMMODES="3,3" POINTSTYPE="GaussLobattoLegendre,GaussLobattoLegendre" NUMPOINTS="6,6" FIELDS="rho,rhou,rhov,E" /> Note that this would use a polynomial order of 2 (nummodes = 3) whilst using double the number of grid points for integration (numpoints = 6) -- the default is one extra (numpoints = 4) which is generally sufficient but in the case of marginal resolution may lead to significant aliasing error. It'll therefore be more expensive but may be more robust. - I am not sure our CFL estimate for the compressible solver is all that great, so I would probably stick to a constant timestep, at least in the short term to investigate the reason for this instability. This is something we need to work on. - You should be aware that for these explicit schemes, larger polynomial orders will have a greater impact on the size of the maximum timestep, so perhaps best to limit to p=3-5 -- p=7 might be a bit restrictive on the timestep. It is probably better to try a grid refinement rather than increase the polynomial order. If none of this helps we can also see if Gianmarco has some input since he has run these cases extensively during his PhD and encountered may similar problems! Cheers, Dave
On 2 Dec 2016, at 09:53, Selbach, Fabian <fabian.selbach@student.uni-siegen.de> wrote:
Hi Douglas, Dear users,
thanks for your explanation of the EulerADCFE.
Your hint to decrease the CFL does not help to stabilize the simulation. Also decreasing the time step on my own to a very small one (analogical CFL=0.4) and to use another time integration method makes no difference. The simulation terminates at the same time. But I observed, that using less CPUs for parallel simulation helps to remain stable for a longer time, but finally it terminates too. Could it be a problem due to parallelization?
I will try to generate a new and better grid, because this is the last clue I got.
Has somebody else an idea what could causes this oscillations?
Thanks in advance!
Best regards
Fabian Von: Serson, Douglas [d.serson14@imperial.ac.uk] Gesendet: Mittwoch, 30. November 2016 15:17 An: Selbach, Fabian; nektar-users Betreff: Re: [Nektar-users] Nonphysical terminated simulation - CompressibleFlowSolver
Hi Fabian,
I don't know what is causing these oscillations (maybe you need a lower CFL), but regarding your other question: EulerADCFE allows using artificial diffusion for shock capture, while EulerCFE is just the standard Euler solver. For EulerADCFE you need to set the "ShockCaptureType" property (possible values are "Smooth" and "NonSmooth") and the parameters described in section 9.4.1 of the user guide.
In the most recent version of the master branch (this was changed just a few days ago), it is also possible to use shock capture directly with EulerCFE. In this newer version, EulerCFE and EulerADCFE are the same thing, with the second name kept for backwards compatibility.
Cheers, Douglas From: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> on behalf of Selbach, Fabian <fabian.selbach@student.uni-siegen.de> Sent: 30 November 2016 13:21:42 To: nektar-users Subject: [Nektar-users] Nonphysical terminated simulation - CompressibleFlowSolver
Dear users,
I hope you can help one more time. I got nonphysical oscillations during my airfoil-simulation. At that time my simulations terminate with the message: "NaN found during time integration". For more information of the error and parameters see pictures below. I tried to solve this by a higher polynomial degree up to p=7 (this generates 1.5 Mio grid-points here for a 2D simulation), but the simulation terminates still. The time step itself is computed by a fixed CFL=0.9 condition. Maybe this is the wrong way, because this condition is not sufficient?
So please can somebody give me a hint about that? Do I have to use another kind of grid, because the resolution of the boundary at the peak of the airfoil is not high enough or the grid lines have no be orthogonal to the surface? I am already using the spherigons-tool and tried to get a smooth boundary of my airfoil.
I am using the CompressibleFlowSolver on a parallel run up to 96 CPUs by solving the Euler-Equations (EulerCFE) and the Navier-Stokes-Equations (NavierStokesCFE). The simulation terminates for both.
By the way what is the difference between EulerCFE and EulerADCFE (there are no information about it anywhere)? I tried to run the example NACA0012 of the paper "Nektar++: An open-source spectral/hp element framework". The conditions use the EulerADCFE, but by running this example the simulation does not terminate, but also does not change (values of every variable in the whole domain are constant over the time - no changes?).
Thanks for any help!
Best regards
Fabian
Visualization of the grid at a polynomial degree of p=2 just before terminating:
<Screenshot_2016-11-29_14-04-05.png>
With auto-scale: <Screenshot from 2016-11-29 14-27-59.png>
Same solution with scaling by user: <Screenshot from 2016-11-29 15-33-10.png>
Please take into account the scaling above.
Error and conditions for example p=2:
<Screenshot from 2016-11-29 14-32-53.png> <Screenshot from 2016-11-29 14-33-22.png> <Screenshot from 2016-11-29 14-33-38.png>
Visualization of the grid at a polynomial degree of p=7 just before terminating:
<Screenshot from 2016-11-29 16-18-39.png>
(a few time steps after that the simulation terminates and the values go against inf...)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
-- David Moxey (Research and Teaching Fellow) d.moxey@imperial.ac.uk | www.imperial.ac.uk/people/d.moxey Room 364, Department of Aeronautics, Imperial College London, London, SW7 2AZ, UK.