Dear Prof. Sherwin,

             That’s perfectly fine, thank you for handling my request and for approving the GitLab registration. Yes, the request was made by me under the email sdebbahi@usthb.dz. I appreciate your help with this.

With gratitude, best regards,
Saad


Le jeu. 3 oct. 2024 à 10:05, Sherwin, Spencer J <s.sherwin@imperial.ac.uk> a écrit :

Hi Saad,

 

Sorry for the delay in approving the request. Was it made under the email account sdebbahi@usthb.dz? If so I have now approved the gitlab account.

 

Best,

Spencer.

 

 

From: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> on behalf of Debbahi Saad <saad.debbahi@gmail.com>
Date: Thursday, 3 October 2024 at 09:14
To: Henrik Wüstenberg <henrik.wuestenberg@hotmail.de>, nektar-users <nektar-users@imperial.ac.uk>
Subject: Re: [Nektar-users] Memory leak with variable viscosity using Incompressible Navier-Stokes solver

Dear all,

 

            I'm trying the get the above cited issue, a memory leak in the Incompressible navier-stokes solver using varCoeff map, signaled to the developers team or contributors by opening an issue on gitlab, i'm having my gitlab registration request on nektar++ domain pending for approval,

Meanwhile, is there an alternative way to reach out for the developers team to signal the issue and hopefully address it?

 

Best regards,

Saad

 

Le mer. 18 sept. 2024 à 17:29, Debbahi Saad <saad.debbahi@gmail.com> a écrit :

Dear HenriK,

 

        Thank you for your feedback on this issue, sorry for answering this late;

I have attached, herein, the session file that reproduces the memory leak using the IncNavierStokesSolver with variable varCoeffmap entries for viscosity patched by a user function defined inside the session file, namely : nuOfT function

I would've quite surely opened an issue on nektar++ gitlab, but having difficulties accessing my university mailbox, I ended up trying to signup using this gmail box which doesn't give access to the gitlab?

 

Is there any way around to get this echoed to the gitlab?

 

Sorry for the inconvenience, I'm looking forward to following up on this issue,

 

Thank you in advance, all the best  

 

Saad

 

 

Le lun. 16 sept. 2024 à 15:21, Henrik Wüstenberg <henrik.wuestenberg@hotmail.de> a écrit :

Dear Saad,

 

many thanks for sharing your issue here.

 

We have had this issue previously. It occurs due to time-dependent coefficients in the matrices. The global matrix is assembled from local/elemental matrices. While the global matrix's memory is freed at every time step, the local matrices are not necessarily being freed and can lead to a memory leak.

 

If you could open an issue on https://gitlab.nektar.info/nektar/nektar and attach your session and mesh file, we can implement a fix for this issue.

 

Best wishes,
Henrik

 


Von: nektar-users-bounces@imperial.ac.uk <nektar-users-bounces@imperial.ac.uk> im Auftrag von Debbahi Saad <saad.debbahi@gmail.com>
Gesendet: Mittwoch, 11. September 2024 20:00
An: nektar-users <
nektar-users@imperial.ac.uk>
Betreff: [Nektar-users] Memory leak with variable viscosity using Incompressible Navier-Stokes solver

 

This email from saad.debbahi@gmail.com originates from outside Imperial. Do not click on links and attachments unless you recognise the sender. If you trust the sender, add them to your safe senders list to disable email stamping for this address.

 

Dear all,

 

            Using a locally compiled version of  nektar release 5.6.0 (which runs pretty well) to solve fully 3d plane Poiseuille flow geometry,

I'm trying to add variable or temperature dependant viscosity, by defining a function in the nektar session file, to be evaluated and passed to VarCoeffMap D00, D11,D22 as suggested by M. R. Rouhanian, a while ago, indeed  I've come to the conclusion that the Incompressible Navier-Stokes solvers  use too much memory in a cumulative way without releasing it throughout the time advancement process.

 

key points:

- Correct behavior with constant viscosity. no problem, the IncNavierStokesSolver runs good

- Source file modified: nektar-v5.6.0/solvers/IncNavierStokesSolver/EquationSystems/VelocityCorrectionScheme.cpp (Line:890, VelocityCorrectionScheme::v_SolveViscous   function)

- Monotonic memory usage increase when enabling variable viscosity through any of the possible 'nu' dependencies i.e.: nu(Temp),nu(x,y,z), nu(t), nu(x,y,z,t).

- The memory usage reaches easily 10 gigabytes after 20 to 30 timesteps, obviously evenly spread over the 4 x IncNavierStokesSolver spawned processes, compared to 4x140MB=560MB for the constant viscosity case

- The memory usage keeps increasing, never decreases, no release of the memory surplus from one timestep to the next, it just keeps adding up...

- No crash the calculations do carry on, with a noticeable slowdown due to viscosity array update, however not usable from a practicality view point as any serious case simulation would claim few terabytes of leaked memory

- Placing some "cout<<" printing arrays sizes show that the arrays involved do keep the same size no issues

 

It seems to be a garbage collection issue or a missing destructor call that I'm not able to pinpoint or fix.

 

Any help on this issue or suggestion to fix it are greatly appreciated,

 

Thank you all in advance, wishing you all the best,

 

Saad

 

Code details:

My VelocityCorrectionScheme::v_SolveViscous  modified function (file attached herein) reads:

 

void VelocityCorrectionScheme::v_SolveViscous(

const Array<OneD, const Array<OneD, NekDouble>> &Forcing,

const Array<OneD, const Array<OneD, NekDouble>> &inarray,

Array<OneD, Array<OneD, NekDouble>> &outarray, const NekDouble aii_Dt)

{

StdRegions::ConstFactorMap factors;

StdRegions::VarCoeffMap varCoeffMap = StdRegions::NullVarCoeffMap;

MultiRegions::VarFactorsMap varFactorsMap = MultiRegions::NullVarFactorsMap;

 

AppendSVVFactors(factors, varFactorsMap);

ComputeGJPNormalVelocity(inarray, varCoeffMap);

/*

switch (coordim)

{

case 3:

Exp->GetCoords(xc0, xc1, xc2);

break;

default:

ASSERTL0(false, "Coordim not valid");

break;

}*/

 

////////////////////////////////////////////////////////////////////////////////

/////////// Modifications to account for variable viscosity start here://///////

 

int nuflag=0; //initialiazed to 0, fall back to constant viscosity if nu(T) is not defined

time_t rawTime;

time(&rawTime);

NekDouble time = static_cast<NekDouble>(rawTime);//ensure time is of NekDouble type

 

// if the function nuOfT is defined in the session file, evaluate the function and set variable viscosity

if(m_session->DefinesFunction("nuOfT"))

{

Array<OneD, NekDouble> xxc0,xxc1,xxc2;// declare NekDouble arrays to hold coordinates

 

int nqv;// total points

int coordim=3;// default dimension

LibUtilities::EquationSharedPtr nuOfTfunc;

nqv=m_fields[0]->GetTotPoints();

 

// allocate 0.0 valued arrays of totalpoints size, to receive the space coordinates

xxc0 = Array<OneD, NekDouble>(nqv, 0.0);

xxc1 = Array<OneD, NekDouble>(nqv, 0.0);

xxc2 = Array<OneD, NekDouble>(nqv, 0.0);

 

m_fields[0]->GetCoords(xxc0,xxc1,xxc2);

coordim=m_fields[0]->GetCoordim(0);

 

Array<OneD,NekDouble> nuOfT(nqv,0.0);// 0.0 valued nekdouble array to hold viscosity value at each point of space

nuOfTfunc=m_session->GetFunction("nuOfT",0);

 

nuOfTfunc->Evaluate(xxc0,xxc1,xxc2,nuOfT);// evaluate the function nu(T)

varCoeffMap[StdRegions::eVarCoeffD00]=nuOfT;// assign the values of viscosity for the three diagonal components (same) evaluated

varCoeffMap[StdRegions::eVarCoeffD11]=nuOfT;// at every point of space using the decalred nu(T) function

varCoeffMap[StdRegions::eVarCoeffD22]=nuOfT;

 

nuflag=1;// set the variable viscosity flag

 

}

 

// Solve Helmholtz system and put in Physical space

for (int i = 0; i < m_nConvectiveFields; ++i)

{

 

// Add diffusion coefficient to GJP matrix operator (Implicit part)

if (m_useGJPStabilisation)

{

factors[StdRegions::eFactorGJP] = m_GJPJumpScale / m_diffCoeff[i];

}

 

// Setup coefficients for equation

factors[StdRegions::eFactorLambda] = 1.0 / aii_Dt / m_diffCoeff[i];

if (nuflag==1) factors[StdRegions::eFactorTime]=time;//if the nuOfT nu(T) function is defined set the tume factor,

//else continue with constant viscosity

 

m_fields[i]->HelmSolve(Forcing[i], m_fields[i]->UpdateCoeffs(), factors,

varCoeffMap, varFactorsMap);

m_fields[i]->BwdTrans(m_fields[i]->GetCoeffs(), outarray[i]);

}

 

}

 

 

Sample function node from the xml session file:

        <FUNCTION NAME="nuOfT">

            <E EVARS="c1" VALUE="Ra*Pr*c1" />
        </FUNCTION>