Questions on using HelmSolve, varCoeffMap and varFactorsMap
Dear all, I have a few questions on using the HelmSolve function together with non-zero varCoeffMap and varFactorsMap input arguments. Consider the following equation on a given domain: div [ eps(x) * grad(U) ] + lambda(x) * U = f(x) Is it correct to say that varCoeffMap can be set up to provide eps(x), while varFactorsMap can be set up to provide lambda(x) for the HelmSolve function? If so, how does the constant factor [eFactorsLamdba] get treated if varFactorsMap is also defined? With these assumptions, I have moved forward to set up an example. The Unsteady Diffusion equation system from the ADR solver was used as the basis. The modified source files are attached. The example is not meant to represent a particular physical problem, rather it just check functionality. A varCoeffMap [StdRegins :: eVarCoeffLaplacian] and varFactorsMap [StdRegions :: eFactorDiffCoefff] were set up based on the total number of quadrature points and are simply filled with random small values at every time step. A 2D problem was set up in a unit square domain (the xml file is also attached). Time marching the problem has revealed that the RAM consumption steadily grows without limits if the varFactorsMap is recomputed at every timestep, regardless of the total RAM, if the total number of time steps if sufficiently high. I am running the code under Ubuntu, and all source files were updated from the repository on March 2nd, prior to any modifications. Is this due to an improper use of varFactorsMap within the example, or is it a memory management problem? Am I missing source files which should be updated to provide this intended functionality? It must be noted that updating varCoeffMap at every step does not generate the same memory problem. Any information or advice would be much appreciated, and apologies for the lengthy email :) Best regards, Oliviu
Dear Oliviu, TL;DR: By changing the coefficients in your matrix at each time step, you are generating local elemental matrices at each time step which are then accumulating in the matrix managers. You would need to clear these managers at each time-step to avoid the increasing memory usage. See detail below: On 15/03/18 21:00, Oliviu Sugar wrote:
Consider the following equation on a given domain: div [ eps(x) * grad(U) ] + lambda(x) * U = f(x)
Is it correct to say that varCoeffMap can be set up to provide eps(x), while varFactorsMap can be set up to provide lambda(x) for the HelmSolve function? If so, how does the constant factor [eFactorsLamdba] get treated if varFactorsMap is also defined?
The (const)factors are scalar spatially independent constants used in the construction of an operator, while the varcoeffs are intended for spatially varying coefficients such as the eps(x) above. If you wanted to have spatially varying lambda coefficient, you would need to modify the mass-matrix varcoeff, which is used when building the Helmholtz operator.
With these assumptions, I have moved forward to set up an example. The Unsteady Diffusion equation system from the ADR solver was used as the basis. The modified source files are attached.
The example is not meant to represent a particular physical problem, rather it just check functionality.
A varCoeffMap [StdRegins :: eVarCoeffLaplacian] and varFactorsMap [StdRegions :: eFactorDiffCoefff] were set up based on the total number of quadrature points and are simply filled with random small values at every time step.
A 2D problem was set up in a unit square domain (the xml file is also attached).
Time marching the problem has revealed that the RAM consumption steadily grows without limits if the varFactorsMap is recomputed at every timestep, regardless of the total RAM, if the total number of time steps if sufficiently high. I am running the code under Ubuntu, and all source files were updated from the repository on March 2nd, prior to any modifications.
The generation of matrices (elemental and global) in Nektar++ is defined using a 'key' which specifies the operator, shape, basis, and any const/variable coefficients. This is combined into a 'hash' and is stored in a manager (of the C++ design pattern sort) along with the generated matrix so that they can be reused if additional requests for the same matrix are made subsequently (e.g. operators on regular StdRegions elements). As a consequence by changing the variable coefficients at each timestep, different matrices will be generated at each time-step and stored in the manager separately. Since this sort of time-dependent matrix usage is not typical of Nektar++ usage we do not really have a clean way of handling it at the moment. We do have a ClearManager member function to the NekManager to remove previously generated matrices from specific pools of matrices in a not-so-elegant brute-force manner. You will notice this is called indirectly through the ClearGlobalLinSysManager() call in your UnsteadySystem.cpp file, but this will only clear those in the "GlobalLinSys" pool, not those in the elemental matrix pools. For those you would need to clear, for example, the "QuadExpMatrix" and "QuadExpStaticCondMatrix" pools, and similarly for other element types. Hope that helps explain your observations and at least steers you towards a solution! Kind regards, Chris
Is this due to an improper use of varFactorsMap within the example, or is it a memory management problem? Am I missing source files which should be updated to provide this intended functionality? It must be noted that updating varCoeffMap at every step does not generate the same memory problem.
Any information or advice would be much appreciated, and apologies for the lengthy email :)
Best regards,
Oliviu
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
-- Chris Cantwell Imperial College London South Kensington Campus London SW7 2AZ Email: c.cantwell@imperial.ac.uk www.imperial.ac.uk/people/c.cantwell
Dear Chris, Many thanks for the information. I will continue to work on this problem, and look at clearing the memory for the elemental matrices, for the various element shapes, at each time step. Best regards,Oliviu On Monday, March 19, 2018, 3:09:58 PM GMT, Chris Cantwell <c.cantwell@imperial.ac.uk> wrote: Dear Oliviu, TL;DR: By changing the coefficients in your matrix at each time step, you are generating local elemental matrices at each time step which are then accumulating in the matrix managers. You would need to clear these managers at each time-step to avoid the increasing memory usage. See detail below: On 15/03/18 21:00, Oliviu Sugar wrote:
Consider the following equation on a given domain: div [ eps(x) * grad(U) ] + lambda(x) * U = f(x)
Is it correct to say that varCoeffMap can be set up to provide eps(x), while varFactorsMap can be set up to provide lambda(x) for the HelmSolve function? If so, how does the constant factor [eFactorsLamdba] get treated if varFactorsMap is also defined?
The (const)factors are scalar spatially independent constants used in the construction of an operator, while the varcoeffs are intended for spatially varying coefficients such as the eps(x) above. If you wanted to have spatially varying lambda coefficient, you would need to modify the mass-matrix varcoeff, which is used when building the Helmholtz operator.
With these assumptions, I have moved forward to set up an example. The Unsteady Diffusion equation system from the ADR solver was used as the basis. The modified source files are attached.
The example is not meant to represent a particular physical problem, rather it just check functionality.
A varCoeffMap [StdRegins :: eVarCoeffLaplacian] and varFactorsMap [StdRegions :: eFactorDiffCoefff] were set up based on the total number of quadrature points and are simply filled with random small values at every time step.
A 2D problem was set up in a unit square domain (the xml file is also attached).
Time marching the problem has revealed that the RAM consumption steadily grows without limits if the varFactorsMap is recomputed at every timestep, regardless of the total RAM, if the total number of time steps if sufficiently high. I am running the code under Ubuntu, and all source files were updated from the repository on March 2nd, prior to any modifications.
The generation of matrices (elemental and global) in Nektar++ is defined using a 'key' which specifies the operator, shape, basis, and any const/variable coefficients. This is combined into a 'hash' and is stored in a manager (of the C++ design pattern sort) along with the generated matrix so that they can be reused if additional requests for the same matrix are made subsequently (e.g. operators on regular StdRegions elements). As a consequence by changing the variable coefficients at each timestep, different matrices will be generated at each time-step and stored in the manager separately. Since this sort of time-dependent matrix usage is not typical of Nektar++ usage we do not really have a clean way of handling it at the moment. We do have a ClearManager member function to the NekManager to remove previously generated matrices from specific pools of matrices in a not-so-elegant brute-force manner. You will notice this is called indirectly through the ClearGlobalLinSysManager() call in your UnsteadySystem.cpp file, but this will only clear those in the "GlobalLinSys" pool, not those in the elemental matrix pools. For those you would need to clear, for example, the "QuadExpMatrix" and "QuadExpStaticCondMatrix" pools, and similarly for other element types. Hope that helps explain your observations and at least steers you towards a solution! Kind regards, Chris
Is this due to an improper use of varFactorsMap within the example, or is it a memory management problem? Am I missing source files which should be updated to provide this intended functionality? It must be noted that updating varCoeffMap at every step does not generate the same memory problem.
Any information or advice would be much appreciated, and apologies for the lengthy email :)
Best regards,
Oliviu
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
-- Chris Cantwell Imperial College London South Kensington Campus London SW7 2AZ Email: c.cantwell@imperial.ac.uk www.imperial.ac.uk/people/c.cantwell _______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
participants (2)
- 
                
                Chris Cantwell
- 
                
                Oliviu Sugar