Dear All, As far as I understand Gmsh only creates the geometry and the mesh. How can we define the boundary conditions, solver info's etc. to the Nektar's xml file? Do we manually write it into the .xml file or is there another way to do this? Regards, Kamil
Hi Kamil, You have to generate the specific boundary conditions by hand but follow the instructions on the wiki under tutorial->mesh convert. cheers, Spencer. On 3 Jul 2014, at 10:14, Kamil Ozden <kamil.ozden.me@gmail.com<mailto:kamil.ozden.me@gmail.com>> wrote: Dear All, As far as I understand Gmsh only creates the geometry and the mesh. How can we define the boundary conditions, solver info's etc. to the Nektar's xml file? Do we manually write it into the .xml file or is there another way to do this? Regards, Kamil _______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk<mailto:Nektar-users@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/nektar-users Spencer Sherwin Professor of Computational Fluid Mechanics, Department of Aeronautics, Imperial College London South Kensington Campus London SW7 2AZ s.sherwin@imperial.ac.uk<mailto:s.sherwin@imperial.ac.uk> +44 (0) 20 759 45052
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert 95% tests passed, 16 tests failed out of 302 The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
Hi Cuneyt, Can I ask: - what system is this installed on? - what version of Nektar++ are you using? - please run: ctest --output-on-failure and post the output (and the output of output.out and output.err if they are referenced) Since it is only parallel tests that are failing for you, it is likely to be an MPI issue. This will be unrelated to the SMV package, which is a feature that is still under development. Thanks, Dave On 4 Jul 2014, at 02:47, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert
95% tests passed, 16 tests failed out of 302
The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
Hi David, This is cluster of computers with a Centos 6.5 frontend. I installed Nektar++ 3.4.0. As an example Test 111 fails as follows ================ Command: mpirun -np 3 /cfd/cusert/nektar++-3.4.0/builds/library/Demos/MultiRegions/Helmholtz2D -I GlobalSysSoln=IterativeStaticCond Helmholtz2D_P7_AllBCs.xml 1>output.out 2>output.err ================ output.err includes the following ================ error while loading shared libraries: libboost_system.so.1.49.0: cannot open shared object file: No such file or directory ================ Boost is installed as a third party library and my LD_LIBRARY_PATH is set as explained in Nektar++'s FAQ page. I can see the above mentioned library in .../ThirdParty/dist/lib directory. MPI was detected by ccmake in /opt/openmpi directory. Can this be related to calling mpirun without the job scheduler. I am not really experienced with MPI. Cuneyt Alinti David Moxey <d.moxey@imperial.ac.uk>
Hi Cuneyt,
Can I ask:
- what system is this installed on? - what version of Nektar++ are you using? - please run: ctest --output-on-failure and post the output (and the output of output.out and output.err if they are referenced)
Since it is only parallel tests that are failing for you, it is likely to be an MPI issue. This will be unrelated to the SMV package, which is a feature that is still under development.
Thanks,
Dave
On 4 Jul 2014, at 02:47, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert
95% tests passed, 16 tests failed out of 302
The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
Hi Cuneyt, It is possible that your LD_LIBRARY_PATH is not being passed through to the executable by mpirun. Depending on your distribution of MPI, you could try to manually pass it through; for example with OpenMPI you can do: mpirun -x LD_LIBRARY_PATH ... but it is entirely dependent on the system/distribution. You might contact your cluster administrator if you continue to have problems, they can probably be of more assistance with this. Thanks, Dave On 6 Jul 2014, at 06:28, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi David,
This is cluster of computers with a Centos 6.5 frontend. I installed Nektar++ 3.4.0.
As an example Test 111 fails as follows
================ Command: mpirun -np 3 /cfd/cusert/nektar++-3.4.0/builds/library/Demos/MultiRegions/Helmholtz2D -I GlobalSysSoln=IterativeStaticCond Helmholtz2D_P7_AllBCs.xml 1>output.out 2>output.err ================
output.err includes the following
================ error while loading shared libraries: libboost_system.so.1.49.0: cannot open shared object file: No such file or directory ================
Boost is installed as a third party library and my LD_LIBRARY_PATH is set as explained in Nektar++'s FAQ page. I can see the above mentioned library in .../ThirdParty/dist/lib directory. MPI was detected by ccmake in /opt/openmpi directory.
Can this be related to calling mpirun without the job scheduler. I am not really experienced with MPI.
Cuneyt
Alinti David Moxey <d.moxey@imperial.ac.uk>
Hi Cuneyt,
Can I ask:
- what system is this installed on? - what version of Nektar++ are you using? - please run: ctest --output-on-failure and post the output (and the output of output.out and output.err if they are referenced)
Since it is only parallel tests that are failing for you, it is likely to be an MPI issue. This will be unrelated to the SMV package, which is a feature that is still under development.
Thanks,
Dave
On 4 Jul 2014, at 02:47, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert
95% tests passed, 16 tests failed out of 302
The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
I just read the following on the FAQ page. Sounds to be the reason of my failed tests. "In a cluster environment, using PBS for example, the mpiexec command should be used". Cuneyt Alinti David Moxey <d.moxey@imperial.ac.uk>
Hi Cuneyt,
Can I ask:
- what system is this installed on? - what version of Nektar++ are you using? - please run: ctest --output-on-failure and post the output (and the output of output.out and output.err if they are referenced)
Since it is only parallel tests that are failing for you, it is likely to be an MPI issue. This will be unrelated to the SMV package, which is a feature that is still under development.
Thanks,
Dave
On 4 Jul 2014, at 02:47, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert
95% tests passed, 16 tests failed out of 302
The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
participants (4)
- 
                
                Cuneyt Sert
- 
                
                David Moxey
- 
                
                Kamil Ozden
- 
                
                Sherwin, Spencer J