Hi David, This is cluster of computers with a Centos 6.5 frontend. I installed Nektar++ 3.4.0. As an example Test 111 fails as follows ================ Command: mpirun -np 3 /cfd/cusert/nektar++-3.4.0/builds/library/Demos/MultiRegions/Helmholtz2D -I GlobalSysSoln=IterativeStaticCond Helmholtz2D_P7_AllBCs.xml 1>output.out 2>output.err ================ output.err includes the following ================ error while loading shared libraries: libboost_system.so.1.49.0: cannot open shared object file: No such file or directory ================ Boost is installed as a third party library and my LD_LIBRARY_PATH is set as explained in Nektar++'s FAQ page. I can see the above mentioned library in .../ThirdParty/dist/lib directory. MPI was detected by ccmake in /opt/openmpi directory. Can this be related to calling mpirun without the job scheduler. I am not really experienced with MPI. Cuneyt Alinti David Moxey <d.moxey@imperial.ac.uk>
Hi Cuneyt,
Can I ask:
- what system is this installed on? - what version of Nektar++ are you using? - please run: ctest --output-on-failure and post the output (and the output of output.out and output.err if they are referenced)
Since it is only parallel tests that are failing for you, it is likely to be an MPI issue. This will be unrelated to the SMV package, which is a feature that is still under development.
Thanks,
Dave
On 4 Jul 2014, at 02:47, Cuneyt Sert <csert@metu.edu.tr> wrote:
Hi there, After installing Nektar++, ctest gives the following output. What may be the reason of these failing cases? Maybe they are related to the optional SMV package, which I wasn't able to install. And how important is this SMV for run time efficiency? Thank you. C. Sert
95% tests passed, 16 tests failed out of 302
The following tests FAILED: 111 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_sc_par3 (Failed) 112 - MultiRegions_Helmholtz2D_CG_P7_Modes_AllBCs_iter_ml_par3 (Failed) 113 - MultiRegions_Helmholtz3D_CG_Hex_AllBCs_iter_ml_par3 (Failed) 114 - MultiRegions_Helmholtz3D_CG_Prism_iter_ml_par3 (Failed) 116 - MultiRegions_Helmholtz3D_HDG_Prism_par2 (Failed) 117 - MultiRegions_Helmholtz3D_HDG_Hex_AllBCs_par2 (Failed) 260 - ADRSolver_Advection3D_m12_DG_hex_periodic_par (Failed) 261 - ADRSolver_ImDiffusion_Hex_Periodic_m5_par (Failed) 262 - ADRSolver_Helmholtz3D_CubePeriodic_par (Failed) 263 - ADRSolver_Helmholtz3D_CubeDirichlet_par (Failed) 264 - ADRSolver_Helmholtz3D_CubePeriodic_RotateFace_par (Failed) 291 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode1 (Failed) 292 - IncNavierStokesSolver_ChanFlow_3DH1D_Parallel_mode2 (Failed) 293 - IncNavierStokesSolver_ChanFlow_m3_par (Failed) 294 - IncNavierStokesSolver_Tet_channel_m8_par (Failed) 302 - FieldConvert_chan3D_vort_par (Failed)
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users
_______________________________________________ Nektar-users mailing list Nektar-users@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/nektar-users