Hi Florian, thanks! I now get a different (and probably unrelated) error message. After reinstalling PyOP2 and firedrake in a different directoy to tidy up my directory structure, it crashes (even on a single processor) with the message below. The point where it falls over seems to be this: [...] File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/host.py", line 654, in compile if self._kernel._is_blas_optimized: AttributeError: 'Kernel' object has no attribute '_is_blas_optimized' Thanks, Eike Started atMon Sep 29 15:46:19 BST 2014 Running helmholtz Mon Sep 29 15:46:22 2014: [PE_0]: cpumask set to 1 cpu on nid00942, cpumask = 000000000000000000000000000000000000000000000001 +------------------------+ ! Mixed Helmholtz solver ! +------------------------+ Running on 1 MPI processes *** Parameters *** Output: savetodisk = False output_dir = output Grid: ref_count_coarse = 3 nlevel = 4 Mixed system: higher_order = False verbose = 2 schur_diagonal_only = False lump_mass = True preconditioner = Multigrid maxiter = 20 tolerance = 1e-05 ksp_type = gmres Pressure solve: lump_mass = True maxiter = 10 tolerance = 1e-05 verbose = 1 ksp_type = cg Multigrid: mu_relax = 1.0 lump_mass = True n_postsmooth = 1 n_coarsesmooth = 1 n_presmooth = 1 Number of cells on finest grid = 327680 Traceback (most recent call last): File "/work/n02/n02/eike//git_workspace/firedrake-helmholtzsolver/source/driver.py", line 159, in <module> lumped_mass_fine = lumpedmass.LumpedMassRT0(V_velocity) File "/fs2/n02/n02/eike/git_workspace/firedrake-helmholtzsolver/source/pressuresolver/lumpedmass.py", line 178, in __init__ solver_parameters=self.project_solver_param) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/function.py", line 134, in project return projection.project(b, self, *args, **kwargs) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/projection.py", line 93, in project form_compiler_parameters=form_compiler_parameters) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/solving.py", line 904, in solve _solve_varproblem(*args, **kwargs) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/solving.py", line 934, in _solve_varproblem solver.solve() File "<string>", line 2, in solve File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/profiling.py", line 168, in wrapper return f(*args, **kwargs) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/solving.py", line 305, in solve self.snes.solve(None, v) File "SNES.pyx", line 418, in petsc4py.PETSc.SNES.solve (src/petsc4py.PETSc.c:149863) File "petscsnes.pxi", line 232, in petsc4py.PETSc.SNES_Function (src/petsc4py.PETSc.c:29845) File "/work/n02/n02/eike/git_workspace/firedrake/firedrake/solving.py", line 227, in form_function with self._F_tensor.dat.vec_ro as v: File "/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/contextlib.py", line 17, in __enter__ return self.gen.next() File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/petsc_base.py", line 90, in vec_context self._force_evaluation() File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/base.py", line 1522, in _force_evaluation _trace.evaluate(reads, writes) File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/base.py", line 154, in evaluate comp._run() File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/base.py", line 3808, in _run return self.compute() File "<string>", line 2, in compute File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/profiling.py", line 168, in wrapper return f(*args, **kwargs) File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/base.py", line 3816, in compute self._compute(self.it_space.iterset.core_part) File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/sequential.py", line 151, in _compute fun(*self._jit_args, argtypes=self._argtypes, restype=None) File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/host.py", line 629, in __call__ return self.compile(argtypes, restype)(*args) File "/work/n02/n02/eike/git_workspace/PyOP2/pyop2/host.py", line 654, in compile if self._kernel._is_blas_optimized: AttributeError: 'Kernel' object has no attribute '_is_blas_optimized' ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /work/n02/n02/eike//git_workspace/firedrake-helmholtzsolver/source/driver.py on a arch-linux2-cxx-opt named nid00942 with 1 processor, by eike Mon Sep 29 15:46:33 2014 Using Petsc Development GIT revision: v3.5.2-313-g42857b6 GIT Date: 2014-09-18 09:00:38 +0100 Max Max/Min Avg Total Time (sec): 1.023e+01 1.00000 1.023e+01 Objects: 2.790e+02 1.00000 2.790e+02 Flops: 0.000e+00 0.00000 0.000e+00 0.000e+00 Flops/sec: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.0231e+01 100.0% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecSet 9 1.0 3.1409e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 1 1.0 9.5367e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 1.4118e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMPlexInterp 1 1.0 2.4271e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMPlexStratify 9 1.0 2.0777e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 0 SFSetGraph 12 1.0 5.3091e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Container 24 14 8008 0 Viewer 1 0 0 0 Index Set 52 47 7921640 0 IS L to G Mapping 2 0 0 0 Section 105 29 19372 0 Vector 12 3 9696 0 Matrix 1 0 0 0 Preconditioner 1 0 0 0 Krylov Solver 1 0 0 0 DMKSP interface 1 0 0 0 SNES 1 0 0 0 SNESLineSearch 1 0 0 0 DMSNES 1 0 0 0 Distributed Mesh 19 7 31992 0 Star Forest Bipartite Graph 38 19 15200 0 Discrete System 19 7 5544 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 #PETSc Option Table entries: -firedrake_snes_0_ksp_rtol 1e-08 -firedrake_snes_0_ksp_type cg -firedrake_snes_0_pc_type jacobi -firedrake_snes_0_snes_type ksponly -log_summary #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --CC=cc --CC_LINKER_FLAGS=-dynamic --CFLAGS=-dynamic --CXX=CC --CXXFLAGS=-dynamic --CXX_LINKER_FLAGS=-dynamic --FC=ftn --FC_LINKER_FLAGS=-dynamic --FFLAGS=-dynamic --download-chaco --download-ctetgen=1 --download-hypre=1 --download-metis=1 --download-ml=1 --download-mumps=1 --download-parmetis=1 --download-ptscotch=1 --download-suitesparse=1 --download-superlu_dist=1 --download-triangle --with-blacs-include=/opt/cray/libsci/12.2.0/GNU/48/sandybridge/include --with-blacs=1--with-fortran-interfaces=1 --with-blas-lib="[/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib/libsci_gnu.so]" --with-c-support --with-clanguage=C++ --with-debugging=0 --with-hypre=1 --with-lapack-lib="[/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib/libsci_gnu.so]" --with-metis=1 --with-ml=1 --with-mpi-dir=/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48 --with-mumps=1 --with-parmetis=1 --with-ptscotch=1 --with-scalapack-include=/opt/cray/libsci/12.2.0/GNU/48/sandybridge/include --with-scalapack-lib="[/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib/libsci_gnu.so]" --with-scalapack="1--with-blacs-lib=[/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib/libsci_gnu.so]" --with-shared-libraries=1 --with-spai=1 --with-suitesparse=1 --with-superlu_dist=1 -O3 -Wl,-Bdynamic -g PETSC_ARCH=arch-linux2-cxx-opt download-spai=1 ----------------------------------------- Libraries compiled on Fri Sep 19 19:52:46 2014 on eslogin002 Machine characteristics: Linux-3.0.101-0.5.2-default-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /work/y07/y07/fdrake/petsc Using PETSc arch: arch-linux2-cxx-opt ----------------------------------------- Using C compiler: CC -dynamic -O -fPIC ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -dynamic -fPIC -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/work/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/include -I/work/y07/y07/fdrake/petsc/include -I/work/y07/y07/fdrake/petsc/include -I/work/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/include -I/fs2/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/include -I/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48/include ----------------------------------------- Using C linker: CC Using Fortran linker: ftn Using libraries: -Wl,-rpath,/work/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/lib -L/work/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/lib -lpetsc -Wl,-rpath,/fs2/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/lib -L/fs2/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -Wl,-rpath,/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib -L/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib -lsci_gnu -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -lspai -lsuperlu_dist_3.3 -lHYPRE -lml -lsci_gnu -lparmetis -lmetis -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -ltriangle -lX11 -lpthread -lchaco -lhwloc -lctetgen -lssl -lcrypto -Wl,-rpath,/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48/lib -L/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48/lib -Wl,-rpath,/opt/cray/atp/1.7.2/lib -L/opt/cray/atp/1.7.2/lib -Wl,-rpath,/opt/gcc/4.8.2/snos/lib/gcc/x86_64-suse-linux/4.8.2 -L/opt/gcc/4.8.2/snos/lib/gcc/x86_64-suse-linux/4.8.2 -Wl,-rpath,/opt/gcc/4.8.2/snos/lib64 -L/opt/gcc/4.8.2/snos/lib64 -Wl,-rpath,/opt/gcc/4.8.2/snos/lib -L/opt/gcc/4.8.2/snos/lib -lgfortran -lm -lgfortran -lm -lmpichf90_gnu_48 -lm -lquadmath -lm -lmpichcxx_gnu_48 -lstdc++ -lrt -lm -lz -Wl,-rpath,/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48/lib -L/opt/cray/mpt/6.3.1/gni/mpich2-gnu/48/lib -Wl,-rpath,/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib -L/opt/cray/libsci/12.2.0/GNU/48/sandybridge/lib -Wl,-rpath,/opt/cray/atp/1.7.2/lib -L/opt/cray/atp/1.7.2/lib -Wl,-rpath,/opt/gcc/4.8.2/snos/lib/gcc/x86_64-suse-linux/4.8.2 -L/opt/gcc/4.8.2/snos/lib/gcc/x86_64-suse-linux/4.8.2 -Wl,-rpath,/opt/gcc/4.8.2/snos/lib64 -L/opt/gcc/4.8.2/snos/lib64 -Wl,-rpath,/opt/gcc/4.8.2/snos/lib -L/opt/gcc/4.8.2/snos/lib -ldl -lmpich_gnu_48 -lsci_gnu_48_mpi_mp -lsci_gnu_48_mp -lAtpSigHandler -lAtpSigHCommData -lgfortran -lpthread -lgcc_s -ldl ----------------------------------------- Application 11287991 exit codes: 1 Application 11287991 resources: utime ~4s, stime ~2s, Rss ~539200, inblocks ~36587, outblocks ~799 Finished atMon Sep 29 15:46:34 BST 2014 On 29 Sep 2014, at 13:56, Florian Rathgeber <f.rathgeber10@imperial.ac.uk> wrote:
On 29/09/14 11:08, Eike Mueller wrote:
Dear all,
just to ley you know that it is working now, i.e. I can run the helmholtz solver on 4 processes on ARCHER.
I needed the local_par-loop branch from the latest version of PyOP2 and also the multigrid branch from the latest firedrake, so I pulled these two into my work directory. I then added
export PYTHONPATH=$WORK/PyOP2:${PYTHONPATH} export PYTHONPATH=$WORK/firedrake:${PYTHONPATH}
to my job script (attached). However, despite that, python seems to load the pre-installed PyOP2 module first, below is the output of sys.path.
I could fix this by adding
import sys sys.path.insert(0,'/work/n02/n02/eike/PyOP2') [...]
at the top of my main python script. Any ideas why I had to do this?
The reason for this is a "feature" of distutils, which generates a file easy_install.pth containing paths to eggs installed using distutils which are pushed to the *beginning* of sys.path even before entries in the $PYTHONPATH. I have fixed this now, so you no longer need to fiddle with sys.path in your script.
Florian
Thanks,
Eike
['/fs2/n02/n02/eike/git_workspace/firedrake-helmholtzsolver/source', '/fs2/y07/y07/fdrake/decorator-3.4.0/lib/python2.7/site-packages/decorator-3.4.0-py2.7.egg', '/work/y07/y07/fdrake/PyOP2/lib/python2.7/site-packages/PyOP2-0.11.0_99_g54eb9ea_dirty-py2.7-linux-x86_64.egg', '/work/y07/y07/fdrake/ffc/lib/python2.7/site-packages/FFC-1.4.0_-py2.7-linux-x86_64.egg', '/work/y07/y07/fdrake/decorator-3.4.0/lib/python2.7/site-packages/decorator-3.4.0-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/python_hostlist-1.14-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/pymongo-2.7.2-py2.7-linux-x86_64.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/GridDataFormats-0.2.4-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/setuptools-2.2-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/extasy.coco-0.1-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/radical.pilot-0.18-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/MDAnalysis-0.8.1-py2.7-linux-x86_64.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/python_hostlist-1.14-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/pymongo-2.7.2-py2.7-linux-x86_64.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/radical.utils-0.7.7-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/saga_python-0.18-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/GridDataFormats-0.2.4-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/apache_libcloud-0.15.1-py2.7.egg', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/setuptools-2.2-py2.7.egg', '/work/n02/n02/eike/firedrake', '/work/n02/n02/eike/PyOP2', '/work/n02/n02/eike/firedrake-bench', '/work/n02/n02/eike/pybench', '/work/y07/y07/fdrake/firedrake/lib/python2.7/site-packages', '/work/y07/y07/fdrake/PyOP2/lib/python2.7/site-packages', '/work/y07/y07/fdrake/ufl/lib/python2.7/site-packages', '/work/y07/y07/fdrake/scientificpython/lib/python2.7/site-packages', '/work/y07/y07/fdrake/psutil/lib/python2.7/site-packages', '/work/y07/y07/fdrake/mpi4py/lib/python2.7/site-packages', '/work/y07/y07/fdrake/instant/lib/python2.7/site-packages', '/work/y07/y07/fdrake/fiat/lib/python2.7/site-packages', '/work/y07/y07/fdrake/ffc/lib/python2.7/site-packages', '/work/y07/y07/fdrake/decorator-3.4.0/lib/python2.7/site-packages', '/work/y07/y07/fdrake/petsc/arch-linux2-cxx-opt/lib/python2.7/site-packages', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7', '/opt/cray/sdb/1.0-1.0501.48084.4.48.ari/lib64/py', '/work/y07/y07/cse/anaconda/1.9.2/lib/python27.zip', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/plat-linux2', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/lib-tk', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/lib-old', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/lib-dynload', '/work/y07/y07/cse/anaconda/1.9.2/lib/python2.7/site-packages/PIL', '/opt/cray/sdb/1.0-1.0501.48084.4.48.ari/lib64/py']
On 15/09/14 09:37, Florian Rathgeber wrote:
On 15/09/14 09:20, Eike Mueller wrote:
Hi Florian,
I'm having another go at this now.
export PYTHONPATH=${FDRAKE_DIR}/petsc4py/lib/python2.7/site-packages:${PYTHONPATH} export PYTHONPATH=${FDRAKE_DIR}/ufl/lib/python2.7/site-packages/:${PYTHONPATH} export PYTHONPATH=${FDRAKE_DIR}/fiat/lib/python2.7/site-packages/:${PYTHONPATH} export PYTHONPATH=${FDRAKE_DIR}/ffc/lib/python2.7/site-packages/:${PYTHONPATH}
I added this to the template file on the wiki. This *should* be added by loading fdrake-python-env
* The https cloning issue I believe is resolved, at least I can't reproduce it anymore. All seems to work fine for me now.
* The PyOP2 installation issue related to NumPy is a bug [1] in setuptools 2.2 which comes with anaconda. For now, the best thing to do is to remove/comment line 138 in PyOP2/setup.py which declares the setup dependency on NumPy.
I commented out the line 'setup_requires=setup_requires,' in setup.py and the PyOP2 installation goes through without problems.
* The anaconda/environment modules conflict is resolved: I have removed ANACONDA_LIB from the LD_LIBRARY_PATH and added it to LD_RUN_PATH instead. This means Python extension modules get an RPATH to that directory baked in and therefore can find the right libpython2.7.so without any LD_LIBRARY_PATH being set (which is preferable anyway).
Works for me now, I can run the module command even after loading the firedrake environment.
Unfortunately, Firedrake's setup.py manually sets an RPATH for PETSc and therefore the LD_RUN_PATH is ignored. This patch is required:
diff --git a/setup.py b/setup.py index 00838f8..350d51d 100644 --- a/setup.py +++ b/setup.py @@ -70,6 +70,7 @@ setup(name='firedrake', include_dirs=include_dirs, libraries=["petsc"], extra_link_args=["-L%s/lib" % d for d in petsc_dirs] + - ["-Wl,-rpath,%s/lib" % d for d in petsc_dirs]), + ["-Wl,-rpath,%s/lib" % d for d in petsc_dirs] + + ["-Wl,-rpath,%s/lib" % sys.prefix]), Extension('evtk.cevtk', evtk_sources, include_dirs=[np.get_include()])])
Can you send me that patch as a separate text file, please? When I copied it out of your email it didn't work. I have pushed this to firedrake master.
Still something not quite right with the python. When I run a 'Hello World!' test, i.e.
import sys print 'Hello World!' sys.exit(0)
I still get
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory That works for me. Not sure why this would fail since you're not loading an extension module. Is this on the login node with anaconda?
but I guess this is due to the unresolved issue with anaconda that you reported to the helpdesk, so I will wait for what comes back from them. Indeed, as reported to helpdesk, it was premature thinking the LD_LIBRARY_PATH was no longer required. I'm currently stuck on mpi4py even though all the RPATHs are correctly set afaict. For now you'll need to add $ANANCONDA_LIBS to LD_LIBRARY_PATH in your firedrake.tpl
Florian
Thanks,
Eike
For FFC it's more tricky, still trying to figure out what's needed.
Keep us posted on how things are progressing!
Florian
I will go ahead and edit the wiki.
Eike
On 8 Sep 2014, at 15:53, Eike Mueller <E.Mueller@bath.ac.uk <mailto:E.Mueller@bath.ac.uk> <mailto:E.Mueller@bath.ac.uk>> wrote:
> Hi Florian, > > thanks a lot for your reply and for updating the wiki, I got a bit > further now. > >> You're exactly right again: it seems that anaconda contains a >> version of >> the TCL libraries which are incompatible with those use by the >> environment modules. I'm not sure how to work around this, since we >> need >> the anaconda libs (as you figured out later) - will open a ticket. > ok, I will use the environment script as it is and not worry about the > fact that the anaconda library breaks the module command at the > moment. Let's see if the support can figure something out. > >> I've changed the wiki page to clone via https, thanks! > It works with the updated instructions on the webpage. Strangely, if I > clone a repository into a directory, delete this directory and then > try to clone again with exactly the same command, nothing happens. > >> This seems to happen because (for reasons I can't figure out) >> setuptools >> doesn't detect that NumPy is already installed and builds it from >> source. The primary cause of the error you're seeing is this: >> https://stackoverflow.com/a/21621493 > I will ignore this then, and just call the setup.py script twice in > the PyOP2 directory. > >> This is because you've removed the anaconda lib path, which also >> contains the Python libraries themselves, from LD_LIBRARY_PATH > I tried a simpler hello-world script where I swap python -> anaconda > and run a python script. This does work. > Still trying to run the poisson example from the benchmark repository, > but the ARCHER queues seem to be terribly slow, even if I just run on > one node on the debug queue :-( > >> Absolutely, feel free to edit the wiki if you find more errors or >> things >> that are unclear. >> > I think I already have access to the repository, so will edit if > necessary. > > Thanks a lot, > > Eike > >> Florian >> >>> Cheers, >>> >>> Eike >>> >>> *** content of firedrake.env *** >>> >>> module swap PrgEnv-cray PrgEnv-gnu >>> module unload python >>> module add anaconda >>> >>> # Add fdrake module path >>> export FDRAKE_DIR=/work/y07/y07/fdrake >>> module use "$FDRAKE_DIR/modules" >>> >>> # Load build environment >>> module load fdrake-build-env >>> module load fdrake-python-env >>> >>> LD_LIBRARY_PATH=`echo $LD_LIBRARY_PATH | sed >>> 's/\/work\/y07\/y07\/cse\/anaconda\/1.9.2\/lib//g'` >>> >>> export >>> PYTHONPATH=$WORK/firedrake-bench:$WORK/firedrake-bench/pybench:$WORK/firedrake:$WORK/PyOP2:$PYTHONPATH
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake