[yt-users] inline yt in parallel

Christine Simpson csimpson at astro.columbia.edu
Thu Apr 11 12:00:17 PDT 2013


Hi Matt,

So I have a script called user_script.py in my run directory where I have the enzo executable, parameter file, etc. and from which I'm running the enzo simulation.  What do you mean by 'yt directory'?  Do you mean site-packages?  The script I'm testing is just this (from the yt docs):

from yt.pmods import *

def main():


   pf = EnzoStaticOutputInMemory()

   pc = PlotCollection(pf)
   pc.add_slice("Density",1)
   pc.save()

If I just type python2.7 user_script.py on the command line, it just returns (although I have to change yt.pmods to yt.mods).  I mean, there's no enzo output in memory unless it is running with an enzo simulation, right?  

Thanks for your help,

Christine
 
On Apr 11, 2013, at 2:55 AM, Matthew Turk wrote:

> Hi Christine,
> 
> On Thu, Apr 11, 2013 at 12:34 AM, Christine Simpson
> <csimpson at astro.columbia.edu> wrote:
>> Hi all,
>> 
>> Thanks for your observations and suggestions.  I had neglected to install mpi4py, which was the original problem.  I installed that and I can run parallel yt scripts, however, I'm still having trouble with using inline yt.  I've pasted the error I now get below.  It is not very informative (to me at least); the keyboard interrupt is the symptom, not the cause of the problem, I think.  I'm doing this on trestles and I tried to use their parallel debugger ddt to get some more information.  ddt seems to indicate that one of the processes is looking for a file called mpi4py.MPI.c in the /tmp directory, which I don't really understand, and maybe is a red herring.  I don't have any problems with single processor jobs.  I installed yt using shared libraries by adding the --enable-shared flag to the configure statement for python in the install script.  I've also pasted the enzo make file that I'm using below.  I'm thinking that I somehow have messed up the libraries or include fil
> es
>> .  If anyone has successfully used inline yt on trestles and has any advice, I'd love to hear it.
>> 
> 
> So this could probably be better covered by the documentation, but the
> inline yt process looks for a script called user_script.py in the yt
> directory, within which it will call the main() function.  This
> function can get access to the in-memory output by doing something
> like "pf = EnzoStaticOutputInMemory()" which will query the
> appropriate items.  Note that you can't access raw data like
> "sphere['Density']" but you can do operations like
> "sphere.quantities['Extrema']('Density')" and so on; anything that
> uses an opaque object is fine, but arrays of concatenated data
> generally aren't.
> 
> If you do have the script "user_script.py" in your directory, then
> this generally means that there's a syntax error or something else
> preventing it from being imported.  I think if you have gotten this
> far and you don't have user_script.py, you probably are fine for the
> libraries and so on.  If you do have it, are you able to run it with
> "python2.7 user_script.py" ?
> 
> -Matt
> 
>> Thanks for all your help
>> Christine
>> 
>> Error:
>> 
>> MPI_Init: NumberOfProcessors = 3
>> warning: the following parameter line was not interpreted:
>> TestStarParticleEnergy  =  0.00104392468495
>> warning: the following parameter line was not interpreted:
>> TestStarParticleDensity  =  1.0
>> warning: the following parameter line was not interpreted:
>> TestStarParticleStarMass  =  100.0
>> ****** ReadUnits:  2.748961e+37 1.000000e-24 3.018025e+20 3.150000e+13 *******
>> Global Dir set to .
>> Initialdt in ReadParameterFile = 4.815337e-05
>> InitializeNew: Starting problem initialization.
>> Central Mass: 6813.382812
>> Allocated 1 particles
>> Initialize Exterior
>> ExtBndry: BoundaryRank = 3
>> ExtBndry: GridDimension = 104 104 104
>> ExtBndry: NumberOfBaryonFields = 6
>> InitializeExternalBoundaryFace
>> SimpleConstantBoundary FALSE
>> End of set exterior
>> InitializeNew: Initial grid hierarchy set
>> InitializeNew: Partition Initial Grid 0
>> Enter CommunicationPartitionGrid.
>> PartitionGrid (on all processors): Layout = 1 1 3
>> NumberOfNewGrids = 3
>> GridDims[0]:  98
>> GridDims[1]:  98
>> GridDims[2]:  33 32 33
>> StartIndex[0]:  0
>> StartIndex[1]:  0
>> StartIndex[2]:  0 33 65
>> Call ZeroSUS on TopGrid
>> ENZO_layout 1 x 1 x 3
>> Grid structure: 1576
>> SubGrids structure: 4728
>> Re-set Unigrid = 0
>> Grid distribution
>> Delete OldGrid
>> OldGrid deleted
>> Exit CommunicationPartitionGrid.
>> InitializeNew: Finished problem initialization.
>> Initializing Python interface
>> Successfully read in parameter file StarParticleTest.enzo.
>> INITIALIZATION TIME =   9.38615084e-01
>> Beginning parallel import block.
>> MPI process (rank: 1) terminated unexpectedly on trestles-12-20.local
>> Exit code -5 signaled from trestles-12-20
>> Traceback (most recent call last):
>>  File "<string>", line 1, in <module>
>>  File "./user_script.py", line 1, in <module>
>>    from yt.pmods import *
>>  File "/home/csimpson/yt-x86_64-shared/src/yt-hg/yt/pmods.py", line 364, in <module>
>>    from yt.mods import *
>>  File "/home/csimpson/yt-x86_64-shared/src/yt-hg/yt/pmods.py", line 234, in __import_hook__
>>    q, tail = __find_head_package__(parent, name)
>>  File "/home/csimpson/yt-x86_64-shared/src/yt-hg/yt/pmods.py", line 323, in __find_head_package__
>>    q = __import_module__(head, qname, parent)
>>  File "/home/csimpson/yt-x86_64-shared/src/yt-hg/yt/pmods.py", line 268, in __import_module__
>>    pathname,stuff,ierror = mpi.bcast((pathname,stuff,ierror))
>>  File "/home/csimpson/yt-x86_64-shared/src/yt-hg/yt/pmods.py", line 201, in bcast
>>    return MPI.COMM_WORLD.bcast(obj,root)
>> KeyboardInterrupt
>> Caught fatal exception:
>> 
>>   'Importing user_script failed!'
>> at InitializePythonInterface.C:108
>> 
>> Backtrace:
>> 
>> BT symbol: ./enzo.exe [0x41ff8a]
>> BT symbol: ./enzo.exe [0x727e14]
>> BT symbol: ./enzo.exe [0x421147]
>> BT symbol: /lib64/libc.so.6(__libc_start_main+0xf4) [0x3c0121d994]
>> BT symbol: ./enzo.exe(__gxx_personality_v0+0x3d9) [0x41fea9]
>> terminate called after throwing an instance of 'EnzoFatalException'
>> 
>> Make file:
>> 
>> #=======================================================================
>> #
>> # FILE:        Make.mach.trestles
>> #
>> # DESCRIPTION: Makefile settings for the Trestles Resource at SDSC/UCSD
>> #
>> # AUTHOR:      John Wise (jwise at astro.princeton.edu)
>> #
>> # DATE:        07 Dec 2010
>> #
>> #
>> #=======================================================================
>> 
>> MACH_TEXT  = Trestles
>> MACH_VALID = 1
>> MACH_FILE  = Make.mach.trestles
>> 
>> MACHINE_NOTES = "MACHINE_NOTES for Trestles at SDSC/UCSD: \
>>        Load these modules, \
>>        'module add intel/11.1 mvapich2/1.5.1p1'"
>> 
>> #-----------------------------------------------------------------------
>> # Compiler settings
>> #-----------------------------------------------------------------------
>> 
>> LOCAL_MPI_INSTALL = /home/diag/opt/mvapich2/1.5.1p1/intel/
>> LOCAL_PYTHON_INSTALL = /home/csimpson/yt-x86_64-shared/
>> #LOCAL_COMPILER_DIR = /opt/pgi/linux86-64/10.5
>> LOCAL_COMPILER_DIR = /opt/intel/Compiler/11.1/072
>> LOCAL_HYPRE_INSTALL =
>> 
>> # With MPI
>> 
>> MACH_CPP       = cpp
>> MACH_CC_MPI    = $(LOCAL_MPI_INSTALL)/bin/mpicc # C compiler when using MPI
>> MACH_CXX_MPI   = $(LOCAL_MPI_INSTALL)/bin/mpicxx # C++ compiler when using MPI
>> MACH_FC_MPI    = $(LOCAL_MPI_INSTALL)/bin/mpif90 # Fortran 77 compiler when using MPI
>> MACH_F90_MPI   = $(LOCAL_MPI_INSTALL)/bin/mpif90 # Fortran 90 compiler when using MPI
>> MACH_LD_MPI    = $(LOCAL_MPI_INSTALL)/bin/mpicxx # Linker when using MPI
>> 
>> # Without MPI
>> 
>> MACH_CC_NOMPI  = $(LOCAL_COMPILER_DIR)/bin/intel64/icc # C compiler when not using MPI
>> MACH_CXX_NOMPI = $(LOCAL_COMPILER_DIR)/bin/intel64/icpc # C++ compiler when not using MPI
>> MACH_FC_NOMPI  = $(LOCAL_COMPILER_DIR)/bin/intel64/ifort # Fortran 77 compiler when not using MPI
>> MACH_F90_NOMPI = $(LOCAL_COMPILER_DIR)/bin/intel64/ifort # Fortran 90 compiler when not using MPI
>> MACH_LD_NOMPI  = $(LOCAL_COMPILER_DIR)/bin/intel64/icpc # Linker when not using MPI
>> 
>> #-----------------------------------------------------------------------
>> # Machine-dependent defines
>> #-----------------------------------------------------------------------
>> # Defines for the architecture; e.g. -DSUN, -DLINUX, etc.
>> MACH_DEFINES   = -DLINUX -DH5_USE_16_API
>> 
>> #-----------------------------------------------------------------------
>> # Compiler flag settings
>> #-----------------------------------------------------------------------
>> 
>> 
>> MACH_CPPFLAGS = -P -traditional
>> MACH_CFLAGS   =
>> MACH_CXXFLAGS =
>> MACH_FFLAGS   =
>> MACH_F90FLAGS =
>> MACH_LDFLAGS  =
>> 
>> #-----------------------------------------------------------------------
>> # Precision-related flags
>> #-----------------------------------------------------------------------
>> 
>> MACH_FFLAGS_INTEGER_32 = -i4
>> MACH_FFLAGS_INTEGER_64 = -i8
>> MACH_FFLAGS_REAL_32    = -r4
>> MACH_FFLAGS_REAL_64    = -r8
>> 
>> #-----------------------------------------------------------------------
>> # Optimization flags
>> #-----------------------------------------------------------------------
>> 
>> MACH_OPT_WARN        = -Wall # Flags for verbose compiler warnings
>> MACH_OPT_DEBUG       = -O0 -g # Flags for debugging
>> # Flags for high conservative optimization
>> #MACH_OPT_HIGH        = -O1 -ftz -mieee-fp -fp-speculation=off -prec-sqrt -prec-div
>> MACH_OPT_HIGH        = -O2
>> # Note that this breaks determinism, which is why it's commented out!
>> #
>> MACH_OPT_AGGRESSIVE  = -O3 # Flags for aggressive optimization
>> # This is the best we can do, from what I can tell.
>> #MACH_OPT_AGGRESSIVE  = -O1 -ftz -mieee-fp -fp-speculation=off -prec-sqrt -prec-div
>> 
>> #-----------------------------------------------------------------------
>> # Includes
>> #-----------------------------------------------------------------------
>> 
>> LOCAL_INCLUDES_MPI    =
>> LOCAL_INCLUDES_HDF5   = -I/home/csimpson/yt-x86_64-shared/include # HDF5 includes
>> LOCAL_INCLUDES_HYPRE  =
>> LOCAL_INCLUDES_PAPI   = # PAPI includes
>> LOCAL_INCLUDES_PYTHON = -I$(LOCAL_PYTHON_INSTALL)/include/python2.7 \
>>                        -I$(LOCAL_PYTHON_INSTALL)/lib/python2.7/site-packages/numpy/core/include
>> 
>> MACH_INCLUDES         = $(LOCAL_INCLUDES_HDF5)
>> MACH_INCLUDES_PYTHON  = $(LOCAL_INCLUDES_PYTHON)
>> MACH_INCLUDES_MPI     = $(LOCAL_INCLUDES_MPI)
>> MACH_INCLUDES_HYPRE   = $(LOCAL_INCLUDES_HYPRE)
>> MACH_INCLUDES_PAPI    = $(LOCAL_INCLUDES_PAPI)
>> 
>> #-----------------------------------------------------------------------
>> # Libraries
>> #-----------------------------------------------------------------------
>> 
>> LOCAL_LIBS_MPI    =
>> LOCAL_LIBS_HDF5   = -L/home/csimpson/yt-x86_64-shared/lib -lhdf5 # HDF5 libraries
>> LOCAL_LIBS_HYPRE  =
>> LOCAL_LIBS_PAPI   = # PAPI libraries
>> LOCAL_LIBS_PYTHON  = -L$(LOCAL_PYTHON_INSTALL)/lib -lpython2.7 \
>>                     -lreadline -ltermcap -lutil
>> 
>> #LOCAL_LIBS_MACH   = -L$(LOCAL_COMPILER_DIR)/lib \
>> #                       -lpgf90 -lpgf90_rpm1 -lpgf902 -lpgf90rtl -lpgftnrtl -lrt
>> LOCAL_LIBS_MACH  = -L$(LOCAL_COMPILER_DIR)/lib/intel64 -lifcore -lifport
>> 
>> 
>> MACH_LIBS         = $(LOCAL_LIBS_HDF5) $(LOCAL_LIBS_MACH)
>> MACH_LIBS_MPI     = $(LOCAL_LIBS_MPI)
>> MACH_LIBS_HYPRE   = $(LOCAL_LIBS_HYPRE)
>> MACH_LIBS_PAPI    = $(LOCAL_LIBS_PAPI)
>> MACH_LIBS_PYTHON  = $(LOCAL_LIBS_PYTHON)
>> 
>> 
>> _______________________________________________
>> yt-users mailing list
>> yt-users at lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> 




More information about the yt-users mailing list