[yt-users] parallel yt on ranger

Christine Simpson csimpson at astro.columbia.edu
Thu Apr 7 19:40:54 PDT 2011


Thanks Matt.  I was able to install mpi4py with this and now it works.

Christine

On Thu, 2011-04-07 at 19:47 -0400, Matthew Turk wrote:
> Hi Christine,
> 
> Unfortunately, no, you have to install mpi4py yourself.  You may be
> able to do this using:
> 
> pip install mpi4py
> 
> I've been looking at ways of including this in the installer script,
> but I've been a bit stymied by the myriad of mpi wrappers and names.
> I'm optimistic I have *some* kind of solution, but I haven't had a
> chance to test it in detail.  For now it's a bit DIY.
> 
> -Matt
> 
> On Thu, Apr 7, 2011 at 7:45 PM, Christine Simpson
> <csimpson at astro.columbia.edu> wrote:
> > Hi Britton,
> >
> > Yes, this was my understanding of ibrun, and what you suggested is the
> > first thing I tried.  It did not work, however, and I was led to try
> > something like ibrun mpirun from some old discussions on the list.
> >
> > However, now looking back on the crash, there was no useful info in
> > the .o file but in the .e file, it looks like it actually got to the
> > script, but couldn't import some modules, specifically the mpi4py
> > module.  Does this come with the yt package?  The script runs on the
> > command line.  Here is the output from the .e file and the script:
> >
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from hierarchy import \
> >        from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> > from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> >    from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> > Traceback (most recent call last):
> >  File "test_parallel_yt.py", line 1, in <module>
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from yt.mods import *
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/mods.py",
> > line 44, in <module>
> >   from hierarchy import \
> >    from yt.data_objects.api import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/api.py", line 34, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from hierarchy import \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/data_objects/hierarchy.py", line 40, in <module>
> >    from yt.utilities.parallel_tools.parallel_analysis_interface import
> > \
> >  File
> > "/share/home/01112/tg803911/yt_4apr2011/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line
> > 47, in <module>
> >    from mpi4py import MPI
> >    from mpi4py import MPI
> >    from mpi4py import MPI
> > ImportError    : from mpi4py import MPI
> >    from mpi4py import MPI
> > ImportErrorNo module named mpi4py:
> >    No module named mpi4py    from mpi4py import MPI
> >
> > from mpi4py import MPI
> > ImportErrorImportError: : ImportError: No module named mpi4pyNo module
> > named mpi4pyNo module named mpi4py
> >
> >
> > ImportError: ImportError: No module named mpi4py
> > No module named mpi4py
> >    from mpi4py import MPI
> > ImportError: No module named mpi4py
> > mpispawn.c:303 Unexpected exit status
> >
> > Child exited abnormally!
> > Killing remote processes...DONE
> >
> >
> > script:
> > from yt.mods import *
> > from yt.visualization.api import PlotCollection
> >
> > import matplotlib.colorbar as cb
> >
> > path =
> > "/scratch/01112/tg803911/halo88_therm_feed_1e-6_LW_RadiationShield_lmax12/DD0010/"
> > fn = "output_0010"
> > pf = load(path+fn)
> > pc = PlotCollection(pf)
> > pc.add_projection("Density",0)
> > pc.set_width(20,'kpc')
> > pc.save(fn)
> >
> > On Thu, 2011-04-07 at 17:24 -0400, Britton Smith wrote:
> >> Hi Christine,
> >>
> >> Mpirun and ibrun are basically the same thing.  On Ranger, you just
> >> want to use ibrun.  You also don't need to specify the number of
> >> processors on Ranger.  It gets that from the "8way 16" line at the
> >> top.
> >>
> >> The command in your script should be:
> >>
> >> ibrun python2.7 test_parallel_yt.py --parallel
> >>
> >> Britton
> >>
> >> On Thu, Apr 7, 2011 at 5:20 PM, Christine Simpson
> >> <csimpson at astro.columbia.edu> wrote:
> >>         Hi all,
> >>
> >>         I'm trying to run yt in parallel on ranger.  I gather from
> >>         previous
> >>         messages to this list that others have had issues in the past,
> >>         but I
> >>         haven't been able to find something that works from those
> >>         posts.
> >>
> >>         I'm trying to run a test script that does a simple projection.
> >>
> >>         First I tried this (from the yt docs):
> >>
> >>         ibrun mpirun -np 8 python2.7 test_parallel_yt.py --parallel
> >>
> >>         I got this output in the output file:
> >>         TACC: Setting memory limits for job 1896644 to unlimited KB
> >>         TACC: Dumping job script:
> >>         --------------------------------------------------------------------------------
> >>         #  Submit this script using the "qsub" command.
> >>         #  Use the "qstat" command to check the status of a job.
> >>         #
> >>         #$-l h_rt=00:05:00
> >>         #$-pe 8way 16
> >>         #$-N test_parallel_yt
> >>         #$-o $JOB_NAME.o$JOB_ID
> >>         #$-q development
> >>         #$-M csimpson at astro.columbia.edu
> >>         #$-m be
> >>         #$-V
> >>         #$-cwd
> >>
> >>         ibrun mpirun -np 8 python2.7 test_parallel_yt.py --parallel
> >>
> >>
> >>
> >>
> >>         --------------------------------------------------------------------------------
> >>         TACC: Done.
> >>         TACC: Starting up job 1896644
> >>         TACC: Setting up parallel environment for MVAPICH ssh-based
> >>         mpirun.
> >>         TACC: Setup complete. Running job script.
> >>         TACC: starting parallel tasks...
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that test_parallel_yt.py is
> >>         a
> >>         command line argument for the program.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that --parallel is a
> >>         command line argument for the program.
> >>         Missing: program name
> >>         Program python2.7 either does not exist, is not
> >>         executable, or is an erroneous argument to mpirun.
> >>         Warning: Command line arguments for program should be given
> >>         after the program name.  Assuming that --parallel is a
> >>         command line argument for the program.
> >>         TACC: MPI job exited with code: 1
> >>         TACC: Shutting down parallel environment.
> >>         TACC: Shutdown complete. Exiting.
> >>         TACC: Cleaning up after job: 1896644
> >>         TACC: Done.
> >>
> >>
> >>         I also tried this:
> >>
> >>         ibrun mpi4py -np 8 python2.7 test_parallel_yt.py --parallel
> >>
> >>         and got this output:
> >>         TACC: Setting memory limits for job 1896674 to unlimited KB
> >>         TACC: Dumping job script:
> >>         --------------------------------------------------------------------------------
> >>         #!/bin/sh
> >>         #
> >>         #  Submit this script using the "qsub" command.
> >>         #  Use the "qstat" command to check the status of a job.
> >>         #
> >>         #$-l h_rt=00:05:00
> >>         #$-pe 8way 16
> >>         #$-N test_parallel_yt
> >>         #$-o $JOB_NAME.o$JOB_ID
> >>         #$-q development
> >>         #$-M csimpson at astro.columbia.edu
> >>         #$-m be
> >>         #$-V
> >>         #$-cwd
> >>
> >>         ibrun mpi4py -np 8 python2.7 test_parallel_yt.py --parallel
> >>
> >>
> >>
> >>
> >>         --------------------------------------------------------------------------------
> >>         TACC: Done.
> >>         TACC: Starting up job 1896674
> >>         TACC: Setting up parallel environment for MVAPICH ssh-based
> >>         mpirun.
> >>         TACC: Setup complete. Running job script.
> >>         TACC: starting parallel tasks...
> >>         TACC: MPI job exited with code: 1
> >>         TACC: Shutting down parallel environment.
> >>         TACC: Shutdown complete. Exiting.
> >>         TACC: Cleaning up after job: 1896674
> >>         TACC: Done.
> >>
> >>
> >>         Any ideas?  I guess in the first instance it is not finding
> >>         python, but
> >>         the test script I run works fine on the command line and doing
> >>         -V should
> >>         import the same environment settings, right?  I guess in the
> >>         second
> >>         instance I'm using the wrong mpi call.  I found info on that
> >>         call in
> >>         some old posts to the email list.
> >>
> >>         Christine
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>         _______________________________________________
> >>         yt-users mailing list
> >>         yt-users at lists.spacepope.org
> >>         http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> >>
> >> _______________________________________________
> >> yt-users mailing list
> >> yt-users at lists.spacepope.org
> >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> >
> >
> > _______________________________________________
> > yt-users mailing list
> > yt-users at lists.spacepope.org
> > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> >
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> 





More information about the yt-users mailing list