[yt-users] Projection Performance

Richard P Wagner rpwagner at sdsc.edu
Tue May 1 22:03:02 PDT 2012


Thanks for the suggestions. 15 minutes or so for the projection is much better than what I saw. I will trying running in parallel, and I suspect this will help significantly. I didn't the first time, because the higher redshift data was so quick. Although, I'm wondering if the slow down was also because I had cranked the figure size up.

For my use case, I'm not sure if it's better to save the projections for later use, or to write the results immediately. I'm trying to write scripts that create high-resolution images, and can serve as examples, but also will. Other than dialing in the colormaps for the examples, there won't be any need to reload them. In fact, the next step will be create fixed resolution buffers for saving to HDF5 files.

--Rick

On May 1, 2012, at 6:11 PM, Britton Smith wrote:

To add to what Sam said, I think you can force yt to save the projections of the individual reginos by adding the node_name=<string> keyword arg to the projection call.  As long as you give unique names, there shouldn't be a problem and the same script should be able to reload them from the .yt file.

Britton

On Tue, May 1, 2012 at 9:05 PM, Sam Skillman <samskillman at gmail.com<mailto:samskillman at gmail.com>> wrote:
Hi Rick,

For me, running on a single 2.93 GHz Xeon core, it takes 9.818254e+02 seconds to project the LCL7 for unweighted density. With the change I mention below, on 8 cores, this took the time down to 1.796410e+02 seconds.  If you have a parallel file system this will probably be a better speedup.

For the lightcone, you may benefit quite a lot by swapping out the quad tree projection for the (for now) much more parallel overlap projection method and running in parallel.  This works best for naturally domain decomped simulations like the light cone.  Note this is only faster in parallel.  The quad tree method is much faster in serial. You would do this with:

pf.h.proj = pf.h.overlap_proj

after you've loaded the parameter file, and run your script (as is) with mpirun -np N your_script.py --parallel

You can probably get decent speedup values for N = 32 or 64. There are some scaling performance plots from yt about a year ago in the method paper.

I also might suggest forgetting about saving all of these through the plot collection and instead just make all the projections first.  They will be saved in the .yt files and can later be recalled in serial where you can play around with the various figures.  It does this automatically so when you try to load up a plot collection with a density projection, for example, it will just read it in from the .yt file.  That way you can just get all the adaptive projections and pan/zoom/export however you want later.  Note that this method of saving the projections will only work for the full box projections.  The region based projections are not currently saved automatically.

Anyways, those are my two cents.
Sam





On Tue, May 1, 2012 at 6:15 PM, Richard P Wagner <rpwagner at sdsc.edu<mailto:rpwagner at sdsc.edu>> wrote:
Hi,

I wanted to build a sequence of projections using various color maps along each axis. The data set I'm using the z = 0 one from the L7 simulation some of you are familiar with. Here's the paste of my current script:
 http://paste.yt-project.org/show/2335/

(The early sys.exit is deliberate.)

Does anyone have an estimate of the time doing this projection serially should take? After two hours it was still going without having produced the first image. The same plots done on the z = 2.75 data took about 15 minutes (this data has about 1/5 the number of grids, though).

I will also gladly take advice on better methods for the projections, or if the benefit to doing this in parallel is worth it.

Thanks,
Rick

_______________________________________________
yt-users mailing list
yt-users at lists.spacepope.org<mailto:yt-users at lists.spacepope.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

_______________________________________________
yt-users mailing list
yt-users at lists.spacepope.org<mailto:yt-users at lists.spacepope.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org


_______________________________________________
yt-users mailing list
yt-users at lists.spacepope.org<mailto:yt-users at lists.spacepope.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20120502/1b16d2a6/attachment.html>


More information about the yt-users mailing list