[yt-users] How to efficiently control the memory in yt with large simulation?

Romain Teyssier romain.teyssier at gmail.com
Fri Dec 6 06:54:48 PST 2013


One possibility would be to define a bounding box that yt uses to upload ramses data only when contained in this bounding box.
This can be done efficiently using the Hilbert key.

Romain

On 06 Dec 2013, at 15:49, Matthew Turk <matthewturk at gmail.com> wrote:

> Hi Junhwan,
> 
> There's some more background on this issue here:
> 
> http://lists.spacepope.org/pipermail/yt-users-spacepope.org/2013-November/004265.html
> 
> Basically, what it amounts to is:
> 
> * Right now RAMSES uses too much memory and duplicates the full mesh
> on every processor
> * This is not how it will always be
> * Unfortunately changing this can't be prioritized in the next couple weeks
> 
> Your mesh is *particularly* large for what we've dealt with before.
> As it stands, I would be surprised if it will work.  The fix for this
> would be to change the Octrees to only parse on demand rather than at
> instantiation of the RAMSESHierarchy.  Sam Geen and I have talked a
> bit about this, and he may be interested in working on it.  It's a
> change that also will happen for N-body datasets (differently) and
> it's definitely planned to come, but it isn't being prioritized at
> this very moment because of other pressing concerns.
> 
> On Thu, Dec 5, 2013 at 10:47 PM, Junhwan Choi (최준환)
> <choi.junhwan at gmail.com> wrote:
>> Hi all,
>> 
>> I try to make some visualizations (density/temperature project) for
>> large simulation with yt.
>> The simulation is 4096^3 unigrid Ramses simulation.
>> 
>> I try to implement the basics density and temperature projection plot
>> with following script and I got memory problem and yt run is crashed.
>> =====
>> from yt.mods import *
>> 
>> ds = load("../output_00048/info_00048.txt", fields =
>> ["Density","x-velocity",
>> "y-velocity","z-velocity","Pressure","Metallicity","Rad"])
>> center = [0., 0., 0.]
>> 
>> pw = ProjectionPlot(ds, "x", ("gas", "Density"),
>> weight_field="Density", center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> pw = ProjectionPlot(ds, "y", ("gas", "Density"),
>> weight_field="Density", center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> pw = ProjectionPlot(ds, "z", ("gas", "Density") ,
>> weight_field="Density", center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> 
>> pw = ProjectionPlot(ds, "x", "Temperature", weight_field="Density",
>> center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> pw = ProjectionPlot(ds, "y", "Temperature", weight_field="Density",
>> center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> pw = ProjectionPlot(ds, "z", "Temperature" , weight_field="Density",
>> center=center)
>> pw.zoom(1.01)
>> pw.save("allviewGas")
>> ====
>> 
>> Is there any way to reduce the memory usage when I make visualization?
>> If I use slice instead of projection, can I save the memory?
>> I saw there is parallelization for yt. Does parallelization distribute
>> the data at the beginning of the read?
>> (But, I prefer to do so w/o parallelization at this moment.)
> 
> In principle, yes, slices will considerably reduce the memory, modulo
> the overhead from having all your octrees in memory at once -- which
> will be considerable.
> 
> I will spend some time thinking if there's a hotfix we can apply to
> make this work for you right now, as is, but I suspect it may be a
> little time until it can be properly implemented.
> 
> -Matt
> 
>> 
>> Thank you in advance,
>> Junhwan
>> _______________________________________________
>> yt-users mailing list
>> yt-users at lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20131206/fb96a633/attachment.html>


More information about the yt-users mailing list