[yt-users] Handling large files
Adam Jacobs
adam.jacobs at stonybrook.edu
Wed Nov 26 16:49:40 PST 2014
Is it unreasonable for me to expect yt on a workstation with 4x3GHz cores
and ~20 GB of RAM to be able to handle a ~100 GB dataset? I'm trying to
select out a subset of the dataset using cut_region(), but still seem to
run into hanging or running out of RAM. For example, when I try to do a
write_out() on the cut region yt sucks up all 20 GB of RAM and I have to
kill it. Is there a preferred method for loading in a subset of data? I
don't need near the full 100 GB of data.
I'm using yt dev and looking at BoxLib/Maestro data.
--
Adam Jacobs
Department of Physics and Astronomy, PhD Candidate
Stony Brook University
http://astro.sunysb.edu/amjacobs/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20141126/6538b67f/attachment.htm>
More information about the yt-users
mailing list