[yt-users] memory required estimate

gso at physics.ucsd.edu gso at physics.ucsd.edu
Fri Mar 18 12:13:11 PDT 2011


Hi everyone, I have a simple memory question...

I was running my script that uses extract_connected_set on a tiny 64 cube
test data and it worked fine (using the install script with python2.7),
but when I tried it on a big 1024 cube dataset I got the  error at the end
of the email on a triton pdaf node 8 core 64GB of memory (was using only 1
core, only used more core for their memory). I believe I may have
underestimated the amount of memory I needed.

If I have am extract connected set on a derived field using two different
fields on a 1024 cube, the memory I need:

1024^3 (number of cells)
2+1 (number of fields)
8 (double precision)

multiplying those numbers gave me ~25GB of memory which I had, am I
missing some other factors?

From
G.S.

First pass100% |||||||||||||||||||||||||||||||||||||||||||||||| Time:
01:30:35
Calculating joins   0% |                                      | ETA: 
--:--:--
Traceback (most recent call last):
  File "create_bubble.py", line 55, in <module>
    new_region = dd.extract_connected_sets(field, num_levels, ion_min,
ion_max)[1][0]
  File "/home/gso/TritonYT/src/yt-hg/yt/data_objects/data_containers.py",
line 2330, in extract_connected_sets
    cached_fields)
  File
"/home/gso/TritonYT/src/yt-hg/yt/analysis_modules/level_sets/contour_finder.py",
line 303, in identify_cont$
    boundary_tree = amr_utils.construct_boundary_relationships(fd)
  File "ContourFinding.pyx", line 92, in
yt.utilities.amr_utils.construct_boundary_relationships
(yt/utilities/amr$
IndexError: Out of bounds on buffer access (axis 0)







More information about the yt-users mailing list