[Yt-dev] HDF5 HaloFinder preloading

matthewturk at gmail.com matthewturk at gmail.com
Tue May 5 09:32:29 PDT 2009


Looks mostly okay, but I think we need to rethink the initial mass
calculation before it gets committed.


http://codereview.appspot.com/59041/diff/1/2
File HaloFinding.py (right):

http://codereview.appspot.com/59041/diff/1/2#newcode197
Line 197:
Not sure we need this?

http://codereview.appspot.com/59041/diff/1/2#newcode433
Line 433: padded, LE, RE, self._data_source =
self._partition_hierarchy_3d(padding=self.padding)
We should get rid of this initial step of partitioning the entire
hierarchy, and instead move to using a DerivedQuantity.  I'd say
something like

all_data = self.pf.h.all_data()
all_data.quantities["TotalQuantity"]("ParticleMassMsun",
lazy_reader=True)

which will automatically parallelize.  This would get rid of most of the
following lines, and we could avoid the mpi_allsum, too.  I'll need to
add in the preloading to the DQ object, however.

http://codereview.appspot.com/59041/diff/1/2#newcode450
Line 450: self._data_source.hierarchy.queue)
Should be fine.

http://codereview.appspot.com/59041



More information about the yt-dev mailing list