[yt-users] TotalQuantity vs. summing manually

Matthew Turk matthewturk at gmail.com
Sat Dec 21 05:43:35 PST 2013


Hi Geoffrey,

On Fri, Dec 20, 2013 at 9:02 PM, Geoffrey So <gsiisg at gmail.com> wrote:
> Hi all,
>
> I found that using a Enzo dataset I was getting slightly different numbers
> when
> 1) using the list of dark matter particles selected by creation_time < 0.0
>
> In [97]: sph_dm = sph['creation_time'] < 0.0
> In [98]: print "%12.12e" % (sph['ParticleMassMsun'][sph_dm]).sum()
> 1.211311468567e+11
>
> 2) compared to summing the dark matter particles inside a 3D container with
> TotalQuantity
>
> In [101]: print "%12.12e" %
> (sph.quantities['TotalQuantity']('Dark_Matter_Density')[0]*vol/Msun)
> 1.188937185993e+11
>
> I'm wondering if the field and particles are handled differently when being
> counted as inside or outside the 3D container?
>

I think John's answer is completely correct, and likely the dominant
problem, but there's a funny thing that happens if you're not extra
careful with big arrays.  As an example, something that I ran into a
couple years ago:

# This is an array filled with ones.
arr = numpy.ones((512, 512, 512), dtype="float32")
print arr.sum()
print arr.size
print arr.sum(dtype = "float64")

The output isn't what you might expect, because numpy uses the dtype
of the array for the accumulation.  The quantities -- thanks to Doug
Rudd -- are uniformly careful about upgrading to 64 bits the
accumulators used for quantities.  Again, I don't think this is
dominant, but it is something to be aware of.

-Matt

> From
> G.S.
>
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>



More information about the yt-users mailing list