[Yt-dev] parallel profiler problem

Matthew Turk matthewturk at gmail.com
Tue Mar 24 11:36:51 PDT 2009


Okay, I have pulled up some source code and the RD0035 dataset and
looked at this.  Here's what's going on.

The sphere object gets all candidate grids, from a first pass:

http://yt.enzotools.org/browser/trunk/yt/lagos/HierarchyType.py#L418

This is the first pass estimate at which grids could be members, which
then gets cut down in the actual sphere object; these two grids are
red herrings.  Note that this checks the grid dx < radius, as this is
important in a moment.  There are 36 particles within this region,
which I was able to find by turning off the criteria that grids have
dx > radius of the sphere.  The ids of the grids in which these
particles live are:

array([ 162648.,  162648.,  162648.,  162648.,  162648.,  162648.,
        162648.,  162648.,  162648.,  162648.,  162648.,  162648.,
        162648.,  162648.,  162648.,  162648.,  162648.,  162648.,
        162648.,  162757.,  162757.,  162757.,  162757.,  162757.,
        162757.,  162757.,  162757.,  162757.,  162757.,  162757.,
        162757.,  162757.,  162757.,  162757.,  162757.,  162757.])

and the dx's are:

array([ 0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656,  0.00097656,  0.00097656,  0.00097656,  0.00097656,
        0.00097656])

I'm torn about this issue.  On the one hand, I kind of don't like the
idea of giving back data from grids with dx > radius; this doesn't sit
well with me.  On the other, particles are not necessarily supposed to
behave the same way, so perhaps in this case it would be appropriate.
Changing the requirement from

gridI = na.where(na.logical_and((self.gridDxs<=radius)[:,0],(dist <
(radius + long_axis))) == 1)

to

gridI = na.where(na.logical_and((self.gridDxs<=2.0*radius)[:,0],(dist
< (radius + long_axis))) == 1)

would satisfy me, and would work with this particular use case.  What
do you think?

I guess a secondary question would be about the meaning of this halo;
it's 36 particles, but they're all within a radius of less than the dx
of their host grid.  Once I've changed the requirement for getting
grids to be 2*radius, we end up with three cells included.  This
profile would be uninformative.

-Matt

On Tue, Mar 24, 2009 at 11:03 AM, Stephen Skory <stephenskory at yahoo.com> wrote:
>
> Hi guys,
>
>> Maybe you could try increasing the value of PROFILE_RADIUS_THRESHOLD in HaloProfiler.py.  Right now, it's set to 2, meaning that it will allow profiles for halos whose outer radii are only 2 times the inner radii.  That's probably too generous.  Even halos of that size should probably be ignored.
>
> The radius of this problem halo/pf.h.get_smallest_dx() = 26, so I don't think that's the issue.
>
> Looking at the hierarchy, below are the two grids listed in sphere._grids. The sphere being cut is
>
> sp = pf.h.sphere([6.983622952e-01, 3.006639125e-01, 4.988738710e-01],8.156776433e-04)
>
> I notice that the center of the halo isn't in these grids. Could that be the problem?
>
> Grid = 163293
> GridRank          = 3
> GridDimension     = 14 22 20
> GridStartIndex    = 3 3 3
> GridEndIndex      = 10 18 16
> GridLeftEdge      = 0.693359375 0.296875 0.4921875
> GridRightEdge     = 0.697265625 0.3046875 0.4990234375
> Time              = 584.25598986586
> SubgridsAreStatic = 0
> FileName       = RD0035.cpu0150
> GroupName      = /Grid00163293
> NumberOfBaryonFields = 7
> FieldType = 0 1 2 4 5 6 22
> CourantSafetyNumber    = 0.100000
> PPMFlatteningParameter = 0
> PPMDiffusionParameter  = 0
> PPMSteepeningParameter = 0
> NumberOfParticles   = 273
> GravityBoundaryType = 2
> Pointer: Grid[163293]->NextGridThisLevel = 0
> Pointer: Grid[163293]->NextGridNextLevel = 163294
>
>
> Grid = 163917
> GridRank          = 3
> GridDimension     = 16 14 10
> GridStartIndex    = 3 3 3
> GridEndIndex      = 12 10 6
> GridLeftEdge      = 0.701171875 0.296875 0.498046875
> GridRightEdge     = 0.7060546875 0.30078125 0.5
> Time              = 584.25598986586
> SubgridsAreStatic = 0
> FileName       = RD0035.cpu0150
> GroupName      = /Grid00163917
> NumberOfBaryonFields = 7
> FieldType = 0 1 2 4 5 6 22
> CourantSafetyNumber    = 0.100000
> PPMFlatteningParameter = 0
> PPMDiffusionParameter  = 0
> PPMSteepeningParameter = 0
> NumberOfParticles   = 84
> GravityBoundaryType = 2
> Pointer: Grid[163917]->NextGridThisLevel = 0
> Pointer: Grid[163917]->NextGridNextLevel = 163918
>
>  _______________________________________________________
> sskory at physics.ucsd.edu           o__  Stephen Skory
> http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
> ________________________________(_)_\(_)_______________
>
> _______________________________________________
> Yt-dev mailing list
> Yt-dev at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
>



More information about the yt-dev mailing list