[Yt-dev] more non-unique particles in enzo and how to deal with them
Eric Hallman
hallman13 at gmail.com
Fri Dec 4 15:51:10 PST 2009
Stephen,
so is this fix (to enzo) only in WOC and not in the dev trunk?
On Dec 4, 2009, at 4:37 PM, Stephen Skory wrote:
> Britton,
>
>> Stephen's latest update of his new parallel hop to check for unique-
>> ness of particle indices has shed light on another problem. In a
>> recent simulation run with the devel/trunk version of enzo, I have
>> found the particle numbers are no longer unique when star particles
>> are generated. Originally, the star particle indices by adding to
>> the number of dark matter particles, such that if you had N dm
>> particles and M star particles, the dm particle indices when from 0
>> to N-1 and the star particle indices went from N to N+M-1.
>
> Unfortunately, Ji-hoon Kim addressed this issue last month <http://hg.spacepope.org/week-of-code/rev/bc61ccd0f247
> >, which you may not be aware of. Obviously it's too late for this
> dataset!
>
>> However, now it seems that the star particles simply go from 0 to
>> M. Before I go any further, I will mention that the star particles
>> can be differentiated from the dark matter particles by the
>> particle_type data, so we're not totally screwed. The point is
>> that for the time being, hop no longer works. The question is
>> this: what is the best solution for dealing with this? I am more
>> than willing to write something to remap the particle indices, but
>> that doesn't fix the underlying problem. I have already done
>> something similar to this for a different application, so I can do
>> it quickly. OR, something can be done to hop to read in particle
>> type. Clearly, this question is mainly for Stephen, but anyone
>> should feel free to weigh in. What is the best solution?
>
> I see several options. You could filter out all the stars like this:
>
> notstars = (self._data_source["particle_type"] != star)
>
> and then in HaloFinding.py add a '[notstars]' to all the arrays
> where the particle data is handled and handed off to Parallel HOP.
> If you have 'creation_time' you could just try setting
> 'dm_only=True' in the HaloFinder call.
>
> I think you could easily write a script to fix the dataset itself
> (on a copy, not the original, of course). Use h5py to open the
> files, and read in the particle_index and particle_type fields. SO
> something like this:
>
> pi = file["Grid00000234"]["particle_index"]
> ct = file["Grid00000234"]["particle_type"]
> tochange = (ct == star)
> tochange *= (number of dark matter particles)
> pi += tochange
>
> and then delete the old particle_index field from the file, and
> replace it with this new pi array.
>
> I'll let you know what else I come up with.
>
> Good luck!
>
> _______________________________________________________
> sskory at physics.ucsd.edu o__ Stephen Skory
> http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
> ________________________________(_)_\(_)_______________
>
> _______________________________________________
> Yt-dev mailing list
> Yt-dev at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Eric Hallman
Google Voice: (312) 725-HMAN
hallman13 at gmail.com
More information about the yt-dev
mailing list