[Yt-dev] particle io issue

Britton Smith brittonsmith at gmail.com
Sat Nov 28 09:53:14 PST 2009


Done.
http://yt.enzotools.org/ticket/230

On Sat, Nov 28, 2009 at 10:35 AM, Matthew Turk <matthewturk at gmail.com>wrote:

> Hi Britton,
>
> ParticleIO is still kind of new, I guess!  I think there's something
> wrong with the dependency calculator in this case -- something about
> the hackery around particle_mass and "particle mass".  There are two
> ways forward for this that I think we need to think about -- global
> field name translations, and fixing the field dependencies.  If you
> fill out a ticket for this problem, I'll examine it this coming
> week...
>
> -Matt
>
> On Fri, Nov 27, 2009 at 2:43 PM, Britton Smith <brittonsmith at gmail.com>
> wrote:
> > Hey everyone,
> >
> > The following error pertains to the yt branch of the hg repo.
> >
> > I'm getting a particle io error when using the various yt halo finders in
> > parallel.  It doesn't appear to have anything to do with the halo
> finders,
> > but I don't know what else uses ParticleIO.py.  This only happens when
> > running in parallel.
> >
> > P000 yt         INFO       2009-11-27 15:16:59,775 Getting
> ParticleMassMsun
> > using ParticleIO
> > Setting period equal to 1.000000
> > Setting period equal to 1.000000
> > Setting period equal to 1.000000
> > Traceback (most recent call last):
> >   File "do_hop.py", line 5, in <module>
> >     h = FOFHaloFinder(pf)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/HaloFinding.py",
> line
> > 1024, in __init__
> >     self._parse_halolist(1.)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/HaloFinding.py",
> line
> > 747, in _parse_halolist
> >     this_max_dens = halo.maximum_density_location()
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/ParallelTools.py",
> line
> > 130, in single_proc_results
> >     return func(self, *args, **kwargs)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/HaloFinding.py",
> line
> > 322, in maximum_density_location
> >     return self.center_of_mass()
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/ParallelTools.py",
> line
> > 130, in single_proc_results
> >     return func(self, *args, **kwargs)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/HaloFinding.py",
> line
> > 306, in center_of_mass
> >     pm = self["ParticleMassMsun"]
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/ParallelTools.py",
> line
> > 130, in single_proc_results
> >     return func(self, *args, **kwargs)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/HaloFinding.py",
> line
> > 136, in __getitem__
> >     return self.data.particles[key][self.indices]
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/ParticleIO.py", line
> > 46, in __getitem__
> >     self.get_data(key)
> >   File "/Users/britton/Documents/work/yt-hg/yt/lagos/ParticleIO.py", line
> > 104, in get_data
> >     if len(to_add) != 1: raise KeyError
> > KeyError
> >
> > I checked the contents of to_add, and it was a list with two items (hence
> > the exception):
> > ['particle_mass', 'particle_mass']
> >
> > I was able to get around this by removing non-unique entries in to_add,
> but
> > I don't think that really fixes the underlying problem.  Anyone have any
> > ideas?
> >
> > Regards,
> >
> > Britton
> >
> > _______________________________________________
> > Yt-dev mailing list
> > Yt-dev at lists.spacepope.org
> > http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
> >
> >
> _______________________________________________
> Yt-dev mailing list
> Yt-dev at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-dev-spacepope.org/attachments/20091128/c113938a/attachment.html>


More information about the yt-dev mailing list