[yt-dev] A particle approach to model Ray and AbsorptionSpectrum

Bili Dong - Gmail qobilidop at gmail.com
Sat Sep 10 10:44:53 PDT 2016


Hi all,

I came up with an idea to bring a particle approach to model Ray and
AbsorptionSpectrum. But before I move on to implement it in yt, I'd like to
let you know what it is and get feedbacks about it.

The situation is that I have an SPH simulation, and I want to model the Ray
(in order to get the AbsorptionSpectrum) as accurately as possible.
Currently when we create a Ray object, it's always created from the
deposited grid. Although it is a good approximation to the true particle
representation, it is still not the most accurate way. I'd like to be able
to do it in the particle way (like in SPLASH). In the long term, I know
that Matt and Meagan is working on a new system for particle dataset. The
work I'm going to propose could be thought as lying on top of that, in that
the method could be made faster utilizing Matt and Meagon's work, but the
main infrastructure would stay the same.

To introduce what I plan to do, let's have a look at the first figure here
<http://yt-project.org/docs/dev/analyzing/analysis_modules/light_ray_generator.html>.
The core concept of a Ray object is the *path length*, *`dl`*. Basically,
if we combine the normal fields with the `dl` field, we get a Ray object.
Now imagine instead of a ray intersecting a lot of grid cells, we have a
ray intersecting a lot of SPH particles. How do we define *`dl`* then? We
could define it as the *integral of the SPH kernel along the intersection*!
And that's the whole trick. From this  we could define a particle Ray that
just looks the 'same' as the original grid Ray. Then any analysis built on
top of the Ray object, AbsorptionSpectrum for example, don't need to change
a lot. They will work in different ways simply when provided with different
different kinds of Ray object.

The main difficulty in the implementation is the construction of the
particle `dl` field. Currently I'm doing it brutal-forcedly by computing
`dl` for all the particles and mask out those with zero values. Matt and
Meagan's work will accelerate this by providing the neighboring
information, so I could do the computation on a small set of particles
then. The brutal-force method is not unbearably slow though. And the
computation acceleration could be saved for future work.

I have an external implementation of the particle approach, and have used
it in my current research. I have compared results using the particle
method and those from Trident and they agree statistically as we expected
(thanks Cameron for the help). Now that it looks mature, I'd like to
implement it in yt.

If anyone has any comments, opinions and suggestions, I'd like to hear them.

Thanks for reading,

Bili
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-dev-spacepope.org/attachments/20160910/a2193f2f/attachment.htm>


More information about the yt-dev mailing list