[yt-users] SFR analysis on 3200 cube data

gso at physics.ucsd.edu gso at physics.ucsd.edu
Thu Jun 16 18:15:20 PDT 2011


Hi everyone, I recently got a hold of some 3200 cube data and was planning
on plotting the Star Formation Rate, but the job always ends suddenly
without any error messages, and I'm wondering if anyone else have seen
this behavior.

This is YT devel running on Lens, The current version of the code is:
749ce1a696de (yt) tip
using 1 node I requested all 16 cores but is running this in serial to
give all the node's memory to that core (64GB).

This is data at a high redshift with only 230595 particles.  But I don't
know how much extra data YT reads in to calculate the SFR (I thought it's
only the particle's ages and mass), so this may be due to an memory issue,
but usually there will be OOM messages if I hit a memory limit.

The printout from YT ends with:

yt : [INFO     ] 2011-06-15 15:01:39,700 Adding Gravitating_Mass_Field to
list of fields
yt : [INFO     ] 2011-06-15 15:01:39,700 Adding kphHeI to list of fields
yt : [INFO     ] 2011-06-15 15:01:39,701 Adding Particle_Density to list
of fields
yt : [INFO     ] 2011-06-15 15:01:39,701 Adding PhotoGamma to list of fields
yt : [INFO     ] 2011-06-15 15:01:39,701 Adding kphHeII to list of fields
yt : [INFO     ] 2011-06-15 15:01:39,701 Adding kphHI to list of fields
yt : [INFO     ] 2011-06-15 15:01:39,701 Adding Gravitational_Potential to
list of fields
Warning: invalid value encountered in sqrt
Warning: invalid value encountered in sqrt
yt : [INFO     ] 2011-06-15 15:01:39,939 Getting creation_time using
ParticleIO

From
G.S.




More information about the yt-users mailing list