[yt-users] SFR analysis on 3200 cube data
gso at physics.ucsd.edu
gso at physics.ucsd.edu
Fri Jun 24 15:07:48 PDT 2011
Sorry for the late reply Stephen, been really sick for the last couple of
days.
Getting back to trying to analyze this 3200 cube data.
"> Are you saying that your first example of how to read the particles
> (adapted from what I gave you) ran out of memory? Let me know!"
I believe it is, but again, as the script goes through each of the
EnzoGrid_#### it stopped at ####=2549 without any sort of warning, it was
progressing fine from 1 to 2549 and before it prints 2550 the log just
ends.
Here's the script:
print "starting imports"
from yt.mods import *
from yt.analysis_modules.star_analysis.api import *
print "loaded modules"
pf = load("RD0017/RD0017")
print "loaded datafile"
dd = pf.h.all_data()
print "loaded all data"
sm = []
ct = []
for grid in pf.h.grids:
print grid
this_ct = grid['creation_time']
this_sm = grid['ParticleMassMsun']
select = (this_ct > 0)
ext_ct = this_ct[na.where(select)]
print na.size(ext_ct)
ct.extend(ext_ct)
sm.extend(this_sm[na.where(select)])
sfr = StarFormationRate(pf, star_mass=na.array(sm),
star_creation_time=na.array(ct),
volume=dd.volume('mpc'))
sfr.write_out(name="StarFormationRate.out")
------------------------------------------------------------------
Here's an excerpt of the log:
yt : [INFO ] 2011-06-22 14:26:25,216 Parameters: current_time
= 17.4164302991
yt : [INFO ] 2011-06-22 14:26:25,217 Parameters: domain_dimensions
= [3200, 3200, 3200]
yt : [INFO ] 2011-06-22 14:26:25,217 Parameters: domain_left_edge
= [ 0. 0. 0.]
yt : [INFO ] 2011-06-22 14:26:25,217 Parameters: domain_right_edge
= [ 1. 1. 1.]
yt : [INFO ] 2011-06-22 14:26:25,217 Parameters:
cosmological_simulation = 1
yt : [INFO ] 2011-06-22 14:26:25,217 Parameters: current_redshift
= 11.9999998306
yt : [INFO ] 2011-06-22 14:26:25,218 Parameters: omega_lambda
= 0.73
yt : [INFO ] 2011-06-22 14:26:25,218 Parameters: omega_matter
= 0.27
yt : [INFO ] 2011-06-22 14:26:25,218 Parameters: hubble_constant
= 0.7
Parsing Hierarchy 0% | | ETA:
--:--:--
Parsing Hierarchy 1% | | ETA:
00:00:01
Parsing Hierarchy 2% | | ETA:
00:00:01
Parsing Hierarchy 3% |\ | ETA:
00:00:01
Parsing Hierarchy 4% || | ETA:
00:00:01
Parsing Hierarchy 5% |/ | ETA:
00:00:01
Parsing Hierarchy 6% |-- | ETA:
00:00:01
Parsing Hierarchy 7% |\\ | ETA:
00:00:01
Parsing Hierarchy 8% |||| | ETA:
00:00:01
Parsing Hierarchy 9% |/// | ETA:
00:00:01
Parsing Hierarchy 10% |--- | ETA:
00:00:01
Parsing Hierarchy 11% |\\\\ | ETA:
00:00:01
Parsing Hierarchy 12% ||||| | ETA:
00:00:01
Parsing Hierarchy 13% |///// | ETA:
00:00:01
Parsing Hierarchy 14% |----- | ETA:
00:00:01
Parsing Hierarchy 15% |\\\\\ | ETA:
00:00:01
Parsing Hierarchy 16% ||||||| | ETA:
00:00:01
Parsing Hierarchy 17% |////// | ETA:
00:00:01
Parsing Hierarchy 18% |------- | ETA:
00:00:01
Parsing Hierarchy 19% |\\\\\\\ | ETA:
00:00:01
Parsing Hierarchy 20% |||||||| | ETA:
00:00:01
Parsing Hierarchy 21% |//////// | ETA:
00:00:01
Parsing Hierarchy 22% |-------- | ETA:
00:00:01
Parsing Hierarchy 23% |\\\\\\\\ | ETA:
00:00:01
Parsing Hierarchy 24% |||||||||| | ETA:
00:00:01
Parsing Hierarchy 25% |///////// | ETA:
00:00:01
Parsing Hierarchy 26% |---------- | ETA:
00:00:01
Parsing Hierarchy 27% |\\\\\\\\\\ | ETA:
00:00:01
Parsing Hierarchy 28% ||||||||||| | ETA:
00:00:01
Parsing Hierarchy 29% |/////////// | ETA:
00:00:01
Parsing Hierarchy 30% |----------- | ETA:
00:00:00
Parsing Hierarchy 31% |\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 32% ||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 33% |//////////// | ETA:
00:00:00
Parsing Hierarchy 34% |------------- | ETA:
00:00:00
Parsing Hierarchy 35% |\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 36% ||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 37% |////////////// | ETA:
00:00:00
Parsing Hierarchy 38% |-------------- | ETA:
00:00:00
Parsing Hierarchy 39% |\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 40% |||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 41% |/////////////// | ETA:
00:00:00
Parsing Hierarchy 42% |---------------- | ETA:
00:00:00
Parsing Hierarchy 43% |\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 44% |||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 45% |///////////////// | ETA:
00:00:00
Parsing Hierarchy 46% |----------------- | ETA:
00:00:00
Parsing Hierarchy 47% |\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 48% ||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 49% |/////////////////// | ETA:
00:00:00
Parsing Hierarchy 50% |------------------- | ETA:
00:00:00
Parsing Hierarchy 51% |\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 52% ||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 53% |//////////////////// | ETA:
00:00:00
Parsing Hierarchy 54% |--------------------- | ETA:
00:00:00
Parsing Hierarchy 55% |\\\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 56% |||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 57% |////////////////////// | ETA:
00:00:00
Parsing Hierarchy 58% |---------------------- | ETA:
00:00:00
Parsing Hierarchy 59% |\\\\\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 60% |||||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 61% |/////////////////////// | ETA:
00:00:00
Parsing Hierarchy 62% |------------------------ | ETA:
00:00:00
Parsing Hierarchy 63% |\\\\\\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 64% ||||||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 65% |///////////////////////// | ETA:
00:00:00
Parsing Hierarchy 66% |------------------------- | ETA:
00:00:00
Parsing Hierarchy 67% |\\\\\\\\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 68% ||||||||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 69% |////////////////////////// | ETA:
00:00:00
Parsing Hierarchy 70% |--------------------------- | ETA:
00:00:00
Parsing Hierarchy 71% |\\\\\\\\\\\\\\\\\\\\\\\\\\\ | ETA:
00:00:00
Parsing Hierarchy 72% ||||||||||||||||||||||||||||| | ETA:
00:00:00
Parsing Hierarchy 73% |//////////////////////////// | ETA:
00:00:00
Parsing Hierarchy 74% |---------------------------- | ETA:
00:00:00
yt : [INFO ] 2011-06-22 14:26:27,785 Adding Gravitating_Mass_Field to
list of fields
yt : [INFO ] 2011-06-22 14:26:27,785 Adding kphHeI to list of fields
yt : [INFO ] 2011-06-22 14:26:27,785 Adding Particle_Density to list
of fields
yt : [INFO ] 2011-06-22 14:26:27,786 Adding PhotoGamma to list of fields
yt : [INFO ] 2011-06-22 14:26:27,786 Adding kphHeII to list of fields
yt : [INFO ] 2011-06-22 14:26:27,786 Adding kphHI to list of fields
yt : [INFO ] 2011-06-22 14:26:27,786 Adding Gravitational_Potential to
list of fields
Warning: invalid value encountered in sqrt
Warning: invalid value encountered in sqrt
starting imports
loaded modules
loaded datafile
loaded all data
EnzoGrid_0001
0
....
EnzoGrid_0020
0
EnzoGrid_0021
5395
EnzoGrid_0022
0
...
EnzoGrid_1315
1258
EnzoGrid_1316
0
EnzoGrid_1317
0
EnzoGrid_1318
0
EnzoGrid_1319
1634
EnzoGrid_1320
0
....
EnzoGrid_2548
0
EnzoGrid_2549
0
Enzo
As you can see, most of the grids print out 0 which means it didn't have
any star particles, and the most star particles it found so far was with a
couple thousand. But to get those star particles it had to load up all
the particle data first including the dark matter. Should I print out how
many particles it loads up (star+DM)? Or is there something obviously
wrong with the script that will cause it to fail? I've tested this script
on a small dataset and it gave identical results to the cookbook example.
I was thinking, when we load up a portion of an array say
a=dd["creation_time"][0:99], are we only using the first 100 creation_time
in memory or does it load up the whole array, then pick out the first 100?
If not we can just loop over several times (next one will be [100:199])
until it gets to the end of the array. Superslow, but it'll fit in
memory...
From
G.S.
> Hi Geoffrey,
>
>> Even if the above method worked, loading each grid onto 1
>> processor/node,
>> it would only alleviate the memory problem so much, because potentially
>> a
>> LOT of the particles can be on a single grid, which will still overload
>> the memory sometimes, so this isn't as good as the parallel HOP's KD
>> tree
>> way of cutting up the particles for load balancing.
>
> Are you saying that your first example of how to read the particles
> (adapted from what I gave you) ran out of memory? Let me know!
>
> Regarding running out of memory, as particle IO works in yt currently,
> a grid with lots of particles can still be a problem, even in parallel
> HOP. If you are in fact running out of memory due to a really heavy
> grid, we should think about how to address that.
>
> --
> Stephen Skory
> s at skory.us
> http://stephenskory.com/
> 510.621.3687 (google voice)
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
More information about the yt-users
mailing list