[yt-users] units of cic mesh output from covering_grid

Matthew Turk matthewturk at gmail.com
Wed Jul 2 04:48:39 PDT 2014


On Wed, Jul 2, 2014 at 1:06 AM, Nathan Goldbaum <nathan12343 at gmail.com> wrote:
>
> On Tue, Jul 1, 2014 at 4:31 PM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>>
>> Hi Brendan,
>>
>> Not sure exactly what's happening.  In general it's difficult to remotely
>> debug something like this without a simple test case that one of us can run
>> locally.  Can you perhaps make a test that illustrates the issue using one
>> of the datasets on yt-project.org/data?
>>
>>
>> On Tue, Jul 1, 2014 at 3:05 PM, Brendan Griffen
>> <brendan.f.griffen at gmail.com> wrote:
>>>
>>> I have made a mesh using covering_grid:
>>>
>>> pf = load(basepath+ext+'.0.hdf5', unit_base = unit_base, bounding_box =
>>> bbox)
>>>
>>> unit_base = {'UnitLength_in_cm'         : 3.08568e+21,
>>>              'UnitMass_in_g'            :   1.989e+43,
>>>              'UnitVelocity_in_cm_per_s' :      100000}
>>>
>>
>> FWIW, you shouldn't need to set the units or the bbox by hand, yt can read
>> this data from the hdf5 file.
>
>
> Hmm, looks like I was a little hasty earlier.  This statement is based on
> the OWLS data format, which is a little more descriptive than the default
> gadget hdf5 data format.  In particular, units and bounding box information
> are written to disk along with the particle data in the hdf5 files.
>
> If your dataset acts like the public Gadget distribution, you will need to
> provide both the bbox and the unit information.

For cosmological Gadget simulations, it will set "sane defaults" and
it knows how to get the bbox information by looking at the header
"BoxSize" value.  These defaults are:

 * Length => 1.0 Mpc/h (comoving)
 * Velocity => 1e5 cm/s (proper)
 * Mass => 1e10 Msun / h

I think these came from the Gadget manual.

-Matt

>
>>
>>
>>>
>>> In [3]: bbox
>>>
>>> Out[3]:
>>>
>>> array([[   0.,  100.],
>>>
>>>        [   0.,  100.],
>>>
>>>        [   0.,  100.]])
>
>
> This might be the issue, depending on what units the bbox is interpreted to
> be in. For what it's worth, the OWLS dataset on yt-project.org/data seems to
> be a 25 comoving megaparsec / h box:
>
> $ yt load snapshot_033/snap_033.0.hdf5
> IPython 2.1.0 -- An enhanced Interactive Python.
> ?         -> Introduction and overview of IPython's features.
> %quickref -> Quick reference.
> help      -> Python's own help system.
> object?   -> Details about 'object', use 'object??' for extra details.
>
> In [1]: ds = yt.load('snapshot_033/snap_033.0.hdf5')
>
> In [2]: ds.domain_left_edge
> Out[1]: YTArray([ 0.,  0.,  0.]) code_length
>
> In [3]: ds.domain_right_edge
> Out[2]: YTArray([ 25.,  25.,  25.]) code_length
>
> In [4]: ds.length_unit
> Out[3]: 3.085678e+24 cmcm/h
>
>
>>>
>>> ndim = 512
>>>
>>> level = int(math.log(ndim,2))
>>>
>>> cg = pf.covering_grid(level=level,
>>> left_edge=[0,0,0],dims=[ndim,ndim,ndim])
>>>
>>> arr = cg['deposit', 'all_cic']
>>>
>>> When I do arr.mean() etc. I get the output:
>>>
>>>  --> Doing dimension: 512
>>>
>>>         == Mass density ==
>>>
>>>    --- Max:  2.367e-13 [g/cm^-3]
>>>
>>>    --- Min:  0.000e+00 [g/cm^-3]
>>>
>>>    --- Mean: 1.764e-21 [g/cm^-3]
>>>
>>>         == Number density ==
>>>
>>>    --- Max:  1.158e+11 [cm^-3]
>>>
>>>    --- Min:  0.000e+00 [cm^-3]
>>>
>>>    --- Mean: 8.630e+02 [cm^-3]
>>>
>>> Written out mesh:
>>> /bigbang/data/bgriffen/c2ray/cicfiles/parent/512/density/z0.000.dat
>>>
>>> I thought it returns the mesh in g/cm^3. Compared to just using the
>>> particles number, box width and particle mass I get (mass & number density):
>>>
>>> orig_mass =
>>> (head.massarr[1]*10**10*head.nall[1]/head.hubble)/(head.boxwidth/head.hubble)**3
>>>
>>> # just total_mass/total_volume where "head" => header info of HDF5 gadget
>>> snapshot
>>>
>>> In [29]: orig_mass*MsolMpc3_to_gcm3
>>>
>>> Out[29]: 2.701e-30
>>>
>>> In [30]: orig_mass*MsolMpc3_to_gcm3*coeff
>>>
>>> Out[30]: 1.321e-06
>>>
>>> These seem reasonable but the values back from yt don't. Can anyone
>>> confirm the yt units of the mesh returned? Does it take care of hubble
>>> parameter as read from block? Though this would only account for some minor
>>> discrepancy - it is orders of magnitude off at this stage.
>
> yt should detect your dataset as being cosmological and apply the
> appropriate scaling factors.
>
> That said I don't have a cosmological gadget hdf5 dataset to test with, so I
> can't easily do an apples-to-apples comparison with what you're trying to
> accomplish.
>
> My best advice would be to look at the code in the Gadget frontend using a
> debugger and try to figure out what's going wrong. We haven't had a large
> number of users for the gadget frontend yet so it's likely there's a lot of
> room for improvement. Patches are more than welcome.
>
> If anyone happens to have a reasonably sized (< 1 GB) cosomological gadget
> hdf5 dataset, it would be an excellent addition to yt-project.org/data.
>
>>>
>>> Thanks.
>>>
>>> Brendan
>>>
>>>
>>> _______________________________________________
>>> yt-users mailing list
>>> yt-users at lists.spacepope.org
>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>
>>
>
>
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>



More information about the yt-users mailing list