[yt-users] Analyzing SPH data
Alankar Dutta
dutta.alankar at gmail.com
Sat Mar 4 01:43:09 PST 2017
Oh I missed telling that the full dataset is around 950 GB
On Sat, Mar 4, 2017 at 1:02 PM, Alankar Dutta <dutta.alankar at gmail.com>
wrote:
> Hello,
>
> This is only one part of the multipart snapshot of the simulation. I am
> sending you the header information from the snapshot file:
>
> {'O0': 0.27500000000000002,
> 'Ol': 0.72499999999999998,
> 'boxsize': 100000.0,
> 'flag_age': 1,
> 'flag_cooling': 1,
> 'flag_delaytime': 0,
> 'flag_fb': 1,
> 'flag_fh2': 0,
> 'flag_metals': 1,
> 'flag_potential': 3,
> 'flag_sfr': 1,
> 'flag_tmax': 0,
> 'h': 0.70199999999999996,
> 'massTable': array([ 0. , 0.00110449, 0. , 0. , 0. , 0. ]),
> 'nbndry': 76247,
> 'nbulge': 0,
> 'ndisk': 0,
> 'ndm': 1459617792,
> 'nfiles': 1024,
> 'ngas': 1343721867,
> 'npartThisFile': array([10045237, 10239661, 0, 0, 451148, 67], dtype=uint32),
> 'npartTotal': array([1343721867, 1459617792, 0, 0, 503500456,
> 76247], dtype=uint32),
> 'npartTotalHW': array([1, 1, 0, 0, 0, 0], dtype=uint32),
> 'nstar': 503500456,
> 'redshift': 1.0000000010567627,
> 'rhocrit': 2.707660428120944e-29,
> 'time': 0.49999999973580939}
>
>
>
> I extracted this information using a different python program.
>
> Alankar Dutta
>
> On Sat, Mar 4, 2017 at 12:45 PM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>
>>
>>
>> On Sat, Mar 4, 2017 at 12:25 AM, Nathan Goldbaum <nathan12343 at gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Sat, Mar 4, 2017 at 12:13 AM, Alankar Dutta <dutta.alankar at gmail.com>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> Modifying the GADGET simulation isn't possible the moment because it
>>>> has already been developed by someone else who has a paper published on
>>>> this simulation and I want to use that simulation snapshots to make mock
>>>> xray map from it.
>>>>
>>>
>>> I'm talking about modifying yt, not Gadget.
>>>
>>> Is the file you attached to your other e-mail just a single file in a
>>> multi-file dataset? How large is the full dataset? Do you not have a way to
>>> produce a full dataset in this output format that's not prohibitively large?
>>>
>>
>> I've opened a pull request that allows me to do IO on the data you
>> attached: https://bitbucket.org/yt_analysis/yt/pull-requests/2537
>>
>> This allows me to read your data in, getting sensible values for e.g.
>> position. I suspect we're not using the correct field specification because
>> I see this warning:
>>
>> yt : [WARNING ] 2017-03-04 01:06:09,109 Your Gadget-2 file may have
>> extra columns or different precision! (1814947576 file vs 1486279952
>> computed)
>>
>> yt supports a number of field specifications out of the box, see:
>>
>> https://bitbucket.org/yt_analysis/yt/src/3eca2ae80ab14a48b64
>> 3d3055d7d3c0933fa77ae/yt/frontends/gadget/definitions.
>> py?at=yt&fileviewer=file-view-default#definitions.py-50
>>
>> Do you happen to know which fields are in your output file?
>>
>>
>>>
>>>
>>>>
>>>>
>>>> Alankar Dutta
>>>>
>>>> On Mar 4, 2017 11:38 AM, "Nathan Goldbaum" <nathan12343 at gmail.com>
>>>> wrote:
>>>>
>>>>> I think the most straightforward thing to do here is to fix the Gadget
>>>>> frontend so it properly reads in gadget binary data with positions written
>>>>> in double precision.
>>>>>
>>>>> Is there any chance you can generate a smallish test dataset in your
>>>>> Gadget output format that we can use for debugging purposes? With that
>>>>> available it should be straightforward to add support. You can share the
>>>>> dataset using the yt curldrop (https://docs.hub.yt/services.
>>>>> html#curldrop) or a cloud filesharing service like dropbox or google
>>>>> drive.
>>>>>
>>>>> Unfortunately there isn't a way to load SPH data without a
>>>>> full-fledged frontend right now.
>>>>>
>>>>> We do have a load_particles function which allows creating a dataset
>>>>> from particle data loaded as numpy arrays, but it's currently not possible
>>>>> to use it to load SPH data.
>>>>>
>>>>> I'm currently actively working on improving support for SPH data in yt
>>>>> and adding the ability to load SPH data with load_particles is one of the
>>>>> things I've added in that branch of the code. Hopefully this work will be
>>>>> stabilized sometime in the next few months.
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Mar 3, 2017 at 11:45 PM, Alankar Dutta <
>>>>> dutta.alankar at gmail.com> wrote:
>>>>>
>>>>>> Hello yt-community,
>>>>>>
>>>>>> I am a beginner on yt. I have an array of data which I have read from
>>>>>> a Gadget simulation snapshot. It is not directly supported by yt at present
>>>>>> (I have ensured this by already discussing this issue in the community
>>>>>> before). The array has position,velocity, density, mass, internal energy
>>>>>> and smoothing length information on the gas particles. Now how can I use
>>>>>> this to make the slice plots or other useful visualizations?
>>>>>>
>>>>>> Alankar Dutta,
>>>>>> Third year Undergraduate,
>>>>>> Physics Department,
>>>>>> Presidency University, India
>>>>>>
>>>>>> _______________________________________________
>>>>>> yt-users mailing list
>>>>>> yt-users at lists.spacepope.org
>>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>>>
>>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> yt-users mailing list
>>>>> yt-users at lists.spacepope.org
>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>>
>>>>>
>>>> _______________________________________________
>>>> yt-users mailing list
>>>> yt-users at lists.spacepope.org
>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>
>>>>
>>>
>>
>> _______________________________________________
>> yt-users mailing list
>> yt-users at lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20170304/66c5ba37/attachment.html>
More information about the yt-users
mailing list