[yt-users] Analyzing SPH data

Nathan Goldbaum nathan12343 at gmail.com
Sat Mar 4 12:33:44 PST 2017


Not right now, no, although I think it would be straightforward to modify
yt to be able to do this. We would need to add a way to indicate that you
don't want to load the full dataset (e.g. as a keyword argument to the
load() function), and then turn off the machinery that detects if a Gadget
binary file is part of a multi-file dataset.

Would you mind opening an issue so we don't lose track of this feature
request?

https://bitbucket.org/yt_analysis/yt/issues/new

I can try to take a stab at this somtime in the next couple weeks. If you'd
like to try adding the feature yourself we would welcome a pull request
that implements this.

On Sat, Mar 4, 2017 at 2:27 PM Alankar Dutta <dutta.alankar at gmail.com>
wrote:

> Is it possible somehow to read only a few of the files from the entire
> dataset? If so then how?
>
> Alankar Dutta
>
>
> Alankar Dutta
>
> On Mar 5, 2017 1:51 AM, "Nathan Goldbaum" <nathan12343 at gmail.com> wrote:
>
> So unfortunately the current support for SPH data is not able to ingest a
> dataset as big as yours, at least on server with as much RAM as you have
> available.
>
> In principle we could make it so you could load just one of your files at
> a time, that would definitely work, but I don't think the current support
> for particle data will ever be able to scale to load a dataset as big as
> yours.
>
> I'm currently actively working on improving the scaling and peak memory
> consumption of yt when looking at SPH particle data. Unfortunately this
> work is not yet ready for anyone besides someone who is ok with some
> features being broken. I'm actively working on stabilizing this code and it
> will hopefully be available as part of a stable release of yt in the
> medium-term future. If you'd like more information about this effort or are
> interested in helping out, take a look at the yt enhancement proposal
> describing the work:
>
> https://bitbucket.org/yt_analysis/ytep/pull-requests/67
>
> I'm sorry that I don't have a nicer answer for you here. Hopefully in the
> medium-term future yt will be able to easily work with extremely large SPH
> datasets like yours.
>
> -Nathan
>
> On Sat, Mar 4, 2017 at 12:51 PM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello,
>
> *I used the following code. This automatically tries to read my entire
> Gadget Snapshot Dataset (1024 files):*
>
> import yt
>
> loc = '/media/alankar/Seagate Expansion Drive/mb2/snapshots/snapdir_068/'
> fname = loc+'snapshot_068.0'
>
> ds = yt.load(fname)
> ds.index
> ad= ds.all_data()
>
> px = yt.ProjectionPlot(ds, 'x', ('gas', 'density'))
> px.show()
>
> *I get the following error message:*
>
> Reloaded modules: yt, yt.funcs, yt.extern, yt.extern.six, yt.utilities,
> yt.utilities.logger, yt.config, yt.utilities.exceptions, yt.extern.tqdm,
> yt.extern.tqdm._tqdm, yt.extern.tqdm._utils, yt.extern.tqdm._tqdm_gui,
> yt.extern.tqdm._tqdm_pandas, yt.extern.tqdm._version, yt.units,
> yt.units.unit_symbols, yt.units.yt_array, yt.units.unit_object,
> yt.units.dimensions, yt.units.equivalencies, yt.utilities.physical_ratios,
> yt.units.unit_lookup_table, yt.units.unit_registry, yt.utilities.lru_cache,
> yt.utilities.on_demand_imports, yt.units.pint_conversions,
> yt.utilities.physical_constants, yt.fields, yt.fields.api,
> yt.fields.field_plugin_registry, yt.fields.angular_momentum,
> yt.fields.derived_field, yt.fields.field_exceptions,
> yt.fields.field_detector, yt.fields.vector_operations,
> yt.utilities.math_utils, yt.utilities.lib, yt.utilities.lib.misc_utilities,
> yt.utilities.lib.geometry_utils, yt.fields.astro_fields,
> yt.fields.cosmology_fields, yt.fields.fluid_fields,
> yt.fields.fluid_vector_fields, yt.fields.magnetic_field,
> yt.fields.geometric_fields, yt.fields.field_functions,
> yt.fields.particle_fields, yt.fields.local_fields,
> yt.fields.field_info_container, yt.fields.my_plugin_fields,
> yt.fields.xray_emission_fields, yt.utilities.linear_interpolators,
> yt.utilities.lib.interpolators, yt.utilities.cosmology, yt.data_objects,
> yt.data_objects.api, yt.data_objects.grid_patch,
> yt.data_objects.data_containers, yt.data_objects.particle_io,
> yt.data_objects.field_data, yt.frontends, yt.frontends.ytdata,
> yt.frontends.ytdata.utilities, yt.utilities.lib.marching_cubes,
> yt.utilities.parallel_tools, yt.utilities.parallel_tools.parallel_analysis_interface,
> yt.data_objects.image_array, yt.visualization,
> yt.visualization.image_writer, yt.visualization.color_maps,
> yt.visualization._colormap_data, yt.utilities.lib.image_utilities,
> yt.utilities.png_writer, yt.utilities.lib.quad_tree,
> yt.utilities.parameter_file_storage, yt.utilities.amr_kdtree,
> yt.utilities.amr_kdtree.api, yt.utilities.amr_kdtree.amr_kdtree,
> yt.utilities.amr_kdtree.amr_kdtools, yt.utilities.lib.amr_kdtools,
> yt.utilities.lib.partitioned_grid, yt.geometry,
> yt.geometry.grid_geometry_handler, yt.arraytypes,
> yt.geometry.geometry_handler, yt.utilities.io_handler,
> yt.utilities.definitions, yt.geometry.grid_container,
> yt.data_objects.derived_quantities, yt.geometry.selection_routines,
> yt.data_objects.profiles, yt.utilities.lib.particle_mesh_operations,
> yt.geometry.particle_deposit, yt.data_objects.octree_subset,
> yt.geometry.particle_smooth, yt.geometry.particle_oct_container,
> yt.data_objects.static_output, yt.data_objects.particle_filters,
> yt.data_objects.particle_unions, yt.data_objects.unions,
> yt.utilities.minimal_representation, yt.units.unit_systems,
> yt.data_objects.region_expression, yt.geometry.coordinates,
> yt.geometry.coordinates.api, yt.geometry.coordinates.coordinate_handler,
> yt.geometry.coordinates.cartesian_coordinates,
> yt.utilities.lib.pixelization_routines, yt.data_objects.unstructured_mesh,
> yt.utilities.lib.mesh_utilities, yt.geometry.coordinates.polar_coordinates,
> yt.geometry.coordinates.cylindrical_coordinates,
> yt.geometry.coordinates.spherical_coordinates,
> yt.geometry.coordinates.geographic_coordinates,
> yt.geometry.coordinates.spec_cube_coordinates, yt.data_objects.time_series,
> yt.convenience, yt.utilities.hierarchy_inspection,
> yt.data_objects.analyzer_objects, yt.data_objects.particle_trajectories,
> yt.data_objects.construction_data_containers,
> yt.utilities.grid_data_format, yt.utilities.grid_data_format.conversion,
> yt.utilities.grid_data_format.conversion.conversion_abc,
> yt.utilities.grid_data_format.conversion.conversion_athena,
> yt.utilities.grid_data_format.writer, yt.frontends.stream,
> yt.frontends.stream.api, yt.frontends.stream.data_structures,
> yt.geometry.oct_geometry_handler, yt.geometry.particle_geometry_handler,
> yt.geometry.oct_container, yt.geometry.unstructured_mesh_handler,
> yt.utilities.decompose, yt.utilities.flagging_methods,
> yt.frontends.stream.fields, yt.frontends.exodus_ii,
> yt.frontends.exodus_ii.util, yt.frontends.stream.io,
> yt.frontends.stream.tests, yt.frontends.stream.sample_data,
> yt.data_objects.selection_data_containers, yt.utilities.orientation,
> yt.frontends.api, yt.frontends.art, yt.frontends.art.api,
> yt.frontends.art.data_structures, yt.frontends.art.definitions,
> yt.frontends.art.fields, yt.utilities.fortran_utils, yt.frontends.art.io,
> yt.frontends.art.tests, yt.frontends.artio, yt.frontends.artio.api,
> yt.frontends.artio.data_structures, yt.frontends.artio._artio_caller,
> yt.frontends.artio.fields, yt.frontends.artio.io,
> yt.frontends.artio.tests, yt.frontends.athena, yt.frontends.athena.api,
> yt.frontends.athena.data_structures, yt.frontends.athena.fields,
> yt.frontends.athena.io, yt.frontends.athena.tests,
> yt.frontends.athena_pp, yt.frontends.athena_pp.api,
> yt.frontends.athena_pp.data_structures, yt.utilities.file_handler,
> yt.frontends.athena_pp.fields, yt.frontends.athena_pp.io,
> yt.frontends.athena_pp.tests, yt.frontends.boxlib, yt.frontends.boxlib.api,
> yt.frontends.boxlib.data_structures, yt.frontends.boxlib.fields,
> yt.frontends.boxlib.io, yt.frontends.chombo, yt.frontends.chombo.io,
> yt.frontends.boxlib.tests, yt.frontends.chombo.api,
> yt.frontends.chombo.data_structures, yt.frontends.chombo.fields,
> yt.frontends.chombo.tests, yt.frontends.eagle, yt.frontends.eagle.api,
> yt.frontends.eagle.data_structures, yt.frontends.gadget,
> yt.frontends.gadget.data_structures, yt.frontends.sph,
> yt.frontends.sph.data_structures, yt.frontends.gadget.definitions,
> yt.frontends.gadget.fields, yt.frontends.sph.fields,
> yt.fields.species_fields, yt.utilities.chemical_formulas,
> yt.utilities.periodic_table, yt.frontends.owls, yt.frontends.owls.fields,
> yt.frontends.owls.owls_ion_tables, yt.frontends.eagle.fields,
> yt.frontends.eagle.definitions, yt.frontends.eagle.io,
> yt.frontends.owls.io, yt.frontends.gadget.io, yt.frontends.eagle.tests,
> yt.frontends.enzo, yt.frontends.enzo.api,
> yt.frontends.enzo.data_structures, yt.utilities.pyparselibconfig,
> yt.utilities.pyparselibconfig.api, yt.utilities.pyparselibconfig.libconfig,
> yt.frontends.enzo.fields, yt.frontends.enzo.simulation_handling,
> yt.frontends.enzo.io, yt.frontends.enzo.tests,
> yt.frontends.exodus_ii.api, yt.frontends.exodus_ii.data_structures,
> yt.frontends.exodus_ii.io, yt.frontends.exodus_ii.fields,
> yt.frontends.exodus_ii.simulation_handling, yt.frontends.exodus_ii.tests,
> yt.frontends.fits, yt.frontends.fits.api,
> yt.frontends.fits.data_structures, yt.frontends.fits.fields,
> yt.frontends.fits.io, yt.frontends.fits.misc, yt.frontends.fits.tests,
> yt.frontends.flash, yt.frontends.flash.api,
> yt.frontends.flash.data_structures, yt.frontends.flash.fields,
> yt.frontends.flash.io, yt.frontends.flash.tests, yt.frontends.gadget.api,
> yt.frontends.gadget.simulation_handling, yt.frontends.gadget.tests,
> yt.frontends.gadget_fof, yt.frontends.gadget_fof.api,
> yt.frontends.gadget_fof.data_structures, yt.frontends.gadget_fof.fields,
> yt.frontends.gadget_fof.io, yt.frontends.gadget_fof.tests,
> yt.frontends.gamer, yt.frontends.gamer.api,
> yt.frontends.gamer.data_structures, yt.frontends.gamer.fields, yt.testing,
> yt.frontends.gamer.io, yt.frontends.gdf, yt.frontends.gdf.api,
> yt.frontends.gdf.data_structures, yt.frontends.gdf.fields,
> yt.frontends.gdf.io, yt.frontends.gdf.tests, yt.frontends.gizmo,
> yt.frontends.gizmo.api, yt.frontends.gizmo.data_structures,
> yt.frontends.gizmo.fields, yt.frontends.halo_catalog,
> yt.frontends.halo_catalog.api, yt.frontends.halo_catalog.data_structures,
> yt.frontends.halo_catalog.fields, yt.frontends.ytdata.data_structures,
> yt.frontends.ytdata.fields, yt.utilities.tree_container,
> yt.frontends.halo_catalog.io, yt.frontends.http_stream,
> yt.frontends.http_stream.api, yt.frontends.http_stream.data_structures,
> yt.frontends.http_stream.io, yt.frontends.moab, yt.frontends.moab.api,
> yt.frontends.moab.data_structures, yt.frontends.moab.fields,
> yt.frontends.moab.io, yt.frontends.moab.tests, yt.frontends.open_pmd,
> yt.frontends.open_pmd.api, yt.frontends.open_pmd.data_structures,
> yt.frontends.open_pmd.fields, yt.frontends.open_pmd.misc,
> yt.frontends.open_pmd.io, yt.frontends.open_pmd.tests,
> yt.frontends.owls.api, yt.frontends.owls.data_structures,
> yt.frontends.owls.simulation_handling, yt.frontends.owls.tests,
> yt.frontends.owls_subfind, yt.frontends.owls_subfind.api,
> yt.frontends.owls_subfind.data_structures,
> yt.frontends.owls_subfind.fields, yt.frontends.owls_subfind.io,
> yt.frontends.owls_subfind.tests, yt.frontends.ramses,
> yt.frontends.ramses.api, yt.frontends.ramses.data_structures,
> yt.frontends.ramses.definitions, yt.frontends.ramses.fields,
> yt.utilities.lib.cosmology_time, yt.frontends.ramses.io,
> yt.frontends.ramses.tests, yt.frontends.rockstar,
> yt.frontends.rockstar.api, yt.frontends.rockstar.data_structures,
> yt.frontends.rockstar.fields, yt.frontends.rockstar.definitions,
> yt.frontends.rockstar.io, yt.frontends.rockstar.tests, yt.frontends.sdf,
> yt.frontends.sdf.api, yt.frontends.sdf.data_structures,
> yt.frontends.sdf.fields, yt.utilities.sdf, yt.frontends.sdf.io,
> yt.frontends.tipsy, yt.frontends.tipsy.api,
> yt.frontends.tipsy.data_structures, yt.frontends.tipsy.fields,
> yt.frontends.tipsy.io, yt.frontends.tipsy.tests, yt.frontends.ytdata.api,
> yt.frontends.ytdata.io, yt.visualization.api,
> yt.visualization.particle_plots, yt.visualization.fixed_resolution,
> yt.visualization.volume_rendering, yt.visualization.volume_rendering.api,
> yt.visualization.volume_rendering.transfer_functions,
> yt.visualization.volume_rendering.image_handling,
> yt.visualization.volume_rendering.camera,
> yt.visualization.volume_rendering.utils,
> yt.utilities.lib.bounding_volume_hierarchy,
> yt.utilities.lib.image_samplers, yt.visualization.volume_rendering.lens,
> yt.utilities.lib.grid_traversal,
> yt.visualization.volume_rendering.transfer_function_helper,
> yt.visualization.volume_rendering.volume_rendering,
> yt.visualization.volume_rendering.scene,
> yt.visualization.volume_rendering.render_source,
> yt.visualization.volume_rendering.zbuffer_array,
> yt.visualization.volume_rendering.off_axis_projection,
> yt.visualization.volume_rendering.interactive_vr_helpers,
> yt.visualization.fixed_resolution_filters, yt.utilities.lib.api,
> yt.utilities.lib.depth_first_octree, yt.utilities.lib.fortran_reader,
> yt.utilities.lib.basic_octree, yt.utilities.lib.points_in_volume,
> yt.utilities.lib.ray_integrators, yt.utilities.lib.write_array,
> yt.utilities.lib.contour_finding,
> yt.utilities.lib.line_integral_convolution, yt.visualization.plot_window,
> yt.visualization.base_plot_types, yt.visualization.plot_modifications,
> yt.analysis_modules, yt.analysis_modules.cosmological_observation,
> yt.analysis_modules.cosmological_observation.light_ray,
> yt.analysis_modules.cosmological_observation.light_ray.light_ray,
> yt.analysis_modules.cosmological_observation.cosmology_splice,
> yt.utilities.lib.mesh_triangulation, yt.visualization.plot_container,
> yt.visualization.tick_locators, yt.visualization.profile_plotter,
> yt.visualization.streamlines, yt.visualization.fits_image,
> yt.analysis_modules.list_modules
> yt : [INFO     ] 2017-03-05 00:12:08,986 Calculating time from 5.000e-01
> to be 1.881e+17 seconds
> yt : [INFO     ] 2017-03-05 00:12:08,986 Calculating time from 5.000e-01
> to be 1.881e+17 seconds
> yt : [INFO     ] 2017-03-05 00:12:08,986 Calculating time from 5.000e-01
> to be 1.881e+17 seconds
> yt : [INFO     ] 2017-03-05 00:12:08,986 Calculating time from 5.000e-01
> to be 1.881e+17 seconds
> yt : [INFO     ] 2017-03-05 00:12:08,986 Calculating time from 5.000e-01
> to be 1.881e+17 seconds
> yt : [INFO     ] 2017-03-05 00:12:09,002 Assuming length units are in
> kpc/h (comoving)
> yt : [INFO     ] 2017-03-05 00:12:09,002 Assuming length units are in
> kpc/h (comoving)
> yt : [INFO     ] 2017-03-05 00:12:09,002 Assuming length units are in
> kpc/h (comoving)
> yt : [INFO     ] 2017-03-05 00:12:09,002 Assuming length units are in
> kpc/h (comoving)
> yt : [INFO     ] 2017-03-05 00:12:09,002 Assuming length units are in
> kpc/h (comoving)
> yt : [INFO     ] 2017-03-05 00:12:09,027 Parameters:
> current_time              = 1.8806506178595517e+17 s
> yt : [INFO     ] 2017-03-05 00:12:09,027 Parameters:
> current_time              = 1.8806506178595517e+17 s
> yt : [INFO     ] 2017-03-05 00:12:09,027 Parameters:
> current_time              = 1.8806506178595517e+17 s
> yt : [INFO     ] 2017-03-05 00:12:09,027 Parameters:
> current_time              = 1.8806506178595517e+17 s
> yt : [INFO     ] 2017-03-05 00:12:09,027 Parameters:
> current_time              = 1.8806506178595517e+17 s
> yt : [INFO     ] 2017-03-05 00:12:09,044 Parameters:
> domain_dimensions         = [2 2 2]
> yt : [INFO     ] 2017-03-05 00:12:09,044 Parameters:
> domain_dimensions         = [2 2 2]
> yt : [INFO     ] 2017-03-05 00:12:09,044 Parameters:
> domain_dimensions         = [2 2 2]
> yt : [INFO     ] 2017-03-05 00:12:09,044 Parameters:
> domain_dimensions         = [2 2 2]
> yt : [INFO     ] 2017-03-05 00:12:09,044 Parameters:
> domain_dimensions         = [2 2 2]
> yt : [INFO     ] 2017-03-05 00:12:09,064 Parameters:
> domain_left_edge          = [ 0.  0.  0.]
> yt : [INFO     ] 2017-03-05 00:12:09,064 Parameters:
> domain_left_edge          = [ 0.  0.  0.]
> yt : [INFO     ] 2017-03-05 00:12:09,064 Parameters:
> domain_left_edge          = [ 0.  0.  0.]
> yt : [INFO     ] 2017-03-05 00:12:09,064 Parameters:
> domain_left_edge          = [ 0.  0.  0.]
> yt : [INFO     ] 2017-03-05 00:12:09,064 Parameters:
> domain_left_edge          = [ 0.  0.  0.]
> yt : [INFO     ] 2017-03-05 00:12:09,081 Parameters:
> domain_right_edge         = [ 100000.  100000.  100000.]
> yt : [INFO     ] 2017-03-05 00:12:09,081 Parameters:
> domain_right_edge         = [ 100000.  100000.  100000.]
> yt : [INFO     ] 2017-03-05 00:12:09,081 Parameters:
> domain_right_edge         = [ 100000.  100000.  100000.]
> yt : [INFO     ] 2017-03-05 00:12:09,081 Parameters:
> domain_right_edge         = [ 100000.  100000.  100000.]
> yt : [INFO     ] 2017-03-05 00:12:09,081 Parameters:
> domain_right_edge         = [ 100000.  100000.  100000.]
> yt : [INFO     ] 2017-03-05 00:12:09,098 Parameters:
> cosmological_simulation   = 1
> yt : [INFO     ] 2017-03-05 00:12:09,098 Parameters:
> cosmological_simulation   = 1
> yt : [INFO     ] 2017-03-05 00:12:09,098 Parameters:
> cosmological_simulation   = 1
> yt : [INFO     ] 2017-03-05 00:12:09,098 Parameters:
> cosmological_simulation   = 1
> yt : [INFO     ] 2017-03-05 00:12:09,098 Parameters:
> cosmological_simulation   = 1
> yt : [INFO     ] 2017-03-05 00:12:09,114 Parameters:
> current_redshift          = 1.0000000010567627
> yt : [INFO     ] 2017-03-05 00:12:09,114 Parameters:
> current_redshift          = 1.0000000010567627
> yt : [INFO     ] 2017-03-05 00:12:09,114 Parameters:
> current_redshift          = 1.0000000010567627
> yt : [INFO     ] 2017-03-05 00:12:09,114 Parameters:
> current_redshift          = 1.0000000010567627
> yt : [INFO     ] 2017-03-05 00:12:09,114 Parameters:
> current_redshift          = 1.0000000010567627
> yt : [INFO     ] 2017-03-05 00:12:09,131 Parameters:
> omega_lambda              = 0.725
> yt : [INFO     ] 2017-03-05 00:12:09,131 Parameters:
> omega_lambda              = 0.725
> yt : [INFO     ] 2017-03-05 00:12:09,131 Parameters:
> omega_lambda              = 0.725
> yt : [INFO     ] 2017-03-05 00:12:09,131 Parameters:
> omega_lambda              = 0.725
> yt : [INFO     ] 2017-03-05 00:12:09,131 Parameters:
> omega_lambda              = 0.725
> yt : [INFO     ] 2017-03-05 00:12:09,145 Parameters:
> omega_matter              = 0.275
> yt : [INFO     ] 2017-03-05 00:12:09,145 Parameters:
> omega_matter              = 0.275
> yt : [INFO     ] 2017-03-05 00:12:09,145 Parameters:
> omega_matter              = 0.275
> yt : [INFO     ] 2017-03-05 00:12:09,145 Parameters:
> omega_matter              = 0.275
> yt : [INFO     ] 2017-03-05 00:12:09,145 Parameters:
> omega_matter              = 0.275
> yt : [INFO     ] 2017-03-05 00:12:09,160 Parameters:
> hubble_constant           = 0.702
> yt : [INFO     ] 2017-03-05 00:12:09,160 Parameters:
> hubble_constant           = 0.702
> yt : [INFO     ] 2017-03-05 00:12:09,160 Parameters:
> hubble_constant           = 0.702
> yt : [INFO     ] 2017-03-05 00:12:09,160 Parameters:
> hubble_constant           = 0.702
> yt : [INFO     ] 2017-03-05 00:12:09,160 Parameters:
> hubble_constant           = 0.702
> yt : [INFO     ] 2017-03-05 00:12:25,556 Allocating for 1.190e+10
> particles (index particle type 'all')
> yt : [INFO     ] 2017-03-05 00:12:25,556 Allocating for 1.190e+10
> particles (index particle type 'all')
> yt : [INFO     ] 2017-03-05 00:12:25,556 Allocating for 1.190e+10
> particles (index particle type 'all')
> yt : [INFO     ] 2017-03-05 00:12:25,556 Allocating for 1.190e+10
> particles (index particle type 'all')
> yt : [INFO     ] 2017-03-05 00:12:25,556 Allocating for 1.190e+10
> particles (index particle type 'all')
> Traceback (most recent call last):
>
>   File "<ipython-input-5-0d0fb587bd5e>", line 1, in <module>
>     runfile('/home/alankar/Documents/X ray map/dev.py',
> wdir='/home/alankar/Documents/X ray map')
>
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py",
> line 866, in runfile
>     execfile(filename, namespace)
>
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py",
> line 102, in execfile
>     exec(compile(f.read(), filename, 'exec'), namespace)
>
>   File "/home/alankar/Documents/X ray map/dev.py", line 23, in <module>
>     ds.index
>
>   File "/home/alankar/Documents/yt/yt/data_objects/static_output.py", line
> 501, in index
>     self, dataset_type=self.dataset_type)
>
>   File
> "/home/alankar/Documents/yt/yt/geometry/particle_geometry_handler.py", line
> 39, in __init__
>     super(ParticleIndex, self).__init__(ds, dataset_type)
>
>   File "/home/alankar/Documents/yt/yt/geometry/geometry_handler.py", line
> 50, in __init__
>     self._setup_geometry()
>
>   File
> "/home/alankar/Documents/yt/yt/geometry/particle_geometry_handler.py", line
> 50, in _setup_geometry
>     self._initialize_particle_handler()
>
>   File
> "/home/alankar/Documents/yt/yt/geometry/particle_geometry_handler.py", line
> 99, in _initialize_particle_handler
>     self._initialize_indices()
>
>   File
> "/home/alankar/Documents/yt/yt/geometry/particle_geometry_handler.py", line
> 121, in _initialize_indices
>     morton = np.empty(self.total_particles, dtype="uint64")
>
> MemoryError
>
>
> Alankar Dutta
>
>
>
> On Sat, Mar 4, 2017 at 11:56 PM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Thanks Nathan that works. Can you please suggest me a way to share with
> you my entire 960 GB snapshot containing 1024 files? Then you can have a
> look at them if you are interested in adding further support.
>
> Alankar Dutta
>
> On Sat, Mar 4, 2017 at 11:11 PM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>
> Oops, typo, should be https://bitbucket.org/ngoldbaum/yt
>
>
> On Sat, Mar 4, 2017 at 11:40 AM Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello,
>
>
> $ hg pull -r f4c5c13 https://bitbucket.org/yt_analysis/yt
> This is giving the following error:
> pulling from https://bitbucket.org/yt_analysis/yt
> abort: unknown revision 'f4c5c13'!
>
> Alankar Dutta
>
>
> On Sat, Mar 4, 2017 at 10:46 PM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>
> To test my pull request you're going to want to build yt from a clone of
> the mercurial repository after pulling in my changes.
>
> $ conda uninstall yt
> $ hg clone https://bitbucket.org/yt_analysis/yt
> $ cd yt
> $ hg pull -r f4c5c13 https://bitbucket.org/yt_analysis/yt
> $ hg update f4c5c13
> $ python setup.py develop
>
> You'll need a compilation environment set up for that to succeed. If you
> don't have compilers installed it will be less straightforward to test
> this. I'd urge you *not* to hand-edit your installed version of yt from
> conda, as that will create more headaches for you if you make a mistake and
> forget to correct it.
>
>
> On Sat, Mar 4, 2017 at 10:34 AM Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello,
>
> I have modified the yt files as given in the pull request. I get the
> following error code:
>
> Traceback (most recent call last):
>   File "dev.py", line 9, in <module>
>     import yt
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/yt/__init__.py", line
> 133, in <module>
>     frontends = _frontend_container()
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/yt/frontends/api.py",
> line 52, in __init__
>     setattr(self, frontend, importlib.import_module(_mod))
>   File "/home/alankar/anaconda3/lib/python3.6/importlib/__init__.py", line
> 126, in import_module
>     return _bootstrap._gcd_import(name[level:], package, level)
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/yt/frontends/eagle/api.py",
> line 17, in <module>
>     from .data_structures import \
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/yt/frontends/eagle/data_structures.py",
> line 21, in <module>
>     from yt.frontends.gadget.data_structures import \
>   File
> "/home/alankar/anaconda3/lib/python3.6/site-packages/yt/frontends/gadget/data_structures.py",
> line 28, in <module>
>     from yt.frontends.sph.data_structures import \
> ImportError: cannot import name 'SPHDataset'
>
> The following is my code:
> import yt
> ds = yt.load("snapshot_068.0")
>
> Alankar Dutta
>
> On Sat, Mar 4, 2017 at 3:13 PM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Oh I missed telling that the full dataset is around 950 GB
>
> On Sat, Mar 4, 2017 at 1:02 PM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello,
>
> This is only one part of the multipart snapshot of the simulation. I am
> sending you the header information from the snapshot file:
>
> {'O0': 0.27500000000000002,
>  'Ol': 0.72499999999999998,
>  'boxsize': 100000.0,
>  'flag_age': 1,
>  'flag_cooling': 1,
>  'flag_delaytime': 0,
>  'flag_fb': 1,
>  'flag_fh2': 0,
>  'flag_metals': 1,
>  'flag_potential': 3,
>  'flag_sfr': 1,
>  'flag_tmax': 0,
>  'h': 0.70199999999999996,
>  'massTable': array([ 0.        ,  0.00110449,  0.        ,  0.        ,  0.        ,  0.        ]),
>  'nbndry': 76247,
>  'nbulge': 0,
>  'ndisk': 0,
>  'ndm': 1459617792,
>  'nfiles': 1024,
>  'ngas': 1343721867,
>  'npartThisFile': array([10045237, 10239661,        0,        0,   451148,       67], dtype=uint32),
>  'npartTotal': array([1343721867, 1459617792,          0,          0,  503500456,
>              76247], dtype=uint32),
>  'npartTotalHW': array([1, 1, 0, 0, 0, 0], dtype=uint32),
>  'nstar': 503500456,
>  'redshift': 1.0000000010567627,
>  'rhocrit': 2.707660428120944e-29,
>  'time': 0.49999999973580939}
>
>
>
> I extracted this information using a different python program.
>
> Alankar Dutta
>
> On Sat, Mar 4, 2017 at 12:45 PM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>
>
>
> On Sat, Mar 4, 2017 at 12:25 AM, Nathan Goldbaum <nathan12343 at gmail.com>
> wrote:
>
>
>
> On Sat, Mar 4, 2017 at 12:13 AM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello,
>
> Modifying the GADGET simulation isn't possible the moment because it has
> already been developed by someone else who has a paper  published  on this
> simulation and I want to use that simulation  snapshots to make mock xray
> map from it.
>
>
> I'm talking about modifying yt, not Gadget.
>
> Is the file you attached to your other e-mail just a single file in a
> multi-file dataset? How large is the full dataset? Do you not have a way to
> produce a full dataset in this output format that's not prohibitively large?
>
>
> I've opened a pull request that allows me to do IO on the data you
> attached: https://bitbucket.org/yt_analysis/yt/pull-requests/2537
>
> This allows me to read your data in, getting sensible values for e.g.
> position. I suspect we're not using the correct field specification because
> I see this warning:
>
> yt : [WARNING  ] 2017-03-04 01:06:09,109 Your Gadget-2 file may have extra
> columns or different precision! (1814947576 file vs 1486279952 computed)
>
> yt supports a number of field specifications out of the box, see:
>
>
> https://bitbucket.org/yt_analysis/yt/src/3eca2ae80ab14a48b643d3055d7d3c0933fa77ae/yt/frontends/gadget/definitions.py?at=yt&fileviewer=file-view-default#definitions.py-50
>
> Do you happen to know which fields are in your output file?
>
>
>
>
>
>
> Alankar Dutta
>
> On Mar 4, 2017 11:38 AM, "Nathan Goldbaum" <nathan12343 at gmail.com> wrote:
>
> I think the most straightforward thing to do here is to fix the Gadget
> frontend so it properly reads in gadget binary data with positions written
> in double precision.
>
> Is there any chance you can generate a smallish test dataset in your
> Gadget output format that we can use for debugging purposes? With that
> available it should be straightforward to add support. You can share the
> dataset using the yt curldrop (https://docs.hub.yt/services.html#curldrop)
> or a cloud filesharing service like dropbox or google drive.
>
> Unfortunately there isn't a way to load SPH data without a full-fledged
> frontend right now.
>
> We do have a load_particles function which allows creating a dataset from
> particle data loaded as numpy arrays, but it's currently not possible to
> use it to load SPH data.
>
> I'm currently actively working on improving support for SPH data in yt and
> adding the ability to load SPH data with load_particles is one of the
> things I've added in that branch of the code. Hopefully this work will be
> stabilized sometime in the next few months.
>
>
>
> On Fri, Mar 3, 2017 at 11:45 PM, Alankar Dutta <dutta.alankar at gmail.com>
> wrote:
>
> Hello yt-community,
>
> I am a beginner on yt. I have an array of data which I have read from a
> Gadget simulation snapshot. It is not directly supported by yt at present
> (I have ensured this by already discussing this issue in the community
> before). The array has position,velocity, density, mass, internal energy
> and smoothing length information on the gas particles. Now how can I use
> this to make the slice plots or other useful visualizations?
>
> Alankar Dutta,
> Third year Undergraduate,
> Physics Department,
> Presidency University, India
>
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
>
> _______________________________________________
> yt-users mailing list
> yt-users at lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.spacepope.org/pipermail/yt-users-spacepope.org/attachments/20170304/392d21cb/attachment.html>


More information about the yt-users mailing list