[yt-svn] commit/yt: 8 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Sat Jul 18 06:51:40 PDT 2015


8 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/0d6ac4368c99/
Changeset:   0d6ac4368c99
Branch:      yt
User:        BW Keller
Date:        2015-07-14 21:26:16+00:00
Summary:     Fixes Issue #1035.  Problem was that code_length changes its meaning
between after it was applied.
Affected #:  1 file

diff -r b7b8cdbad2505dea1391bddbe3ab6e5d65f2e9b6 -r 0d6ac4368c991d9fd829d41fd4499c52971045e9 yt/frontends/tipsy/data_structures.py
--- a/yt/frontends/tipsy/data_structures.py
+++ b/yt/frontends/tipsy/data_structures.py
@@ -180,7 +180,7 @@
                 self.domain_left_edge = None
                 self.domain_right_edge = None
         else: 
-            bbox = self.arr(self.bounding_box, 'code_length', dtype="float64")
+            bbox = self.arr(self.bounding_box, dtype="float64")
             if bbox.shape == (2, 3):
                 bbox = bbox.transpose()
             self.domain_left_edge = bbox[:,0]


https://bitbucket.org/yt_analysis/yt/commits/8720621b3491/
Changeset:   8720621b3491
Branch:      yt
User:        BW Keller
Date:        2015-07-15 00:12:23+00:00
Summary:     Tipsy datasets were not being generated with smoothing lengths, and this led to some strange behaviour
(biases in calculations of cell-averaged quantities and plots).
Affected #:  1 file

diff -r 0d6ac4368c991d9fd829d41fd4499c52971045e9 -r 8720621b3491489baf0dfe6a235a928433590052 yt/frontends/tipsy/fields.py
--- a/yt/frontends/tipsy/fields.py
+++ b/yt/frontends/tipsy/fields.py
@@ -16,6 +16,8 @@
 #-----------------------------------------------------------------------------
 
 from yt.frontends.sph.fields import SPHFieldInfo
+from yt.fields.particle_fields import add_volume_weighted_smoothed_field, add_nearest_neighbor_field
+from yt.utilities.physical_constants import mp, kb
 
 class TipsyFieldInfo(SPHFieldInfo):
     aux_particle_fields = {
@@ -44,3 +46,29 @@
                 self.aux_particle_fields[field[1]] not in self.known_particle_fields:
                 self.known_particle_fields += (self.aux_particle_fields[field[1]],)
         super(TipsyFieldInfo,self).__init__(ds, field_list, slice_info)
+
+    def setup_particle_fields(self, ptype, *args, **kwargs):
+
+        # setup some special fields that only make sense for SPH particles
+
+        if ptype in ("PartType0", "Gas"):
+            self.setup_gas_particle_fields(ptype)
+
+        super(TipsyFieldInfo, self).setup_particle_fields(
+            ptype, *args, **kwargs)
+
+
+    def setup_gas_particle_fields(self, ptype):
+
+        def _smoothing_length(field, data):
+            # For now, we hardcode num_neighbors.  We should make this configurable
+            # in the future.
+            num_neighbors = 64
+            fn, = add_nearest_neighbor_field(ptype, "particle_position", self, num_neighbors)
+            return data[ptype, 'nearest_neighbor_distance_%d' % num_neighbors]
+
+        self.add_field(
+            (ptype, "smoothing_length"),
+            function=_smoothing_length,
+            particle_type=True,
+            units="code_length")


https://bitbucket.org/yt_analysis/yt/commits/a8ba56ca7f9b/
Changeset:   a8ba56ca7f9b
Branch:      bugfix-1035
User:        BW Keller
Date:        2015-07-15 00:15:20+00:00
Summary:     Bugfix for Issue #1035
Affected #:  0 files



https://bitbucket.org/yt_analysis/yt/commits/f1f968e3cd05/
Changeset:   f1f968e3cd05
Branch:      bugfix-1035
User:        BW Keller
Date:        2015-07-17 15:55:42+00:00
Summary:     Leave bbox as a Numpy array until it has units assigned.
Affected #:  1 file

diff -r a8ba56ca7f9ba525ced99ecc6b99dc6bd9ff184f -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd yt/frontends/tipsy/data_structures.py
--- a/yt/frontends/tipsy/data_structures.py
+++ b/yt/frontends/tipsy/data_structures.py
@@ -180,7 +180,7 @@
                 self.domain_left_edge = None
                 self.domain_right_edge = None
         else: 
-            bbox = self.arr(self.bounding_box, dtype="float64")
+            bbox = np.array(self.bounding_box, dtype="float64")
             if bbox.shape == (2, 3):
                 bbox = bbox.transpose()
             self.domain_left_edge = bbox[:,0]


https://bitbucket.org/yt_analysis/yt/commits/fa91228a919d/
Changeset:   fa91228a919d
Branch:      yt
User:        bwkeller
Date:        2015-07-17 17:20:44+00:00
Summary:     Merged yt_analysis/yt into yt
Affected #:  41 files

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -11,6 +11,7 @@
 yt/analysis_modules/halo_finding/rockstar/rockstar_interface.c
 yt/analysis_modules/ppv_cube/ppv_utils.c
 yt/frontends/ramses/_ramses_reader.cpp
+yt/frontends/sph/smoothing_kernel.c
 yt/geometry/fake_octree.c
 yt/geometry/grid_container.c
 yt/geometry/grid_visitors.c
@@ -40,6 +41,7 @@
 yt/utilities/lib/mesh_utilities.c
 yt/utilities/lib/misc_utilities.c
 yt/utilities/lib/Octree.c
+yt/utilities/lib/GridTree.c
 yt/utilities/lib/origami.c
 yt/utilities/lib/pixelization_routines.c
 yt/utilities/lib/png_writer.c

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a README
--- a/README
+++ b/README
@@ -20,4 +20,5 @@
 For more information on installation, what to do if you run into problems, or 
 ways to help development, please visit our website.
 
-Enjoy!
\ No newline at end of file
+Enjoy!
+

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/install_script.sh
--- a/doc/install_script.sh
+++ b/doc/install_script.sh
@@ -641,7 +641,7 @@
 TORNADO='tornado-4.0.2'
 ZEROMQ='zeromq-4.0.5'
 ZLIB='zlib-1.2.8'
-SETUPTOOLS='setuptools-16.0'
+SETUPTOOLS='setuptools-18.0.1'
 
 # Now we dump all our SHA512 files out.
 echo '856220fa579e272ac38dcef091760f527431ff3b98df9af6e68416fcf77d9659ac5abe5c7dee41331f359614637a4ff452033085335ee499830ed126ab584267  Cython-0.22.tar.gz' > Cython-0.22.tar.gz.sha512
@@ -669,7 +669,7 @@
 echo '93591068dc63af8d50a7925d528bc0cccdd705232c529b6162619fe28dddaf115e8a460b1842877d35160bd7ed480c1bd0bdbec57d1f359085bd1814e0c1c242  tornado-4.0.2.tar.gz' > tornado-4.0.2.tar.gz.sha512
 echo '0d928ed688ed940d460fa8f8d574a9819dccc4e030d735a8c7db71b59287ee50fa741a08249e356c78356b03c2174f2f2699f05aa7dc3d380ed47d8d7bab5408  zeromq-4.0.5.tar.gz' > zeromq-4.0.5.tar.gz.sha512
 echo 'ece209d4c7ec0cb58ede791444dc754e0d10811cbbdebe3df61c0fd9f9f9867c1c3ccd5f1827f847c005e24eef34fb5bf87b5d3f894d75da04f1797538290e4a  zlib-1.2.8.tar.gz' > zlib-1.2.8.tar.gz.sha512
-echo '38a89aad89dc9aa682dbfbca623e2f69511f5e20d4a3526c01aabbc7e93ae78f20aac566676b431e111540b41540a1c4f644ce4174e7ecf052318612075e02dc  setuptools-16.0.tar.gz' > setuptools-16.0.tar.gz.sha512
+echo '9b318ce2ee2cf787929dcb886d76c492b433e71024fda9452d8b4927652a298d6bd1bdb7a4c73883a98e100024f89b46ea8aa14b250f896e549e6dd7e10a6b41  setuptools-18.0.1.tar.gz' > setuptools-18.0.1.tar.gz.sha512
 # Individual processes
 [ -z "$HDF5_DIR" ] && get_ytproject $HDF5.tar.gz
 [ $INST_ZLIB -eq 1 ] && get_ytproject $ZLIB.tar.gz

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:2cc168b2c1737c67647aa29892c0213e7a58233fa53c809f9cd975a4306e9bc8"
+  "signature": "sha256:487383ec23a092310522ec25bd02ad2eb16a3402c5ed3d2b103d33fe17697b3c"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -70,6 +70,13 @@
      ]
     },
     {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "<font color='red'>**NOTE**</font>: Currently, use of the SZpack library to create S-Z projections in yt is limited to Python 2.x."
+     ]
+    },
+    {
      "cell_type": "heading",
      "level": 2,
      "metadata": {},

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -174,7 +174,7 @@
 
 Field plugins can be loaded dynamically, although at present this is not
 particularly useful.  Plans for extending field plugins to dynamically load, to
-enable simple definition of common types (gradient, divergence, etc), and to
+enable simple definition of common types (divergence, curl, etc), and to
 more verbosely describe available fields, have been put in place for future
 versions.
 

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/cookbook/fit_spectrum.py
--- a/doc/source/cookbook/fit_spectrum.py
+++ b/doc/source/cookbook/fit_spectrum.py
@@ -10,10 +10,10 @@
 def _OVI_number_density(field, data):
     return data['H_number_density']*2.0
 
-# Define a function that will accept a ds and add the new field 
+# Define a function that will accept a ds and add the new field
 # defined above.  This will be given to the LightRay below.
 def setup_ds(ds):
-    ds.add_field("O_p5_number_density", 
+    ds.add_field(("gas","O_p5_number_density"),
                  function=_OVI_number_density,
                  units="cm**-3")
 
@@ -62,7 +62,7 @@
 
 # Get all fields that need to be added to the light ray
 fields = ['temperature']
-for s, params in species_dicts.iteritems():
+for s, params in species_dicts.items():
     fields.append(params['field'])
 
 # Make a light ray, and set njobs to -1 to use one core
@@ -79,7 +79,7 @@
 sp = AbsorptionSpectrum(900.0, 1400.0, 50000)
 
 # Iterate over species
-for s, params in species_dicts.iteritems():
+for s, params in species_dicts.items():
     # Iterate over transitions for a single species
     for i in range(params['numLines']):
         # Add the lines to the spectrum

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/cookbook/free_free_field.py
--- a/doc/source/cookbook/free_free_field.py
+++ /dev/null
@@ -1,105 +0,0 @@
-### THIS RECIPE IS CURRENTLY BROKEN IN YT-3.0
-### DO NOT TRUST THIS RECIPE UNTIL THIS LINE IS REMOVED
-
-import numpy as np
-import yt
-# Need to grab the proton mass from the constants database
-from yt.utilities.physical_constants import mp
-
-exit()
-# Define the emission field
-
-keVtoerg = 1.602e-9  # Convert energy in keV to energy in erg
-KtokeV = 8.617e-08  # Convert degrees Kelvin to degrees keV
-sqrt3 = np.sqrt(3.)
-expgamma = 1.78107241799  # Exponential of Euler's constant
-
-
-def _FreeFree_Emission(field, data):
-
-    if data.has_field_parameter("Z"):
-        Z = data.get_field_parameter("Z")
-    else:
-        Z = 1.077  # Primordial H/He plasma
-
-    if data.has_field_parameter("mue"):
-        mue = data.get_field_parameter("mue")
-    else:
-        mue = 1./0.875  # Primordial H/He plasma
-
-    if data.has_field_parameter("mui"):
-        mui = data.get_field_parameter("mui")
-    else:
-        mui = 1./0.8125  # Primordial H/He plasma
-
-    if data.has_field_parameter("Ephoton"):
-        Ephoton = data.get_field_parameter("Ephoton")
-    else:
-        Ephoton = 1.0  # in keV
-
-    if data.has_field_parameter("photon_emission"):
-        photon_emission = data.get_field_parameter("photon_emission")
-    else:
-        photon_emission = False  # Flag for energy or photon emission
-
-    n_e = data["density"]/(mue*mp)
-    n_i = data["density"]/(mui*mp)
-    kT = data["temperature"]*KtokeV
-
-    # Compute the Gaunt factor
-
-    g_ff = np.zeros(kT.shape)
-    g_ff[Ephoton/kT > 1.] = np.sqrt((3./np.pi)*kT[Ephoton/kT > 1.]/Ephoton)
-    g_ff[Ephoton/kT < 1.] = (sqrt3/np.pi)*np.log((4./expgamma) *
-                                                 kT[Ephoton/kT < 1.]/Ephoton)
-
-    eps_E = 1.64e-20*Z*Z*n_e*n_i/np.sqrt(data["temperature"]) * \
-        np.exp(-Ephoton/kT)*g_ff
-
-    if photon_emission:
-        eps_E /= (Ephoton*keVtoerg)
-
-    return eps_E
-
-yt.add_field("FreeFree_Emission", function=_FreeFree_Emission)
-
-# Define the luminosity derived quantity
-def _FreeFreeLuminosity(data):
-    return (data["FreeFree_Emission"]*data["cell_volume"]).sum()
-
-
-def _combFreeFreeLuminosity(data, luminosity):
-    return luminosity.sum()
-
-yt.add_quantity("FreeFree_Luminosity", function=_FreeFreeLuminosity,
-                combine_function=_combFreeFreeLuminosity, n_ret=1)
-
-ds = yt.load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0150")
-
-sphere = ds.sphere(ds.domain_center, (100., "kpc"))
-
-# Print out the total luminosity at 1 keV for the sphere
-
-print("L_E (1 keV, primordial) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# The defaults for the field assume a H/He primordial plasma.
-# Let's set the appropriate parameters for a pure hydrogen plasma.
-
-sphere.set_field_parameter("mue", 1.0)
-sphere.set_field_parameter("mui", 1.0)
-sphere.set_field_parameter("Z", 1.0)
-
-print("L_E (1 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# Now let's print the luminosity at an energy of E = 10 keV
-
-sphere.set_field_parameter("Ephoton", 10.0)
-
-print("L_E (10 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# Finally, let's set the flag for photon emission, to get the total number
-# of photons emitted at this energy:
-
-sphere.set_field_parameter("photon_emission", True)
-
-print("L_ph (10 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/cookbook/simulation_analysis.py
--- a/doc/source/cookbook/simulation_analysis.py
+++ b/doc/source/cookbook/simulation_analysis.py
@@ -2,11 +2,11 @@
 yt.enable_parallelism()
 import collections
 
-# Enable parallelism in the script (assuming it was called with 
+# Enable parallelism in the script (assuming it was called with
 # `mpirun -np <n_procs>` )
 yt.enable_parallelism()
 
-# By using wildcards such as ? and * with the load command, we can load up a 
+# By using wildcards such as ? and * with the load command, we can load up a
 # Time Series containing all of these datasets simultaneously.
 ts = yt.load('enzo_tiny_cosmology/DD????/DD????')
 
@@ -16,7 +16,7 @@
 # Create an empty dictionary
 data = {}
 
-# Iterate through each dataset in the Time Series (using piter allows it 
+# Iterate through each dataset in the Time Series (using piter allows it
 # to happen in parallel automatically across available processors)
 for ds in ts.piter():
     ad = ds.all_data()
@@ -31,6 +31,6 @@
 # Print out all the values we calculated.
 print("Dataset      Redshift        Density Min      Density Max")
 print("---------------------------------------------------------")
-for key, val in od.iteritems(): 
+for key, val in od.items(): 
     print("%s       %05.3f          %5.3g g/cm^3   %5.3g g/cm^3" % \
            (key, val[1], val[0][0], val[0][1]))

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/cookbook/time_series.py
--- a/doc/source/cookbook/time_series.py
+++ b/doc/source/cookbook/time_series.py
@@ -12,7 +12,7 @@
 
 storage = {}
 
-# By using the piter() function, we can iterate on every dataset in 
+# By using the piter() function, we can iterate on every dataset in
 # the TimeSeries object.  By using the storage keyword, we can populate
 # a dictionary where the dataset is the key, and sto.result is the value
 # for later use when the loop is complete.
@@ -25,13 +25,13 @@
     sphere = ds.sphere("c", (100., "kpc"))
     # Calculate the entropy within that sphere
     entr = sphere["entropy"].sum()
-    # Store the current time and sphere entropy for this dataset in our 
+    # Store the current time and sphere entropy for this dataset in our
     # storage dictionary as a tuple
     store.result = (ds.current_time.in_units('Gyr'), entr)
 
 # Convert the storage dictionary values to a Nx2 array, so the can be easily
 # plotted
-arr = np.array(storage.values())
+arr = np.array(list(storage.values()))
 
 # Plot up the results: time versus entropy
 plt.semilogy(arr[:,0], arr[:,1], 'r-')

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/developing/intro.rst
--- a/doc/source/developing/intro.rst
+++ b/doc/source/developing/intro.rst
@@ -142,3 +142,77 @@
 federated database for simulation outputs, and so on and so forth.
 
 yt is an ambitious project.  Let's be ambitious together.
+
+yt Community Code of Conduct
+----------------------------
+
+The community of participants in open source 
+Scientific projects is made up of members from around the
+globe with a diverse set of skills, personalities, and
+experiences. It is through these differences that our
+community experiences success and continued growth. We
+expect everyone in our community to follow these guidelines
+when interacting with others both inside and outside of our
+community. Our goal is to keep ours a positive, inclusive,
+successful, and growing community.
+
+As members of the community,
+
+- We pledge to treat all people with respect and
+  provide a harassment- and bullying-free environment,
+  regardless of sex, sexual orientation and/or gender
+  identity, disability, physical appearance, body size,
+  race, nationality, ethnicity, and religion. In
+  particular, sexual language and imagery, sexist,
+  racist, or otherwise exclusionary jokes are not
+  appropriate.
+
+- We pledge to respect the work of others by
+  recognizing acknowledgment/citation requests of
+  original authors. As authors, we pledge to be explicit
+  about how we want our own work to be cited or
+  acknowledged.
+
+- We pledge to welcome those interested in joining the
+  community, and realize that including people with a
+  variety of opinions and backgrounds will only serve to
+  enrich our community. In particular, discussions
+  relating to pros/cons of various technologies,
+  programming languages, and so on are welcome, but
+  these should be done with respect, taking proactive
+  measure to ensure that all participants are heard and
+  feel confident that they can freely express their
+  opinions.
+
+- We pledge to welcome questions and answer them
+  respectfully, paying particular attention to those new
+  to the community. We pledge to provide respectful
+  criticisms and feedback in forums, especially in
+  discussion threads resulting from code
+  contributions.
+
+- We pledge to be conscientious of the perceptions of
+  the wider community and to respond to criticism
+  respectfully. We will strive to model behaviors that
+  encourage productive debate and disagreement, both
+  within our community and where we are criticized. We
+  will treat those outside our community with the same
+  respect as people within our community.
+
+- We pledge to help the entire community follow the
+  code of conduct, and to not remain silent when we see
+  violations of the code of conduct. We will take action
+  when members of our community violate this code such as
+  contacting confidential at yt-project.org (all emails sent to
+  this address will be treated with the strictest
+  confidence) or talking privately with the person.
+
+This code of conduct applies to all
+community situations online and offline, including mailing
+lists, forums, social media, conferences, meetings,
+associated social events, and one-to-one interactions.
+
+The yt Community Code of Conduct was adapted from the 
+`Astropy Community Code of Conduct 
+<http://www.astropy.org/about.html#codeofconduct>`_,
+which was partially inspired by the PSF code of conduct.

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -1079,6 +1079,76 @@
 
 .. _loading-pyne-data:
 
+Halo Catalog Data
+-----------------
+
+yt has support for reading halo catalogs produced by Rockstar and the inline 
+FOF/SUBFIND halo finders of Gadget and OWLS.  The halo catalogs are treated as 
+particle datasets where each particle represents a single halo.  At this time, 
+yt does not have the ability to load the member particles for a given halo.  
+However, once loaded, further halo analysis can be performed using 
+:ref:`halo_catalog`.
+
+In the case where halo catalogs are written to multiple files, one must only 
+give the path to one of them.
+
+Gadget FOF/SUBFIND
+^^^^^^^^^^^^^^^^^^
+
+The two field types for GadgetFOF data are "Group" (FOF) and "Subhalo" (SUBFIND).
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("gadget_fof_halos/groups_042/fof_subhalo_tab_042.0.hdf5")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["Group", "particle_mass"]
+   print ad["Subhalo", "particle_mass"]
+   # Halo ID
+   print ad["Group", "particle_identifier"]
+   print ad["Subhalo", "particle_identifier"]
+   # positions
+   print ad["Group", "particle_position_x"]
+   # velocities
+   print ad["Group", "particle_velocity_x"]
+
+Multidimensional fields can be accessed through the field name followed by an 
+underscore and the index.
+
+.. code-block:: python
+
+   # x component of the spin
+   print ad["Subhalo", "SubhaloSpin_0"]
+
+OWLS FOF/SUBFIND
+^^^^^^^^^^^^^^^^
+
+OWLS halo catalogs have a very similar structure to regular Gadget halo catalogs.  
+The two field types are "FOF" and "SUBFIND".
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("owls_fof_halos/groups_008/group_008.0.hdf5")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["FOF", "particle_mass"]
+
+Rockstar
+^^^^^^^^
+
+Rockstar halo catalogs are loaded by providing the path to one of the .bin files.
+The single field type available is "halos".
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("rockstar_halos/halos_0.0.bin")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["halos", "particle_mass"]
+
 PyNE Data
 ---------
 

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
--- a/yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
+++ b/yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
@@ -1011,7 +1011,7 @@
 
     """
     f = h5py.File(file_name, 'w')
-    for ion, params in lineDic.iteritems():
+    for ion, params in lineDic.items():
         f.create_dataset("{0}/N".format(ion),data=params['N'])
         f.create_dataset("{0}/b".format(ion),data=params['b'])
         f.create_dataset("{0}/z".format(ion),data=params['z'])

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
--- a/yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
+++ b/yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
@@ -343,7 +343,7 @@
             del output["object"]
 
         # Combine results from each slice.
-        all_slices = all_storage.keys()
+        all_slices = list(all_storage.keys())
         all_slices.sort()
         for my_slice in all_slices:
             if save_slice_images:

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/analysis_modules/halo_finding/halo_objects.py
--- a/yt/analysis_modules/halo_finding/halo_objects.py
+++ b/yt/analysis_modules/halo_finding/halo_objects.py
@@ -22,6 +22,7 @@
 import glob
 import os
 import os.path as path
+from functools import cmp_to_key
 from collections import defaultdict
 from yt.extern.six import add_metaclass
 from yt.extern.six.moves import zip as izip
@@ -39,7 +40,7 @@
     TINY
 from yt.utilities.physical_ratios import \
      rho_crit_g_cm3_h2
-    
+
 from .hop.EnzoHop import RunHOP
 from .fof.EnzoFOF import RunFOF
 
@@ -138,9 +139,9 @@
         c[2] = self["particle_position_z"] - self.ds.domain_left_edge[2]
         com = []
         for i in range(3):
-            # A halo is likely periodic around a boundary if the distance 
+            # A halo is likely periodic around a boundary if the distance
             # between the max and min particle
-            # positions are larger than half the box. 
+            # positions are larger than half the box.
             # So skip the rest if the converse is true.
             # Note we might make a change here when periodicity-handling is
             # fully implemented.
@@ -444,7 +445,7 @@
         Msun2g = mass_sun_cgs
         rho_crit = rho_crit * ((1.0 + z) ** 3.0)
         # Get some pertinent information about the halo.
-        self.mass_bins = self.ds.arr(np.zeros(self.bin_count + 1, 
+        self.mass_bins = self.ds.arr(np.zeros(self.bin_count + 1,
                                               dtype='float64'),'Msun')
         dist = np.empty(thissize, dtype='float64')
         cen = self.center_of_mass()
@@ -475,7 +476,7 @@
         self.overdensity = self.mass_bins * Msun2g / \
             (4./3. * math.pi * rho_crit * \
             (self.radial_bins )**3.0)
-        
+
     def _get_ellipsoid_parameters_basic(self):
         np.seterr(all='ignore')
         # check if there are 4 particles to form an ellipsoid
@@ -501,7 +502,7 @@
         for axis in range(np.size(DW)):
             cases = np.array([position[axis],
                                 position[axis] + DW[axis],
-                              position[axis] - DW[axis]])        
+                              position[axis] - DW[axis]])
             # pick out the smallest absolute distance from com
             position[axis] = np.choose(np.abs(cases).argmin(axis=0), cases)
         # find the furthest particle's index
@@ -571,7 +572,7 @@
     _name = "RockstarHalo"
     # See particle_mask
     _radjust = 4.
-    
+
     def maximum_density(self):
         r"""Not implemented."""
         return -1
@@ -635,11 +636,11 @@
     def get_ellipsoid_parameters(self):
         r"""Calculate the parameters that describe the ellipsoid of
         the particles that constitute the halo.
-        
+
         Parameters
         ----------
         None
-        
+
         Returns
         -------
         tuple : (cm, mag_A, mag_B, mag_C, e0_vector, tilt)
@@ -650,7 +651,7 @@
               #. mag_C as a float.
               #. e0_vector as an array.
               #. tilt as a float.
-        
+
         Examples
         --------
         >>> params = halos[0].get_ellipsoid_parameters()
@@ -662,22 +663,22 @@
             basic_parameters[4], basic_parameters[5]]), basic_parameters[6]]
         toreturn.extend(updated)
         return tuple(toreturn)
-    
+
     def get_ellipsoid(self):
         r"""Returns an ellipsoidal data object.
-        
+
         This will generate a new, empty ellipsoidal data object for this
         halo.
-        
+
         Parameters
         ----------
         None.
-        
+
         Returns
         -------
         ellipsoid : `yt.data_objects.data_containers.YTEllipsoidBase`
             The ellipsoidal data object.
-        
+
         Examples
         --------
         >>> ell = halos[0].get_ellipsoid()
@@ -686,7 +687,7 @@
         ell = self.data.ds.ellipsoid(ep[0], ep[1], ep[2], ep[3],
             ep[4], ep[5])
         return ell
-    
+
 class HOPHalo(Halo):
     _name = "HOPHalo"
     pass
@@ -763,14 +764,14 @@
             self.size, key)
         if field_data is not None:
             if key == 'particle_index':
-                #this is an index for turning data sorted by particle index 
+                #this is an index for turning data sorted by particle index
                 #into the same order as the fields on disk
                 self._pid_sort = field_data.argsort().argsort()
             #convert to YTArray using the data from disk
             if key == 'particle_mass':
                 field_data = self.ds.arr(field_data, 'Msun')
             else:
-                field_data = self.ds.arr(field_data, 
+                field_data = self.ds.arr(field_data,
                     self.ds._get_field_info('unknown',key).units)
             self._saved_fields[key] = field_data
             return self._saved_fields[key]
@@ -856,21 +857,21 @@
             basic_parameters[4], basic_parameters[5]]), basic_parameters[6]]
         toreturn.extend(updated)
         return tuple(toreturn)
-    
+
     def get_ellipsoid(self):
-        r"""Returns an ellipsoidal data object.        
+        r"""Returns an ellipsoidal data object.
         This will generate a new, empty ellipsoidal data object for this
         halo.
-        
+
         Parameters
         ----------
         None.
-        
+
         Returns
         -------
         ellipsoid : `yt.data_objects.data_containers.YTEllipsoidBase`
             The ellipsoidal data object.
-        
+
         Examples
         --------
         >>> ell = halos[0].get_ellipsoid()
@@ -947,11 +948,11 @@
     def maximum_density(self):
         r"""Undefined for text halos."""
         return -1
-    
+
     def maximum_density_location(self):
         r"""Undefined, default to CoM"""
         return self.center_of_mass()
-    
+
     def get_size(self):
         # Have to just get it from the sphere.
         return self["particle_position_x"].size
@@ -964,8 +965,8 @@
     def __init__(self, data_source, dm_only=True, redshift=-1):
         """
         Run hop on *data_source* with a given density *threshold*.  If
-        *dm_only* is True (default), only run it on the dark matter particles, 
-        otherwise on all particles.  Returns an iterable collection of 
+        *dm_only* is True (default), only run it on the dark matter particles,
+        otherwise on all particles.  Returns an iterable collection of
         *HopGroup* items.
         """
         self._data_source = data_source
@@ -1051,7 +1052,7 @@
         ellipsoid_data : bool.
             Whether to print the ellipsoidal information to the file.
             Default = False.
-        
+
         Examples
         --------
         >>> halos.write_out("HopAnalysis.out")
@@ -1144,10 +1145,10 @@
     _halo_dt = np.dtype([('id', np.int64), ('pos', (np.float32, 6)),
         ('corevel', (np.float32, 3)), ('bulkvel', (np.float32, 3)),
         ('m', np.float32), ('r', np.float32), ('child_r', np.float32),
-        ('vmax_r', np.float32), 
+        ('vmax_r', np.float32),
         ('mgrav', np.float32), ('vmax', np.float32),
         ('rvmax', np.float32), ('rs', np.float32),
-        ('klypin_rs', np.float32), 
+        ('klypin_rs', np.float32),
         ('vrms', np.float32), ('J', (np.float32, 3)),
         ('energy', np.float32), ('spin', np.float32),
         ('alt_m', (np.float32, 4)), ('Xoff', np.float32),
@@ -1221,9 +1222,9 @@
         """
         Read the out_*.list text file produced
         by Rockstar into memory."""
-        
+
         ds = self.ds
-        # In order to read the binary data, we need to figure out which 
+        # In order to read the binary data, we need to figure out which
         # binary files belong to this output.
         basedir = os.path.dirname(self.out_list)
         s = self.out_list.split('_')[-1]
@@ -1523,12 +1524,14 @@
                 id += 1
 
         def haloCmp(h1, h2):
+            def cmp(a, b):
+                return (a > b) - (a < b)
             c = cmp(h1.total_mass(), h2.total_mass())
             if c != 0:
                 return -1 * c
             if c == 0:
                 return cmp(h1.center_of_mass()[0], h2.center_of_mass()[0])
-        self._groups.sort(haloCmp)
+        self._groups.sort(key=cmp_to_key(haloCmp))
         sorted_max_dens = {}
         for i, halo in enumerate(self._groups):
             if halo.id in self._max_dens:
@@ -1873,7 +1876,7 @@
 
 class LoadTextHaloes(GenericHaloFinder, TextHaloList):
     r"""Load a text file of halos.
-    
+
     Like LoadHaloes, but when all that is available is a plain
     text file. This assumes the text file has the 3-positions of halos
     along with a radius. The halo objects created are spheres.
@@ -1882,7 +1885,7 @@
     ----------
     fname : String
         The name of the text file to read in.
-    
+
     columns : dict
         A dict listing the column name : column number pairs for data
         in the text file. It is zero-based (like Python).
@@ -1890,7 +1893,7 @@
         Any column name outside of ['x', 'y', 'z', 'r'] will be attached
         to each halo object in the supplementary dict 'supp'. See
         example.
-    
+
     comment : String
         If the first character of a line is equal to this, the line is
         skipped. Default = "#".
@@ -1915,7 +1918,7 @@
     Parameters
     ----------
     fname : String
-        The name of the Rockstar file to read in. Default = 
+        The name of the Rockstar file to read in. Default =
         "rockstar_halos/out_0.list'.
 
     Examples

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/analysis_modules/level_sets/clump_handling.py
--- a/yt/analysis_modules/level_sets/clump_handling.py
+++ b/yt/analysis_modules/level_sets/clump_handling.py
@@ -20,7 +20,8 @@
 from yt.fields.derived_field import \
     ValidateSpatial
 from yt.funcs import mylog
-    
+from yt.extern.six import string_types
+
 from .clump_info_items import \
     clump_info_registry
 from .clump_validators import \
@@ -268,7 +269,7 @@
 
 def write_clump_index(clump, level, fh):
     top = False
-    if not isinstance(fh, file):
+    if isinstance(fh, string_types):
         fh = open(fh, "w")
         top = True
     for q in range(level):
@@ -285,7 +286,7 @@
 
 def write_clumps(clump, level, fh):
     top = False
-    if not isinstance(fh, file):
+    if isinstance(fh, string_types):
         fh = open(fh, "w")
         top = True
     if ((clump.children is None) or (len(clump.children) == 0)):

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/data_objects/time_series.py
--- a/yt/data_objects/time_series.py
+++ b/yt/data_objects/time_series.py
@@ -130,7 +130,7 @@
     def __new__(cls, outputs, *args, **kwargs):
         if isinstance(outputs, string_types):
             outputs = get_filenames_from_glob_pattern(outputs)
-        ret = super(DatasetSeries, cls).__new__(cls, *args, **kwargs)
+        ret = super(DatasetSeries, cls).__new__(cls)
         try:
             ret._pre_outputs = outputs[:]
         except TypeError:

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/api.py
--- a/yt/frontends/api.py
+++ b/yt/frontends/api.py
@@ -27,6 +27,7 @@
     'fits',
     'flash',
     'gadget',
+    'gadget_fof',
     'gdf',
     'halo_catalog',
     'http_stream',

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/chombo/io.py
--- a/yt/frontends/chombo/io.py
+++ b/yt/frontends/chombo/io.py
@@ -76,7 +76,7 @@
         for key, val in self._handle.attrs.items():
             if key.startswith('component_'):
                 comp_number = int(re.match('component_(\d+)', key).groups()[0])
-                field_dict[val] = comp_number
+                field_dict[val.decode('utf-8')] = comp_number
         self._field_dict = field_dict
         return self._field_dict
 

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/fits/io.py
--- a/yt/frontends/fits/io.py
+++ b/yt/frontends/fits/io.py
@@ -32,7 +32,7 @@
     def _read_particle_coords(self, chunks, ptf):
         pdata = self.ds._handle[self.ds.first_image].data
         assert(len(ptf) == 1)
-        ptype = ptf.keys()[0]
+        ptype = list(ptf.keys())[0]
         x = np.asarray(pdata.field("X"), dtype="=f8")
         y = np.asarray(pdata.field("Y"), dtype="=f8")
         z = np.ones(x.shape)
@@ -43,7 +43,7 @@
     def _read_particle_fields(self, chunks, ptf, selector):
         pdata = self.ds._handle[self.ds.first_image].data
         assert(len(ptf) == 1)
-        ptype = ptf.keys()[0]
+        ptype = list(ptf.keys())[0]
         field_list = ptf[ptype]
         x = np.asarray(pdata.field("X"), dtype="=f8")
         y = np.asarray(pdata.field("Y"), dtype="=f8")

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/fits/misc.py
--- a/yt/frontends/fits/misc.py
+++ b/yt/frontends/fits/misc.py
@@ -12,14 +12,18 @@
 
 import numpy as np
 import base64
-from yt.extern.six.moves import StringIO
+from yt.extern.six import PY3
 from yt.fields.derived_field import ValidateSpatial
 from yt.utilities.on_demand_imports import _astropy
 from yt.funcs import mylog, get_image_suffix
 from yt.visualization._mpl_imports import FigureCanvasAgg
 from yt.units.yt_array import YTQuantity, YTArray
 from yt.utilities.fits_image import FITSImageData
-
+if PY3:
+    from io import BytesIO as IO
+else:
+    from yt.extern.six.moves import StringIO as IO
+    
 import os
 
 def _make_counts(emin, emax):
@@ -255,12 +259,12 @@
 
     def _repr_html_(self):
         ret = ''
-        for k, v in self.plots.iteritems():
+        for k, v in self.plots.items():
             canvas = FigureCanvasAgg(v)
-            f = StringIO()
+            f = IO()
             canvas.print_figure(f)
             f.seek(0)
-            img = base64.b64encode(f.read())
+            img = base64.b64encode(f.read()).decode()
             ret += r'<img style="max-width:100%%;max-height:100%%;" ' \
                    r'src="data:image/png;base64,%s"><br>' % img
         return ret

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget/data_structures.py
--- a/yt/frontends/gadget/data_structures.py
+++ b/yt/frontends/gadget/data_structures.py
@@ -387,7 +387,7 @@
     @classmethod
     def _is_valid(self, *args, **kwargs):
         need_groups = ['Header']
-        veto_groups = ['FOF']
+        veto_groups = ['FOF', 'Group', 'Subhalo']
         valid = True
         try:
             fh = h5py.File(args[0], mode='r')

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/__init__.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/__init__.py
@@ -0,0 +1,15 @@
+"""
+API for HaloCatalog frontend.
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/api.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/api.py
@@ -0,0 +1,26 @@
+"""
+API for GadgetFOF frontend
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2015, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from .data_structures import \
+     GadgetFOFDataset
+
+from .io import \
+     IOHandlerGadgetFOFHDF5
+
+from .fields import \
+     GadgetFOFFieldInfo
+
+from . import tests

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/data_structures.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/data_structures.py
@@ -0,0 +1,246 @@
+"""
+Data structures for GadgetFOF frontend.
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from collections import defaultdict
+import h5py
+import numpy as np
+import stat
+import weakref
+import struct
+import glob
+import time
+import os
+
+from .fields import \
+    GadgetFOFFieldInfo
+
+from yt.utilities.cosmology import \
+    Cosmology
+from yt.utilities.definitions import \
+    mpc_conversion, sec_conversion
+from yt.utilities.exceptions import \
+    YTException
+from yt.utilities.logger import ytLogger as \
+    mylog
+from yt.geometry.particle_geometry_handler import \
+    ParticleIndex
+from yt.data_objects.static_output import \
+    Dataset, \
+    ParticleFile
+from yt.frontends.gadget.data_structures import \
+    _fix_unit_ordering
+import yt.utilities.fortran_utils as fpu
+from yt.units.yt_array import \
+    YTArray, \
+    YTQuantity
+
+class GadgetFOFParticleIndex(ParticleIndex):
+    def __init__(self, ds, dataset_type):
+        super(GadgetFOFParticleIndex, self).__init__(ds, dataset_type)
+
+    def _calculate_particle_index_starts(self):
+        # Halo indices are not saved in the file, so we must count by hand.
+        # File 0 has halos 0 to N_0 - 1, file 1 has halos N_0 to N_0 + N_1 - 1, etc.
+        particle_count = defaultdict(int)
+        offset_count = 0
+        for data_file in self.data_files:
+            data_file.index_start = dict([(ptype, particle_count[ptype]) for
+                                           ptype in data_file.total_particles])
+            data_file.offset_start = offset_count
+            for ptype in data_file.total_particles:
+                particle_count[ptype] += data_file.total_particles[ptype]
+            offset_count += data_file.total_offset
+
+    def _calculate_file_offset_map(self):
+        # After the FOF  is performed, a load-balancing step redistributes halos 
+        # and then writes more fields.  Here, for each file, we create a list of 
+        # files which contain the rest of the redistributed particles.
+        ifof = np.array([data_file.total_particles["Group"]
+                         for data_file in self.data_files])
+        isub = np.array([data_file.total_offset
+                         for data_file in self.data_files])
+        subend = isub.cumsum()
+        fofend = ifof.cumsum()
+        istart = np.digitize(fofend - ifof, subend - isub) - 1
+        iend = np.clip(np.digitize(fofend, subend), 0, ifof.size - 2)
+        for i, data_file in enumerate(self.data_files):
+            data_file.offset_files = self.data_files[istart[i]: iend[i] + 1]
+
+    def _detect_output_fields(self):
+        # TODO: Add additional fields
+        dsl = []
+        units = {}
+        for dom in self.data_files:
+            fl, _units = self.io._identify_fields(dom)
+            units.update(_units)
+            dom._calculate_offsets(fl)
+            for f in fl:
+                if f not in dsl: dsl.append(f)
+        self.field_list = dsl
+        ds = self.dataset
+        ds.particle_types = tuple(set(pt for pt, ds in dsl))
+        # This is an attribute that means these particle types *actually*
+        # exist.  As in, they are real, in the dataset.
+        ds.field_units.update(units)
+        ds.particle_types_raw = ds.particle_types
+            
+    def _setup_geometry(self):
+        super(GadgetFOFParticleIndex, self)._setup_geometry()
+        self._calculate_particle_index_starts()
+        self._calculate_file_offset_map()
+    
+class GadgetFOFHDF5File(ParticleFile):
+    def __init__(self, ds, io, filename, file_id):
+        super(GadgetFOFHDF5File, self).__init__(ds, io, filename, file_id)
+        with h5py.File(filename, "r") as f:
+            self.header = dict((field, f.attrs[field]) \
+                               for field in f.attrs.keys())
+    
+class GadgetFOFDataset(Dataset):
+    _index_class = GadgetFOFParticleIndex
+    _file_class = GadgetFOFHDF5File
+    _field_info_class = GadgetFOFFieldInfo
+    _suffix = ".hdf5"
+
+    def __init__(self, filename, dataset_type="gadget_fof_hdf5",
+                 n_ref=16, over_refine_factor=1,
+                 unit_base=None, units_override=None):
+        self.n_ref = n_ref
+        self.over_refine_factor = over_refine_factor
+        if unit_base is not None and "UnitLength_in_cm" in unit_base:
+            # We assume this is comoving, because in the absence of comoving
+            # integration the redshift will be zero.
+            unit_base['cmcm'] = 1.0 / unit_base["UnitLength_in_cm"]
+        self._unit_base = unit_base
+        if units_override is not None:
+            raise RuntimeError("units_override is not supported for GadgetFOFDataset. "+
+                               "Use unit_base instead.")
+        super(GadgetFOFDataset, self).__init__(filename, dataset_type,
+                                                 units_override=units_override)
+
+    def _parse_parameter_file(self):
+        handle = h5py.File(self.parameter_filename, mode="r")
+        hvals = {}
+        hvals.update((str(k), v) for k, v in handle["/Header"].attrs.items())
+        hvals["NumFiles"] = hvals["NumFiles"]
+
+        self.dimensionality = 3
+        self.refine_by = 2
+        self.unique_identifier = \
+            int(os.stat(self.parameter_filename)[stat.ST_CTIME])
+
+        # Set standard values
+        self.domain_left_edge = np.zeros(3, "float64")
+        self.domain_right_edge = np.ones(3, "float64") * hvals["BoxSize"]
+        nz = 1 << self.over_refine_factor
+        self.domain_dimensions = np.ones(3, "int32") * nz
+        self.cosmological_simulation = 1
+        self.periodicity = (True, True, True)
+        self.current_redshift = hvals["Redshift"]
+        self.omega_lambda = hvals["OmegaLambda"]
+        self.omega_matter = hvals["Omega0"]
+        self.hubble_constant = hvals["HubbleParam"]
+
+        cosmology = Cosmology(hubble_constant=self.hubble_constant,
+                              omega_matter=self.omega_matter,
+                              omega_lambda=self.omega_lambda)
+        self.current_time = cosmology.t_from_z(self.current_redshift)
+
+        self.parameters = hvals
+        prefix = os.path.abspath(
+            os.path.join(os.path.dirname(self.parameter_filename), 
+                         os.path.basename(self.parameter_filename).split(".", 1)[0]))
+        
+        suffix = self.parameter_filename.rsplit(".", 1)[-1]
+        self.filename_template = "%s.%%(num)i.%s" % (prefix, suffix)
+        self.file_count = len(glob.glob(prefix + "*" + self._suffix))
+        if self.file_count == 0:
+            raise YTException(message="No data files found.", ds=self)
+        self.particle_types = ("Group", "Subhalo")
+        self.particle_types_raw = ("Group", "Subhalo")
+        
+        handle.close()
+
+    def _set_code_unit_attributes(self):
+        # Set a sane default for cosmological simulations.
+        if self._unit_base is None and self.cosmological_simulation == 1:
+            mylog.info("Assuming length units are in Mpc/h (comoving)")
+            self._unit_base = dict(length = (1.0, "Mpccm/h"))
+        # The other same defaults we will use from the standard Gadget
+        # defaults.
+        unit_base = self._unit_base or {}
+        
+        if "length" in unit_base:
+            length_unit = unit_base["length"]
+        elif "UnitLength_in_cm" in unit_base:
+            if self.cosmological_simulation == 0:
+                length_unit = (unit_base["UnitLength_in_cm"], "cm")
+            else:
+                length_unit = (unit_base["UnitLength_in_cm"], "cmcm/h")
+        else:
+            raise RuntimeError
+        length_unit = _fix_unit_ordering(length_unit)
+        self.length_unit = self.quan(length_unit[0], length_unit[1])
+        
+        if "velocity" in unit_base:
+            velocity_unit = unit_base["velocity"]
+        elif "UnitVelocity_in_cm_per_s" in unit_base:
+            velocity_unit = (unit_base["UnitVelocity_in_cm_per_s"], "cm/s")
+        else:
+            if self.cosmological_simulation == 0:
+                velocity_unit = (1e5, "cm/s")
+            else:
+                velocity_unit = (1e5, "cmcm/s")
+        velocity_unit = _fix_unit_ordering(velocity_unit)
+        self.velocity_unit = self.quan(velocity_unit[0], velocity_unit[1])
+
+        # We set hubble_constant = 1.0 for non-cosmology, so this is safe.
+        # Default to 1e10 Msun/h if mass is not specified.
+        if "mass" in unit_base:
+            mass_unit = unit_base["mass"]
+        elif "UnitMass_in_g" in unit_base:
+            if self.cosmological_simulation == 0:
+                mass_unit = (unit_base["UnitMass_in_g"], "g")
+            else:
+                mass_unit = (unit_base["UnitMass_in_g"], "g/h")
+        else:
+            # Sane default
+            mass_unit = (1.0, "1e10*Msun/h")
+        mass_unit = _fix_unit_ordering(mass_unit)
+        self.mass_unit = self.quan(mass_unit[0], mass_unit[1])
+
+        if "time" in unit_base:
+            time_unit = unit_base["time"]
+        elif "UnitTime_in_s" in unit_base:
+            time_unit = (unit_base["UnitTime_in_s"], "s")
+        else:
+            time_unit = (1., "s")        
+        self.time_unit = self.quan(time_unit[0], time_unit[1])
+
+    @classmethod
+    def _is_valid(self, *args, **kwargs):
+        need_groups = ['Group', 'Header', 'Subhalo']
+        veto_groups = ['FOF']
+        valid = True
+        try:
+            fh = h5py.File(args[0], mode='r')
+            valid = all(ng in fh["/"] for ng in need_groups) and \
+              not any(vg in fh["/"] for vg in veto_groups)
+            fh.close()
+        except:
+            valid = False
+            pass
+        return valid

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/fields.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/fields.py
@@ -0,0 +1,48 @@
+"""
+GadgetFOF-specific fields
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2015, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from yt.funcs import mylog
+from yt.fields.field_info_container import \
+    FieldInfoContainer
+from yt.units.yt_array import \
+    YTArray
+
+m_units = "code_mass"
+p_units = "code_length"
+v_units = "code_velocity"
+
+class GadgetFOFFieldInfo(FieldInfoContainer):
+    known_other_fields = (
+    )
+
+    known_particle_fields = (
+        ("GroupPos_0", (p_units, ["Group", "particle_position_x"], None)),
+        ("GroupPos_1", (p_units, ["Group", "particle_position_y"], None)),
+        ("GroupPos_2", (p_units, ["Group", "particle_position_z"], None)),
+        ("GroupVel_0", (v_units, ["Group", "particle_velocity_x"], None)),
+        ("GroupVel_1", (v_units, ["Group", "particle_velocity_y"], None)),
+        ("GroupVel_2", (v_units, ["Group", "particle_velocity_z"], None)),
+        ("GroupMass",  (m_units, ["Group", "particle_mass"], None)),
+        ("GroupLen",   ("",      ["Group", "particle_number"], None)),
+        ("SubhaloPos_0", (p_units, ["Subhalo", "particle_position_x"], None)),
+        ("SubhaloPos_1", (p_units, ["Subhalo", "particle_position_y"], None)),
+        ("SubhaloPos_2", (p_units, ["Subhalo", "particle_position_z"], None)),
+        ("SubhaloVel_0", (v_units, ["Subhalo", "particle_velocity_x"], None)),
+        ("SubhaloVel_1", (v_units, ["Subhalo", "particle_velocity_y"], None)),
+        ("SubhaloVel_2", (v_units, ["Subhalo", "particle_velocity_z"], None)),
+        ("SubhaloMass",  (m_units, ["Subhalo", "particle_mass"], None)),
+        ("SubhaloLen",   ("",      ["Subhalo", "particle_number"], None)),
+)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/io.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/io.py
@@ -0,0 +1,207 @@
+"""
+GadgetFOF data-file handling function
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+import h5py
+import numpy as np
+
+from yt.utilities.exceptions import *
+from yt.funcs import mylog
+
+from yt.utilities.io_handler import \
+    BaseIOHandler
+
+from yt.utilities.lib.geometry_utils import compute_morton
+
+class IOHandlerGadgetFOFHDF5(BaseIOHandler):
+    _dataset_type = "gadget_fof_hdf5"
+
+    def __init__(self, ds):
+        super(IOHandlerGadgetFOFHDF5, self).__init__(ds)
+        self.offset_fields = set([])
+
+    def _read_fluid_selection(self, chunks, selector, fields, size):
+        raise NotImplementedError
+
+    def _read_particle_coords(self, chunks, ptf):
+        # This will read chunks and yield the results.
+        chunks = list(chunks)
+        data_files = set([])
+        for chunk in chunks:
+            for obj in chunk.objs:
+                data_files.update(obj.data_files)
+        for data_file in sorted(data_files):
+            with h5py.File(data_file.filename, "r") as f:
+                for ptype, field_list in sorted(ptf.items()):
+                    pcount = data_file.total_particles[ptype]
+                    coords = f[ptype]["%sPos" % ptype].value.astype("float64")
+                    coords = np.resize(coords, (pcount, 3))
+                    x = coords[:, 0]
+                    y = coords[:, 1]
+                    z = coords[:, 2]
+                    yield ptype, (x, y, z)
+
+    def _read_offset_particle_field(self, field, data_file, fh):
+        field_data = np.empty(data_file.total_particles["Group"], dtype="float64")
+        fofindex = np.arange(data_file.total_particles["Group"]) + data_file.index_start["Group"]
+        for offset_file in data_file.offset_files:
+            if fh.filename == offset_file.filename:
+                ofh = fh
+            else:
+                ofh = h5py.File(offset_file.filename, "r")
+            subindex = np.arange(offset_file.total_offset) + offset_file.offset_start
+            substart = max(fofindex[0] - subindex[0], 0)
+            subend = min(fofindex[-1] - subindex[0], subindex.size - 1)
+            fofstart = substart + subindex[0] - fofindex[0]
+            fofend = subend + subindex[0] - fofindex[0]
+            field_data[fofstart:fofend + 1] = ofh["Subhalo"][field][substart:subend + 1]
+        return field_data
+                    
+    def _read_particle_fields(self, chunks, ptf, selector):
+        # Now we have all the sizes, and we can allocate
+        chunks = list(chunks)
+        data_files = set([])
+        for chunk in chunks:
+            for obj in chunk.objs:
+                data_files.update(obj.data_files)
+        for data_file in sorted(data_files):
+            with h5py.File(data_file.filename, "r") as f:
+                for ptype, field_list in sorted(ptf.items()):
+                    pcount = data_file.total_particles[ptype]
+                    if pcount == 0: continue
+                    coords = f[ptype]["%sPos" % ptype].value.astype("float64")
+                    coords = np.resize(coords, (pcount, 3))
+                    x = coords[:, 0]
+                    y = coords[:, 1]
+                    z = coords[:, 2]
+                    mask = selector.select_points(x, y, z, 0.0)
+                    del x, y, z
+                    if mask is None: continue
+                    for field in field_list:
+                        if field in self.offset_fields:
+                            field_data = \
+                              self._read_offset_particle_field(field, data_file, f)
+                        else:
+                            if field == "particle_identifier":
+                                field_data = \
+                                  np.arange(data_file.total_particles[ptype]) + \
+                                  data_file.index_start[ptype]
+                            elif field in f[ptype]:
+                                field_data = f[ptype][field].value.astype("float64")
+                            else:
+                                fname = field[:field.rfind("_")]
+                                field_data = f[ptype][fname].value.astype("float64")
+                                my_div = field_data.size / pcount
+                                if my_div > 1:
+                                    field_data = np.resize(field_data, (pcount, my_div))
+                                    findex = int(field[field.rfind("_") + 1:])
+                                    field_data = field_data[:, findex]
+                        data = field_data[mask]
+                        yield (ptype, field), data
+
+    def _initialize_index(self, data_file, regions):
+        pcount = sum(data_file.total_particles.values())
+        morton = np.empty(pcount, dtype='uint64')
+        if pcount == 0: return morton
+        mylog.debug("Initializing index % 5i (% 7i particles)",
+                    data_file.file_id, pcount)
+        ind = 0
+        with h5py.File(data_file.filename, "r") as f:
+            if not f.keys(): return None
+            dx = np.finfo(f["Group"]["GroupPos"].dtype).eps
+            dx = 2.0*self.ds.quan(dx, "code_length")
+
+            for ptype in data_file.ds.particle_types_raw:
+                if data_file.total_particles[ptype] == 0: continue
+                pos = f[ptype]["%sPos" % ptype].value.astype("float64")
+                pos = np.resize(pos, (data_file.total_particles[ptype], 3))
+                pos = data_file.ds.arr(pos, "code_length")
+                
+                # These are 32 bit numbers, so we give a little lee-way.
+                # Otherwise, for big sets of particles, we often will bump into the
+                # domain edges.  This helps alleviate that.
+                np.clip(pos, self.ds.domain_left_edge + dx,
+                             self.ds.domain_right_edge - dx, pos)
+                if np.any(pos.min(axis=0) < self.ds.domain_left_edge) or \
+                   np.any(pos.max(axis=0) > self.ds.domain_right_edge):
+                    raise YTDomainOverflow(pos.min(axis=0),
+                                           pos.max(axis=0),
+                                           self.ds.domain_left_edge,
+                                           self.ds.domain_right_edge)
+                regions.add_data_file(pos, data_file.file_id)
+                morton[ind:ind+pos.shape[0]] = compute_morton(
+                    pos[:,0], pos[:,1], pos[:,2],
+                    data_file.ds.domain_left_edge,
+                    data_file.ds.domain_right_edge)
+                ind += pos.shape[0]
+        return morton
+
+    def _count_particles(self, data_file):
+        with h5py.File(data_file.filename, "r") as f:
+            pcount = {"Group": f["Header"].attrs["Ngroups_ThisFile"],
+                      "Subhalo": f["Header"].attrs["Nsubgroups_ThisFile"]}
+            data_file.total_offset = 0 # need to figure out how subfind works here
+            return pcount
+
+    def _identify_fields(self, data_file):
+        fields = []
+        pcount = data_file.total_particles
+        if sum(pcount.values()) == 0: return fields, {}
+        with h5py.File(data_file.filename, "r") as f:
+            for ptype in self.ds.particle_types_raw:
+                if data_file.total_particles[ptype] == 0: continue
+                fields.append((ptype, "particle_identifier"))
+                my_fields, my_offset_fields = \
+                  subfind_field_list(f[ptype], ptype, data_file.total_particles)
+                fields.extend(my_fields)
+                self.offset_fields = self.offset_fields.union(set(my_offset_fields))
+        return fields, {}
+
+def subfind_field_list(fh, ptype, pcount):
+    fields = []
+    offset_fields = []
+    for field in fh.keys():
+        if isinstance(fh[field], h5py.Group):
+            my_fields, my_offset_fields = \
+              subfind_field_list(fh[field], ptype, pcount)
+            fields.extend(my_fields)
+            my_offset_fields.extend(offset_fields)
+        else:
+            if not fh[field].size % pcount[ptype]:
+                my_div = fh[field].size / pcount[ptype]
+                fname = fh[field].name[fh[field].name.find(ptype) + len(ptype) + 1:]
+                if my_div > 1:
+                    for i in range(my_div):
+                        fields.append((ptype, "%s_%d" % (fname, i)))
+                else:
+                    fields.append((ptype, fname))
+            elif ptype == "Subfind" and \
+              not fh[field].size % fh["/Subfind"].attrs["Number_of_groups"]:
+                # These are actually Group fields, but they were written after 
+                # a load balancing step moved halos around and thus they do not
+                # correspond to the halos stored in the Group group.
+                my_div = fh[field].size / fh["/Subfind"].attrs["Number_of_groups"]
+                fname = fh[field].name[fh[field].name.find(ptype) + len(ptype) + 1:]
+                if my_div > 1:
+                    for i in range(my_div):
+                        fields.append(("Group", "%s_%d" % (fname, i)))
+                else:
+                    fields.append(("Group", fname))
+                offset_fields.append(fname)
+            else:
+                mylog.warn("Cannot add field (%s, %s) with size %d." % \
+                           (ptype, fh[field].name, fh[field].size))
+                continue
+    return fields, offset_fields

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/setup.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/setup.py
@@ -0,0 +1,13 @@
+#!/usr/bin/env python
+import setuptools
+import os
+import sys
+import os.path
+
+
+def configuration(parent_package='', top_path=None):
+    from numpy.distutils.misc_util import Configuration
+    config = Configuration('gadget_fof', parent_package, top_path)
+    config.make_config_py()  # installs __config__.py
+    #config.make_svn_version_py()
+    return config

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/gadget_fof/tests/test_outputs.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/tests/test_outputs.py
@@ -0,0 +1,56 @@
+"""
+GadgetFOF frontend tests using gadget_fof datasets
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+import os.path
+from yt.testing import \
+    assert_equal
+from yt.utilities.answer_testing.framework import \
+    FieldValuesTest, \
+    requires_ds, \
+    requires_file, \
+    data_dir_load
+from yt.frontends.gadget_fof.api import GadgetFOFDataset
+
+p_types  = ("Group", "Subhalo")
+p_fields = ("particle_position_x", "particle_position_y",
+            "particle_position_z", "particle_velocity_x",
+            "particle_velocity_y", "particle_velocity_z",
+            "particle_mass", "particle_identifier")
+_fields = tuple([(p_type, p_field) for p_type in p_types
+                                   for p_field in p_fields])
+
+# a dataset with empty files
+g5 = "gadget_fof_halos/groups_005/fof_subhalo_tab_005.0.hdf5"
+g42 = "gadget_fof_halos/groups_042/fof_subhalo_tab_042.0.hdf5"
+
+
+ at requires_ds(g5)
+def test_fields_g5():
+    ds = data_dir_load(g5)
+    yield assert_equal, str(ds), os.path.basename(g5)
+    for field in _fields:
+        yield FieldValuesTest(g5, field, particle_type=True)
+
+
+ at requires_ds(g42)
+def test_fields_g42():
+    ds = data_dir_load(g42)
+    yield assert_equal, str(ds), os.path.basename(g42)
+    for field in _fields:
+        yield FieldValuesTest(g42, field, particle_type=True)
+
+ at requires_file(g42)
+def test_GadgetFOFDataset():
+    assert isinstance(data_dir_load(g42), GadgetFOFDataset)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/halo_catalog/io.py
--- a/yt/frontends/halo_catalog/io.py
+++ b/yt/frontends/halo_catalog/io.py
@@ -39,7 +39,7 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
@@ -57,7 +57,7 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/owls/io.py
--- a/yt/frontends/owls/io.py
+++ b/yt/frontends/owls/io.py
@@ -70,7 +70,7 @@
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files, key=lambda x: x.filename):
             f = _get_h5_handle(data_file.filename)
             # This double-reads
             for ptype, field_list in sorted(ptf.items()):
@@ -88,7 +88,7 @@
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files, key=lambda x: x.filename):
             f = _get_h5_handle(data_file.filename)
             for ptype, field_list in sorted(ptf.items()):
                 if data_file.total_particles[ptype] == 0:

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/owls_subfind/data_structures.py
--- a/yt/frontends/owls_subfind/data_structures.py
+++ b/yt/frontends/owls_subfind/data_structures.py
@@ -27,11 +27,14 @@
 from .fields import \
     OWLSSubfindFieldInfo
 
-from yt.utilities.cosmology import Cosmology
+from yt.utilities.cosmology import \
+    Cosmology
 from yt.utilities.definitions import \
     mpc_conversion, sec_conversion
 from yt.utilities.exceptions import \
-     YTException
+    YTException
+from yt.utilities.logger import ytLogger as \
+     mylog
 from yt.geometry.particle_geometry_handler import \
     ParticleIndex
 from yt.data_objects.static_output import \
@@ -170,6 +173,7 @@
         # The other same defaults we will use from the standard Gadget
         # defaults.
         unit_base = self._unit_base or {}
+
         if "length" in unit_base:
             length_unit = unit_base["length"]
         elif "UnitLength_in_cm" in unit_base:
@@ -182,7 +186,6 @@
         length_unit = _fix_unit_ordering(length_unit)
         self.length_unit = self.quan(length_unit[0], length_unit[1])
 
-        unit_base = self._unit_base or {}
         if "velocity" in unit_base:
             velocity_unit = unit_base["velocity"]
         elif "UnitVelocity_in_cm_per_s" in unit_base:
@@ -191,6 +194,7 @@
             velocity_unit = (1e5, "cm/s")
         velocity_unit = _fix_unit_ordering(velocity_unit)
         self.velocity_unit = self.quan(velocity_unit[0], velocity_unit[1])
+
         # We set hubble_constant = 1.0 for non-cosmology, so this is safe.
         # Default to 1e10 Msun/h if mass is not specified.
         if "mass" in unit_base:
@@ -205,7 +209,14 @@
             mass_unit = (1.0, "1e10*Msun/h")
         mass_unit = _fix_unit_ordering(mass_unit)
         self.mass_unit = self.quan(mass_unit[0], mass_unit[1])
-        self.time_unit = self.quan(unit_base["UnitTime_in_s"], "s")
+
+        if "time" in unit_base:
+            time_unit = unit_base["time"]
+        elif "UnitTime_in_s" in unit_base:
+            time_unit = (unit_base["UnitTime_in_s"], "s")
+        else:
+            time_unit = (1., "s")        
+        self.time_unit = self.quan(time_unit[0], time_unit[1])
 
     @classmethod
     def _is_valid(self, *args, **kwargs):

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/ramses/data_structures.py
--- a/yt/frontends/ramses/data_structures.py
+++ b/yt/frontends/ramses/data_structures.py
@@ -491,13 +491,22 @@
         """
         Generates the conversion to various physical _units based on the parameter file
         """
-        # Note that unit_l *already* converts to proper!
-        # Also note that unit_l must be multiplied by the boxlen parameter to
-        # ensure we are correctly set up for the current domain.
-        length_unit = self.parameters['unit_l']
-        boxlen = self.parameters['boxlen']
-        density_unit = self.parameters['unit_d']
-        mass_unit = density_unit * (length_unit * boxlen)**3
+        #Please note that for all units given in the info file, the boxlen
+        #still needs to be folded in, as shown below!
+
+        boxlen=self.parameters['boxlen']
+        length_unit = self.parameters['unit_l'] * boxlen
+        density_unit = self.parameters['unit_d']/ boxlen**3
+
+        # In the mass unit, the factors of boxlen cancel back out, so this 
+        #is equivalent to unit_d*unit_l**3
+
+        mass_unit = density_unit * length_unit**3
+
+        # Cosmological runs are done in lookback conformal time. 
+        # To convert to proper time, the time unit is calculated from 
+        # the expansion factor. This is not yet  done here!
+
         time_unit = self.parameters['unit_t']
         magnetic_unit = np.sqrt(4*np.pi * mass_unit /
                                 (time_unit**2 * length_unit))

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/ramses/io.py
--- a/yt/frontends/ramses/io.py
+++ b/yt/frontends/ramses/io.py
@@ -20,7 +20,12 @@
     BaseIOHandler
 from yt.utilities.logger import ytLogger as mylog
 import yt.utilities.fortran_utils as fpu
-from yt.extern.six.moves import cStringIO
+from yt.extern.six import PY3
+
+if PY3:
+    from io import BytesIO as IO
+else:
+    from cStringIO import StringIO as IO
 
 class IOHandlerRAMSES(BaseIOHandler):
     _dataset_type = "ramses"
@@ -37,7 +42,7 @@
                 f = open(subset.domain.hydro_fn, "rb")
                 # This contains the boundary information, so we skim through
                 # and pick off the right vectors
-                content = cStringIO(f.read())
+                content = IO(f.read())
                 rv = subset.fill(content, fields, selector)
                 for ft, f in fields:
                     d = rv.pop(f)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/rockstar/io.py
--- a/yt/frontends/rockstar/io.py
+++ b/yt/frontends/rockstar/io.py
@@ -28,6 +28,7 @@
 from yt.utilities.lib.geometry_utils import compute_morton
 
 from yt.geometry.oct_container import _ORDER_MAX
+from operator import attrgetter
 
 class IOHandlerRockstarBinary(BaseIOHandler):
     _dataset_type = "rockstar_binary"
@@ -45,12 +46,11 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files,key=attrgetter("filename")):
             pcount = data_file.header['num_halos']
             with open(data_file.filename, "rb") as f:
                 f.seek(data_file._position_offset, os.SEEK_SET)
@@ -66,11 +66,11 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files,key=attrgetter("filename")):
             pcount = data_file.header['num_halos']
             with open(data_file.filename, "rb") as f:
                 for ptype, field_list in sorted(ptf.items()):

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/frontends/setup.py
--- a/yt/frontends/setup.py
+++ b/yt/frontends/setup.py
@@ -17,6 +17,7 @@
     config.add_subpackage("fits")
     config.add_subpackage("flash")
     config.add_subpackage("gadget")
+    config.add_subpackage("gadget_fof")
     config.add_subpackage("gdf")
     config.add_subpackage("halo_catalog")
     config.add_subpackage("http_stream")
@@ -34,11 +35,12 @@
     config.add_subpackage("athena/tests")
     config.add_subpackage("boxlib/tests")
     config.add_subpackage("chombo/tests")
+    config.add_subpackage("eagle/tests")
     config.add_subpackage("enzo/tests")
-    config.add_subpackage("eagle/tests")
     config.add_subpackage("fits/tests")
     config.add_subpackage("flash/tests")
     config.add_subpackage("gadget/tests")
+    config.add_subpackage("gadget_fof/tests")
     config.add_subpackage("moab/tests")
     config.add_subpackage("owls/tests")
     config.add_subpackage("owls_subfind/tests")

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/utilities/fits_image.py
--- a/yt/utilities/fits_image.py
+++ b/yt/utilities/fits_image.py
@@ -122,7 +122,7 @@
         for key in fields:
             if key not in exclude_fields:
                 if hasattr(img_data[key], "units"):
-                    self.field_units[key] = str(img_data[key].units)
+                    self.field_units[key] = img_data[key].units
                 else:
                     self.field_units[key] = "dimensionless"
                 mylog.info("Making a FITS image of field %s" % key)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/visualization/color_maps.py
--- a/yt/visualization/color_maps.py
+++ b/yt/visualization/color_maps.py
@@ -11,7 +11,6 @@
 # The full license is in the file COPYING.txt, distributed with this software.
 #-----------------------------------------------------------------------------
 import numpy as np
-from yt.extern.six.moves import zip as izip
 
 import matplotlib
 import matplotlib.colors as cc
@@ -86,9 +85,9 @@
                 194.5*_vs**2.88+99.72*np.exp(-77.24*(_vs-0.742)**2.0)
               + 45.40*_vs**0.089+10.0)/255.0
 
-cdict = {'red':zip(_vs,_kamae_red,_kamae_red),
-         'green':zip(_vs,_kamae_grn,_kamae_grn),
-         'blue':zip(_vs,_kamae_blu,_kamae_blu)}
+cdict = {'red':np.transpose([_vs,_kamae_red,_kamae_red]),
+         'green':np.transpose([_vs,_kamae_grn,_kamae_grn]),
+         'blue':np.transpose([_vs,_kamae_blu,_kamae_blu])}
 add_cmap('kamae', cdict)
 
 # This one is a simple black & green map
@@ -151,9 +150,9 @@
 _vs = np.linspace(0,1,256)
 for k,v in list(_cm.color_map_luts.items()):
     if k not in yt_colormaps and k not in mcm.cmap_d:
-        cdict = { 'red': zip(_vs,v[0],v[0]),
-                  'green': zip(_vs,v[1],v[1]),
-                  'blue': zip(_vs,v[2],v[2]) }
+        cdict = { 'red': np.transpose([_vs,v[0],v[0]]),
+                  'green': np.transpose([_vs,v[1],v[1]]),
+                  'blue': np.transpose([_vs,v[2],v[2]]) }
         add_cmap(k, cdict)
 
 def _extract_lookup_table(cmap_name):
@@ -393,9 +392,9 @@
     #   Second number is the (0..1) number to interpolate to when coming *from below*
     #   Third number is the (0..1) number to interpolate to when coming *from above*
     _vs = np.linspace(0,1,256)
-    cdict = {'red':   zip(_vs, cmap[:,0], cmap[:,0]),
-             'green': zip(_vs, cmap[:,1], cmap[:,1]),
-             'blue':  zip(_vs, cmap[:,2], cmap[:,2])}
+    cdict = {'red':   np.transpose([_vs, cmap[:,0], cmap[:,0]]),
+             'green': np.transpose([_vs, cmap[:,1], cmap[:,1]]),
+             'blue':  np.transpose([_vs, cmap[:,2], cmap[:,2]])}
 
     if name is not None:
         add_cmap(name, cdict)

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/visualization/fixed_resolution.py
--- a/yt/visualization/fixed_resolution.py
+++ b/yt/visualization/fixed_resolution.py
@@ -27,7 +27,6 @@
 import numpy as np
 import weakref
 import re
-import string
 
 class FixedResolutionBuffer(object):
     r"""
@@ -178,13 +177,13 @@
             pstr = m.string[m.start()+1:m.end()-1]
             segments = fname.split("_")
             for i,s in enumerate(segments):
-                segments[i] = string.capitalize(s)
+                segments[i] = s.capitalize()
                 if s == pstr:
                     ipstr = i
             element = segments[ipstr-1]
             roman = pnum2rom[pstr[1:]]
             label = element + '\ ' + roman + '\ ' + \
-                string.join(segments[ipstr+1:], '\ ')
+                '\ '.join(segments[ipstr+1:])
         else:
             label = fname
         return label

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/visualization/image_writer.py
--- a/yt/visualization/image_writer.py
+++ b/yt/visualization/image_writer.py
@@ -170,7 +170,7 @@
         bitmap_array = np.concatenate([bitmap_array.astype('uint8'),
                                        alpha_channel], axis=-1)
     if transpose:
-        bitmap_array = bitmap_array.swapaxes(0,1)
+        bitmap_array = bitmap_array.swapaxes(0,1).copy(order="C")
     if filename is not None:
         pw.write_png(bitmap_array, filename)
     else:

diff -r 8720621b3491489baf0dfe6a235a928433590052 -r fa91228a919dda447788e79a9a51ca9728bbfb9a yt/visualization/profile_plotter.py
--- a/yt/visualization/profile_plotter.py
+++ b/yt/visualization/profile_plotter.py
@@ -222,7 +222,7 @@
             plot_spec = [plot_spec.copy() for p in profiles]
 
         ProfilePlot._initialize_instance(self, profiles, label, plot_spec, y_log)
-
+        
     @validate_plot
     def save(self, name=None, suffix=None):
         r"""
@@ -530,15 +530,20 @@
                 xma = xmax
             extrema = {p.x_field: ((xmi, str(p.x.units)), (xma, str(p.x.units)))}
             units = {p.x_field: str(p.x.units)}
+            if self.x_log is None:
+                logs = None
+            else:
+                logs = {p.x_field: self.x_log}
             for field in p.field_map.values():
                 units[field] = str(p.field_data[field].units)
             self.profiles[i] = \
                 create_profile(p.data_source, p.x_field,
                                n_bins=len(p.x_bins)-1,
-                               fields=p.field_map.values(),
+                               fields=list(p.field_map.values()),
                                weight_field=p.weight_field,
                                accumulation=p.accumulation,
                                fractional=p.fractional,
+                               logs=logs,
                                extrema=extrema, units=units)
         return self
 
@@ -1146,6 +1151,14 @@
         extrema = {p.x_field: ((xmin, str(p.x.units)), (xmax, str(p.x.units))),
                    p.y_field: ((p.y_bins.min(), str(p.y.units)),
                                (p.y_bins.max(), str(p.y.units)))}
+        if self.x_log is not None or self.y_log is not None:
+            logs = {}
+        else:
+            logs = None
+        if self.x_log is not None:
+            logs[p.x_field] = self.x_log
+        if self.y_log is not None:
+            logs[p.y_field] = self.y_log
         deposition = getattr(self.profile, "deposition", None)
         if deposition is None:
             additional_kwargs = {'accumulation': p.accumulation,
@@ -1155,11 +1168,12 @@
         self.profile = create_profile(
             p.data_source,
             [p.x_field, p.y_field],
-            p.field_map.values(),
+            list(p.field_map.values()),
             n_bins=[len(p.x_bins)-1, len(p.y_bins)-1],
             weight_field=p.weight_field,
             units=units,
             extrema=extrema,
+            logs=logs,
             **additional_kwargs)
         for field in zunits:
             self.profile.set_field_unit(field, zunits[field])
@@ -1201,6 +1215,14 @@
         extrema = {p.x_field: ((p.x_bins.min(), str(p.x.units)),
                                (p.x_bins.max(), str(p.x.units))),
                    p.y_field: ((ymin, str(p.y.units)), (ymax, str(p.y.units)))}
+        if self.x_log is not None or self.y_log is not None:
+            logs = {}
+        else:
+            logs = None
+        if self.x_log is not None:
+            logs[p.x_field] = self.x_log
+        if self.y_log is not None:
+            logs[p.y_field] = self.y_log
         deposition = getattr(self.profile, "deposition", None)
         if deposition is None:
             additional_kwargs = {'accumulation': p.accumulation,
@@ -1210,11 +1232,12 @@
         self.profile = create_profile(
             p.data_source,
             [p.x_field, p.y_field],
-            p.field_map.values(),
+            list(p.field_map.values()),
             n_bins=[len(p.x_bins)-1, len(p.y_bins)-1],
             weight_field=p.weight_field,
             units=units,
             extrema=extrema,
+            logs=logs,
             **additional_kwargs)
         for field in zunits:
             self.profile.set_field_unit(field, zunits[field])


https://bitbucket.org/yt_analysis/yt/commits/a4f19a0f6144/
Changeset:   a4f19a0f6144
Branch:      bugfix-1035
User:        bwkeller
Date:        2015-07-17 17:21:24+00:00
Summary:     Merged yt into bugfix-1035
Affected #:  42 files

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -11,6 +11,7 @@
 yt/analysis_modules/halo_finding/rockstar/rockstar_interface.c
 yt/analysis_modules/ppv_cube/ppv_utils.c
 yt/frontends/ramses/_ramses_reader.cpp
+yt/frontends/sph/smoothing_kernel.c
 yt/geometry/fake_octree.c
 yt/geometry/grid_container.c
 yt/geometry/grid_visitors.c
@@ -40,6 +41,7 @@
 yt/utilities/lib/mesh_utilities.c
 yt/utilities/lib/misc_utilities.c
 yt/utilities/lib/Octree.c
+yt/utilities/lib/GridTree.c
 yt/utilities/lib/origami.c
 yt/utilities/lib/pixelization_routines.c
 yt/utilities/lib/png_writer.c

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 README
--- a/README
+++ b/README
@@ -20,4 +20,5 @@
 For more information on installation, what to do if you run into problems, or 
 ways to help development, please visit our website.
 
-Enjoy!
\ No newline at end of file
+Enjoy!
+

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/install_script.sh
--- a/doc/install_script.sh
+++ b/doc/install_script.sh
@@ -641,7 +641,7 @@
 TORNADO='tornado-4.0.2'
 ZEROMQ='zeromq-4.0.5'
 ZLIB='zlib-1.2.8'
-SETUPTOOLS='setuptools-16.0'
+SETUPTOOLS='setuptools-18.0.1'
 
 # Now we dump all our SHA512 files out.
 echo '856220fa579e272ac38dcef091760f527431ff3b98df9af6e68416fcf77d9659ac5abe5c7dee41331f359614637a4ff452033085335ee499830ed126ab584267  Cython-0.22.tar.gz' > Cython-0.22.tar.gz.sha512
@@ -669,7 +669,7 @@
 echo '93591068dc63af8d50a7925d528bc0cccdd705232c529b6162619fe28dddaf115e8a460b1842877d35160bd7ed480c1bd0bdbec57d1f359085bd1814e0c1c242  tornado-4.0.2.tar.gz' > tornado-4.0.2.tar.gz.sha512
 echo '0d928ed688ed940d460fa8f8d574a9819dccc4e030d735a8c7db71b59287ee50fa741a08249e356c78356b03c2174f2f2699f05aa7dc3d380ed47d8d7bab5408  zeromq-4.0.5.tar.gz' > zeromq-4.0.5.tar.gz.sha512
 echo 'ece209d4c7ec0cb58ede791444dc754e0d10811cbbdebe3df61c0fd9f9f9867c1c3ccd5f1827f847c005e24eef34fb5bf87b5d3f894d75da04f1797538290e4a  zlib-1.2.8.tar.gz' > zlib-1.2.8.tar.gz.sha512
-echo '38a89aad89dc9aa682dbfbca623e2f69511f5e20d4a3526c01aabbc7e93ae78f20aac566676b431e111540b41540a1c4f644ce4174e7ecf052318612075e02dc  setuptools-16.0.tar.gz' > setuptools-16.0.tar.gz.sha512
+echo '9b318ce2ee2cf787929dcb886d76c492b433e71024fda9452d8b4927652a298d6bd1bdb7a4c73883a98e100024f89b46ea8aa14b250f896e549e6dd7e10a6b41  setuptools-18.0.1.tar.gz' > setuptools-18.0.1.tar.gz.sha512
 # Individual processes
 [ -z "$HDF5_DIR" ] && get_ytproject $HDF5.tar.gz
 [ $INST_ZLIB -eq 1 ] && get_ytproject $ZLIB.tar.gz

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:2cc168b2c1737c67647aa29892c0213e7a58233fa53c809f9cd975a4306e9bc8"
+  "signature": "sha256:487383ec23a092310522ec25bd02ad2eb16a3402c5ed3d2b103d33fe17697b3c"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -70,6 +70,13 @@
      ]
     },
     {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "<font color='red'>**NOTE**</font>: Currently, use of the SZpack library to create S-Z projections in yt is limited to Python 2.x."
+     ]
+    },
+    {
      "cell_type": "heading",
      "level": 2,
      "metadata": {},

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -174,7 +174,7 @@
 
 Field plugins can be loaded dynamically, although at present this is not
 particularly useful.  Plans for extending field plugins to dynamically load, to
-enable simple definition of common types (gradient, divergence, etc), and to
+enable simple definition of common types (divergence, curl, etc), and to
 more verbosely describe available fields, have been put in place for future
 versions.
 

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/cookbook/fit_spectrum.py
--- a/doc/source/cookbook/fit_spectrum.py
+++ b/doc/source/cookbook/fit_spectrum.py
@@ -10,10 +10,10 @@
 def _OVI_number_density(field, data):
     return data['H_number_density']*2.0
 
-# Define a function that will accept a ds and add the new field 
+# Define a function that will accept a ds and add the new field
 # defined above.  This will be given to the LightRay below.
 def setup_ds(ds):
-    ds.add_field("O_p5_number_density", 
+    ds.add_field(("gas","O_p5_number_density"),
                  function=_OVI_number_density,
                  units="cm**-3")
 
@@ -62,7 +62,7 @@
 
 # Get all fields that need to be added to the light ray
 fields = ['temperature']
-for s, params in species_dicts.iteritems():
+for s, params in species_dicts.items():
     fields.append(params['field'])
 
 # Make a light ray, and set njobs to -1 to use one core
@@ -79,7 +79,7 @@
 sp = AbsorptionSpectrum(900.0, 1400.0, 50000)
 
 # Iterate over species
-for s, params in species_dicts.iteritems():
+for s, params in species_dicts.items():
     # Iterate over transitions for a single species
     for i in range(params['numLines']):
         # Add the lines to the spectrum

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/cookbook/free_free_field.py
--- a/doc/source/cookbook/free_free_field.py
+++ /dev/null
@@ -1,105 +0,0 @@
-### THIS RECIPE IS CURRENTLY BROKEN IN YT-3.0
-### DO NOT TRUST THIS RECIPE UNTIL THIS LINE IS REMOVED
-
-import numpy as np
-import yt
-# Need to grab the proton mass from the constants database
-from yt.utilities.physical_constants import mp
-
-exit()
-# Define the emission field
-
-keVtoerg = 1.602e-9  # Convert energy in keV to energy in erg
-KtokeV = 8.617e-08  # Convert degrees Kelvin to degrees keV
-sqrt3 = np.sqrt(3.)
-expgamma = 1.78107241799  # Exponential of Euler's constant
-
-
-def _FreeFree_Emission(field, data):
-
-    if data.has_field_parameter("Z"):
-        Z = data.get_field_parameter("Z")
-    else:
-        Z = 1.077  # Primordial H/He plasma
-
-    if data.has_field_parameter("mue"):
-        mue = data.get_field_parameter("mue")
-    else:
-        mue = 1./0.875  # Primordial H/He plasma
-
-    if data.has_field_parameter("mui"):
-        mui = data.get_field_parameter("mui")
-    else:
-        mui = 1./0.8125  # Primordial H/He plasma
-
-    if data.has_field_parameter("Ephoton"):
-        Ephoton = data.get_field_parameter("Ephoton")
-    else:
-        Ephoton = 1.0  # in keV
-
-    if data.has_field_parameter("photon_emission"):
-        photon_emission = data.get_field_parameter("photon_emission")
-    else:
-        photon_emission = False  # Flag for energy or photon emission
-
-    n_e = data["density"]/(mue*mp)
-    n_i = data["density"]/(mui*mp)
-    kT = data["temperature"]*KtokeV
-
-    # Compute the Gaunt factor
-
-    g_ff = np.zeros(kT.shape)
-    g_ff[Ephoton/kT > 1.] = np.sqrt((3./np.pi)*kT[Ephoton/kT > 1.]/Ephoton)
-    g_ff[Ephoton/kT < 1.] = (sqrt3/np.pi)*np.log((4./expgamma) *
-                                                 kT[Ephoton/kT < 1.]/Ephoton)
-
-    eps_E = 1.64e-20*Z*Z*n_e*n_i/np.sqrt(data["temperature"]) * \
-        np.exp(-Ephoton/kT)*g_ff
-
-    if photon_emission:
-        eps_E /= (Ephoton*keVtoerg)
-
-    return eps_E
-
-yt.add_field("FreeFree_Emission", function=_FreeFree_Emission)
-
-# Define the luminosity derived quantity
-def _FreeFreeLuminosity(data):
-    return (data["FreeFree_Emission"]*data["cell_volume"]).sum()
-
-
-def _combFreeFreeLuminosity(data, luminosity):
-    return luminosity.sum()
-
-yt.add_quantity("FreeFree_Luminosity", function=_FreeFreeLuminosity,
-                combine_function=_combFreeFreeLuminosity, n_ret=1)
-
-ds = yt.load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0150")
-
-sphere = ds.sphere(ds.domain_center, (100., "kpc"))
-
-# Print out the total luminosity at 1 keV for the sphere
-
-print("L_E (1 keV, primordial) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# The defaults for the field assume a H/He primordial plasma.
-# Let's set the appropriate parameters for a pure hydrogen plasma.
-
-sphere.set_field_parameter("mue", 1.0)
-sphere.set_field_parameter("mui", 1.0)
-sphere.set_field_parameter("Z", 1.0)
-
-print("L_E (1 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# Now let's print the luminosity at an energy of E = 10 keV
-
-sphere.set_field_parameter("Ephoton", 10.0)
-
-print("L_E (10 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())
-
-# Finally, let's set the flag for photon emission, to get the total number
-# of photons emitted at this energy:
-
-sphere.set_field_parameter("photon_emission", True)
-
-print("L_ph (10 keV, pure hydrogen) = ", sphere.quantities["FreeFree_Luminosity"]())

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/cookbook/simulation_analysis.py
--- a/doc/source/cookbook/simulation_analysis.py
+++ b/doc/source/cookbook/simulation_analysis.py
@@ -2,11 +2,11 @@
 yt.enable_parallelism()
 import collections
 
-# Enable parallelism in the script (assuming it was called with 
+# Enable parallelism in the script (assuming it was called with
 # `mpirun -np <n_procs>` )
 yt.enable_parallelism()
 
-# By using wildcards such as ? and * with the load command, we can load up a 
+# By using wildcards such as ? and * with the load command, we can load up a
 # Time Series containing all of these datasets simultaneously.
 ts = yt.load('enzo_tiny_cosmology/DD????/DD????')
 
@@ -16,7 +16,7 @@
 # Create an empty dictionary
 data = {}
 
-# Iterate through each dataset in the Time Series (using piter allows it 
+# Iterate through each dataset in the Time Series (using piter allows it
 # to happen in parallel automatically across available processors)
 for ds in ts.piter():
     ad = ds.all_data()
@@ -31,6 +31,6 @@
 # Print out all the values we calculated.
 print("Dataset      Redshift        Density Min      Density Max")
 print("---------------------------------------------------------")
-for key, val in od.iteritems(): 
+for key, val in od.items(): 
     print("%s       %05.3f          %5.3g g/cm^3   %5.3g g/cm^3" % \
            (key, val[1], val[0][0], val[0][1]))

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/cookbook/time_series.py
--- a/doc/source/cookbook/time_series.py
+++ b/doc/source/cookbook/time_series.py
@@ -12,7 +12,7 @@
 
 storage = {}
 
-# By using the piter() function, we can iterate on every dataset in 
+# By using the piter() function, we can iterate on every dataset in
 # the TimeSeries object.  By using the storage keyword, we can populate
 # a dictionary where the dataset is the key, and sto.result is the value
 # for later use when the loop is complete.
@@ -25,13 +25,13 @@
     sphere = ds.sphere("c", (100., "kpc"))
     # Calculate the entropy within that sphere
     entr = sphere["entropy"].sum()
-    # Store the current time and sphere entropy for this dataset in our 
+    # Store the current time and sphere entropy for this dataset in our
     # storage dictionary as a tuple
     store.result = (ds.current_time.in_units('Gyr'), entr)
 
 # Convert the storage dictionary values to a Nx2 array, so the can be easily
 # plotted
-arr = np.array(storage.values())
+arr = np.array(list(storage.values()))
 
 # Plot up the results: time versus entropy
 plt.semilogy(arr[:,0], arr[:,1], 'r-')

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/developing/intro.rst
--- a/doc/source/developing/intro.rst
+++ b/doc/source/developing/intro.rst
@@ -142,3 +142,77 @@
 federated database for simulation outputs, and so on and so forth.
 
 yt is an ambitious project.  Let's be ambitious together.
+
+yt Community Code of Conduct
+----------------------------
+
+The community of participants in open source 
+Scientific projects is made up of members from around the
+globe with a diverse set of skills, personalities, and
+experiences. It is through these differences that our
+community experiences success and continued growth. We
+expect everyone in our community to follow these guidelines
+when interacting with others both inside and outside of our
+community. Our goal is to keep ours a positive, inclusive,
+successful, and growing community.
+
+As members of the community,
+
+- We pledge to treat all people with respect and
+  provide a harassment- and bullying-free environment,
+  regardless of sex, sexual orientation and/or gender
+  identity, disability, physical appearance, body size,
+  race, nationality, ethnicity, and religion. In
+  particular, sexual language and imagery, sexist,
+  racist, or otherwise exclusionary jokes are not
+  appropriate.
+
+- We pledge to respect the work of others by
+  recognizing acknowledgment/citation requests of
+  original authors. As authors, we pledge to be explicit
+  about how we want our own work to be cited or
+  acknowledged.
+
+- We pledge to welcome those interested in joining the
+  community, and realize that including people with a
+  variety of opinions and backgrounds will only serve to
+  enrich our community. In particular, discussions
+  relating to pros/cons of various technologies,
+  programming languages, and so on are welcome, but
+  these should be done with respect, taking proactive
+  measure to ensure that all participants are heard and
+  feel confident that they can freely express their
+  opinions.
+
+- We pledge to welcome questions and answer them
+  respectfully, paying particular attention to those new
+  to the community. We pledge to provide respectful
+  criticisms and feedback in forums, especially in
+  discussion threads resulting from code
+  contributions.
+
+- We pledge to be conscientious of the perceptions of
+  the wider community and to respond to criticism
+  respectfully. We will strive to model behaviors that
+  encourage productive debate and disagreement, both
+  within our community and where we are criticized. We
+  will treat those outside our community with the same
+  respect as people within our community.
+
+- We pledge to help the entire community follow the
+  code of conduct, and to not remain silent when we see
+  violations of the code of conduct. We will take action
+  when members of our community violate this code such as
+  contacting confidential at yt-project.org (all emails sent to
+  this address will be treated with the strictest
+  confidence) or talking privately with the person.
+
+This code of conduct applies to all
+community situations online and offline, including mailing
+lists, forums, social media, conferences, meetings,
+associated social events, and one-to-one interactions.
+
+The yt Community Code of Conduct was adapted from the 
+`Astropy Community Code of Conduct 
+<http://www.astropy.org/about.html#codeofconduct>`_,
+which was partially inspired by the PSF code of conduct.

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -1079,6 +1079,76 @@
 
 .. _loading-pyne-data:
 
+Halo Catalog Data
+-----------------
+
+yt has support for reading halo catalogs produced by Rockstar and the inline 
+FOF/SUBFIND halo finders of Gadget and OWLS.  The halo catalogs are treated as 
+particle datasets where each particle represents a single halo.  At this time, 
+yt does not have the ability to load the member particles for a given halo.  
+However, once loaded, further halo analysis can be performed using 
+:ref:`halo_catalog`.
+
+In the case where halo catalogs are written to multiple files, one must only 
+give the path to one of them.
+
+Gadget FOF/SUBFIND
+^^^^^^^^^^^^^^^^^^
+
+The two field types for GadgetFOF data are "Group" (FOF) and "Subhalo" (SUBFIND).
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("gadget_fof_halos/groups_042/fof_subhalo_tab_042.0.hdf5")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["Group", "particle_mass"]
+   print ad["Subhalo", "particle_mass"]
+   # Halo ID
+   print ad["Group", "particle_identifier"]
+   print ad["Subhalo", "particle_identifier"]
+   # positions
+   print ad["Group", "particle_position_x"]
+   # velocities
+   print ad["Group", "particle_velocity_x"]
+
+Multidimensional fields can be accessed through the field name followed by an 
+underscore and the index.
+
+.. code-block:: python
+
+   # x component of the spin
+   print ad["Subhalo", "SubhaloSpin_0"]
+
+OWLS FOF/SUBFIND
+^^^^^^^^^^^^^^^^
+
+OWLS halo catalogs have a very similar structure to regular Gadget halo catalogs.  
+The two field types are "FOF" and "SUBFIND".
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("owls_fof_halos/groups_008/group_008.0.hdf5")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["FOF", "particle_mass"]
+
+Rockstar
+^^^^^^^^
+
+Rockstar halo catalogs are loaded by providing the path to one of the .bin files.
+The single field type available is "halos".
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("rockstar_halos/halos_0.0.bin")
+   ad = ds.all_data()
+   # The halo mass
+   print ad["halos", "particle_mass"]
+
 PyNE Data
 ---------
 

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
--- a/yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
+++ b/yt/analysis_modules/absorption_spectrum/absorption_spectrum_fit.py
@@ -1011,7 +1011,7 @@
 
     """
     f = h5py.File(file_name, 'w')
-    for ion, params in lineDic.iteritems():
+    for ion, params in lineDic.items():
         f.create_dataset("{0}/N".format(ion),data=params['N'])
         f.create_dataset("{0}/b".format(ion),data=params['b'])
         f.create_dataset("{0}/z".format(ion),data=params['z'])

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
--- a/yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
+++ b/yt/analysis_modules/cosmological_observation/light_cone/light_cone.py
@@ -343,7 +343,7 @@
             del output["object"]
 
         # Combine results from each slice.
-        all_slices = all_storage.keys()
+        all_slices = list(all_storage.keys())
         all_slices.sort()
         for my_slice in all_slices:
             if save_slice_images:

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/analysis_modules/halo_finding/halo_objects.py
--- a/yt/analysis_modules/halo_finding/halo_objects.py
+++ b/yt/analysis_modules/halo_finding/halo_objects.py
@@ -22,6 +22,7 @@
 import glob
 import os
 import os.path as path
+from functools import cmp_to_key
 from collections import defaultdict
 from yt.extern.six import add_metaclass
 from yt.extern.six.moves import zip as izip
@@ -39,7 +40,7 @@
     TINY
 from yt.utilities.physical_ratios import \
      rho_crit_g_cm3_h2
-    
+
 from .hop.EnzoHop import RunHOP
 from .fof.EnzoFOF import RunFOF
 
@@ -138,9 +139,9 @@
         c[2] = self["particle_position_z"] - self.ds.domain_left_edge[2]
         com = []
         for i in range(3):
-            # A halo is likely periodic around a boundary if the distance 
+            # A halo is likely periodic around a boundary if the distance
             # between the max and min particle
-            # positions are larger than half the box. 
+            # positions are larger than half the box.
             # So skip the rest if the converse is true.
             # Note we might make a change here when periodicity-handling is
             # fully implemented.
@@ -444,7 +445,7 @@
         Msun2g = mass_sun_cgs
         rho_crit = rho_crit * ((1.0 + z) ** 3.0)
         # Get some pertinent information about the halo.
-        self.mass_bins = self.ds.arr(np.zeros(self.bin_count + 1, 
+        self.mass_bins = self.ds.arr(np.zeros(self.bin_count + 1,
                                               dtype='float64'),'Msun')
         dist = np.empty(thissize, dtype='float64')
         cen = self.center_of_mass()
@@ -475,7 +476,7 @@
         self.overdensity = self.mass_bins * Msun2g / \
             (4./3. * math.pi * rho_crit * \
             (self.radial_bins )**3.0)
-        
+
     def _get_ellipsoid_parameters_basic(self):
         np.seterr(all='ignore')
         # check if there are 4 particles to form an ellipsoid
@@ -501,7 +502,7 @@
         for axis in range(np.size(DW)):
             cases = np.array([position[axis],
                                 position[axis] + DW[axis],
-                              position[axis] - DW[axis]])        
+                              position[axis] - DW[axis]])
             # pick out the smallest absolute distance from com
             position[axis] = np.choose(np.abs(cases).argmin(axis=0), cases)
         # find the furthest particle's index
@@ -571,7 +572,7 @@
     _name = "RockstarHalo"
     # See particle_mask
     _radjust = 4.
-    
+
     def maximum_density(self):
         r"""Not implemented."""
         return -1
@@ -635,11 +636,11 @@
     def get_ellipsoid_parameters(self):
         r"""Calculate the parameters that describe the ellipsoid of
         the particles that constitute the halo.
-        
+
         Parameters
         ----------
         None
-        
+
         Returns
         -------
         tuple : (cm, mag_A, mag_B, mag_C, e0_vector, tilt)
@@ -650,7 +651,7 @@
               #. mag_C as a float.
               #. e0_vector as an array.
               #. tilt as a float.
-        
+
         Examples
         --------
         >>> params = halos[0].get_ellipsoid_parameters()
@@ -662,22 +663,22 @@
             basic_parameters[4], basic_parameters[5]]), basic_parameters[6]]
         toreturn.extend(updated)
         return tuple(toreturn)
-    
+
     def get_ellipsoid(self):
         r"""Returns an ellipsoidal data object.
-        
+
         This will generate a new, empty ellipsoidal data object for this
         halo.
-        
+
         Parameters
         ----------
         None.
-        
+
         Returns
         -------
         ellipsoid : `yt.data_objects.data_containers.YTEllipsoidBase`
             The ellipsoidal data object.
-        
+
         Examples
         --------
         >>> ell = halos[0].get_ellipsoid()
@@ -686,7 +687,7 @@
         ell = self.data.ds.ellipsoid(ep[0], ep[1], ep[2], ep[3],
             ep[4], ep[5])
         return ell
-    
+
 class HOPHalo(Halo):
     _name = "HOPHalo"
     pass
@@ -763,14 +764,14 @@
             self.size, key)
         if field_data is not None:
             if key == 'particle_index':
-                #this is an index for turning data sorted by particle index 
+                #this is an index for turning data sorted by particle index
                 #into the same order as the fields on disk
                 self._pid_sort = field_data.argsort().argsort()
             #convert to YTArray using the data from disk
             if key == 'particle_mass':
                 field_data = self.ds.arr(field_data, 'Msun')
             else:
-                field_data = self.ds.arr(field_data, 
+                field_data = self.ds.arr(field_data,
                     self.ds._get_field_info('unknown',key).units)
             self._saved_fields[key] = field_data
             return self._saved_fields[key]
@@ -856,21 +857,21 @@
             basic_parameters[4], basic_parameters[5]]), basic_parameters[6]]
         toreturn.extend(updated)
         return tuple(toreturn)
-    
+
     def get_ellipsoid(self):
-        r"""Returns an ellipsoidal data object.        
+        r"""Returns an ellipsoidal data object.
         This will generate a new, empty ellipsoidal data object for this
         halo.
-        
+
         Parameters
         ----------
         None.
-        
+
         Returns
         -------
         ellipsoid : `yt.data_objects.data_containers.YTEllipsoidBase`
             The ellipsoidal data object.
-        
+
         Examples
         --------
         >>> ell = halos[0].get_ellipsoid()
@@ -947,11 +948,11 @@
     def maximum_density(self):
         r"""Undefined for text halos."""
         return -1
-    
+
     def maximum_density_location(self):
         r"""Undefined, default to CoM"""
         return self.center_of_mass()
-    
+
     def get_size(self):
         # Have to just get it from the sphere.
         return self["particle_position_x"].size
@@ -964,8 +965,8 @@
     def __init__(self, data_source, dm_only=True, redshift=-1):
         """
         Run hop on *data_source* with a given density *threshold*.  If
-        *dm_only* is True (default), only run it on the dark matter particles, 
-        otherwise on all particles.  Returns an iterable collection of 
+        *dm_only* is True (default), only run it on the dark matter particles,
+        otherwise on all particles.  Returns an iterable collection of
         *HopGroup* items.
         """
         self._data_source = data_source
@@ -1051,7 +1052,7 @@
         ellipsoid_data : bool.
             Whether to print the ellipsoidal information to the file.
             Default = False.
-        
+
         Examples
         --------
         >>> halos.write_out("HopAnalysis.out")
@@ -1144,10 +1145,10 @@
     _halo_dt = np.dtype([('id', np.int64), ('pos', (np.float32, 6)),
         ('corevel', (np.float32, 3)), ('bulkvel', (np.float32, 3)),
         ('m', np.float32), ('r', np.float32), ('child_r', np.float32),
-        ('vmax_r', np.float32), 
+        ('vmax_r', np.float32),
         ('mgrav', np.float32), ('vmax', np.float32),
         ('rvmax', np.float32), ('rs', np.float32),
-        ('klypin_rs', np.float32), 
+        ('klypin_rs', np.float32),
         ('vrms', np.float32), ('J', (np.float32, 3)),
         ('energy', np.float32), ('spin', np.float32),
         ('alt_m', (np.float32, 4)), ('Xoff', np.float32),
@@ -1221,9 +1222,9 @@
         """
         Read the out_*.list text file produced
         by Rockstar into memory."""
-        
+
         ds = self.ds
-        # In order to read the binary data, we need to figure out which 
+        # In order to read the binary data, we need to figure out which
         # binary files belong to this output.
         basedir = os.path.dirname(self.out_list)
         s = self.out_list.split('_')[-1]
@@ -1523,12 +1524,14 @@
                 id += 1
 
         def haloCmp(h1, h2):
+            def cmp(a, b):
+                return (a > b) - (a < b)
             c = cmp(h1.total_mass(), h2.total_mass())
             if c != 0:
                 return -1 * c
             if c == 0:
                 return cmp(h1.center_of_mass()[0], h2.center_of_mass()[0])
-        self._groups.sort(haloCmp)
+        self._groups.sort(key=cmp_to_key(haloCmp))
         sorted_max_dens = {}
         for i, halo in enumerate(self._groups):
             if halo.id in self._max_dens:
@@ -1873,7 +1876,7 @@
 
 class LoadTextHaloes(GenericHaloFinder, TextHaloList):
     r"""Load a text file of halos.
-    
+
     Like LoadHaloes, but when all that is available is a plain
     text file. This assumes the text file has the 3-positions of halos
     along with a radius. The halo objects created are spheres.
@@ -1882,7 +1885,7 @@
     ----------
     fname : String
         The name of the text file to read in.
-    
+
     columns : dict
         A dict listing the column name : column number pairs for data
         in the text file. It is zero-based (like Python).
@@ -1890,7 +1893,7 @@
         Any column name outside of ['x', 'y', 'z', 'r'] will be attached
         to each halo object in the supplementary dict 'supp'. See
         example.
-    
+
     comment : String
         If the first character of a line is equal to this, the line is
         skipped. Default = "#".
@@ -1915,7 +1918,7 @@
     Parameters
     ----------
     fname : String
-        The name of the Rockstar file to read in. Default = 
+        The name of the Rockstar file to read in. Default =
         "rockstar_halos/out_0.list'.
 
     Examples

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/analysis_modules/level_sets/clump_handling.py
--- a/yt/analysis_modules/level_sets/clump_handling.py
+++ b/yt/analysis_modules/level_sets/clump_handling.py
@@ -20,7 +20,8 @@
 from yt.fields.derived_field import \
     ValidateSpatial
 from yt.funcs import mylog
-    
+from yt.extern.six import string_types
+
 from .clump_info_items import \
     clump_info_registry
 from .clump_validators import \
@@ -268,7 +269,7 @@
 
 def write_clump_index(clump, level, fh):
     top = False
-    if not isinstance(fh, file):
+    if isinstance(fh, string_types):
         fh = open(fh, "w")
         top = True
     for q in range(level):
@@ -285,7 +286,7 @@
 
 def write_clumps(clump, level, fh):
     top = False
-    if not isinstance(fh, file):
+    if isinstance(fh, string_types):
         fh = open(fh, "w")
         top = True
     if ((clump.children is None) or (len(clump.children) == 0)):

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/data_objects/time_series.py
--- a/yt/data_objects/time_series.py
+++ b/yt/data_objects/time_series.py
@@ -130,7 +130,7 @@
     def __new__(cls, outputs, *args, **kwargs):
         if isinstance(outputs, string_types):
             outputs = get_filenames_from_glob_pattern(outputs)
-        ret = super(DatasetSeries, cls).__new__(cls, *args, **kwargs)
+        ret = super(DatasetSeries, cls).__new__(cls)
         try:
             ret._pre_outputs = outputs[:]
         except TypeError:

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/api.py
--- a/yt/frontends/api.py
+++ b/yt/frontends/api.py
@@ -27,6 +27,7 @@
     'fits',
     'flash',
     'gadget',
+    'gadget_fof',
     'gdf',
     'halo_catalog',
     'http_stream',

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/chombo/io.py
--- a/yt/frontends/chombo/io.py
+++ b/yt/frontends/chombo/io.py
@@ -76,7 +76,7 @@
         for key, val in self._handle.attrs.items():
             if key.startswith('component_'):
                 comp_number = int(re.match('component_(\d+)', key).groups()[0])
-                field_dict[val] = comp_number
+                field_dict[val.decode('utf-8')] = comp_number
         self._field_dict = field_dict
         return self._field_dict
 

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/fits/io.py
--- a/yt/frontends/fits/io.py
+++ b/yt/frontends/fits/io.py
@@ -32,7 +32,7 @@
     def _read_particle_coords(self, chunks, ptf):
         pdata = self.ds._handle[self.ds.first_image].data
         assert(len(ptf) == 1)
-        ptype = ptf.keys()[0]
+        ptype = list(ptf.keys())[0]
         x = np.asarray(pdata.field("X"), dtype="=f8")
         y = np.asarray(pdata.field("Y"), dtype="=f8")
         z = np.ones(x.shape)
@@ -43,7 +43,7 @@
     def _read_particle_fields(self, chunks, ptf, selector):
         pdata = self.ds._handle[self.ds.first_image].data
         assert(len(ptf) == 1)
-        ptype = ptf.keys()[0]
+        ptype = list(ptf.keys())[0]
         field_list = ptf[ptype]
         x = np.asarray(pdata.field("X"), dtype="=f8")
         y = np.asarray(pdata.field("Y"), dtype="=f8")

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/fits/misc.py
--- a/yt/frontends/fits/misc.py
+++ b/yt/frontends/fits/misc.py
@@ -12,14 +12,18 @@
 
 import numpy as np
 import base64
-from yt.extern.six.moves import StringIO
+from yt.extern.six import PY3
 from yt.fields.derived_field import ValidateSpatial
 from yt.utilities.on_demand_imports import _astropy
 from yt.funcs import mylog, get_image_suffix
 from yt.visualization._mpl_imports import FigureCanvasAgg
 from yt.units.yt_array import YTQuantity, YTArray
 from yt.utilities.fits_image import FITSImageData
-
+if PY3:
+    from io import BytesIO as IO
+else:
+    from yt.extern.six.moves import StringIO as IO
+    
 import os
 
 def _make_counts(emin, emax):
@@ -255,12 +259,12 @@
 
     def _repr_html_(self):
         ret = ''
-        for k, v in self.plots.iteritems():
+        for k, v in self.plots.items():
             canvas = FigureCanvasAgg(v)
-            f = StringIO()
+            f = IO()
             canvas.print_figure(f)
             f.seek(0)
-            img = base64.b64encode(f.read())
+            img = base64.b64encode(f.read()).decode()
             ret += r'<img style="max-width:100%%;max-height:100%%;" ' \
                    r'src="data:image/png;base64,%s"><br>' % img
         return ret

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget/data_structures.py
--- a/yt/frontends/gadget/data_structures.py
+++ b/yt/frontends/gadget/data_structures.py
@@ -387,7 +387,7 @@
     @classmethod
     def _is_valid(self, *args, **kwargs):
         need_groups = ['Header']
-        veto_groups = ['FOF']
+        veto_groups = ['FOF', 'Group', 'Subhalo']
         valid = True
         try:
             fh = h5py.File(args[0], mode='r')

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/__init__.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/__init__.py
@@ -0,0 +1,15 @@
+"""
+API for HaloCatalog frontend.
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/api.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/api.py
@@ -0,0 +1,26 @@
+"""
+API for GadgetFOF frontend
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2015, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from .data_structures import \
+     GadgetFOFDataset
+
+from .io import \
+     IOHandlerGadgetFOFHDF5
+
+from .fields import \
+     GadgetFOFFieldInfo
+
+from . import tests

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/data_structures.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/data_structures.py
@@ -0,0 +1,246 @@
+"""
+Data structures for GadgetFOF frontend.
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from collections import defaultdict
+import h5py
+import numpy as np
+import stat
+import weakref
+import struct
+import glob
+import time
+import os
+
+from .fields import \
+    GadgetFOFFieldInfo
+
+from yt.utilities.cosmology import \
+    Cosmology
+from yt.utilities.definitions import \
+    mpc_conversion, sec_conversion
+from yt.utilities.exceptions import \
+    YTException
+from yt.utilities.logger import ytLogger as \
+    mylog
+from yt.geometry.particle_geometry_handler import \
+    ParticleIndex
+from yt.data_objects.static_output import \
+    Dataset, \
+    ParticleFile
+from yt.frontends.gadget.data_structures import \
+    _fix_unit_ordering
+import yt.utilities.fortran_utils as fpu
+from yt.units.yt_array import \
+    YTArray, \
+    YTQuantity
+
+class GadgetFOFParticleIndex(ParticleIndex):
+    def __init__(self, ds, dataset_type):
+        super(GadgetFOFParticleIndex, self).__init__(ds, dataset_type)
+
+    def _calculate_particle_index_starts(self):
+        # Halo indices are not saved in the file, so we must count by hand.
+        # File 0 has halos 0 to N_0 - 1, file 1 has halos N_0 to N_0 + N_1 - 1, etc.
+        particle_count = defaultdict(int)
+        offset_count = 0
+        for data_file in self.data_files:
+            data_file.index_start = dict([(ptype, particle_count[ptype]) for
+                                           ptype in data_file.total_particles])
+            data_file.offset_start = offset_count
+            for ptype in data_file.total_particles:
+                particle_count[ptype] += data_file.total_particles[ptype]
+            offset_count += data_file.total_offset
+
+    def _calculate_file_offset_map(self):
+        # After the FOF  is performed, a load-balancing step redistributes halos 
+        # and then writes more fields.  Here, for each file, we create a list of 
+        # files which contain the rest of the redistributed particles.
+        ifof = np.array([data_file.total_particles["Group"]
+                         for data_file in self.data_files])
+        isub = np.array([data_file.total_offset
+                         for data_file in self.data_files])
+        subend = isub.cumsum()
+        fofend = ifof.cumsum()
+        istart = np.digitize(fofend - ifof, subend - isub) - 1
+        iend = np.clip(np.digitize(fofend, subend), 0, ifof.size - 2)
+        for i, data_file in enumerate(self.data_files):
+            data_file.offset_files = self.data_files[istart[i]: iend[i] + 1]
+
+    def _detect_output_fields(self):
+        # TODO: Add additional fields
+        dsl = []
+        units = {}
+        for dom in self.data_files:
+            fl, _units = self.io._identify_fields(dom)
+            units.update(_units)
+            dom._calculate_offsets(fl)
+            for f in fl:
+                if f not in dsl: dsl.append(f)
+        self.field_list = dsl
+        ds = self.dataset
+        ds.particle_types = tuple(set(pt for pt, ds in dsl))
+        # This is an attribute that means these particle types *actually*
+        # exist.  As in, they are real, in the dataset.
+        ds.field_units.update(units)
+        ds.particle_types_raw = ds.particle_types
+            
+    def _setup_geometry(self):
+        super(GadgetFOFParticleIndex, self)._setup_geometry()
+        self._calculate_particle_index_starts()
+        self._calculate_file_offset_map()
+    
+class GadgetFOFHDF5File(ParticleFile):
+    def __init__(self, ds, io, filename, file_id):
+        super(GadgetFOFHDF5File, self).__init__(ds, io, filename, file_id)
+        with h5py.File(filename, "r") as f:
+            self.header = dict((field, f.attrs[field]) \
+                               for field in f.attrs.keys())
+    
+class GadgetFOFDataset(Dataset):
+    _index_class = GadgetFOFParticleIndex
+    _file_class = GadgetFOFHDF5File
+    _field_info_class = GadgetFOFFieldInfo
+    _suffix = ".hdf5"
+
+    def __init__(self, filename, dataset_type="gadget_fof_hdf5",
+                 n_ref=16, over_refine_factor=1,
+                 unit_base=None, units_override=None):
+        self.n_ref = n_ref
+        self.over_refine_factor = over_refine_factor
+        if unit_base is not None and "UnitLength_in_cm" in unit_base:
+            # We assume this is comoving, because in the absence of comoving
+            # integration the redshift will be zero.
+            unit_base['cmcm'] = 1.0 / unit_base["UnitLength_in_cm"]
+        self._unit_base = unit_base
+        if units_override is not None:
+            raise RuntimeError("units_override is not supported for GadgetFOFDataset. "+
+                               "Use unit_base instead.")
+        super(GadgetFOFDataset, self).__init__(filename, dataset_type,
+                                                 units_override=units_override)
+
+    def _parse_parameter_file(self):
+        handle = h5py.File(self.parameter_filename, mode="r")
+        hvals = {}
+        hvals.update((str(k), v) for k, v in handle["/Header"].attrs.items())
+        hvals["NumFiles"] = hvals["NumFiles"]
+
+        self.dimensionality = 3
+        self.refine_by = 2
+        self.unique_identifier = \
+            int(os.stat(self.parameter_filename)[stat.ST_CTIME])
+
+        # Set standard values
+        self.domain_left_edge = np.zeros(3, "float64")
+        self.domain_right_edge = np.ones(3, "float64") * hvals["BoxSize"]
+        nz = 1 << self.over_refine_factor
+        self.domain_dimensions = np.ones(3, "int32") * nz
+        self.cosmological_simulation = 1
+        self.periodicity = (True, True, True)
+        self.current_redshift = hvals["Redshift"]
+        self.omega_lambda = hvals["OmegaLambda"]
+        self.omega_matter = hvals["Omega0"]
+        self.hubble_constant = hvals["HubbleParam"]
+
+        cosmology = Cosmology(hubble_constant=self.hubble_constant,
+                              omega_matter=self.omega_matter,
+                              omega_lambda=self.omega_lambda)
+        self.current_time = cosmology.t_from_z(self.current_redshift)
+
+        self.parameters = hvals
+        prefix = os.path.abspath(
+            os.path.join(os.path.dirname(self.parameter_filename), 
+                         os.path.basename(self.parameter_filename).split(".", 1)[0]))
+        
+        suffix = self.parameter_filename.rsplit(".", 1)[-1]
+        self.filename_template = "%s.%%(num)i.%s" % (prefix, suffix)
+        self.file_count = len(glob.glob(prefix + "*" + self._suffix))
+        if self.file_count == 0:
+            raise YTException(message="No data files found.", ds=self)
+        self.particle_types = ("Group", "Subhalo")
+        self.particle_types_raw = ("Group", "Subhalo")
+        
+        handle.close()
+
+    def _set_code_unit_attributes(self):
+        # Set a sane default for cosmological simulations.
+        if self._unit_base is None and self.cosmological_simulation == 1:
+            mylog.info("Assuming length units are in Mpc/h (comoving)")
+            self._unit_base = dict(length = (1.0, "Mpccm/h"))
+        # The other same defaults we will use from the standard Gadget
+        # defaults.
+        unit_base = self._unit_base or {}
+        
+        if "length" in unit_base:
+            length_unit = unit_base["length"]
+        elif "UnitLength_in_cm" in unit_base:
+            if self.cosmological_simulation == 0:
+                length_unit = (unit_base["UnitLength_in_cm"], "cm")
+            else:
+                length_unit = (unit_base["UnitLength_in_cm"], "cmcm/h")
+        else:
+            raise RuntimeError
+        length_unit = _fix_unit_ordering(length_unit)
+        self.length_unit = self.quan(length_unit[0], length_unit[1])
+        
+        if "velocity" in unit_base:
+            velocity_unit = unit_base["velocity"]
+        elif "UnitVelocity_in_cm_per_s" in unit_base:
+            velocity_unit = (unit_base["UnitVelocity_in_cm_per_s"], "cm/s")
+        else:
+            if self.cosmological_simulation == 0:
+                velocity_unit = (1e5, "cm/s")
+            else:
+                velocity_unit = (1e5, "cmcm/s")
+        velocity_unit = _fix_unit_ordering(velocity_unit)
+        self.velocity_unit = self.quan(velocity_unit[0], velocity_unit[1])
+
+        # We set hubble_constant = 1.0 for non-cosmology, so this is safe.
+        # Default to 1e10 Msun/h if mass is not specified.
+        if "mass" in unit_base:
+            mass_unit = unit_base["mass"]
+        elif "UnitMass_in_g" in unit_base:
+            if self.cosmological_simulation == 0:
+                mass_unit = (unit_base["UnitMass_in_g"], "g")
+            else:
+                mass_unit = (unit_base["UnitMass_in_g"], "g/h")
+        else:
+            # Sane default
+            mass_unit = (1.0, "1e10*Msun/h")
+        mass_unit = _fix_unit_ordering(mass_unit)
+        self.mass_unit = self.quan(mass_unit[0], mass_unit[1])
+
+        if "time" in unit_base:
+            time_unit = unit_base["time"]
+        elif "UnitTime_in_s" in unit_base:
+            time_unit = (unit_base["UnitTime_in_s"], "s")
+        else:
+            time_unit = (1., "s")        
+        self.time_unit = self.quan(time_unit[0], time_unit[1])
+
+    @classmethod
+    def _is_valid(self, *args, **kwargs):
+        need_groups = ['Group', 'Header', 'Subhalo']
+        veto_groups = ['FOF']
+        valid = True
+        try:
+            fh = h5py.File(args[0], mode='r')
+            valid = all(ng in fh["/"] for ng in need_groups) and \
+              not any(vg in fh["/"] for vg in veto_groups)
+            fh.close()
+        except:
+            valid = False
+            pass
+        return valid

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/fields.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/fields.py
@@ -0,0 +1,48 @@
+"""
+GadgetFOF-specific fields
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2015, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from yt.funcs import mylog
+from yt.fields.field_info_container import \
+    FieldInfoContainer
+from yt.units.yt_array import \
+    YTArray
+
+m_units = "code_mass"
+p_units = "code_length"
+v_units = "code_velocity"
+
+class GadgetFOFFieldInfo(FieldInfoContainer):
+    known_other_fields = (
+    )
+
+    known_particle_fields = (
+        ("GroupPos_0", (p_units, ["Group", "particle_position_x"], None)),
+        ("GroupPos_1", (p_units, ["Group", "particle_position_y"], None)),
+        ("GroupPos_2", (p_units, ["Group", "particle_position_z"], None)),
+        ("GroupVel_0", (v_units, ["Group", "particle_velocity_x"], None)),
+        ("GroupVel_1", (v_units, ["Group", "particle_velocity_y"], None)),
+        ("GroupVel_2", (v_units, ["Group", "particle_velocity_z"], None)),
+        ("GroupMass",  (m_units, ["Group", "particle_mass"], None)),
+        ("GroupLen",   ("",      ["Group", "particle_number"], None)),
+        ("SubhaloPos_0", (p_units, ["Subhalo", "particle_position_x"], None)),
+        ("SubhaloPos_1", (p_units, ["Subhalo", "particle_position_y"], None)),
+        ("SubhaloPos_2", (p_units, ["Subhalo", "particle_position_z"], None)),
+        ("SubhaloVel_0", (v_units, ["Subhalo", "particle_velocity_x"], None)),
+        ("SubhaloVel_1", (v_units, ["Subhalo", "particle_velocity_y"], None)),
+        ("SubhaloVel_2", (v_units, ["Subhalo", "particle_velocity_z"], None)),
+        ("SubhaloMass",  (m_units, ["Subhalo", "particle_mass"], None)),
+        ("SubhaloLen",   ("",      ["Subhalo", "particle_number"], None)),
+)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/io.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/io.py
@@ -0,0 +1,207 @@
+"""
+GadgetFOF data-file handling function
+
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+import h5py
+import numpy as np
+
+from yt.utilities.exceptions import *
+from yt.funcs import mylog
+
+from yt.utilities.io_handler import \
+    BaseIOHandler
+
+from yt.utilities.lib.geometry_utils import compute_morton
+
+class IOHandlerGadgetFOFHDF5(BaseIOHandler):
+    _dataset_type = "gadget_fof_hdf5"
+
+    def __init__(self, ds):
+        super(IOHandlerGadgetFOFHDF5, self).__init__(ds)
+        self.offset_fields = set([])
+
+    def _read_fluid_selection(self, chunks, selector, fields, size):
+        raise NotImplementedError
+
+    def _read_particle_coords(self, chunks, ptf):
+        # This will read chunks and yield the results.
+        chunks = list(chunks)
+        data_files = set([])
+        for chunk in chunks:
+            for obj in chunk.objs:
+                data_files.update(obj.data_files)
+        for data_file in sorted(data_files):
+            with h5py.File(data_file.filename, "r") as f:
+                for ptype, field_list in sorted(ptf.items()):
+                    pcount = data_file.total_particles[ptype]
+                    coords = f[ptype]["%sPos" % ptype].value.astype("float64")
+                    coords = np.resize(coords, (pcount, 3))
+                    x = coords[:, 0]
+                    y = coords[:, 1]
+                    z = coords[:, 2]
+                    yield ptype, (x, y, z)
+
+    def _read_offset_particle_field(self, field, data_file, fh):
+        field_data = np.empty(data_file.total_particles["Group"], dtype="float64")
+        fofindex = np.arange(data_file.total_particles["Group"]) + data_file.index_start["Group"]
+        for offset_file in data_file.offset_files:
+            if fh.filename == offset_file.filename:
+                ofh = fh
+            else:
+                ofh = h5py.File(offset_file.filename, "r")
+            subindex = np.arange(offset_file.total_offset) + offset_file.offset_start
+            substart = max(fofindex[0] - subindex[0], 0)
+            subend = min(fofindex[-1] - subindex[0], subindex.size - 1)
+            fofstart = substart + subindex[0] - fofindex[0]
+            fofend = subend + subindex[0] - fofindex[0]
+            field_data[fofstart:fofend + 1] = ofh["Subhalo"][field][substart:subend + 1]
+        return field_data
+                    
+    def _read_particle_fields(self, chunks, ptf, selector):
+        # Now we have all the sizes, and we can allocate
+        chunks = list(chunks)
+        data_files = set([])
+        for chunk in chunks:
+            for obj in chunk.objs:
+                data_files.update(obj.data_files)
+        for data_file in sorted(data_files):
+            with h5py.File(data_file.filename, "r") as f:
+                for ptype, field_list in sorted(ptf.items()):
+                    pcount = data_file.total_particles[ptype]
+                    if pcount == 0: continue
+                    coords = f[ptype]["%sPos" % ptype].value.astype("float64")
+                    coords = np.resize(coords, (pcount, 3))
+                    x = coords[:, 0]
+                    y = coords[:, 1]
+                    z = coords[:, 2]
+                    mask = selector.select_points(x, y, z, 0.0)
+                    del x, y, z
+                    if mask is None: continue
+                    for field in field_list:
+                        if field in self.offset_fields:
+                            field_data = \
+                              self._read_offset_particle_field(field, data_file, f)
+                        else:
+                            if field == "particle_identifier":
+                                field_data = \
+                                  np.arange(data_file.total_particles[ptype]) + \
+                                  data_file.index_start[ptype]
+                            elif field in f[ptype]:
+                                field_data = f[ptype][field].value.astype("float64")
+                            else:
+                                fname = field[:field.rfind("_")]
+                                field_data = f[ptype][fname].value.astype("float64")
+                                my_div = field_data.size / pcount
+                                if my_div > 1:
+                                    field_data = np.resize(field_data, (pcount, my_div))
+                                    findex = int(field[field.rfind("_") + 1:])
+                                    field_data = field_data[:, findex]
+                        data = field_data[mask]
+                        yield (ptype, field), data
+
+    def _initialize_index(self, data_file, regions):
+        pcount = sum(data_file.total_particles.values())
+        morton = np.empty(pcount, dtype='uint64')
+        if pcount == 0: return morton
+        mylog.debug("Initializing index % 5i (% 7i particles)",
+                    data_file.file_id, pcount)
+        ind = 0
+        with h5py.File(data_file.filename, "r") as f:
+            if not f.keys(): return None
+            dx = np.finfo(f["Group"]["GroupPos"].dtype).eps
+            dx = 2.0*self.ds.quan(dx, "code_length")
+
+            for ptype in data_file.ds.particle_types_raw:
+                if data_file.total_particles[ptype] == 0: continue
+                pos = f[ptype]["%sPos" % ptype].value.astype("float64")
+                pos = np.resize(pos, (data_file.total_particles[ptype], 3))
+                pos = data_file.ds.arr(pos, "code_length")
+                
+                # These are 32 bit numbers, so we give a little lee-way.
+                # Otherwise, for big sets of particles, we often will bump into the
+                # domain edges.  This helps alleviate that.
+                np.clip(pos, self.ds.domain_left_edge + dx,
+                             self.ds.domain_right_edge - dx, pos)
+                if np.any(pos.min(axis=0) < self.ds.domain_left_edge) or \
+                   np.any(pos.max(axis=0) > self.ds.domain_right_edge):
+                    raise YTDomainOverflow(pos.min(axis=0),
+                                           pos.max(axis=0),
+                                           self.ds.domain_left_edge,
+                                           self.ds.domain_right_edge)
+                regions.add_data_file(pos, data_file.file_id)
+                morton[ind:ind+pos.shape[0]] = compute_morton(
+                    pos[:,0], pos[:,1], pos[:,2],
+                    data_file.ds.domain_left_edge,
+                    data_file.ds.domain_right_edge)
+                ind += pos.shape[0]
+        return morton
+
+    def _count_particles(self, data_file):
+        with h5py.File(data_file.filename, "r") as f:
+            pcount = {"Group": f["Header"].attrs["Ngroups_ThisFile"],
+                      "Subhalo": f["Header"].attrs["Nsubgroups_ThisFile"]}
+            data_file.total_offset = 0 # need to figure out how subfind works here
+            return pcount
+
+    def _identify_fields(self, data_file):
+        fields = []
+        pcount = data_file.total_particles
+        if sum(pcount.values()) == 0: return fields, {}
+        with h5py.File(data_file.filename, "r") as f:
+            for ptype in self.ds.particle_types_raw:
+                if data_file.total_particles[ptype] == 0: continue
+                fields.append((ptype, "particle_identifier"))
+                my_fields, my_offset_fields = \
+                  subfind_field_list(f[ptype], ptype, data_file.total_particles)
+                fields.extend(my_fields)
+                self.offset_fields = self.offset_fields.union(set(my_offset_fields))
+        return fields, {}
+
+def subfind_field_list(fh, ptype, pcount):
+    fields = []
+    offset_fields = []
+    for field in fh.keys():
+        if isinstance(fh[field], h5py.Group):
+            my_fields, my_offset_fields = \
+              subfind_field_list(fh[field], ptype, pcount)
+            fields.extend(my_fields)
+            my_offset_fields.extend(offset_fields)
+        else:
+            if not fh[field].size % pcount[ptype]:
+                my_div = fh[field].size / pcount[ptype]
+                fname = fh[field].name[fh[field].name.find(ptype) + len(ptype) + 1:]
+                if my_div > 1:
+                    for i in range(my_div):
+                        fields.append((ptype, "%s_%d" % (fname, i)))
+                else:
+                    fields.append((ptype, fname))
+            elif ptype == "Subfind" and \
+              not fh[field].size % fh["/Subfind"].attrs["Number_of_groups"]:
+                # These are actually Group fields, but they were written after 
+                # a load balancing step moved halos around and thus they do not
+                # correspond to the halos stored in the Group group.
+                my_div = fh[field].size / fh["/Subfind"].attrs["Number_of_groups"]
+                fname = fh[field].name[fh[field].name.find(ptype) + len(ptype) + 1:]
+                if my_div > 1:
+                    for i in range(my_div):
+                        fields.append(("Group", "%s_%d" % (fname, i)))
+                else:
+                    fields.append(("Group", fname))
+                offset_fields.append(fname)
+            else:
+                mylog.warn("Cannot add field (%s, %s) with size %d." % \
+                           (ptype, fh[field].name, fh[field].size))
+                continue
+    return fields, offset_fields

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/setup.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/setup.py
@@ -0,0 +1,13 @@
+#!/usr/bin/env python
+import setuptools
+import os
+import sys
+import os.path
+
+
+def configuration(parent_package='', top_path=None):
+    from numpy.distutils.misc_util import Configuration
+    config = Configuration('gadget_fof', parent_package, top_path)
+    config.make_config_py()  # installs __config__.py
+    #config.make_svn_version_py()
+    return config

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/gadget_fof/tests/test_outputs.py
--- /dev/null
+++ b/yt/frontends/gadget_fof/tests/test_outputs.py
@@ -0,0 +1,56 @@
+"""
+GadgetFOF frontend tests using gadget_fof datasets
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+import os.path
+from yt.testing import \
+    assert_equal
+from yt.utilities.answer_testing.framework import \
+    FieldValuesTest, \
+    requires_ds, \
+    requires_file, \
+    data_dir_load
+from yt.frontends.gadget_fof.api import GadgetFOFDataset
+
+p_types  = ("Group", "Subhalo")
+p_fields = ("particle_position_x", "particle_position_y",
+            "particle_position_z", "particle_velocity_x",
+            "particle_velocity_y", "particle_velocity_z",
+            "particle_mass", "particle_identifier")
+_fields = tuple([(p_type, p_field) for p_type in p_types
+                                   for p_field in p_fields])
+
+# a dataset with empty files
+g5 = "gadget_fof_halos/groups_005/fof_subhalo_tab_005.0.hdf5"
+g42 = "gadget_fof_halos/groups_042/fof_subhalo_tab_042.0.hdf5"
+
+
+ at requires_ds(g5)
+def test_fields_g5():
+    ds = data_dir_load(g5)
+    yield assert_equal, str(ds), os.path.basename(g5)
+    for field in _fields:
+        yield FieldValuesTest(g5, field, particle_type=True)
+
+
+ at requires_ds(g42)
+def test_fields_g42():
+    ds = data_dir_load(g42)
+    yield assert_equal, str(ds), os.path.basename(g42)
+    for field in _fields:
+        yield FieldValuesTest(g42, field, particle_type=True)
+
+ at requires_file(g42)
+def test_GadgetFOFDataset():
+    assert isinstance(data_dir_load(g42), GadgetFOFDataset)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/halo_catalog/io.py
--- a/yt/frontends/halo_catalog/io.py
+++ b/yt/frontends/halo_catalog/io.py
@@ -39,7 +39,7 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
@@ -57,7 +57,7 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/owls/io.py
--- a/yt/frontends/owls/io.py
+++ b/yt/frontends/owls/io.py
@@ -70,7 +70,7 @@
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files, key=lambda x: x.filename):
             f = _get_h5_handle(data_file.filename)
             # This double-reads
             for ptype, field_list in sorted(ptf.items()):
@@ -88,7 +88,7 @@
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files, key=lambda x: x.filename):
             f = _get_h5_handle(data_file.filename)
             for ptype, field_list in sorted(ptf.items()):
                 if data_file.total_particles[ptype] == 0:

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/owls_subfind/data_structures.py
--- a/yt/frontends/owls_subfind/data_structures.py
+++ b/yt/frontends/owls_subfind/data_structures.py
@@ -27,11 +27,14 @@
 from .fields import \
     OWLSSubfindFieldInfo
 
-from yt.utilities.cosmology import Cosmology
+from yt.utilities.cosmology import \
+    Cosmology
 from yt.utilities.definitions import \
     mpc_conversion, sec_conversion
 from yt.utilities.exceptions import \
-     YTException
+    YTException
+from yt.utilities.logger import ytLogger as \
+     mylog
 from yt.geometry.particle_geometry_handler import \
     ParticleIndex
 from yt.data_objects.static_output import \
@@ -170,6 +173,7 @@
         # The other same defaults we will use from the standard Gadget
         # defaults.
         unit_base = self._unit_base or {}
+
         if "length" in unit_base:
             length_unit = unit_base["length"]
         elif "UnitLength_in_cm" in unit_base:
@@ -182,7 +186,6 @@
         length_unit = _fix_unit_ordering(length_unit)
         self.length_unit = self.quan(length_unit[0], length_unit[1])
 
-        unit_base = self._unit_base or {}
         if "velocity" in unit_base:
             velocity_unit = unit_base["velocity"]
         elif "UnitVelocity_in_cm_per_s" in unit_base:
@@ -191,6 +194,7 @@
             velocity_unit = (1e5, "cm/s")
         velocity_unit = _fix_unit_ordering(velocity_unit)
         self.velocity_unit = self.quan(velocity_unit[0], velocity_unit[1])
+
         # We set hubble_constant = 1.0 for non-cosmology, so this is safe.
         # Default to 1e10 Msun/h if mass is not specified.
         if "mass" in unit_base:
@@ -205,7 +209,14 @@
             mass_unit = (1.0, "1e10*Msun/h")
         mass_unit = _fix_unit_ordering(mass_unit)
         self.mass_unit = self.quan(mass_unit[0], mass_unit[1])
-        self.time_unit = self.quan(unit_base["UnitTime_in_s"], "s")
+
+        if "time" in unit_base:
+            time_unit = unit_base["time"]
+        elif "UnitTime_in_s" in unit_base:
+            time_unit = (unit_base["UnitTime_in_s"], "s")
+        else:
+            time_unit = (1., "s")        
+        self.time_unit = self.quan(time_unit[0], time_unit[1])
 
     @classmethod
     def _is_valid(self, *args, **kwargs):

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/ramses/data_structures.py
--- a/yt/frontends/ramses/data_structures.py
+++ b/yt/frontends/ramses/data_structures.py
@@ -491,13 +491,22 @@
         """
         Generates the conversion to various physical _units based on the parameter file
         """
-        # Note that unit_l *already* converts to proper!
-        # Also note that unit_l must be multiplied by the boxlen parameter to
-        # ensure we are correctly set up for the current domain.
-        length_unit = self.parameters['unit_l']
-        boxlen = self.parameters['boxlen']
-        density_unit = self.parameters['unit_d']
-        mass_unit = density_unit * (length_unit * boxlen)**3
+        #Please note that for all units given in the info file, the boxlen
+        #still needs to be folded in, as shown below!
+
+        boxlen=self.parameters['boxlen']
+        length_unit = self.parameters['unit_l'] * boxlen
+        density_unit = self.parameters['unit_d']/ boxlen**3
+
+        # In the mass unit, the factors of boxlen cancel back out, so this 
+        #is equivalent to unit_d*unit_l**3
+
+        mass_unit = density_unit * length_unit**3
+
+        # Cosmological runs are done in lookback conformal time. 
+        # To convert to proper time, the time unit is calculated from 
+        # the expansion factor. This is not yet  done here!
+
         time_unit = self.parameters['unit_t']
         magnetic_unit = np.sqrt(4*np.pi * mass_unit /
                                 (time_unit**2 * length_unit))

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/ramses/io.py
--- a/yt/frontends/ramses/io.py
+++ b/yt/frontends/ramses/io.py
@@ -20,7 +20,12 @@
     BaseIOHandler
 from yt.utilities.logger import ytLogger as mylog
 import yt.utilities.fortran_utils as fpu
-from yt.extern.six.moves import cStringIO
+from yt.extern.six import PY3
+
+if PY3:
+    from io import BytesIO as IO
+else:
+    from cStringIO import StringIO as IO
 
 class IOHandlerRAMSES(BaseIOHandler):
     _dataset_type = "ramses"
@@ -37,7 +42,7 @@
                 f = open(subset.domain.hydro_fn, "rb")
                 # This contains the boundary information, so we skim through
                 # and pick off the right vectors
-                content = cStringIO(f.read())
+                content = IO(f.read())
                 rv = subset.fill(content, fields, selector)
                 for ft, f in fields:
                     d = rv.pop(f)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/rockstar/io.py
--- a/yt/frontends/rockstar/io.py
+++ b/yt/frontends/rockstar/io.py
@@ -28,6 +28,7 @@
 from yt.utilities.lib.geometry_utils import compute_morton
 
 from yt.geometry.oct_container import _ORDER_MAX
+from operator import attrgetter
 
 class IOHandlerRockstarBinary(BaseIOHandler):
     _dataset_type = "rockstar_binary"
@@ -45,12 +46,11 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files,key=attrgetter("filename")):
             pcount = data_file.header['num_halos']
             with open(data_file.filename, "rb") as f:
                 f.seek(data_file._position_offset, os.SEEK_SET)
@@ -66,11 +66,11 @@
         data_files = set([])
         # Only support halo reading for now.
         assert(len(ptf) == 1)
-        assert(ptf.keys()[0] == "halos")
+        assert(list(ptf.keys())[0] == "halos")
         for chunk in chunks:
             for obj in chunk.objs:
                 data_files.update(obj.data_files)
-        for data_file in sorted(data_files):
+        for data_file in sorted(data_files,key=attrgetter("filename")):
             pcount = data_file.header['num_halos']
             with open(data_file.filename, "rb") as f:
                 for ptype, field_list in sorted(ptf.items()):

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/setup.py
--- a/yt/frontends/setup.py
+++ b/yt/frontends/setup.py
@@ -17,6 +17,7 @@
     config.add_subpackage("fits")
     config.add_subpackage("flash")
     config.add_subpackage("gadget")
+    config.add_subpackage("gadget_fof")
     config.add_subpackage("gdf")
     config.add_subpackage("halo_catalog")
     config.add_subpackage("http_stream")
@@ -34,11 +35,12 @@
     config.add_subpackage("athena/tests")
     config.add_subpackage("boxlib/tests")
     config.add_subpackage("chombo/tests")
+    config.add_subpackage("eagle/tests")
     config.add_subpackage("enzo/tests")
-    config.add_subpackage("eagle/tests")
     config.add_subpackage("fits/tests")
     config.add_subpackage("flash/tests")
     config.add_subpackage("gadget/tests")
+    config.add_subpackage("gadget_fof/tests")
     config.add_subpackage("moab/tests")
     config.add_subpackage("owls/tests")
     config.add_subpackage("owls_subfind/tests")

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/frontends/tipsy/fields.py
--- a/yt/frontends/tipsy/fields.py
+++ b/yt/frontends/tipsy/fields.py
@@ -16,6 +16,8 @@
 #-----------------------------------------------------------------------------
 
 from yt.frontends.sph.fields import SPHFieldInfo
+from yt.fields.particle_fields import add_volume_weighted_smoothed_field, add_nearest_neighbor_field
+from yt.utilities.physical_constants import mp, kb
 
 class TipsyFieldInfo(SPHFieldInfo):
     aux_particle_fields = {
@@ -44,3 +46,29 @@
                 self.aux_particle_fields[field[1]] not in self.known_particle_fields:
                 self.known_particle_fields += (self.aux_particle_fields[field[1]],)
         super(TipsyFieldInfo,self).__init__(ds, field_list, slice_info)
+
+    def setup_particle_fields(self, ptype, *args, **kwargs):
+
+        # setup some special fields that only make sense for SPH particles
+
+        if ptype in ("PartType0", "Gas"):
+            self.setup_gas_particle_fields(ptype)
+
+        super(TipsyFieldInfo, self).setup_particle_fields(
+            ptype, *args, **kwargs)
+
+
+    def setup_gas_particle_fields(self, ptype):
+
+        def _smoothing_length(field, data):
+            # For now, we hardcode num_neighbors.  We should make this configurable
+            # in the future.
+            num_neighbors = 64
+            fn, = add_nearest_neighbor_field(ptype, "particle_position", self, num_neighbors)
+            return data[ptype, 'nearest_neighbor_distance_%d' % num_neighbors]
+
+        self.add_field(
+            (ptype, "smoothing_length"),
+            function=_smoothing_length,
+            particle_type=True,
+            units="code_length")

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/utilities/fits_image.py
--- a/yt/utilities/fits_image.py
+++ b/yt/utilities/fits_image.py
@@ -122,7 +122,7 @@
         for key in fields:
             if key not in exclude_fields:
                 if hasattr(img_data[key], "units"):
-                    self.field_units[key] = str(img_data[key].units)
+                    self.field_units[key] = img_data[key].units
                 else:
                     self.field_units[key] = "dimensionless"
                 mylog.info("Making a FITS image of field %s" % key)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/visualization/color_maps.py
--- a/yt/visualization/color_maps.py
+++ b/yt/visualization/color_maps.py
@@ -11,7 +11,6 @@
 # The full license is in the file COPYING.txt, distributed with this software.
 #-----------------------------------------------------------------------------
 import numpy as np
-from yt.extern.six.moves import zip as izip
 
 import matplotlib
 import matplotlib.colors as cc
@@ -86,9 +85,9 @@
                 194.5*_vs**2.88+99.72*np.exp(-77.24*(_vs-0.742)**2.0)
               + 45.40*_vs**0.089+10.0)/255.0
 
-cdict = {'red':zip(_vs,_kamae_red,_kamae_red),
-         'green':zip(_vs,_kamae_grn,_kamae_grn),
-         'blue':zip(_vs,_kamae_blu,_kamae_blu)}
+cdict = {'red':np.transpose([_vs,_kamae_red,_kamae_red]),
+         'green':np.transpose([_vs,_kamae_grn,_kamae_grn]),
+         'blue':np.transpose([_vs,_kamae_blu,_kamae_blu])}
 add_cmap('kamae', cdict)
 
 # This one is a simple black & green map
@@ -151,9 +150,9 @@
 _vs = np.linspace(0,1,256)
 for k,v in list(_cm.color_map_luts.items()):
     if k not in yt_colormaps and k not in mcm.cmap_d:
-        cdict = { 'red': zip(_vs,v[0],v[0]),
-                  'green': zip(_vs,v[1],v[1]),
-                  'blue': zip(_vs,v[2],v[2]) }
+        cdict = { 'red': np.transpose([_vs,v[0],v[0]]),
+                  'green': np.transpose([_vs,v[1],v[1]]),
+                  'blue': np.transpose([_vs,v[2],v[2]]) }
         add_cmap(k, cdict)
 
 def _extract_lookup_table(cmap_name):
@@ -393,9 +392,9 @@
     #   Second number is the (0..1) number to interpolate to when coming *from below*
     #   Third number is the (0..1) number to interpolate to when coming *from above*
     _vs = np.linspace(0,1,256)
-    cdict = {'red':   zip(_vs, cmap[:,0], cmap[:,0]),
-             'green': zip(_vs, cmap[:,1], cmap[:,1]),
-             'blue':  zip(_vs, cmap[:,2], cmap[:,2])}
+    cdict = {'red':   np.transpose([_vs, cmap[:,0], cmap[:,0]]),
+             'green': np.transpose([_vs, cmap[:,1], cmap[:,1]]),
+             'blue':  np.transpose([_vs, cmap[:,2], cmap[:,2]])}
 
     if name is not None:
         add_cmap(name, cdict)

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/visualization/fixed_resolution.py
--- a/yt/visualization/fixed_resolution.py
+++ b/yt/visualization/fixed_resolution.py
@@ -27,7 +27,6 @@
 import numpy as np
 import weakref
 import re
-import string
 
 class FixedResolutionBuffer(object):
     r"""
@@ -178,13 +177,13 @@
             pstr = m.string[m.start()+1:m.end()-1]
             segments = fname.split("_")
             for i,s in enumerate(segments):
-                segments[i] = string.capitalize(s)
+                segments[i] = s.capitalize()
                 if s == pstr:
                     ipstr = i
             element = segments[ipstr-1]
             roman = pnum2rom[pstr[1:]]
             label = element + '\ ' + roman + '\ ' + \
-                string.join(segments[ipstr+1:], '\ ')
+                '\ '.join(segments[ipstr+1:])
         else:
             label = fname
         return label

diff -r f1f968e3cd056f51c5d2703ca33829ebb0b62acd -r a4f19a0f6144c12944b0cbfd351e887309d185a0 yt/visualization/image_writer.py
--- a/yt/visualization/image_writer.py
+++ b/yt/visualization/image_writer.py
@@ -170,7 +170,7 @@
         bitmap_array = np.concatenate([bitmap_array.astype('uint8'),
                                        alpha_channel], axis=-1)
     if transpose:
-        bitmap_array = bitmap_array.swapaxes(0,1)
+        bitmap_array = bitmap_array.swapaxes(0,1).copy(order="C")
     if filename is not None:
         pw.write_png(bitmap_array, filename)
     else:

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/bc88878b5537/
Changeset:   bc88878b5537
Branch:      bugfix-1035
User:        BW Keller
Date:        2015-07-17 17:23:36+00:00
Summary:     Leave those changes for another PR
Affected #:  1 file

diff -r a4f19a0f6144c12944b0cbfd351e887309d185a0 -r bc88878b553718fc1a40772a913408871379e680 yt/frontends/tipsy/fields.py
--- a/yt/frontends/tipsy/fields.py
+++ b/yt/frontends/tipsy/fields.py
@@ -16,8 +16,6 @@
 #-----------------------------------------------------------------------------
 
 from yt.frontends.sph.fields import SPHFieldInfo
-from yt.fields.particle_fields import add_volume_weighted_smoothed_field, add_nearest_neighbor_field
-from yt.utilities.physical_constants import mp, kb
 
 class TipsyFieldInfo(SPHFieldInfo):
     aux_particle_fields = {
@@ -46,29 +44,3 @@
                 self.aux_particle_fields[field[1]] not in self.known_particle_fields:
                 self.known_particle_fields += (self.aux_particle_fields[field[1]],)
         super(TipsyFieldInfo,self).__init__(ds, field_list, slice_info)
-
-    def setup_particle_fields(self, ptype, *args, **kwargs):
-
-        # setup some special fields that only make sense for SPH particles
-
-        if ptype in ("PartType0", "Gas"):
-            self.setup_gas_particle_fields(ptype)
-
-        super(TipsyFieldInfo, self).setup_particle_fields(
-            ptype, *args, **kwargs)
-
-
-    def setup_gas_particle_fields(self, ptype):
-
-        def _smoothing_length(field, data):
-            # For now, we hardcode num_neighbors.  We should make this configurable
-            # in the future.
-            num_neighbors = 64
-            fn, = add_nearest_neighbor_field(ptype, "particle_position", self, num_neighbors)
-            return data[ptype, 'nearest_neighbor_distance_%d' % num_neighbors]
-
-        self.add_field(
-            (ptype, "smoothing_length"),
-            function=_smoothing_length,
-            particle_type=True,
-            units="code_length")


https://bitbucket.org/yt_analysis/yt/commits/7427f58f1825/
Changeset:   7427f58f1825
Branch:      yt
User:        jzuhone
Date:        2015-07-18 13:51:32+00:00
Summary:     Merged in bwkeller/yt/bugfix-1035 (pull request #1641)

Bugfix for 1035
Affected #:  2 files

diff -r eef682fe125dc8f36c86c89453fc7ab6481c994d -r 7427f58f1825c352d68365489c615c7a25c59c5f yt/frontends/tipsy/data_structures.py
--- a/yt/frontends/tipsy/data_structures.py
+++ b/yt/frontends/tipsy/data_structures.py
@@ -180,7 +180,7 @@
                 self.domain_left_edge = None
                 self.domain_right_edge = None
         else: 
-            bbox = self.arr(self.bounding_box, 'code_length', dtype="float64")
+            bbox = np.array(self.bounding_box, dtype="float64")
             if bbox.shape == (2, 3):
                 bbox = bbox.transpose()
             self.domain_left_edge = bbox[:,0]

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list