[yt-svn] commit/yt: 5 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Mon Jul 28 16:22:51 PDT 2014


5 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/080a6a7e0e74/
Changeset:   080a6a7e0e74
Branch:      yt-3.0
User:        pshriwise
Date:        2014-03-19 21:47:31
Summary:     "Converted lines coming out of triangle_plane_intersect to match plot coords.""
Affected #:  1 file

diff -r d6c3a32952d89450b2b2c8cba9352cc665ea2192 -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 yt/visualization/plot_modifications.py
--- a/yt/visualization/plot_modifications.py
+++ b/yt/visualization/plot_modifications.py
@@ -1356,6 +1356,13 @@
         plot._axes.hold(True)
         xax, yax = x_dict[plot.data.axis], y_dict[plot.data.axis]
         l_cy = triangle_plane_intersect(plot.data.axis, plot.data.coord, self.vertices)[:,:,(xax, yax)]
+        # Convert numpy array to a YT array
+        l_cy = [YTArray(line, input_units="code_length") for line in l_cy]
+        # Convert points individually
+        for line in l_cy:
+            line[0] = self.convert_to_plot(plot,line[0])
+            line[1] = self.convert_to_plot(plot,line[1])
+        # create the line collection using the new points
         lc = matplotlib.collections.LineCollection(l_cy, **self.plot_args)
         plot._axes.add_collection(lc)
         plot._axes.hold(False)


https://bitbucket.org/yt_analysis/yt/commits/15c6f41cb462/
Changeset:   15c6f41cb462
Branch:      yt-3.0
User:        MatthewTurk
Date:        2014-04-02 21:25:40
Summary:     Merging from experimental branch
Affected #:  327 files

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -36,6 +36,7 @@
 yt/utilities/lib/mesh_utilities.c
 yt/utilities/lib/misc_utilities.c
 yt/utilities/lib/Octree.c
+yt/utilities/lib/origami.c
 yt/utilities/lib/png_writer.c
 yt/utilities/lib/PointsInVolume.c
 yt/utilities/lib/QuadTree.c

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e CITATION
--- a/CITATION
+++ b/CITATION
@@ -29,3 +29,28 @@
    adsurl = {http://adsabs.harvard.edu/abs/2011ApJS..192....9T},
   adsnote = {Provided by the SAO/NASA Astrophysics Data System}
 }
+
+Using yt can also utilize other functionality.  If you utilize ORIGAMI, we ask
+that you please cite the ORIGAMI paper:
+
+ at ARTICLE{2012ApJ...754..126F,
+   author = {{Falck}, B.~L. and {Neyrinck}, M.~C. and {Szalay}, A.~S.},
+    title = "{ORIGAMI: Delineating Halos Using Phase-space Folds}",
+  journal = {\apj},
+archivePrefix = "arXiv",
+   eprint = {1201.2353},
+ primaryClass = "astro-ph.CO",
+ keywords = {dark matter, galaxies: halos, large-scale structure of universe, methods: numerical},
+     year = 2012,
+    month = aug,
+   volume = 754,
+      eid = {126},
+    pages = {126},
+      doi = {10.1088/0004-637X/754/2/126},
+   adsurl = {http://adsabs.harvard.edu/abs/2012ApJ...754..126F},
+  adsnote = {Provided by the SAO/NASA Astrophysics Data System}
+}
+
+The main homepage for ORIGAMI can be found here:
+
+http://icg.port.ac.uk/~falckb/origami.html

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/cheatsheet.tex
--- a/doc/cheatsheet.tex
+++ b/doc/cheatsheet.tex
@@ -217,21 +217,21 @@
 in the snapshot. \\
 \texttt{val, loc = pf.h.find\_max("Density")} \textemdash\ Find the \texttt{val}ue of
 the maximum of the field \texttt{Density} and its \texttt{loc}ation. \\
-\texttt{sp = pf.h.sphere(}{\it cen}\texttt{,}{\it radius}\texttt{)} \textemdash\   Create a spherical data 
+\texttt{sp = pf.sphere(}{\it cen}\texttt{,}{\it radius}\texttt{)} \textemdash\   Create a spherical data 
 container. {\it cen} may be a coordinate, or ``max'' which 
 centers on the max density point. {\it radius} may be a float in 
 code units or a tuple of ({\it length, unit}).\\
 
-\texttt{re = pf.h.region({\it cen}, {\it left edge}, {\it right edge})} \textemdash\ Create a
+\texttt{re = pf.region({\it cen}, {\it left edge}, {\it right edge})} \textemdash\ Create a
 rectilinear data container. {\it cen} is required but not used.
 {\it left} and {\it right edge} are coordinate values that define the region.
 
-\texttt{di = pf.h.disk({\it cen}, {\it normal}, {\it radius}, {\it height})} \textemdash\ 
+\texttt{di = pf.disk({\it cen}, {\it normal}, {\it radius}, {\it height})} \textemdash\ 
 Create a cylindrical data container centered at {\it cen} along the 
 direction set by {\it normal},with total length
  2$\times${\it height} and with radius {\it radius}. \\
  
- \texttt{bl = pf.h.boolean({\it constructor})} \textemdash\ Create a boolean data
+ \texttt{bl = pf.boolean({\it constructor})} \textemdash\ Create a boolean data
  container. {\it constructor} is a list of pre-defined non-boolean 
  data containers with nested boolean logic using the
  ``AND'', ``NOT'', or ``OR'' operators. E.g. {\it constructor=}

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/coding_styleguide.txt
--- a/doc/coding_styleguide.txt
+++ b/doc/coding_styleguide.txt
@@ -60,7 +60,7 @@
  * Avoid Enzo-isms.  This includes but is not limited to:
    * Hard-coding parameter names that are the same as those in Enzo.  The
      following translation table should be of some help.  Note that the
-     parameters are now properties on a StaticOutput subclass: you access them
+     parameters are now properties on a Dataset subclass: you access them
      like pf.refine_by .
      * RefineBy => refine_by
      * TopGridRank => dimensionality

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/extensions/notebook_sphinxext.py
--- a/doc/extensions/notebook_sphinxext.py
+++ b/doc/extensions/notebook_sphinxext.py
@@ -1,9 +1,10 @@
-import os, shutil, string, glob
+import os, shutil, string, glob, re
 from sphinx.util.compat import Directive
 from docutils import nodes
 from docutils.parsers.rst import directives
 from IPython.nbconvert import html, python
-from runipy.notebook_runner import NotebookRunner
+from IPython.nbformat.current import read, write
+from runipy.notebook_runner import NotebookRunner, NotebookError
 
 class NotebookDirective(Directive):
     """Insert an evaluated notebook into a document
@@ -57,12 +58,8 @@
 
         skip_exceptions = 'skip_exceptions' in self.options
 
-        try:
-            evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval,
-                                               skip_exceptions=skip_exceptions)
-        except:
-            # bail
-            return []
+        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval,
+                                           skip_exceptions=skip_exceptions)
 
         # Create link to notebook and script files
         link_rst = "(" + \
@@ -138,11 +135,20 @@
     # Create evaluated version and save it to the dest path.
     # Always use --pylab so figures appear inline
     # perhaps this is questionable?
-    nb_runner = NotebookRunner(nb_in=nb_path, pylab=True)
-    nb_runner.run_notebook(skip_exceptions=skip_exceptions)
+    notebook = read(open(nb_path), 'json')
+    nb_runner = NotebookRunner(notebook, pylab=False)
+    try:
+        nb_runner.run_notebook(skip_exceptions=skip_exceptions)
+    except NotebookError as e:
+        print ''
+        print e
+        # Return the traceback, filtering out ANSI color codes.
+        # http://stackoverflow.com/questions/13506033/filtering-out-ansi-escape-sequences
+        return 'Notebook conversion failed with the following traceback: \n%s' % \
+            re.sub(r'\\033[\[\]]([0-9]{1,2}([;@][0-9]{0,2})*)*[mKP]?', '', str(e))
     if dest_path is None:
         dest_path = 'temp_evaluated.ipynb'
-    nb_runner.save_notebook(dest_path)
+    write(nb_runner.nb, open(dest_path, 'w'), 'json')
     ret = nb_to_html(dest_path)
     if dest_path is 'temp_evaluated.ipynb':
         os.remove(dest_path)

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/extensions/notebookcell_sphinxext.py
--- a/doc/extensions/notebookcell_sphinxext.py
+++ b/doc/extensions/notebookcell_sphinxext.py
@@ -35,12 +35,7 @@
 
         skip_exceptions = 'skip_exceptions' in self.options
 
-        try:
-            evaluated_text = \
-                evaluate_notebook('temp.ipynb', skip_exceptions=skip_exceptions)
-        except:
-            # bail
-            return []
+        evaluated_text = evaluate_notebook('temp.ipynb', skip_exceptions=skip_exceptions)
 
         # create notebook node
         attributes = {'format': 'html', 'source': 'nb_path'}

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -17,7 +17,7 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
-Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+Try using the ``pf.field_list`` and ``pf.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
 to display the native fields in alphabetical order:
 
@@ -25,7 +25,7 @@
 
   from yt.mods import *
   pf = load("Enzo_64/DD0043/data0043")
-  for i in sorted(pf.h.field_list):
+  for i in sorted(pf.field_list):
     print i
 
 .. note:: Universal fields will be overridden by a code-specific field.

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/how_to_develop_yt.txt
--- a/doc/how_to_develop_yt.txt
+++ b/doc/how_to_develop_yt.txt
@@ -79,8 +79,8 @@
       This is where interfaces to codes are created.  Within each subdirectory of
       yt/frontends/ there must exist the following files, even if empty:
 
-      * data_structures.py, where subclasses of AMRGridPatch, StaticOutput and
-        GridGeometryHandler are defined.
+      * data_structures.py, where subclasses of AMRGridPatch, Dataset and
+        GridIndex are defined.
       * io.py, where a subclass of IOHandler is defined.
       * misc.py, where any miscellaneous functions or classes are defined.
       * definitions.py, where any definitions specific to the frontend are

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/install_script.sh
--- a/doc/install_script.sh
+++ b/doc/install_script.sh
@@ -503,10 +503,8 @@
     cd $LIB
     if [ ! -z `echo $LIB | grep h5py` ]
     then
-        shift
 	( ${DEST_DIR}/bin/python2.7 setup.py build --hdf5=${HDF5_DIR} $* 2>&1 ) 1>> ${LOG_FILE} || do_exit
     else
-        shift
         ( ${DEST_DIR}/bin/python2.7 setup.py build   $* 2>&1 ) 1>> ${LOG_FILE} || do_exit
     fi
     ( ${DEST_DIR}/bin/python2.7 setup.py install    2>&1 ) 1>> ${LOG_FILE} || do_exit

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
--- a/doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
+++ b/doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:874e85c86cd80a516bb61775b566cd46766c60bdf8f865336bf9dd3505f83821"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -18,7 +19,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *\n",
+      "%matplotlib inline\n",
+      "from yt.mods import *\n",
       "from yt.analysis_modules.api import ParticleTrajectories\n",
       "from yt.config import ytcfg\n",
       "path = ytcfg.get(\"yt\", \"test_data_dir\")"
@@ -220,7 +222,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "sp = pf.h.sphere(\"max\", (0.5, \"mpc\"))\n",
+      "sp = pf.sphere(\"max\", (0.5, \"mpc\"))\n",
       "indices = sp[\"particle_index\"][sp[\"particle_type\"] == 1]"
      ],
      "language": "python",
@@ -238,7 +240,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "my_fns = glob.glob(path+\"/enzo_tiny_cosmology/DD*/*.hierarchy\")\n",
+      "my_fns = glob.glob(path+\"/enzo_tiny_cosmology/DD*/*.index\")\n",
       "my_fns.sort()\n",
       "trajs = ParticleTrajectories(my_fns, indices)"
      ],

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:e5d3c629592c8aacbabf2e3fab2660703298886b8de6f36eb7cdc1f60b726496"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -88,7 +89,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *\n",
+      "%matplotlib inline\n",
+      "from yt.mods import *\n",
       "from yt.analysis_modules.api import SZProjection\n",
       "\n",
       "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/clump_finding.rst
--- a/doc/source/analyzing/analysis_modules/clump_finding.rst
+++ b/doc/source/analyzing/analysis_modules/clump_finding.rst
@@ -22,7 +22,7 @@
 selecting only those clumps that are gravitationally bound.
 
 Once the clump-finder has finished, the user can write out a set of quantities for each clump in the 
-hierarchy.  Additional info items can also be added.  We also provide a recipe
+index.  Additional info items can also be added.  We also provide a recipe
 for finding clumps in :ref:`cookbook-find_clumps`.
 
 Treecode Optimization
@@ -85,7 +85,7 @@
   from yt.mods import *
   
   pf = load("DD0000")
-  sp = pf.h.sphere([0.5, 0.5, 0.5], radius=0.1)
+  sp = pf.sphere([0.5, 0.5, 0.5], radius=0.1)
   
   ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
       treecode=True, opening_angle=2.0)
@@ -98,7 +98,7 @@
   from yt.mods import *
   
   pf = load("DD0000")
-  sp = pf.h.sphere([0.5, 0.5, 0.5], radius=0.1)
+  sp = pf.sphere([0.5, 0.5, 0.5], radius=0.1)
   
   ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
       treecode=False)
@@ -151,7 +151,7 @@
 region of analysis. Up to about 100,000 cells,
 the treecode is actually slower than the brute-force method. This is due to
 the fact that with fewer cells, smaller geometric distances,
-and a shallow AMR hierarchy, the treecode
+and a shallow AMR index, the treecode
 method has very little chance to be applied. The calculation is overall
 slower due to the overhead of the treecode method & startup costs. This
 explanation is further strengthened by the fact that the accuracy of the

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/ellipsoid_analysis.rst
--- a/doc/source/analyzing/analysis_modules/ellipsoid_analysis.rst
+++ b/doc/source/analyzing/analysis_modules/ellipsoid_analysis.rst
@@ -107,7 +107,7 @@
 
 .. code-block:: python
 
-  ell = pf.h.ellipsoid(ell_param[0],
+  ell = pf.ellipsoid(ell_param[0],
   ell_param[1],
   ell_param[2],
   ell_param[3],

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/photon_simulator.rst
--- a/doc/source/analyzing/analysis_modules/photon_simulator.rst
+++ b/doc/source/analyzing/analysis_modules/photon_simulator.rst
@@ -35,7 +35,7 @@
 
 .. code:: python
 
-    from yt.imods import *
+    from yt.mods import *
     from yt.analysis_modules.api import *
     from yt.utilities.cosmology import Cosmology
 
@@ -89,7 +89,7 @@
 
 .. code:: python
 
-    sp = pf.h.sphere("c", (250., "kpc"))
+    sp = pf.sphere("c", (250., "kpc"))
 
 This will serve as our ``data_source`` that we will use later. Next, we
 need to create the ``SpectralModel`` instance that will determine how
@@ -445,7 +445,7 @@
 
 .. code:: python
 
-   sphere = pf.h.sphere(pf.domain_center, 1.0/pf["mpc"])
+   sphere = pf.sphere(pf.domain_center, 1.0/pf["mpc"])
        
    A = 6000.
    exp_time = 2.0e5

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/radmc3d_export.rst
--- a/doc/source/analyzing/analysis_modules/radmc3d_export.rst
+++ b/doc/source/analyzing/analysis_modules/radmc3d_export.rst
@@ -18,7 +18,7 @@
 
 To compute thermal emission intensities, RADMC-3D needs a file called
 "dust_density.inp" that specifies the density of dust for every cell in the AMR
-hierarchy. To generate this file, first import the RADMC-3D exporter, which 
+index. To generate this file, first import the RADMC-3D exporter, which 
 is not loaded into your environment by default:
 
 .. code-block:: python
@@ -73,7 +73,7 @@
 
 The file format required for line emission is slightly different. The following script will generate 
 two files, one called "numderdens_co.inp", which contains the number density of CO molecules
-for every cell in the hierarchy, and another called "gas-velocity.inp", which is useful if you want 
+for every cell in the index, and another called "gas-velocity.inp", which is useful if you want 
 to include doppler broadening.
 
 .. code-block:: python

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/running_halofinder.rst
--- a/doc/source/analyzing/analysis_modules/running_halofinder.rst
+++ b/doc/source/analyzing/analysis_modules/running_halofinder.rst
@@ -448,7 +448,7 @@
   pf = load('data0458')
   # Note that the first term below, [0.5]*3, defines the center of
   # the region and is not used. It can be any value.
-  sv = pf.h.region([0.5]*3, [0.21, .21, .72], [.28, .28, .79])
+  sv = pf.region([0.5]*3, [0.21, .21, .72], [.28, .28, .79])
   halos = HaloFinder(pf, subvolume = sv)
   halos.write_out("sv.out")
 
@@ -493,10 +493,10 @@
   from yt.analysis_modules.halo_finding.rockstar.api import RockstarHaloFinder
 
   #find all of our simulation files
-  files = glob.glob("Enzo_64/DD*/\*hierarchy")
+  files = glob.glob("Enzo_64/DD*/\*index")
   #hopefully the file name order is chronological
   files.sort()
-  ts = TimeSeriesData.from_filenames(files[:])
+  ts = DatasetSeries.from_filenames(files[:])
   rh = RockstarHaloFinder(ts)
   rh.run()
 
@@ -522,7 +522,7 @@
     the width of the smallest grid element in the simulation from the
     last data snapshot (i.e. the one where time has evolved the
     longest) in the time series:
-    ``pf_last.h.get_smallest_dx() * pf_last['mpch']``.
+    ``pf_last.index.get_smallest_dx() * pf_last['mpch']``.
   * ``total_particles``, if supplied, this is a pre-calculated
     total number of dark matter
     particles present in the simulation. For example, this is useful
@@ -624,12 +624,12 @@
     
     def main():
         import enzo
-        pf = EnzoStaticOutputInMemory()
+        pf = EnzoDatasetInMemory()
         mine = ytcfg.getint('yt','__topcomm_parallel_rank')
         size = ytcfg.getint('yt','__topcomm_parallel_size')
 
         # Call rockstar.
-        ts = TimeSeriesData([pf])
+        ts = DatasetSeries([pf])
         outbase = "./rockstar_halos_%04d" % pf['NumberOfPythonTopGridCalls']
         rh = RockstarHaloFinder(ts, num_readers = size,
             outbase = outbase)

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/star_analysis.rst
--- a/doc/source/analyzing/analysis_modules/star_analysis.rst
+++ b/doc/source/analyzing/analysis_modules/star_analysis.rst
@@ -52,7 +52,7 @@
   from yt.mods import *
   from yt.analysis_modules.star_analysis.api import *
   pf = load("data0030")
-  re = pf.h.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
+  re = pf.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
   # This puts the particle data for *all* the particles in the region re
   # into the arrays sm and ct.
   sm = re["ParticleMassMsun"]
@@ -148,7 +148,7 @@
 
 .. code-block:: python
 
-  re = pf.h.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
+  re = pf.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
   spec.calculate_spectrum(data_source=re)
 
 If a subset of stars are desired, call it like this. ``star_mass`` is in units
@@ -157,7 +157,7 @@
 
 .. code-block:: python
 
-  re = pf.h.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
+  re = pf.region([0.5,0.5,0.5], [0.4,0.5,0.6], [0.5,0.6,0.7])
   # This puts the particle data for *all* the particles in the region re
   # into the arrays sm, ct and metal.
   sm = re["ParticleMassMsun"]

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/sunrise_export.rst
--- a/doc/source/analyzing/analysis_modules/sunrise_export.rst
+++ b/doc/source/analyzing/analysis_modules/sunrise_export.rst
@@ -18,7 +18,7 @@
 	from yt.mods import *
 	import numpy as na
 
-	pf = ARTStaticOutput(file_amr)
+	pf = ARTDataset(file_amr)
 	potential_value,center=pf.h.find_min('Potential_New')
 	root_cells = pf.domain_dimensions[0]
 	le = np.floor(root_cells*center) #left edge
@@ -69,7 +69,7 @@
 
 	for x,a in enumerate(zip(pos,age)): #loop over stars
 	    center = x*pf['kpc']
-	    grid,idx = find_cell(pf.h.grids[0],center)
+	    grid,idx = find_cell(pf.index.grids[0],center)
 	    pk[i] = grid['Pk'][idx]
 
 This code is how Sunrise calculates the pressure, so we can add our own derived field:
@@ -114,7 +114,7 @@
 Sanity Check: Gas & Stars Line Up
 ---------------------------------
 
-If you add your star particles separately from the gas cell hierarchy, then it is worth checking that they still lined up once they've been loaded into Sunrise. This is fairly easy to do with a useful 'auxiliary' run. In Sunrise, set all of your rays to zero, (nrays_nonscatter, nrays_scatter,nrays_intensity,nrays_ir ) except for nrays_aux, and this will produce an mcrx FITS file with a gas map, a metals map, a temperature*gass_mass map and a stellar map for each camera. As long as you keep some cameras at theta,phi = 0,0 or 90,0, etc., then a standard yt projection down the code's xyz axes should look identical:
+If you add your star particles separately from the gas cell index, then it is worth checking that they still lined up once they've been loaded into Sunrise. This is fairly easy to do with a useful 'auxiliary' run. In Sunrise, set all of your rays to zero, (nrays_nonscatter, nrays_scatter,nrays_intensity,nrays_ir ) except for nrays_aux, and this will produce an mcrx FITS file with a gas map, a metals map, a temperature*gass_mass map and a stellar map for each camera. As long as you keep some cameras at theta,phi = 0,0 or 90,0, etc., then a standard yt projection down the code's xyz axes should look identical:
 
 .. code-block:: python
 

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/analysis_modules/two_point_functions.rst
--- a/doc/source/analyzing/analysis_modules/two_point_functions.rst
+++ b/doc/source/analyzing/analysis_modules/two_point_functions.rst
@@ -872,7 +872,7 @@
     
     # We work in simulation's units, these are for conversion.
     vol_conv = pf['cm'] ** 3
-    sm = pf.h.get_smallest_dx()**3
+    sm = pf.index.get_smallest_dx()**3
     
     # Our density limit, in gm/cm**3
     dens = 2e-31

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/creating_derived_fields.rst
--- a/doc/source/analyzing/creating_derived_fields.rst
+++ b/doc/source/analyzing/creating_derived_fields.rst
@@ -88,7 +88,7 @@
 
    >>> from yt.mods import *
    >>> pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100")
-   >>> pf.h.field_list
+   >>> pf.field_list
    ['dens', 'temp', 'pres', 'gpot', 'divb', 'velx', 'vely', 'velz', 'magx', 'magy', 'magz', 'magp']
    >>> pf.field_info['dens']._units
    '\\rm{g}/\\rm{cm}^{3}'
@@ -295,8 +295,6 @@
      (*Advanced*) Should this field appear in the dropdown box in Reason?
    ``not_in_all``
      (*Advanced*) If this is *True*, the field may not be in all the grids.
-   ``projection_conversion``
-     (*Advanced*) Which unit should we multiply by in a projection?
 
 How Do Units Work?
 ------------------

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/external_analysis.rst
--- a/doc/source/analyzing/external_analysis.rst
+++ b/doc/source/analyzing/external_analysis.rst
@@ -21,7 +21,7 @@
    pf = load("DD0010/DD0010")
    rt_grids = []
 
-   for grid in pf.h.grids:
+   for grid in pf.index.grids:
        rt_grid = radtrans.RegularBox(
             grid.LeftEdge, grid.RightEdge,
             grid["density"], grid["temperature"], grid["metallicity"])

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/generating_processed_data.rst
--- a/doc/source/analyzing/generating_processed_data.rst
+++ b/doc/source/analyzing/generating_processed_data.rst
@@ -43,7 +43,7 @@
 
 .. code-block:: python
 
-   sl = pf.h.slice(0, 0.5)
+   sl = pf.slice(0, 0.5)
    frb = FixedResolutionBuffer(sl, (0.3, 0.5, 0.6, 0.8), (512, 512))
    my_image = frb["density"]
 
@@ -98,7 +98,7 @@
 
 .. code-block:: python
 
-   source = pf.h.sphere( (0.3, 0.6, 0.4), 1.0/pf['pc'])
+   source = pf.sphere( (0.3, 0.6, 0.4), 1.0/pf['pc'])
    profile = BinnedProfile1D(source, 128, "density", 1e-24, 1e-10)
    profile.add_fields("cell_mass", weight = None)
    profile.add_fields("temperature")
@@ -128,7 +128,7 @@
 
 .. code-block:: python
 
-   source = pf.h.sphere( (0.3, 0.6, 0.4), 1.0/pf['pc'])
+   source = pf.sphere( (0.3, 0.6, 0.4), 1.0/pf['pc'])
    prof2d = BinnedProfile2D(source, 128, "density", 1e-24, 1e-10, True,
                                     128, "temperature", 10, 10000, True)
    prof2d.add_fields("cell_mass", weight = None)
@@ -163,7 +163,7 @@
 
 To calculate the values along a line connecting two points in a simulation, you
 can use the object :class:`~yt.data_objects.data_containers.AMRRayBase`,
-accessible as the ``ray`` property on a hierarchy.  (See :ref:`using-objects`
+accessible as the ``ray`` property on a index.  (See :ref:`using-objects`
 for more information on this.)  To do so, you can supply two points and access
 fields within the returned object.  For instance, this code will generate a ray
 between the points (0.3, 0.5, 0.9) and (0.1, 0.8, 0.5) and examine the density
@@ -171,7 +171,7 @@
 
 .. code-block:: python
 
-   ray = pf.h.ray(  (0.3, 0.5, 0.9), (0.1, 0.8, 0.5) )
+   ray = pf.ray(  (0.3, 0.5, 0.9), (0.1, 0.8, 0.5) )
    print ray["density"]
 
 The points are ordered, but the ray is also traversing cells of varying length,

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/ionization_cube.py
--- a/doc/source/analyzing/ionization_cube.py
+++ b/doc/source/analyzing/ionization_cube.py
@@ -8,14 +8,14 @@
 def IonizedHydrogen(field, data):
     return data["HII_Density"]/(data["HI_Density"]+data["HII_Density"])
 
-ts = TimeSeriesData.from_filenames("SED800/DD*/*.hierarchy", parallel = 8)
+ts = DatasetSeries.from_filenames("SED800/DD*/*.index", parallel = 8)
 
 ionized_z = np.zeros(ts[0].domain_dimensions, dtype="float32")
 
 t1 = time.time()
 for pf in ts.piter():
     z = pf.current_redshift
-    for g in parallel_objects(pf.h.grids, njobs = 16):
+    for g in parallel_objects(pf.index.grids, njobs = 16):
         i1, j1, k1 = g.get_global_startindex() # Index into our domain
         i2, j2, k2 = g.get_global_startindex() + g.ActiveDimensions
         # Look for the newly ionized gas

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/objects.rst
--- a/doc/source/analyzing/objects.rst
+++ b/doc/source/analyzing/objects.rst
@@ -40,8 +40,7 @@
 
    add_enzo_field("Cooling_Time", units=r"\rm{s}",
                   function=NullFunc,
-                  validators=ValidateDataField("Cooling_Time"),
-                  projection_conversion="1")
+                  validators=ValidateDataField("Cooling_Time"))
 
 Note that we used the ``NullFunc`` function here.  To add a derived field,
 which is not expected to necessarily exist on disk, use the standard
@@ -83,7 +82,7 @@
 
 .. code-block:: python
 
-   sp = pf.h.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
+   sp = pf.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
 
 and then look at the temperature of its cells within it via:
 
@@ -107,8 +106,8 @@
 .. code-block:: python
 
    pf = load("my_data")
-   print pf.h.field_list
-   print pf.h.derived_field_list
+   print pf.field_list
+   print pf.derived_field_list
 
 When a field is added, it is added to a container that hangs off of the
 parameter file, as well.  All of the field creation options
@@ -132,11 +131,11 @@
 Available Objects
 -----------------
 
-Objects are instantiated by direct access of a hierarchy.  Each of the objects
-that can be generated by a hierarchy are in fact fully-fledged data objects
+Objects are instantiated by direct access of a index.  Each of the objects
+that can be generated by a index are in fact fully-fledged data objects
 respecting the standard protocol for interaction.
 
-The following objects are available, all of which hang off of the hierarchy
+The following objects are available, all of which hang off of the index
 object.  To access them, you would do something like this (as for a
 :class:`region`):
 
@@ -144,7 +143,7 @@
 
    from yt.mods import *
    pf = load("RedshiftOutput0005")
-   reg = pf.h.region([0.5, 0.5, 0.5], [0.0, 0.0, 0.0], [1.0, 1.0, 1.0])
+   reg = pf.region([0.5, 0.5, 0.5], [0.0, 0.0, 0.0], [1.0, 1.0, 1.0])
 
 .. include:: _obj_docstrings.inc
 
@@ -243,15 +242,15 @@
 .. notebook-cell::
 
    from yt.mods import *
-   pf = load("enzo_tiny_cosmology/DD0046/DD0046")
-   ad = pf.h.all_data()
-   total_mass = ad.quantities["TotalQuantity"]("cell_mass")
+   ds = load("enzo_tiny_cosmology/DD0046/DD0046")
+   ad = ds.all_data()
+   total_mass = ad.quantities.total_mass()
    # now select only gas with 1e5 K < T < 1e7 K.
-   new_region = ad.cut_region(['grid["temperature"] > 1e5',
-                               'grid["temperature"] < 1e7'])
-   cut_mass = new_region.quantities["TotalQuantity"]("cell_mass")
+   new_region = ad.cut_region(['obj["temperature"] > 1e5',
+                               'obj["temperature"] < 1e7'])
+   cut_mass = new_region.quantities.total_mass()
    print "The fraction of mass in this temperature range is %f." % \
-     (cut_mass[0] / total_mass[0])
+     (cut_mass / total_mass)
 
 The ``cut_region`` function generates a new object containing only the cells 
 that meet all of the specified criteria.  The sole argument to ``cut_region`` 
@@ -267,7 +266,7 @@
    from yt.mods import *
    pf = load("enzo_tiny_cosmology/DD0046/DD0046")
    ad = pf.h.all_data()
-   new_region = ad.cut_region(['grid["density"] > 1e-29'])
+   new_region = ad.cut_region(['obj["density"] > 1e-29'])
    plot = ProjectionPlot(pf, "x", "density", weight_field="density",
                          data_source=new_region)
    plot.save()
@@ -292,7 +291,7 @@
 
 .. code-block:: python
 
-   sp = pf.h.sphere("max", (1.0, 'pc'))
+   sp = pf.sphere("max", (1.0, 'pc'))
    contour_values, connected_sets = sp.extract_connected_sets(
         "density", 3, 1e-30, 1e-20)
 
@@ -369,20 +368,20 @@
 :mod:`~yt.utilities.ParameterFileStorage` via :class:`~yt.utilities.ParameterFileStorage.ParameterFileStore`.)
 
 To save an object, you can either save it in the ``.yt`` file affiliated with
-the hierarchy or as a standalone file.  For instance, using
-:meth:`~yt.data_objects.hierarchy.save_object` we can save a sphere.
+the index or as a standalone file.  For instance, using
+:meth:`~yt.data_objects.index.save_object` we can save a sphere.
 
 .. code-block:: python
 
    from yt.mods import *
    pf = load("my_data")
-   sp = pf.h.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
+   sp = pf.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
 
    pf.h.save_object(sp, "sphere_to_analyze_later")
 
 
 In a later session, we can load it using
-:meth:`~yt.data_objects.hierarchy.load_object`:
+:meth:`~yt.data_objects.index.load_object`:
 
 .. code-block:: python
 
@@ -399,14 +398,14 @@
    from yt.mods import *
 
    pf = load("my_data")
-   sp = pf.h.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
+   sp = pf.sphere([0.5, 0.5, 0.5], 10.0/pf['kpc'])
 
    sp.save_object("my_sphere", "my_storage_file.cpkl")
 
 This will store the object as ``my_sphere`` in the file
 ``my_storage_file.cpkl``, which will be created or accessed using the standard
 python module :mod:`shelve`.  Note that if a filename is not supplied, it will
-be saved via the hierarchy, as above.
+be saved via the index, as above.
 
 To re-load an object saved this way, you can use the shelve module directly:
 
@@ -430,6 +429,6 @@
           loading and storing objects -- so in theory you could even save a
           list of objects!
 
-This method works for clumps, as well, and the entire clump hierarchy will be
+This method works for clumps, as well, and the entire clump index will be
 stored and restored upon load.
 

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/parallel_computation.rst
--- a/doc/source/analyzing/parallel_computation.rst
+++ b/doc/source/analyzing/parallel_computation.rst
@@ -166,7 +166,7 @@
 Spatial Decomposition
 +++++++++++++++++++++
 
-During this process, the hierarchy will be decomposed along either all three
+During this process, the index will be decomposed along either all three
 axes or along an image plane, if the process is that of projection.  This type
 of parallelism is overall less efficient than grid-based parallelism, but it
 has been shown to obtain good results overall.
@@ -283,7 +283,7 @@
 -----------------------------
 
 The same :func:`parallel_objects` machinery discussed above is turned on by
-default when using a ``TimeSeriesData`` object (see :ref:`time-series-analysis`)
+default when using a ``DatasetSeries`` object (see :ref:`time-series-analysis`)
 to iterate over simulation outputs.  The syntax for this is very simple.  As an
 example, we can use the following script to find the angular momentum vector in
 a 1 pc sphere centered on the maximum density cell in a large number of
@@ -292,7 +292,7 @@
 .. code-block:: python
 
    from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = True)
+   ts = DatasetSeries.from_filenames("DD*/output_*", parallel = True)
    sphere = ts.sphere("max", (1.0, "pc"))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 
@@ -302,15 +302,15 @@
 explicitly set ``parallel = True`` as in the above example. 
 
 One could get the same effect by iterating over the individual parameter files
-in the TimeSeriesData object:
+in the DatasetSeries object:
 
 .. code-block:: python
 
    from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = True)
+   ts = DatasetSeries.from_filenames("DD*/output_*", parallel = True)
    my_storage = {}
    for sto,pf in ts.piter(storage=my_storage):
-       sphere = pf.h.sphere("max", (1.0, "pc"))
+       sphere = pf.sphere("max", (1.0, "pc"))
        L_vec = sphere.quantities["AngularMomentumVector"]()
        sto.result_id = pf.parameter_filename
        sto.result = L_vec
@@ -329,7 +329,7 @@
 .. code-block:: python
 
    from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = 4)
+   ts = DatasetSeries.from_filenames("DD*/output_*", parallel = 4)
    sphere = ts.sphere("max", (1.0, "pc))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/time_series_analysis.rst
--- a/doc/source/analyzing/time_series_analysis.rst
+++ b/doc/source/analyzing/time_series_analysis.rst
@@ -17,11 +17,11 @@
        process_output(pf)
 
 But this is not really very nice.  This ends up requiring a lot of maintenance.
-The :class:`~yt.data_objects.time_series.TimeSeriesData` object has been
+The :class:`~yt.data_objects.time_series.DatasetSeries` object has been
 designed to remove some of this clunkiness and present an easier, more unified
 approach to analyzing sets of data.  Even better,
-:class:`~yt.data_objects.time_series.TimeSeriesData` works in parallel by
-default (see :ref:`parallel-computation`), so you can use a ``TimeSeriesData``
+:class:`~yt.data_objects.time_series.DatasetSeries` works in parallel by
+default (see :ref:`parallel-computation`), so you can use a ``DatasetSeries``
 object to quickly and easily parallelize your analysis.  Since doing the same
 analysis task on many simulation outputs is 'embarrassingly' parallel, this
 naturally allows for almost arbitrary speedup - limited only by the number of
@@ -33,9 +33,9 @@
 creating your own, and these operators can be applied either to datasets on the
 whole or to subregions of individual datasets.
 
-The simplest mechanism for creating a ``TimeSeriesData`` object is to use the
+The simplest mechanism for creating a ``DatasetSeries`` object is to use the
 class method
-:meth:`~yt.data_objects.time_series.TimeSeriesData.from_filenames`.  This
+:meth:`~yt.data_objects.time_series.DatasetSeries.from_filenames`.  This
 method accepts a list of strings that can be supplied to ``load``.  For
 example:
 
@@ -43,7 +43,7 @@
 
    from yt.mods import *
    filenames = ["DD0030/output_0030", "DD0040/output_0040"]
-   ts = TimeSeriesData.from_filenames(filenames)
+   ts = DatasetSeries.from_filenames(filenames)
 
 This will create a new time series, populated with the output files ``DD0030``
 and ``DD0040``.  This object, here called ``ts``, can now be analyzed in bulk.
@@ -53,29 +53,29 @@
 .. code-block:: python
 
    from yt.mods import *
-   ts = TimeSeriesData.from_filenames("*/*.hierarchy")
+   ts = DatasetSeries.from_filenames("*/*.index")
 
 Analyzing Each Dataset In Sequence
 ----------------------------------
 
-The :class:`~yt.data_objects.time_series.TimeSeriesData` object has two primary
+The :class:`~yt.data_objects.time_series.DatasetSeries` object has two primary
 methods of iteration.  The first is a very simple iteration, where each object
 is returned for iteration:
 
 .. code-block:: python
 
    from yt.mods import *
-   ts = TimeSeriesData.from_filenames("*/*.hierarchy")
+   ts = DatasetSeries.from_filenames("*/*.index")
    for pf in ts:
        print pf.current_time
 
 This can also operate in parallel, using
-:meth:`~yt.data_objects.time_series.TimeSeriesData.piter`.  For more examples,
+:meth:`~yt.data_objects.time_series.DatasetSeries.piter`.  For more examples,
 see:
 
  * :ref:`parallel-time-series-analysis`
  * The cookbook recipe for :ref:`cookbook-time-series-analysis`
- * :class:`~yt.data_objects.time_series.TimeSeriesData`
+ * :class:`~yt.data_objects.time_series.DatasetSeries`
 
 Prepared Time Series Analysis
 -----------------------------
@@ -97,13 +97,13 @@
 .. code-block:: python
 
    from yt.mods import *
-   ts = TimeSeries.from_filenames("*/*.hierarchy")
+   ts = TimeSeries.from_filenames("*/*.index")
    max_rho = ts.tasks["MaximumValue"]("density")
 
 When we call the task, the time series object executes the task on each
 component parameter file.  The results are then returned to the user.  More
 complex, multi-task evaluations can be conducted by using the
-:meth:`~yt.data_objects.time_series.TimeSeriesData.eval` call, which accepts a
+:meth:`~yt.data_objects.time_series.DatasetSeries.eval` call, which accepts a
 list of analysis tasks.
 
 Analysis Tasks Applied to Objects
@@ -122,7 +122,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   ts = TimeSeries.from_filenames("*/*.hierarchy")
+   ts = TimeSeries.from_filenames("*/*.index")
    sphere = ts.sphere("max", (1.0, "pc"))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 
@@ -155,7 +155,7 @@
    print ms
 
 This allows you to create your own analysis tasks that will be then available
-to time series data objects.  Since ``TimeSeriesData`` objects iterate over
+to time series data objects.  Since ``DatasetSeries`` objects iterate over
 filenames in parallel by default, this allows for transparent parallelization. 
 
 .. _analyzing-an-entire-simulation:
@@ -165,7 +165,7 @@
 
 The parameter file used to run a simulation contains all the information 
 necessary to know what datasets should be available.  The ``simulation`` 
-convenience function allows one to create a ``TimeSeriesData`` object of all 
+convenience function allows one to create a ``DatasetSeries`` object of all 
 or a subset of all data created by a single simulation.
 
 .. note:: Currently only implemented for Enzo.  Other simulation types coming 
@@ -179,7 +179,7 @@
   my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo',
                       find_outputs=False)
 
-Then, create a ``TimeSeriesData`` object with the :meth:`get_time_series` 
+Then, create a ``DatasetSeries`` object with the :meth:`get_time_series` 
 function.  With no additional keywords, the time series will include every 
 dataset.  If the **find_outputs** keyword is set to True, a search of the 
 simulation directory will be performed looking for potential datasets.  These 
@@ -249,7 +249,7 @@
    the requested times or redshifts.  If None, the nearest output is always 
    taken.  Default: None.
 
- * **parallel** (*bool*/*int*): If True, the generated TimeSeriesData will 
+ * **parallel** (*bool*/*int*): If True, the generated DatasetSeries will 
    divide the work such that a single processor works on each dataset.  If an
    integer is supplied, the work will be divided into that number of jobs.
    Default: True.

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/units/1)_Symbolic_Units.ipynb
--- a/doc/source/analyzing/units/1)_Symbolic_Units.ipynb
+++ b/doc/source/analyzing/units/1)_Symbolic_Units.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:982174bfd01e41ea6510dc5cbff556dc82e6f13b8d0c6189324f3ca7a81b702a"
+  "signature": "sha256:52f186664831f5290b31ec433114927b9771e224bd79d0c82dd3d9a8d9c09bf6"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -227,6 +227,119 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
+      "When working with a YTArray with complicated units, you can use `unit_array` and `unit_quantity` to conveniently apply units to data:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "test_array = YTArray(np.random.random(20), 'erg/s')\n",
+      "\n",
+      "print test_array"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`unit_quantity` returns a `YTQuantity` with a value of 1.0 and the same units as the array it is a attached to."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print test_array.unit_quantity"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`unit_array` returns a `YTArray` with the same units and shape as the array it is a attached to and with all values set to 1.0."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print test_array.unit_array"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "These are useful when doing arithmetic:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print test_array + 1.0*test_array.unit_quantity"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print test_array + np.arange(20)*test_array.unit_array"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For convenience, `unit_quantity` is also available via `uq` and `unit_array` is available via `ua`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print test_array.uq\n",
+      "\n",
+      "print test_array.unit_quantity == test_array.uq"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from numpy import array_equal\n",
+      "\n",
+      "print test_array.ua\n",
+      "\n",
+      "print array_equal(test_array.ua, test_array.unit_array)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
       "Unit metadata is encoded in the `units` attribute that hangs off of `YTArray` or `YTQuantity` instances:"
      ]
     },
@@ -249,6 +362,14 @@
      "outputs": []
     },
     {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Arithmetic with `YTQuantity` and `YTArray`"
+     ]
+    },
+    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
@@ -342,10 +463,15 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
+      "from yt.utilities.exceptions import YTUnitOperationError\n",
+      "\n",
       "a = YTQuantity(3, 'm')\n",
       "b = YTQuantity(5, 'erg')\n",
       "\n",
-      "print a+b"
+      "try:\n",
+      "    print a+b\n",
+      "except YTUnitOperationError as e:\n",
+      "    print e"
      ],
      "language": "python",
      "metadata": {},

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
--- a/doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
+++ b/doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:8be3d1eb683160bad5efbb176d4bb310b50e8af4f1a4ad356edb2ae5d5a227d6"
+  "signature": "sha256:8e1a5db9e3869bcf761ff39c5a95d21458b7c4205f00da3d3f973d398422a466"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -37,8 +37,8 @@
       "from yt.mods import *\n",
       "ds = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
       "          \n",
-      "dd = ds.h.all_data()\n",
-      "maxval, maxloc = ds.h.find_max('density')\n",
+      "dd = ds.all_data()\n",
+      "maxval, maxloc = ds.find_max('density')\n",
       "\n",
       "dens = dd['density']"
      ],
@@ -105,23 +105,60 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "YTArray defines several user-visible member functions: \n",
+      "YTArray defines several user-visible member functions that allow data to be converted from one unit system to another:\n",
       "\n",
-      "* `convert_to_units`\n",
-      "* `convert_to_cgs`\n",
       "* `in_units`\n",
       "* `in_cgs`\n",
-      "* `to_ndarray`\n",
-      "\n",
-      "The first two functions do in-place operations while the second two return copies of the original array in the new unit:"
+      "* `convert_to_units`\n",
+      "* `convert_to_cgs`"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The first method, `in_units`, returns a copy of the array in the units denoted by a string argument:"
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print dd['density'].in_units('Msun/pc**3')\n",
-      "\n",
+      "print dd['density'].in_units('Msun/pc**3')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The second, `in_cgs`, returns a copy of the array converted into the base units of yt's CGS unit system:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print (dd['pressure']/dd['density'])\n",
+      "print (dd['pressure']/dd['density']).in_cgs()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The next two methods do in-place conversions:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
       "dens = dd['density']\n",
       "print dens\n",
       "\n",
@@ -136,14 +173,14 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "One possibly confusing wrinkle in this is that the unit conversions are always done 'in place' so if you try to query `dd['density']` again, you'll find that it has been converted to solar masses per cubic parsec:"
+      "One possibly confusing wrinkle when using in-place conversions is if you try to query `dd['density']` again, you'll find that it has been converted to solar masses per cubic parsec:"
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print dens\n",
+      "print dd['density']\n",
       "\n",
       "dens.convert_to_units('g/cm**3')\n",
       "\n",
@@ -176,10 +213,18 @@
      "outputs": []
     },
     {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Working with views and converting to ndarray"
+     ]
+    },
+    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "Of course, the data is always available as a raw NumPy array.  There are two ways to copy the contents of a `YTArray` into an ndarray:"
+      "There are two ways to convert the data into a numpy array.  The most straightforward and safe way to do this is to create a copy of the array data.  The following cell demonstrates four equivalent ways of doing this, in increasing degree of terseness."
      ]
     },
     {
@@ -188,8 +233,12 @@
      "input": [
       "import numpy as np\n",
       "\n",
-      "print dd['cell_mass'].to_ndarray()\n",
-      "print np.array(dd['cell_mass'])"
+      "dens = dd['cell_mass']\n",
+      "\n",
+      "print dens.to_ndarray()\n",
+      "print np.array(dens)\n",
+      "print dens.value\n",
+      "print dens.v"
      ],
      "language": "python",
      "metadata": {},
@@ -199,7 +248,14 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "Similarly, one can get a view into the underlying array data:"
+      "Since we have a copy of the data, we can mess with it however we wish without disturbing the original data returned by the yt data object."
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Another way to touch the raw array data is to get a _view_.  A numpy view is a lightweight array interface to a memory buffer. There are four ways to create views of YTArray instances:"
      ]
     },
     {
@@ -207,7 +263,9 @@
      "collapsed": false,
      "input": [
       "print dd['cell_mass'].ndarray_view()\n",
-      "print dd['cell_mass'].view(np.ndarray)"
+      "print dd['cell_mass'].view(np.ndarray)\n",
+      "print dd['cell_mass'].ndview\n",
+      "print dd['cell_mass'].d"
      ],
      "language": "python",
      "metadata": {},
@@ -217,15 +275,17 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "When working with views, rememeber that you are touching the raw array data and no longer have any of the unit checking provided by the unit system."
+      "When working with views, rememeber that you are touching the raw array data and no longer have any of the unit checking provided by the unit system.  This can be useful where it might be more straightforward to treat the array as if it didn't have units but without copying the data."
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "density = dd['density'].ndarray_view()\n",
-      "density[0:10] = 0\n",
+      "density_values = dd['density'].d\n",
+      "density_values[0:10] = 0\n",
+      "\n",
+      "# The original array was updated\n",
       "print dd['density']"
      ],
      "language": "python",

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
--- a/doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
+++ b/doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:97e766f98a390fee4e2e87d8ea55ba5855e4f08ed6a1bfe031eaf71fb41c4822"
+  "signature": "sha256:448380e74a746d19dc1eecfe222c0e798a87a4ac285e4f50e2598316086c5ee8"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -96,13 +96,13 @@
      "input": [
       "from yt.mods import *\n",
       "\n",
-      "ts = TimeSeriesData.from_filenames(\"Enzo_64/DD????/data????\")\n",
+      "ts = DatasetSeries.from_filenames(\"Enzo_64/DD????/data????\")\n",
       "\n",
       "storage = {}\n",
       "\n",
-      "for sto, pf in ts.piter(storage=storage):\n",
-      "    sto.result_id = pf.current_time\n",
-      "    sto.result = pf.length_unit\n",
+      "for sto, ds in ts.piter(storage=storage):\n",
+      "    sto.result_id = ds.current_time\n",
+      "    sto.result = ds.length_unit\n",
       "\n",
       "if is_root():\n",
       "    for t in sorted(storage.keys()):\n",

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/units/5)_Units_and_plotting.ipynb
--- a/doc/source/analyzing/units/5)_Units_and_plotting.ipynb
+++ b/doc/source/analyzing/units/5)_Units_and_plotting.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:f22e2acb721ada2e47ad0736deb074ac1bb919b82021b387d8d5624a53889025"
+  "signature": "sha256:981baca6958c75f0d84bbc24be7d2b75af5957d36aa3eb4ba725d9e47a85f80d"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -29,9 +29,8 @@
      "collapsed": false,
      "input": [
       "from yt.mods import *\n",
-      "ds = load('HiResIsolatedGalaxy/DD0044/DD0044')\n",
-      "\n",
-      "slc = SlicePlot(ds, 2, 'density', center=[0.53, 0.53, 0.53], width=(15, 'kpc'))\n",
+      "ds = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
+      "slc = SlicePlot(ds, 2, 'density', center=[0.5, 0.5, 0.5], width=(15, 'kpc'))\n",
       "slc.set_figure_size(6)"
      ],
      "language": "python",
@@ -83,7 +82,12 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "slc.set_unit('density', 'Msun')"
+      "from yt.utilities.exceptions import YTUnitConversionError\n",
+      "\n",
+      "try:\n",
+      "    slc.set_unit('density', 'Msun')\n",
+      "except YTUnitConversionError as e:\n",
+      "    print e"
      ],
      "language": "python",
      "metadata": {},
@@ -102,7 +106,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "dd = ds.h.all_data()\n",
+      "dd = ds.all_data()\n",
       "plot = ProfilePlot(dd, 'density', 'temperature', weight_field='cell_mass')\n",
       "plot.show()"
      ],

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/analyzing/units/data_selection_and_fields.rst
--- a/doc/source/analyzing/units/data_selection_and_fields.rst
+++ b/doc/source/analyzing/units/data_selection_and_fields.rst
@@ -3,7 +3,7 @@
 Data selection and fields
 =========================
 
-.. notebook:: 2)_Data_selection_and_fields.ipynb
+.. notebook:: 2)_Data_Selection_and_fields.ipynb
 
 Derived Fields
 --------------

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/bootcamp/2)_Data_Inspection.ipynb
--- a/doc/source/bootcamp/2)_Data_Inspection.ipynb
+++ b/doc/source/bootcamp/2)_Data_Inspection.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:15cdc35ddb8b1b938967237e17534149f734f4e7a61ebd37d74b675f8059da20"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -13,14 +14,14 @@
      "source": [
       "# Starting Out and Loading Data\n",
       "\n",
-      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment.  Note that in most scripts, you will want to import from ``yt.mods`` rather than ``yt.imods``.  But using ``yt.imods`` gets you some nice stuff for the IPython notebook, which we'll use below."
+      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment."
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *"
+      "from yt.mods import *"
      ],
      "language": "python",
      "metadata": {},
@@ -37,7 +38,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+      "ds = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
      ],
      "language": "python",
      "metadata": {},
@@ -49,14 +50,14 @@
      "source": [
       "## Fields and Facts\n",
       "\n",
-      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"hierarchy\" (shorthand is `.h`) it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
+      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"index\" it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf.h.print_stats()"
+      "ds.print_stats()"
      ],
      "language": "python",
      "metadata": {},
@@ -73,7 +74,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf.h.field_list"
+      "ds.field_list"
      ],
      "language": "python",
      "metadata": {},
@@ -90,7 +91,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf.h.derived_field_list"
+      "ds.derived_field_list"
      ],
      "language": "python",
      "metadata": {},
@@ -107,7 +108,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print pf.field_info[\"gas\", \"vorticity_x\"].get_source()"
+      "print ds.field_info[\"gas\", \"vorticity_x\"].get_source()"
      ],
      "language": "python",
      "metadata": {},
@@ -124,7 +125,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print pf.domain_width"
+      "print ds.domain_width"
      ],
      "language": "python",
      "metadata": {},
@@ -141,9 +142,9 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print pf.domain_width.in_units(\"kpc\")\n",
-      "print pf.domain_width.in_units(\"au\")\n",
-      "print pf.domain_width.in_units(\"mile\")"
+      "print ds.domain_width.in_units(\"kpc\")\n",
+      "print ds.domain_width.in_units(\"au\")\n",
+      "print ds.domain_width.in_units(\"mile\")"
      ],
      "language": "python",
      "metadata": {},
@@ -162,7 +163,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print pf.h.grid_left_edge"
+      "print ds.index.grid_left_edge"
      ],
      "language": "python",
      "metadata": {},
@@ -172,14 +173,14 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The hierarchy (`pf.h` here) has an attribute `grids` which is all of the grid objects."
+      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The index (`ds.index` here) has an attribute `grids` which is all of the grid objects."
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "print pf.h.grids[0]"
+      "print ds.index.grids[0]"
      ],
      "language": "python",
      "metadata": {},
@@ -189,7 +190,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "g = pf.h.grids[0]\n",
+      "g = ds.index.grids[0]\n",
       "print g"
      ],
      "language": "python",
@@ -258,7 +259,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "gs = pf.h.select_grids(pf.h.max_level)"
+      "gs = ds.index.select_grids(ds.index.max_level)"
      ],
      "language": "python",
      "metadata": {},
@@ -302,7 +303,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "for f in pf.h.field_list:\n",
+      "for f in ds.field_list:\n",
       "    fv = g[f]\n",
       "    if fv.size == 0: continue\n",
       "    print f, fv.min(), fv.max()"
@@ -326,7 +327,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "sp = pf.h.sphere(\"max\", (10, 'kpc'))"
+      "sp = ds.sphere(\"max\", (10, 'kpc'))"
      ],
      "language": "python",
      "metadata": {},

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/bootcamp/3)_Simple_Visualization.ipynb
--- a/doc/source/bootcamp/3)_Simple_Visualization.ipynb
+++ b/doc/source/bootcamp/3)_Simple_Visualization.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:eb5fbf5eb55a9c8997c687f072c8c6030e74bef0048a72b4f74a06893c11b80a"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -20,7 +21,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *"
+      "from yt.mods import *"
      ],
      "language": "python",
      "metadata": {},
@@ -37,8 +38,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
-      "print \"Redshift =\", pf.current_redshift"
+      "ds = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "print \"Redshift =\", ds.current_redshift"
      ],
      "language": "python",
      "metadata": {},
@@ -57,7 +58,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "p = ProjectionPlot(pf, \"y\", \"density\")\n",
+      "p = ProjectionPlot(ds, \"y\", \"density\")\n",
       "p.show()"
      ],
      "language": "python",
@@ -134,7 +135,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "p = ProjectionPlot(pf, \"z\", [\"density\", \"temperature\"], weight_field=\"density\")\n",
+      "p = ProjectionPlot(ds, \"z\", [\"density\", \"temperature\"], weight_field=\"density\")\n",
       "p.show()"
      ],
      "language": "python",
@@ -169,7 +170,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "v, c = pf.h.find_max(\"density\")\n",
+      "v, c = ds.find_max(\"density\")\n",
       "p.set_center((c[0], c[1]))\n",
       "p.zoom(10)"
      ],
@@ -188,8 +189,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf = load(\"Enzo_64/DD0043/data0043\")\n",
-      "s = SlicePlot(pf, \"z\", [\"density\", \"velocity_magnitude\"], center=\"max\")\n",
+      "ds = load(\"Enzo_64/DD0043/data0043\")\n",
+      "s = SlicePlot(ds, \"z\", [\"density\", \"velocity_magnitude\"], center=\"max\")\n",
       "s.set_cmap(\"velocity_magnitude\", \"kamae\")\n",
       "s.zoom(10.0)"
      ],
@@ -242,7 +243,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "s = SlicePlot(pf, \"x\", [\"density\"], center=\"max\")\n",
+      "s = SlicePlot(ds, \"x\", [\"density\"], center=\"max\")\n",
       "s.annotate_contour(\"temperature\")\n",
       "s.zoom(2.5)"
      ],
@@ -271,4 +272,4 @@
    "metadata": {}
   }
  ]
-}
\ No newline at end of file
+}

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/bootcamp/4)_Data_Objects_and_Time_Series.ipynb
--- a/doc/source/bootcamp/4)_Data_Objects_and_Time_Series.ipynb
+++ b/doc/source/bootcamp/4)_Data_Objects_and_Time_Series.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:41293a66cd6fd5eae6da2d0343549144dc53d72e83286999faab3cf21d801f51"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -13,14 +14,16 @@
      "source": [
       "# Data Objects and Time Series Data\n",
       "\n",
-      "Just like before, we will load up yt."
+      "Just like before, we will load up yt.  Since we'll be using pylab to plot some data in this notebook, we additionally tell matplotlib to place plots inline inside the notebook."
      ]
     },
     {
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *"
+      "%matplotlib inline\n",
+      "from yt.mods import *\n",
+      "from matplotlib import pylab"
      ],
      "language": "python",
      "metadata": {},
@@ -32,7 +35,7 @@
      "source": [
       "## Time Series Data\n",
       "\n",
-      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `TimeSeriesData` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
+      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `DatasetSeries` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
       "\n",
       "Note that you can specify either a list of filenames, or a glob (i.e., asterisk) pattern in this."
      ]
@@ -41,7 +44,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "ts = TimeSeriesData.from_filenames(\"enzo_tiny_cosmology/*/*.hierarchy\")"
+      "ts = DatasetSeries(\"enzo_tiny_cosmology/*/*.hierarchy\")"
      ],
      "language": "python",
      "metadata": {},
@@ -53,7 +56,7 @@
      "source": [
       "### Example 1: Simple Time Series\n",
       "\n",
-      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for pf in ts` where `pf` means \"Parameter File\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the Derived Quantity `Extrema`, and append the min and max to our extrema outputs."
+      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for ds in ts` where `ds` means \"Dataset\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the `extrema` Derived Quantity, and append the min and max to our extrema outputs."
      ]
     },
     {
@@ -62,10 +65,10 @@
      "input": [
       "rho_ex = []\n",
       "times = []\n",
-      "for pf in ts:\n",
-      "    dd = pf.h.all_data()\n",
+      "for ds in ts:\n",
+      "    dd = ds.all_data()\n",
       "    rho_ex.append(dd.quantities.extrema(\"density\"))\n",
-      "    times.append(pf.current_time.in_units(\"ys\"))\n",
+      "    times.append(ds.current_time.in_units(\"Gyr\"))\n",
       "rho_ex = np.array(rho_ex)"
      ],
      "language": "python",
@@ -107,16 +110,16 @@
      "input": [
       "mass = []\n",
       "zs = []\n",
-      "for pf in ts:\n",
-      "    halos = HaloFinder(pf)\n",
-      "    dd = pf.h.all_data()\n",
+      "for ds in ts:\n",
+      "    halos = HaloFinder(ds)\n",
+      "    dd = ds.all_data()\n",
       "    total_mass = dd.quantities.total_quantity(\"cell_mass\").in_units(\"Msun\")\n",
       "    total_in_baryons = 0.0\n",
       "    for halo in halos:\n",
       "        sp = halo.get_sphere()\n",
       "        total_in_baryons += sp.quantities.total_quantity(\"cell_mass\").in_units(\"Msun\")\n",
       "    mass.append(total_in_baryons/total_mass)\n",
-      "    zs.append(pf.current_redshift)"
+      "    zs.append(ds.current_redshift)"
      ],
      "language": "python",
      "metadata": {},
@@ -158,7 +161,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "ray = pf.h.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
+      "ray = ds.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
       "pylab.semilogy(ray[\"t\"], ray[\"density\"])"
      ],
      "language": "python",
@@ -208,9 +211,9 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
-      "v, c = pf.h.find_max(\"density\")\n",
-      "sl = pf.h.slice(0, c[0])\n",
+      "ds = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "v, c = ds.find_max(\"density\")\n",
+      "sl = ds.slice(0, c[0])\n",
       "print sl[\"index\", \"x\"], sl[\"index\", \"z\"], sl[\"pdx\"]\n",
       "print sl[\"gas\", \"density\"].shape"
      ],
@@ -222,7 +225,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "If we want to do something interesting with a Slice, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
+      "If we want to do something interesting with a `Slice`, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
      ]
     },
     {
@@ -240,7 +243,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of Density, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
+      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of `density`, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
      ]
     },
     {
@@ -270,7 +273,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "cp = pf.h.cutting([0.2, 0.3, 0.5], \"max\")\n",
+      "cp = ds.cutting([0.2, 0.3, 0.5], \"max\")\n",
       "pw = cp.to_pw(fields = [\"density\"])"
      ],
      "language": "python",
@@ -329,7 +332,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "cg = pf.h.covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "cg = ds.covering_grid(2, [0.0, 0.0, 0.0], ds.domain_dimensions * 2**2)\n",
       "print cg[\"density\"].shape"
      ],
      "language": "python",
@@ -347,7 +350,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "scg = pf.h.smoothed_covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "scg = ds.smoothed_covering_grid(2, [0.0, 0.0, 0.0], ds.domain_dimensions * 2**2)\n",
       "print scg[\"density\"].shape"
      ],
      "language": "python",
@@ -358,4 +361,4 @@
    "metadata": {}
   }
  ]
-}
\ No newline at end of file
+}

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
--- a/doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
+++ b/doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:a19d451f3b4dcfeed448caa22c2cac35c46958e0646c19c226b1e467b76d0718"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -20,7 +21,9 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *"
+      "%matplotlib inline\n",
+      "from yt.mods import *\n",
+      "from matplotlib import pylab"
      ],
      "language": "python",
      "metadata": {},
@@ -32,7 +35,7 @@
      "source": [
       "## Derived Fields\n",
       "\n",
-      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `Dinosaurs` and our units are `Trex/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
+      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `dinosaurs` and our units are `K*cm/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
      ]
     },
     {
@@ -58,8 +61,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
-      "dd = pf.h.all_data()\n",
+      "ds = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = ds.all_data()\n",
       "print dd.quantities.keys()"
      ],
      "language": "python",
@@ -70,7 +73,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `Extrema` quantity -- the exact same way that we would for Density, Temperature, and so on."
+      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `extrema` quantity -- the exact same way that we would for density, temperature, and so on."
      ]
     },
     {
@@ -113,7 +116,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "sp = pf.h.sphere(\"max\", (10.0, 'kpc'))\n",
+      "sp = ds.sphere(\"max\", (10.0, 'kpc'))\n",
       "bv = sp.quantities.bulk_velocity()\n",
       "L = sp.quantities.angular_momentum_vector()\n",
       "rho_min, rho_max = sp.quantities.extrema(\"density\")\n",
@@ -133,7 +136,7 @@
       "\n",
       "We do this using the objects `Profile1D`, `Profile2D`, and `Profile3D`.  The first two are the most common since they are the easiest to visualize.\n",
       "\n",
-      "This first set of commands manually creates a `BinnedProfile1D` from the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the density-weighted average of the fields `Temperature` and (previously-defined) `Dinosaurs`.  We then plot it in a loglog plot."
+      "This first set of commands manually creates a profile object the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the density-weighted average of the fields `temperature` and (previously-defined) `dinosaurs`.  We then plot it in a loglog plot."
      ]
     },
     {
@@ -152,7 +155,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "Now we plot the `Dinosaurs` field."
+      "Now we plot the `dinosaurs` field."
      ]
     },
     {
@@ -197,10 +200,10 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "sp_small = pf.h.sphere(\"max\", (50.0, 'kpc'))\n",
+      "sp_small = ds.sphere(\"max\", (50.0, 'kpc'))\n",
       "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
       "\n",
-      "sp = pf.h.sphere(\"max\", (0.1, 'Mpc'))\n",
+      "sp = ds.sphere(\"max\", (0.1, 'Mpc'))\n",
       "rv1 = sp.quantities[\"Extrema\"](\"radial_velocity\")\n",
       "\n",
       "sp.clear_data()\n",

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/bootcamp/6)_Volume_Rendering.ipynb
--- a/doc/source/bootcamp/6)_Volume_Rendering.ipynb
+++ b/doc/source/bootcamp/6)_Volume_Rendering.ipynb
@@ -1,6 +1,7 @@
 {
  "metadata": {
-  "name": ""
+  "name": "",
+  "signature": "sha256:2929940fc3977b495aa124dee851f7602d61e073ed65407dd95e7cf597684b35"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -20,9 +21,8 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "from yt.imods import *\n",
-      "import yt.units as u\n",
-      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+      "from yt.mods import *\n",
+      "ds = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
      ],
      "language": "python",
      "metadata": {},
@@ -45,7 +45,7 @@
      "input": [
       "tf = ColorTransferFunction((-28, -24))\n",
       "tf.add_layers(4, w=0.01)\n",
-      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], (20, 'kpc'), 512, tf, fields=[\"density\"])\n",
+      "cam = ds.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], (20, 'kpc'), 512, tf, fields=[\"density\"])\n",
       "cam.show()"
      ],
      "language": "python",
@@ -56,7 +56,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the `std()` of the image array."
+      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the standard deviation of the values in the image array."
      ]
     },
     {
@@ -82,7 +82,7 @@
      "input": [
       "tf = ColorTransferFunction((-28, -25))\n",
       "tf.add_layers(4, w=0.03)\n",
-      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], (20.0, 'kpc'), 512, tf, no_ghost=False)\n",
+      "cam = ds.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], (20.0, 'kpc'), 512, tf, no_ghost=False)\n",
       "cam.show(clip_ratio=4.0)"
      ],
      "language": "python",

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/cookbook/aligned_cutting_plane.py
--- a/doc/source/cookbook/aligned_cutting_plane.py
+++ b/doc/source/cookbook/aligned_cutting_plane.py
@@ -6,7 +6,7 @@
 # Create a 1 kpc radius sphere, centered on the max density.  Note that this
 # sphere is very small compared to the size of our final plot, and it has a
 # non-axially aligned L vector.
-sp = pf.h.sphere("center", (15.0, "kpc"))
+sp = pf.sphere("center", (15.0, "kpc"))
 
 # Get the angular momentum vector for the sphere.
 L = sp.quantities.angular_momentum_vector()

diff -r 080a6a7e0e745a354cb65dd00cb7447ddb9f0e29 -r 15c6f41cb4624b680f8170c11caec6e9cd19623e doc/source/cookbook/boolean_data_objects.py
--- a/doc/source/cookbook/boolean_data_objects.py
+++ b/doc/source/cookbook/boolean_data_objects.py
@@ -2,22 +2,22 @@
 
 pf = load("Enzo_64/DD0043/data0043") # load data
 # Make a few data ojbects to start.
-re1 = pf.h.region([0.5, 0.5, 0.5], [0.4, 0.4, 0.4], [0.6, 0.6, 0.6])
-re2 = pf.h.region([0.5, 0.5, 0.5], [0.5, 0.5, 0.5], [0.6, 0.6, 0.6])
-sp1 = pf.h.sphere([0.5, 0.5, 0.5], 0.05)
-sp2 = pf.h.sphere([0.1, 0.2, 0.3], 0.1)
+re1 = pf.region([0.5, 0.5, 0.5], [0.4, 0.4, 0.4], [0.6, 0.6, 0.6])
+re2 = pf.region([0.5, 0.5, 0.5], [0.5, 0.5, 0.5], [0.6, 0.6, 0.6])
+sp1 = pf.sphere([0.5, 0.5, 0.5], 0.05)
+sp2 = pf.sphere([0.1, 0.2, 0.3], 0.1)
 # The "AND" operator. This will make a region identical to re2.
-bool1 = pf.h.boolean([re1, "AND", re2])
+bool1 = pf.boolean([re1, "AND", re2])
 xp = bool1["particle_position_x"]
 # The "OR" operator. This will make a region identical to re1.
-bool2 = pf.h.boolean([re1, "OR", re2])
+bool2 = pf.boolean([re1, "OR", re2])
 # The "NOT" operator. This will make a region like re1, but with the corner
 # that re2 covers cut out.
-bool3 = pf.h.boolean([re1, "NOT", re2])
+bool3 = pf.boolean([re1, "NOT", re2])
 # Disjoint regions can be combined with the "OR" operator.
-bool4 = pf.h.boolean([sp1, "OR", sp2])
+bool4 = pf.boolean([sp1, "OR", sp2])
 # Find oddly-shaped overlapping regions.
-bool5 = pf.h.boolean([re2, "AND", sp1])
+bool5 = pf.boolean([re2, "AND", sp1])
 # Nested logic with parentheses.
 # This is re1 with the oddly-shaped region cut out.
-bool6 = pf.h.boolean([re1, "NOT", "(", re1, "AND", sp1, ")"])
+bool6 = pf.boolean([re1, "NOT", "(", re1, "AND", sp1, ")"])

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/49796e66b871/
Changeset:   49796e66b871
Branch:      yt-3.0
User:        MatthewTurk
Date:        2014-04-02 21:27:22
Summary:     Attempting to vectorize the annotate_triangles.
Affected #:  1 file

diff -r 15c6f41cb4624b680f8170c11caec6e9cd19623e -r 49796e66b871558d52782a49b8c29e780eb86ca0 yt/visualization/plot_modifications.py
--- a/yt/visualization/plot_modifications.py
+++ b/yt/visualization/plot_modifications.py
@@ -1368,13 +1368,13 @@
     def __call__(self, plot):
         plot._axes.hold(True)
         xax, yax = x_dict[plot.data.axis], y_dict[plot.data.axis]
-        l_cy = triangle_plane_intersect(plot.data.axis, plot.data.coord, self.vertices)[:,:,(xax, yax)]
-        # Convert numpy array to a YT array
-        l_cy = [YTArray(line, input_units="code_length") for line in l_cy]
-        # Convert points individually
-        for line in l_cy:
-            line[0] = self.convert_to_plot(plot,line[0])
-            line[1] = self.convert_to_plot(plot,line[1])
+        if not hasattr(self.vertices, "in_units"):
+            vertices = plot.data.pf.arr(self.vertices, "code_length")
+        else:
+            vertices = self.vertices
+        l_cy = triangle_plane_intersect(plot.data.axis, plot.data.coord, vertices)[:,:,(xax, yax)]
+        l_cy[:,0,:] = self.convert_to_plot(plot, l_cy[:,0,:])
+        l_cy[:,1,:] = self.convert_to_plot(plot, l_cy[:,1,:])
         # create the line collection using the new points
         lc = matplotlib.collections.LineCollection(l_cy, **self.plot_args)
         plot._axes.add_collection(lc)


https://bitbucket.org/yt_analysis/yt/commits/6de5eaad0968/
Changeset:   6de5eaad0968
Branch:      yt-3.0
User:        pshriwise
Date:        2014-04-04 20:47:50
Summary:     Updated the process for converting triangle plane intersection points to plot coordinates.
Affected #:  1 file

diff -r 49796e66b871558d52782a49b8c29e780eb86ca0 -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 yt/visualization/plot_modifications.py
--- a/yt/visualization/plot_modifications.py
+++ b/yt/visualization/plot_modifications.py
@@ -1373,9 +1373,14 @@
         else:
             vertices = self.vertices
         l_cy = triangle_plane_intersect(plot.data.axis, plot.data.coord, vertices)[:,:,(xax, yax)]
-        l_cy[:,0,:] = self.convert_to_plot(plot, l_cy[:,0,:])
-        l_cy[:,1,:] = self.convert_to_plot(plot, l_cy[:,1,:])
-        # create the line collection using the new points
+        # reformat for conversion to plot coordinates
+        l_cy = np.rollaxis(l_cy,0,3)
+        # convert all line starting points
+        l_cy[0] = self.convert_to_plot(plot,l_cy[0])
+        l_cy[1] = self.convert_to_plot(plot,l_cy[1])
+        # convert all line ending points
+        l_cy = np.rollaxis(l_cy,2,0)
+        # create line collection and add it to the plot
         lc = matplotlib.collections.LineCollection(l_cy, **self.plot_args)
         plot._axes.add_collection(lc)
         plot._axes.hold(False)


https://bitbucket.org/yt_analysis/yt/commits/fc533325d312/
Changeset:   fc533325d312
Branch:      yt-3.0
User:        MatthewTurk
Date:        2014-07-29 01:15:27
Summary:     Merging from Pat
Affected #:  605 files

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -7,6 +7,7 @@
 rockstar.cfg
 yt_updater.log
 yt/frontends/artio/_artio_caller.c
+yt/analysis_modules/halo_finding/rockstar/rockstar_groupies.c
 yt/analysis_modules/halo_finding/rockstar/rockstar_interface.c
 yt/frontends/ramses/_ramses_reader.cpp
 yt/frontends/sph/smoothing_kernel.c
@@ -41,6 +42,7 @@
 yt/utilities/lib/PointsInVolume.c
 yt/utilities/lib/QuadTree.c
 yt/utilities/lib/RayIntegrators.c
+yt/utilities/lib/ragged_arrays.c
 yt/utilities/lib/VolumeIntegrator.c
 yt/utilities/lib/grid_traversal.c
 yt/utilities/lib/GridTree.c

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -5160,3 +5160,4 @@
 954d1ffcbf04c3d1b394c2ea05324d903a9a07cf yt-3.0a2
 f4853999c2b5b852006d6628719c882cddf966df yt-3.0a3
 079e456c38a87676472a458210077e2be325dc85 last_gplv3
+f327552a6ede406b82711fb800ebcd5fe692d1cb yt-3.0a4

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 CREDITS
--- a/CREDITS
+++ b/CREDITS
@@ -2,15 +2,21 @@
 
 Contributors:   
                 Tom Abel (tabel at stanford.edu)
-                David Collins (dcollins at physics.ucsd.edu)
+                Gabriel Altay (gabriel.altay at gmail.com)
+                Kenza Arraki (karraki at gmail.com)
+                Alex Bogert (fbogert at ucsc.edu)
+                David Collins (dcollins4096 at gmail.com)
                 Brian Crosby (crosby.bd at gmail.com)
                 Andrew Cunningham (ajcunn at gmail.com)
+                Miguel de Val-Borro (miguel.deval at gmail.com)
                 Hilary Egan (hilaryye at gmail.com)
                 John Forces (jforbes at ucolick.org)
+                Sam Geen (samgeen at gmail.com)
                 Nathan Goldbaum (goldbaum at ucolick.org)
                 Markus Haider (markus.haider at uibk.ac.at)
                 Cameron Hummels (chummels at gmail.com)
                 Christian Karch (chiffre at posteo.de)
+                Ben W. Keller (kellerbw at mcmaster.ca)
                 Ji-hoon Kim (me at jihoonkim.org)
                 Steffen Klemer (sklemer at phys.uni-goettingen.de)
                 Kacper Kowalik (xarthisius.kk at gmail.com)
@@ -21,18 +27,23 @@
                 Chris Malone (chris.m.malone at gmail.com)
                 Josh Maloney (joshua.moloney at colorado.edu)
                 Chris Moody (cemoody at ucsc.edu)
+                Stuart Mumford (stuart at mumford.me.uk)
                 Andrew Myers (atmyers at astro.berkeley.edu)
                 Jill Naiman (jnaiman at ucolick.org)
+                Desika Narayanan (dnarayan at haverford.edu)
                 Kaylea Nelson (kaylea.nelson at yale.edu)
                 Jeff Oishi (jsoishi at gmail.com)
+                Brian O'Shea (bwoshea at gmail.com)
                 Jean-Claude Passy (jcpassy at uvic.ca)
+                John Regan (john.regan at helsinki.fi)
                 Mark Richardson (Mark.L.Richardson at asu.edu)
                 Thomas Robitaille (thomas.robitaille at gmail.com)
                 Anna Rosen (rosen at ucolick.org)
                 Douglas Rudd (drudd at uchicago.edu)
                 Anthony Scopatz (scopatz at gmail.com)
                 Noel Scudder (noel.scudder at stonybrook.edu)
-                Devin Silvia (devin.silvia at colorado.edu)
+                Pat Shriwise (shriwise at wisc.edu)
+                Devin Silvia (devin.silvia at gmail.com)
                 Sam Skillman (samskillman at gmail.com)
                 Stephen Skory (s at skory.us)
                 Britton Smith (brittonsmith at gmail.com)
@@ -42,8 +53,10 @@
                 Stephanie Tonnesen (stonnes at gmail.com)
                 Matthew Turk (matthewturk at gmail.com)
                 Rich Wagner (rwagner at physics.ucsd.edu)
+                Michael S. Warren (mswarren at gmail.com)
                 Andrew Wetzel (andrew.wetzel at yale.edu)
                 John Wise (jwise at physics.gatech.edu)
+                Michael Zingale (michael.zingale at stonybrook.edu)
                 John ZuHone (jzuhone at gmail.com)
 
 Several items included in the yt/extern directory were written by other

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 MANIFEST.in
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -7,4 +7,8 @@
 include doc/extensions/README doc/Makefile
 prune doc/source/reference/api/generated
 prune doc/build/
-recursive-include yt/utilities/kdtree *.f90 *.v Makefile LICENSE
+recursive-include yt/analysis_modules/halo_finding/rockstar *.py *.pyx
+prune yt/frontends/_skeleton
+prune tests
+graft yt/gui/reason/html/resources
+exclude clean.sh .hgchurn

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/README
--- a/doc/README
+++ b/doc/README
@@ -5,6 +5,6 @@
 http://sphinx.pocoo.org/
 
 Because the documentation requires a number of dependencies, we provide
-pre-build versions online, accessible here:
+pre-built versions online, accessible here:
 
-http://yt-project.org/docs/
+http://yt-project.org/docs/dev-3.0/

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/cheatsheet.tex
--- a/doc/cheatsheet.tex
+++ b/doc/cheatsheet.tex
@@ -3,7 +3,7 @@
 \usepackage{calc}
 \usepackage{ifthen}
 \usepackage[landscape]{geometry}
-\usepackage[colorlinks = true, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref}
+\usepackage[hyphens]{url}
 
 % To make this come out properly in landscape mode, do one of the following
 % 1.
@@ -101,9 +101,13 @@
 Documentation \url{http://yt-project.org/doc/index.html}.
 Need help? Start here \url{http://yt-project.org/doc/help/} and then
 try the IRC chat room \url{http://yt-project.org/irc.html},
-or the mailing list \url{http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org}.
-{\bf Installing yt:} The easiest way to install yt is to use the installation script
-found on the yt homepage or the docs linked above.
+or the mailing list \url{http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org}. \\
+
+\subsection{Installing yt} The easiest way to install yt is to use the
+installation script found on the yt homepage or the docs linked above.  If you
+already have python set up with \texttt{numpy}, \texttt{scipy},
+\texttt{matplotlib}, \texttt{h5py}, and \texttt{cython}, you can also use
+\texttt{pip install yt}
 
 \subsection{Command Line yt}
 yt, and its convenience functions, are launched from a command line prompt.
@@ -118,9 +122,8 @@
 \texttt{yt stats} {\it dataset} \textemdash\ Print stats of a dataset. \\
 \texttt{yt update} \textemdash\ Update yt to most recent version.\\
 \texttt{yt update --all} \textemdash\ Update yt and dependencies to most recent version. \\
-\texttt{yt instinfo} \textemdash\ yt installation information. \\
+\texttt{yt version} \textemdash\ yt installation information. \\
 \texttt{yt notebook} \textemdash\ Run the IPython notebook server. \\
-\texttt{yt serve} ({\it dataset}) \textemdash\  Run yt-specific web GUI ({\it dataset} is optional).\\
 \texttt{yt upload\_image} {\it image.png} \textemdash\ Upload PNG image to imgur.com. \\
 \texttt{yt upload\_notebook} {\it notebook.nb} \textemdash\ Upload IPython notebook to hub.yt-project.org.\\
 \texttt{yt plot} {\it dataset} \textemdash\ Create a set of images.\\
@@ -132,16 +135,8 @@
  paste.yt-project.org. \\ 
 \texttt{yt pastebin\_grab} {\it identifier} \textemdash\ Print content of pastebin to
  STDOUT. \\
- \texttt{yt hub\_register} \textemdash\ Register with
-hub.yt-project.org. \\
-\texttt{yt hub\_submit} \textemdash\ Submit hg repo to
-hub.yt-project.org. \\
-\texttt{yt bootstrap\_dev} \textemdash\ Bootstrap a yt 
-development environment. \\
 \texttt{yt bugreport} \textemdash\ Report a yt bug. \\
 \texttt{yt hop} {\it dataset} \textemdash\  Run hop on a dataset. \\
-\texttt{yt rpdb} \textemdash\ Connect to running rpd 
- session. 
 
 \subsection{yt Imports}
 In order to use yt, Python must load the relevant yt modules into memory.
@@ -149,37 +144,40 @@
 used as part of a script.
 \newlength{\MyLen}
 \settowidth{\MyLen}{\texttt{letterpaper}/\texttt{a4paper} \ }
-\texttt{from yt.mods import \textasteriskcentered}  \textemdash\ 
-Load base yt  modules. \\
+\texttt{import yt}  \textemdash\ 
+Load yt. \\
 \texttt{from yt.config import ytcfg}  \textemdash\ 
 Used to set yt configuration options.
- If used, must be called before importing any other module.\\
-\texttt{from yt.analysis\_modules.api import \textasteriskcentered}   \textemdash\ 
-Load all yt analysis modules. \\
+If used, must be called before importing any other module.\\
 \texttt{from yt.analysis\_modules.\emph{halo\_finding}.api import \textasteriskcentered}  \textemdash\ 
 Load halo finding modules. Other modules
 are loaded in a similar way by swapping the 
 {\em emphasized} text.
 See the \textbf{Analysis Modules} section for a listing and short descriptions of each.
 
-\subsection{Numpy Arrays}
-Simulation data in yt is returned in Numpy arrays. The Numpy package provides a wealth of built-in
-functions that operate on Numpy arrays. Here is a very brief list of some useful ones.
-Please see \url{http://docs.scipy.org/doc/numpy/reference/} for the full
-numpy documentation.\\
-\settowidth{\MyLen}{\texttt{multicol} }
+\subsection{YTArray}
+Simulation data in yt is returned as a YTArray.  YTArray is a numpy array that
+has unit data attached to it and can automatically handle unit conversions and
+detect unit errors. Just like a numpy array, YTArray provides a wealth of
+built-in functions to calculate properties of the data in the array. Here is a
+very brief list of some useful ones.
+\settowidth{\MyLen}{\texttt{multicol} }\\
+\texttt{v = a.in\_cgs()} \textemdash\ Return the array in CGS units \\
+\texttt{v = a.in\_units('Msun/pc**3')} \textemdash\ Return the array in solar masses per cubic parsec \\ 
 \texttt{v = a.max(), a.min()} \textemdash\ Return maximum, minimum of \texttt{a}. \\
-\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max, 
+\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max,
 min value of \texttt{a}.\\
 \texttt{v = a[}{\it index}\texttt{]} \textemdash\ Select a single value from \texttt{a} at location {\it index}.\\
-\texttt{b = a[}{\it i:j}\texttt{]} \textemdash\ Select the slice of values from \texttt{a} between
+\texttt{b = a[}{\it i:j}\texttt{]} \textemdash\ Select the slice of values from
+\texttt{a} between
 locations {\it i} to {\it j-1} saved to a new Numpy array \texttt{b} with length {\it j-i}. \\
-\texttt{sel = (a > const)}  \textemdash\ Create a new boolean Numpy array \texttt{sel}, of the same shape as \texttt{a},
+\texttt{sel = (a > const)} \textemdash\ Create a new boolean Numpy array
+\texttt{sel}, of the same shape as \texttt{a},
 that marks which values of \texttt{a > const}. Other operators (e.g. \textless, !=, \%) work as well.\\
-\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of elements from \texttt{a} that correspond to elements of \texttt{sel}
+\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of
+elements from \texttt{a} that correspond to elements of \texttt{sel}
 that are {\it True}. In the above example \texttt{b} would be all elements of \texttt{a} that are greater than \texttt{const}.\\
-\texttt{a.dump({\it filename.dat})} \textemdash\ Save \texttt{a} to the binary file {\it filename.dat}.\\
-\texttt{a = np.load({\it filename.dat})} \textemdash\ Load the contents of {\it filename.dat} into \texttt{a}.
+\texttt{a.write\_hdf5({\it filename.h5})} \textemdash\ Save \texttt{a} to the hdf5 file {\it filename.h5}.\\
 
 \subsection{IPython Tips}
 \settowidth{\MyLen}{\texttt{multicol} }
@@ -196,6 +194,7 @@
 \texttt{\%hist} \textemdash\ Print recent command history.\\
 \texttt{\%quickref} \textemdash\ Print IPython quick reference.\\
 \texttt{\%pdb} \textemdash\ Automatically enter the Python debugger at an exception.\\
+\texttt{\%debug} \textemdash\ Drop into a debugger at the location of the last unhandled exception. \\
 \texttt{\%time, \%timeit} \textemdash\ Find running time of expressions for benchmarking.\\
 \texttt{\%lsmagic} \textemdash\ List all available IPython magics. Hint: \texttt{?} works with magics.\\
 
@@ -208,68 +207,52 @@
 After that, simulation data is generally accessed in yt using {\it Data Containers} which are Python objects
 that define a region of simulation space from which data should be selected.
 \settowidth{\MyLen}{\texttt{multicol} }
-\texttt{pf = load(}{\it dataset}\texttt{)} \textemdash\   Reference a single snapshot.\\
-\texttt{dd = pf.h.all\_data()} \textemdash\ Select the entire volume.\\
-\texttt{a = dd[}{\it field\_name}\texttt{]} \textemdash\ Saves the contents of {\it field} into the
-numpy array \texttt{a}. Similarly for other data containers.\\
-\texttt{pf.h.field\_list} \textemdash\ A list of available fields in the snapshot. \\
-\texttt{pf.h.derived\_field\_list} \textemdash\ A list of available derived fields
+\texttt{ds = yt.load(}{\it dataset}\texttt{)} \textemdash\   Reference a single snapshot.\\
+\texttt{dd = ds.all\_data()} \textemdash\ Select the entire volume.\\
+\texttt{a = dd[}{\it field\_name}\texttt{]} \textemdash\ Copies the contents of {\it field} into the
+YTArray \texttt{a}. Similarly for other data containers.\\
+\texttt{ds.field\_list} \textemdash\ A list of available fields in the snapshot. \\
+\texttt{ds.derived\_field\_list} \textemdash\ A list of available derived fields
 in the snapshot. \\
-\texttt{val, loc = pf.h.find\_max("Density")} \textemdash\ Find the \texttt{val}ue of
+\texttt{val, loc = ds.find\_max("Density")} \textemdash\ Find the \texttt{val}ue of
 the maximum of the field \texttt{Density} and its \texttt{loc}ation. \\
-\texttt{sp = pf.sphere(}{\it cen}\texttt{,}{\it radius}\texttt{)} \textemdash\   Create a spherical data 
+\texttt{sp = ds.sphere(}{\it cen}\texttt{,}{\it radius}\texttt{)} \textemdash\   Create a spherical data 
 container. {\it cen} may be a coordinate, or ``max'' which 
 centers on the max density point. {\it radius} may be a float in 
 code units or a tuple of ({\it length, unit}).\\
 
-\texttt{re = pf.region({\it cen}, {\it left edge}, {\it right edge})} \textemdash\ Create a
+\texttt{re = ds.region({\it cen}, {\it left edge}, {\it right edge})} \textemdash\ Create a
 rectilinear data container. {\it cen} is required but not used.
 {\it left} and {\it right edge} are coordinate values that define the region.
 
-\texttt{di = pf.disk({\it cen}, {\it normal}, {\it radius}, {\it height})} \textemdash\ 
+\texttt{di = ds.disk({\it cen}, {\it normal}, {\it radius}, {\it height})} \textemdash\ 
 Create a cylindrical data container centered at {\it cen} along the 
 direction set by {\it normal},with total length
  2$\times${\it height} and with radius {\it radius}. \\
  
- \texttt{bl = pf.boolean({\it constructor})} \textemdash\ Create a boolean data
- container. {\it constructor} is a list of pre-defined non-boolean 
- data containers with nested boolean logic using the
- ``AND'', ``NOT'', or ``OR'' operators. E.g. {\it constructor=}
- {\it [sp, ``NOT'', (di, ``OR'', re)]} gives a volume defined
- by {\it sp} minus the patches covered by {\it di} and {\it re}.\\
- 
-\texttt{pf.h.save\_object(sp, {\it ``sp\_for\_later''})} \textemdash\ Save an object (\texttt{sp}) for later use.\\
-\texttt{sp = pf.h.load\_object({\it ``sp\_for\_later''})} \textemdash\ Recover a saved object.\\
+\texttt{ds.save\_object(sp, {\it ``sp\_for\_later''})} \textemdash\ Save an object (\texttt{sp}) for later use.\\
+\texttt{sp = ds.load\_object({\it ``sp\_for\_later''})} \textemdash\ Recover a saved object.\\
 
 
-\subsection{Defining New Fields \& Quantities}
-\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory. Quantities reduce a field (e.g. "Density") defined over an object (e.g. "sphere") to get a single value (e.g. "Mass"). \\
-\texttt{def \_MetalMassMsun({\it field},{\it data})}\\
-\texttt{\hspace{4 mm} return data["Metallicity"]*data["CellMassMsun"]}\\
-\texttt{add\_field("MetalMassMsun",function=\_MetalMassMsun)}\\
-Define a new quantity; note the first function operates on grids and data objects and the second on the results of the first. \\
-\texttt{def \_TotalMass(data): }\\
-\texttt{\hspace{4 mm} baryon\_mass = data["CellMassMsun"].sum()}\\
-\texttt{\hspace{4 mm} particle\_mass = data["ParticleMassMsun"].sum()}\\
-\texttt{\hspace{4 mm} return baryon\_mass, particle\_mass}\\
-\texttt{def \_combTotalMass(data, baryon\_mass, particle\_mass):}\\
-\texttt{\hspace{4 mm} return baryon\_mass.sum() + particle\_mass.sum()}\\
-\texttt{add\_quantity("TotalMass", function=\_TotalMass,}\\
-\texttt{\hspace{4 mm} combine\_function=\_combTotalMass, n\_ret = 2)}\\
-
-
+\subsection{Defining New Fields}
+\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory. 
+Field can either be created before a dataset is loaded using \texttt{add\_field}:
+\texttt{def \_metal\_mass({\it field},{\it data})}\\
+\texttt{\hspace{4 mm} return data["metallicity"]*data["cell\_mass"]}\\
+\texttt{add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
+Or added to an existing dataset using \texttt{ds.add\_field}:
+\texttt{ds.add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
 
 \subsection{Slices and Projections}
 \settowidth{\MyLen}{\texttt{multicol} }
-\texttt{slc = SlicePlot(pf, {\it axis}, {\it field}, {\it center=}, {\it width=}, {\it weight\_field=}, {\it additional parameters})} \textemdash\ Make a slice plot
-perpendicular to {\it axis} of {\it field} weighted by {\it weight\_field} at (code-units) {\it center} with 
-{\it width} in code units or a (value, unit) tuple. Hint: try {\it SlicePlot?} in IPython to see additional parameters.\\
+\texttt{slc = yt.SlicePlot(ds, {\it axis or normal vector}, {\it field}, {\it center=}, {\it width=}, {\it weight\_field=}, {\it additional parameters})} \textemdash\ Make a slice plot
+perpendicular to {\it axis} (specified via 'x', 'y', or 'z') or a normal vector for an off-axis slice of {\it field} weighted by {\it weight\_field} at (code-units) {\it center} with 
+{\it width} in code units or a (value, unit) tuple. Hint: try {\it yt.SlicePlot?} in IPython to see additional parameters.\\
 \texttt{slc.save({\it file\_prefix})} \textemdash\ Save the slice to a png with name prefix {\it file\_prefix}.
 \texttt{.save()} works similarly for the commands below.\\
 
-\texttt{prj = ProjectionPlot(pf, {\it axis}, {\it field}, {\it addit. params})} \textemdash\ Make a projection. \\
-\texttt{prj = OffAxisSlicePlot(pf, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off-axis slice. Note this takes an array of fields. \\
-\texttt{prj = OffAxisProjectionPlot(pf, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off axis projection. Note this takes an array of fields. \\
+\texttt{prj = yt.ProjectionPlot(ds, {\it axis}, {\it field}, {\it addit. params})} \textemdash\ Make a projection. \\
+\texttt{prj = yt.OffAxisProjectionPlot(ds, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off axis projection. Note this takes an array of fields. \\
 
 \subsection{Plot Annotations}
 \settowidth{\MyLen}{\texttt{multicol} }
@@ -299,51 +282,37 @@
 The \texttt{my\_plugins.py} file \textemdash\ Add functions, derived fields, constants, or other commonly-used Python code to yt.
 
 
-
-
 \subsection{Analysis Modules}
 \settowidth{\MyLen}{\texttt{multicol}}
 The import name for each module is listed at the end of each description (see \textbf{yt Imports}).
 
 \texttt{Absorption Spectrum} \textemdash\ (\texttt{absorption\_spectrum}). \\
 \texttt{Clump Finder} \textemdash\ Find clumps defined by density thresholds (\texttt{level\_sets}). \\
-\texttt{Coordinate Transformation} \textemdash\ (\texttt{coordinate\_transformation}). \\
 \texttt{Halo Finding} \textemdash\ Locate halos of dark matter particles (\texttt{halo\_finding}). \\
-\texttt{Halo Mass Function} \textemdash\ Find halo mass functions from data and from theory (\texttt{halo\_mass\_function}). \\
-\texttt{Halo Profiling} \textemdash\ Profile and project multiple halos (\texttt{halo\_profiler}). \\
-\texttt{Halo Merger Tree} \textemdash\ Create a database of halo mergers (\texttt{halo\_merger\_tree}). \\
 \texttt{Light Cone Generator} \textemdash\ Stitch datasets together to perform analysis over cosmological volumes. \\
 \texttt{Light Ray Generator} \textemdash\ Analyze the path of light rays.\\
-\texttt{Radial Column Density} \textemdash\ Calculate column densities around a point (\texttt{radial\_column\_density}). \\
 \texttt{Rockstar Halo Finding} \textemdash\ Locate halos of dark matter using the Rockstar halo finder (\texttt{halo\_finding.rockstar}). \\
 \texttt{Star Particle Analysis} \textemdash\ Analyze star formation history and assemble spectra (\texttt{star\_analysis}). \\
 \texttt{Sunrise Exporter} \textemdash\ Export data to the sunrise visualization format (\texttt{sunrise\_export}). \\
-\texttt{Two Point Functions} \textemdash\ Two point correlations (\texttt{two\_point\_functions}). \\
 
 
 \subsection{Parallel Analysis}
-\settowidth{\MyLen}{\texttt{multicol}}
-Nearly all of yt is parallelized using MPI.
-The {\it mpi4py} package must be installed for parallelism in yt.
-To install {\it pip install mpi4py} on the command line usually works.
+\settowidth{\MyLen}{\texttt{multicol}} 
+Nearly all of yt is parallelized using
+MPI.  The {\it mpi4py} package must be installed for parallelism in yt.  To
+install {\it pip install mpi4py} on the command line usually works.
 Execute python in parallel similar to this:\\
-{\it mpirun -n 12 python script.py --parallel}\\
-This command may differ for each system on which you use yt;
-please consult the system documentation for details on how to run parallel applications.
+{\it mpirun -n 12 python script.py}\\
+The file \texttt{script.py} must call the \texttt{yt.enable\_parallelism()} to
+turn on yt's parallelism.  If this doesn't happen, all cores will execute the
+same serial yt script.  This command may differ for each system on which you use
+yt; please consult the system documentation for details on how to run parallel
+applications.
 
-\texttt{from yt.pmods import *} \textemdash\ Load yt faster when in parallel.
-This replaces the usual \texttt{from yt.mods import *}.\\
 \texttt{parallel\_objects()} \textemdash\ A way to parallelize analysis over objects
 (such as halos or clumps).\\
 
 
-\subsection{Pre-Installed Versions}
-\settowidth{\MyLen}{\texttt{multicol}}
-yt is pre-installed on several supercomputer systems.
-
-\textbf{NICS Kraken} \textemdash\ {\it module load yt} \\
-
-
 \subsection{Mercurial}
 \settowidth{\MyLen}{\texttt{multicol}}
 Please see \url{http://mercurial.selenic.com/} for the full Mercurial documentation.
@@ -365,8 +334,7 @@
 \subsection{FAQ}
 \settowidth{\MyLen}{\texttt{multicol}}
 
-\texttt{pf.field\_info[`field'].take\_log = False} \textemdash\ When plotting \texttt{field}, do not take log.
-Must enter \texttt{pf.h} before this command. \\
+\texttt{slc.set\_log('field', False)} \textemdash\ When plotting \texttt{field}, use linear scaling instead of log scaling.
 
 
 %\rule{0.3\linewidth}{0.25pt}

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/coding_styleguide.txt
--- a/doc/coding_styleguide.txt
+++ b/doc/coding_styleguide.txt
@@ -34,11 +34,11 @@
  * Do not import "*" from anything other than "yt.funcs".
  * Internally, only import from source files directly -- instead of:
 
-   from yt.visualization.api import PlotCollection
+   from yt.visualization.api import ProjectionPlot
 
    do:
 
-   from yt.visualization.plot_collection import PlotCollection
+   from yt.visualization.plot_window import ProjectionPlot
 
  * Numpy is to be imported as "np", after a long time of using "na".
  * Do not use too many keyword arguments.  If you have a lot of keyword
@@ -49,7 +49,7 @@
  * Don't create a new class to replicate the functionality of an old class --
    replace the old class.  Too many options makes for a confusing user
    experience.
- * Parameter files are a last resort.
+ * Parameter files external to yt are a last resort.
  * The usage of the **kwargs construction should be avoided.  If they cannot
    be avoided, they must be explained, even if they are only to be passed on to
    a nested function.
@@ -61,7 +61,7 @@
    * Hard-coding parameter names that are the same as those in Enzo.  The
      following translation table should be of some help.  Note that the
      parameters are now properties on a Dataset subclass: you access them
-     like pf.refine_by .
+     like ds.refine_by .
      * RefineBy => refine_by
      * TopGridRank => dimensionality
      * TopGridDimensions => domain_dimensions

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/docstring_example.txt
--- a/doc/docstring_example.txt
+++ b/doc/docstring_example.txt
@@ -73,7 +73,7 @@
     Examples
     --------
     These are written in doctest format, and should illustrate how to
-    use the function.  Use the variables 'pf' for the parameter file, 'pc' for
+    use the function.  Use the variables 'ds' for the dataset, 'pc' for
     a plot collection, 'c' for a center, and 'L' for a vector. 
 
     >>> a=[1,2,3]

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/docstring_idioms.txt
--- a/doc/docstring_idioms.txt
+++ b/doc/docstring_idioms.txt
@@ -19,7 +19,7 @@
 useful variable names that correspond to specific instances that the user is
 presupposed to have created.
 
-   * `pf`: a parameter file, loaded successfully
+   * `ds`: a dataset, loaded successfully
    * `sp`: a sphere
    * `c`: a 3-component "center"
    * `L`: a 3-component vector that corresponds to either angular momentum or a
@@ -43,7 +43,7 @@
 To indicate the return type of a given object, you can reference it using this
 construction:
 
-    This function returns a :class:`PlotCollection`.
+    This function returns a :class:`ProjectionPlot`.
 
 To reference a function, you can use:
 
@@ -51,4 +51,4 @@
 
 To reference a method, you can use:
 
-    To add a projection, use :meth:`PlotCollection.add_projection`.
+    To add a projection, use :meth:`ProjectionPlot.set_width`.

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/extensions/notebook_sphinxext.py
--- a/doc/extensions/notebook_sphinxext.py
+++ b/doc/extensions/notebook_sphinxext.py
@@ -15,8 +15,13 @@
     required_arguments = 1
     optional_arguments = 1
     option_spec = {'skip_exceptions' : directives.flag}
+    final_argument_whitespace = True
 
-    def run(self):
+    def run(self): # check if there are spaces in the notebook name
+        nb_path = self.arguments[0]
+        if ' ' in nb_path: raise ValueError(
+            "Due to issues with docutils stripping spaces from links, white "
+            "space is not allowed in notebook filenames '{0}'".format(nb_path))
         # check if raw html is supported
         if not self.state.document.settings.raw_enabled:
             raise self.warning('"%s" directive disabled.' % self.name)
@@ -24,10 +29,11 @@
         # get path to notebook
         source_dir = os.path.dirname(
             os.path.abspath(self.state.document.current_source))
-        nb_basename = os.path.basename(self.arguments[0])
+        nb_filename = self.arguments[0]
+        nb_basename = os.path.basename(nb_filename)
         rst_file = self.state_machine.document.attributes['source']
         rst_dir = os.path.abspath(os.path.dirname(rst_file))
-        nb_abs_path = os.path.join(rst_dir, nb_basename)
+        nb_abs_path = os.path.abspath(os.path.join(rst_dir, nb_filename))
 
         # Move files around.
         rel_dir = os.path.relpath(rst_dir, setup.confdir)
@@ -89,7 +95,6 @@
         return [nb_node]
 
 
-
 class notebook_node(nodes.raw):
     pass
 
@@ -109,6 +114,7 @@
     # http://imgur.com/eR9bMRH
     header = header.replace('<style', '<style scoped="scoped"')
     header = header.replace('body {\n  overflow: visible;\n  padding: 8px;\n}\n', '')
+    header = header.replace("code,pre{", "code{")
 
     # Filter out styles that conflict with the sphinx theme.
     filter_strings = [
@@ -120,8 +126,16 @@
     ]
     filter_strings.extend(['h%s{' % (i+1) for i in range(6)])
 
+    line_begin_strings = [
+        'pre{',
+        'p{margin'
+        ]
+
     header_lines = filter(
         lambda x: not any([s in x for s in filter_strings]), header.split('\n'))
+    header_lines = filter(
+        lambda x: not any([x.startswith(s) for s in line_begin_strings]), header_lines)
+
     header = '\n'.join(header_lines)
 
     # concatenate raw html lines

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/helper_scripts/parse_cb_list.py
--- a/doc/helper_scripts/parse_cb_list.py
+++ b/doc/helper_scripts/parse_cb_list.py
@@ -2,7 +2,7 @@
 import inspect
 from textwrap import TextWrapper
 
-pf = load("RD0005-mine/RedshiftOutput0005")
+ds = load("RD0005-mine/RedshiftOutput0005")
 
 output = open("source/visualizing/_cb_docstrings.inc", "w")
 

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/helper_scripts/parse_dq_list.py
--- a/doc/helper_scripts/parse_dq_list.py
+++ b/doc/helper_scripts/parse_dq_list.py
@@ -2,7 +2,7 @@
 import inspect
 from textwrap import TextWrapper
 
-pf = load("RD0005-mine/RedshiftOutput0005")
+ds = load("RD0005-mine/RedshiftOutput0005")
 
 output = open("source/analyzing/_dq_docstrings.inc", "w")
 
@@ -29,7 +29,7 @@
                             docstring = docstring))
                             #docstring = "\n".join(tw.wrap(docstring))))
 
-dd = pf.h.all_data()
+dd = ds.all_data()
 for n,func in sorted(dd.quantities.functions.items()):
     print n, func
     write_docstring(output, n, func[1])

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/helper_scripts/parse_object_list.py
--- a/doc/helper_scripts/parse_object_list.py
+++ b/doc/helper_scripts/parse_object_list.py
@@ -2,7 +2,7 @@
 import inspect
 from textwrap import TextWrapper
 
-pf = load("RD0005-mine/RedshiftOutput0005")
+ds = load("RD0005-mine/RedshiftOutput0005")
 
 output = open("source/analyzing/_obj_docstrings.inc", "w")
 
@@ -27,7 +27,7 @@
     f.write(template % dict(clsname = clsname, sig = sig, clsproxy=clsproxy,
                             docstring = 'physical-object-api'))
 
-for n,c in sorted(pf.h.__dict__.items()):
+for n,c in sorted(ds.__dict__.items()):
     if hasattr(c, '_con_args'):
         print n
         write_docstring(output, n, c)

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -17,15 +17,15 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
-Try using the ``pf.field_list`` and ``pf.derived_field_list`` to view the
+Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
 to display the native fields in alphabetical order:
 
 .. notebook-cell::
 
   from yt.mods import *
-  pf = load("Enzo_64/DD0043/data0043")
-  for i in sorted(pf.field_list):
+  ds = load("Enzo_64/DD0043/data0043")
+  for i in sorted(ds.field_list):
     print i
 
 .. note:: Universal fields will be overridden by a code-specific field.

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/install_script.sh
--- a/doc/install_script.sh
+++ b/doc/install_script.sh
@@ -567,8 +567,10 @@
 
 mkdir -p ${DEST_DIR}/data
 cd ${DEST_DIR}/data
-echo 'de6d8c6ea849f0206d219303329a0276b3cce7c051eec34377d42aacbe0a4f47ac5145eb08966a338ecddd2b83c8f787ca9956508ad5c39ee2088ad875166410  xray_emissivity.h5' > xray_emissivity.h5.sha512
-get_ytdata xray_emissivity.h5
+echo 'de6d8c6ea849f0206d219303329a0276b3cce7c051eec34377d42aacbe0a4f47ac5145eb08966a338ecddd2b83c8f787ca9956508ad5c39ee2088ad875166410  cloudy_emissivity.h5' > cloudy_emissivity.h5.sha512
+[ ! -e cloudy_emissivity.h5 ] && get_ytdata cloudy_emissivity.h5
+echo '0f714ae2eace0141b1381abf1160dc8f8a521335e886f99919caf3beb31df1fe271d67c7b2a804b1467949eb16b0ef87a3d53abad0e8160fccac1e90d8d9e85f  apec_emissivity.h5' > apec_emissivity.h5.sha512
+[ ! -e apec_emissivity.h5 ] && get_ytdata apec_emissivity.h5
 
 # Set paths to what they should be when yt is activated.
 export PATH=${DEST_DIR}/bin:$PATH
@@ -586,11 +588,11 @@
 FREETYPE_VER='freetype-2.4.12'
 H5PY='h5py-2.1.3'
 HDF5='hdf5-1.8.11'
-IPYTHON='ipython-1.1.0'
+IPYTHON='ipython-2.1.0'
 LAPACK='lapack-3.4.2'
 PNG=libpng-1.6.3
 MATPLOTLIB='matplotlib-1.3.0'
-MERCURIAL='mercurial-2.8'
+MERCURIAL='mercurial-3.0'
 NOSE='nose-1.3.0'
 NUMPY='numpy-1.7.1'
 PYTHON_HGLIB='python-hglib-1.0'
@@ -608,23 +610,21 @@
 echo '3f53d0b474bfd79fea2536d0a9197eaef6c0927e95f2f9fd52dbd6c1d46409d0e649c21ac418d8f7767a9f10fe6114b516e06f2be4b06aec3ab5bdebc8768220  Forthon-0.8.11.tar.gz' > Forthon-0.8.11.tar.gz.sha512
 echo '4941f5aa21aff3743546495fb073c10d2657ff42b2aff401903498638093d0e31e344cce778980f28a7170c6d29eab72ac074277b9d4088376e8692dc71e55c1  PyX-0.12.1.tar.gz' > PyX-0.12.1.tar.gz.sha512
 echo '3df0ba4b1cfef5f02fb27925de4c2ca414eca9000af6a3d475d39063720afe987287c3d51377e0a36b88015573ef699f700782e1749c7a357b8390971d858a79  Python-2.7.6.tgz' > Python-2.7.6.tgz.sha512
-echo '172f2bc671145ebb0add2669c117863db35851fb3bdb192006cd710d4d038e0037497eb39a6d01091cb923f71a7e8982a77b6e80bf71d6275d5d83a363c8d7e5  rockstar-0.99.6.tar.gz' > rockstar-0.99.6.tar.gz.sha512
 echo '276bd9c061ec9a27d478b33078a86f93164ee2da72210e12e2c9da71dcffeb64767e4460b93f257302b09328eda8655e93c4b9ae85e74472869afbeae35ca71e  blas.tar.gz' > blas.tar.gz.sha512
 echo '00ace5438cfa0c577e5f578d8a808613187eff5217c35164ffe044fbafdfec9e98f4192c02a7d67e01e5a5ccced630583ad1003c37697219b0f147343a3fdd12  bzip2-1.0.6.tar.gz' > bzip2-1.0.6.tar.gz.sha512
 echo 'a296dfcaef7e853e58eed4e24b37c4fa29cfc6ac688def048480f4bb384b9e37ca447faf96eec7b378fd764ba291713f03ac464581d62275e28eb2ec99110ab6  reason-js-20120623.zip' > reason-js-20120623.zip.sha512
 echo '609a68a3675087e0cc95268574f31e104549daa48efe15a25a33b8e269a93b4bd160f4c3e8178dca9c950ef5ca514b039d6fd1b45db6af57f25342464d0429ce  freetype-2.4.12.tar.gz' > freetype-2.4.12.tar.gz.sha512
 echo '2eb7030f8559ff5cb06333223d98fda5b3a663b6f4a026949d1c423aa9a869d824e612ed5e1851f3bf830d645eea1a768414f73731c23ab4d406da26014fe202  h5py-2.1.3.tar.gz' > h5py-2.1.3.tar.gz.sha512
 echo 'e9db26baa297c8ed10f1ca4a3fcb12d6985c6542e34c18d48b2022db73014f054c8b8434f3df70dcf44631f38b016e8050701d52744953d0fced3272d7b6b3c1  hdf5-1.8.11.tar.gz' > hdf5-1.8.11.tar.gz.sha512
-echo '46b8ae25df2ced674b3b3629070aafac955ba3aa2a5e749f8e63ef1f459126e1c4a9a03661406151622590a90c73b527716ad71bc626f57f52b51abfae0f43ca  ipython-1.1.0.tar.gz' > ipython-1.1.0.tar.gz.sha512
+echo '68c15f6402cacfd623f8e2b70c22d06541de3616fdb2d502ce93cd2fdb4e7507bb5b841a414a4123264221ee5ffb0ebefbb8541f79e647fcb9f73310b4c2d460  ipython-2.1.0.tar.gz' > ipython-2.1.0.tar.gz.sha512
 echo '8770214491e31f0a7a3efaade90eee7b0eb20a8a6ab635c5f854d78263f59a1849133c14ef5123d01023f0110cbb9fc6f818da053c01277914ae81473430a952  lapack-3.4.2.tar.gz' > lapack-3.4.2.tar.gz.sha512
 echo '887582e5a22e4cde338aa8fec7a89f6dd31f2f02b8842735f00f970f64582333fa03401cea6d01704083403c7e8b7ebc26655468ce930165673b33efa4bcd586  libpng-1.6.3.tar.gz' > libpng-1.6.3.tar.gz.sha512
 echo '990e3a155ca7a9d329c41a43b44a9625f717205e81157c668a8f3f2ad5459ed3fed8c9bd85e7f81c509e0628d2192a262d4aa30c8bfc348bb67ed60a0362505a  matplotlib-1.3.0.tar.gz' > matplotlib-1.3.0.tar.gz.sha512
-echo 'b08dcd746728d89f1f96036f39df1608fad0ff863ae48fe12424b1645936ebbf59b9068b93fe3c7cfd2036db046df3dc814119f89a827bd5f008d32f323d45a8  mercurial-2.8.tar.gz' > mercurial-2.8.tar.gz.sha512
+echo '8cd387ea0d74d5ed01b58d5ef8e3fb408d4b05f7deb45a02e34fbb931fd920aafbfcb3a9b52a027ebcdb562837198637a0e51f2121c94e0fcf7f7d8c016f5342  mercurial-3.0.tar.gz' > mercurial-3.0.tar.gz.sha512
 echo 'a3b8060e415560a868599224449a3af636d24a060f1381990b175dcd12f30249edd181179d23aea06b0c755ff3dc821b7a15ed8840f7855530479587d4d814f4  nose-1.3.0.tar.gz' > nose-1.3.0.tar.gz.sha512
 echo 'd58177f3971b6d07baf6f81a2088ba371c7e43ea64ee7ada261da97c6d725b4bd4927122ac373c55383254e4e31691939276dab08a79a238bfa55172a3eff684  numpy-1.7.1.tar.gz' > numpy-1.7.1.tar.gz.sha512
 echo '9c0a61299779aff613131aaabbc255c8648f0fa7ab1806af53f19fbdcece0c8a68ddca7880d25b926d67ff1b9201954b207919fb09f6a290acb078e8bbed7b68  python-hglib-1.0.tar.gz' > python-hglib-1.0.tar.gz.sha512
 echo 'c65013293dd4049af5db009fdf7b6890a3c6b1e12dd588b58fb5f5a5fef7286935851fb7a530e03ea16f28de48b964e50f48bbf87d34545fd23b80dd4380476b  pyzmq-13.1.0.tar.gz' > pyzmq-13.1.0.tar.gz.sha512
-echo '172f2bc671145ebb0add2669c117863db35851fb3bdb192006cd710d4d038e0037497eb39a6d01091cb923f71a7e8982a77b6e80bf71d6275d5d83a363c8d7e5  rockstar-0.99.6.tar.gz' > rockstar-0.99.6.tar.gz.sha512
 echo '80c8e137c3ccba86575d4263e144ba2c4684b94b5cd620e200f094c92d4e118ea6a631d27bdb259b0869771dfaeeae68c0fdd37fdd740b9027ee185026e921d4  scipy-0.12.0.tar.gz' > scipy-0.12.0.tar.gz.sha512
 echo '96f3e51b46741450bc6b63779c10ebb4a7066860fe544385d64d1eda52592e376a589ef282ace2e1df73df61c10eab1a0d793abbdaf770e60289494d4bf3bcb4  sqlite-autoconf-3071700.tar.gz' > sqlite-autoconf-3071700.tar.gz.sha512
 echo '2992baa3edfb4e1842fb642abf0bf0fc0bf56fc183aab8fed6b3c42fbea928fa110ede7fdddea2d63fc5953e8d304b04da433dc811134fadefb1eecc326121b8  sympy-0.7.3.tar.gz' > sympy-0.7.3.tar.gz.sha512
@@ -657,7 +657,6 @@
 get_ytproject $NOSE.tar.gz
 get_ytproject $PYTHON_HGLIB.tar.gz
 get_ytproject $SYMPY.tar.gz
-get_ytproject $ROCKSTAR.tar.gz
 if [ $INST_BZLIB -eq 1 ]
 then
     if [ ! -e $BZLIB/done ]
@@ -816,6 +815,7 @@
         YT_DIR=`dirname $ORIG_PWD`
     elif [ ! -e yt-hg ]
     then
+        echo "Cloning yt"
         YT_DIR="$PWD/yt-hg/"
         ( ${HG_EXEC} --debug clone https://bitbucket.org/yt_analysis/yt-supplemental/ 2>&1 ) 1>> ${LOG_FILE}
         # Recently the hg server has had some issues with timeouts.  In lieu of
@@ -824,9 +824,9 @@
         ( ${HG_EXEC} --debug clone https://bitbucket.org/yt_analysis/yt/ ./yt-hg 2>&1 ) 1>> ${LOG_FILE}
         # Now we update to the branch we're interested in.
         ( ${HG_EXEC} -R ${YT_DIR} up -C ${BRANCH} 2>&1 ) 1>> ${LOG_FILE}
-    elif [ -e yt-3.0-hg ] 
+    elif [ -e yt-hg ]
     then
-        YT_DIR="$PWD/yt-3.0-hg/"
+        YT_DIR="$PWD/yt-hg/"
     fi
     echo Setting YT_DIR=${YT_DIR}
 fi
@@ -943,14 +943,19 @@
 # Now we build Rockstar and set its environment variable.
 if [ $INST_ROCKSTAR -eq 1 ]
 then
-    if [ ! -e Rockstar/done ]
+    if [ ! -e rockstar/done ]
     then
-        [ ! -e Rockstar ] && tar xfz $ROCKSTAR.tar.gz
         echo "Building Rockstar"
-        cd Rockstar
+        if [ ! -e rockstar ]
+        then
+            ( hg clone http://bitbucket.org/MatthewTurk/rockstar 2>&1 ) 1>> ${LOG_FILE}
+        fi
+        cd rockstar
+        ( hg pull 2>&1 ) 1>> ${LOG_FILE}
+        ( hg up -C tip 2>&1 ) 1>> ${LOG_FILE}
         ( make lib 2>&1 ) 1>> ${LOG_FILE} || do_exit
         cp librockstar.so ${DEST_DIR}/lib
-        ROCKSTAR_DIR=${DEST_DIR}/src/Rockstar
+        ROCKSTAR_DIR=${DEST_DIR}/src/rockstar
         echo $ROCKSTAR_DIR > ${YT_DIR}/rockstar.cfg
         touch done
         cd ..

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/_static/agogo_yt.css
--- a/doc/source/_static/agogo_yt.css
+++ /dev/null
@@ -1,41 +0,0 @@
- at import url("agogo.css");
- at import url("http://fonts.googleapis.com/css?family=Crimson+Text");
- at import url("http://fonts.googleapis.com/css?family=Droid+Sans");
-
-div.document ul {
-  margin-left: 1.5em;
-  margin-top: 0.0em;
-  margin-bottom: 1.0em;
-}
-
-div.document li.toctree-l1 {
-  margin-bottom: 0.5em;
-}
-
-table.contentstable {
-  width: 100%;
-}
-
-table.contentstable td {
-  padding: 5px 15px 0px 15px;
-}
-
-table.contentstable tr {
-  border-bottom: 1px solid black;
-}
-
-a.biglink {
-  line-height: 1.2em;
-}
-
-a tt.xref {
-  font-weight: bolder;
-}
-
-table.docutils {
-  width: 100%;
-}
-
-table.docutils td {
-  width: 50%;
-}

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/_static/custom.css
--- /dev/null
+++ b/doc/source/_static/custom.css
@@ -0,0 +1,8 @@
+blockquote {
+    font-size: 16px;
+    border-left: none;
+}
+
+dd {
+    margin-left: 30px;
+}
\ No newline at end of file

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/_templates/layout.html
--- a/doc/source/_templates/layout.html
+++ b/doc/source/_templates/layout.html
@@ -35,3 +35,5 @@
     </div>
 {%- endblock %}
 
+{# Custom CSS overrides #}
+{% set bootswatch_css_custom = ['_static/custom.css'] %}

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/analyzing/_dq_docstrings.inc
--- a/doc/source/analyzing/_dq_docstrings.inc
+++ b/doc/source/analyzing/_dq_docstrings.inc
@@ -1,43 +1,20 @@
 
 
-.. function:: Action(action, combine_action, filter=None):
+.. function:: angular_momentum_vector()
 
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._Action`.)
-   This function evals the string given by the action arg and uses 
-   the function thrown with the combine_action to combine the values.  
-   A filter can be thrown to be evaled to short-circuit the calculation 
-   if some criterion is not met.
-   :param action: a string containing the desired action to be evaled.
-   :param combine_action: the function used to combine the answers when done lazily.
-   :param filter: a string to be evaled to serve as a data filter.
-
-
-
-.. function:: AngularMomentumVector():
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._AngularMomentumVector`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.AngularMomentumVector`.)
    This function returns the mass-weighted average angular momentum vector.
 
 
+.. function:: bulk_velocity():
 
-.. function:: BaryonSpinParameter():
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._BaryonSpinParameter`.)
-   This function returns the spin parameter for the baryons, but it uses
-   the particles in calculating enclosed mass.
-
-
-
-.. function:: BulkVelocity():
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._BulkVelocity`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.BulkVelocity`.)
    This function returns the mass-weighted average velocity in the object.
 
 
+.. function:: center_of_mass(use_cells=True, use_particles=False):
 
-.. function:: CenterOfMass(use_cells=True, use_particles=False):
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._CenterOfMass`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.CenterOfMass`.)
    This function returns the location of the center
    of mass. By default, it computes of the *non-particle* data in the object. 
    
@@ -51,112 +28,64 @@
 
 
 
-.. function:: Extrema(fields, non_zero=False, filter=None):
+.. function:: extrema(fields, non_zero=False, filter=None):
 
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._Extrema`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.Extrema`.)
    This function returns the extrema of a set of fields
    
    :param fields: A field name, or a list of field names
    :param filter: a string to be evaled to serve as a data filter.
 
 
+.. function:: max_location(field):
 
-.. function:: IsBound(truncate=True, include_thermal_energy=False, treecode=True, opening_angle=1.0, periodic_test=False, include_particles=True):
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._IsBound`.)
-   This returns whether or not the object is gravitationally bound. If this
-   returns a value greater than one, it is bound, and otherwise not.
-   
-   Parameters
-   ----------
-   truncate : Bool
-       Should the calculation stop once the ratio of
-       gravitational:kinetic is 1.0?
-   include_thermal_energy : Bool
-       Should we add the energy from ThermalEnergy
-       on to the kinetic energy to calculate 
-       binding energy?
-   treecode : Bool
-       Whether or not to use the treecode.
-   opening_angle : Float 
-       The maximal angle a remote node may subtend in order
-       for the treecode method of mass conglomeration may be
-       used to calculate the potential between masses.
-   periodic_test : Bool 
-       Used for testing the periodic adjustment machinery
-       of this derived quantity.
-   include_particles : Bool
-       Should we add the mass contribution of particles
-       to calculate binding energy?
-   
-   Examples
-   --------
-   >>> sp.quantities["IsBound"](truncate=False,
-   ... include_thermal_energy=True, treecode=False, opening_angle=2.0)
-   0.32493
-
-
-
-.. function:: MaxLocation(field):
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._MaxLocation`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.max_location`.)
    This function returns the location of the maximum of a set
    of fields.
 
 
+.. function:: min_location(field):
 
-.. function:: MinLocation(field):
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._MinLocation`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.MinLocation`.)
    This function returns the location of the minimum of a set
    of fields.
 
 
 
-.. function:: ParticleSpinParameter():
+.. function:: spin_parameter(use_gas=True, use_particles=True):
 
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._ParticleSpinParameter`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.SpinParameter`.)
    This function returns the spin parameter for the baryons, but it uses
    the particles in calculating enclosed mass.
 
 
+.. function:: total_mass():
 
-.. function:: StarAngularMomentumVector():
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._StarAngularMomentumVector`.)
-   This function returns the mass-weighted average angular momentum vector 
-   for stars.
-
-
-
-.. function:: TotalMass():
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._TotalMass`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.TotalMass`.)
    This function takes no arguments and returns the sum of cell masses and
    particle masses in the object.
 
 
+.. function:: total_quantity(fields):
 
-.. function:: TotalQuantity(fields):
-
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._TotalQuantity`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.TotalQuantity`.)
    This function sums up a given field over the entire region
    
    :param fields: The fields to sum up
 
 
 
-.. function:: WeightedAverageQuantity(field, weight):
+.. function:: weighted_average_quantity(field, weight):
 
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._WeightedAverageQuantity`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.WeightedAverageQuantity`.)
    This function returns an averaged quantity.
    
    :param field: The field to average
    :param weight: The field to weight by
 
-.. function:: WeightedVariance(field, weight):
+.. function:: weighted_variance(field, weight):
 
-   (This is a proxy for :func:`~yt.data_objects.derived_quantities._WeightedVariance`.)
+   (This is a proxy for :func:`~yt.data_objects.derived_quantities.WeightedVariance`.)
     This function returns the variance of a field.
 
     :param field: The target field

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/analyzing/_obj_docstrings.inc
--- a/doc/source/analyzing/_obj_docstrings.inc
+++ b/doc/source/analyzing/_obj_docstrings.inc
@@ -1,12 +1,12 @@
 
 
-.. class:: boolean(self, regions, fields=None, pf=None, **field_parameters):
+.. class:: boolean(self, regions, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRBooleanRegionBase`.)
 
 
-.. class:: covering_grid(self, level, left_edge, dims, fields=None, pf=None, num_ghost_zones=0, use_pbar=True, **field_parameters):
+.. class:: covering_grid(self, level, left_edge, dims, fields=None, ds=None, num_ghost_zones=0, use_pbar=True, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRCoveringGridBase`.)
@@ -24,13 +24,13 @@
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRCuttingPlaneBase`.)
 
 
-.. class:: disk(self, center, normal, radius, height, fields=None, pf=None, **field_parameters):
+.. class:: disk(self, center, normal, radius, height, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRCylinderBase`.)
 
 
-.. class:: ellipsoid(self, center, A, B, C, e0, tilt, fields=None, pf=None, **field_parameters):
+.. class:: ellipsoid(self, center, A, B, C, e0, tilt, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMREllipsoidBase`.)
@@ -48,79 +48,79 @@
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRFixedResCuttingPlaneBase`.)
 
 
-.. class:: fixed_res_proj(self, axis, level, left_edge, dims, fields=None, pf=None, **field_parameters):
+.. class:: fixed_res_proj(self, axis, level, left_edge, dims, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRFixedResProjectionBase`.)
 
 
-.. class:: grid_collection(self, center, grid_list, fields=None, pf=None, **field_parameters):
+.. class:: grid_collection(self, center, grid_list, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRGridCollectionBase`.)
 
 
-.. class:: grid_collection_max_level(self, center, max_level, fields=None, pf=None, **field_parameters):
+.. class:: grid_collection_max_level(self, center, max_level, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRMaxLevelCollectionBase`.)
 
 
-.. class:: inclined_box(self, origin, box_vectors, fields=None, pf=None, **field_parameters):
+.. class:: inclined_box(self, origin, box_vectors, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRInclinedBoxBase`.)
 
 
-.. class:: ortho_ray(self, axis, coords, fields=None, pf=None, **field_parameters):
+.. class:: ortho_ray(self, axis, coords, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMROrthoRayBase`.)
 
 
-.. class:: overlap_proj(self, axis, field, weight_field=None, max_level=None, center=None, pf=None, source=None, node_name=None, field_cuts=None, preload_style='level', serialize=True, **field_parameters):
+.. class:: overlap_proj(self, axis, field, weight_field=None, max_level=None, center=None, ds=None, source=None, node_name=None, field_cuts=None, preload_style='level', serialize=True, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRProjBase`.)
 
 
-.. class:: periodic_region(self, center, left_edge, right_edge, fields=None, pf=None, **field_parameters):
+.. class:: periodic_region(self, center, left_edge, right_edge, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRPeriodicRegionBase`.)
 
 
-.. class:: periodic_region_strict(self, center, left_edge, right_edge, fields=None, pf=None, **field_parameters):
+.. class:: periodic_region_strict(self, center, left_edge, right_edge, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRPeriodicRegionStrictBase`.)
 
 
-.. class:: proj(self, axis, field, weight_field=None, max_level=None, center=None, pf=None, source=None, node_name=None, field_cuts=None, preload_style=None, serialize=True, style='integrate', **field_parameters):
+.. class:: proj(self, axis, field, weight_field=None, max_level=None, center=None, ds=None, source=None, node_name=None, field_cuts=None, preload_style=None, serialize=True, style='integrate', **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRQuadTreeProjBase`.)
 
 
-.. class:: ray(self, start_point, end_point, fields=None, pf=None, **field_parameters):
+.. class:: ray(self, start_point, end_point, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRRayBase`.)
 
 
-.. class:: region(self, center, left_edge, right_edge, fields=None, pf=None, **field_parameters):
+.. class:: region(self, center, left_edge, right_edge, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRRegionBase`.)
 
 
-.. class:: region_strict(self, center, left_edge, right_edge, fields=None, pf=None, **field_parameters):
+.. class:: region_strict(self, center, left_edge, right_edge, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRRegionStrictBase`.)
 
 
-.. class:: slice(self, axis, coord, fields=None, center=None, pf=None, node_name=False, **field_parameters):
+.. class:: slice(self, axis, coord, fields=None, center=None, ds=None, node_name=False, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRSliceBase`.)
@@ -132,13 +132,13 @@
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase`.)
 
 
-.. class:: sphere(self, center, radius, fields=None, pf=None, **field_parameters):
+.. class:: sphere(self, center, radius, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRSphereBase`.)
 
 
-.. class:: streamline(self, positions, length=1.0, fields=None, pf=None, **field_parameters):
+.. class:: streamline(self, positions, length=1.0, fields=None, ds=None, **field_parameters):
 
    For more information, see :ref:`physical-object-api`
    (This is a proxy for :class:`~yt.data_objects.data_containers.AMRStreamlineBase`.)

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/analyzing/analysis_modules/Halo_Analysis.ipynb
--- /dev/null
+++ b/doc/source/analyzing/analysis_modules/Halo_Analysis.ipynb
@@ -0,0 +1,412 @@
+{
+ "metadata": {
+  "name": "",
+  "signature": "sha256:c423bcb9e3370a4581cbaaa8e764b95ec13e665aa3b46d452891d76cc79d7acf"
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "heading",
+     "level": 1,
+     "metadata": {},
+     "source": [
+      "Full Halo Analysis"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Creating a Catalog"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Here we put everything together to perform some realistic analysis. First we load a full simulation dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import yt\n",
+      "from yt.analysis_modules.halo_analysis.api import *\n",
+      "import tempfile\n",
+      "import shutil\n",
+      "import os\n",
+      "\n",
+      "# Create temporary directory for storing files\n",
+      "tmpdir = tempfile.mkdtemp()\n",
+      "\n",
+      "# Load the data set with the full simulation information\n",
+      "data_ds = yt.load('Enzo_64/RD0006/RedshiftOutput0006')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we load a rockstar halos binary file. This is the output from running the rockstar halo finder on the dataset loaded above. It is also possible to require the HaloCatalog to find the halos in the full simulation dataset at runtime by specifying a `finder_method` keyword."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# Load the rockstar data files\n",
+      "halos_ds = yt.load('rockstar_halos/halos_0.0.bin')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "From these two loaded datasets we create a halo catalog object. No analysis is done at this point, we are simply defining an object we can add analysis tasks to. These analysis tasks will be run in the order they are added to the halo catalog object."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# Instantiate a catalog using those two paramter files\n",
+      "hc = HaloCatalog(data_ds=data_ds, halos_ds=halos_ds, \n",
+      "                 output_dir=os.path.join(tmpdir, 'halo_catalog'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The first analysis task we add is a filter for the most massive halos; those with masses great than $10^{14}~M_\\odot$. Note that all following analysis will only be performed on these massive halos and we will not waste computational time calculating quantities for halos we are not interested in. This is a result of adding this filter first. If we had called `add_filter` after some other `add_quantity` or `add_callback` to the halo catalog, the quantity and callback calculations would have been performed for all halos, not just those which pass the filter."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": true,
+     "input": [
+      "# Filter out less massive halos\n",
+      "hc.add_filter(\"quantity_value\", \"particle_mass\", \">\", 1e14, \"Msun\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Finding Radial Profiles"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Our first analysis goal is going to be constructing radial profiles for our halos. We would like these profiles to be in terms of the virial radius. Unfortunately we have no guarantee that values of center and virial radius recorded by the halo finder are actually physical. Therefore we should recalculate these quantities ourselves using the values recorded by the halo finder as a starting point."
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The first step is going to be creating a sphere object that we will create radial profiles along. This attaches a sphere data object to every halo left in the catalog."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# attach a sphere object to each halo whose radius extends to twice the radius of the halo\n",
+      "hc.add_callback(\"sphere\", factor=2.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Next we find the radial profile of the gas overdensity along the sphere object in order to find the virial radius. `radius` is the axis along which we make bins for the radial profiles. `[(\"gas\",\"overdensity\")]` is the quantity that we are profiling. This is a list so we can profile as many quantities as we want. The `weight_field` indicates how the cells should be weighted, but note that this is not a list, so all quantities will be weighted in the same way. The `accumulation` keyword indicates if the profile should be cummulative; this is useful for calculating profiles such as enclosed mass. The `storage` keyword indicates the name of the attribute of a halo where these profiles will be stored. Setting the storage keyword to \"virial_quantities_profiles\" means that the profiles will be stored in a dictionary that can be accessed by `halo.virial_quantities_profiles`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# use the sphere to calculate radial profiles of gas density weighted by cell volume in terms of the virial radius\n",
+      "hc.add_callback(\"profile\", x_field=\"radius\",\n",
+      "                y_fields=[(\"gas\", \"overdensity\")],\n",
+      "                weight_field=\"cell_volume\", \n",
+      "                accumulation=False,\n",
+      "                storage=\"virial_quantities_profiles\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we calculate the virial radius of halo using the sphere object. As this is a callback, not a quantity, the virial radius will not be written out with the rest of the halo properties in the final halo catalog. This also has a `profile_storage` keyword to specify where the radial profiles are stored that will allow the callback to calculate the relevant virial quantities. We supply this keyword with the same string we gave to `storage` in the last `profile` callback."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# Define a virial radius for the halo.\n",
+      "hc.add_callback(\"virial_quantities\", [\"radius\"], \n",
+      "                profile_storage = \"virial_quantities_profiles\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we have calculated the virial radius, we delete the profiles we used to find it."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "hc.add_callback('delete_attribute','virial_quantities_profiles')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we have calculated virial quantities we can add a new sphere that is aware of the virial radius we calculated above."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "hc.add_callback('sphere', radius_field='radius_200', factor=5,\n",
+      "                field_parameters=dict(virial_radius=('quantity', 'radius_200')))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Using this new sphere, we calculate a gas temperature profile along the virial radius, weighted by the cell mass."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "hc.add_callback('profile', 'virial_radius', [('gas','temperature')],\n",
+      "                storage='virial_profiles',\n",
+      "                weight_field='cell_mass', \n",
+      "                accumulation=False, output_dir='profiles')\n"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "As profiles are not quantities they will not automatically be written out in the halo catalog; thus in order to be reloadable we must write them out explicitly through a callback of `save_profiles`. This makes sense because they have an extra dimension for each halo along the profile axis. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# Save the profiles\n",
+      "hc.add_callback(\"save_profiles\", storage=\"virial_profiles\", output_dir=\"profiles\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We then create the halo catalog. Remember, no analysis is done before this call to create. By adding callbacks and filters we are simply queuing up the actions we want to take that will all run now."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": true,
+     "input": [
+      "hc.create()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Reloading HaloCatalogs"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally we load these profiles back in and make a pretty plot. It is not strictly necessary to reload the profiles in this notebook, but we show this process here to illustrate that this step may be performed completely separately from the rest of the script. This workflow allows you to create a single script that will allow you to perform all of the analysis that requires the full dataset. The output can then be saved in a compact form where only the necessarily halo quantities are stored. You can then download this smaller dataset to a local computer and run any further non-computationally intense analysis and design the appropriate plots."
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can load a previously saved halo catalog by using the `load` command. We then create a `HaloCatalog` object from just this dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "halos_ds =  yt.load(os.path.join(tmpdir, 'halo_catalog/halo_catalog.0.h5'))\n",
+      "\n",
+      "hc_reloaded = HaloCatalog(halos_ds=halos_ds,\n",
+      "                          output_dir=os.path.join(tmpdir, 'halo_catalog'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      " Just as profiles are saved seperately throught the `save_profiles` callback they also must be loaded separately using the `load_profiles` callback."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "hc_reloaded.add_callback('load_profiles', storage='virial_profiles',\n",
+      "                         output_dir='profiles')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Calling `load` is the equivalent of calling `create` earlier, but defaults to to not saving new information. This means that the callback to `load_profiles` is not run until we call `load` here."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": true,
+     "input": [
+      "hc_reloaded.load()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Plotting Radial Profiles"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In the future ProfilePlot will be able to properly interpret the loaded profiles of `Halo` and `HaloCatalog` objects, but this functionality is not yet implemented. In the meantime, we show a quick method of viewing a profile for a single halo."
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The individual `Halo` objects contained in the `HaloCatalog` can be accessed through the `halo_list` attribute. This gives us access to the dictionary attached to each halo where we stored the radial profiles."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "halo = hc_reloaded.halo_list[0]\n",
+      "\n",
+      "radius = halo.virial_profiles['virial_radius']\n",
+      "temperature = halo.virial_profiles[u\"('gas', 'temperature')\"]\n",
+      "\n",
+      "# Remove output files, that are no longer needed\n",
+      "shutil.rmtree(tmpdir)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Here we quickly use matplotlib to create a basic plot of the radial profile of this halo. When `ProfilePlot` is properly configured to accept Halos and HaloCatalogs the full range of yt plotting tools will be accessible."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "%matplotlib inline\n",
+      "import matplotlib.pyplot as plt\n",
+      "import numpy as np\n",
+      "\n",
+      "plt.plot(np.array(radius), np.array(temperature))\n",
+      "\n",
+      "plt.semilogy()\n",
+      "plt.xlabel(r'$\\rm{R/R_{vir}}$')\n",
+      "plt.ylabel(r'$\\rm{Temperature\\/\\/(K)}$')\n",
+      "\n",
+      "plt.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 6de5eaad0968a448dfa0bee841cf81dc170d5094 -r fc533325d312b53c8c2a816fe3ecb8a42de8d8e8 doc/source/analyzing/analysis_modules/PPVCube.ipynb
--- /dev/null
+++ b/doc/source/analyzing/analysis_modules/PPVCube.ipynb
@@ -0,0 +1,307 @@
+{
+ "metadata": {
+  "name": "",
+  "signature": "sha256:56a8d72735e3cc428ff04b241d4b2ce6f653019818c6fc7a4148840d99030c85"
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Detailed spectra of astrophysical objects sometimes allow for determinations of how much of the gas is moving with a certain velocity along the line of sight, thanks to Doppler shifting of spectral lines. This enables \"data cubes\" to be created in RA, Dec, and line-of-sight velocity space. In yt, we can use the `PPVCube` analysis module to project fields along a given line of sight traveling at different line-of-sight velocities, to \"mock-up\" what would be seen in observations."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import yt\n",
+      "import numpy as np\n",
+      "\n",
+      "from yt.analysis_modules.ppv_cube.api import PPVCube"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To demonstrate this functionality, we'll create a simple unigrid dataset from scratch of a rotating disk galaxy. We create a thin disk in the x-y midplane of the domain of three cells in height in either direction, and a radius of 10 kpc. The density and azimuthal velocity profiles of the disk as a function of radius will be given by the following functions:"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Density: $\\rho(r) \\propto r^{\\alpha}$"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Velocity: $v_{\\theta}(r) \\propto \\frac{r}{1+(r/r_0)^{\\beta}}$"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "where for simplicity we won't worry about the normalizations of these profiles. "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "First, we'll set up the grid and the parameters of the profiles:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "nx,ny,nz = (256,256,256) # domain dimensions\n",
+      "R = 10. # outer radius of disk, kpc\n",
+      "r_0 = 3. # scale radius, kpc\n",
+      "beta = 1.4 # for the tangential velocity profile\n",
+      "alpha = -1. # for the radial density profile\n",
+      "x, y = np.mgrid[-R:R:nx*1j,-R:R:ny*1j] # cartesian coordinates of x-y plane of disk\n",
+      "r = np.sqrt(x*x+y*y) # polar coordinates\n",
+      "theta = np.arctan2(y, x) # polar coordinates"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Second, we'll construct the data arrays for the density and the velocity of the disk. Since we have the tangential velocity profile, we have to use the polar coordinates we derived earlier to compute `velx` and `vely`. Everywhere outside the disk, all fields are set to zero.  "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "dens = np.zeros((nx,ny,nz))\n",
+      "dens[:,:,nz/2-3:nz/2+3] = (r**alpha).reshape(nx,ny,1) # the density profile of the disk\n",
+      "vel_theta = r/(1.+(r/r_0)**beta) # the azimuthal velocity profile of the disk\n",
+      "velx = np.zeros((nx,ny,nz))\n",
+      "vely = np.zeros((nx,ny,nz))\n",
+      "velx[:,:,nz/2-3:nz/2+3] = (-vel_theta*np.sin(theta)).reshape(nx,ny,1) # convert polar to cartesian\n",
+      "vely[:,:,nz/2-3:nz/2+3] = (vel_theta*np.cos(theta)).reshape(nx,ny,1) # convert polar to cartesian\n",
+      "dens[r > R] = 0.0\n",
+      "velx[r > R] = 0.0\n",
+      "vely[r > R] = 0.0"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we'll package these data arrays up into a dictionary, which will then be shipped off to `load_uniform_grid`. We'll define the width of the grid to be `2*R` kpc, which will be equal to 1  `code_length`. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = {}\n",
+      "data[\"density\"] = (dens,\"g/cm**3\")\n",
+      "data[\"velocity_x\"] = (velx, \"km/s\")\n",
+      "data[\"velocity_y\"] = (vely, \"km/s\")\n",
+      "data[\"velocity_z\"] = (np.zeros((nx,ny,nz)), \"km/s\") # zero velocity in the z-direction\n",
+      "bbox = np.array([[-0.5,0.5],[-0.5,0.5],[-0.5,0.5]]) # bbox of width 1 on a side with center (0,0,0)\n",
+      "ds = yt.load_uniform_grid(data, (nx,ny,nz), length_unit=(2*R,\"kpc\"), nprocs=1, bbox=bbox)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To get a sense of what the data looks like, we'll take a slice through the middle of the disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = yt.SlicePlot(ds, \"z\", [\"density\",\"velocity_x\",\"velocity_y\",\"velocity_magnitude\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc.set_log(\"velocity_x\", False)\n",
+      "slc.set_log(\"velocity_y\", False)\n",
+      "slc.set_log(\"velocity_magnitude\", False)\n",
+      "slc.set_unit(\"velocity_magnitude\", \"km/s\")\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Which shows a rotating disk with a specific density and velocity profile. Now, suppose we wanted to look at this disk galaxy from a certain orientation angle, and simulate a 3D FITS data cube where we can see the gas that is emitting at different velocities along the line of sight. We can do this using the `PPVCube` class. First, let's assume we rotate our viewing angle 60 degrees from face-on, from along the z-axis into the y-axis. We'll create a normal vector:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "i = 60.*np.pi/180.\n",
+      "L = [0.0,np.sin(i),np.sin(i)]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Next, we need to specify a field that will serve as the \"intensity\" of the emission that we see. For simplicity, we'll simply choose the gas density as this field, though it could be any field (including derived fields) in principle. We also need to specify the dimensions of the data cube, and optionally we may choose the bounds in line-of-sight velocity that the data will be binned into. Otherwise, the bounds will simply be set to the negative and positive of the largest speed in the dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cube = PPVCube(ds, L, \"density\", dims=(200,100,50), velocity_bounds=(-1.5,1.5,\"km/s\"))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Following this, we can now write this cube to a FITS file:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cube.write_fits(\"cube.fits\", clobber=True, length_unit=(5.0,\"deg\"))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now, we'll look at the FITS dataset in yt and look at different slices along the velocity axis, which is the \"z\" axis:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ds = yt.load(\"cube.fits\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# Specifying no center gives us the center slice\n",
+      "slc = yt.SlicePlot(ds, \"z\", [\"density\"])\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import yt.units as u\n",
+      "# Picking different velocities for the slices\n",
+      "new_center = ds.domain_center\n",
+      "new_center[2] = ds.spec2pixel(-1.0*u.km/u.s)\n",
+      "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "new_center[2] = ds.spec2pixel(0.7*u.km/u.s)\n",
+      "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "new_center[2] = ds.spec2pixel(-0.3*u.km/u.s)\n",
+      "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we project all the emission at all the different velocities along the z-axis, we recover the entire disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prj = yt.ProjectionPlot(ds, \"z\", [\"density\"], proj_style=\"sum\")\n",
+      "prj.set_log(\"density\", True)\n",
+      "prj.set_zlim(\"density\", 1.0e-3, 0.2)\n",
+      "prj.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

This diff is so big that we needed to truncate the remainder.

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list