[yt-svn] commit/yt-doc: 104 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Sat Nov 2 12:20:49 PDT 2013


104 new commits in yt-doc:

https://bitbucket.org/yt_analysis/yt-doc/commits/092d1a15c496/
Changeset:   092d1a15c496
User:        chummels
Date:        2013-10-28 18:31:20
Summary:     Modifying front page of the yt docs to make more succinct.
Affected #:  3 files

diff -r a059cbb57ddf355a8e311bc1816c0728351deee4 -r 092d1a15c49663209f810f1d42c54c1e50556aac source/examining/index.rst
--- /dev/null
+++ b/source/examining/index.rst
@@ -0,0 +1,4 @@
+Examining Data
+==============
+
+How to examine a dataset on disk.

diff -r a059cbb57ddf355a8e311bc1816c0728351deee4 -r 092d1a15c49663209f810f1d42c54c1e50556aac source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -2,131 +2,39 @@
 ===========
 
 yt is a community-developed analysis and visualization toolkit for
-astrophysical simulation data.  yt runs both interactively and
-non-interactively, and has been designed to support as many operations as
-possible in parallel. 
+examining datasets in a variety of scientific disciplines.  yt is developed 
+in Python under the open-source model.  yt currently supports several 
+astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
+for unsupported data formats.  Fully-supported codes 
+include: `Enzo <http://enzo-project.org/>`_, 
+`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
+`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
+`FLASH <http://flash.uchicago.edu/website/home/>`_, 
+Piernik NEED LINK; 
+and partially-supported codes include: 
+Castro NEED LINK, 
+ART (NMSU) NEED LINK, 
+Maestro NEED LINK, and
+RAMSES NEED LINK.
 
-yt provides full support for several simulation codes in the current release:
+yt uses a three-pronged approach to interacting with data:
 
- * `Enzo <http://enzo-project.org/>`_ 
- * Orion
- * `Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_
- * `FLASH <http://flash.uchicago.edu/website/home/>`_
- * Piernik
+ * Examine Data - Access data directly on disk with a variety of helper classes for making this task easier.
+ * Visualize Data - Generate plots, images, and movies for better understanding your raw datasets.
+ * Analyze Data - Use a variety of additional analysis routines to derive real-world results associated with your data.
 
-We also provide limited support for Castro, NMSU-ART, and Maestro.  A limited
-amount of RAMSES IO is provided, but full support  for RAMSES will not be
-completed until the 3.0 release of yt.
-
-If you use ``yt`` in a paper, you are highly encouraged to submit the
-repository containing the scripts you used to analyze and visualize your data
-to the `yt Hub <http://hub.yt-project.org/>`_, and we ask that you consider
-citing our `method paper <http://adsabs.harvard.edu/abs/2011ApJS..192....9T>`_,
-as well.  If you are looking to use ``yt``, then check out the `yt Hub
-<http://hub.yt-project.org/>`_ for ideas of how other people used ``yt`` to
-generate worthwhile analysis.  We encourage you to explore the source code and
-even consider :ref:`contributing <contributing-code>` your enhancements and
-scripts.
-
-For more information, please visit `our homepage <http://yt-project.org/>`_
-and for help, please see :ref:`asking-for-help`.
-
-.. raw:: html
-
-   <h2>Getting Started</h2>
-   <table class="contentstable" align="center">
-   
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="welcome/index.html">Welcome to yt!</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">What's yt all about?</p>
-       </td>
-     </tr>
-   
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="orientation/index.html">yt Orientation</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">Quickly get up and running with yt: zero to
-         sixty. (For sixty to seventy, see <a href="bootcamp.html">the
-         bootcamp!</a>)</p>
-       </td>
-     </tr>
-   
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="help/index.html">How to Ask for Help</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">Some guidelines on how and where to ask for help with yt</p>
-       </td>
-     </tr>
-   
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="cookbook/index.html">The Cookbook</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">A bunch of illustrated examples of how to do things</p>
-       </td>
-     </tr>
- 
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="faq/index.html">FAQ</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">Frequently Asked Questions: answered for you!</p>
-       </td>
-     </tr>
- 
-     <tr valign="top">
-       <td width="25%">
-         <p>
-           <a href="cheatsheet.pdf">Cheat Sheet</a>
-         </p>
-       </td>
-       <td width="75%">
-         <p class="linkdescr">A cheat sheet for yt you can print out</p>
-       </td>
-     </tr>
- 
-   </table>
- 
-   <h2>User Guide</h2>
+Documentation Highlights
+========================
 
 .. toctree::
-   :maxdepth: 2
+   :maxdepth: 1
 
-   welcome/index
-   orientation/index
+   orientation/index 
    bootcamp
-   workshop
    help/index
-   interacting/index
-   configuration
    cookbook/index
+   examining/index
+   visualizing/index
    analyzing/index
-   visualizing/index
-   analysis_modules/index
-   advanced/index
    getting_involved/index
-   api/api   
-   field_list
-   faq/index
-   changelog
-
-
+   reference/index

diff -r a059cbb57ddf355a8e311bc1816c0728351deee4 -r 092d1a15c49663209f810f1d42c54c1e50556aac source/reference/index.rst
--- /dev/null
+++ b/source/reference/index.rst
@@ -0,0 +1,3 @@
+Reference Materials
+===================
+


https://bitbucket.org/yt_analysis/yt-doc/commits/783eedc2dba9/
Changeset:   783eedc2dba9
User:        chummels
Date:        2013-10-28 18:43:28
Summary:     Finished updating links to simulation codes in front page.
Affected #:  1 file

diff -r 092d1a15c49663209f810f1d42c54c1e50556aac -r 783eedc2dba9666bc41f64ed5706256df331b411 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -10,12 +10,12 @@
 `Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
 `Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
 `FLASH <http://flash.uchicago.edu/website/home/>`_, 
-Piernik NEED LINK; 
+`Piernik <http://arxiv.org/abs/0901.0104>`_;
 and partially-supported codes include: 
-Castro NEED LINK, 
-ART (NMSU) NEED LINK, 
-Maestro NEED LINK, and
-RAMSES NEED LINK.
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
+`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
 
 yt uses a three-pronged approach to interacting with data:
 


https://bitbucket.org/yt_analysis/yt-doc/commits/941f87ecc071/
Changeset:   941f87ecc071
User:        chummels
Date:        2013-10-28 18:52:00
Summary:     Moving python references from quickstart page to python introduction
Affected #:  1 file

diff -r 783eedc2dba9666bc41f64ed5706256df331b411 -r 941f87ecc071630b1beffe072559343c254e939e source/orientation/python_introduction.rst
--- a/source/orientation/python_introduction.rst
+++ b/source/orientation/python_introduction.rst
@@ -732,3 +732,12 @@
 
 We'll now move on to talking more about how to use yt, both from a scripting
 perspective and interactively.
+
+Python and Related References
++++++++++++++++++++++++++++++
+    * `Python quickstart <http://docs.python.org/tutorial/>`_
+    * `Learn Python the Hard Way <http://learnpythonthehardway.org/index>`_
+    * `Byte of Python <http://www.swaroopch.com/notes/Python>`_
+    * `Dive Into Python <http://diveintopython.org>`_
+    * `Numpy docs <http://docs.numpy.org/>`_
+    * `Matplotlib docs <http://matplotlib.sf.net>`_


https://bitbucket.org/yt_analysis/yt-doc/commits/b2cdc65ad1fe/
Changeset:   b2cdc65ad1fe
User:        chummels
Date:        2013-10-28 20:35:20
Summary:     Updating installation instructions and moving them to top-level.
Affected #:  3 files

diff -r 941f87ecc071630b1beffe072559343c254e939e -r b2cdc65ad1fec26b822eff87f83cc5883d59747a source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -29,7 +29,7 @@
 .. toctree::
    :maxdepth: 1
 
-   orientation/index 
+   installing
    bootcamp
    help/index
    cookbook/index

diff -r 941f87ecc071630b1beffe072559343c254e939e -r b2cdc65ad1fec26b822eff87f83cc5883d59747a source/installing.rst
--- /dev/null
+++ b/source/installing.rst
@@ -0,0 +1,96 @@
+Installing yt
+-------------
+
+yt is a Python package (with some components written in C), using NumPy as a
+computation engine, Matplotlib for some visualization tasks and Mercurial for
+version control.  Because installation of all of these interlocking parts can 
+time-consuming, yt provides an installation script which downloads and builds
+a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
+yt supports Linux and OSX deployment, with the possibility of deployment on 
+other Unix-like systems (XSEDE resources, clusters, etc.).  It Windows is not
+supported.
+
+To get the installation script, download it from:
+
+.. code-block:: bash
+
+  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
+
+By default, it will install an array of items, with an option to also download
+the current stable version of Enzo.  The script has all its options at the top
+of the script; you should be able to open it and edit it without any knowledge
+of bash syntax.  To execute it, run:
+
+.. code-block:: bash
+
+  $ bash install_script.sh
+
+Because the installer is downloading and building a variety of packages from
+source, this will likely take a while (e.g. 20 minutes), but you will get 
+updates of its status at the command line throughout.
+
+If you receive errors during this process, the installer will provide you 
+with a large amount of information to assist in debugging your problems.  The 
+file `yt_install.log` will contain all of the STDOUT and STDERR from the entire 
+installation process, so it is usually quite cumbersome.  By looking at the 
+last few hundred lines (i.e. `tail -500 yt_install.log`), you can potentially 
+figure out what went wrong.  If you have problems, though, do not hesitate to 
+:ref:`contact us asking-for-help` for assitance.
+
+Activating Your Installation
+----------------------------
+
+Once the installation has completed, there will be instructions on how to set up 
+your shell environment to use yt using the activate script.  You must execute 
+this script in order to have yt properly recognized by your system.  You can 
+either add it to your login script, or you must execute it in a shell session 
+prior to working with yt.
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate
+
+If you use csh or tcsh as your shell, activate that version of the script:
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate.csh
+
+If you don't like executing outside scripts on your computer, you can set 
+the shell variables manually.  ``YT_DEST`` needs to point to the root of the
+directory containing the install. By default, this will be ``yt-<arch>``, where
+``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You 
+will also need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain 
+``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
+
+Alternative Installation Methods
+--------------------------------
+
+If you want to forego the use of the install script, you need to make sure 
+you have yt's dependencies installed on your system.  These include: a C compiler, 
+``HDF5``, ``Freetype``, ``libpng``, ``python``, ``cython``, ``numpy``, and 
+``matplotlib``.  From here, you can use ``pip`` to install yt as:
+
+.. code-block:: bash
+
+  $ pip install yt
+
+If you choose this installation method, you do not need to run the activation
+script as it is unnecessary.
+
+Testing Your Installation
+-------------------------
+
+To test to make sure everything is installed properly, try running yt at 
+the command line:
+
+.. code-block:: bash
+
+  $ yt --help
+
+If this works, you should get a list of the various command-line options for
+yt, which means you have successfully installed yt.  Congratulations!  
+
+If you get an error, follow the instructions it gives you to debug the problem.  
+Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
+figure it out.

diff -r 941f87ecc071630b1beffe072559343c254e939e -r b2cdc65ad1fec26b822eff87f83cc5883d59747a source/orientation/installing.rst
--- a/source/orientation/installing.rst
+++ /dev/null
@@ -1,53 +0,0 @@
-Installing yt
--------------
-
-yt is a Python package (with some components written in C), using NumPy as a
-computation engine, Matplotlib for some visualization tasks and Mercurial for
-version control.  Installing all of these components can be a daunting task,
-particularly as the Python ecosystem of packages is rapidly evolving.  Frankly,
-one of the *last* things a computational scientist wants to do is to install a
-bunch of packages and deal with the interlocking parts, when really the goal is
-to just simply look at some data.
-
-To that end, the yt project provides an installation script for the toolchain
-upon which yt builds, which contains a fully-isolated Python + Numpy +
-Matplotlib + HDF5 + Mercurial installation.  This installation script has been
-tested on most of the Teragrid as well as on a number of private clusters and
-Linux and OS X machines; in fact, if it doesn't work, that's considered a bug
-and we would endeavor to fix it.  yt supports Linux and OSX deployment, with
-the possibility of deployment on other Unix-like systems.  Windows is not
-supported.
-
-To get the installation script, download it from:
-
-.. code-block:: bash
-
-  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
-
-By default, it will install an array of items, with an option to also download
-the current stable version of Enzo.  The script has all its options at the top
-of the script; you should be able to open it and edit it without any knowledge
-of bash syntax.
-
-.. code-block:: bash
-
-  $ bash install_script.sh
-
-It will start out by telling you a little bit about itself and what it's
-installing, and then continue on for some time while it downloads, builds, and
-installs (into an isolated directory) everything you need to run yt.
-
-Once it has completed, there will be instructions on how to set up your shell
-environment to use yt.  **You should follow these, or else yt may not work, or
-may simply fail -- in unexpected ways!**
-
-One thing that we will use for the rest of the orientation is the environment
-variable ``YT_DEST``, which is output at the end of the installation process.
-If you use the ``activate`` script as described in the instructions printed by
-the install script, you will be all set.
-
-If you'd like to do it manually, ``YT_DEST`` needs to point to the root of the
-directory containing the install. By default, this will be ``yt-<arch>``, where
-``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You will also
-need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain ``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
-


https://bitbucket.org/yt_analysis/yt-doc/commits/9b17ad5d2f5e/
Changeset:   9b17ad5d2f5e
User:        MatthewTurk
Date:        2013-10-28 19:39:36
Summary:     Adding in the IPython notebooks from the Bootcamp.
Affected #:  6 files

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Data Inspection.ipynb
--- /dev/null
+++ b/source/bootcamp/Data Inspection.ipynb	
@@ -0,0 +1,396 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Starting Out and Loading Data\n",
+      "\n",
+      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment.  Note that in most scripts, you will want to import from ``yt.mods`` rather than ``yt.imods``.  But using ``yt.imods`` gets you some nice stuff for the IPython notebook, which we'll use below."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we've loaded yt, we can load up some data.  Let's load the `IsolatedGalaxy` dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Fields and Facts\n",
+      "\n",
+      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"hierarchy\" (shorthand is `.h`) it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.print_stats()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also tell you the fields it found on disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, all of the fields it thinks it knows how to generate:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.derived_field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also transparently generate fields.  However, we encourage you to examine exactly what yt is doing when it generates those fields.  To see, you can ask for the source of a given field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.field_info[\"VorticityX\"].get_source()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt stores information about the domain of the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also convert this into various units:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width * pf[\"kpc\"]\n",
+      "print pf.domain_width * pf[\"au\"]\n",
+      "print pf.domain_width * pf[\"miles\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Mesh Structure\n",
+      "\n",
+      "If you're using a simulation type that has grids (for instance, here we're using an Enzo simulation) you can examine the structure of the mesh.  For the most part, you probably won't have to use this unless you're debugging a simulation or examining in detail what is going on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grid_left_edge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The hierarchy (`pf.h` here) has an attribute `grids` which is all of the grid objects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grids[0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g = pf.h.grids[0]\n",
+      "print g"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Grids have dimensions, extents, level, and even a list of Child grids."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.ActiveDimensions"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.LeftEdge, g.RightEdge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Level"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Children"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Grid Inspection\n",
+      "\n",
+      "If we want to examine grids only at a given level, we can!  Not only that, but we can load data and take a look at various fields.\n",
+      "\n",
+      "*This section can be skipped!*"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "gs = pf.h.select_grids(pf.h.max_level)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g2 = gs[0]\n",
+      "print g2\n",
+      "print g2.Parent\n",
+      "print g2.get_global_startindex()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print g2[\"Density\"][:,:,0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print (g2.Parent.child_mask == 0).sum() * 8\n",
+      "print g2.ActiveDimensions.prod()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in pf.h.field_list:\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in sorted(pf.h.field_list):\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Examining Data in Regions\n",
+      "\n",
+      "yt provides data object selectors.  In subsequent notebooks we'll examine these in more detail, but we can select a sphere of data and perform a number of operations on it.  yt makes it easy to operate on fluid fields in an object in *bulk*, but you can also examine individual field values.\n",
+      "\n",
+      "This creates a sphere selector positioned at the most dense point in the simulation that has a radius of 10 kpc."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10, 'kpc'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can calculate a bunch of bulk quantities.  Here's that list, but there's a list in the docs, too!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Let's look at the total mass.  This is how you call a given quantity.  yt calls these \"Derived Quantities\".  We'll talk about a few in a later notebook."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities[\"TotalMass\"]()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Data Objects and Time Series.ipynb
--- /dev/null
+++ b/source/bootcamp/Data Objects and Time Series.ipynb	
@@ -0,0 +1,361 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Data Objects and Time Series Data\n",
+      "\n",
+      "Just like before, we will load up yt."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Time Series Data\n",
+      "\n",
+      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `TimeSeriesData` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
+      "\n",
+      "Note that you can specify either a list of filenames, or a glob (i.e., asterisk) pattern in this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ts = TimeSeriesData.from_filenames(\"enzo_tiny_cosmology/*/*.hierarchy\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 1: Simple Time Series\n",
+      "\n",
+      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for pf in ts` where `pf` means \"Parameter File\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the Derived Quantity `Extrema`, and append the min and max to our extrema outputs."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "rho_ex = []\n",
+      "times = []\n",
+      "for pf in ts:\n",
+      "    dd = pf.h.all_data()\n",
+      "    rho_ex.append(dd.quantities[\"Extrema\"](\"Density\")[0])\n",
+      "    times.append(pf.current_time * pf[\"years\"])\n",
+      "rho_ex = np.array(rho_ex)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the minimum and the maximum:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.semilogy(times, rho_ex[:,0], '-xk')\n",
+      "pylab.semilogy(times, rho_ex[:,1], '-xr')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 2: Advanced Time Series\n",
+      "\n",
+      "Let's do something a bit different.  Let's calculate the total mass inside halos and outside halos.\n",
+      "\n",
+      "This actually touches a lot of different pieces of machinery in yt.  For every parameter file, we will run the halo finder HOP.  Then, we calculate the total mass in the domain.  Then, for each halo, we calculate the sum of the baryon mass in that halo.  We'll keep running tallies of these two things."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "mass = []\n",
+      "zs = []\n",
+      "for pf in ts:\n",
+      "    halos = HaloFinder(pf)\n",
+      "    dd = pf.h.all_data()\n",
+      "    total_mass = dd.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    total_in_baryons = 0.0\n",
+      "    for halo in halos:\n",
+      "        sp = halo.get_sphere()\n",
+      "        total_in_baryons += sp.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    mass.append(total_in_baryons/total_mass)\n",
+      "    zs.append(pf.current_redshift)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now let's plot them!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(zs, mass, '-xb')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Data Objects\n",
+      "\n",
+      "Time series data have many applications, but most of them rely on examining the underlying data in some way.  Below, we'll see how to use and manipulate data objects.\n",
+      "\n",
+      "### Ray Queries\n",
+      "\n",
+      "yt provides the ability to examine rays, or lines, through the domain.  Note that these are not periodic, unlike most other data objects.  We create a ray object and can then examine quantities of it.  Rays have the special fields `t` and `dts`, which correspond to the time the ray enters a given cell and the distance it travels through that cell.\n",
+      "\n",
+      "To create a ray, we specify the start and end points."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ray = pf.h.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
+      "pylab.semilogy(ray[\"t\"], ray[\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"dts\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"t\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"x\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Slice Queries\n",
+      "\n",
+      "While slices are often used for visualization, they can be useful for other operations as well.  yt regards slices as multi-resolution objects.  They are an array of cells that are not all the same size; it only returns the cells at the highest resolution that it intersects.  (This is true for all yt data objects.)  Slices and projections have the special fields `px`, `py`, `pdx` and `pdy`, which correspond to the coordinates and half-widths in the pixel plane."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "sl = pf.h.slice(0, c[0])\n",
+      "print sl[\"x\"], sl[\"z\"], sl[\"pdx\"]\n",
+      "print sl[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to do something interesting with a Slice, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "frb = sl.to_frb((50.0, 'kpc'), 1024)\n",
+      "print frb[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of Density, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "write_image(np.log10(frb[\"Density\"]), \"temp.png\")\n",
+      "from IPython.core.display import Image\n",
+      "Image(filename = \"temp.png\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Off-Axis Slices\n",
+      "\n",
+      "yt provides not only slices, but off-axis slices that are sometimes called \"cutting planes.\"  These are specified by (in order) a normal vector and a center.  Here we've set the normal vector to `[0.2, 0.3, 0.5]` and the center to be the point of maximum density.\n",
+      "\n",
+      "We can then turn these directly into plot windows using `to_pw`.  Note that the `to_pw` and `to_frb` methods are available on slices, off-axis slices, and projections, and can be used on any of them."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cp = pf.h.cutting([0.2, 0.3, 0.5], \"max\")\n",
+      "pw = cp.to_pw(fields = [\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once we have our plot window from our cutting plane, we can show it here."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pw.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can, as noted above, do the same with our slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pws = sl.to_pw(fields=[\"Density\"])\n",
+      "pws.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Covering Grids\n",
+      "\n",
+      "If we want to access a 3D array of data that spans multiple resolutions in our simulation, we can use a covering grid.  This will return a 3D array of data, drawing from up to the resolution level specified when creating the data.  For example, if you create a covering grid that spans two child grids of a single parent grid, it will fill those zones covered by a zone of a child grid with the data from that child grid.  Where it is covered only by the parent grid, the cells from the parent grid will be duplicated (appropriately) to fill the covering grid.\n",
+      "\n",
+      "There are two different types of covering grids: unsmoothed and smoothed.  Smoothed grids will be filled through a cascading interpolation process; they will be filled at level 0, interpolated to level 1, filled at level 1, interpolated to level 2, filled at level 2, etc.  This will help to reduce edge effects.  Unsmoothed covering grids will not be interpolated, but rather values will be duplicated multiple times.\n",
+      "\n",
+      "Here we create an unsmoothed covering grid at level 2, with the left edge at `[0.0, 0.0, 0.0]` and with dimensions equal to those that would cover the entire domain at level 2.  We can then ask for the Density field, which will be a 3D array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cg = pf.h.covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print cg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we do exactly the same thing: except we ask for a *smoothed* covering grid, which will reduce edge effects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "scg = pf.h.smoothed_covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print scg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Derived Fields and Profiles.ipynb
--- /dev/null
+++ b/source/bootcamp/Derived Fields and Profiles.ipynb	
@@ -0,0 +1,304 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Derived Fields and Profiles\n",
+      "\n",
+      "One of the most powerful features in yt is the ability to create derived fields that act and look exactly like fields that exist on disk.  This means that they will be generated on demand and can be used anywhere a field that exists on disk would be used.  Additionally, you can create them by just writing python functions."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Derived Fields\n",
+      "\n",
+      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `Dinosaurs` and our units are `Trex/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(units = \"Trex/s\")\n",
+      "def Dinosaurs(field, data):\n",
+      "    return data[\"Density\"]**(2.0/3.0) * data[\"VelocityMagnitude\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One important thing to note is that derived fields must be defined *before* any datasets are loaded.  Let's load up our data and take a look at some quantities."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "print dd.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `Extrema` quantity -- the exact same way that we would for Density, Temperature, and so on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"Extrema\"](\"Dinosaurs\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can do the same for the average quantities as well."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"WeightedAverageQuantity\"](\"Dinosaurs\", weight=\"Temperature\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## A Few Other Quantities\n",
+      "\n",
+      "We can ask other quantities of our data, as well.  For instance, this sequence of operations will find the most dense point, center a sphere on it, calculate the bulk velocity of that sphere, calculate the baryonic angular momentum vector, and then the density extrema.  All of this is done in a memory conservative way: if you have an absolutely enormous dataset, yt will split that dataset into pieces, apply intermediate reductions and then a final reduction to calculate your quantity."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10.0, 'kpc'))\n",
+      "bv = sp.quantities[\"BulkVelocity\"]()\n",
+      "L = sp.quantities[\"AngularMomentumVector\"]()\n",
+      "(rho_min, rho_max), = sp.quantities[\"Extrema\"](\"Density\")\n",
+      "print bv, L, rho_min, rho_max"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Profiles\n",
+      "\n",
+      "yt provides the ability to bin in 1, 2 and 3 dimensions.  This means discretizing in one or more dimensions of phase space (density, temperature, etc) and then calculating either the total value of a field in each bin or the average value of a field in each bin.\n",
+      "\n",
+      "We do this using the objects `BinnedProfile1D`, `BinnedProfile2D`, and `BinnedProfile3D`.  The first two are the most common since they are the easiest to visualize.\n",
+      "\n",
+      "This first set of commands manually creates a `BinnedProfile1D` from the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the Density-weighted average of the fields `Temperature` and (previously-defined) `Dinosaurs`.  We then plot it in a loglog plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof = BinnedProfile1D(sp, 32, \"Density\", rho_min, rho_max)\n",
+      "prof.add_fields([\"Temperature\", \"Dinosaurs\"], weight=\"Density\")\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"Temperature\"], \"-x\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the `Dinosaurs` field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(prof[\"Density\"], prof[\"Dinosaurs\"], '-x')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to see the total mass in every bin, we add the `CellMassMsun` field with no weight.  Specifying `weight=None` will simply take the total value in every bin and add that up."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can also specify accumulation, which sums all the bins, from left to right.  Note that for 2D and 3D profiles, this needs to be a tuple of length 2 or 3."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None, accumulation=True)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Derived Fields\n",
+      "\n",
+      "*This section can be skipped!*\n",
+      "\n",
+      "You can also define fields that require extra zones.  This is useful, for instance, if you want to take the average, or apply a stencil.  yt provides fields like `DivV` that do this internally.  This example is a very busy example of how to do it.  You need to specify the validator `ValidateSpatial` with the number of extra zones *on each side* of the grid that you need, and then inside your function you need to return a field *with those zones stripped off*.  So by necessity, the arrays returned by `data[something]` will have larger spatial extent than what should be returned by the function itself.  If you specify that you need 0 extra zones, this will also work and will simply supply a `grid` object for the field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(name = \"AveragedTemperature\",\n",
+      "               validators = [ValidateSpatial(1)],\n",
+      "               units = r\"K\")\n",
+      "def _AveragedTemperature(field, data):\n",
+      "    nx, ny, nz = data[\"Temperature\"].shape\n",
+      "    new_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    weight_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    i_i, j_i, k_i = na.mgrid[0:3,0:3,0:3]\n",
+      "    for i,j,k in zip(i_i.ravel(),j_i.ravel(),k_i.ravel()):\n",
+      "        sl = [slice(i,nx-(2-i)),slice(j,ny-(2-j)),slice(k,nz-(2-k))]\n",
+      "        new_field += data[\"Temperature\"][sl] * data[\"CellMass\"][sl]\n",
+      "        weight_field += data[\"CellMass\"][sl]\n",
+      "    # Now some fancy footwork\n",
+      "    new_field2 = na.zeros((nx,ny,nz))\n",
+      "    new_field2[1:-1,1:-1,1:-1] = new_field/weight_field\n",
+      "    return new_field2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now, once again, we can access `AveragedTemperature` just like any other field.  Note that because it requires ghost zones, this will be a much slower process!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "(tmin, tmax), (atmin, atmax) = dd.quantities[\"Extrema\"]([\"Temperature\", \"AveragedTemperature\"])\n",
+      "print tmin, tmax, atmin, atmax\n",
+      "print tmin / atmin, tmax / atmax"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Field Parameters\n",
+      "\n",
+      "Field parameters are a method of passing information to derived fields.  For instance, you might pass in information about a vector you want to use as a basis for a coordinate transformation.  yt often uses things like `bulk_velocity` to identify velocities that should be subtracted off.  Here we show how that works:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp_small = pf.h.sphere(\"max\", (1.0, 'kpc'))\n",
+      "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
+      "\n",
+      "sp = pf.h.sphere(\"max\", (0.1, 'mpc'))\n",
+      "rv1 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "sp.clear_data()\n",
+      "sp.set_field_parameter(\"bulk_velocity\", bv)\n",
+      "rv2 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "print bv\n",
+      "print rv1\n",
+      "print rv2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Introduction.ipynb
--- /dev/null
+++ b/source/bootcamp/Introduction.ipynb
@@ -0,0 +1,93 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Welcome to the yt bootcamp!\n",
+      "\n",
+      "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
+      "\n",
+      "But, before we begin, there are a few places to go if you run into trouble.\n",
+      "\n",
+      "**The yt homepage is at http://yt-project.org/**\n",
+      "\n",
+      "## Source of Help\n",
+      "\n",
+      "There are three places to check for help:\n",
+      "\n",
+      " * The documentation: http://yt-project.org/doc/\n",
+      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
+      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
+      "\n",
+      "## Sources of Information\n",
+      "\n",
+      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
+      "\n",
+      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
+      "\n",
+      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
+      "\n",
+      "## How to Update yt\n",
+      "\n",
+      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
+      "\n",
+      "`yt update`\n",
+      "\n",
+      "This will automatically update it for you.\n",
+      "\n",
+      "## Acquiring the datasets for this tutorial\n",
+      "\n",
+      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/, or run this next cell by pressing `Shift-Enter` inside it.  It may take a few minutes.\n",
+      "\n",
+      "## What's Next?\n",
+      "\n",
+      "The Notebooks are meant to be explored in this order:\n",
+      "\n",
+      "1. Introduction\n",
+      "2. Data Inspection\n",
+      "3. Simple Visualization\n",
+      "4. Data Objects and Time Series\n",
+      "5. Derived Fields and Profiles\n",
+      "6. Volume Rendering"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "!curl -sSO http://yt-project.org/data/enzo_tiny_cosmology.tar\n",
+      "print \"Got enzo_tiny_cosmology\"\n",
+      "!tar xf enzo_tiny_cosmology.tar\n",
+      "!curl -sSO http://yt-project.org/data/Enzo_64.tar\n",
+      "print \"Got Enzo_64\"\n",
+      "!tar xf Enzo_64.tar\n",
+      "!curl -sSO http://yt-project.org/data/IsolatedGalaxy.tar\n",
+      "print \"Got IsolatedGalaxy\"\n",
+      "!tar xf IsolatedGalaxy.tar\n",
+      "print \"All done!\""
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Simple Visualization.ipynb
--- /dev/null
+++ b/source/bootcamp/Simple Visualization.ipynb	
@@ -0,0 +1,285 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Simple Visualizations of Data\n",
+      "\n",
+      "Just like in our first notebook, we have to load yt and then some data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For this notebook, we'll load up a cosmology dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "print \"Redshift =\", pf.current_redshift"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In the terms that yt uses, a projection is a line integral through the domain.  This can either be unweighted (in which case a column density is returned) or weighted, in which case an average value is returned.  Projections are, like all other data objects in yt, full-fledged data objects that churn through data and present that to you.  However, we also provide a simple method of creating Projections and plotting them in a single step.  This is called a Plot Window, here specifically known as a `ProjectionPlot`.  One thing to note is that in yt, we project all the way through the entire domain at a single time.  This means that the first call to projecting can be somewhat time consuming, but panning, zooming and plotting are all quite fast.\n",
+      "\n",
+      "yt is designed to make it easy to make nice plots and straightforward to modify those plots directly.  The cookbook in the documentation includes detailed examples of this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"y\", \"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The `show` command simply sends the plot to the IPython notebook.  You can also call `p.save()` which will save the plot to the file system.  This function accepts an argument, which will be pre-prended to the filename and can be used to name it based on the width or to supply a location.\n",
+      "\n",
+      "Now we'll zoom and pan a bit."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(2.0)\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((0.1, 0.0))\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(10.0)\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((-0.25, -0.5))\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(0.1)\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we specify multiple fields, each time we call `show` we get multiple plots back.  Same for `save`!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"z\", [\"Density\", \"Temperature\"], weight_field=\"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the colormap on a field-by-field basis."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.set_cmap(\"Temperature\", \"hot\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, we can re-center the plot on different locations.  One possible use of this would be to make a single `ProjectionPlot` which you move around to look at different regions in your simulation, saving at each one."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "p.set_center(c)\n",
+      "p.zoom(10)\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Okay, let's load up a bigger simulation (from `Enzo_64` this time) and make a slice plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"Enzo_64/DD0043/data0043\")\n",
+      "s = SlicePlot(pf, \"z\", [\"Density\", \"VelocityMagnitude\"], center=\"max\")\n",
+      "s.set_cmap(\"VelocityMagnitude\", \"kamae\")\n",
+      "s.zoom(10.0)\n",
+      "s.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the logging of various fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.set_log(\"VelocityMagnitude\", True)\n",
+      "s.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides many different annotations for your plots.  You can see all of these in the documentation, or if you type `s.annotate_` and press tab, a list will show up here.  We'll annotate with velocity arrows."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.annotate_velocity()\n",
+      "s.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Contours can also be overlaid:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s = SlicePlot(pf, \"x\", [\"Density\"], center=\"max\")\n",
+      "s.annotate_contour(\"Temperature\")\n",
+      "s.zoom(2.5)\n",
+      "s.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we can save out to the file system."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.save()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 941f87ecc071630b1beffe072559343c254e939e -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 source/bootcamp/Volume Rendering.ipynb
--- /dev/null
+++ b/source/bootcamp/Volume Rendering.ipynb	
@@ -0,0 +1,95 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# A Brief Demo of Volume Rendering\n",
+      "\n",
+      "This shows a small amount of volume rendering.  Really, just enough to get your feet wet!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To create a volume rendering, we need a camera and a transfer function.  We'll use the `ColorTransferFunction`, which accepts (in log space) the minimum and maximum bounds of our transfer function.  This means behavior for data outside these values is undefined.\n",
+      "\n",
+      "We then add on \"layers\" like an onion.  This function can accept a width (here specified) in data units, and also a color map.  Here we add on four layers.\n",
+      "\n",
+      "Finally, we create a camera.  The focal point is `[0.5, 0.5, 0.5]`, the width is 20 kpc (including front-to-back integration) and we specify a transfer function.  Once we've done that, we call `show` to actually cast our rays and display them inline."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -24))\n",
+      "tf.add_layers(4, w=0.01)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf)\n",
+      "cam.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the `std()` of the image array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cam.show(clip_ratio=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "There are several other options we can specify.  Note that here we have turned on the use of ghost zones, shortened the data interval for the transfer function, and widened our gaussian layers."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -25))\n",
+      "tf.add_layers(4, w=0.03)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf, no_ghost=False)\n",
+      "cam.show(clip_ratio=4.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file


https://bitbucket.org/yt_analysis/yt-doc/commits/559912db681f/
Changeset:   559912db681f
User:        MatthewTurk
Date:        2013-10-28 19:45:28
Summary:     Removing lots of stuff from the config section.
Affected #:  1 file

diff -r 9b17ad5d2f5e6a8dceca95d9ccb0c29275a344a8 -r 559912db681f8fc681e70f00c31b546e9d56532a source/configuration.rst
--- a/source/configuration.rst
+++ b/source/configuration.rst
@@ -74,39 +74,22 @@
 Available Configuration Options
 -------------------------------
 
-The following parameters are available.
+The following external parameters are available.  A number of parameters are
+used internally.
 
-* ``__parallel`` (default: ``'False'``): Internal parameter governing whether this
-  run is being executed in parallel or not.  Can be read from.
-* ``__parallel_rank`` (default: ``'0'``): Internal parameter governing the rank of
-  the current processor.  Can be read from.
-* ``__parallel_size`` (default: ``'1'``): Internal parameter governing the size of
-  the parallel job.  Can be read from.
 * ``coloredlogs`` (default: ``'False'``): Should logs be colored?
-* ``inline`` (default: ``'False'``): Internal parameter indicating whether this
-  session is being called from within a running simulation code.
 * ``loadfieldplugins`` (default: ``'True'``): Do we want to load the plugin file?
+* ``pluginfilename``  (default ``'my_plugins.py'``) The name of our plugin file.
 * ``logfile`` (default: ``'False'``): Should we output to a log file in the
   filesystem?
 * ``loglevel`` (default: ``'20'``): What is the threshold (0 to 50) for outputting
   log files?
-* ``maximumstoredpfs`` (default: ``'500'``): How many parameter files should be
-  tracked between sessions?
 * ``notebook_password`` (default: empty): If set, this will be fed to the
   IPython notebook created by ``yt notebook``.  Note that this should be an
   sha512 hash, not a plaintext password.  Starting ``yt notebook`` with no
   setting will provide instructions for setting this.
-* ``onlydeserialize`` (default: ``'False'``): If true, only pull from .yt files,
-  never add to them.
-* ``parameterfilestore`` (default: ``'parameter_files.csv'``): The name of the file
-  in ``$HOME/.yt/`` in which parameter files will be tracked.
-* ``pluginfilename``  (default ``'my_plugins.py'``) The name of our plugin file.
 * ``serialize`` (default: ``'True'``): Are we allowed to write to the ``.yt`` file?
 * ``sketchfab_api_key`` (default: empty): API key for http://sketchfab.com/ for
   uploading AMRSurface objects.
-* ``storeparameterfiles`` (default: ``'False'``): Should we track parameter files
-  between sessions?
 * ``suppressStreamLogging`` (default: ``'False'``): If true, execution mode will be
   quiet.
-* ``timefunctions`` (default: ``'False'``): Should (some) functions report their
-  runtime?


https://bitbucket.org/yt_analysis/yt-doc/commits/ac4cf29c2667/
Changeset:   ac4cf29c2667
User:        MatthewTurk
Date:        2013-10-28 19:48:53
Summary:     Moving around some documents.
Affected #:  9 files

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/analyzing/index.rst
--- a/source/analyzing/index.rst
+++ b/source/analyzing/index.rst
@@ -4,10 +4,8 @@
 .. toctree::
    :maxdepth: 2
 
-   loading_data
    objects
    particles
    creating_derived_fields
    generating_processed_data
    time_series_analysis
-   low_level_inspection

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/analyzing/loading_data.rst
--- a/source/analyzing/loading_data.rst
+++ /dev/null
@@ -1,336 +0,0 @@
-.. _loading-data:
-
-Loading Data
-============
-
-This section contains information on how to load data into ``yt``, as well as
-some important caveats about different data formats.
-
-.. _loading-numpy-array:
-
-Generic Array Data
-------------------
-
-Even if your data is not strictly related to fields commonly used in
-astrophysical codes or your code is not supported yet, you can still feed it to
-``yt`` to use its advanced visualization and analysis facilities. The only
-requirement is that your data can be represented as one or more uniform, three
-dimensional numpy arrays. Assuming that you have your data in ``arr``,
-the following code:
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = arr)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-will create ``yt``-native parameter file ``pf`` that will treat your array as
-density field in cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) and
-simultaneously divide the domain into 12 chunks, so that you can take advantage
-of the underlying parallelism. 
-
-Particle fields are detected as one-dimensional fields. The number of
-particles is set by the ``number_of_particles`` key in
-``data``. Particle fields are then added as one-dimensional arrays in
-a similar manner as the three-dimensional grid fields:
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = dens, 
-               number_of_particles = 1000000,
-               particle_position_x = posx_arr, 
-	       particle_position_y = posy_arr,
-	       particle_position_z = posz_arr)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-where in this exampe the particle position fields have been assigned. ``number_of_particles`` must be the same size as the particle
-arrays. If no particle arrays are supplied then ``number_of_particles`` is assumed to be zero. 
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Particles may be difficult to integrate.
-* Data must already reside in memory.
-
-.. _loading-enzo-data:
-
-Enzo Data
----------
-
-Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
-dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
-it the parameter file name.  This would be the name of the output file, and it
-contains no extension.  For instance, if you have the following files:
-
-.. code-block:: none
-
-   DD0010/
-   DD0010/data0010
-   DD0010/data0010.hierarchy
-   DD0010/data0010.cpu0000
-   DD0010/data0010.cpu0001
-   DD0010/data0010.cpu0002
-   DD0010/data0010.cpu0003
-
-You would feed the ``load`` command the filename ``DD0010/data0010`` as
-mentioned.
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("DD0010/data0010")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Enzo usage
-* Units should be correct, if you utilize standard unit-setting routines.  yt
-  will notify you if it cannot determine the units, although this
-  notification will be passive.
-* 2D and 1D data are supported, but the extraneous dimensions are set to be
-  of length 1.0
-
-.. _loading-orion-data:
-
-Orion Data
-----------
-
-Orion data is fully supported and cared for by Jeff Oishi.  This method should
-also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
-Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
-command provided by ``yt.mods`` and supply to it the directory file name.
-**You must also have the ``inputs`` file in the base directory.**  For
-instance, if you were in a directory with the following files:
-
-.. code-block:: none
-
-   inputs
-   pltgmlcs5600/
-   pltgmlcs5600/Header
-   pltgmlcs5600/Level_0
-   pltgmlcs5600/Level_0/Cell_H
-   pltgmlcs5600/Level_1
-   pltgmlcs5600/Level_1/Cell_H
-   pltgmlcs5600/Level_2
-   pltgmlcs5600/Level_2/Cell_H
-   pltgmlcs5600/Level_3
-   pltgmlcs5600/Level_3/Cell_H
-   pltgmlcs5600/Level_4
-   pltgmlcs5600/Level_4/Cell_H
-
-You would feed it the filename ``pltgmlcs5600``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("pltgmlcs5600")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Orion usage
-* Star particles are not supported at the current time
-
-.. _loading-flash-data:
-
-FLASH Data
-----------
-
-FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
-FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
-supply to it the file name of a plot file or checkpoint file, but particle
-files are not currently directly loadable by themselves, due to the
-fact that they typically lack grid information. For instance, if you were in a directory with
-the following files:
-
-.. code-block:: none
-
-   cosmoSim_coolhdf5_chk_0026
-
-You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("cosmoSim_coolhdf5_chk_0026")
-
-If you have a FLASH particle file that was created at the same time as
-a plotfile or checkpoint file (therefore having particle data
-consistent with the grid structure of the latter), its data may be loaded with the
-``particle_filename`` optional argument:
-
-.. code-block:: python
-
-    from yt.mods import *
-    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly utilized; yt assumes cgs
-* Velocities and length units will be scaled to comoving coordinates if yt is
-  able to discern you are examining a cosmology simulation; particle and grid
-  positions will not be.
-* Domains may be visualized assuming periodicity.
-
-.. _loading-ramses-data:
-
-RAMSES Data
------------
-
-RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
-you are interested in taking a development or stewardship role, please contact
-him.  To load a RAMSES dataset, you can use the ``load`` command provided by
-``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
-were in a directory with the following files:
-
-.. code-block:: none
-
-   output_00007
-   output_00007/amr_00007.out00001
-   output_00007/grav_00007.out00001
-   output_00007/hydro_00007.out00001
-   output_00007/info_00007.txt
-   output_00007/part_00007.out00001
-
-You would feed it the filename ``output_00007/info_00007.txt``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("output_00007/info_00007.txt")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly set!  This may not be the
-  case for RAMSES data
-* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
-  domain to ensure minimum-coverage from a set of grid patches.  (This is
-  described in the yt method paper.)  This is a time-consuming process and it
-  has not yet been written to be stored between calls.
-* Particles are not supported
-* Parallelism will not be terribly efficient for large datasets
-* There may be occasional segfaults on multi-domain data, which do not
-  reflect errors in the calculation
-
-If you are interested in helping with RAMSES support, we are eager to hear from
-you!
-
-.. _loading-art-data:
-
-ART Data
---------
-
-ART data enjoys preliminary support and is supported by Christopher Moody.
-Please contact the ``yt-dev`` mailing list if you are interested in using yt
-for ART data, or if you are interested in assisting with development of yt to
-work with ART data.
-
-At the moment, the ART octree is 'regridded' at each level to make the native
-octree look more like a mesh-based code. As a result, the initial outlay
-is about ~60 seconds to grid octs onto a mesh. This will be improved in 
-``yt-3.0``, where octs will be supported natively. 
-
-To load an ART dataset you can use the ``load`` command provided by 
-``yt.mods`` and passing the gas mesh file. It will search for and attempt 
-to find the complementary dark matter and stellar particle header and data 
-files. However, your simulations may not follow the same naming convention.
-
-So for example, a single snapshot might have a series of files looking like
-this:
-
-.. code-block:: none
-
-   10MpcBox_csf512_a0.300.d    #Gas mesh
-   PMcrda0.300.DAT             #Particle header
-   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
-   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
-
-The ART frontend tries to find the associated files matching the above, but
-if that fails you can specify ``file_particle_data``,``file_particle_data``,
-``file_star_data`` in addition to the specifying the gas mesh. You also have 
-the option of gridding particles, and assigning them onto the meshes.
-This process is in beta, and for the time being it's probably  best to leave
-``do_grid_particles=False`` as the default.
-
-To speed up the loading of an ART file, you have a few options. You can turn 
-off the particles entirely by setting ``discover_particles=False``. You can
-also only grid octs up to a certain level, ``limit_level=5``, which is useful
-when debugging by artificially creating a 'smaller' dataset to work with.
-
-Finally, when stellar ages are computed we 'spread' the ages evenly within a
-smoothing window. By default this is turned on and set to 10Myr. To turn this 
-off you can set ``spread=False``, and you can tweak the age smoothing window
-by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
-
-.. code-block:: python
-    
-   from yt.mods import *
-
-   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
-   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
-   pf.h.print_stats()
-   dd=pf.h.all_data()
-   print np.sum(dd['particle_type']==0)
-
-In the above example code, the first line imports the standard yt functions,
-followed by defining the gas mesh file. It's loaded only through level 3, but
-grids particles on to meshes on level 2 and higher. Finally, we create a data
-container and ask it to gather the particle_type array. In this case ``type==0``
-is for the most highly-refined dark matter particle, and we print out how many
-high-resolution star particles we find in the simulation.  Typically, however,
-you shouldn't have to specify any keyword arguments to load in a dataset.
-
-.. loading-amr-data:
-
-Generic AMR Data
-----------------
-
-It is possible to create native ``yt`` parameter file from Python's dictionary
-that describes set of rectangular patches of data of possibly varying
-resolution. 
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_amr_grids
-
-   grid_data = [
-       dict(left_edge = [0.0, 0.0, 0.0],
-            right_edge = [1.0, 1.0, 1.],
-            level = 0,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-       dict(left_edge = [0.25, 0.25, 0.25],
-            right_edge = [0.75, 0.75, 0.75],
-            level = 1,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-   ]
-  
-   for g in grid_data:
-       g["Density"] = np.random.random(g["dimensions"]) * 2**g["level"]
-  
-   pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)
-
-Particle fields are supported by adding 1-dimensional arrays and
-setting the ``number_of_particles`` key to each ``grid``'s dict:
-
-.. code-block:: python
-
-    for g in grid_data:
-        g["number_of_particles"] = 100000
-        g["particle_position_x"] = np.random.random((g["number_of_particles"]))
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Some functions may behave oddly, and parallelism will be disappointing or
-  non-existent in most cases.
-* No consistency checks are performed on the hierarchy
-* Data must already reside in memory.
-* Consistency between particle positions and grids is not checked;
-  ``load_amr_grids`` assumes that particle positions associated with one grid are
-  not bounded within another grid at a higher level, so this must be
-  ensured by the user prior to loading the grid data. 

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/analyzing/low_level_inspection.rst
--- a/source/analyzing/low_level_inspection.rst
+++ /dev/null
@@ -1,117 +0,0 @@
-Low-Level Data Inspection
-=========================
-
-yt can not only provide high-level access to data, such as through slices,
-projections, object queries and the like, but it can also provide low-level
-access to data.
-
-.. note:: This section is tuned for patch- or block-based simulations.  Future
-          versions of yt will enable more direct access to particle and oct
-          based simulations.  For now, these are represented as patches, with
-          the attendant properties.
-
-For a more basic introduction, see :ref:`first_steps` and more specifically
-:ref:`grid_inspection`.
-
-Examining Grid Hierarchies
---------------------------
-
-yt organizes grids in a hierarchical fashion; a coarser grid that contains (or
-overlaps with) a finer grid is referred to as its parent.  yt organizes these
-only a single level of refinement at a time.  To access grids, the ``grids``
-attribute on a :class:`~yt.data_objects.hierarchy.AMRHierarchy` object.  (For
-fast operations, a number of additional arrays prefixed with ``grid`` are also
-available, such as ``grid_left_edges`` and so on.)  This returns an instance of
-:class:`~yt.data_objects.grid_patch.AMRGridPatch`, which can be queried for
-either data or hierarchy information.
-
-The :class:`~yt.data_objects.grid_patch.AMRGridPatch` object itself provides
-the following attributes:
-
- * ``Children``: a list of grids contained within this one, of one higher level
-   of refinement
- * ``Parent``: a single object or a list of objects this grid is contained
-   within, one level of refinement coarser
- * ``child_mask``: a mask of 0's and 1's, representing where no finer data is
-   available in refined grids (1) or where this grid is covered by finer regions
-   (0).  Note that to get back the final data contained within a grid, one can
-   multiple a field by this attribute.
- * ``child_indices``: a mask of booleans, where False indicates no finer data
-   is available.  This is essentially the inverse of ``child_mask``.
- * ``child_index_mask``: a mask of indices into the ``pf.h.grids`` array of the
-   child grids.
- * ``LeftEdge``: the left edge, in native code coordinates, of this grid
- * ``RightEdge``: the right edge, in native code coordinates, of this grid
- * ``dds``: the width of a cell in this grid
- * ``id``: the id (not necessarily the index) of this grid.  Defined such that
-   subtracting the property ``_id_offset`` gives the index into ``pf.h.grids``.
- * ``NumberOfParticles``: the number of particles in this grid
- * ``OverlappingSiblings``: a list of sibling grids that this grid overlaps
-   with.  Likely only defined for Octree-based codes.
-
-In addition, the method
-:meth:`~yt.data_objects.grid_patch.AMRGridPatch.get_global_startindex` can be
-used to get the integer coordinates of the upper left edge.  These integer
-coordinates are defined with respect to the current level; this means that they
-are the offset of the left edge, with respect to the left edge of the domain,
-divided by the local ``dds``.
-
-To traverse a series of grids, this type of construction can be used:
-
-.. code-block:: python
-
-   g = pf.h.grids[1043]
-   g2 = g.Children[1].Children[0]
-   print g2.LeftEdge
-
-Examining Grid Data
--------------------
-
-Once you have identified a grid you wish to inspect, there are two ways to
-examine data.  You can either ask the grid to read the data and pass it to you
-as normal, or you can manually intercept the data from the IO handler and
-examine it before it has been unit converted.  This allows for much more raw
-data inspection.
-
-To access data that has been read in the typical fashion and unit-converted as
-normal, you can access the grid as you would a normal object:
-
-.. code-block:: python
-
-   g = pf.h.grids[1043]
-   print g["Density"]
-   print g["Density"].min()
-
-To access the raw data, you have to call the IO handler from the hierarchy
-instead.  This is somewhat more low-level.
-
-.. code-block:: python
-
-   g = pf.h.grids[1043]
-   rho = pf.h.io.pop(g, "Density")
-
-This field will be the raw data found in the file.
-
-Finding Data at Fixed Points
-----------------------------
-
-One of the most common questions asked of data is, what is the value *at this
-specific point*.  While there are several ways to find out the answer to this
-question, a few helper routines are provided as well.  To identify the
-finest-resolution (i.e., most canonical) data at a given point, use
-:meth:`~yt.data_objects.hierarchy.AMRHierarchy.find_field_value_at_point`.
-This accepts a position (in coordinates of the domain) and returns the field
-values for one or multiple fields.
-
-To identify all the grids that intersect a given point, the function 
-:meth:`~yt.data_objects.hierarchy.AMRHierarchy.find_point` will return indices
-and objects that correspond to it.  For instance:
-
-.. code-block:: python
-
-   gs, gi = pf.h.find_point((0.5, 0.6, 0.9))
-   for g in gs:
-       print g.Level, g.LeftEdge, g.RightEdge
-
-Note that this doesn't just return the canonical output, but also all of the
-parent grids that overlap with that point.

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/configuration.rst
--- a/source/configuration.rst
+++ /dev/null
@@ -1,95 +0,0 @@
-.. _configuration-file:
-
-Configuration File
-==================
-
-Configuration File Format
--------------------------
-
-yt will look for and recognize the file ``$HOME/.yt/config`` as a configuration
-file, containing several options that can be modified and adjusted to control
-runtime behavior.  For example, a sample ``$HOME/.yt/config`` file could look
-like:
-
-.. code-block:: none
-    
-   [yt]
-   loglevel = 1
-   maximumstoredpfs = 10000
-
-This configuration file would set the logging threshold much lower, enabling
-much more voluminous output from yt.  Additionally, it increases the number of
-parameter files tracked between instantiations of yt.
-
-Configuration Options At Runtime
---------------------------------
-
-In addition to setting parameters in the configuration file itself, you can set
-them at runtime.  
-
-.. warning:: Several parameters are only accessed when yt starts up: therefore,
-   if you want to modify any configuration parameters at runtime, you should
-   execute the appropriate commands at the *very top* of your script!
-
-This involves importing the configuration object and then setting a given
-parameter to be equal to a specific string.  Note that even for items that
-accept integers, floating points and other non-string types, you *must* set
-them to be a string or else the configuration object will consider them broken.
-
-Here is an example script, where we adjust the logging at startup:
-
-.. code-block:: python
-
-   from yt.config import ytcfg
-   ytcfg["yt", "loglevel"] = "1"
-
-   from yt.mods import *
-   pf = load("my_data0001")
-   pf.h.print_stats()
-
-This has the same effect as setting ``loglevel = 1`` in the configuration file.
-
-Setting Configuration On the Command Line
------------------------------------------
-
-Options can also be set directly on the command line by specifying a
-command-line option.  For instance, if you are running the script
-``my_script.py`` you can specify a configuration option with the ``--config``
-argument.  As an example, to lower the log level (thus making it more verbose)
-you can specify:
-
-.. code-block:: bash
-
-   $ python2.7 my_script.py --config loglevel=1
-
-Any configuration option specific to yt can be specified in this manner.  One
-common configuration option would be to disable serialization:
-
-.. code-block:: bash
-
-   $ python2.7 my_script.py --config serialize=False
-
-This way projections are always re-created.
-
-Available Configuration Options
--------------------------------
-
-The following external parameters are available.  A number of parameters are
-used internally.
-
-* ``coloredlogs`` (default: ``'False'``): Should logs be colored?
-* ``loadfieldplugins`` (default: ``'True'``): Do we want to load the plugin file?
-* ``pluginfilename``  (default ``'my_plugins.py'``) The name of our plugin file.
-* ``logfile`` (default: ``'False'``): Should we output to a log file in the
-  filesystem?
-* ``loglevel`` (default: ``'20'``): What is the threshold (0 to 50) for outputting
-  log files?
-* ``notebook_password`` (default: empty): If set, this will be fed to the
-  IPython notebook created by ``yt notebook``.  Note that this should be an
-  sha512 hash, not a plaintext password.  Starting ``yt notebook`` with no
-  setting will provide instructions for setting this.
-* ``serialize`` (default: ``'True'``): Are we allowed to write to the ``.yt`` file?
-* ``sketchfab_api_key`` (default: empty): API key for http://sketchfab.com/ for
-  uploading AMRSurface objects.
-* ``suppressStreamLogging`` (default: ``'False'``): If true, execution mode will be
-  quiet.

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/examining/index.rst
--- a/source/examining/index.rst
+++ b/source/examining/index.rst
@@ -2,3 +2,9 @@
 ==============
 
 How to examine a dataset on disk.
+
+.. toctree::
+   :maxdepth: 2
+
+   loading
+   low_level_inspection

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/examining/loading_data.rst
--- /dev/null
+++ b/source/examining/loading_data.rst
@@ -0,0 +1,336 @@
+.. _loading-data:
+
+Loading Data
+============
+
+This section contains information on how to load data into ``yt``, as well as
+some important caveats about different data formats.
+
+.. _loading-numpy-array:
+
+Generic Array Data
+------------------
+
+Even if your data is not strictly related to fields commonly used in
+astrophysical codes or your code is not supported yet, you can still feed it to
+``yt`` to use its advanced visualization and analysis facilities. The only
+requirement is that your data can be represented as one or more uniform, three
+dimensional numpy arrays. Assuming that you have your data in ``arr``,
+the following code:
+
+.. code-block:: python
+
+   from yt.frontends.stream.api import load_uniform_grid
+
+   data = dict(Density = arr)
+   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
+   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
+
+will create ``yt``-native parameter file ``pf`` that will treat your array as
+density field in cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) and
+simultaneously divide the domain into 12 chunks, so that you can take advantage
+of the underlying parallelism. 
+
+Particle fields are detected as one-dimensional fields. The number of
+particles is set by the ``number_of_particles`` key in
+``data``. Particle fields are then added as one-dimensional arrays in
+a similar manner as the three-dimensional grid fields:
+
+.. code-block:: python
+
+   from yt.frontends.stream.api import load_uniform_grid
+
+   data = dict(Density = dens, 
+               number_of_particles = 1000000,
+               particle_position_x = posx_arr, 
+	       particle_position_y = posy_arr,
+	       particle_position_z = posz_arr)
+   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
+   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
+
+where in this exampe the particle position fields have been assigned. ``number_of_particles`` must be the same size as the particle
+arrays. If no particle arrays are supplied then ``number_of_particles`` is assumed to be zero. 
+
+.. rubric:: Caveats
+
+* Units will be incorrect unless the data has already been converted to cgs.
+* Particles may be difficult to integrate.
+* Data must already reside in memory.
+
+.. _loading-enzo-data:
+
+Enzo Data
+---------
+
+Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
+dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
+it the parameter file name.  This would be the name of the output file, and it
+contains no extension.  For instance, if you have the following files:
+
+.. code-block:: none
+
+   DD0010/
+   DD0010/data0010
+   DD0010/data0010.hierarchy
+   DD0010/data0010.cpu0000
+   DD0010/data0010.cpu0001
+   DD0010/data0010.cpu0002
+   DD0010/data0010.cpu0003
+
+You would feed the ``load`` command the filename ``DD0010/data0010`` as
+mentioned.
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("DD0010/data0010")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Enzo usage
+* Units should be correct, if you utilize standard unit-setting routines.  yt
+  will notify you if it cannot determine the units, although this
+  notification will be passive.
+* 2D and 1D data are supported, but the extraneous dimensions are set to be
+  of length 1.0
+
+.. _loading-orion-data:
+
+Orion Data
+----------
+
+Orion data is fully supported and cared for by Jeff Oishi.  This method should
+also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
+Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
+command provided by ``yt.mods`` and supply to it the directory file name.
+**You must also have the ``inputs`` file in the base directory.**  For
+instance, if you were in a directory with the following files:
+
+.. code-block:: none
+
+   inputs
+   pltgmlcs5600/
+   pltgmlcs5600/Header
+   pltgmlcs5600/Level_0
+   pltgmlcs5600/Level_0/Cell_H
+   pltgmlcs5600/Level_1
+   pltgmlcs5600/Level_1/Cell_H
+   pltgmlcs5600/Level_2
+   pltgmlcs5600/Level_2/Cell_H
+   pltgmlcs5600/Level_3
+   pltgmlcs5600/Level_3/Cell_H
+   pltgmlcs5600/Level_4
+   pltgmlcs5600/Level_4/Cell_H
+
+You would feed it the filename ``pltgmlcs5600``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("pltgmlcs5600")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Orion usage
+* Star particles are not supported at the current time
+
+.. _loading-flash-data:
+
+FLASH Data
+----------
+
+FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
+FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
+supply to it the file name of a plot file or checkpoint file, but particle
+files are not currently directly loadable by themselves, due to the
+fact that they typically lack grid information. For instance, if you were in a directory with
+the following files:
+
+.. code-block:: none
+
+   cosmoSim_coolhdf5_chk_0026
+
+You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("cosmoSim_coolhdf5_chk_0026")
+
+If you have a FLASH particle file that was created at the same time as
+a plotfile or checkpoint file (therefore having particle data
+consistent with the grid structure of the latter), its data may be loaded with the
+``particle_filename`` optional argument:
+
+.. code-block:: python
+
+    from yt.mods import *
+    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly utilized; yt assumes cgs
+* Velocities and length units will be scaled to comoving coordinates if yt is
+  able to discern you are examining a cosmology simulation; particle and grid
+  positions will not be.
+* Domains may be visualized assuming periodicity.
+
+.. _loading-ramses-data:
+
+RAMSES Data
+-----------
+
+RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
+you are interested in taking a development or stewardship role, please contact
+him.  To load a RAMSES dataset, you can use the ``load`` command provided by
+``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
+were in a directory with the following files:
+
+.. code-block:: none
+
+   output_00007
+   output_00007/amr_00007.out00001
+   output_00007/grav_00007.out00001
+   output_00007/hydro_00007.out00001
+   output_00007/info_00007.txt
+   output_00007/part_00007.out00001
+
+You would feed it the filename ``output_00007/info_00007.txt``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("output_00007/info_00007.txt")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly set!  This may not be the
+  case for RAMSES data
+* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
+  domain to ensure minimum-coverage from a set of grid patches.  (This is
+  described in the yt method paper.)  This is a time-consuming process and it
+  has not yet been written to be stored between calls.
+* Particles are not supported
+* Parallelism will not be terribly efficient for large datasets
+* There may be occasional segfaults on multi-domain data, which do not
+  reflect errors in the calculation
+
+If you are interested in helping with RAMSES support, we are eager to hear from
+you!
+
+.. _loading-art-data:
+
+ART Data
+--------
+
+ART data enjoys preliminary support and is supported by Christopher Moody.
+Please contact the ``yt-dev`` mailing list if you are interested in using yt
+for ART data, or if you are interested in assisting with development of yt to
+work with ART data.
+
+At the moment, the ART octree is 'regridded' at each level to make the native
+octree look more like a mesh-based code. As a result, the initial outlay
+is about ~60 seconds to grid octs onto a mesh. This will be improved in 
+``yt-3.0``, where octs will be supported natively. 
+
+To load an ART dataset you can use the ``load`` command provided by 
+``yt.mods`` and passing the gas mesh file. It will search for and attempt 
+to find the complementary dark matter and stellar particle header and data 
+files. However, your simulations may not follow the same naming convention.
+
+So for example, a single snapshot might have a series of files looking like
+this:
+
+.. code-block:: none
+
+   10MpcBox_csf512_a0.300.d    #Gas mesh
+   PMcrda0.300.DAT             #Particle header
+   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
+   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
+
+The ART frontend tries to find the associated files matching the above, but
+if that fails you can specify ``file_particle_data``,``file_particle_data``,
+``file_star_data`` in addition to the specifying the gas mesh. You also have 
+the option of gridding particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's probably  best to leave
+``do_grid_particles=False`` as the default.
+
+To speed up the loading of an ART file, you have a few options. You can turn 
+off the particles entirely by setting ``discover_particles=False``. You can
+also only grid octs up to a certain level, ``limit_level=5``, which is useful
+when debugging by artificially creating a 'smaller' dataset to work with.
+
+Finally, when stellar ages are computed we 'spread' the ages evenly within a
+smoothing window. By default this is turned on and set to 10Myr. To turn this 
+off you can set ``spread=False``, and you can tweak the age smoothing window
+by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
+
+.. code-block:: python
+    
+   from yt.mods import *
+
+   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
+   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
+   pf.h.print_stats()
+   dd=pf.h.all_data()
+   print np.sum(dd['particle_type']==0)
+
+In the above example code, the first line imports the standard yt functions,
+followed by defining the gas mesh file. It's loaded only through level 3, but
+grids particles on to meshes on level 2 and higher. Finally, we create a data
+container and ask it to gather the particle_type array. In this case ``type==0``
+is for the most highly-refined dark matter particle, and we print out how many
+high-resolution star particles we find in the simulation.  Typically, however,
+you shouldn't have to specify any keyword arguments to load in a dataset.
+
+.. loading-amr-data:
+
+Generic AMR Data
+----------------
+
+It is possible to create native ``yt`` parameter file from Python's dictionary
+that describes set of rectangular patches of data of possibly varying
+resolution. 
+
+.. code-block:: python
+
+   from yt.frontends.stream.api import load_amr_grids
+
+   grid_data = [
+       dict(left_edge = [0.0, 0.0, 0.0],
+            right_edge = [1.0, 1.0, 1.],
+            level = 0,
+            dimensions = [32, 32, 32],
+            number_of_particles = 0)
+       dict(left_edge = [0.25, 0.25, 0.25],
+            right_edge = [0.75, 0.75, 0.75],
+            level = 1,
+            dimensions = [32, 32, 32],
+            number_of_particles = 0)
+   ]
+  
+   for g in grid_data:
+       g["Density"] = np.random.random(g["dimensions"]) * 2**g["level"]
+  
+   pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)
+
+Particle fields are supported by adding 1-dimensional arrays and
+setting the ``number_of_particles`` key to each ``grid``'s dict:
+
+.. code-block:: python
+
+    for g in grid_data:
+        g["number_of_particles"] = 100000
+        g["particle_position_x"] = np.random.random((g["number_of_particles"]))
+
+.. rubric:: Caveats
+
+* Units will be incorrect unless the data has already been converted to cgs.
+* Some functions may behave oddly, and parallelism will be disappointing or
+  non-existent in most cases.
+* No consistency checks are performed on the hierarchy
+* Data must already reside in memory.
+* Consistency between particle positions and grids is not checked;
+  ``load_amr_grids`` assumes that particle positions associated with one grid are
+  not bounded within another grid at a higher level, so this must be
+  ensured by the user prior to loading the grid data. 

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/examining/low_level_inspection.rst
--- /dev/null
+++ b/source/examining/low_level_inspection.rst
@@ -0,0 +1,117 @@
+Low-Level Data Inspection
+=========================
+
+yt can not only provide high-level access to data, such as through slices,
+projections, object queries and the like, but it can also provide low-level
+access to data.
+
+.. note:: This section is tuned for patch- or block-based simulations.  Future
+          versions of yt will enable more direct access to particle and oct
+          based simulations.  For now, these are represented as patches, with
+          the attendant properties.
+
+For a more basic introduction, see :ref:`first_steps` and more specifically
+:ref:`grid_inspection`.
+
+Examining Grid Hierarchies
+--------------------------
+
+yt organizes grids in a hierarchical fashion; a coarser grid that contains (or
+overlaps with) a finer grid is referred to as its parent.  yt organizes these
+only a single level of refinement at a time.  To access grids, the ``grids``
+attribute on a :class:`~yt.data_objects.hierarchy.AMRHierarchy` object.  (For
+fast operations, a number of additional arrays prefixed with ``grid`` are also
+available, such as ``grid_left_edges`` and so on.)  This returns an instance of
+:class:`~yt.data_objects.grid_patch.AMRGridPatch`, which can be queried for
+either data or hierarchy information.
+
+The :class:`~yt.data_objects.grid_patch.AMRGridPatch` object itself provides
+the following attributes:
+
+ * ``Children``: a list of grids contained within this one, of one higher level
+   of refinement
+ * ``Parent``: a single object or a list of objects this grid is contained
+   within, one level of refinement coarser
+ * ``child_mask``: a mask of 0's and 1's, representing where no finer data is
+   available in refined grids (1) or where this grid is covered by finer regions
+   (0).  Note that to get back the final data contained within a grid, one can
+   multiple a field by this attribute.
+ * ``child_indices``: a mask of booleans, where False indicates no finer data
+   is available.  This is essentially the inverse of ``child_mask``.
+ * ``child_index_mask``: a mask of indices into the ``pf.h.grids`` array of the
+   child grids.
+ * ``LeftEdge``: the left edge, in native code coordinates, of this grid
+ * ``RightEdge``: the right edge, in native code coordinates, of this grid
+ * ``dds``: the width of a cell in this grid
+ * ``id``: the id (not necessarily the index) of this grid.  Defined such that
+   subtracting the property ``_id_offset`` gives the index into ``pf.h.grids``.
+ * ``NumberOfParticles``: the number of particles in this grid
+ * ``OverlappingSiblings``: a list of sibling grids that this grid overlaps
+   with.  Likely only defined for Octree-based codes.
+
+In addition, the method
+:meth:`~yt.data_objects.grid_patch.AMRGridPatch.get_global_startindex` can be
+used to get the integer coordinates of the upper left edge.  These integer
+coordinates are defined with respect to the current level; this means that they
+are the offset of the left edge, with respect to the left edge of the domain,
+divided by the local ``dds``.
+
+To traverse a series of grids, this type of construction can be used:
+
+.. code-block:: python
+
+   g = pf.h.grids[1043]
+   g2 = g.Children[1].Children[0]
+   print g2.LeftEdge
+
+Examining Grid Data
+-------------------
+
+Once you have identified a grid you wish to inspect, there are two ways to
+examine data.  You can either ask the grid to read the data and pass it to you
+as normal, or you can manually intercept the data from the IO handler and
+examine it before it has been unit converted.  This allows for much more raw
+data inspection.
+
+To access data that has been read in the typical fashion and unit-converted as
+normal, you can access the grid as you would a normal object:
+
+.. code-block:: python
+
+   g = pf.h.grids[1043]
+   print g["Density"]
+   print g["Density"].min()
+
+To access the raw data, you have to call the IO handler from the hierarchy
+instead.  This is somewhat more low-level.
+
+.. code-block:: python
+
+   g = pf.h.grids[1043]
+   rho = pf.h.io.pop(g, "Density")
+
+This field will be the raw data found in the file.
+
+Finding Data at Fixed Points
+----------------------------
+
+One of the most common questions asked of data is, what is the value *at this
+specific point*.  While there are several ways to find out the answer to this
+question, a few helper routines are provided as well.  To identify the
+finest-resolution (i.e., most canonical) data at a given point, use
+:meth:`~yt.data_objects.hierarchy.AMRHierarchy.find_field_value_at_point`.
+This accepts a position (in coordinates of the domain) and returns the field
+values for one or multiple fields.
+
+To identify all the grids that intersect a given point, the function 
+:meth:`~yt.data_objects.hierarchy.AMRHierarchy.find_point` will return indices
+and objects that correspond to it.  For instance:
+
+.. code-block:: python
+
+   gs, gi = pf.h.find_point((0.5, 0.6, 0.9))
+   for g in gs:
+       print g.Level, g.LeftEdge, g.RightEdge
+
+Note that this doesn't just return the canonical output, but also all of the
+parent grids that overlap with that point.

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/reference/configuration.rst
--- /dev/null
+++ b/source/reference/configuration.rst
@@ -0,0 +1,95 @@
+.. _configuration-file:
+
+Configuration File
+==================
+
+Configuration File Format
+-------------------------
+
+yt will look for and recognize the file ``$HOME/.yt/config`` as a configuration
+file, containing several options that can be modified and adjusted to control
+runtime behavior.  For example, a sample ``$HOME/.yt/config`` file could look
+like:
+
+.. code-block:: none
+    
+   [yt]
+   loglevel = 1
+   maximumstoredpfs = 10000
+
+This configuration file would set the logging threshold much lower, enabling
+much more voluminous output from yt.  Additionally, it increases the number of
+parameter files tracked between instantiations of yt.
+
+Configuration Options At Runtime
+--------------------------------
+
+In addition to setting parameters in the configuration file itself, you can set
+them at runtime.  
+
+.. warning:: Several parameters are only accessed when yt starts up: therefore,
+   if you want to modify any configuration parameters at runtime, you should
+   execute the appropriate commands at the *very top* of your script!
+
+This involves importing the configuration object and then setting a given
+parameter to be equal to a specific string.  Note that even for items that
+accept integers, floating points and other non-string types, you *must* set
+them to be a string or else the configuration object will consider them broken.
+
+Here is an example script, where we adjust the logging at startup:
+
+.. code-block:: python
+
+   from yt.config import ytcfg
+   ytcfg["yt", "loglevel"] = "1"
+
+   from yt.mods import *
+   pf = load("my_data0001")
+   pf.h.print_stats()
+
+This has the same effect as setting ``loglevel = 1`` in the configuration file.
+
+Setting Configuration On the Command Line
+-----------------------------------------
+
+Options can also be set directly on the command line by specifying a
+command-line option.  For instance, if you are running the script
+``my_script.py`` you can specify a configuration option with the ``--config``
+argument.  As an example, to lower the log level (thus making it more verbose)
+you can specify:
+
+.. code-block:: bash
+
+   $ python2.7 my_script.py --config loglevel=1
+
+Any configuration option specific to yt can be specified in this manner.  One
+common configuration option would be to disable serialization:
+
+.. code-block:: bash
+
+   $ python2.7 my_script.py --config serialize=False
+
+This way projections are always re-created.
+
+Available Configuration Options
+-------------------------------
+
+The following external parameters are available.  A number of parameters are
+used internally.
+
+* ``coloredlogs`` (default: ``'False'``): Should logs be colored?
+* ``loadfieldplugins`` (default: ``'True'``): Do we want to load the plugin file?
+* ``pluginfilename``  (default ``'my_plugins.py'``) The name of our plugin file.
+* ``logfile`` (default: ``'False'``): Should we output to a log file in the
+  filesystem?
+* ``loglevel`` (default: ``'20'``): What is the threshold (0 to 50) for outputting
+  log files?
+* ``notebook_password`` (default: empty): If set, this will be fed to the
+  IPython notebook created by ``yt notebook``.  Note that this should be an
+  sha512 hash, not a plaintext password.  Starting ``yt notebook`` with no
+  setting will provide instructions for setting this.
+* ``serialize`` (default: ``'True'``): Are we allowed to write to the ``.yt`` file?
+* ``sketchfab_api_key`` (default: empty): API key for http://sketchfab.com/ for
+  uploading AMRSurface objects.
+* ``suppressStreamLogging`` (default: ``'False'``): If true, execution mode will be
+  quiet.

diff -r 559912db681f8fc681e70f00c31b546e9d56532a -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 source/reference/index.rst
--- a/source/reference/index.rst
+++ b/source/reference/index.rst
@@ -1,3 +1,10 @@
 Reference Materials
 ===================
 
+These are reference materials for using yt.
+
+
+.. toctree::
+   :maxdepth: 2
+
+   configuration


https://bitbucket.org/yt_analysis/yt-doc/commits/539df5bc6c9e/
Changeset:   539df5bc6c9e
User:        MatthewTurk
Date:        2013-10-28 19:51:40
Summary:     Loading data was missing.
Affected #:  1 file

diff -r ac4cf29c266739cde042c06ba6df3a712cbcd3d6 -r 539df5bc6c9ee670bb84e01760fd6c9ff613d8a5 source/examining/index.rst
--- a/source/examining/index.rst
+++ b/source/examining/index.rst
@@ -6,5 +6,5 @@
 .. toctree::
    :maxdepth: 2
 
-   loading
+   loading_data
    low_level_inspection


https://bitbucket.org/yt_analysis/yt-doc/commits/c20c394f32bc/
Changeset:   c20c394f32bc
User:        MatthewTurk
Date:        2013-10-28 19:52:38
Summary:     Update source.conf for 2.6.
Affected #:  1 file

diff -r 539df5bc6c9ee670bb84e01760fd6c9ff613d8a5 -r c20c394f32bc4520fd63902bf8e116fce6c240b3 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -46,16 +46,16 @@
 
 # General information about the project.
 project = u'yt'
-copyright = u'2012, the yt Project'
+copyright = u'2013, the yt Project'
 
 # The version info for the project you're documenting, acts as replacement for
 # |version| and |release|, also used in various other places throughout the
 # built documents.
 #
 # The short X.Y version.
-version = '2.5'
+version = '2.6'
 # The full version, including alpha/beta/rc tags.
-release = '2.5'
+release = '2.6dev'
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.
@@ -238,7 +238,7 @@
 
 # Example configuration for intersphinx: refer to the Python standard library.
 intersphinx_mapping = {'http://docs.python.org/': None,
-                       'http://ipython.org/ipython-doc/rel-0.10/html/': None,
+                       'http://ipython.org/ipython-doc/rel-1.10/html/': None,
                        'http://docs.scipy.org/doc/numpy/': None,
                        'http://matplotlib.sourceforge.net/': None,
                        }


https://bitbucket.org/yt_analysis/yt-doc/commits/bb1365132027/
Changeset:   bb1365132027
User:        MatthewTurk
Date:        2013-10-28 19:53:37
Summary:     So long, workshop!
Affected #:  1 file

diff -r c20c394f32bc4520fd63902bf8e116fce6c240b3 -r bb1365132027344adebd1bc86e8466b71d0f1661 source/workshop.rst
--- a/source/workshop.rst
+++ /dev/null
@@ -1,231 +0,0 @@
-yt Workshop Materials
-=====================
-
-In late January, 2012, the first yt Users' Workshop was held at the FLASH
-Center in Chicago.  
-
-The workshop website, http://yt-project.org/workshop2012/ includes historical
-information as well as details on how to download the data used for the
-workshop.  Below are recordings of the talks, as well as links to the
-appropriate slide decks.
-
-Introduction to yt
-------------------
-
-Presented by Matthew Turk
-
-.. youtube:: TsirUjX7fWs
-
-Slides are `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.introduction/src/tip/workshop_intro.pdf>`__
-
-Repository is `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.introduction/>`__
-
-Objects in yt
--------------
-
-Presented by Britton Smith
-
-.. youtube:: GMHeimeHdo4
-
-Slides are `available <https://bitbucket.org/brittonsmith/yt.workshop2012.objects/src/tip/output/yt_objects.pdf>`__
-
-Repository is `available <https://bitbucket.org/brittonsmith/yt.workshop2012.objects>`__
-
-Visualization with yt
----------------------
-
-Presented by Stephen Skory and John ZuHone
-
-.. youtube:: jNWd4YUbcLY
-
-Slides are `available <https://bitbucket.org/sskory/yt.workshop2012.simpleviz/src/tip/SimpleViz.pdf>`__
-
-Repository is `available <https://bitbucket.org/sskory/yt.workshop2012.simpleviz/>`__
-
-General Analysis in yt
-----------------------
-
-Presented by Sam Skillman
-
-.. youtube:: zUqSYpWPYDw
-
-Slides are `available <https://bitbucket.org/samskillman/yt.workshop2012.general_analysis/src/tip/output/yt_general_analysis.pdf>`__
-
-Repository is `available <https://bitbucket.org/samskillman/yt.workshop2012.general_analysis/>`__
-
-Finding Your Way
-----------------
-
-Presented by Cameron Hummels
-
-.. youtube:: hat3QAO4s-I
-
-Slides are `available <https://bitbucket.org/chummels/yt.workshop2012.findingyourway/src/tip/yt.workshop2012.findingyourway.pdf>`__
-
-Repository is `available <https://bitbucket.org/chummels/yt.workshop2012.findingyourway/>`__
-
-Fields and Derived Quantities
------------------------------
-
-Presented by Matthew Turk
-
-.. youtube:: NSwy8Lw1Uvk
-
-Slides are `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.fields/src/tip/output/yt_fields.pdf>`__
-
-Repository is `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.fields/>`__
-
-Parallelism in yt
------------------
-
-Presented by Sam Skillman
-
-.. youtube:: QKWZ-jmqQCQ
-
-Slides are `available <https://bitbucket.org/samskillman/yt.workshop2012.parallelism/src/tip/yt_parallelism.pdf>`__
-
-Repository is `available <https://bitbucket.org/samskillman/yt.workshop2012.parallelism>`__
-
-Advanced Viz hands-on
----------------------
-
-Presented by Jeff Oishi
-
-.. youtube:: MQEKXsTmrAs
-
-Slides are `available <https://bitbucket.org/jsoishi/yt.workshop2012.advanced_visualization/src/tip/output/advanced_visualization.pdf>`__
-
-Repository is `available <https://bitbucket.org/jsoishi/yt.workshop2012.advanced_visualization>`__
-
-EPS Writer
-----------
-
-Presented by John Wise
-
-.. youtube:: Xs_FzJ0ZRiU
-
-Slides are `available <https://bitbucket.org/jsoishi/yt.workshop2012.advanced_visualization/src/tip/output/eps-writer.pdf>`__
-
-Repository not available.
-
-Time Series Analysis
---------------------
-
-Presented by Britton Smith
-
-.. youtube:: Sg_G2QOcxz0
-
-Slides are `available <https://bitbucket.org/brittonsmith/yt.workshop2012.time-series/src/tip/output/time_series.pdf>`__
-
-Repository is `available <https://bitbucket.org/brittonsmith/yt.workshop2012.time-series/>`__
-
-Beginning Volume Rendering
---------------------------
-
-Presented by Cameron Hummels
-
-.. youtube:: euV4KVm4nMw
-
-Slides are `available <https://bitbucket.org/chummels/yt.workshop2012.beginningvr/src/tip/yt.workshop2012.beginning_VR.pdf>`__
-
-Repository is `available <https://bitbucket.org/chummels/yt.workshop2012.beginningvr/>`__
-
-Advanced Volume Rendering
--------------------------
-
-Presented by Sam Skillman
-
-.. youtube:: zra79xb7BP4
-
-Slides are `available <https://bitbucket.org/samskillman/yt.workshop2012.advanced_rendering/src/tip/output/yt_advanced_rendering.pdf>`__
-
-Repository is `available <https://bitbucket.org/samskillman/yt.workshop2012.advanced_rendering>`__
-
-Using external tools with yt
-----------------------------
-
-Presented by John ZuHone
-
-.. youtube:: q3_whMA1BQU
-
-Slides are `available <https://bitbucket.org/jzuhone/yt.external_analysis_examples/src/tip/yt_workshop_external_tools.pdf>`__
-
-Repository is `available <https://bitbucket.org/jzuhone/yt.external_analysis_examples/>`__
-
-Hands-On: Advanced Data Objects
--------------------------------
-
-Presented by Stephen Skory
-
-.. youtube:: _JlFsrqoEBI
-
-Slides are `available <https://bitbucket.org/sskory/yt.workshop2012.datacontainers/src/tip/DataContainers.pdf>`__
-
-Repository is `available <https://bitbucket.org/sskory/yt.workshop2012.datacontainers>`__
-
-Clump Finding
--------------
-
-Presented by Britton Smith
-
-.. youtube:: 19PCVyVj6oo
-
-Slides are `available <https://bitbucket.org/brittonsmith/yt.workshop2012.clump-finding/src/tip/clump_finding_full.pdf>`__
-
-Repository is `available <https://bitbucket.org/brittonsmith/yt.workshop2012.clump-finding/>`__
-
-DVCS with hg
-------------
-
-Presented by Matthew Turk
-
-.. youtube:: BjfUfcqSCYQ
-
-Slides are `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.dvcs/src/tip/output/dvcs.pdf>`__
-
-Repository is `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.dvcs>`__
-
-What We Aren't Showing You
---------------------------
-
-Presented by Cameron Hummels
-
-.. youtube:: qcVlf5p_1ZI
-
-Slides are `available <https://bitbucket.org/chummels/yt.workshop2012.misc/src/tip/yt.workshop2012.what_we_dont_show.pdf>`__
-
-Repository is `available <https://bitbucket.org/chummels/yt.workshop2012.misc/>`__
-
-Development Overview
---------------------
-
-Presented by Sam Skillman
-
-.. youtube:: DG8MJIzeKyA
-
-Slides are `available <https://bitbucket.org/samskillman/yt.workshop2012.development_overview/src/tip/yt_development_overview.pdf>`__
-
-Repository is `available <https://bitbucket.org/samskillman/yt.workshop2012.development_overview/>`__
-
-Testing and Documentation
--------------------------
-
-Presented by Stephen Skory
-
-.. youtube:: OMeqR38KNN4
-
-Slides are `available <https://bitbucket.org/sskory/yt.workshop2012.documentation/src/tip/Documentation.pdf>`__
-
-Repository is `available <https://bitbucket.org/sskory/yt.workshop2012.documentation>`__
-
-Adding a new code frontend
---------------------------
-
-Presented by Matthew Turk
-
-Video not available.
-
-Slides are `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.frontends/src/tip/output/frontends.pdf>`__
-
-Repository is `available <https://bitbucket.org/MatthewTurk/yt.workshop2012.frontends/>`__
-


https://bitbucket.org/yt_analysis/yt-doc/commits/62e32737bacc/
Changeset:   62e32737bacc
User:        MatthewTurk
Date:        2013-10-28 19:57:17
Summary:     Moving analysis modules under analyzing data, splitting into general and
astrophysics.
Affected #:  99 files

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/ParallelHaloFinder.pdf
Binary file source/analysis_modules/ParallelHaloFinder.pdf has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/2ptcorrelation.png
Binary file source/analysis_modules/_images/2ptcorrelation.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/2ptcorrelation.svg
--- a/source/analysis_modules/_images/2ptcorrelation.svg
+++ /dev/null
@@ -1,186 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" standalone="no"?>
-<!-- Created with Inkscape (http://www.inkscape.org/) -->
-
-<svg
-   xmlns:ns0="http://www.iki.fi/pav/software/textext/"
-   xmlns:dc="http://purl.org/dc/elements/1.1/"
-   xmlns:cc="http://creativecommons.org/ns#"
-   xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
-   xmlns:svg="http://www.w3.org/2000/svg"
-   xmlns="http://www.w3.org/2000/svg"
-   xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
-   xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
-   width="274.29703"
-   height="191.12869"
-   id="svg2"
-   version="1.1"
-   inkscape:version="0.47 r22583"
-   sodipodi:docname="New document 1">
-  <sodipodi:namedview
-     id="base"
-     pagecolor="#ffffff"
-     bordercolor="#666666"
-     borderopacity="1.0"
-     inkscape:pageopacity="0.0"
-     inkscape:pageshadow="2"
-     inkscape:zoom="1.01"
-     inkscape:cx="237.34349"
-     inkscape:cy="-18.594048"
-     inkscape:document-units="px"
-     inkscape:current-layer="layer1"
-     showgrid="false"
-     inkscape:window-width="1440"
-     inkscape:window-height="852"
-     inkscape:window-x="0"
-     inkscape:window-y="0"
-     inkscape:window-maximized="0" />
-  <defs
-     id="defs4">
-    <marker
-       inkscape:stockid="Arrow2Mend"
-       orient="auto"
-       refY="0"
-       refX="0"
-       id="Arrow2Mend"
-       style="overflow:visible">
-      <path
-         id="path3648"
-         style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
-         d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
-         transform="scale(-0.6,-0.6)" />
-    </marker>
-    <marker
-       inkscape:stockid="Arrow2Mstart"
-       orient="auto"
-       refY="0"
-       refX="0"
-       id="Arrow2Mstart"
-       style="overflow:visible">
-      <path
-         id="path3645"
-         style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
-         d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
-         transform="scale(0.6,0.6)" />
-    </marker>
-    <inkscape:perspective
-       sodipodi:type="inkscape:persp3d"
-       inkscape:vp_x="0 : 526.18109 : 1"
-       inkscape:vp_y="0 : 1000 : 0"
-       inkscape:vp_z="744.09448 : 526.18109 : 1"
-       inkscape:persp3d-origin="372.04724 : 350.78739 : 1"
-       id="perspective10" />
-  </defs>
-  <metadata
-     id="metadata7">
-    <rdf:RDF>
-      <cc:Work
-         rdf:about="">
-        <dc:format>image/svg+xml</dc:format>
-        <dc:type
-           rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
-        <dc:title></dc:title>
-      </cc:Work>
-    </rdf:RDF>
-  </metadata>
-  <g
-     inkscape:label="Layer 1"
-     inkscape:groupmode="layer"
-     id="layer1"
-     transform="translate(-120.77228,-322.63943)">
-    <rect
-       style="fill:#000000;fill-opacity:0;fill-rule:nonzero;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       id="rect2816"
-       width="88.118813"
-       height="88.118813"
-       x="122.77228"
-       y="423.64932" />
-    <rect
-       y="324.63943"
-       x="304.9505"
-       height="88.118813"
-       width="88.118813"
-       id="rect2818"
-       style="fill:#000000;fill-opacity:0;fill-rule:nonzero;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0" />
-    <path
-       sodipodi:type="arc"
-       style="fill:#000000;fill-opacity:1;fill-rule:nonzero;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       id="path2822"
-       sodipodi:cx="338.1188"
-       sodipodi:cy="569.6889"
-       sodipodi:rx="5.4455447"
-       sodipodi:ry="5.4455447"
-       d="m 343.56435,569.6889 c 0,3.0075 -2.43805,5.44555 -5.44555,5.44555 -3.00749,0 -5.44554,-2.43805 -5.44554,-5.44555 0,-3.00749 2.43805,-5.44554 5.44554,-5.44554 3.0075,0 5.44555,2.43805 5.44555,5.44554 z"
-       transform="translate(-170.75743,-102.9703)" />
-    <path
-       d="m 343.56435,569.6889 c 0,3.0075 -2.43805,5.44555 -5.44555,5.44555 -3.00749,0 -5.44554,-2.43805 -5.44554,-5.44555 0,-3.00749 2.43805,-5.44554 5.44554,-5.44554 3.0075,0 5.44555,2.43805 5.44555,5.44554 z"
-       sodipodi:ry="5.4455447"
-       sodipodi:rx="5.4455447"
-       sodipodi:cy="569.6889"
-       sodipodi:cx="338.1188"
-       id="path3614"
-       style="fill:#000000;fill-opacity:1;fill-rule:nonzero;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       sodipodi:type="arc"
-       transform="translate(10.430693,-202.9505)" />
-    <path
-       style="fill:none;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;marker-start:url(#Arrow2Mstart);marker-end:url(#Arrow2Mend)"
-       d="M 179.20792,457.80278 C 336.63366,374.63446 336.63366,374.63446 336.63366,374.63446"
-       id="path3616" />
-    <g
-       transform="matrix(3.5,0,0,-3.5,-639.60396,2710.1485)"
-       xml:space="preserve"
-       style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;letter-spacing:normal;word-spacing:normal;text-anchor:start;fill:none;stroke:#000000;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:10.43299961;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       ns0:text="$dV_1$\n"
-       ns0:preamble=""
-       id="g4269">
-<path
-   d="m 228.57,664.04 0,0 0,0 0,0 0,0 0,0 0,0.01 0,0 0,0 0,0.01 0,0 0,0.01 0,0 0,0.01 -0.01,0 0,0.01 0,0 0,0.01 -0.01,0 0,0.01 0,0 -0.01,0 0,0.01 0,0 -0.01,0 0,0 0,0.01 0,0 -0.01,0 0,0 0,0 -0.01,0 0,0.01 -0.01,0 0,0 0,0 -0.01,0 0,0 -0.01,0 0,0 -0.01,0.01 0,0 -0.01,0 0,0 -0.01,0 -0.01,0 c -0.15,0 -1.09,-0.09 -1.26,-0.11 -0.08,-0.01 -0.14,-0.06 -0.14,-0.19 0,-0.12 0.09,-0.12 0.24,-0.12 0.48,0 0.5,-0.07 0.5,-0.17 l -0.03,-0.2 -0.6,-2.36 c -0.18,0.37 -0.47,0.64 -0.92,0.64 l 0.01,-0.22 c 0.65,0 0.79,-0.82 0.79,-0.88 0,-0.06 -0.02,-0.12 -0.03,-0.17 l -0.5,-1.95 c -0.05,-0.18 -0.05,-0.2 -0.2,-0.37 -0.44,-0.55 -0.84,-0.71 -1.12,-0.71 -0.5,0 -0.64,0.55 -0.64,0.94 0,0.5 0.32,1.72 0.55,2.18 0.31,0.59 0.75,0.96 1.15,0.96 l -0.01,0.22 c -1.16,0 -2.4,-1.47 -2.4,-2.92 0,-0.94 0.55,-1.6 1.33,-1.6 0.19,0 0.69,0.04 1.29,0.75 0.08,-0.42 0.43,-0.75 0.91,-0.75 0.35,0 0.58,0.23 0.74,0.55 0.16,0.36 0.29,0.97 0.29,0.99 0,0.1 -0.09,0.1 -0.12,0.1 -0.09,0 -0.11,-0.04 -0.14,-0.18 -0.17,-0.65 -0.34,-1.24 -0.75,-1.24 -0.27,0 -0.3,0.26 -0.3,0.46 0,0.24 0.02,0.31 0.06,0.48 z"
-   style="fill:#000000;stroke-width:0"
-   id="path4271" />
-<path
-   d="m 234.87,662.9 0.05,0.07 0.04,0.07 0.05,0.06 0.04,0.06 0.05,0.05 0.04,0.05 0.04,0.05 0.05,0.05 0.04,0.04 0.04,0.03 0.04,0.04 0.04,0.03 0.04,0.03 0.04,0.02 0.04,0.03 0.04,0.02 0.04,0.02 0.04,0.01 0.04,0.02 0.04,0.01 0.04,0.01 0.04,0.01 0.03,0.01 0.04,0.01 0.04,0.01 0.03,0 0.08,0.01 0.07,0 0.07,0.01 c 0.12,0.01 0.13,0.18 0.13,0.19 0,0.08 -0.05,0.12 -0.13,0.12 -0.26,0 -0.55,-0.03 -0.82,-0.03 -0.33,0 -0.67,0.03 -0.99,0.03 -0.06,0 -0.19,0 -0.19,-0.19 0,-0.11 0.09,-0.12 0.16,-0.12 0.27,-0.02 0.46,-0.12 0.46,-0.33 0,-0.15 -0.15,-0.37 -0.15,-0.38 l -3.06,-4.86 -0.67,5.27 c 0,0.17 0.23,0.3 0.68,0.3 0.14,0 0.25,0 0.25,0.2 0,0.09 -0.08,0.11 -0.14,0.11 -0.39,0 -0.82,-0.03 -1.23,-0.03 -0.18,0 -0.37,0.01 -0.55,0.01 -0.18,0 -0.37,0.02 -0.54,0.02 -0.07,0 -0.19,0 -0.19,-0.19 0,-0.12 0.09,-0.12 0.25,-0.12 0.56,0 0.57,-0.09 0.6,-0.34 l 0.79,-6.15 c 0.03,-0.2 0.07,-0.23 0.2,-0.23 0.15,0 0.2,0.05 0.28,0.18 z"
-   style="fill:#000000;stroke-width:0"
-   id="path4273" />
-<path
-   d="m 236.76,660.17 0,0.01 0,0.01 0,0.01 0,0.01 0,0.01 0,0 0,0.01 0,0.01 0,0 0,0.01 0,0.01 0,0 0,0.01 0,0 0,0.01 0,0.01 0,0 0,0.01 -0.01,0 0,0.01 0,0.01 0,0.01 -0.01,0 0,0.01 -0.01,0 0,0.01 -0.01,0 -0.01,0.01 0,0 -0.01,0 -0.01,0 -0.01,0.01 0,0 -0.01,0 -0.01,0 0,0 -0.01,0 0,0 -0.01,0 -0.01,0 0,0 -0.01,0 -0.01,0 0,0 -0.01,0 -0.01,0 -0.01,0 -0.01,0 -0.01,0 -0.01,0 c -0.44,-0.44 -1.08,-0.45 -1.36,-0.45 v -0.25 c 0.16,0 0.63,0 1.01,0.2 v -3.56 c 0,-0.23 0,-0.32 -0.7,-0.32 h -0.26 v -0.25 c 0.12,0.01 0.98,0.03 1.24,0.03 0.21,0 1.09,-0.02 1.25,-0.03 v 0.25 h -0.27 c -0.7,0 -0.7,0.09 -0.7,0.32 z"
-   style="fill:#000000;stroke-width:0"
-   id="path4275" />
-</g><g
-       id="g4313"
-       ns0:preamble=""
-       ns0:text="$dV_2$\n"
-       style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;letter-spacing:normal;word-spacing:normal;text-anchor:start;fill:none;stroke:#000000;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:10.43299961;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       xml:space="preserve"
-       transform="matrix(3.5,0,0,-3.5,-459.40594,2743.8119)">
-<path
-   id="path4315"
-   style="fill:#000000;stroke-width:0"
-   d="m 228.57,664.04 0,0 0,0 0,0 0,0 0,0 0,0.01 0,0 0,0 0,0.01 0,0 0,0.01 0,0 0,0.01 -0.01,0 0,0.01 0,0 0,0.01 -0.01,0 0,0.01 0,0 -0.01,0 0,0.01 0,0 -0.01,0 0,0 0,0.01 0,0 -0.01,0 0,0 0,0 -0.01,0 0,0.01 -0.01,0 0,0 0,0 -0.01,0 0,0 -0.01,0 0,0 -0.01,0.01 0,0 -0.01,0 0,0 -0.01,0 -0.01,0 c -0.15,0 -1.09,-0.09 -1.26,-0.11 -0.08,-0.01 -0.14,-0.06 -0.14,-0.19 0,-0.12 0.09,-0.12 0.24,-0.12 0.48,0 0.5,-0.07 0.5,-0.17 l -0.03,-0.2 -0.6,-2.36 c -0.18,0.37 -0.47,0.64 -0.92,0.64 l 0.01,-0.22 c 0.65,0 0.79,-0.82 0.79,-0.88 0,-0.06 -0.02,-0.12 -0.03,-0.17 l -0.5,-1.95 c -0.05,-0.18 -0.05,-0.2 -0.2,-0.37 -0.44,-0.55 -0.84,-0.71 -1.12,-0.71 -0.5,0 -0.64,0.55 -0.64,0.94 0,0.5 0.32,1.72 0.55,2.18 0.31,0.59 0.75,0.96 1.15,0.96 l -0.01,0.22 c -1.16,0 -2.4,-1.47 -2.4,-2.92 0,-0.94 0.55,-1.6 1.33,-1.6 0.19,0 0.69,0.04 1.29,0.75 0.08,-0.42 0.43,-0.75 0.91,-0.75 0.35,0 0.58,0.23 0.74,0.55 0.16,0.36 0.29,0.97 0.29,0.99 0,0.1 -0.09,0.1 -0.12,0.1 -0.09,0 -0.11,-0.04 -0.14,-0.18 -0.17,-0.65 -0.34,-1.24 -0.75,-1.24 -0.27,0 -0.3,0.26 -0.3,0.46 0,0.24 0.02,0.31 0.06,0.48 z" />
-<path
-   id="path4317"
-   style="fill:#000000;stroke-width:0"
-   d="m 234.87,662.9 0.05,0.07 0.04,0.07 0.05,0.06 0.04,0.06 0.05,0.05 0.04,0.05 0.04,0.05 0.05,0.05 0.04,0.04 0.04,0.03 0.04,0.04 0.04,0.03 0.04,0.03 0.04,0.02 0.04,0.03 0.04,0.02 0.04,0.02 0.04,0.01 0.04,0.02 0.04,0.01 0.04,0.01 0.04,0.01 0.03,0.01 0.04,0.01 0.04,0.01 0.03,0 0.08,0.01 0.07,0 0.07,0.01 c 0.12,0.01 0.13,0.18 0.13,0.19 0,0.08 -0.05,0.12 -0.13,0.12 -0.26,0 -0.55,-0.03 -0.82,-0.03 -0.33,0 -0.67,0.03 -0.99,0.03 -0.06,0 -0.19,0 -0.19,-0.19 0,-0.11 0.09,-0.12 0.16,-0.12 0.27,-0.02 0.46,-0.12 0.46,-0.33 0,-0.15 -0.15,-0.37 -0.15,-0.38 l -3.06,-4.86 -0.67,5.27 c 0,0.17 0.23,0.3 0.68,0.3 0.14,0 0.25,0 0.25,0.2 0,0.09 -0.08,0.11 -0.14,0.11 -0.39,0 -0.82,-0.03 -1.23,-0.03 -0.18,0 -0.37,0.01 -0.55,0.01 -0.18,0 -0.37,0.02 -0.54,0.02 -0.07,0 -0.19,0 -0.19,-0.19 0,-0.12 0.09,-0.12 0.25,-0.12 0.56,0 0.57,-0.09 0.6,-0.34 l 0.79,-6.15 c 0.03,-0.2 0.07,-0.23 0.2,-0.23 0.15,0 0.2,0.05 0.28,0.18 z" />
-<path
-   id="path4319"
-   style="fill:#000000;stroke-width:0"
-   d="m 237.95,657.01 h -0.24 c -0.02,-0.15 -0.09,-0.57 -0.18,-0.63 -0.05,-0.05 -0.59,-0.05 -0.69,-0.05 h -1.28 c 0.73,0.65 0.97,0.85 1.39,1.17 0.52,0.42 1,0.85 1,1.51 0,0.84 -0.74,1.36 -1.63,1.36 -0.87,0 -1.45,-0.61 -1.45,-1.25 0,-0.35 0.3,-0.39 0.37,-0.39 0.16,0 0.37,0.12 0.37,0.37 0,0.13 -0.05,0.37 -0.41,0.37 0.21,0.5 0.68,0.65 1.01,0.65 0.7,0 1.06,-0.54 1.06,-1.11 0,-0.61 -0.43,-1.09 -0.65,-1.34 l -1.68,-1.66 c -0.07,-0.06 -0.07,-0.08 -0.07,-0.27 h 2.87 z" />
-</g><g
-       id="g4421"
-       ns0:preamble=""
-       ns0:text="$\\mathbf{r}_{12}$\n"
-       style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;letter-spacing:normal;word-spacing:normal;text-anchor:start;fill:none;stroke:#000000;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:10.43299961;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
-       xml:space="preserve"
-       transform="matrix(3.5,0,0,-3.5,-532.67327,2743.8119)">
-<path
-   id="path4423"
-   style="fill:#000000;stroke-width:0"
-   d="m 225.46,660.59 v 1.13 l -1.66,-0.08 v -0.47 c 0.62,0 0.69,0 0.69,-0.39 v -3.08 h -0.69 v -0.47 c 0.35,0.01 0.8,0.03 1.26,0.03 0.37,0 1.01,0 1.37,-0.03 v 0.47 h -0.87 v 1.74 c 0,0.7 0.25,1.92 1.25,1.92 -0.01,-0.01 -0.19,-0.17 -0.19,-0.46 0,-0.41 0.32,-0.61 0.61,-0.61 0.29,0 0.61,0.21 0.61,0.61 0,0.53 -0.54,0.82 -1.06,0.82 -0.7,0 -1.1,-0.5 -1.32,-1.13 z" />
-<path
-   id="path4425"
-   style="fill:#000000;stroke-width:0"
-   d="m 230.49,660.17 0,0.01 0,0.01 0,0.01 0,0.01 0,0.01 0,0 0,0.01 0,0.01 -0.01,0 0,0.01 0,0.01 0,0 0,0.01 0,0 0,0.01 0,0.01 0,0 0,0.01 0,0 -0.01,0.01 0,0.01 0,0.01 -0.01,0 0,0.01 -0.01,0 0,0.01 -0.01,0 0,0.01 -0.01,0 -0.01,0 -0.01,0 -0.01,0.01 0,0 -0.01,0 0,0 -0.01,0 -0.01,0 0,0 -0.01,0 0,0 -0.01,0 -0.01,0 -0.01,0 0,0 -0.01,0 -0.01,0 -0.01,0 -0.01,0 0,0 -0.01,0 c -0.45,-0.44 -1.08,-0.45 -1.37,-0.45 v -0.25 c 0.17,0 0.63,0 1.01,0.2 v -3.56 c 0,-0.23 0,-0.32 -0.7,-0.32 h -0.26 v -0.25 c 0.12,0.01 0.98,0.03 1.24,0.03 0.22,0 1.09,-0.02 1.25,-0.03 v 0.25 h -0.27 c -0.69,0 -0.69,0.09 -0.69,0.32 z" />
-<path
-   id="path4427"
-   style="fill:#000000;stroke-width:0"
-   d="m 235.65,657.01 h -0.24 c -0.02,-0.15 -0.09,-0.57 -0.18,-0.63 -0.06,-0.05 -0.59,-0.05 -0.69,-0.05 h -1.28 c 0.73,0.65 0.97,0.85 1.39,1.17 0.52,0.42 1,0.85 1,1.51 0,0.84 -0.74,1.36 -1.64,1.36 -0.86,0 -1.45,-0.61 -1.45,-1.25 0,-0.35 0.3,-0.39 0.37,-0.39 0.17,0 0.37,0.12 0.37,0.37 0,0.13 -0.04,0.37 -0.41,0.37 0.22,0.5 0.69,0.65 1.02,0.65 0.7,0 1.06,-0.54 1.06,-1.11 0,-0.61 -0.43,-1.09 -0.65,-1.34 l -1.69,-1.66 c -0.07,-0.06 -0.07,-0.08 -0.07,-0.27 h 2.88 z" />
-</g></g>
-</svg>

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/31micron.png
Binary file source/analysis_modules/_images/31micron.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/LightCone_full_small.png
Binary file source/analysis_modules/_images/LightCone_full_small.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/PDF.png
Binary file source/analysis_modules/_images/PDF.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/PDF.svgz
Binary file source/analysis_modules/_images/PDF.svgz has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/ParallelHaloFinder.png
Binary file source/analysis_modules/_images/ParallelHaloFinder.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/ParallelHaloFinder.svg
--- a/source/analysis_modules/_images/ParallelHaloFinder.svg
+++ /dev/null
@@ -1,617 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" standalone="no"?>
-<!-- Created with Inkscape (http://www.inkscape.org/) -->
-<svg
-   xmlns:dc="http://purl.org/dc/elements/1.1/"
-   xmlns:cc="http://creativecommons.org/ns#"
-   xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
-   xmlns:svg="http://www.w3.org/2000/svg"
-   xmlns="http://www.w3.org/2000/svg"
-   xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
-   xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
-   width="1802.5653"
-   height="634"
-   id="svg4675"
-   sodipodi:version="0.32"
-   inkscape:version="0.46"
-   version="1.0"
-   sodipodi:docname="ParallelHaloFinder.svg"
-   inkscape:output_extension="org.inkscape.output.svg.inkscape">
-  <sodipodi:namedview
-     id="base"
-     pagecolor="#ffffff"
-     bordercolor="#666666"
-     borderopacity="1.0"
-     gridtolerance="10000"
-     guidetolerance="10"
-     objecttolerance="10"
-     inkscape:pageopacity="0.0"
-     inkscape:pageshadow="2"
-     inkscape:zoom="0.94"
-     inkscape:cx="615.87663"
-     inkscape:cy="661.52444"
-     inkscape:document-units="px"
-     inkscape:current-layer="layer1"
-     showgrid="false"
-     inkscape:window-width="1380"
-     inkscape:window-height="801"
-     inkscape:window-x="18"
-     inkscape:window-y="124" />
-  <defs
-     id="defs4677">
-    <inkscape:perspective
-       id="perspective4683"
-       inkscape:persp3d-origin="372.04724 : 350.78739 : 1"
-       inkscape:vp_z="744.09448 : 526.18109 : 1"
-       inkscape:vp_y="0 : 1000 : 0"
-       inkscape:vp_x="0 : 526.18109 : 1"
-       sodipodi:type="inkscape:persp3d" />
-  </defs>
-  <metadata
-     id="metadata4680">
-    <rdf:RDF>
-      <cc:Work
-         rdf:about="">
-        <dc:format>image/svg+xml</dc:format>
-        <dc:type
-           rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
-      </cc:Work>
-    </rdf:RDF>
-  </metadata>
-  <g
-     transform="translate(-84.635605,-240.26999)"
-     id="layer1"
-     inkscape:groupmode="layer"
-     inkscape:label="Layer 1">
-    <rect
-       y="243.37381"
-       x="87.328461"
-       height="574.28571"
-       width="574.28571"
-       id="rect3293"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1" />
-    <path
-       transform="translate(1147.3285,53.868784)"
-       d="M -888.57142,343.79074 A 45.714287,45.714287 0 1 1 -979.99999,343.79074 A 45.714287,45.714287 0 1 1 -888.57142,343.79074 z"
-       sodipodi:ry="45.714287"
-       sodipodi:rx="45.714287"
-       sodipodi:cy="343.79074"
-       sodipodi:cx="-934.28571"
-       id="path3810"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1143.3285,51.868784)"
-       d="M -719.99998,619.50507 A 58.57143,58.57143 0 1 1 -837.14284,619.50507 A 58.57143,58.57143 0 1 1 -719.99998,619.50507 z"
-       sodipodi:ry="58.57143"
-       sodipodi:rx="58.57143"
-       sodipodi:cy="619.50507"
-       sodipodi:cx="-778.57141"
-       id="path3812"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1147.3285,53.868784)"
-       sodipodi:open="true"
-       sodipodi:end="3.8872974"
-       sodipodi:start="0.8201017"
-       d="M -484.46407,258.92294 A 50,50 0 0 1 -555.30191,188.4377"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path3824"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1143.2317,629.06049)"
-       sodipodi:open="true"
-       sodipodi:end="5.528188"
-       sodipodi:start="3.8627418"
-       d="M -556.12379,189.34977 A 50,50 0 0 1 -482.15774,188.09785"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path3826"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(571.09911,56.390246)"
-       sodipodi:open="true"
-       sodipodi:end="7.0895067"
-       sodipodi:start="5.4872445"
-       d="M -483.59077,186.63608 A 50,50 0 0 1 -483.9635,258.44948"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path3828"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <text
-       id="text3850"
-       y="358.08282"
-       x="238.99515"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="358.08282"
-         x="238.99515"
-         id="tspan3852"
-         sodipodi:role="line">1</tspan></text>
-    <text
-       id="text3854"
-       y="629.37915"
-       x="414.92105"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="629.37915"
-         x="414.92105"
-         id="tspan3856"
-         sodipodi:role="line">2</tspan></text>
-    <text
-       id="text3858"
-       y="354.37909"
-       x="551.95813"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="354.37909"
-         x="551.95813"
-         id="tspan3860"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text3862"
-       y="294.63901"
-       x="109.75778"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="294.63901"
-         x="109.75778"
-         id="tspan3864"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text3866"
-       y="790.93536"
-       x="606.35034"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="790.93536"
-         x="606.35034"
-         id="tspan3868"
-         sodipodi:role="line">3</tspan></text>
-    <rect
-       y="243.37384"
-       x="698.05756"
-       height="574.28571"
-       width="574.28571"
-       id="rect2470"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1" />
-    <path
-       transform="translate(1758.0575,53.868788)"
-       d="M -888.57142,343.79074 A 45.714287,45.714287 0 1 1 -979.99999,343.79074 A 45.714287,45.714287 0 1 1 -888.57142,343.79074 z"
-       sodipodi:ry="45.714287"
-       sodipodi:rx="45.714287"
-       sodipodi:cy="343.79074"
-       sodipodi:cx="-934.28571"
-       id="path2472"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1754.0575,51.868788)"
-       d="M -719.99998,619.50507 A 58.57143,58.57143 0 1 1 -837.14284,619.50507 A 58.57143,58.57143 0 1 1 -719.99998,619.50507 z"
-       sodipodi:ry="58.57143"
-       sodipodi:rx="58.57143"
-       sodipodi:cy="619.50507"
-       sodipodi:cx="-778.57141"
-       id="path2474"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1758.0575,53.868788)"
-       sodipodi:open="true"
-       sodipodi:end="3.8872974"
-       sodipodi:start="0.8201017"
-       d="M -484.46407,258.92294 A 50,50 0 0 1 -555.30191,188.4377"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2476"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1753.9607,629.06049)"
-       sodipodi:open="true"
-       sodipodi:end="5.528188"
-       sodipodi:start="3.8627418"
-       d="M -556.12379,189.34977 A 50,50 0 0 1 -482.15774,188.09785"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2478"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1181.8281,56.390258)"
-       sodipodi:open="true"
-       sodipodi:end="7.0895067"
-       sodipodi:start="5.4872445"
-       d="M -483.59077,186.63608 A 50,50 0 0 1 -483.9635,258.44948"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2480"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <text
-       id="text2482"
-       y="358.08282"
-       x="849.72424"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="358.08282"
-         x="849.72424"
-         id="tspan2484"
-         sodipodi:role="line">1</tspan></text>
-    <text
-       id="text2486"
-       y="629.37909"
-       x="1025.6501"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="629.37909"
-         x="1025.6501"
-         id="tspan2488"
-         sodipodi:role="line">2</tspan></text>
-    <text
-       id="text2490"
-       y="354.37909"
-       x="1162.6871"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="354.37909"
-         x="1162.6871"
-         id="tspan2492"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text2494"
-       y="294.63898"
-       x="720.48688"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="294.63898"
-         x="720.48688"
-         id="tspan2496"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text2498"
-       y="790.9353"
-       x="1217.0793"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="790.9353"
-         x="1217.0793"
-         id="tspan2500"
-         sodipodi:role="line">3</tspan></text>
-    <path
-       id="path4603"
-       d="M 984.90936,818.268 L 984.90936,243.268"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:4, 8;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path4605"
-       d="M 1273.8908,528.74948 L 698.89091,528.74948"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:4, 8;stroke-dashoffset:0;stroke-opacity:1" />
-    <text
-       id="text2423"
-       y="519.27954"
-       x="935.05994"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="519.27954"
-         x="935.05994"
-         id="tspan2425"
-         sodipodi:role="line">A</tspan></text>
-    <text
-       id="text2427"
-       y="519.4303"
-       x="996.34601"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="519.4303"
-         x="996.34601"
-         id="tspan2429"
-         sodipodi:role="line">B</tspan></text>
-    <text
-       id="text2431"
-       y="579.82684"
-       x="934.10461"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="579.82684"
-         x="934.10461"
-         id="tspan2433"
-         sodipodi:role="line">C</tspan></text>
-    <text
-       id="text2435"
-       y="579.55688"
-       x="995.6203"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="579.55688"
-         x="995.6203"
-         id="tspan2437"
-         sodipodi:role="line">D</tspan></text>
-    <rect
-       y="243.83733"
-       x="1309.1649"
-       height="574.28571"
-       width="574.28571"
-       id="rect2512"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1" />
-    <path
-       transform="translate(2369.165,54.332296)"
-       d="M -888.57142,343.79074 A 45.714287,45.714287 0 1 1 -979.99999,343.79074 A 45.714287,45.714287 0 1 1 -888.57142,343.79074 z"
-       sodipodi:ry="45.714287"
-       sodipodi:rx="45.714287"
-       sodipodi:cy="343.79074"
-       sodipodi:cx="-934.28571"
-       id="path2514"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(2365.165,52.332296)"
-       d="M -719.99998,619.50507 A 58.57143,58.57143 0 1 1 -837.14284,619.50507 A 58.57143,58.57143 0 1 1 -719.99998,619.50507 z"
-       sodipodi:ry="58.57143"
-       sodipodi:rx="58.57143"
-       sodipodi:cy="619.50507"
-       sodipodi:cx="-778.57141"
-       id="path2516"
-       style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:5;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(2369.165,54.332296)"
-       sodipodi:open="true"
-       sodipodi:end="3.8872974"
-       sodipodi:start="0.8201017"
-       d="M -484.46407,258.92294 A 50,50 0 0 1 -555.30191,188.4377"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2518"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(2365.0682,629.52401)"
-       sodipodi:open="true"
-       sodipodi:end="5.528188"
-       sodipodi:start="3.8627418"
-       d="M -556.12379,189.34977 A 50,50 0 0 1 -482.15774,188.09785"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2520"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <path
-       transform="translate(1792.9356,56.853766)"
-       sodipodi:open="true"
-       sodipodi:end="7.0895067"
-       sodipodi:start="5.4872445"
-       d="M -483.59077,186.63608 A 50,50 0 0 1 -483.9635,258.44948"
-       sodipodi:ry="50"
-       sodipodi:rx="50"
-       sodipodi:cy="222.36218"
-       sodipodi:cx="-518.57141"
-       id="path2522"
-       style="fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:5;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4.0999999;stroke-dasharray:none;stroke-opacity:1"
-       sodipodi:type="arc" />
-    <text
-       id="text2524"
-       y="358.54633"
-       x="1460.8315"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="358.54633"
-         x="1460.8315"
-         id="tspan2526"
-         sodipodi:role="line">1</tspan></text>
-    <text
-       id="text2528"
-       y="629.84265"
-       x="1636.7574"
-       style="font-size:40px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         style="font-size:60px"
-         y="629.84265"
-         x="1636.7574"
-         id="tspan2530"
-         sodipodi:role="line">2</tspan></text>
-    <text
-       id="text2532"
-       y="354.84259"
-       x="1773.7944"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="354.84259"
-         x="1773.7944"
-         id="tspan2534"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text2536"
-       y="295.10248"
-       x="1331.5942"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="295.10248"
-         x="1331.5942"
-         id="tspan2538"
-         sodipodi:role="line">3</tspan></text>
-    <text
-       id="text2540"
-       y="791.39886"
-       x="1828.1866"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="791.39886"
-         x="1828.1866"
-         id="tspan2542"
-         sodipodi:role="line">3</tspan></text>
-    <path
-       id="path2544"
-       d="M 1596.0168,818.73152 L 1596.0168,243.73152"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:4, 8;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path2546"
-       d="M 1884.9983,529.213 L 1309.9983,529.213"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:4;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:4, 8;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path4607"
-       d="M 1672.1649,819.15689 L 1672.1649,456.96223"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.17465901;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:6.34931817, 3.17465908, 1.58732954, 3.17465908;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path4611"
-       d="M 1673.8624,454.47226 L 1307.9411,454.47226"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.19094968;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:6.38189956, 3.19094978, 1.59547488, 3.19094978;stroke-dashoffset:0;stroke-opacity:1" />
-    <text
-       id="text2550"
-       y="519.7431"
-       x="1546.1674"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="519.7431"
-         x="1546.1674"
-         id="tspan2552"
-         sodipodi:role="line">A</tspan></text>
-    <text
-       id="text2554"
-       y="519.89386"
-       x="1607.4534"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="519.89386"
-         x="1607.4534"
-         id="tspan2556"
-         sodipodi:role="line">B</tspan></text>
-    <text
-       id="text2558"
-       y="580.29041"
-       x="1545.2119"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="580.29041"
-         x="1545.2119"
-         id="tspan2560"
-         sodipodi:role="line">C</tspan></text>
-    <text
-       id="text2562"
-       y="580.02045"
-       x="1606.7277"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="580.02045"
-         x="1606.7277"
-         id="tspan2564"
-         sodipodi:role="line">D</tspan></text>
-    <path
-       transform="translate(1310.2315,244.52564)"
-       d="M 117.29323,163.05865 L 118.677,154.5806 L 112.47747,148.63434 L 120.96818,147.33053 L 124.70765,139.59693 L 128.57143,147.26918 L 137.08207,148.4358 L 130.97931,154.48133 L 132.49971,162.93594 L 124.86422,159.00004 L 117.29323,163.05865 z"
-       inkscape:randomized="0"
-       inkscape:rounded="0"
-       inkscape:flatsided="false"
-       sodipodi:arg2="2.8193643"
-       sodipodi:arg1="2.1910458"
-       sodipodi:r2="6.4679136"
-       sodipodi:r1="12.935827"
-       sodipodi:cy="152.53233"
-       sodipodi:cx="124.81203"
-       sodipodi:sides="5"
-       id="path3215"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-       sodipodi:type="star" />
-    <path
-       transform="translate(1461.8878,516.66251)"
-       d="M 117.29323,163.05865 L 118.677,154.5806 L 112.47747,148.63434 L 120.96818,147.33053 L 124.70765,139.59693 L 128.57143,147.26918 L 137.08207,148.4358 L 130.97931,154.48133 L 132.49971,162.93594 L 124.86422,159.00004 L 117.29323,163.05865 z"
-       inkscape:randomized="0"
-       inkscape:rounded="0"
-       inkscape:flatsided="false"
-       sodipodi:arg2="2.8193643"
-       sodipodi:arg1="2.1910458"
-       sodipodi:r2="6.4679136"
-       sodipodi:r1="12.935827"
-       sodipodi:cy="152.53233"
-       sodipodi:cx="124.81203"
-       sodipodi:sides="5"
-       id="path3217"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-       sodipodi:type="star" />
-    <path
-       transform="translate(1729.8126,116.66252)"
-       d="M 117.29323,163.05865 L 118.677,154.5806 L 112.47747,148.63434 L 120.96818,147.33053 L 124.70765,139.59693 L 128.57143,147.26918 L 137.08207,148.4358 L 130.97931,154.48133 L 132.49971,162.93594 L 124.86422,159.00004 L 117.29323,163.05865 z"
-       inkscape:randomized="0"
-       inkscape:rounded="0"
-       inkscape:flatsided="false"
-       sodipodi:arg2="2.8193643"
-       sodipodi:arg1="2.1910458"
-       sodipodi:r2="6.4679136"
-       sodipodi:r1="12.935827"
-       sodipodi:cy="152.53233"
-       sodipodi:cx="124.81203"
-       sodipodi:sides="5"
-       id="path3219"
-       style="opacity:1;fill:#ffffff;fill-opacity:0;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-       sodipodi:type="star" />
-    <path
-       id="path2410"
-       d="M 1675.5439,318.70709 L 1309.6225,318.70709"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.19094968;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:6.38189956, 3.19094978, 1.59547488, 3.19094978;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path2412"
-       d="M 1809.2869,821.32294 L 1809.2869,459.12828"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.17465901;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:6.34931817, 3.17465908, 1.58732954, 3.17465908;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path3188"
-       d="M 1807.2498,456.55894 L 1881.3239,456.55894"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.9000001;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:7.8, 3.9, 1.95, 3.9;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path3190"
-       d="M 1677.398,320.13304 L 1677.398,249.57744"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.80624986;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:7.61249959, 3.8062498, 1.9031249, 3.8062498;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path3192"
-       d="M 1803.7684,320.54045 L 1803.7684,249.98485"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.80624986;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:7.61249959, 3.8062498, 1.9031249, 3.8062498;stroke-dashoffset:0;stroke-opacity:1" />
-    <path
-       id="path3194"
-       d="M 1807.8795,318.70709 L 1881.9535,318.70709"
-       style="fill:none;fill-rule:evenodd;stroke:#000000;stroke-width:3.9000001;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:7.8, 3.9, 1.95, 3.9;stroke-dashoffset:0;stroke-opacity:1" />
-    <text
-       transform="translate(84.635605,240.26999)"
-       id="text2575"
-       y="622"
-       x="218"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       xml:space="preserve"><tspan
-         y="622"
-         x="218"
-         id="tspan2577"
-         sodipodi:role="line" /></text>
-    <text
-       xml:space="preserve"
-       style="font-size:60px;font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;fill:#000000;fill-opacity:1;stroke:none;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;font-family:Helvetica;-inkscape-font-specification:Helvetica"
-       x="370.63562"
-       y="874.27002"
-       id="text3473"><tspan
-         sodipodi:role="line"
-         id="tspan3475"
-         x="370.63562"
-         y="874.27002">i                                   ii                                  iii</tspan></text>
-  </g>
-</svg>

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/SED.png
Binary file source/analysis_modules/_images/SED.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/SFR.png
Binary file source/analysis_modules/_images/SFR.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/TreecodeCellsBig.png
Binary file source/analysis_modules/_images/TreecodeCellsBig.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/TreecodeCellsSmall.png
Binary file source/analysis_modules/_images/TreecodeCellsSmall.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/TreecodeOpeningAngleBig.png
Binary file source/analysis_modules/_images/TreecodeOpeningAngleBig.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/halo_mass_function.png
Binary file source/analysis_modules/_images/halo_mass_function.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/lightray.png
Binary file source/analysis_modules/_images/lightray.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/merger_tree_ex.png
Binary file source/analysis_modules/_images/merger_tree_ex.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/mw3_0420.png
Binary file source/analysis_modules/_images/mw3_0420.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/profiles.png
Binary file source/analysis_modules/_images/profiles.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/projections.png
Binary file source/analysis_modules/_images/projections.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/spectrum_full.png
Binary file source/analysis_modules/_images/spectrum_full.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/spectrum_zoom.png
Binary file source/analysis_modules/_images/spectrum_zoom.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes0.png
Binary file source/analysis_modules/_images/struct_fcn_subvolumes0.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes0.svgz
Binary file source/analysis_modules/_images/struct_fcn_subvolumes0.svgz has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes1.png
Binary file source/analysis_modules/_images/struct_fcn_subvolumes1.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes1.svgz
Binary file source/analysis_modules/_images/struct_fcn_subvolumes1.svgz has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes2.png
Binary file source/analysis_modules/_images/struct_fcn_subvolumes2.png has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/_images/struct_fcn_subvolumes2.svgz
Binary file source/analysis_modules/_images/struct_fcn_subvolumes2.svgz has changed

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/absorption_spectrum.rst
--- a/source/analysis_modules/absorption_spectrum.rst
+++ /dev/null
@@ -1,104 +0,0 @@
-.. _absorption_spectrum:
-
-Making an Absorption Spectrum
-=============================
-.. sectionauthor:: Britton Smith <brittonsmith at gmail.com>
-
-Absorption line spectra, such as shown below, can be made with data created by the 
-(:ref:`light-ray-generator`).  For each element of the ray, column densities are 
-calculated multiplying the number density within a grid cell with the path length 
-of the ray through the cell.  Line profiles are generated using a voigt profile based 
-on the temperature field.  The lines are then shifted according to the redshift 
-recorded by the light ray tool and (optionally) the line of sight peculiar velocity.  
-Inclusion of the peculiar velocity requires setting **get_los_velocity** to True in 
-the call to :meth:`make_light_ray`.
-
-The spectrum generator will output a file containing the wavelength and normalized flux.  
-It will also output a text file listing all important lines.
-
-.. image:: _images/spectrum_full.png
-   :width: 500
-
-An absorption spectrum for the wavelength range from 900 to 1800 Angstroms made with 
-a light ray extending from z = 0 to z = 0.4.
-
-.. image:: _images/spectrum_zoom.png
-   :width: 500
-
-A zoom-in of the above spectrum.
-
-Creating an Absorption Spectrum
--------------------------------
-
-To instantiate an AbsorptionSpectrum object, the arguments required are the minimum and 
-maximum wavelengths, and the number of wavelength bins.
-
-.. code-block:: python
-
-  from yt.analysis_modules.api import AbsorptionSpectrum
-
-  sp = AbsorptionSpectrum(900.0, 1800.0, 10000)
-
-Adding Features to the Spectrum
--------------------------------
-
-Absorption lines and continuum features can then be added to the spectrum.  To add a 
-line, you must know some properties of the line: the rest wavelength, f-value, gamma value, 
-and the atomic mass in amu of the atom.  Below, we will add the H Lyman-alpha line.
-
-.. code-block:: python
-  
-  my_label = 'HI Lya'
-  field = 'HI_NumberDensity'
-  wavelength = 1215.6700 # Angstroms
-  f_value = 4.164E-01
-  gamma = 6.265e+08
-  mass = 1.00794
-  
-  sp.add_line(my_label, field, wavelength, f_value, gamma, mass, label_threshold=1.e10)
-
-In the above example, the *field* argument tells the spectrum generator which field from the 
-ray data to use to calculate the column density.  The **label_threshold** keyword tells the 
-spectrum generator to add all lines above a column density of 10 :superscript:`10` 
-cm :superscript:`-2` to the text line list.  If None is provided, as is the default, no 
-lines of this type will be added to the text list.
-
-Continuum features who optical depths follow a power law can also be added.  Below, we will add 
-H Lyman continuum.
-
-.. code-block:: python
-
-  my_label = 'HI Lya'
-  field = 'HI_NumberDensity'
-  wavelength = 912.323660 # Angstroms
-  normalization = 1.6e17
-  index = 3.0
-  
-  sp.add_continuum(my_label, field, wavelength, normalization, index)
-
-Making the Spectrum
--------------------
-
-Once all the lines and continuum are added, it is time to make a spectrum out of 
-some light ray data.
-
-.. code-block:: python
-
-  wavelength, flux = sp.make_spectrum('lightray.h5', output_file='spectrum.fits', 
-                                      line_list_file='lines.txt',
-                                      use_peculiar_velocity=True)
-
-A spectrum will be made using the specified ray data and the wavelength and flux arrays 
-will also be returned.  If **use_peculiar_velocity** is set to False, the lines will only 
-be shifted according to the redshift.
-
-Three output file formats are supported for writing out the spectrum: fits, hdf5, and ascii.  
-The file format used is based on the extension provided in the **output_file** keyword: '.fits' 
-for a fits file, '.h5' for an hdf5 file, and anything else for an ascii file.
-
-.. note:: To write out a fits file, you must install the `pyfits <http://www.stsci.edu/resources/software_hardware/pyfits>`_ module.
-
-What can I do with this?
-------------------------
-
-Try :ref:`quick_start_fitting`

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/clump_finding.rst
--- a/source/analysis_modules/clump_finding.rst
+++ /dev/null
@@ -1,186 +0,0 @@
-.. _clump_finding:
-
-Clump Finding
-=============
-.. sectionauthor:: Britton Smith <britton.smith at colorado.edu>
-
-YT has the ability to identify topologically disconnected structures based in a dataset using 
-any field available.  This is powered by a contouring algorithm that runs in a recursive 
-fashion.  The user specifies the initial data object in which the clump-finding will occur, 
-the field over which the contouring will be done, the upper and lower limits of the 
-initial contour, and the contour increment.
-
-The clump finder begins by creating a single contour of the specified field over the entire 
-range given.  For every isolated contour identified in the initial iteration, contouring is 
-repeated with the same upper limit as before, but with the lower limit increased by the 
-specified increment.  This repeated for every isolated group until the lower limit is equal 
-to the upper limit.
-
-Often very tiny clumps can appear as groups of only a few cells that happen to be slightly 
-overdense (if contouring over density) with respect to the surrounding gas.  The user may 
-specify criteria that clumps must meet in order to be kept.  The most obvious example is 
-selecting only those clumps that are gravitationally bound.
-
-Once the clump-finder has finished, the user can write out a set of quantities for each clump in the 
-hierarchy.  Additional info items can also be added.  We also provide a recipe
-for finding clumps in :ref:`cookbook-find_clumps`.
-
-Treecode Optimization
----------------------
-
-.. sectionauthor:: Stephen Skory <s at skory.us>
-.. versionadded:: 2.1
-
-As mentioned above, the user has the option to limit clumps to those that are
-gravitationally bound.
-The correct and accurate way to calculate if a clump is gravitationally
-bound is to do the full double sum:
-
-.. math::
-
-  PE = \Sigma_{i=1}^N \Sigma_{j=i}^N \frac{G M_i M_j}{r_{ij}}
-
-where :math:`PE` is the gravitational potential energy of :math:`N` cells,
-:math:`G` is the
-gravitational constant, :math:`M_i` is the mass of cell :math:`i`, 
-and :math:`r_{ij}` is the distance
-between cell :math:`i` and :math:`j`.
-The number of calculations required for this calculation
-grows with the square of :math:`N`. Therefore, for large clumps with many cells, the
-test for boundedness can take a significant amount of time.
-
-An effective way to greatly speed up this calculation with minimal error
-is to use the treecode approximation pioneered by
-`Barnes and Hut (1986) <http://adsabs.harvard.edu/abs/1986Natur.324..446B>`_.
-This method of calculating gravitational potentials works by
-grouping individual masses that are located close together into a larger conglomerated
-mass with a geometric size equal to the distribution of the individual masses.
-For a mass cell that is sufficiently distant from the conglomerated mass,
-the gravitational calculation can be made using the conglomerate, rather than
-each individual mass, which saves time.
-
-The decision whether or not to use a conglomerate depends on the accuracy control
-parameter ``opening_angle``. Using the small-angle approximation, a conglomerate
-may be used if its geometric size subtends an angle no greater than the
-``opening_angle`` upon the remote mass. The default value is
-``opening_angle = 1``, which gives errors well under 1%. A value of 
-``opening_angle = 0`` is identical to the full O(N^2) method, and larger values
-will speed up the calculation and sacrifice accuracy (see the figures below).
-
-The treecode method is iterative. Conglomerates may themselves form larger
-conglomerates. And if a larger conglomerate does not meet the ``opening_angle``
-criterion, the smaller conglomerates are tested as well. This iteration of 
-conglomerates will
-cease once the level of the original masses is reached (this is what happens
-for all pair calculations if ``opening_angle = 0``).
-
-Below are some examples of how to control the usage of the treecode.
-
-This example will calculate the ratio of the potential energy to kinetic energy
-for a spherical clump using the treecode method with an opening angle of 2.
-The default opening angle is 1.0:
-
-.. code-block:: python
-  
-  from yt.mods import *
-  
-  pf = load("DD0000")
-  sp = pf.h.sphere([0.5, 0.5, 0.5], radius=0.1)
-  
-  ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
-      treecode=True, opening_angle=2.0)
-
-This example will accomplish the same as the above, but will use the full
-N^2 method.
-
-.. code-block:: python
-  
-  from yt.mods import *
-  
-  pf = load("DD0000")
-  sp = pf.h.sphere([0.5, 0.5, 0.5], radius=0.1)
-  
-  ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
-      treecode=False)
-
-Here the treecode method is specified for clump finding (this is default).
-Please see the link above for the full example of how to find clumps (the
-trailing backslash is important!):
-
-.. code-block:: python
-  
-  function_name = 'self.data.quantities["IsBound"](truncate=True, \
-      include_thermal_energy=True, treecode=True, opening_angle=2.0) > 1.0'
-  master_clump = amods.level_sets.Clump(data_source, None, field,
-      function=function_name)
-
-To turn off the treecode, of course one should turn treecode=False in the
-example above.
-
-Treecode Speedup and Accuracy Figures
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Two datasets are used to make the three figures below. Each is a zoom-in
-simulation with high resolution in the middle with AMR, and then lower
-resolution static grids on the periphery. In this way they are very similar to
-a clump in a full-AMR simulation, where there are many AMR levels stacked
-around a density peak. One dataset has a total of 3 levels of AMR, and
-the other has 10 levels, but in other ways are very similar.
-
-The first figure shows the effect of varying the opening angle on the speed
-and accuracy of the treecode. The tests were performed using the L=10 
-dataset on a clump with approximately 118,000 cells. The speedup of up the
-treecode is in green, and the accuracy in blue, with the opening angle
-on the x-axis.
-
-With an ``opening_angle`` = 0, the accuracy is perfect, but the treecode is
-less than half as fast as the brute-force method. However, by an
-``opening_angle`` of 1, the treecode is now nearly twice as fast, with
-about 0.2% error. This trend continues to an ``opening_angle`` 8, where
-large opening angles have no effect due to geometry.
-
-.. image:: _images/TreecodeOpeningAngleBig.png
-   :width: 450
-   :height: 400
-
-Note that the accuracy is always below 1. The treecode will always underestimate
-the gravitational binding energy of a clump.
-
-In this next figure, the ``opening_angle`` is kept constant at 1, but the
-number of cells is varied on the L=3 dataset by slowly expanding a spherical
-region of analysis. Up to about 100,000 cells,
-the treecode is actually slower than the brute-force method. This is due to
-the fact that with fewer cells, smaller geometric distances,
-and a shallow AMR hierarchy, the treecode
-method has very little chance to be applied. The calculation is overall
-slower due to the overhead of the treecode method & startup costs. This
-explanation is further strengthened by the fact that the accuracy of the
-treecode method stay perfect for the first couple thousand cells, indicating
-that the treecode method is not being applied over that range.
-
-Once the number of cells gets high enough, and the size of the region becomes
-large enough, the treecode method can work its magic and the treecode method
-becomes advantageous.
-
-.. image:: _images/TreecodeCellsSmall.png
-   :width: 450
-   :height: 400
-
-The saving grace to the figure above is that for small clumps, a difference of
-50% in calculation time is on the order of a second or less, which is tiny
-compared to the minutes saved for the larger clumps where the speedup can
-be greater than 3.
-
-The final figure is identical to the one above, but for the L=10 dataset.
-Due to the higher number of AMR levels, which translates into more opportunities
-for the treecode method to be applied, the treecode becomes faster than the
-brute-force method at only about 30,000 cells. The accuracy shows a different
-behavior, with a dip and a rise, and overall lower accuracy. However, at all
-times the error is still well under 1%, and the time savings are significant.
-
-.. image:: _images/TreecodeCellsBig.png
-   :width: 450
-   :height: 400
-
-The figures above show that the treecode method is generally very advantageous,
-and that the error introduced is minimal.
\ No newline at end of file

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/ellipsoid_analysis.rst
--- a/source/analysis_modules/ellipsoid_analysis.rst
+++ /dev/null
@@ -1,166 +0,0 @@
-.. _ellipsoid_analysis:
-
-Halo Ellipsoid Analysis
-=======================
-.. sectionauthor:: Geoffrey So <gso at physics.ucsd.edu>
-
-.. warning:: This is my first attempt at modifying the YT source code,
-   so the program may be bug ridden.  Please send yt-dev an email and
-   address to Geoffrey So if you discover something wrong with this
-   portion of the code.
-
-Purpose
--------
-
-The purpose of creating this feature in YT is to analyze field
-properties that surround dark matter haloes.  Originally, this was
-usually done with the sphere 3D container, but since many halo
-particles are linked together in a more elongated shape, I thought it
-would be better to use an ellipsoid 3D container to wrap around the
-particles.  This way, less of the empty-of-particle space around the
-halo would be included when doing the analysis of field properties
-where the particles are suppose to occupy.
-
-General Overview
-----------------
-
-In order to use the ellipsoid 3D container object, one must supply it
-with a center, the magnitude of the semi-principle axes, the direction
-of the first semi-principle axis, the tilt angle (rotation angle about
-the y axis that will align the first semi-principle axis with the x
-axis once it is aligned in the x-z plane.)
-
-Once those parameters are determined, the function "ellipsoid" will
-return the 3D object, and users will be able to get field attributes
-from the data object just as they would from spheres, cylinders etc.
-
-Example
--------
-
-To use the ellipsoid container to get field information, you
-will have to first determine the ellipsoid's parameters.  This can be
-done with the haloes obtained from halo finding, but essentially it
-takes the information:
-
-  #. Center position x,y,z
-  #. List of particles position x,y,z
-
-And calculates the ellipsoid information needed for the 3D container.
-
-What I usually do is get this information from the halo finder output
-files in the .h5 HDF5 binary format. I load them into memory using the
-LoadHaloes() function instead of reading in the ASCII output.
-
-Halo Finding
-~~~~~~~~~~~~
-.. code-block:: python
-
-  from yt.mods import *
-  from yt.analysis_modules.halo_finding.api import *
-
-  pf=load('RD0006/RD0006')
-  halo_list = parallelHF(pf)
-  halo_list.dump('MyHaloList')
-
-Ellipsoid Parameters
-~~~~~~~~~~~~~~~~~~~~
-.. code-block:: python
-
-  from yt.mods import *
-  from yt.analysis_modules.halo_finding.api import *
-
-  pf=load('RD0006/RD0006')
-  haloes = LoadHaloes(pf, 'MyHaloList')
-
-Once the halo information is saved you can load it into the data
-object "haloes", you can get loop over the list of haloes and do
-
-.. code-block:: python
-
-  ell_param = haloes[0].get_ellipsoid_parameters()
-
-This will return 6 items
-
-  #. The center of mass as an array.
-  #. A as a float.  (Must have A>=B)
-  #. B as a float.  (Must have B>=C)
-  #. C as a float.  (Must have C > cell size)
-  #. e0 vector as an array.  (now normalized automatically in the code)
-  #. tilt as a float.
-
-The center of mass would be the same one as returned by the halo
-finder.  The A, B, C are the largest to smallest magnitude of the
-ellipsoid's semi-principle axes. "e0" is the largest semi-principle
-axis vector direction that would have magnitude A but normalized.  
-The "tilt" is an angle measured in radians.  It can be best described
-as after the rotation about the z-axis to allign e0 to x in the x-y
-plane, and then rotating about the y-axis to align e0 completely to
-the x-axis, the angle remaining to rotate about the x-axis to align
-both e1 to the y-axis and e2 to the z-axis.
-
-Ellipsoid 3D Container
-~~~~~~~~~~~~~~~~~~~~~~
-
-Once the parameters are obtained from the get_ellipsoid_parameters()
-function, or picked at random by the user, it can be input into the
-ellipsoid container as:
-
-.. code-block:: python
-
-  ell = pf.h.ellipsoid(ell_param[0],
-  ell_param[1],
-  ell_param[2],
-  ell_param[3],
-  ell_param[4],
-  ell_param[5])
-  dens = ell.quantities['TotalQuantity']('Density')[0]
-
-This way, "ell" will be the ellipsoid container, and "dens" will be
-the total density of the ellipsoid in an unigrid simulation.  One can
-of course use this container object with parameters that they come up
-with, the ellipsoid parameters do not have to come from the Halo
-Finder.  And of course, one can use the ellipsoid container with other
-derived fields or fields that they are interested in.
-
-Drawbacks
----------
-
-Since this is a first attempt, there are many drawbacks and corners
-cut.  Many things listed here will be amended when I have time.
-
-* The ellipsoid 3D container like the boolean object, do not contain 
-  particle position and velocity information.
-* This currently assume periodic boundary condition, so if an
-  ellipsoid center is at the edge, it will return part of the opposite
-  edge field information.  Will try to put in the option to turn off
-  periodicity in the future.
-* This method gives a minimalistic ellipsoid centered around the
-  center of mass that contains all the particles, but sometimes people
-  prefer an inertial tensor triaxial ellipsoid described in 
-  `Dubinski, Carlberg 1991
-  <http://adsabs.harvard.edu/abs/1991ApJ...378..496D>`_.  I have that
-  method composed but it is not fully tested yet.
-* The method to obtain information from the halo still uses the center
-  of mass as the center of the ellipsoid, so it is not making the
-  smallest ellipsoid that contains the particles as possible.  To
-  start at the center of the particles based on position will require
-  an O(:math:`N^2`) operation, right now I'm trying to limit
-  everything to O(:math:`N`) operations.  If particle count does not
-  get too large, I may implement the O(:math:`N^2`) operation.
-* Currently the list of haloes can be analyzed using object
-  parallelism (one halo per core), but I'm not sure if haloes will get
-  big enough soon that other forms of parallelism will be needed to
-  analyze them due to memory constraint.
-* This has only been tested on unigrid simulation data, not AMR.  In
-  unigrid simulations, I can take "dens" from the example and divide
-  it by the total number of cells to get the average density, in AMR
-  one would need to do an volume weighted average instead.
-
-Thanks
-------
-
-Big thanks to the yt-user and yt-dev community that have been so
-supportive.  Special thanks to Stephen Skory for help in coding some
-functions that I'm not familiar with, Britton Smith's advice to shave
-off redundant data, Matt Turk for encouraging me to even start on
-this trek, and Dave Collins for getting ideas straight in my head.

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/fitting_procedure.rst
--- a/source/analysis_modules/fitting_procedure.rst
+++ /dev/null
@@ -1,138 +0,0 @@
-.. _fitting_procedure:
-
-Procedure for Generating Fits
-=============================
-.. sectionauthor:: Hilary Egan <hilary.egan at colorado.edu>
-
-To generate a fit for a spectrum :py:func:`generate_total_fit()` is called.
-This function controls the identification of line complexes, the fit
-of a series of absorption lines for each appropriate species, checks of
-those fits, and returns the results of the fits.
-
-
-Finding Line Complexes
-----------------------
-Line complexes are found using the :py:func:`find_complexes` function. The
-process by which line complexes are found involves walking through
-the array of flux in order from minimum to maximum wavelength, and finding
-series of spatially contiguous cells whose flux is less than some limit.
-These regions are then checked in terms of an additional flux limit and size.
-The bounds of all the passing regions are then listed and returned. Those
-bounds that cover an exceptionally large region of wavelength space will be
-broken up if a suitable cut point is found. This method is only appropriate
-for noiseless spectra.
-
-The optional parameter **complexLim** (default = 0.999), controls the limit
-that triggers the identification of a spatially contiguous region of flux
-that could be a line complex. This number should be very close to 1 but not
-exactly equal. It should also be at least an order of magnitude closer to 1
-than the later discussed **fitLim** parameter, because a line complex where
-the flux of the trough is very close to the flux of the edge can be incredibly
-unstable when optimizing.
-
-The **fitLim** parameter controls what is the maximum flux that the trough
-of the region can have and still be considered a line complex. This 
-effectively controls the sensitivity to very low column absorbers. Default
-value is **fitLim** = 0.99. If a region is identified where the flux of the trough
-is greater than this value, the region is simply ignored.
-
-The **minLength** parameter controls the minimum number of array elements 
-that an identified region must have. This value must be greater than or
-equal to 3 as there are a minimum of 3 free parameters that must be fit.
-Default is **minLength** = 3.
-
-The **maxLength** parameter controls the maximum number of array elements
-that an identified region can have before it is split into separate regions.
-Default is **maxLength** = 1000. This should be adjusted based on the 
-resolution of the spectrum to remain appropriate. The value correspond
-to a wavelength of roughly 50 angstroms. 
-
-The **splitLim** parameter controls how exceptionally large regions are split.
-When such a region is identified by having more array elements than
-**maxLength**, the point of maximum flux (or minimum absorption) in the 
-middle two quartiles is identified. If that point has a flux greater than
-or equal to **splitLim**, then two separate complexes are created: one from
-the lower wavelength edge to the minimum absorption point and the other from
-the minimum absorption point to the higher wavelength edge. The default
-value is **splitLim** =.99, but it should not drastically affect results, so
-long as the value is reasonably close to 1.
-
-
-Fitting a Line Complex
-----------------------
-
-After a complex is identified, it is fitted by iteratively adding and 
-optimizing a set of Voigt Profiles for a particular species until the
-region is considered successfully fit. The optimizing is accomplished
-using scipy's least squares optimizer. This requires an initial estimate
-of the parameters to be fit (column density, b-value, redshift) for each
-line.
-
-Each time a line is added, the guess of the parameters is based on
-the difference between the line complex and the fit so far. For the first line
-this just means the initial guess is based solely on the flux of the line
-complex. The column density is given by the initial column density given
-in the species parameters dictionary. If the line is saturated (some portion
-of the flux with a value less than .1) than the larger initial column density
-guess is chosen. If the flux is relatively high (all values >.9) than the
-smaller initial guess is given. These values are chosen to make optimization
-faster and more stable by being closer to the actual value, but the final
-results of fitting should not depend on them as they merely provide a
-starting point. 
-
-After the parameters for a line are optimized for the first time, the 
-optimized parameters are then used for the initial guess on subsequent 
-iterations with more lines. 
-
-The complex is considered successfully fit when the sum of the squares of 
-the difference between the flux generated from the fit and the desired flux
-profile is less than **errBound**. **errBound** is related to the optional
-parameter to :py:func:`generate_total_fit()`, **maxAvgError** by the number
-of array elements in the region such that **errBound** = number of elements *
-**maxAvgError**.
-
-There are several other conditions under which the cycle of adding and 
-optimizing lines will halt. If the error of the optimized fit from adding
-a line is an order of magnitude worse than the error of the fit without
-that line, then it is assumed that the fitting has become unstable and 
-the latest line is removed. Lines are also prevented from being added if
-the total number of lines is greater than the number of elements in the flux
-array being fit divided by 3. This is because there must not be more free
-parameters in a fit than the number of points to constrain them. 
-
-
-Checking Fit Results
---------------------
-
-After an acceptable fit for a region is determined, there are several steps
-the algorithm must go through to validate the fits. 
-
-First, the parameters must be in a reasonable range. This is a check to make 
-sure that the optimization did not become unstable and generate a fit that
-diverges wildly outside the region where the fit was performed. This way, even
-if particular complex cannot be fit, the rest of the spectrum fitting still
-behaves as expected. The range of acceptability for each parameter is given
-in the species parameter dictionary. These are merely broad limits that will
-prevent numerical instability rather than physical limits.
-
-In cases where a single species generates multiple lines (as in the OVI 
-doublet), the fits are then checked for higher wavelength lines. Originally
-the fits are generated only considering the lowest wavelength fit to a region.
-This is because we perform the fitting of complexes in order from the lowest
-wavelength to the highest, so any contribution to a complex being fit must
-come from the lower wavelength as the higher wavelength contributions would
-already have been subtracted out after fitting the lower wavelength. 
-
-Saturated Lyman Alpha Fitting Tools
------------------------------------
-
-In cases where a large or saturated line (there exists a point in the complex
-where the flux is less than .1) fails to be fit properly at first pass, a
-more robust set of fitting tools is used to try and remedy the situation.
-The basic approach is to simply try a much wider range of initial parameter
-guesses in order to find the true optimization minimum, rather than getting
-stuck in a local minimum. A set of hard coded initial parameter guesses
-for Lyman alpha lines is given by the function :py:func:`get_test_lines`. 
-Also included in these parameter guesses is an an initial guess of a high
-column cool line overlapping a lower column warm line, indictive of a 
-broad Lyman alpha (BLA) absorber.

diff -r bb1365132027344adebd1bc86e8466b71d0f1661 -r 62e32737bacce37678bf875d946872f58b9e2b4b source/analysis_modules/halo_mass_function.rst
--- a/source/analysis_modules/halo_mass_function.rst
+++ /dev/null
@@ -1,159 +0,0 @@
-.. _halo_mass_function:
-
-Halo Mass Function
-==================
-.. sectionauthor:: Stephen Skory <sskory at physics.ucsd.edu>
-.. versionadded:: 1.6
-
-The Halo Mass Function extension is capable of outputting the halo mass function
-for a collection haloes (input), and/or an analytical fit over a given mass range
-for a set of specified cosmological parameters.
-
-This extension is based on code generously provided by Brian O'Shea.
-
-General Overview
-----------------
-
-In order to run this extension on a dataset, the haloes need to be located
-(using HOP, FOF or Parallel HOP, see :ref:`halo_finding`),
-and their virial masses determined using the
-HaloProfiler (see :ref:`halo_profiling`).
-Please see the step-by-step how-to which puts these steps together
-(:ref:`hmf_howto`).
-If an optional analytical fit is desired, the correct initial
-cosmological parameters will need to be input as well. These initial parameters
-are not stored in an Enzo dataset, so they must be set by hand.
-An analytical fit can be found without referencing a particular dataset or
-set of haloes, but all the cosmological parameters need to be set by hand.
-
-Analytical Fits
----------------
-
-There are five analytical fits to choose from.
-
-  1. `Press-Schechter (1974) <http://adsabs.harvard.edu/abs/1974ApJ...187..425P>`_
-  2. `Jenkins (2001) <http://adsabs.harvard.edu/abs/2001MNRAS.321..372J>`_
-  3. `Sheth-Tormen (2002) <http://adsabs.harvard.edu/abs/2002MNRAS.329...61S>`_
-  4. `Warren (2006) <http://adsabs.harvard.edu/abs/2006ApJ...646..881W>`_
-  5. `Tinker (2008) <http://adsabs.harvard.edu/abs/2008ApJ...688..709T>`_
-
-We encourage reading each of the primary sources.
-In general, we recommend the Warren fitting function because it matches
-simulations over a wide range of masses very well.
-The Warren fitting function is the default (equivalent to not specifying
-``fitting_function`` in ``HaloMassFcn()``, below).
-The Tinker fit is for the :math:`\Delta=300` fits given in the paper, which
-appears to fit HOP threshold=80.0 fairly well.
-
-Analyze Simulated Haloes
-------------------------
-
-If an analytical fit is not needed, it is simple to analyze a set of 
-haloes. The ``halo_file`` needs to be specified, and
-``fitting_function`` does not need to be specified.
-``num_sigma_bins`` is how many bins the halo masses are sorted into.
-The default is 360. ``mass_column`` is the zero-indexed column of the
-``halo_file`` file that contains the halo masses. The default is 5, which
-corresponds to the sixth column of data in the file.
-
-.. code-block:: python
-
-  from yt.mods import *
-  from yt.analysis_modules.halo_mass_function.api import *
-  pf = load("data0030")
-  hmf = HaloMassFcn(pf, halo_file="FilteredQuantities.out", num_sigma_bins=200,
-  mass_column=5)
-
-Attached to ``hmf`` is the convenience function ``write_out``, which saves
-the halo mass function to a text file. By default, both the halo analysis (``haloes``) and
-fit (``fit``) are written to (different) text files, but they can be turned on or off
-explicitly. ``prefix`` sets the name used for the file(s). The haloes file
-is named ``prefix-haloes.dat``, and the fit file ``prefix-fit.dat``.
-Continued from above, invoking this command:
-
-.. code-block:: python
-
-  hmf.write_out(prefix='hmf', fit=False, haloes=True)
-
-will save the haloes data to a file named ``hmf-haloes.dat``. The contents
-of the ``-haloes.dat`` file is three columns:
-
-  1. log10 of mass (Msolar, NOT Msolar/h) for this bin.
-  2. mass (Msolar/h) for this bin.
-  3. cumulative number density of halos (per Mpc^3, NOT h^3/Mpc^3) in this bin.
-
-Analytical Halo Mass Function Fit
----------------------------------
-
-When an analytical fit is desired, in nearly all cases several cosmological
-parameters will need to be specified by hand. These parameters are not
-stored with Enzo datasets. In the case where both the haloes and an analytical
-fit are desired, the analysis is instantiated as below.
-``sigma8input``, ``primordial_index`` and ``omega_baryon0`` should be set to
-the same values as
-``PowerSpectrumSigma8``, ``PowerSpectrumPrimordialIndex`` and
-``CosmologyOmegaBaryonNow`` from the
-`inits <http://lca.ucsd.edu/projects/enzo/wiki/UserGuide/RunningInits>`_
-parameter file used to set up the simulation.
-``fitting_function`` is set to values 1 through 4 from the list of available
-fits above.
-
-.. code-block:: python
-
-  from yt.mods import *
-  from yt.analysis_modules.halo_mass_function.api import *
-  pf = load("data0030")
-  hmf = HaloMassFcn(pf, halo_file="FilteredQuantities.out", 
-  sigma8input=0.9, primordial_index=1., omega_baryon0=0.06,
-  fitting_function=4)
-  hmf.write_out(prefix='hmf')
-
-Both the ``-haloes.dat`` and ``-fit.dat`` files are written to disk.
-The contents of the ``-fit.dat`` file is four columns:
-
-  1. log10 of mass (Msolar, NOT Msolar/h) for this bin.
-  2. mass (Msolar/h) for this bin.
-  3. (dn/dM)*dM (differential number density of halos, per Mpc^3 (NOT h^3/Mpc^3) in this bin.
-  4. cumulative number density of halos (per Mpc^3, NOT h^3/Mpc^3) in this bin.
-
-Below is an example of the output for both the haloes and the (Warren)
-analytical fit, for three datasets. The black lines are the calculated
-halo mass functions, and the blue lines the analytical fit set by initial
-conditions. This simulation shows typical behavior, in that there are too
-few small haloes compared to the fit due to lack of mass and gravity resolution
-for small haloes. But at higher mass ranges, the simulated haloes are quite close
-to the analytical fit.
-
-.. image:: _images/halo_mass_function.png
-   :width: 350
-   :height: 400
-
-The analytical fit can be found without referencing a particular dataset. In this
-case, all the various cosmological parameters need to be specified by hand.
-``omega_matter0`` is the fraction of universe that is made up of matter
-(baryons and dark matter). ``omega_lambda0`` is the fractional proportion due
-to dark energy. In a flat universe, ``omega_matter0`` + ``omega_lambda0`` = 1.
-``this_redshift`` is the redshift for which you wish to generate a fit.
-``log_mass_min`` and ``log_mass_max`` are the logarithmic ends of the mass range for which
-you wish to calculate the fit.
-
-.. code-block:: python
-
-  from yt.mods import *
-  from yt.analysis_modules.halo_mass_function.api import *
-  hmf = HaloMassFcn(None, omega_matter0=0.3, omega_lambda0=0.7,
-  omega_baryon0=0.06, hubble0=.7, this_redshift=0., log_mass_min=8.,
-  log_mass_max=13., sigma8input=0.9, primordial_index=1.,
-  fitting_function=1)
-  hmf.write_out(prefix="hmf-press-schechter", fit=True, haloes=False)
-
-It is possible to access the output of the halo mass function without saving
-to disk. The content is stored in arrays hanging off the ``HaloMassFcn``
-object:
-
-  * ``hmf.logmassarray`` for log10 of mass bin.
-  * ``hmf.massarray`` for mass bin.
-  * ``hmf.dn_M_z`` for (dn/dM)*dM (analytical fit).
-  * ``hmf.nofmz_cum`` for cumulative number density of halos (analytical fit).
-  * ``hmf.dis`` for cumulative number density of halos (from provided halo
-    halo information).

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/4bee173c6ffc/
Changeset:   4bee173c6ffc
User:        MatthewTurk
Date:        2013-10-28 19:59:31
Summary:     Moving a few more into reference.
Affected #:  5 files

diff -r 62e32737bacce37678bf875d946872f58b9e2b4b -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e source/changelog.rst
--- a/source/changelog.rst
+++ /dev/null
@@ -1,434 +0,0 @@
-.. _changelog:
-
-ChangeLog
-=========
-
-
-This is a non-comprehensive log of changes to the code.
-
-Contributors
-------------
-
-Here are all of the contributors to the code base, in alphabetical order.
-
- * Tom Abel
- * David Collins
- * Brian Crosby
- * Andrew Cunningham
- * Nathan Goldbaum
- * Markus Haider
- * Cameron Hummels
- * Christian Karch
- * Ji-hoon Kim
- * Steffen Klemer
- * Kacper Kowalik
- * Michael Kuhlen
- * Eve Lee
- * Yuan Li
- * Chris Malone
- * Josh Moloney
- * Chris Moody
- * Andrew Myers
- * Jeff Oishi
- * Jean-Claude Passy
- * Mark Richardson
- * Thomass Robitaille
- * Anna Rosen
- * Anthony Scopatz
- * Devin Silvia
- * Sam Skillman
- * Stephen Skory
- * Britton Smith
- * Geoffrey So
- * Casey Stark
- * Elizabeth Tasker
- * Stephanie Tonnesen
- * Matthew Turk
- * Rick Wagner
- * John Wise
- * John ZuHone
-
-Version 2.5
------------
-
-Many below-the-surface changes happened in yt 2.5 to improve reliability,
-fidelity of the answers, and streamlined user interface.  The major change in
-this release has been the immense expansion in testing of yt.  We now have over
-2000 unit tests (run on every commit, thanks to both Kacper Kowalik and Shining
-Panda) as well as answer testing for FLASH, Enzo, Chombo and Orion data.
-
-The Stream frontend, which can construct datasets in memory, has been improved
-considerably.  It's now easier than ever to load data from disk.  If you know
-how to get volumetric data into Python, you can use either the
-``load_uniform_grid`` function or the ``load_amr_grid`` function to create an
-in-memory parameter file that yt can analyze.
-
-yt now supports the Athena code.
-
-yt is now focusing on providing first class support for the IPython notebook.
-In this release, plots can be displayed inline.  The Reason HTML5 GUI will be
-merged with the IPython notebook in a future release.
-
-Install Script Changes:
-~~~~~~~~~~~~~~~~~~~~~~~
-
- * SciPy can now be installed
- * Rockstar can now be installed
- * Dependencies can be updated with "yt update --all"
- * Cython has been upgraded to 0.17.1
- * Python has been upgraded to 2.7.3
- * h5py has been upgraded to 2.1.0
- * hdf5 has been upgraded to 1.8.9
- * matplotlib has been upgraded to 1.2.0
- * IPython has been upgraded to 0.13.1
- * Forthon has been upgraded to 0.8.10
- * nose has been added
- * sympy has been added
- * python-hglib has been added
-
-We've also improved support for installing on OSX, Ubuntu and OpenSUSE.
-
-Most Visible Improvements
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
- * Nearly 200 pull requests and over 1000 changesets have been merged since yt
-   2.4 was release on August 2nd, 2012.
- * numpy is now imported as np, not na.  na will continue to work for the
-   foreseeable future.
- * You can now get a `yt cheat sheet <http://yt-project.org/docs/2.5/cheatsheet.pdf>`!
- * yt can now load simulation data created by Athena.
- * The Rockstar halo finder can now be installed by the install script
- * SciPy can now be installed by the install script
- * Data can now be written out in two ways:
-
-   * Sidecar files containing expensive derived fields can be written and
-     implicitly loaded from.
-   * GDF files, which are portable yt-specific representations of full
-     simulations, can be created from any parameter file.  Work is underway on
-     a pure C library that can be linked against to load these files into
-     simulations.
-
- * The "Stream" frontend, for loading raw data in memory, has been greatly
-   expanded and now includes initial conditions generation functionality,
-   particle fields, and simple loading of AMR grids with ``load_amr_grids``.
- * Spherical and Cylindrical fields have been sped up and made to have a
-   uniform interface.  These fields can be the building blocks of more advanced
-   fields.
- * Coordinate transformations have been sped up and streamlined. It is now
-   possible to convert any scalar or vector field to a new cartesian, spherical,
-   or cylindrical coordinate system with an arbitrary orientation. This makes it
-   possible to do novel analyses like profiling the toroidal and poloidal
-   velocity as a function of radius in an inclined disk.
- * Many improvements to the EnzoSimulation class, which can now find many
-   different types of data.
- * Image data is now encapsulated in an ImageArray class, which carries with it
-   provenance information about its trajectory through yt.
- * Streamlines now query at every step along the streamline, not just at every
-   cell.
- * Surfaces can now be extracted and examined, as well as uploaded to
-   Sketchfab.com for interactive visualization in a web browser.
- * allsky_projection can now accept a datasource, making it easier to cut out
-   regions to examine.
- * Many, many improvements to PlotWindow.  If you're still using
-   PlotCollection, check out ``ProjectionPlot``, ``SlicePlot``,
-   ``OffAxisProjectionPlot`` and ``OffAxisSlicePlot``.
- * PlotWindow can now accept a timeseries instead of a parameter file.
- * Many fixes for 1D and 2D data, especially in FLASH datasets.
- * Vast improvements to the particle file handling for FLASH datasets.
- * Particles can now be created ex nihilo with CICSample_3.
- * Rockstar halo finding is now a targeted goal.  Support for using Rockstar
-   has improved dramatically.
- * Increased support for tracking halos across time using the FOF halo finder.
- * The command ``yt notebook`` has been added to spawn an IPython notebook
-   server, and the ``yt.imods`` module can replace ``yt.mods`` in the IPython
-   Notebook to enable better integration.
- * Metallicity-dependent X-ray fields have now been added.
- * Grid lines can now be added to volume renderings.
- * Volume rendering backend has been updated to use an alpha channel, fixing
-   parallel opaque volume renderings.  This also enables easier blending of 
-   multiple images and annotations to the rendering. Users are encouraged
-   to look at the capabilities of the ``ImageArray`` for writing out renders,
-   as updated in the cookbook examples. Volume renders can now be saved with
-   an arbitrary background color.
- * Periodicity, or alternately non-periodicity, is now a part of radius
-   calculations.
- * The AMRKDTree has been rewritten.  This allows parallelism with other than 
-   power-of-2 MPI processes, arbitrary sets of grids, and splitting of
-   unigrids. 
- * Fixed Resolution Buffers and volume rendering images now utilize a new 
-   ImageArray class that stores information such as data source, field names,
-   and other information in a .info dictionary. See the ``ImageArray``
-   docstrings for more information on how they can be used to save to a bitmap
-   or hdf5 file.
-
-Version 2.4
------------
-
-The 2.4 release was particularly large, encompassing nearly a thousand
-changesets and a number of new features.
-
-To help you get up to speed, we've made an IPython notebook file demonstrating
-a few of the changes to the scripting API.  You can
-`download it here <http://yt-project.org/files/yt24.ipynb>`_.
-
-Most Visible Improvements
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
- * Threaded volume renderer, completely refactored from the ground up for
-   speed and parallelism.
- * The Plot Window (see :ref:`simple-inspection`) is now fully functional!  No
-   more PlotCollections, and full, easy access to Matplotlib axes objects.
- * Many improvements to Time Series analysis:
-    * EnzoSimulation now integrates with TimeSeries analysis!
-    * Auto-parallelization of analysis and parallel iteration
-    * Memory usage when iterating over parameter files reduced substantially
- * Many improvements to Reason, the yt GUI
-    * Addition of "yt reason" as a startup command
-    * Keyboard shortcuts in projection & slice mode: z, Z, x, X for zooms,
-      hjkl, HJKL for motion
-    * Drag to move in projection & slice mode
-    * Contours and vector fields in projection & slice mode
-    * Color map selection in projection & slice mode
-    * 3D Scene
- * Integration with the all new yt Hub ( http://hub.yt-project.org/ ): upload
-   variable resolution projections, slices, project information, vertices and
-   plot collections right from the yt command line!
-
-Other Changes
-~~~~~~~~~~~~~
-
- * :class:`~yt.visualization.plot_window.ProjectionPlot` and 
-   :class:`~yt.visualization.plot_window.SlicePlot` supplant the functionality
-   of PlotCollection.
- * Camera path creation from keyframes and splines
- * Ellipsoidal data containers and ellipsoidal parameter calculation for halos
- * PyX and ZeroMQ now available in the install script
- * Consolidation of unit handling
- * HDF5 updated to 1.8.7, Mercurial updated to 2.2, IPython updated to 0.12
- * Preview of integration with Rockstar halo finder
- * Improvements to merger tree speed and memory usage
- * Sunrise exporter now compatible with Sunrise 4.0
- * Particle trajectory calculator now available!
- * Speed and parallel scalability improvements in projections, profiles and HOP
- * New Vorticity-related fields
- * Vast improvements to the ART frontend
- * Many improvements to the FLASH frontend, including full parameter reads,
-   speedups, and support for more corner cases of FLASH 2, 2.5 and 3 data.
- * Integration of the Grid Data Format frontend, and a converter for Athena
-   data to this format.
- * Improvements to command line parsing
- * Parallel import improvements on parallel filesystems
-   (``from yt.pmods import *``)
- * proj_style keyword for projections, for Maximum Intensity Projections
-   (``proj_style = "mip"``)
- * Fisheye rendering for planetarium rendering
- * Profiles now provide \*_std fields for standard deviation of values
- * Generalized Orientation class, providing 6DOF motion control
- * parallel_objects iteration now more robust, provides optional barrier.
-   (Also now being used as underlying iteration mechanism in many internal
-   routines.)
- * Dynamic load balancing in parallel_objects iteration.
- * Parallel-aware objects can now be pickled.
- * Many new colormaps included
- * Numerous improvements to the PyX-based eps_writer module
- * FixedResolutionBuffer to FITS export.
- * Generic image to FITS export.
- * Multi-level parallelism for extremely large cameras in volume rendering
- * Light cone and light ray updates to fit with current best practices for
-   parallelism
-
-Version 2.3 
------------
-
-`(yt 2.3 docs) <http://yt-project.org/docs/2.3>`_
- * Multi-level parallelism
- * Real, extensive answer tests
- * Boolean data regions (see :ref:`boolean_data_objects`)
- * Isocontours / flux calculations (see :ref:`extracting-isocontour-information`)
- * Field reorganization (see :ref:`types_of_fields`)
- * PHOP memory improvements
- * Bug fixes for tests
- * Parallel data loading for RAMSES, along with other speedups and improvements
-   there
- * WebGL interface for isocontours and a pannable map widget added to Reason
-   (see :ref:`within-reason`)
- * Performance improvements for volume rendering
- * Adaptive HEALpix support (see :ref:`adaptive_healpix_volume_rendering`)
- * Column density calculations (see :ref:`radial-column-density`)
- * Massive speedup for 1D profiles
- * Lots more, bug fixes etc.
- * Substantial improvements to the documentation, including
-   :ref:`manual-plotting` and a revamped :ref:`orientation`.
-
-Version 2.2
------------
-
-`(yt 2.2 docs) <http://yt-project.org/docs/2.2>`_
- * Command-line submission to the yt Hub (http://hub.yt-project.org/)
- * Initial release of the web-based GUI Reason, designed for efficient remote
-   usage over SSH tunnels (see :ref:`reason`)
- * Absorption line spectrum generator for cosmological simulations (see
-   :ref:`absorption_spectrum`)
- * Interoperability with ParaView for volume rendering, slicing, and so forth
- * Support for the Nyx code
- * An order of magnitude speed improvement in the RAMSES support
- * Quad-tree projections, speeding up the process of projecting by up to an
-   order of magnitude and providing better load balancing
- * “mapserver” for in-browser, Google Maps-style slice and projection
-   visualization (see :ref:`mapserver`)
- * Many bug fixes and performance improvements
- * Halo loader (see :ref:`load_haloes`)
-
-Version 2.1
------------
-
-`(yt 2.1 docs) <http://yt-project.org/docs/2.1>`_
- * HEALpix-based volume rendering for 4pi, allsky volume rendering
- * libconfig is now included
- * SQLite3 and Forthon now included by default in the install script
- * Development guide has been lengthened substantially and a development
-   bootstrap script (:ref:`bootstrap-dev`) is now included.
- * Installation script now installs Python 2.7 and HDF5 1.8.6
- * iyt now tab-completes field names
- * Halos can now be stored on-disk much more easily between HaloFinding runs.
- * Halos found inline in Enzo can be loaded and merger trees calculated
- * Support for CASTRO particles has been added
- * Chombo support updated and fixed
- * New code contributions 
- * Contour finder has been sped up by a factor of a few
- * Constrained two-point functions are now possible, for LOS power spectra
- * Time series analysis (:ref:`time-series-analysis`) now much easier
- * Stream Lines now a supported 1D data type (:class:`AMRStreamlineBase`)
- * Stream Lines now able to be calculated and plotted (:ref:`streamlines`)
- * In situ Enzo visualization now much faster
- * "gui" source directory reorganized and cleaned up
- * Cython now a compile-time dependency, reducing the size of source tree
-   updates substantially
- * ``yt-supplemental`` repository now checked out by default, containing
-   cookbook, documentation, handy mercurial extensions, and advanced plotting
-   examples and helper scripts.
- * Pasteboards now supported and available 
- * Parallel yt efficiency improved by removal of barriers and improvement of
-   collective operations
-
-Version 2.0
------------
-
- * Major reorganization of the codebase for speed, ease of modification, and maintainability
- * Re-organization of documentation and addition of Orientation Session
- * Support for FLASH code
- * Preliminary support for MAESTRO, CASTRO, ART, and RAMSES (contributions welcome!)
- * Perspective projection for volume rendering
- * Exporting to Sunrise
- * Preliminary particle rendering in volume rendering visualization
- * Drastically improved parallel volume rendering, via kD-tree decomposition
- * Simple merger tree calculation for FOF catalogs
- * New and greatly expanded documentation, with a "source" button
-
-Version 1.7
------------
-
- * Direct writing of PNGs (see :ref:`image_writer`)
- * Multi-band image writing (see :ref:`image_writer`)
- * Parallel halo merger tree (see :ref:`merger_tree`)
- * Parallel structure function generator (see :ref:`two_point_functions`)
- * Image pan and zoom object and display widget.
- * Parallel volume rendering (see :ref:`volume_rendering`)
- * Multivariate volume rendering, allowing for multiple forms of emission and
-   absorption, including approximate scattering and Planck emissions. (see
-   :ref:`volume_rendering`)
- * Added Camera interface to volume rendering (See :ref:`volume_rendering`)
- * Off-axis projection (See :ref:`volume_rendering`)
- * Stereo (toe-in) volume rendering (See :ref:`volume_rendering`)
- * DualEPS extension for better EPS construction
- * yt now uses Distribute instead of SetupTools
- * Better ``iyt`` initialization for GUI support
- * Rewritten, memory conservative and speed-improved contour finding algorithm
- * Speed improvements to volume rendering
- * Preliminary support for the Tiger code
- * Default colormap is now ``algae``
- * Lightweight projection loading with ``projload``
- * Improvements to `yt.data_objects.time_series`
- * Improvements to :class:`yt.extensions.EnzoSimulation` (See
-   :ref:`analyzing-an-entire-simulation`)
- * Removed ``direct_ray_cast``
- * Fixed bug causing double data-read in projections
- * Added Cylinder support to ParticleIO
- * Fixes for 1- and 2-D Enzo datasets
- * Preliminary, largely non-functional Gadget support
- * Speed improvements to basic HOP
- * Added physical constants module
- * Beginning to standardize and enforce docstring requirements, changing to
-   ``autosummary``-based API documentation.
-
-Version 1.6.1
--------------
-
- * Critical fixes to ParticleIO
- * Halo mass function fixes for comoving coordinates
- * Fixes to halo finding
- * Fixes to the installation script
- * "yt instinfo" command to report current installation information as well as
-   auto-update some types of installations
- * Optimizations to the volume renderer (2x-26x reported speedups)
-
-Version 1.6
------------
-
-Version 1.6 is a point release, primarily notable for the new parallel halo
-finder (see :ref:`halo_finding`)
-
- * (New) Parallel HOP ( http://arxiv.org/abs/1001.3411 , :ref:`halo_finding` )
- * (Beta) Software ray casting and volume rendering
-   (see :ref:`volume_rendering`)
- * Rewritten, faster and better contouring engine for clump identification
- * Spectral Energy Distribution calculation for stellar populations
-   (see :ref:`synthetic_spectrum`)
- * Optimized data structures such as the hierarchy
- * Star particle analysis routines
-   (see :ref:`star_analysis`)
- * Halo mass function routines (see :ref:`hmf_howto`)
- * Completely rewritten, massively faster and more memory efficient Particle IO
- * Fixes for plots, including normalized phase plots
- * Better collective communication in parallel routines
- * Consolidation of optimized C routines into ``amr_utils``
- * Many bug fixes and minor optimizations 
-
-Version 1.5
------------
-
-Version 1.5 features many new improvements, most prominently that of the
-addition of parallel computing abilities (see :ref:`parallel-computation`) and
-generalization for multiple AMR data formats, specifically both Enzo and Orion.
-
- * Rewritten documentation
- * Fully parallel slices, projections, cutting planes, profiles,
-   quantities
- * Parallel HOP
- * Friends-of-friends halo finder
- * Object storage and serialization
- * Major performance improvements to the clump finder (factor of five)
- * Generalized domain sizes
- * Generalized field info containers
- * Dark Matter-only simulations
- * 1D and 2D simulations
- * Better IO for HDF5 sets
- * Support for the Orion AMR code
- * Spherical re-gridding
- * Halo profiler
- * Disk image stacker
- * Light cone generator
- * Callback interface improved
- * Several new callbacks
- * New data objects -- ortho and non-ortho rays, limited ray-tracing
- * Fixed resolution buffers
- * Spectral integrator for CLOUDY data
- * Substantially better interactive interface
- * Performance improvements *everywhere*
- * Command-line interface to *many* common tasks
- * Isolated plot handling, independent of PlotCollections
-
-Version 1.0
------------
-
- * Initial release!

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/38edb65efe94/
Changeset:   38edb65efe94
User:        MatthewTurk
Date:        2013-10-28 20:06:46
Summary:     Moving tons of stuff around and splitting up the "Advanced" section.
Affected #:  30 files

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/_static/axes.c
--- a/source/advanced/_static/axes.c
+++ /dev/null
@@ -1,15 +0,0 @@
-#include "axes.h"
-
-void calculate_axes(ParticleCollection *part,
-    double *ax1, double *ax2, double *ax3)
-{
-    int i;
-    for (i = 0; i < part->npart; i++) {
-        if (ax1[0] > part->xpos[i]) ax1[0] = part->xpos[i];
-        if (ax2[0] > part->ypos[i]) ax2[0] = part->ypos[i];
-        if (ax3[0] > part->zpos[i]) ax3[0] = part->zpos[i];
-        if (ax1[1] < part->xpos[i]) ax1[1] = part->xpos[i];
-        if (ax2[1] < part->ypos[i]) ax2[1] = part->ypos[i];
-        if (ax3[1] < part->zpos[i]) ax3[1] = part->zpos[i];
-    }
-}

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/_static/axes.h
--- a/source/advanced/_static/axes.h
+++ /dev/null
@@ -1,10 +0,0 @@
-typedef struct structParticleCollection {
-     long npart;
-     double *xpos;
-     double *ypos;
-     double *zpos;
-} ParticleCollection;
-
-void calculate_axes(ParticleCollection *part,
-         double *ax1, double *ax2, double *ax3);
-

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/_static/axes_calculator.pyx
--- a/source/advanced/_static/axes_calculator.pyx
+++ /dev/null
@@ -1,37 +0,0 @@
-import numpy as np
-cimport numpy as np
-cimport cython
-from stdlib cimport malloc, free
-
-cdef extern from "axes.h":
-    ctypedef struct ParticleCollection:
-            long npart
-            double *xpos
-            double *ypos
-            double *zpos
-
-    void calculate_axes(ParticleCollection *part,
-             double *ax1, double *ax2, double *ax3)
-
-def examine_axes(np.ndarray[np.float64_t, ndim=1] xpos,
-                 np.ndarray[np.float64_t, ndim=1] ypos,
-                 np.ndarray[np.float64_t, ndim=1] zpos):
-    cdef double ax1[3], ax2[3], ax3[3]
-    cdef ParticleCollection particles
-    cdef int i
-
-    particles.npart = len(xpos)
-    particles.xpos = <double *> xpos.data
-    particles.ypos = <double *> ypos.data
-    particles.zpos = <double *> zpos.data
-
-    for i in range(particles.npart):
-        particles.xpos[i] = xpos[i]
-        particles.ypos[i] = ypos[i]
-        particles.zpos[i] = zpos[i]
-
-    calculate_axes(&particles, ax1, ax2, ax3)
-
-    return ( (ax1[0], ax1[1], ax1[2]),
-             (ax2[0], ax2[1], ax2[2]),
-             (ax3[0], ax3[1], ax3[2]) )

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/_static/axes_calculator_setup.txt
--- a/source/advanced/_static/axes_calculator_setup.txt
+++ /dev/null
@@ -1,25 +0,0 @@
-NAME = "axes_calculator"
-EXT_SOURCES = ["axes.c"]
-EXT_LIBRARIES = []
-EXT_LIBRARY_DIRS = []
-EXT_INCLUDE_DIRS = []
-DEFINES = []
-
-from distutils.core import setup
-from distutils.extension import Extension
-from Cython.Distutils import build_ext
-
-ext_modules = [Extension(NAME,
-                 [NAME+".pyx"] + EXT_SOURCES,
-                 libraries = EXT_LIBRARIES,
-                 library_dirs = EXT_LIBRARY_DIRS,
-                 include_dirs = EXT_INCLUDE_DIRS,
-                 define_macros = DEFINES)
-]
-
-setup(
-  name = NAME,
-  cmdclass = {'build_ext': build_ext},
-  ext_modules = ext_modules
-)
-

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/creating_datatypes.rst
--- a/source/advanced/creating_datatypes.rst
+++ /dev/null
@@ -1,41 +0,0 @@
-.. _creating-objects:
-
-Creating 3D Datatypes
-=====================
-
-The three-dimensional datatypes in yt follow a fairly simple protocol.  The
-basic principle is that if you want to define a region in space, that region
-must be identifiable from some sort of cut applied against the cells --
-typically, in yt, this is done by examining the geometry.  (The
-:class:`yt.data_objects.data_containers.ExtractedRegionBase` type is a notable
-exception to this, as it is defined as a subset of an existing data object.)
-
-In principle, you can define any number of 3D data objects, as long as the
-following methods are implemented to protocol specifications.
-
-.. function:: __init__(self, args, kwargs)
-
-   This function can accept any number of arguments but must eventually call
-   AMR3DData.__init__.  It is used to set up the various parameters that
-   define the object.
-
-.. function:: _get_list_of_grids(self)
-
-   This function must set the property _grids to be a list of the grids
-   that should be considered to be a part of the data object.  Each of these
-   will be partly or completely contained within the object.
-
-.. function:: _is_fully_enclosed(self, grid)
-
-   This function returns true if the entire grid is part of the data object
-   and false if it is only partly enclosed.
-
-.. function:: _get_cut_mask(self, grid)
-
-   This function returns a boolean mask in the shape of the grid.  All of the
-   cells set to 'True' will be included in the data object and all of those set
-   to 'False' will be excluded.  Typically this is done via some logical
-   operation.
-
-For a good example of how to do this, see the
-:class:`yt.data_objects.data_containers.AMRCylinderBase` source code.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/creating_derived_quantities.rst
--- a/source/advanced/creating_derived_quantities.rst
+++ /dev/null
@@ -1,31 +0,0 @@
-.. _creating_derived_quantities:
-
-Creating Derived Quantities
----------------------------
-
-The basic idea is that you need to be able to operate both on a set of data,
-and a set of sets of data.  (If this is not possible, the quantity needs to be
-added with the ``force_unlazy`` option.)
-
-Two functions are necessary.  One will operate on arrays of data, either fed
-from each grid individually or fed from the entire data object at once.  The
-second one takes the results of the first, either as lists of arrays or as
-single arrays, and returns the final values.  For an example, we look at the
-``TotalMass`` function:
-
-.. code-block:: python
-
-   def _TotalMass(data):
-       baryon_mass = data["CellMassMsun"].sum()
-       particle_mass = data["ParticleMassMsun"].sum()
-       return baryon_mass, particle_mass
-   def _combTotalMass(data, baryon_mass, particle_mass):
-       return baryon_mass.sum() + particle_mass.sum()
-   add_quantity("TotalMass", function=_TotalMass,
-                combine_function=_combTotalMass, n_ret = 2)
-
-Once the two functions have been defined, we then call :func:`add_quantity` to
-tell it the function that defines the data, the collator function, and the
-number of values that get passed between them.  In this case we return both the
-particle and the baryon mass, so we have two total values passed from the main
-function into the collator.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/creating_frontend.rst
--- a/source/advanced/creating_frontend.rst
+++ /dev/null
@@ -1,143 +0,0 @@
-.. _creating_frontend:
-
-Creating A New Code Frontend
-============================
-
-``yt`` is designed to support analysis and visualization of data from multiple
-different simulation codes, although it has so far been most successfully
-applied to Adaptive Mesh Refinement (AMR) data. For a list of codes and the
-level of support they enjoy, we've created a handy [[CodeSupportLevels|table]].
-We'd like to support a broad range of codes, both AMR-based and otherwise. To
-add support for a new code, a few things need to be put into place. These
-necessary structures can be classified into a couple categories:
-
- * Data meaning: This is the set of parameters that convert the data into
-   physically relevant units; things like spatial and mass conversions, time
-   units, and so on.
- * Data localization: These are structures that help make a "first pass" at data
-   loading. Essentially, we need to be able to make a first pass at guessing
-   where data in a given physical region would be located on disk. With AMR
-   data, this is typically quite easy: the grid patches are the "first pass" at
-   localization.
- * Data reading: This is the set of routines that actually perform a read of
-   either all data in a region or a subset of that data.
-
-Data Meaning Structures
------------------------
-
-If you are interested in adding a new code, be sure to drop us a line on
-`yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_!
-
-To get started, make a new directory in ``yt/frontends`` with the name of your
-code -- you can start by copying into it the contents of the ``stream``
-directory, which is a pretty empty format. You'll then have to create a subclass
-of ``StaticOutput``. This subclass will need to handle conversion between the
-different physical units and the code units; for the most part, the examples of
-``OrionStaticOutput`` and ``EnzoStaticOutput`` should be followed, but
-``ChomboStaticOutput``, as a slightly newer addition, can also be used as an
-instructive example -- be sure to add an ``_is_valid`` classmethod that will
-verify if a filename is valid for that output type, as that is how "load" works.
-
-A new set of fields must be added in the file ``fields.py`` in that directory.
-For the most part this means subclassing ``CodeFieldInfoContainer`` and adding
-the necessary fields specific to that code. Here is the Chombo field container:
-
-.. code-block:: python
-
-    from UniversalFields import *
-    class ChomboFieldContainer(CodeFieldInfoContainer):
-        _shared_state = {}
-        _field_list = {}
-    ChomboFieldInfo = ChomboFieldContainer()
-    add_chombo_field = ChomboFieldInfo.add_field
-
-The field container is a shared state object, which is why we explicitly set
-``_shared_state`` equal to a mutable.
-
-Data Localization Structures
-----------------------------
-
-As of right now, the "grid patch" mechanism is going to remain in yt, however in
-the future that may change. As such, some other output formats -- like Gadget --
-may be shoe-horned in, slightly.
-
-Hierarchy
-^^^^^^^^^
-
-To set up data localization, an ``AMRHierarchy`` subclass must be added in the
-file ``data_structures.py``. The hierarchy object must override the following
-methods:
-
- * ``_detect_fields``: ``self.field_list`` must be populated as a list of
-   strings corresponding to "native" fields in the data files.
- * ``_setup_classes``: it's probably safe to crib this from one of the other
-   ``AMRHierarchy`` subclasses.
- * ``_count_grids``: this must set self.num_grids to be the total number of
-   grids in the simulation.
- * ``_parse_hierarchy``: this must fill in ``grid_left_edge``,
-   ``grid_right_edge``, ``grid_particle_count``, ``grid_dimensions`` and
-   ``grid_levels`` with the appropriate information. Additionally, ``grids``
-   must be an array of grid objects that already know their IDs.
- * ``_populate_grid_objects``: this initializes the grids by calling
-   ``_prepare_grid`` and ``_setup_dx`` on all of them.  Additionally, it should
-   set up ``Children`` and ``Parent`` lists on each grid object.
- * ``_setup_unknown_fields``: If a field is in the data file that yt doesn't
-   already know, this is where you make a guess at it.
- * ``_setup_derived_fields``: ``self.derived_field_list`` needs to be made a
-   list of strings that correspond to all derived fields valid for this
-   hierarchy.
-
-For the most part, the ``ChomboHierarchy`` should be the first place to look for
-hints on how to do this; ``EnzoHierarchy`` is also instructive.
-
-Grids
-^^^^^
-
-A new grid object, subclassing ``AMRGridPatch``, will also have to be added.
-This should go in ``data_structures.py``. For the most part, this may be all
-that is needed:
-
-.. code-block:: python
-
-    class ChomboGrid(AMRGridPatch):
-        _id_offset = 0
-        __slots__ = ["_level_id"]
-        def __init__(self, id, hierarchy, level = -1):
-            AMRGridPatch.__init__(self, id, filename = hierarchy.hierarchy_filename,
-                                  hierarchy = hierarchy)
-            self.Parent = []
-            self.Children = []
-            self.Level = level
-
-
-Even the most complex grid object, ``OrionGrid``, is still relatively simple.
-
-Data Reading Functions
-----------------------
-
-In ``io.py``, there are a number of IO handlers that handle the mechanisms by
-which data is read off disk.  To implement a new data reader, you must subclass
-``BaseIOHandler`` and override the following methods:
-
- * ``_read_field_names``: this routine accepts a grid object and must return all
-   the fields in the data file affiliated with that grid. It is used at the
-   initialization of the ``AMRHierarchy`` but likely not later.
- * ``modify``: This accepts a field from a data file and returns it ready to be
-   used by yt. This is used in Enzo data for preloading.
- * ``_read_data_set``: This accepts a grid object and a field name and must
-   return that field, ready to be used by yt as a NumPy array. Note that this
-   presupposes that any actions done in ``modify`` (above) have been executed.
- * ``_read_data_slice``: This accepts a grid object, a field name, an axis and
-   an (integer) coordinate, and it must return a slice through the array at that
-   value.
- * ``preload``: (optional) This accepts a list of grids and a list of datasets
-   and it populates ``self.queue`` (a dict keyed by grid id) with dicts of
-   datasets.
- * ``_read_exception``: (property) This is a tuple of exceptions that can be
-   raised by the data reading to indicate a field does not exist in the file.
-
-
-And that just about covers it. Please feel free to email
-`yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_ or
-`yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ with
-any questions, or to let us know you're thinking about adding a new code to yt.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/debugdrive.rst
--- a/source/advanced/debugdrive.rst
+++ /dev/null
@@ -1,127 +0,0 @@
-.. _debug-drive:
-
-Debugging and Driving YT
-========================
-
-There are several different convenience functions that allow you to control YT
-in perhaps unexpected and unorthodox manners.  These will allow you to conduct
-in-depth debugging of processes that may be running in parallel on multiple
-processors, as well as providing a mechanism of signalling to YT that you need
-more information about a running process.  Additionally, YT has a built-in
-mechanism for optional reporting of errors to a central server.  All of these
-allow for more rapid development and debugging of any problems you might
-encounter.
-
-Additionally, ``yt`` is able to leverage existing developments in the IPython
-community for parallel, interactive analysis.  This allows you to initialize
-multiple YT processes through ``mpirun`` and interact with all of them from a
-single, unified interactive prompt.  This enables and facilitates parallel
-analysis without sacrificing interactivity and flexibility.
-
-.. _pastebin:
-
-The Pastebin
-------------
-
-A pastebin is a website where you can easily copy source code and error
-messages to share with yt developers or your collaborators. At
-http://paste.yt-project.org/ a pastebin is available for placing scripts.  With
-``yt`` the script ``yt_lodgeit.py`` is distributed and wrapped with 
-the ``pastebin`` and ``pastebin_grab`` commands, which allow for commandline 
-uploading and downloading of pasted snippets.  To upload a script you
-would supply it to the command:
-
-.. code-block:: bash
-
-   $ yt pastebin some_script.py
-
-The URL will be returned.  If you'd like it to be marked 'private' and not show
-up in the list of pasted snippets, supply the argument ``--private``.  All
-snippets are given either numbers or hashes.  To download a pasted snippet, you
-would use the ``pastebin_grab`` option:
-
-.. code-block:: bash
-
-   $ yt pastebin_grab 1768
-
-The snippet will be output to the window, so output redirection can be used to
-store it in a file.
-
-.. _error-reporting:
-
-Error Reporting with the Pastebin
-+++++++++++++++++++++++++++++++++
-
-If you are having troubles with ``yt``, you can have it paste the error report
-to the pastebin by running your problematic script with the ``--paste`` option:
-
-.. code-block:: bash
-
-   $ python2.7 some_problematic_script.py --paste
-
-The ``--paste`` option has to come after the name of the script.  When the
-script dies and prints its error, it will also submit that error to the
-pastebin and return a URL for the error.  When reporting your bug, include this
-URL and then the problem can be debugged more easily.
-
-For more information on asking for help, see `asking-for-help`.
-
-Signaling YT to Do Something
-----------------------------
-
-During startup, ``yt`` inserts handlers for two operating system-level signals.
-These provide two diagnostic methods for interacting with a running process.
-Signalling the python process that is running your script with these signals
-will induce the requested behavior.  
-
-   SIGUSR1
-     This will cause the python code to print a stack trace, showing exactly
-     where in the function stack it is currently executing.
-   SIGUSR1
-     This will cause the python code to insert an IPython session wherever it
-     currently is, with all local variables in the local namespace.  It should
-     allow you to change the state variables.
-
-If your ``yt``-running process has PID 5829, you can signal it to print a
-traceback with:
-
-.. code-block:: bash
-
-   $ kill -SIGUSR1 5829
-
-Note, however, that if the code is currently inside a C function, the signal
-will not be handled, and the stacktrace will not be printed, until it returns
-from that function.
-
-.. _remote-debugging:
-
-Remote and Disconnected Debugging
----------------------------------
-
-If you are running a parallel job that fails, often it can be difficult to do a
-post-mortem analysis to determine what went wrong.  To facilitate this, ``yt``
-has implemented an `XML-RPC <http://en.wikipedia.org/wiki/XML-RPC>`_ interface
-to the Python debugger (``pdb``) event loop.  
-
-Running with the ``--rpdb`` command will cause any uncaught exception during
-execution to spawn this interface, which will sit and wait for commands,
-exposing the full Python debugger.  Additionally, a frontend to this is
-provided through the ``yt`` command.  So if you run the command:
-
-.. code-block:: bash
-
-   $ mpirun -np 4 python2.7 some_script.py --parallel --rpdb
-
-and it reaches an error or an exception, it will launch the debugger.
-Additionally, instructions will be printed for connecting to the debugger.
-Each of the four processes will be accessible via:
-
-.. code-block:: bash
-
-   $ yt rpdb 0
-
-where ``0`` here indicates the process 0.
-
-For security reasons, this will only work on local processes; to connect on a
-cluster, you will have to execute the command ``yt rpdb`` on the node on which
-that process was launched.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/developing.rst
--- a/source/advanced/developing.rst
+++ /dev/null
@@ -1,436 +0,0 @@
-.. _contributing-code:
-
-How to Develop yt
-=================
-
-.. note:: If you already know how to use version control and are comfortable
-   with handling it yourself, the quickest way to contribute to yt is to `fork
-   us on BitBucket <http://hg.yt-project.org/yt/fork>`_, `make your changes
-   <http://mercurial.selenic.com/>`_, and issue a `pull request
-   <http://hg.yt-project.org/yt/pull>`_.  The rest of this document is just an
-   explanation of how to do that.
-
-yt is a community project!
-
-We are very happy to accept patches, features, and bugfixes from any member of
-the community!  yt is developed using mercurial, primarily because it enables
-very easy and straightforward submission of changesets.  We're eager to hear
-from you, and if you are developing yt, we encourage you to subscribe to the
-`developer mailing list
-<http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_
-
-Please feel free to hack around, commit changes, and send them upstream.  If
-you're new to Mercurial, these three resources are pretty great for learning
-the ins and outs:
-
-   * http://hginit.com/
-   * http://hgbook.red-bean.com/read/
-   * http://mercurial.selenic.com/
-
-The commands that are essential for using mercurial include:
-
-   * ``hg commit`` which commits changes in the working directory to the
-     repository, creating a new "changeset object."
-   * ``hg add`` which adds a new file to be tracked by mercurial.  This does
-     not change the working directory.
-   * ``hg pull`` which pulls (from an optional path specifier) changeset
-     objects from a remote source.  The working directory is not modified.
-   * ``hg push`` which sends (to an optional path specifier) changeset objects
-     to a remote source.  The working directory is not modified.
-   * ``hg log`` which shows a log of all changeset objects in the current
-     repository.  Use ``-g`` to show a graph of changeset objects and their
-     relationship.
-   * ``hg update`` which (with an optional "revision" specifier) updates the
-     state of the working directory to match a changeset object in the
-     repository.
-   * ``hg merge`` which combines two changesets to make a union of their lines
-     of development.  This updates the working directory.
-
-Keep in touch, and happy hacking!  We also provide `doc/coding_styleguide.txt`
-and an example of a fiducial docstring in `doc/docstring_example.txt`.  Please
-read them before hacking on the codebase, and feel free to email any of the
-mailing lists for help with the codebase.
-
-.. _bootstrap-dev:
-
-Submitting Changes
-------------------
-
-We provide a brief introduction to submitting changes here.  yt thrives on the
-strength of its communities ( http://arxiv.org/abs/1301.7064 has further
-discussion) and we encourage contributions from any user.  While we do not
-discuss in detail version control, mercurial or the advanced usage of
-BitBucket, we do provide an outline of how to submit changes and we are happy
-to provide further assistance or guidance.
-
-Licensing
-+++++++++
-
-All contributed code must be GPL-compatible; we ask that you consider licensing
-under the GPL version 3, but we will consider submissions of code that are
-BSD-like licensed as well.  If you'd rather not license in this manner, but
-still want to contribute, just drop me a line and I'll put a link on the main
-wiki page to wherever you like!
-
-Requirements for Code Submission
-++++++++++++++++++++++++++++++++
-
-Modifications to the code typically fall into one of three categories, each of
-which have different requirements for acceptance into the code base.  These
-requirements are in place for a few reasons -- to make sure that the code is
-maintainable, testable, and that we can easily include information about
-changes in changelogs during the release procedure.  (See `YTEP-0008
-<https://ytep.readthedocs.org/en/latest/YTEPs/YTEP-0008.html>`_ for more
-detail.)
-
-  * New Features
-
-    * New unit tests (possibly new answer tests) (See :ref:`testing`)
-    * Docstrings for public API
-    * Addition of new feature to the narrative documentation
-    * Addition of cookbook recipe
-    * Issue created on issue tracker, to ensure this is added to the changelog
-
-  * Extension or Breakage of API in Existing Features
-
-    * Update existing narrative docs and docstrings
-    * Update existing cookbook recipes
-    * Modify of create new unit tests (See :ref:`testing`)
-    * Issue created on issue tracker, to ensure this is added to the changelog
-
-  * Bug fixes
-
-    * Unit test is encouraged, to ensure breakage does not happen again in the
-      future.
-    * Issue created on issue tracker, to ensure this is added to the changelog
-
-When submitting, you will be asked to make sure that your changes meet all of
-these requirements.  They are pretty easy to meet, and we're also happy to help
-out with them.  In :ref:`code-style-guide` there is a list of handy tips for
-how to structure and write your code.
-
-How to Use Mercurial with yt
-++++++++++++++++++++++++++++
-
-This document doesn't cover detailed mercurial use, but on IRC we are happy to
-walk you through any troubles you might have.  Here are some suggestions
-for using mercurial with yt:
-
-  * Named branches are to be avoided.  Try using bookmarks (``hg bookmark``) to
-    track work.  (`More <http://mercurial.selenic.com/wiki/Bookmarks>`_)
-  * Make sure you set a username in your ``~/.hgrc`` before you commit any
-    changes!  All of the tutorials above will describe how to do this as one of
-    the very first steps.
-  * When contributing changes, you might be asked to make a handful of
-    modifications to your source code.  We'll work through how to do this with
-    you, and try to make it as painless as possible.
-  * Please avoid deleting your yt forks, as that eliminates the code review
-    process from BitBucket's website.
-  * In all likelihood, you only need one fork.  To keep it in sync, you can
-    sync from the website.  (See Bitbucket's `Blog Post
-    <http://blog.bitbucket.org/2013/02/04/syncing-and-merging-come-to-bitbucket/>`_
-    about this.)
-  * If you run into any troubles, stop by IRC (see :ref:`irc`) or the mailing
-    list.
-
-Building yt
-+++++++++++
-
-If you have made changes to any C or Cython (``.pyx``) modules, you have to
-rebuild yt.  If your changes have exclusively been to Python modules, you will
-not need to re-build, but (see below) you may need to re-install.  
-
-If you are running from a clone that is executable in-place (i.e., has been
-installed via the installation script or you have run ``setup.py develop``) you
-can rebuild these modules by executing:
-
-.. code-block:: bash
-
-   python2.7 setup.py develop
-
-If you have previously "installed" via ``setup.py install`` you have to
-re-install:
-
-.. code-block:: bash
-
-   python2.7 setup.py install
-
-Only one of these two options is needed.  yt may require you to specify the
-location to libpng and hdf5.  This can be done through files named ``png.cfg``
-and ``hdf5.cfg``.  If you are using the installation script, these will already
-exist.
-
-Making and Sharing Changes
-++++++++++++++++++++++++++
-
-The simplest way to submit changes to yt is to commit changes in your
-``$YT_DEST/src/yt-hg`` directory, fork the repository on BitBucket,  push the
-changesets to your fork, and then issue a pull request.  If you will be
-developing much more in-depth features for yt, you will also
-likely want to edit the paths in your 
-
-Here's a more detailed flowchart of how to submit changes.
-
-  #. If you have used the installation script, the source code for yt can be
-     found in ``$YT_DEST/src/yt-hg``.  (Below, in :ref:`reading-source`, 
-     we describe how to find items of interest.)  Edit the source file you are
-     interested in and test your changes.  (See :ref:`testing` for more
-     information.)
-  #. Fork yt on BitBucket.  (This step only has to be done once.)  You can do
-     this at: https://bitbucket.org/yt_analysis/yt/fork .  Call this repository
-     ``yt``.
-  #. Commit these changes, using ``hg commit``.  This can take an argument
-     which is a series of filenames, if you have some changes you do not want
-     to commit.
-  #. If your changes include new functionality or cover an untested area of the
-     code, add a test.  (See :ref:`testing` for more information.)  Commit
-     these changes as well.
-  #. Push your changes to your new fork using the command::
-
-        hg push https://bitbucket.org/YourUsername/yt/
- 
-     If you end up doing considerable development, you can set an alias in the
-     file ``.hg/hgrc`` to point to this path.
-  #. Issue a pull request at
-     https://bitbucket.org/YourUsername/yt/pull-request/new
-
-During the course of your pull request you may be asked to make changes.  These
-changes may be related to style issues, correctness issues, or even requesting
-tests.  The process for responding to pull request code review is relatively
-straightforward.
-
-  #. Make requested changes, or leave a comment indicating why you don't think
-     they should be made.
-  #. Commit those changes to your local repository.
-  #. Push the changes to your fork::
-
-        hg push https://bitbucket.org/YourUsername/yt/
-
-  #. Update your pull request by visiting
-     https://bitbucket.org/YourUsername/yt/pull-request/new
-
-How to Write Documentation
-++++++++++++++++++++++++++
-
-The process for writing documentation is identical to the above, except that
-instead of ``yt_analysis/yt`` you should be forking and pushing to
-``yt_analysis/yt-doc``.  All the source for the documentation is written in
-`Sphinx <http://sphinx-doc.org/>`_, which uses ReST for markup.
-
-Cookbook recipes go in ``source/cookbook/`` and must be added to one of the
-``.rst`` files in that directory.
-
-How To Get The Source Code For Editing
---------------------------------------
-
-yt is hosted on BitBucket, and you can see all of the yt repositories at
-http://hg.yt-project.org/ .  With the yt installation script you should have a
-copy of Mercurial for checking out pieces of code.  Make sure you have followed
-the steps above for bootstrapping your development (to assure you have a
-bitbucket account, etc.)
-
-In order to modify the source code for yt, we ask that you make a "fork" of the
-main yt repository on bitbucket.  A fork is simply an exact copy of the main
-repository (along with its history) that you will now own and can make
-modifications as you please.  You can create a personal fork by visiting the yt
-bitbucket webpage at https://bitbucket.org/yt_analysis/yt/ .  After logging in,
-you should see an option near the top right labeled "fork".  Click this option,
-and then click the fork repository button on the subsequent page.  You now have
-a forked copy of the yt repository for your own personal modification.
-
-This forked copy exists on the bitbucket repository, so in order to access
-it locally, follow the instructions at the top of that webpage for that
-forked repository, namely run at a local command line:
-
-.. code-block:: bash
-
-   $ hg clone http://bitbucket.org/<USER>/<REPOSITORY_NAME>
-
-This downloads that new forked repository to your local machine, so that you
-can access it, read it, make modifications, etc.  It will put the repository in
-a local directory of the same name as the repository in the current working
-directory.  You can see any past state of the code by using the hg log command.
-For example, the following command would show you the last 5 changesets
-(modifications to the code) that were submitted to that repository.
-
-.. code-block:: bash
-
-   $ cd <REPOSITORY_NAME>
-   $ hg log -l 5
-
-Using the revision specifier (the number or hash identifier next to each
-changeset), you can update the local repository to any past state of the
-code (a previous changeset or version) by executing the command:
-
-.. code-block:: bash
-
-   $ hg up revision_specifier
-
-Lastly, if you want to use this new downloaded version of your yt repository
-as the *active* version of yt on your computer (i.e. the one which is executed
-when you run yt from the command line or ``from yt.mods import *``),
-then you must "activate" it using the following commands from within the
-repository directory.
-
-In order to do this for the first time with a new repository, you have to
-copy some config files over from your yt installation directory (where yt
-was initially installed from the install_script.sh).  Try this:
-
-.. code-block:: bash
-
-   $ cp $YT_DEST/src/yt-hg/*.cfg <REPOSITORY_NAME>
-
-and then every time you want to "activate" a different repository of yt.
-
-.. code-block:: bash
-
-   $ cd <REPOSITORY_NAME>
-   $ python2.7 setup.py develop
-
-This will rebuild all C modules as well.
-
-.. _reading-source:
-
-How To Read The Source Code
----------------------------
-
-If you just want to *look* at the source code, you already have it on your
-computer.  Go to the directory where you ran the install_script.sh, then
-go to ``$YT_DEST/src/yt-hg`` .  In this directory are a number of
-subdirectories with different components of the code, although most of them
-are in the yt subdirectory.  Feel free to explore here.
-
-   ``frontends``
-      This is where interfaces to codes are created.  Within each subdirectory of
-      yt/frontends/ there must exist the following files, even if empty:
-
-      * ``data_structures.py``, where subclasses of AMRGridPatch, StaticOutput
-        and AMRHierarchy are defined.
-      * ``io.py``, where a subclass of IOHandler is defined.
-      * ``misc.py``, where any miscellaneous functions or classes are defined.
-      * ``definitions.py``, where any definitions specific to the frontend are
-        defined.  (i.e., header formats, etc.)
-
-   ``visualization``
-      This is where all visualization modules are stored.  This includes plot
-      collections, the volume rendering interface, and pixelization frontends.
-
-   ``data_objects``
-      All objects that handle data, processed or unprocessed, not explicitly
-      defined as visualization are located in here.  This includes the base
-      classes for data regions, covering grids, time series, and so on.  This
-      also includes derived fields and derived quantities.
-
-   ``analysis_modules``
-      This is where all mechanisms for processing data live.  This includes
-      things like clump finding, halo profiling, halo finding, and so on.  This
-      is something of a catchall, but it serves as a level of greater
-      abstraction that simply data selection and modification.
-
-   ``gui``
-      This is where all GUI components go.  Typically this will be some small
-      tool used for one or two things, which contains a launching mechanism on
-      the command line.
-
-   ``utilities``
-      All broadly useful code that doesn't clearly fit in one of the other
-      categories goes here.
-
-
-If you're looking for a specific file or function in the yt source code, use
-the unix find command:
-
-.. code-block:: bash
-
-   $ find <DIRECTORY_TREE_TO_SEARCH> -name '<FILENAME>'
-
-The above command will find the FILENAME in any subdirectory in the
-DIRECTORY_TREE_TO_SEARCH.  Alternatively, if you're looking for a function
-call or a keyword in an unknown file in a directory tree, try:
-
-.. code-block:: bash
-
-   $ grep -R <KEYWORD_TO_FIND><DIRECTORY_TREE_TO_SEARCH>
-
-This can be very useful for tracking down functions in the yt source.
-
-.. _code-style-guide:
-
-Code Style Guide
-----------------
-
-To keep things tidy, we try to stick with a couple simple guidelines.
-
-General Guidelines
-++++++++++++++++++
-
- * In general, follow `PEP-8 <http://www.python.org/dev/peps/pep-0008/>`_ guidelines.
- * Classes are ConjoinedCapitals, methods and functions are
-   ``lowercase_with_underscores.``
- * Use 4 spaces, not tabs, to represent indentation.
- * Line widths should not be more than 80 characters.
- * Do not use nested classes unless you have a very good reason to, such as
-   requiring a namespace or class-definition modification.  Classes should live
-   at the top level.  ``__metaclass__`` is exempt from this.
- * Do not use unnecessary parenthesis in conditionals.  ``if((something) and
-   (something_else))`` should be rewritten as ``if something and
-   something_else``.  Python is more forgiving than C.
- * Avoid copying memory when possible. For example, don't do ``a =
-   a.reshape(3,4)`` when ``a.shape = (3,4)`` will do, and ``a = a * 3`` should be
-   ``na.multiply(a, 3, a)``.
- * In general, avoid all double-underscore method names: ``__something`` is
-   usually unnecessary.
- * Doc strings should describe input, output, behavior, and any state changes
-   that occur on an object.  See the file `doc/docstring_example.txt` for a
-   fiducial example of a docstring.
-
-API Guide
-+++++++++
-
- * Do not import "*" from anything other than ``yt.funcs``.
- * Internally, only import from source files directly; instead of: ``from
-   yt.visualization.api import PlotCollection`` do
-   ``from yt.visualization.plot_collection import PlotCollection``.
- * Numpy is to be imported as ``na`` not ``np``.  While this may change in the
-   future, for now this is the correct idiom.
- * Do not use too many keyword arguments.  If you have a lot of keyword
-   arguments, then you are doing too much in ``__init__`` and not enough via
-   parameter setting.
- * In function arguments, place spaces before commas.  ``def something(a,b,c)``
-   should be ``def something(a, b, c)``.
- * Don't create a new class to replicate the functionality of an old class --
-   replace the old class.  Too many options makes for a confusing user
-   experience.
- * Parameter files external to yt are a last resort.
- * The usage of the ``**kwargs`` construction should be avoided.  If they
-   cannot be avoided, they must be explained, even if they are only to be
-   passed on to a nested function.
- * Constructor APIs should be kept as *simple* as possible.
- * Variable names should be short but descriptive.
- * No global variables!
-
-Variable Names and Enzo-isms
-++++++++++++++++++++++++++++
-
- * Avoid Enzo-isms.  This includes but is not limited to:
-
-   + Hard-coding parameter names that are the same as those in Enzo.  The
-     following translation table should be of some help.  Note that the
-     parameters are now properties on a StaticOutput subclass: you access them
-     like ``pf.refine_by`` .
-
-     - ``RefineBy `` => `` refine_by``
-     - ``TopGridRank `` => `` dimensionality``
-     - ``TopGridDimensions `` => `` domain_dimensions``
-     - ``InitialTime `` => `` current_time``
-     - ``DomainLeftEdge `` => `` domain_left_edge``
-     - ``DomainRightEdge `` => `` domain_right_edge``
-     - ``CurrentTimeIdentifier `` => `` unique_identifier``
-     - ``CosmologyCurrentRedshift `` => `` current_redshift``
-     - ``ComovingCoordinates `` => `` cosmological_simulation``
-     - ``CosmologyOmegaMatterNow `` => `` omega_matter``
-     - ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
-     - ``CosmologyHubbleConstantNow `` => `` hubble_constant``
-
-   + Do not assume that the domain runs from 0 to 1.  This is not true
-     everywhere.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/external_analysis.rst
--- a/source/advanced/external_analysis.rst
+++ /dev/null
@@ -1,411 +0,0 @@
-Using yt with External Analysis Tools
-=====================================
-
-yt can be used as a ``glue`` code between simulation data and other methods of
-analyzing data.  Its facilities for understanding units, disk IO and data
-selection set it up ideally to use other mechanisms for analyzing, processing
-and visualizing data.
-
-Calling External Python Codes
------------------------------
-
-Calling external Python codes very straightforward.  For instance, if you had a
-Python code that accepted a set of structured meshes and then post-processed
-them to apply radiative feedback, one could imagine calling it directly:
-
-.. code-block:: python
-
-   from yt.mods import *
-   import radtrans
-
-   pf = load("DD0010/DD0010")
-   rt_grids = []
-
-   for grid in pf.h.grids:
-       rt_grid = radtrans.RegularBox(
-            grid.LeftEdge, grid.RightEdge,
-            grid["Density"], grid["Temperature"], grid["Metallicity"])
-       rt_grids.append(rt_grid)
-       grid.clear_data()
-
-   radtrans.process(rt_grids)
-
-Or if you wanted to run a population synthesis module on a set of star
-particles (and you could fit them all into memory) it might look something like
-this:
-
-.. code-block:: python
-
-   from yt.mods import *
-   import pop_synthesis
-
-   pf = load("DD0010/DD0010")
-   dd = pf.h.all_data()
-   star_masses = dd["StarMassMsun"]
-   star_metals = dd["StarMetals"]
-
-   pop_synthesis.CalculateSED(star_masses, star_metals)
-
-If you have a code that's written in Python that you are having trouble getting
-data into from yt, please feel encouraged to email the users list and we'll
-help out.
-
-Calling Non-Python External Codes
----------------------------------
-
-Independent of its ability to process, analyze and visualize data, yt can also
-serve as a mechanism for reading and selecting simulation data.  In this way,
-it can be used to supply data to an external analysis routine written in
-Fortran, C or C++.  This document describes how to supply that data, using the
-example of a simple code that calculates the best axes that describe a
-distribution of particles as a starting point.  (The underlying method is left
-as an exercise for the reader; we're only currently interested in the function
-specification and structs.)
-
-If you have written a piece of code that performs some analysis function, and
-you would like to include it in the base distribution of yt, we would be happy
-to do so; drop us a line or see :ref:`contributing-code` for more information.
-
-To accomplish the process of linking Python with our external code, we will be
-using a language called `Cython <http://www.cython.org/>`_, which is
-essentially a superset of Python that compiles down to C.  It is aware of NumPy
-arrays, and it is able to massage data between the interpreted language Python
-and C, Fortran or C++.  It will be much easier to utilize routines and analysis
-code that have been separated into subroutines that accept data structures, so
-we will assume that our halo axis calculator accepts a set of structs.
-
-Our Example Code
-++++++++++++++++
-
-Here is the ``axes.h`` file in our imaginary code, which we will then wrap:
-
-.. code-block:: c
-
-   typedef struct structParticleCollection {
-        long npart;
-        double *xpos;
-        double *ypos;
-        double *zpos;
-   } ParticleCollection;
-   
-   void calculate_axes(ParticleCollection *part, 
-            double *ax1, double *ax2, double *ax3);
-
-There are several components to this analysis routine which we will have to
-wrap.
-
-   #. We have to wrap the creation of an instance of ``ParticleCollection``.
-   #. We have to transform a set of NumPy arrays into pointers to doubles.
-   #. We have to create a set of doubles into which ``calculate_axes`` will be
-      placing the values of the axes it calculates.
-   #. We have to turn the return values back into Python objects.
-
-Each of these steps can be handled in turn, and we'll be doing it using Cython
-as our interface code.
-
-Setting Up and Building Our Wrapper
-+++++++++++++++++++++++++++++++++++
-
-To get started, we'll need to create two files:
-
-.. code-block:: bash
-
-   axes_calculator.pyx
-   axes_calculator_setup.py
-
-These can go anywhere, but it might be useful to put them in their own
-directory.  The contents of ``axes_calculator.pyx`` will be left for the next
-section, but we will need to put some boilerplate code into
-``axes_calculator_setup.pyx``.  As a quick sidenote, you should call these
-whatever is most appropriate for the external code you are wrapping;
-``axes_calculator`` is probably not the best bet.
-
-Here's a rough outline of what should go in ``axes_calculator_setup.py``:
-
-.. code-block:: python
-
-   NAME = "axes_calculator"
-   EXT_SOURCES = []
-   EXT_LIBRARIES = ["axes_utils", "m"]
-   EXT_LIBRARY_DIRS = ["/home/rincewind/axes_calculator/"]
-   EXT_INCLUDE_DIRS = []
-   DEFINES = []
-
-   from distutils.core import setup
-   from distutils.extension import Extension
-   from Cython.Distutils import build_ext
-
-   ext_modules = [Extension(NAME,
-                    [NAME+".pyx"] + EXT_SOURCES,
-                    libraries = EXT_LIBRARIES,
-                    library_dirs = EXT_LIBRARY_DIRS,
-                    include_dirs = EXT_INCLUDE_DIRS,
-                    define_macros = DEFINES)
-   ]
-
-   setup(
-     name = NAME,
-     cmdclass = {'build_ext': build_ext},
-     ext_modules = ext_modules
-   )
-
-The only variables you should have to change in this are the first six, and
-possibly only the first one.  We'll go through these variables one at a time.  
-
-``NAME``
-   This is the name of our source file, minus the ``.pyx``.  We're also
-   mandating that it be the name of the module we import.  You're free to
-   modify this.
-``EXT_SOURCES``
-   Any additional sources can be listed here.  For instance, if you are only
-   linking against a single ``.c`` file, you could list it here -- if our axes
-   calculator were fully contained within a file called ``calculate_my_axes.c``
-   we could link against it using this variable, and then we would not have to
-   specify any libraries.  This is usually the simplest way to do things, and in
-   fact, yt makes use of this itself for things like HEALpix and interpolation
-   functions.
-``EXT_LIBRARIES``
-   Any libraries that will need to be linked against (like ``m``!) should be
-   listed here.  Note that these are the name of the library minus the leading
-   ``lib`` and without the trailing ``.so``.  So ``libm.so`` would become ``m``
-   and ``libluggage.so`` would become ``luggage``.
-``EXT_LIBRARY_DIRS``
-   If the libraries listed in ``EXT_LIBRARIES`` reside in some other directory
-   or directories, those directories should be listed here.  For instance,
-   ``["/usr/local/lib", "/home/rincewind/luggage/"]`` .
-``EXT_INCLUDE_DIRS``
-   If any header files have been included that live in external directories,
-   those directories should be included here.
-``DEFINES``
-   Any define macros that should be passed to the C compiler should be listed
-   here; if they just need to be defined, then they should be specified to be
-   defined as "None."  For instance, if you wanted to pass ``-DTWOFLOWER``, you
-   would set this to equal: ``[("TWOFLOWER", None)]``.
-
-To build our extension, we would run:
-
-.. code-block:: bash
-
-   $ python2.7 axes_calculator_setup.py build_ext -i
-
-Note that since we don't yet have an ``axes_calculator.pyx``, this will fail.
-But once we have it, it ought to run.
-
-Writing and Calling our Wrapper
-+++++++++++++++++++++++++++++++
-
-Now we begin the tricky part, of writing our wrapper code.  We've already
-figured out how to build it, which is halfway to being able to test that it
-works, and we now need to start writing Cython code.
-
-For a more detailed introduction to Cython, see the Cython documentation at
-http://docs.cython.org/ .  We'll cover a few of the basics for wrapping code
-however.
-
-To start out with, we need to open up and edit our file,
-``axes_calculator.pyx``.  Open this in your favorite version of vi (mine is
-vim) and we will get started by declaring the struct we need to pass in.  But
-first, we need to include some header information:
-
-.. code-block:: cython
-
-   import numpy as np
-   cimport numpy as np
-   cimport cython
-   from stdlib cimport malloc, free
-
-These lines simply import and "Cython import" some common routines.  For more
-information about what is already available, see the Cython documentation.  For
-now, we need to start translating our data.
-
-To do so, we tell Cython both where the struct should come from, and then we
-describe the struct itself.  One fun thing to note is that if you don't need to
-set or access all the values in a struct, and it just needs to be passed around
-opaquely, you don't have to include them in the definition.  For an example of
-this, see the ``png_writer.pyx`` file in the yt repository.  Here's the syntax
-for pulling in (from a file called ``axes_calculator.h``) a struct like the one
-described above:
-
-.. code-block:: cython
-
-   cdef extern from "axes_calculator.h":
-       ctypedef struct ParticleCollection:
-           long npart
-           double *xpos
-           double *ypos
-           double *zpos
-
-So far, pretty easy!  We've basically just translated the declaration from the
-``.h`` file.  Now that we have done so, any other Cython code can create and
-manipulate these ``ParticleCollection`` structs -- which we'll do shortly.
-Next up, we need to declare the function we're going to call, which looks
-nearly exactly like the one in the ``.h`` file.  (One common problem is that
-Cython doesn't know what ``const`` means, so just remove it wherever you see
-it.)  Declare it like so:
-
-.. code-block:: cython
-
-       void calculate_axes(ParticleCollection *part,
-                double *ax1, double *ax2, double *ax3)
-
-Note that this is indented one level, to indicate that it, too, comes from
-``axes_calculator.h``.  The next step is to create a function that accepts
-arrays and converts them to the format the struct likes.  We declare our
-function just like we would a normal Python function, using ``def``.  You can
-also use ``cdef`` if you only want to call a function from within Cython.  We
-want to call it from Python, too, so we just use ``def``.  Note that we don't
-here specify types for the various arguments.  In a moment we'll refine this to
-have better argument types.
-
-.. code-block:: cython
-
-   def examine_axes(xpos, ypos, zpos):
-       cdef double ax1[3], ax2[3], ax3[3]
-       cdef ParticleCollection particles
-       cdef int i
-
-       particles.npart = len(xpos)
-       particles.xpos = <double *> malloc(particles.npart * sizeof(double))
-       particles.ypos = <double *> malloc(particles.npart * sizeof(double))
-       particles.zpos = <double *> malloc(particles.npart * sizeof(double))
-
-       for i in range(particles.npart):
-           particles.xpos[i] = xpos[i]
-           particles.ypos[i] = ypos[i]
-           particles.zpos[i] = zpos[i]
-
-       calculate_axes(&particles, ax1, ax2, ax3)
-
-       free(particles.xpos)
-       free(particles.ypos)
-       free(particles.zpos)
-
-       return ( (ax1[0], ax1[1], ax1[2]),
-                (ax2[0], ax2[1], ax2[2]),
-                (ax3[0], ax3[1], ax3[2]) )
-
-This does the rest.  Note that we've weaved in C-type declarations (ax1, ax2,
-ax3) and Python access to the variables fed in.  This function will probably be
-quite slow -- because it doesn't know anything about the variables xpos, ypos,
-zpos, it won't be able to speed up access to them.  Now we will see what we can
-do by declaring them to be of array-type before we start handling them at all.
-We can do that by annotating in the function argument list.  But first, let's
-test that it works.  From the directory in which you placed these files, run:
-
-.. code-block:: bash
-
-   $ python2.6 setup.py build_ext -i
-
-Now, create a sample file that feeds in the particles:
-
-.. code-block:: python
-
-    import axes_calculator
-    axes_calculator.examine_axes(xpos, ypos, zpos)
-
-Most of the time in that function is spent in converting the data.  So now we
-can go back and we'll try again, rewriting our converter function to believe
-that its being fed arrays from NumPy:
-
-.. code-block:: cython
-
-   def examine_axes(np.ndarray[np.float64_t, ndim=1] xpos,
-                    np.ndarray[np.float64_t, ndim=1] ypos,
-                    np.ndarray[np.float64_t, ndim=1] zpos):
-       cdef double ax1[3], ax2[3], ax3[3]
-       cdef ParticleCollection particles
-       cdef int i
-
-       particles.npart = len(xpos)
-       particles.xpos = <double *> malloc(particles.npart * sizeof(double))
-       particles.ypos = <double *> malloc(particles.npart * sizeof(double))
-       particles.zpos = <double *> malloc(particles.npart * sizeof(double))
-
-       for i in range(particles.npart):
-           particles.xpos[i] = xpos[i]
-           particles.ypos[i] = ypos[i]
-           particles.zpos[i] = zpos[i]
-
-       calculate_axes(&particles, ax1, ax2, ax3)
-
-       free(particles.xpos)
-       free(particles.ypos)
-       free(particles.zpos)
-
-       return ( (ax1[0], ax1[1], ax1[2]),
-                (ax2[0], ax2[1], ax2[2]),
-                (ax3[0], ax3[1], ax3[2]) )
-
-This should be substantially faster, assuming you feed it arrays.
-
-Now, there's one last thing we can try.  If we know our function won't modify
-our arrays, and they are C-Contiguous, we can simply grab pointers to the data:
-
-.. code-block:: cython
-
-   def examine_axes(np.ndarray[np.float64_t, ndim=1] xpos,
-                    np.ndarray[np.float64_t, ndim=1] ypos,
-                    np.ndarray[np.float64_t, ndim=1] zpos):
-       cdef double ax1[3], ax2[3], ax3[3]
-       cdef ParticleCollection particles
-       cdef int i
-
-       particles.npart = len(xpos)
-       particles.xpos = <double *> xpos.data
-       particles.ypos = <double *> ypos.data
-       particles.zpos = <double *> zpos.data
-
-       for i in range(particles.npart):
-           particles.xpos[i] = xpos[i]
-           particles.ypos[i] = ypos[i]
-           particles.zpos[i] = zpos[i]
-
-       calculate_axes(&particles, ax1, ax2, ax3)
-
-       return ( (ax1[0], ax1[1], ax1[2]),
-                (ax2[0], ax2[1], ax2[2]),
-                (ax3[0], ax3[1], ax3[2]) )
-
-But note!  This will break or do weird things if you feed it arrays that are
-non-contiguous.
-
-At this point, you should have a mostly working piece of wrapper code.  And it
-was pretty easy!  Let us know if you run into any problems, or if you are
-interested in distributing your code with yt.
-
-A complete set of files is available with this documentation.  These are
-slightly different, so that the whole thing will simply compile, but they
-provide a useful example.
-
- * `axes.c <../_static/axes.c>`_
- * `axes.h <../_static/axes.h>`_
- * `axes_calculator.pyx <../_static/axes_calculator.pyx>`_
- * `axes_calculator_setup.py <../_static/axes_calculator_setup.txt>`_
-
-Exporting Data from yt
-----------------------
-
-yt is installed alongside h5py.  If you need to export your data from yt, to
-share it with people or to use it inside another code, h5py is a good way to do
-so.  You can write out complete datasets with just a few commands.  You have to
-import, and then save things out into a file.
-
-.. code-block:: python
-
-   import h5py
-   f = h5py.File("some_file.h5")
-   f.create_dataset("/data", data=some_data)
-
-This will create ``some_file.h5`` if necessary and add a new dataset
-(``/data``) to it.  Writing out in ASCII should be relatively straightforward.
-For instance:
-
-.. code-block:: python
-
-   f = open("my_file.txt", "w")
-   for halo in halos:
-       x, y, z = halo.center_of_mass()
-       f.write("%0.2f %0.2f %0.2f\n", x, y, z)
-   f.close()
-
-This example could be extended to work with any data object's fields, as well.

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/ionization_cube.py
--- a/source/advanced/ionization_cube.py
+++ /dev/null
@@ -1,37 +0,0 @@
-from yt.mods import *
-from yt.utilities.parallel_tools.parallel_analysis_interface \
-    import communication_system
-import h5py, glob, time
-
- at derived_field(name = "IonizedHydrogen",
-               units = r"\frac{\rho_{HII}}{rho_H}")
-def IonizedHydrogen(field, data):
-    return data["HII_Density"]/(data["HI_Density"]+data["HII_Density"])
-
-ts = TimeSeriesData.from_filenames("SED800/DD*/*.hierarchy", parallel = 8)
-
-ionized_z = np.zeros(ts[0].domain_dimensions, dtype="float32")
-
-t1 = time.time()
-for pf in ts.piter():
-    z = pf.current_redshift
-    for g in parallel_objects(pf.h.grids, njobs = 16):
-        i1, j1, k1 = g.get_global_startindex() # Index into our domain
-        i2, j2, k2 = g.get_global_startindex() + g.ActiveDimensions
-        # Look for the newly ionized gas
-        newly_ion = ((g["IonizedHydrogen"] > 0.999)
-                   & (ionized_z[i1:i2,j1:j2,k1:k2] < z))
-        ionized_z[i1:i2,j1:j2,k1:k2][newly_ion] = z
-        g.clear_data()
-
-print "Iteration completed  %0.3e" % (time.time()-t1)
-comm = communication_system.communicators[-1]
-for i in range(ionized_z.shape[0]):
-    ionized_z[i,:,:] = comm.mpi_allreduce(ionized_z[i,:,:], op="max")
-    print "Slab % 3i has minimum z of %0.3e" % (i, ionized_z[i,:,:].max())
-t2 = time.time()
-print "Completed.  %0.3e" % (t2-t1)
-
-if comm.rank == 0:
-    f = h5py.File("IonizationCube.h5", "w")
-    f.create_dataset("/z", data=ionized_z)

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/parallel_computation.rst
--- a/source/advanced/parallel_computation.rst
+++ /dev/null
@@ -1,533 +0,0 @@
-.. _parallel-computation:
-
-Parallel Computation With YT
-============================
-
-YT has been instrumented with the ability to compute many -- most, even --
-quantities in parallel.  This utilizes the package 
-`mpi4py <http://code.google.com/p/mpi4py>`_ to parallelize using the Message
-Passing Interface, typically installed on clusters.  
-
-.. _capabilities:
-
-Capabilities
-------------
-
-Currently, YT is able to perform the following actions in parallel:
-
- * Projections (:ref:`projection-plots`)
- * Slices (:ref:`slice-plots`)
- * Cutting planes (oblique slices) (:ref:`off-axis-slices`)
- * Derived Quantities (total mass, angular momentum, etc) (:ref:`creating_derived_quantities`,
-   :ref:`derived-quantities`)
- * 1-, 2-, and 3-D profiles (:ref:`generating-profiles-and-histograms`)
- * Halo finding (:ref:`halo_finding`)
- * Merger tree (:ref:`merger_tree`)
- * Two point functions (:ref:`two_point_functions`)
- * Volume rendering (:ref:`volume_rendering`)
- * Radial column density (:ref: `radial-column-density`)
- * Isocontours & flux calculations (:ref: `extracting-isocontour-information`)
-
-This list covers just about every action YT can take!  Additionally, almost all
-scripts will benefit from parallelization without any modification.  The goal
-of Parallel-YT has been to retain API compatibility and abstract all
-parallelism.
-
-Setting Up Parallel YT
-----------------------
-
-To run scripts in parallel, you must first install
-`mpi4py <http://code.google.com/p/mpi4py>`_.
-Instructions for doing so are provided on the mpi4py website, but you may
-have luck by just running: 
-
-.. code-block:: bash
-
-    $ pip install mpi4py
-
-Once that has been installed, you're all done!  You just need to launch your 
-scripts with ``mpirun`` (or equivalent) and signal to YT that you want to run 
-them in parallel.  In general, that's all it takes to get a speed benefit on a 
-multi-core machine.  Here is an example on an 8-core desktop:
-
-.. code-block:: bash
-
-    $ mpirun -np 8 python script.py --parallel
-
-Throughout its normal operation, yt keeps you aware of what is happening with
-regular messages to the stderr usually prefaced with: 
-
-.. code-block:: bash
-
-    yt : [INFO   ] YYY-MM-DD HH:MM:SS
-
-However, when operating in parallel mode, yt outputs information from each
-of your processors to this log mode, as in:
-
-.. code-block:: bash
-
-    P000 yt : [INFO   ] YYY-MM-DD HH:MM:SS
-    P001 yt : [INFO   ] YYY-MM-DD HH:MM:SS
-
-in the case of two cores being used.
-
-It's important to note that all of the processes listed in `capabilities` work
--- and no additional work is necessary to parallelize those processes.
-Furthermore, the ``yt`` command itself recognizes the ``--parallel`` option, so
-those commands will work in parallel as well.
-
-The Derived Quantities and Profile objects must both have the ``lazy_reader``
-option set to ``True`` when they are instantiated.  What this does is to
-operate on a grid-by-grid decomposed basis.  In ``yt`` version 1.5 and the
-trunk, this has recently been set to be the default.
-
-.. warning:: If you manually interact with the filesystem via
-   functions, not through YT, you will have to ensure that you only
-   execute these functions on the root processor.  You can do this
-   with the function :func:`only_on_root`. If you have only a few
-   lines of code that interact with the filesystem
-   (e.g. ``pyplot.savefig('blah.png')``), you can wrap them in an if
-   statement, using yt's :func:`is_root` which returns True only for
-   the root process. See :ref:`cookbook-time-series-analysis` for
-   an example.
-
-yt.pmods
---------
-
-yt.pmods is a replacement module for yt.mods, which can be enabled in
-the ``from yt.mods import *`` calls in yt scripts.  It should enable 
-more efficient use of parallel filesystems, if you are running on such a 
-system.
-
-For instance, the following script, which we'll save as ``my_script.py``:
-
-.. code-block:: python
-
-   from yt.pmods import *
-   pf = load("RD0035/RedshiftOutput0035")
-   v, c = pf.h.find_max("Density")
-   print v, c
-   p = ProjectionPlot(pf, "x", "Density")
-   p.save()
-
-will execute the finding of the maximum density and the projection in parallel
-if launched in parallel.  Note that the usual ``from yt.mods import *`` has 
-been replaced by ``from yt.pmods import *``.
-To run this script at the command line you would execute:
-
-.. code-block:: bash
-
-   $ mpirun -np 16 python2.7 my_script.py --parallel
-
-if you wanted it to run in parallel on 16 cores (you can always the number of 
-cores you want to run on).  If you run into problems, the you can use
-:ref:`remote-debugging` to examine what went wrong.
-
-Types of Parallelism
---------------------
-
-In order to divide up the work, YT will attempt to send different tasks to
-different processors.  However, to minimize inter-process communication, YT
-will decompose the information in different ways based on the task.
-
-Spatial Decomposition
-+++++++++++++++++++++
-
-During this process, the hierarchy will be decomposed along either all three
-axes or along an image plane, if the process is that of projection.  This type
-of parallelism is overall less efficient than grid-based parallelism, but it
-has been shown to obtain good results overall.
-
-The following operations use spatial decomposition:
-
-  * Halo finding
-  * Merger tree
-  * Two point functions
-  * Volume rendering
-  * Radial column density
-
-Grid Decomposition
-++++++++++++++++++
-
-The alternative to spatial decomposition is a simple round-robin of the grids.
-This process allows YT to pool data access to a given Enzo data file, which
-ultimately results in faster read times and better parallelism.
-
-The following operations use grid decomposition:
-
-  * Projections
-  * Slices
-  * Cutting planes
-  * Derived Quantities
-  * 1-, 2-, and 3-D profiles
-  * Isocontours & flux calculations
-
-Object-Based
-++++++++++++
-
-In a fashion similar to grid decomposition, computation can be parallelized
-over objects. This is especially useful for
-`embarrassingly parallel <http://en.wikipedia.org/wiki/Embarrassingly_parallel>`_
-tasks where the items to be worked on can be split into separate chunks and
-saved to a list. The list is then split up and each MPI task performs parts of
-it independently.
-
-.. _parallelizing-your-analysis:
-
-Parallelizing Your Analysis
----------------------------
-
-It is easy within YT to parallelize a list of tasks, as long as those tasks
-are independent of one another.
-Using object-based parallelism, the function :func:`parallel_objects` will
-automatically split up a list of tasks over the specified number of processors
-(or cores).
-Please see this heavily-commented example:
-
-.. code-block:: python
-   
-   # This is necessary to prevent a race-condition where each copy of
-   # yt attempts to save information about datasets to the same file on disk,
-   # simultaneously. This will be fixed, eventually...
-   from yt.config import ytcfg; ytcfg["yt","serialize"] = "False"
-   # As always...
-   from yt.pmods import *
-   import glob
-   
-   # The number 4, below, is the number of processes to parallelize over, which
-   # is generally equal to the number of MPI tasks the job is launched with.
-   # If num_procs is set to zero or a negative number, the for loop below
-   # will be run such that each iteration of the loop is done by a single MPI
-   # task. Put another way, setting it to zero means that no matter how many
-   # MPI tasks the job is run with, num_procs will default to the number of
-   # MPI tasks automatically.
-   num_procs = 4
-   
-   # fns is a list of all Enzo hierarchy files in directories one level down.
-   fns = glob.glob("*/*.hierarchy")
-   fns.sort()
-   # This dict will store information collected in the loop, below.
-   # Inside the loop each task will have a local copy of the dict, but
-   # the dict will be combined once the loop finishes.
-   my_storage = {}
-   # In this example, because the storage option is used in the
-   # parallel_objects function, the loop yields a tuple, which gets used
-   # as (sto, fn) inside the loop.
-   # In the loop, sto is essentially my_storage, but a local copy of it.
-   # If data does not need to be combined after the loop is done, the line
-   # would look like:
-   #       for fn in parallel_objects(fns, num_procs):
-   for sto, fn in parallel_objects(fns, num_procs, storage = my_storage):
-       # Open a data file, remembering that fn is different on each task.
-       pf = load(fn)
-       dd = pf.h.all_data()
-       # This copies fn and the min/max of density to the local copy of
-       # my_storage
-       sto.result_id = fn
-       sto.result = dd.quantities["Extrema"]("Density")
-       # Makes and saves a plot of the gas density.
-       p = ProjectionPlot(pf, "x", "Density")
-       p.save()
-   # At this point, as the loop exits, the local copies of my_storage are
-   # combined such that all tasks now have an identical and full version of
-   # my_storage. Until this point, each task is unaware of what the other
-   # tasks have produced.
-   # Below, the values in my_storage are printed by only one task. The other
-   # tasks do nothing.
-   if ytcfg.getint("yt", "__topcomm_parallel_rank") == 0:
-       for fn, vals in sorted(my_storage.items()):
-           print fn, vals
-
-This example above can be modified to loop over anything that can be saved to
-a Python list: halos, data files, arrays, and more.
-
-.. _parallel-time-series-analysis:
-
-Parallel Time Series Analysis
------------------------------
-
-The same :func:`parallel_objects` machinery discussed above is turned on by
-default when using a ``TimeSeriesData`` object (see :ref:`time-series-analysis`)
-to iterate over simulation outputs.  The syntax for this is very simple.  As an
-example, we can use the following script to find the angular momentum vector in
-a 1 pc sphere centered on the maximum density cell in a large number of
-simulation outputs:
-
-.. code-block:: python
-
-   from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = True)
-   sphere = ts.sphere("max", (1.0, "pc"))
-   L_vecs = sphere.quantities["AngularMomentumVector"]()
-
-Note that this script can be run in serial or parallel with an arbitrary number
-of processors.  When running in parallel, each output is given to a different
-processor.  By default, parallel is set to ``True``, so you do not have to
-explicitly set ``parallel = True`` as in the above example. 
-
-One could get the same effect by iterating over the individual parameter files
-in the TimeSeriesData object:
-
-.. code-block:: python
-
-   from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = True)
-   my_storage = {}
-   for sto,pf in ts.piter(storage=my_storage):
-       sphere = pf.h.sphere("max", (1.0, "pc"))
-       L_vec = sphere.quantities["AngularMomentumVector"]()
-       sto.result_id = pf.parameter_filename
-       sto.result = L_vec
-
-   L_vecs = []
-   for fn, L_vec in sorted(my_storage.items()):
-       L_vecs.append(L_vec)
-
-
-You can also request a fixed number of processors to calculate each
-angular momentum vector.  For example, this script will calculate each angular
-momentum vector using 4 workgroups, splitting up the pool available processors.
-Note that parallel=1 implies that the analysis will be run using 1 workgroup, 
-whereas parallel=True will run with Nprocs workgroups.
-
-.. code-block:: python
-
-   from yt.pmods import *
-   ts = TimeSeriesData.from_filenames("DD*/output_*", parallel = 4)
-   sphere = ts.sphere("max", (1.0, "pc))
-   L_vecs = sphere.quantities["AngularMomentumVector"]()
-
-If you do not want to use ``parallel_objects`` parallelism when using a
-TimeSeries object, set ``parallel = False``.  When running python in parallel,
-this will use all of the available processors to evaluate the requested
-operation on each simulation output.  Some care and possibly trial and error
-might be necessary to estimate the correct settings for your Simulation
-outputs.
-
-Parallel Performance, Resources, and Tuning
--------------------------------------------
-
-Optimizing parallel jobs in YT is difficult; there are many parameters that
-affect how well and quickly the job runs.  In many cases, the only way to find
-out what the minimum (or optimal) number of processors is, or amount of memory
-needed, is through trial and error.  However, this section will attempt to
-provide some insight into what are good starting values for a given parallel
-task.
-
-Grid Decomposition
-++++++++++++++++++
-
-In general, these types of parallel calculations scale very well with number of
-processors.
-They are also fairly memory-conservative.
-The two limiting factors is therefore the number of grids in the dataset,
-and the speed of the disk the data is stored on.
-There is no point in running a parallel job of this kind with more processors
-than grids, because the extra processors will do absolutely nothing, and will
-in fact probably just serve to slow down the whole calculation due to the extra
-overhead.
-The speed of the disk is also a consideration - if it is not a high-end parallel
-file system, adding more tasks will not speed up the calculation if the disk
-is already swamped with activity.
-
-The best advice for these sort of calculations is to run 
-with just a few processors and go from there, seeing if it the runtime
-improves noticeably.
-
-**Projections, Slices, and Cutting Planes**
-
-Projections, slices and cutting planes are the most common methods of creating
-two-dimensional representations of data.  All three have been parallelized in a
-grid-based fashion.
-
- * Projections: projections are parallelized utilizing a quad-tree approach.
-   Data is loaded for each processor, typically by a process that consolidates
-   open/close/read operations, and each grid is then iterated over and cells
-   are deposited into a data structure that stores values corresponding to
-   positions in the two-dimensional plane.  This provides excellent load
-   balancing, and in serial is quite fast.  However, as of yt 2.3, the
-   operation by which quadtrees are joined across processors scales poorly;
-   while memory consumption scales well, the time to completion does not.  As
-   such, projections can often be done very fast when operating only on a single
-   processor!  The quadtree algorithm can be used inline (and, indeed, it is
-   for this reason that it is slow.)  It is recommended that you attempt to
-   project in serial before projecting in parallel; even for the very largest
-   datasets (Enzo 1024^3 root grid with 7 levels of refinement) in the absence
-   of IO the quadtree algorithm takes only three minutes or so on a decent
-   processor.
- * Slices: to generate a slice, grids that intersect a given slice are iterated
-   over and their finest-resolution cells are deposited.  The grids are
-   decomposed via standard load balancing.  While this operation is parallel,
-   **it is almost never necessary to slice a dataset in parallel**, as all data is
-   loaded on demand anyway.  The slice operation has been parallelized so as to
-   enable slicing when running *in situ*.
- * Cutting planes: cutting planes are parallelized exactly as slices are.
-   However, in contrast to slices, because the data-selection operation can be
-   much more time consuming, cutting planes often benefit from parallelism.
-
-Object-Based
-++++++++++++
-
-Like grid decomposition, it does not help to run with more processors than the
-number of objects to be iterated over.
-There is also the matter of the kind of work being done on each object, and
-whether it is disk-intensive, cpu-intensive, or memory-intensive.
-It is up to the user to figure out what limits the performance of their script,
-and use the correct amount of resources, accordingly.
-
-Disk-intensive jobs are limited by the speed of the file system, as above,
-and extra processors beyond its capability are likely counter-productive.
-It may require some testing or research (e.g. supercomputer documentation)
-to find out what the file system is capable of.
-
-If it is cpu-intensive, it's best to use as many processors as possible
-and practical.
-
-For a memory-intensive job, each processor needs to be able to allocate enough
-memory, which may mean using fewer than the maximum number of tasks per compute
-node, and increasing the number of nodes.
-The memory used per processor should be calculated, compared to the memory
-on each compute node, which dictates how many tasks per node.
-After that, the number of processors used overall is dictated by the 
-disk system or CPU-intensity of the job.
-
-
-Domain Decomposition
-++++++++++++++++++++
-
-The various types of analysis that utilize domain decomposition use them in
-different enough ways that they are be discussed separately.
-
-**Halo-Finding**
-
-Halo finding, along with the merger tree that uses halo finding, operates
-on the particles in the volume, and is therefore mostly grid-agnostic.
-Generally, the biggest concern for halo finding is the amount of memory needed.
-There is subtle art in estimating the amount of memory needed for halo finding,
-but a rule of thumb is that Parallel HOP (:func:`parallelHF`) is the most
-memory-intensive, followed by plain HOP (:func:`HaloFinder`),
-with Friends of Friends (:func:`FOFHaloFinder`) being
-the most memory-conservative.
-It has been found that :func:`parallelHF` needs roughly
-1 MB of memory per 5,000
-particles, although recent work has improved this and the memory requirement
-is now smaller than this. But this is a good starting point for beginning to
-calculate the memory required for halo-finding.
-
-**Two point functions**
-
-Please see :ref:`tpf_strategies` for more details.
-
-**Volume Rendering**
-
-The simplest way to think about volume rendering, and the radial column density
-module that uses it, is that it load-balances over the grids in the dataset.
-Each processor is given roughly the same sized volume to operate on.
-In practice, there are just a few things to keep in mind when doing volume
-rendering.
-First, it only uses a power of two number of processors.
-If the job is run with 100 processors, only 64 of them will actually do anything.
-Second, the absolute maximum number of processors is the number of grids.
-But in order to keep work distributed evenly, typically the number of processors
-should be no greater than one-eighth or one-quarter the number of processors
-that were used to produce the dataset.
-
-Additional Tips
----------------
-
-  * Don't be afraid to change how a parallel job is run. Change the
-    number of processors, or memory allocated, and see if things work better
-    or worse. After all, it's just a computer, it doesn't pass moral judgment!
-
-  * Similarly, human time is more valuable than computer time. Try increasing
-    the number of processors, and see if the runtime drops significantly.
-    There will be a sweet spot between speed of run and the waiting time in
-    the job scheduler queue; it may be worth trying to find it.
-
-  * If you are using object-based parallelism but doing CPU-intensive computations
-    on each object, you may find that setting ``num_procs`` equal to the 
-    number of processors per compute node can lead to significant speedups.
-    By default, most mpi implementations will assign tasks to processors on a
-    'by-slot' basis, so this setting will tell yt to do computations on a single
-    object using only the processors on a single compute node.  A nice application
-    for this type of parallelism is calculating a list of derived quantities for 
-    a large number of simulation outputs.
-
-  * It is impossible to tune a parallel operation without understanding what's
-    going on. Read the documentation, look at the underlying code, or talk to
-    other yt users. Get informed!
-    
-  * Sometimes it is difficult to know if a job is cpu, memory, or disk
-    intensive, especially if the parallel job utilizes several of the kinds of
-    parallelism discussed above. In this case, it may be worthwhile to put
-    some simple timers in your script (as below) around different parts.
-    
-    .. code-block:: python
-    
-       from yt.pmods import *
-       import time
-       
-       pf = load("DD0152")
-       t0 = time.time()
-       bigstuff, hugestuff = StuffFinder(pf)
-       BigHugeStuffParallelFunction(pf, bigstuff, hugestuff)
-       t1 = time.time()
-       for i in range(1000000):
-           tinystuff, ministuff = GetTinyMiniStuffOffDisk("in%06d.txt" % i)
-           array = TinyTeensyParallelFunction(pf, tinystuff, ministuff)
-           SaveTinyMiniStuffToDisk("out%06d.txt" % i, array)
-       t2 = time.time()
-       
-       if ytcfg.getint("yt", "__topcomm_parallel_rank") == 0:
-           print "BigStuff took %.5e sec, TinyStuff took %.5e sec" % (t1 - t0, t2 - t1)
-  
-  * Remember that if the script handles disk IO explicitly, and does not use
-    a built-in yt function to write data to disk,
-    care must be taken to
-    avoid `race-conditions <http://en.wikipedia.org/wiki/Race_conditions>`_.
-    Be explicit about which MPI task writes to disk using a construction
-    something like this:
-    
-    .. code-block:: python
-       
-       if ytcfg.getint("yt", "__topcomm_parallel_rank") == 0:
-           file = open("out.txt", "w")
-           file.write(stuff)
-           file.close()
-
-  * Many supercomputers allow users to ssh into the nodes that their job is
-    running on.
-    Many job schedulers send the names of the nodes that are
-    used in the notification emails, or a command like ``qstat -f NNNN``, where
-    ``NNNN`` is the job ID, will also show this information.
-    By ssh-ing into nodes, the memory usage of each task can be viewed in
-    real-time as the job runs (using ``top``, for example),
-    and can give valuable feedback about the
-    resources the task requires.
-    
-An Advanced Worked Example
---------------------------
-
-Below is a script used to calculate the redshift of first 99.9% ionization in a
-simulation.  This script was designed to analyze a set of 100 outputs on
-Gordon, running on 128 processors.  This script goes through three phases:
-
- #. Define a new derived field, which calculates the fraction of ionized
-    hydrogen as a function only of the total hydrogen density.
- #. Load a time series up, specifying ``parallel = 8``.  This means that it
-    will decompose into 8 jobs.  So if we ran on 128 processors, we would have
-    16 processors assigned to each output in the time series.
- #. Creating a big cube that will hold our results for this set of processors.
-    Note that this will be only for each output considered by this processor,
-    and this cube will not necessarily be filled in in every cell.
- #. For each output, distribute the grids to each of the sixteen processors
-    working on that output.  Each of these takes the max of the ionized
-    redshift in their zone versus the accumulation cube.
- #. Iterate over slabs and find the maximum redshift in each slab of our
-    accumulation cube.
-
-At the end, the root processor (of the global calculation) writes out an
-ionization cube that contains the redshift of first reionization for each zone
-across all outputs.
-
-.. literalinclude:: ionization_cube.py

diff -r 4bee173c6ffca15c9c4e41ee8c16c4670148e90e -r 38edb65efe940bac12681b6508f7d47e2caebb27 source/advanced/plugin_file.rst
--- a/source/advanced/plugin_file.rst
+++ /dev/null
@@ -1,56 +0,0 @@
-.. _plugin-file:
-
-The Plugin File
-===============
-
-The plugin file is a means of modifying the available fields, quantities, data
-objects and so on without modifying the source code of yt.  The plugin file
-will be executed if it is detected, and it must be:
-
-.. code-block:: bash
-
-   $HOME/.yt/my_plugins.py
-
-The code in this file can thus add fields, add derived quantities, add
-datatypes, and on and on.  It is executed at the bottom of ``yt.mods``, and so
-it is provided with the entire namespace available in the module ``yt.mods`` --
-which is the primary entry point to yt, and which contains most of the
-functionality of yt.  For example, if I created a plugin file containing:
-
-.. code-block:: python
-
-   def _myfunc(field, data):
-       return np.random.random(data["Density"].shape)
-   add_field("SomeQuantity", function=_myfunc)
-
-then all of my data objects would have access to the field "SomeQuantity"
-despite its lack of use.
-
-You can also define other convenience functions in your plugin file.  For
-instance, you could define some variables or functions, and even import common
-modules:
-
-.. code-block:: python
-
-   import os
-
-   HOMEDIR="/home/username/"
-   RUNDIR="/scratch/runs/"
-
-   def load_run(fn):
-       if not os.path.exists(RUNDIR + fn):
-           return None
-       return load(RUNDIR + fn)
-
-In this case, we've written ``load_run`` to look in a specific directory to see
-if it can find an output with the given name.  So now we can write scripts that
-use this function:
-
-.. code-block:: python
-
-   from yt.mods import *
-
-   my_run = load_run("hotgasflow/DD0040/DD0040")
-
-And because we have imported from ``yt.mods`` we have access to the
-``load_run`` function defined in our plugin file.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/a8c48dfbdd88/
Changeset:   a8c48dfbdd88
User:        MatthewTurk
Date:        2013-10-28 20:10:13
Summary:     Moving getting involved around.
Affected #:  5 files

diff -r 38edb65efe940bac12681b6508f7d47e2caebb27 -r a8c48dfbdd886a2f362f24c14e335f281470517e source/analyzing/index.rst
--- a/source/analyzing/index.rst
+++ b/source/analyzing/index.rst
@@ -9,4 +9,6 @@
    creating_derived_fields
    generating_processed_data
    time_series_analysis
+   external_analysis
+   parallel_computation
    analysis_modules/index

diff -r 38edb65efe940bac12681b6508f7d47e2caebb27 -r a8c48dfbdd886a2f362f24c14e335f281470517e source/developing/index.rst
--- /dev/null
+++ b/source/developing/index.rst
@@ -0,0 +1,18 @@
+Getting Involved with yt
+========================
+
+There are many ways to get involved with yt -- participating in the mailing
+list, helping people out in IRC, providing suggestions for the documentation,
+and contributing code!
+
+.. toctree::
+   :maxdepth: 2
+    
+   intro
+   developing
+   testing
+   debugdrive
+   creating_datatypes
+   creating_derived_fields
+   creating_derived_quantities
+   creating_frontend

diff -r 38edb65efe940bac12681b6508f7d47e2caebb27 -r a8c48dfbdd886a2f362f24c14e335f281470517e source/developing/intro.rst
--- /dev/null
+++ b/source/developing/intro.rst
@@ -0,0 +1,156 @@
+.. _getting-involved:
+
+Getting Involved
+================
+
+There are *lots* of ways to get involved with yt, as a community and as a
+technical system -- not all of them just contributing code, but also
+participating in the community, helping us with designing the websites, adding
+documentation, and sharing your scripts with others.
+
+Coding is only one way to be involved!
+
+Communication Channels
+----------------------
+
+There are four main communication channels for yt:
+
+ * We also have an IRC channel, on ``irc.freenode.net`` in ``#yt``, which can be a
+   bit less on-topic than the mailing lists.  You can connect through our web
+   gateway without any special client, at http://yt-project.org/irc.html .
+   *IRC is the first stop for conversation!*
+ * `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
+   is a relatively high-traffic mailing list where people are encouraged to ask
+   questions about the code, figure things out and so on.
+ * `yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ is
+   a much lower-traffic mailing list designed to focus on discussions of
+   improvements to the code, ideas about planning, development issues, and so
+   on.
+ * `yt-svn <http://lists.spacepope.org/listinfo.cgi/yt-svn-spacepope.org>`_ is
+   the (now-inaccurately titled) mailing list where all pushes to the primary
+   repository are sent.
+
+The easiest way to get involved with yt is to read the mailing lists, hang out
+in IRC, and participate.  If someone asks a question you know the answer to (or
+have your own question about!) write back and answer it.
+
+If you have an idea about something, suggest it!  We not only welcome
+participation, we encourage it.
+
+.. _share-your-scripts:
+
+Share Your Scripts
+------------------
+
+The next easiest way to get involved with yt is to participate in the `yt Hub
+<http://hub.yt-project.org/>`_.  This is a place where scripts, paper
+repositories, documents and so on can be submitted to share with the broader
+community.
+
+If you have a repository on `BitBucket <https://bitbucket.org/>`_ then you can
+simply submit it through the ytHub submit link.   Otherwise, we provide the
+``yt hubsubmit`` command, which will guide you through the process of creating
+a mercurial repository, uploading it to BitBucket, and then submitting it
+directly to the Hub.
+
+This is one of the best ways to get involved in the community!  We would love
+to have more examples that show complex or advanced behavior -- and if you have
+used such scripts to write a paper, that too would be an amazing contribution.
+
+Documentation and Screencasts
+-----------------------------
+
+The yt documentation -- which you are reading right now -- is constantly being
+updated, and it is a task we would very much appreciate assistance with.
+Whether that is adding a section, updating an outdated section, contributing
+typo or grammatical fixes, adding a FAQ, or increasing coverage of
+functionality, it would be very helpful if you wanted to help out.
+
+The easiest way to help out is to fork the repository:
+
+http://hg.yt-project.org/yt-doc/fork
+
+and then make your changes in your own fork.  When you are done, issue a pull
+request through the website for your new fork, and we can comment back and
+forth and eventually accept your changes.
+
+One of the more interesting ways we are attempting to do lately is to add
+screencasts to the documentation -- these are recordings of people executing
+sessions in a terminal or in a web browser, showing off functionality and
+describing how to do various things.  These provide a more dynamic and
+engaging way of demonstrating functionality and teaching methods.
+
+One easy place to record screencasts is with `Screencast-O-Matic
+<http://www.screencast-o-matic.com/>`_ but there are many to choose from.  Once
+you have recorded it, let us know and be sure to add it to the
+`yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.  We'll then link to it
+from the documentation!
+
+Gallery Images and Videos
+-------------------------
+
+If you have an image or video you'd like to display in the image or video
+galleries, getting it included it easy!  For the image, you can either fork the
+`yt homepage repository <http://bitbucket.org/MatthewTurk/yt-homepage>`_ and
+add it there, or email it to us and we'll add it to the `Image Gallery
+<http://hg.yt-project.org/yt/wiki/ImageGallery>`_.  If you have a video, just
+add it to the `yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.
+
+We're eager to show off the images you make with yt, so please feel free to
+drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ a
+line and let us know if you've got something great!
+
+Technical Contributions
+-----------------------
+
+Contributing code is another excellent way to participate -- whether it's
+bug fixes, new features, analysis modules, or a new code frontend.  See 
+:ref:`creating_frontend` for more details.
+
+The process is pretty simple: fork on BitBucket, make changes, issue a pull
+request.  We can then go back and forth with comments in the pull request, but
+usually we end up accepting.
+
+For more information, see :ref:`contributing-code`, where we spell out how to
+get up and running with a development environment, how to commit, and how to
+use BitBucket.
+
+Online Presence
+---------------
+
+Some of these fall under the other items, but if you'd like to help out with
+the website or any of the other ways yt is presented online, please feel free!
+Almost everything is kept in hg repositories on BitBucket, and it is very easy
+to fork and contribute back changes.
+
+Please feel free to dig in and contribute changes.
+
+Word of Mouth
+-------------
+
+If you're using yt and it has increased your productivity, please feel
+encouraged to share that information.  Cite our `paper
+<http://adsabs.harvard.edu/abs/2011ApJS..192....9T>`_, tell your colleagues,
+and just spread word of mouth.  By telling people about your successes, you'll
+help bring more eyes and hands to the table -- in this manner, by increasing
+participation, collaboration, and simply spreading the limits of what the code
+is asked to do, we hope to help scale the utility and capability of yt with the
+community size.
+
+Feel free to `blog <http://blog.yt-project.org/>`_ about, `tweet
+<http://twitter.com/yt_astro>`_ about and talk about what you are up to!
+
+Long-Term Projects
+------------------
+
+There are some wild-eyed, out-there ideas that have been bandied about for the
+future directions of yt -- some of them even written into the mission
+statement.  The ultimate goal is to move past simple analysis and visualization
+of data and begin to approach it from the other side, of generating data,
+running solvers.  We also hope to increase its ability to act as an in situ
+analysis code, by presenting a unified protocol.  Other projects include
+interfacing with ParaView and VisIt, creating a web GUI for running
+simulations, creating a run-tracker that follows simulations in progress, a
+federated database for simulation outputs, and so on and so forth.
+
+yt is an ambitious project.  Let's be ambitious together.

diff -r 38edb65efe940bac12681b6508f7d47e2caebb27 -r a8c48dfbdd886a2f362f24c14e335f281470517e source/getting_involved/index.rst
--- a/source/getting_involved/index.rst
+++ /dev/null
@@ -1,156 +0,0 @@
-.. _getting-involved:
-
-Getting Involved
-================
-
-There are *lots* of ways to get involved with yt, as a community and as a
-technical system -- not all of them just contributing code, but also
-participating in the community, helping us with designing the websites, adding
-documentation, and sharing your scripts with others.
-
-Coding is only one way to be involved!
-
-Communication Channels
-----------------------
-
-There are four main communication channels for yt:
-
- * We also have an IRC channel, on ``irc.freenode.net`` in ``#yt``, which can be a
-   bit less on-topic than the mailing lists.  You can connect through our web
-   gateway without any special client, at http://yt-project.org/irc.html .
-   *IRC is the first stop for conversation!*
- * `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
-   is a relatively high-traffic mailing list where people are encouraged to ask
-   questions about the code, figure things out and so on.
- * `yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ is
-   a much lower-traffic mailing list designed to focus on discussions of
-   improvements to the code, ideas about planning, development issues, and so
-   on.
- * `yt-svn <http://lists.spacepope.org/listinfo.cgi/yt-svn-spacepope.org>`_ is
-   the (now-inaccurately titled) mailing list where all pushes to the primary
-   repository are sent.
-
-The easiest way to get involved with yt is to read the mailing lists, hang out
-in IRC, and participate.  If someone asks a question you know the answer to (or
-have your own question about!) write back and answer it.
-
-If you have an idea about something, suggest it!  We not only welcome
-participation, we encourage it.
-
-.. _share-your-scripts:
-
-Share Your Scripts
-------------------
-
-The next easiest way to get involved with yt is to participate in the `yt Hub
-<http://hub.yt-project.org/>`_.  This is a place where scripts, paper
-repositories, documents and so on can be submitted to share with the broader
-community.
-
-If you have a repository on `BitBucket <https://bitbucket.org/>`_ then you can
-simply submit it through the ytHub submit link.   Otherwise, we provide the
-``yt hubsubmit`` command, which will guide you through the process of creating
-a mercurial repository, uploading it to BitBucket, and then submitting it
-directly to the Hub.
-
-This is one of the best ways to get involved in the community!  We would love
-to have more examples that show complex or advanced behavior -- and if you have
-used such scripts to write a paper, that too would be an amazing contribution.
-
-Documentation and Screencasts
------------------------------
-
-The yt documentation -- which you are reading right now -- is constantly being
-updated, and it is a task we would very much appreciate assistance with.
-Whether that is adding a section, updating an outdated section, contributing
-typo or grammatical fixes, adding a FAQ, or increasing coverage of
-functionality, it would be very helpful if you wanted to help out.
-
-The easiest way to help out is to fork the repository:
-
-http://hg.yt-project.org/yt-doc/fork
-
-and then make your changes in your own fork.  When you are done, issue a pull
-request through the website for your new fork, and we can comment back and
-forth and eventually accept your changes.
-
-One of the more interesting ways we are attempting to do lately is to add
-screencasts to the documentation -- these are recordings of people executing
-sessions in a terminal or in a web browser, showing off functionality and
-describing how to do various things.  These provide a more dynamic and
-engaging way of demonstrating functionality and teaching methods.
-
-One easy place to record screencasts is with `Screencast-O-Matic
-<http://www.screencast-o-matic.com/>`_ but there are many to choose from.  Once
-you have recorded it, let us know and be sure to add it to the
-`yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.  We'll then link to it
-from the documentation!
-
-Gallery Images and Videos
--------------------------
-
-If you have an image or video you'd like to display in the image or video
-galleries, getting it included it easy!  For the image, you can either fork the
-`yt homepage repository <http://bitbucket.org/MatthewTurk/yt-homepage>`_ and
-add it there, or email it to us and we'll add it to the `Image Gallery
-<http://hg.yt-project.org/yt/wiki/ImageGallery>`_.  If you have a video, just
-add it to the `yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.
-
-We're eager to show off the images you make with yt, so please feel free to
-drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ a
-line and let us know if you've got something great!
-
-Technical Contributions
------------------------
-
-Contributing code is another excellent way to participate -- whether it's
-bug fixes, new features, analysis modules, or a new code frontend.  See 
-:ref:`creating_frontend` for more details.
-
-The process is pretty simple: fork on BitBucket, make changes, issue a pull
-request.  We can then go back and forth with comments in the pull request, but
-usually we end up accepting.
-
-For more information, see :ref:`contributing-code`, where we spell out how to
-get up and running with a development environment, how to commit, and how to
-use BitBucket.
-
-Online Presence
----------------
-
-Some of these fall under the other items, but if you'd like to help out with
-the website or any of the other ways yt is presented online, please feel free!
-Almost everything is kept in hg repositories on BitBucket, and it is very easy
-to fork and contribute back changes.
-
-Please feel free to dig in and contribute changes.
-
-Word of Mouth
--------------
-
-If you're using yt and it has increased your productivity, please feel
-encouraged to share that information.  Cite our `paper
-<http://adsabs.harvard.edu/abs/2011ApJS..192....9T>`_, tell your colleagues,
-and just spread word of mouth.  By telling people about your successes, you'll
-help bring more eyes and hands to the table -- in this manner, by increasing
-participation, collaboration, and simply spreading the limits of what the code
-is asked to do, we hope to help scale the utility and capability of yt with the
-community size.
-
-Feel free to `blog <http://blog.yt-project.org/>`_ about, `tweet
-<http://twitter.com/yt_astro>`_ about and talk about what you are up to!
-
-Long-Term Projects
-------------------
-
-There are some wild-eyed, out-there ideas that have been bandied about for the
-future directions of yt -- some of them even written into the mission
-statement.  The ultimate goal is to move past simple analysis and visualization
-of data and begin to approach it from the other side, of generating data,
-running solvers.  We also hope to increase its ability to act as an in situ
-analysis code, by presenting a unified protocol.  Other projects include
-interfacing with ParaView and VisIt, creating a web GUI for running
-simulations, creating a run-tracker that follows simulations in progress, a
-federated database for simulation outputs, and so on and so forth.
-
-yt is an ambitious project.  Let's be ambitious together.

diff -r 38edb65efe940bac12681b6508f7d47e2caebb27 -r a8c48dfbdd886a2f362f24c14e335f281470517e source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -36,5 +36,5 @@
    examining/index
    visualizing/index
    analyzing/index
-   getting_involved/index
+   developing/index
    reference/index


https://bitbucket.org/yt_analysis/yt-doc/commits/10234ebc9e63/
Changeset:   10234ebc9e63
User:        MatthewTurk
Date:        2013-10-28 20:12:15
Summary:     Adding some things to the hgignore
Affected #:  1 file

diff -r a8c48dfbdd886a2f362f24c14e335f281470517e -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -6,3 +6,4 @@
 _temp/*
 **/.DS_Store
 RD0005-mine/*
+source/bootcamp/.ipynb_checkpoints/


https://bitbucket.org/yt_analysis/yt-doc/commits/3d2f5202b07e/
Changeset:   3d2f5202b07e
User:        MatthewTurk
Date:        2013-10-28 20:18:12
Summary:     Merging from Nathan
Affected #:  16 files

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 extensions/notebook_sphinxext.py
--- /dev/null
+++ b/extensions/notebook_sphinxext.py
@@ -0,0 +1,151 @@
+import os, shutil, string
+from sphinx.util.compat import Directive
+from docutils import nodes
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from runipy.notebook_runner import NotebookRunner
+from jinja2 import FileSystemLoader
+
+class NotebookDirective(Directive):
+    """Insert an evaluated notebook into a document
+
+    This uses runipy and nbconvert to transform a path to an unevaluated notebook
+    into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 1
+    optional_arguments = 0
+
+    def run(self):
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # get path to notebook
+        source_dir = os.path.dirname(
+            os.path.abspath(self.state.document.current_source))
+        nb_basename = os.path.basename(self.arguments[0])
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        nb_abs_path = os.path.join(rst_dir, nb_basename)
+
+        # Move files around.
+        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
+                                                os.path.dirname(nb_abs_path)))
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir)
+
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        place = os.path.join(dest_dir, rel_dir)
+        if not os.path.isdir(place): os.makedirs(place)
+        dest_path = os.path.join(place, nb_basename)
+        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
+        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
+
+        # Copy unevaluated script
+        try:
+            shutil.copyfile(nb_abs_path, dest_path)
+        except IOError:
+            raise RuntimeError("Unable to copy notebook to build destination.")
+
+        # Create python script vesion
+        unevaluated_text = nb_to_html(nb_abs_path)
+        script_text = nb_to_python(nb_abs_path)
+        f = open(dest_path_script, 'w')
+        f.write(script_text.encode('utf8'))
+        f.close()
+
+        # Create evaluated version and save it to the dest path.
+        # Always use --pylab so figures appear inline
+        # perhaps this is questionable?
+        nb_runner = NotebookRunner(nb_in=nb_abs_path, pylab=True)
+        nb_runner.run_notebook()
+        nb_runner.save_notebook(dest_path_eval)
+        evaluated_text = nb_to_html(dest_path_eval)
+
+        # Create link to notebook and script files
+        link_rst = "(" + \
+                   formatted_link(dest_path) + "; " + \
+                   formatted_link(dest_path_eval) + "; " + \
+                   formatted_link(dest_path_script) + \
+                   ")"
+
+        self.state_machine.insert_input([link_rst], rst_file)
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = nodes.raw('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # add dependency
+        self.state.document.settings.record_dependencies.add(nb_abs_path)
+
+        return [nb_node]
+
+class notebook_node(nodes.raw):
+    pass
+
+def nb_to_python(nb_path):
+    """convert notebook to python script"""
+    exporter = python.PythonExporter()
+    output, resources = exporter.from_filename(nb_path)
+    return output
+
+def nb_to_html(nb_path):
+    """convert notebook to html"""
+    exporter = html.HTMLExporter(template_file='full')
+    output, resources = exporter.from_filename(nb_path)
+    header = output.split('<head>', 1)[1].split('</head>',1)[0]
+    body = output.split('<body>', 1)[1].split('</body>',1)[0]
+
+    # http://imgur.com/eR9bMRH
+    header = header.replace('<style', '<style scoped="scoped"')
+    header = header.replace('body{background-color:#ffffff;}\n', '')
+    header = header.replace('body{background-color:white;position:absolute;'
+                            'left:0px;right:0px;top:0px;bottom:0px;'
+                            'overflow:visible;}\n', '')
+    header = header.replace('body{margin:0;'
+                            'font-family:"Helvetica Neue",Helvetica,Arial,'
+                            'sans-serif;font-size:13px;line-height:20px;'
+                            'color:#000000;background-color:#ffffff;}', '')
+    header = header.replace('\na{color:#0088cc;text-decoration:none;}', '')
+    header = header.replace(
+        'a:focus{color:#005580;text-decoration:underline;}', '')
+    header = header.replace(
+        '\nh1,h2,h3,h4,h5,h6{margin:10px 0;font-family:inherit;font-weight:bold;'
+        'line-height:20px;color:inherit;text-rendering:optimizelegibility;}'
+        'h1 small,h2 small,h3 small,h4 small,h5 small,'
+        'h6 small{font-weight:normal;line-height:1;color:#999999;}'
+        '\nh1,h2,h3{line-height:40px;}\nh1{font-size:35.75px;}'
+        '\nh2{font-size:29.25px;}\nh3{font-size:22.75px;}'
+        '\nh4{font-size:16.25px;}\nh5{font-size:13px;}'
+        '\nh6{font-size:11.049999999999999px;}\nh1 small{font-size:22.75px;}'
+        '\nh2 small{font-size:16.25px;}\nh3 small{font-size:13px;}'
+        '\nh4 small{font-size:13px;}', '')
+    header = header.replace('background-color:#ffffff;', '', 1)
+
+    # concatenate raw html lines
+    lines = ['<div class="ipynotebook">']
+    lines.append(header)
+    lines.append(body)
+    lines.append('</div>')
+    return '\n'.join(lines)
+
+def formatted_link(path):
+    return "`%s <%s>`__" % (os.path.basename(path), path)
+
+def visit_notebook_node(self, node):
+    self.visit_raw(node)
+
+def depart_notebook_node(self, node):
+    self.depart_raw(node)
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook', NotebookDirective)

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Data_Inspection.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Inspection.ipynb
@@ -0,0 +1,396 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Starting Out and Loading Data\n",
+      "\n",
+      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment.  Note that in most scripts, you will want to import from ``yt.mods`` rather than ``yt.imods``.  But using ``yt.imods`` gets you some nice stuff for the IPython notebook, which we'll use below."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we've loaded yt, we can load up some data.  Let's load the `IsolatedGalaxy` dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Fields and Facts\n",
+      "\n",
+      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"hierarchy\" (shorthand is `.h`) it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.print_stats()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also tell you the fields it found on disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, all of the fields it thinks it knows how to generate:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.derived_field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also transparently generate fields.  However, we encourage you to examine exactly what yt is doing when it generates those fields.  To see, you can ask for the source of a given field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.field_info[\"VorticityX\"].get_source()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt stores information about the domain of the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also convert this into various units:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width * pf[\"kpc\"]\n",
+      "print pf.domain_width * pf[\"au\"]\n",
+      "print pf.domain_width * pf[\"miles\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Mesh Structure\n",
+      "\n",
+      "If you're using a simulation type that has grids (for instance, here we're using an Enzo simulation) you can examine the structure of the mesh.  For the most part, you probably won't have to use this unless you're debugging a simulation or examining in detail what is going on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grid_left_edge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The hierarchy (`pf.h` here) has an attribute `grids` which is all of the grid objects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grids[0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g = pf.h.grids[0]\n",
+      "print g"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Grids have dimensions, extents, level, and even a list of Child grids."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.ActiveDimensions"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.LeftEdge, g.RightEdge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Level"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Children"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Grid Inspection\n",
+      "\n",
+      "If we want to examine grids only at a given level, we can!  Not only that, but we can load data and take a look at various fields.\n",
+      "\n",
+      "*This section can be skipped!*"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "gs = pf.h.select_grids(pf.h.max_level)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g2 = gs[0]\n",
+      "print g2\n",
+      "print g2.Parent\n",
+      "print g2.get_global_startindex()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print g2[\"Density\"][:,:,0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print (g2.Parent.child_mask == 0).sum() * 8\n",
+      "print g2.ActiveDimensions.prod()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in pf.h.field_list:\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in sorted(pf.h.field_list):\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Examining Data in Regions\n",
+      "\n",
+      "yt provides data object selectors.  In subsequent notebooks we'll examine these in more detail, but we can select a sphere of data and perform a number of operations on it.  yt makes it easy to operate on fluid fields in an object in *bulk*, but you can also examine individual field values.\n",
+      "\n",
+      "This creates a sphere selector positioned at the most dense point in the simulation that has a radius of 10 kpc."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10, 'kpc'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can calculate a bunch of bulk quantities.  Here's that list, but there's a list in the docs, too!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Let's look at the total mass.  This is how you call a given quantity.  yt calls these \"Derived Quantities\".  We'll talk about a few in a later notebook."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities[\"TotalMass\"]()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Data_Objects_and_Time_Series.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Objects_and_Time_Series.ipynb
@@ -0,0 +1,361 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Data Objects and Time Series Data\n",
+      "\n",
+      "Just like before, we will load up yt."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Time Series Data\n",
+      "\n",
+      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `TimeSeriesData` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
+      "\n",
+      "Note that you can specify either a list of filenames, or a glob (i.e., asterisk) pattern in this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ts = TimeSeriesData.from_filenames(\"enzo_tiny_cosmology/*/*.hierarchy\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 1: Simple Time Series\n",
+      "\n",
+      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for pf in ts` where `pf` means \"Parameter File\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the Derived Quantity `Extrema`, and append the min and max to our extrema outputs."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "rho_ex = []\n",
+      "times = []\n",
+      "for pf in ts:\n",
+      "    dd = pf.h.all_data()\n",
+      "    rho_ex.append(dd.quantities[\"Extrema\"](\"Density\")[0])\n",
+      "    times.append(pf.current_time * pf[\"years\"])\n",
+      "rho_ex = np.array(rho_ex)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the minimum and the maximum:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.semilogy(times, rho_ex[:,0], '-xk')\n",
+      "pylab.semilogy(times, rho_ex[:,1], '-xr')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 2: Advanced Time Series\n",
+      "\n",
+      "Let's do something a bit different.  Let's calculate the total mass inside halos and outside halos.\n",
+      "\n",
+      "This actually touches a lot of different pieces of machinery in yt.  For every parameter file, we will run the halo finder HOP.  Then, we calculate the total mass in the domain.  Then, for each halo, we calculate the sum of the baryon mass in that halo.  We'll keep running tallies of these two things."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "mass = []\n",
+      "zs = []\n",
+      "for pf in ts:\n",
+      "    halos = HaloFinder(pf)\n",
+      "    dd = pf.h.all_data()\n",
+      "    total_mass = dd.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    total_in_baryons = 0.0\n",
+      "    for halo in halos:\n",
+      "        sp = halo.get_sphere()\n",
+      "        total_in_baryons += sp.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    mass.append(total_in_baryons/total_mass)\n",
+      "    zs.append(pf.current_redshift)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now let's plot them!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(zs, mass, '-xb')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Data Objects\n",
+      "\n",
+      "Time series data have many applications, but most of them rely on examining the underlying data in some way.  Below, we'll see how to use and manipulate data objects.\n",
+      "\n",
+      "### Ray Queries\n",
+      "\n",
+      "yt provides the ability to examine rays, or lines, through the domain.  Note that these are not periodic, unlike most other data objects.  We create a ray object and can then examine quantities of it.  Rays have the special fields `t` and `dts`, which correspond to the time the ray enters a given cell and the distance it travels through that cell.\n",
+      "\n",
+      "To create a ray, we specify the start and end points."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ray = pf.h.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
+      "pylab.semilogy(ray[\"t\"], ray[\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"dts\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"t\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"x\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Slice Queries\n",
+      "\n",
+      "While slices are often used for visualization, they can be useful for other operations as well.  yt regards slices as multi-resolution objects.  They are an array of cells that are not all the same size; it only returns the cells at the highest resolution that it intersects.  (This is true for all yt data objects.)  Slices and projections have the special fields `px`, `py`, `pdx` and `pdy`, which correspond to the coordinates and half-widths in the pixel plane."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "sl = pf.h.slice(0, c[0])\n",
+      "print sl[\"x\"], sl[\"z\"], sl[\"pdx\"]\n",
+      "print sl[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to do something interesting with a Slice, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "frb = sl.to_frb((50.0, 'kpc'), 1024)\n",
+      "print frb[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of Density, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "write_image(np.log10(frb[\"Density\"]), \"temp.png\")\n",
+      "from IPython.core.display import Image\n",
+      "Image(filename = \"temp.png\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Off-Axis Slices\n",
+      "\n",
+      "yt provides not only slices, but off-axis slices that are sometimes called \"cutting planes.\"  These are specified by (in order) a normal vector and a center.  Here we've set the normal vector to `[0.2, 0.3, 0.5]` and the center to be the point of maximum density.\n",
+      "\n",
+      "We can then turn these directly into plot windows using `to_pw`.  Note that the `to_pw` and `to_frb` methods are available on slices, off-axis slices, and projections, and can be used on any of them."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cp = pf.h.cutting([0.2, 0.3, 0.5], \"max\")\n",
+      "pw = cp.to_pw(fields = [\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once we have our plot window from our cutting plane, we can show it here."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pw.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can, as noted above, do the same with our slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pws = sl.to_pw(fields=[\"Density\"])\n",
+      "pws.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Covering Grids\n",
+      "\n",
+      "If we want to access a 3D array of data that spans multiple resolutions in our simulation, we can use a covering grid.  This will return a 3D array of data, drawing from up to the resolution level specified when creating the data.  For example, if you create a covering grid that spans two child grids of a single parent grid, it will fill those zones covered by a zone of a child grid with the data from that child grid.  Where it is covered only by the parent grid, the cells from the parent grid will be duplicated (appropriately) to fill the covering grid.\n",
+      "\n",
+      "There are two different types of covering grids: unsmoothed and smoothed.  Smoothed grids will be filled through a cascading interpolation process; they will be filled at level 0, interpolated to level 1, filled at level 1, interpolated to level 2, filled at level 2, etc.  This will help to reduce edge effects.  Unsmoothed covering grids will not be interpolated, but rather values will be duplicated multiple times.\n",
+      "\n",
+      "Here we create an unsmoothed covering grid at level 2, with the left edge at `[0.0, 0.0, 0.0]` and with dimensions equal to those that would cover the entire domain at level 2.  We can then ask for the Density field, which will be a 3D array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cg = pf.h.covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print cg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we do exactly the same thing: except we ask for a *smoothed* covering grid, which will reduce edge effects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "scg = pf.h.smoothed_covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print scg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Derived_Fields_and_Profiles.ipynb
--- /dev/null
+++ b/source/bootcamp/Derived_Fields_and_Profiles.ipynb
@@ -0,0 +1,316 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Derived Fields and Profiles\n",
+      "\n",
+      "One of the most powerful features in yt is the ability to create derived fields that act and look exactly like fields that exist on disk.  This means that they will be generated on demand and can be used anywhere a field that exists on disk would be used.  Additionally, you can create them by just writing python functions."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 1
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Derived Fields\n",
+      "\n",
+      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `Dinosaurs` and our units are `Trex/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(units = \"Trex/s\")\n",
+      "def Dinosaurs(field, data):\n",
+      "    return data[\"Density\"]**(2.0/3.0) * data[\"VelocityMagnitude\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 2
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One important thing to note is that derived fields must be defined *before* any datasets are loaded.  Let's load up our data and take a look at some quantities."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "print dd.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "['MinLocation', 'StarAngularMomentumVector', 'WeightedVariance', 'TotalMass', 'AngularMomentumVector', 'TotalQuantity', 'IsBound', 'WeightedAverageQuantity', 'CenterOfMass', 'BulkVelocity', 'ParticleSpinParameter', 'Action', 'Extrema', 'MaxLocation', 'BaryonSpinParameter']\n"
+       ]
+      }
+     ],
+     "prompt_number": 4
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `Extrema` quantity -- the exact same way that we would for Density, Temperature, and so on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"Extrema\"](\"Dinosaurs\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "[(2.2146366774504352e-20, 9.1573883828992124e-09)]\n"
+       ]
+      }
+     ],
+     "prompt_number": 5
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can do the same for the average quantities as well."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"WeightedAverageQuantity\"](\"Dinosaurs\", weight=\"Temperature\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## A Few Other Quantities\n",
+      "\n",
+      "We can ask other quantities of our data, as well.  For instance, this sequence of operations will find the most dense point, center a sphere on it, calculate the bulk velocity of that sphere, calculate the baryonic angular momentum vector, and then the density extrema.  All of this is done in a memory conservative way: if you have an absolutely enormous dataset, yt will split that dataset into pieces, apply intermediate reductions and then a final reduction to calculate your quantity."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10.0, 'kpc'))\n",
+      "bv = sp.quantities[\"BulkVelocity\"]()\n",
+      "L = sp.quantities[\"AngularMomentumVector\"]()\n",
+      "(rho_min, rho_max), = sp.quantities[\"Extrema\"](\"Density\")\n",
+      "print bv, L, rho_min, rho_max"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Profiles\n",
+      "\n",
+      "yt provides the ability to bin in 1, 2 and 3 dimensions.  This means discretizing in one or more dimensions of phase space (density, temperature, etc) and then calculating either the total value of a field in each bin or the average value of a field in each bin.\n",
+      "\n",
+      "We do this using the objects `BinnedProfile1D`, `BinnedProfile2D`, and `BinnedProfile3D`.  The first two are the most common since they are the easiest to visualize.\n",
+      "\n",
+      "This first set of commands manually creates a `BinnedProfile1D` from the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the Density-weighted average of the fields `Temperature` and (previously-defined) `Dinosaurs`.  We then plot it in a loglog plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof = BinnedProfile1D(sp, 32, \"Density\", rho_min, rho_max)\n",
+      "prof.add_fields([\"Temperature\", \"Dinosaurs\"], weight=\"Density\")\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"Temperature\"], \"-x\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the `Dinosaurs` field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(prof[\"Density\"], prof[\"Dinosaurs\"], '-x')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to see the total mass in every bin, we add the `CellMassMsun` field with no weight.  Specifying `weight=None` will simply take the total value in every bin and add that up."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can also specify accumulation, which sums all the bins, from left to right.  Note that for 2D and 3D profiles, this needs to be a tuple of length 2 or 3."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None, accumulation=True)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Derived Fields\n",
+      "\n",
+      "*This section can be skipped!*\n",
+      "\n",
+      "You can also define fields that require extra zones.  This is useful, for instance, if you want to take the average, or apply a stencil.  yt provides fields like `DivV` that do this internally.  This example is a very busy example of how to do it.  You need to specify the validator `ValidateSpatial` with the number of extra zones *on each side* of the grid that you need, and then inside your function you need to return a field *with those zones stripped off*.  So by necessity, the arrays returned by `data[something]` will have larger spatial extent than what should be returned by the function itself.  If you specify that you need 0 extra zones, this will also work and will simply supply a `grid` object for the field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(name = \"AveragedTemperature\",\n",
+      "               validators = [ValidateSpatial(1)],\n",
+      "               units = r\"K\")\n",
+      "def _AveragedTemperature(field, data):\n",
+      "    nx, ny, nz = data[\"Temperature\"].shape\n",
+      "    new_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    weight_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    i_i, j_i, k_i = na.mgrid[0:3,0:3,0:3]\n",
+      "    for i,j,k in zip(i_i.ravel(),j_i.ravel(),k_i.ravel()):\n",
+      "        sl = [slice(i,nx-(2-i)),slice(j,ny-(2-j)),slice(k,nz-(2-k))]\n",
+      "        new_field += data[\"Temperature\"][sl] * data[\"CellMass\"][sl]\n",
+      "        weight_field += data[\"CellMass\"][sl]\n",
+      "    # Now some fancy footwork\n",
+      "    new_field2 = na.zeros((nx,ny,nz))\n",
+      "    new_field2[1:-1,1:-1,1:-1] = new_field/weight_field\n",
+      "    return new_field2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now, once again, we can access `AveragedTemperature` just like any other field.  Note that because it requires ghost zones, this will be a much slower process!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "(tmin, tmax), (atmin, atmax) = dd.quantities[\"Extrema\"]([\"Temperature\", \"AveragedTemperature\"])\n",
+      "print tmin, tmax, atmin, atmax\n",
+      "print tmin / atmin, tmax / atmax"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Field Parameters\n",
+      "\n",
+      "Field parameters are a method of passing information to derived fields.  For instance, you might pass in information about a vector you want to use as a basis for a coordinate transformation.  yt often uses things like `bulk_velocity` to identify velocities that should be subtracted off.  Here we show how that works:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp_small = pf.h.sphere(\"max\", (1.0, 'kpc'))\n",
+      "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
+      "\n",
+      "sp = pf.h.sphere(\"max\", (0.1, 'mpc'))\n",
+      "rv1 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "sp.clear_data()\n",
+      "sp.set_field_parameter(\"bulk_velocity\", bv)\n",
+      "rv2 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "print bv\n",
+      "print rv1\n",
+      "print rv2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Introduction.ipynb
--- a/source/bootcamp/Introduction.ipynb
+++ b/source/bootcamp/Introduction.ipynb
@@ -45,7 +45,7 @@
       "\n",
       "## Acquiring the datasets for this tutorial\n",
       "\n",
-      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/, or run this next cell by pressing `Shift-Enter` inside it.  It may take a few minutes.\n",
+      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/.\n",
       "\n",
       "## What's Next?\n",
       "\n",
@@ -58,33 +58,6 @@
       "5. Derived Fields and Profiles\n",
       "6. Volume Rendering"
      ]
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [
-      "!curl -sSO http://yt-project.org/data/enzo_tiny_cosmology.tar\n",
-      "print \"Got enzo_tiny_cosmology\"\n",
-      "!tar xf enzo_tiny_cosmology.tar\n",
-      "!curl -sSO http://yt-project.org/data/Enzo_64.tar\n",
-      "print \"Got Enzo_64\"\n",
-      "!tar xf Enzo_64.tar\n",
-      "!curl -sSO http://yt-project.org/data/IsolatedGalaxy.tar\n",
-      "print \"Got IsolatedGalaxy\"\n",
-      "!tar xf IsolatedGalaxy.tar\n",
-      "print \"All done!\""
-     ],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
     }
    ],
    "metadata": {}

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Simple_Visualization.ipynb
--- /dev/null
+++ b/source/bootcamp/Simple_Visualization.ipynb
@@ -0,0 +1,274 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Simple Visualizations of Data\n",
+      "\n",
+      "Just like in our first notebook, we have to load yt and then some data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For this notebook, we'll load up a cosmology dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "print \"Redshift =\", pf.current_redshift"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In the terms that yt uses, a projection is a line integral through the domain.  This can either be unweighted (in which case a column density is returned) or weighted, in which case an average value is returned.  Projections are, like all other data objects in yt, full-fledged data objects that churn through data and present that to you.  However, we also provide a simple method of creating Projections and plotting them in a single step.  This is called a Plot Window, here specifically known as a `ProjectionPlot`.  One thing to note is that in yt, we project all the way through the entire domain at a single time.  This means that the first call to projecting can be somewhat time consuming, but panning, zooming and plotting are all quite fast.\n",
+      "\n",
+      "yt is designed to make it easy to make nice plots and straightforward to modify those plots directly.  The cookbook in the documentation includes detailed examples of this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"y\", \"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The `show` command simply sends the plot to the IPython notebook.  You can also call `p.save()` which will save the plot to the file system.  This function accepts an argument, which will be pre-prended to the filename and can be used to name it based on the width or to supply a location.\n",
+      "\n",
+      "Now we'll zoom and pan a bit."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(2.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((0.1, 0.0))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((-0.25, -0.5))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(0.1)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we specify multiple fields, each time we call `show` we get multiple plots back.  Same for `save`!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"z\", [\"Density\", \"Temperature\"], weight_field=\"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the colormap on a field-by-field basis."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.set_cmap(\"Temperature\", \"hot\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, we can re-center the plot on different locations.  One possible use of this would be to make a single `ProjectionPlot` which you move around to look at different regions in your simulation, saving at each one."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "p.set_center((c[0], c[1]))\n",
+      "p.zoom(10)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Okay, let's load up a bigger simulation (from `Enzo_64` this time) and make a slice plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"Enzo_64/DD0043/data0043\")\n",
+      "s = SlicePlot(pf, \"z\", [\"Density\", \"VelocityMagnitude\"], center=\"max\")\n",
+      "s.set_cmap(\"VelocityMagnitude\", \"kamae\")\n",
+      "s.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the logging of various fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.set_log(\"VelocityMagnitude\", True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides many different annotations for your plots.  You can see all of these in the documentation, or if you type `s.annotate_` and press tab, a list will show up here.  We'll annotate with velocity arrows."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.annotate_velocity()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Contours can also be overlaid:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s = SlicePlot(pf, \"x\", [\"Density\"], center=\"max\")\n",
+      "s.annotate_contour(\"Temperature\")\n",
+      "s.zoom(2.5)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we can save out to the file system."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.save()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/Volume_Rendering.ipynb
--- /dev/null
+++ b/source/bootcamp/Volume_Rendering.ipynb
@@ -0,0 +1,95 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# A Brief Demo of Volume Rendering\n",
+      "\n",
+      "This shows a small amount of volume rendering.  Really, just enough to get your feet wet!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To create a volume rendering, we need a camera and a transfer function.  We'll use the `ColorTransferFunction`, which accepts (in log space) the minimum and maximum bounds of our transfer function.  This means behavior for data outside these values is undefined.\n",
+      "\n",
+      "We then add on \"layers\" like an onion.  This function can accept a width (here specified) in data units, and also a color map.  Here we add on four layers.\n",
+      "\n",
+      "Finally, we create a camera.  The focal point is `[0.5, 0.5, 0.5]`, the width is 20 kpc (including front-to-back integration) and we specify a transfer function.  Once we've done that, we call `show` to actually cast our rays and display them inline."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -24))\n",
+      "tf.add_layers(4, w=0.01)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf)\n",
+      "cam.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the `std()` of the image array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cam.show(clip_ratio=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "There are several other options we can specify.  Note that here we have turned on the use of ghost zones, shortened the data interval for the transfer function, and widened our gaussian layers."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -25))\n",
+      "tf.add_layers(4, w=0.03)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf, no_ghost=False)\n",
+      "cam.show(clip_ratio=4.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/data_inspection.rst
--- /dev/null
+++ b/source/bootcamp/data_inspection.rst
@@ -0,0 +1,4 @@
+Data Inspection
+---------------
+
+.. notebook:: Data_Inspection.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/data_objects_and_time_series.rst
--- /dev/null
+++ b/source/bootcamp/data_objects_and_time_series.rst
@@ -0,0 +1,4 @@
+Data Objects and Time Series
+----------------------------
+
+.. notebook:: Data_Objects_and_Time_Series.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/derived_fields_and_profiles.rst
--- /dev/null
+++ b/source/bootcamp/derived_fields_and_profiles.rst
@@ -0,0 +1,4 @@
+Derived Fields and Profiles
+---------------------------
+
+.. notebook:: Derived_Fields_and_Profiles.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/index.rst
--- /dev/null
+++ b/source/bootcamp/index.rst
@@ -0,0 +1,49 @@
+yt Bootcamp
+===========
+
+We have been developing a sequence of materials that can be run in the IPython
+notebook that walk through how to look at data and how to operate on data.
+These are not meant to be detailed walkthroughs, but simply short
+introductions.  Their purpose is to let you explore, interactively, some common
+operations that can be done on data with yt!
+
+To get started with the bootcamp, you need to download the repository and start
+the IPython notebook.  The easiest way, if you have mercurial installed, to get
+the repository is to:
+
+.. code-block:: bash
+
+   hg clone https://bitbucket.org/yt_analysis/yt-doc
+
+If you don't, you can download it from `here
+<https://bitbucket.org/yt_analysis/yt-doc/get/tip.tar.bz2>`_
+
+Now you can start the IPython notebook and begin:
+
+.. code-block:: bash
+
+   cd yt-doc/source/bootcamp
+   yt notebook
+
+This command will give you information about the Notebook Server and how to
+access it.  Once you have done so, choose "Introduction" from the list of
+notebooks, which includes an introduction and information about how to download
+the sample data.
+
+.. warning:: The pre-filled out notebooks are *far* less fun than running them
+             yourselves!  Check out the repo and give it a try.
+
+Here are the notebooks, which have been filled in for inspection:
+
+.. toctree::
+   :maxdepth: 1
+
+   introduction
+   data_inspection
+   simple_visualization
+   data_objects_and_time_series
+   derived_fields_and_profiles
+   volume_rendering
+
+Let us know if you would like to contribute other example notebooks, or have
+any suggestions for how these can be improved.

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/introduction.rst
--- /dev/null
+++ b/source/bootcamp/introduction.rst
@@ -0,0 +1,4 @@
+Introduction
+------------
+
+.. notebook:: Introduction.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/simple_visualization.rst
--- /dev/null
+++ b/source/bootcamp/simple_visualization.rst
@@ -0,0 +1,4 @@
+Simple Visualization
+--------------------
+
+.. notebook:: Simple_Visualization.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/bootcamp/volume_rendering.rst
--- /dev/null
+++ b/source/bootcamp/volume_rendering.rst
@@ -0,0 +1,4 @@
+Volume Rendering
+----------------
+
+.. notebook:: Volume_Rendering.ipynb

diff -r 10234ebc9e63308c3b61c617d58a6df1c98acd9a -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -30,7 +30,7 @@
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
               'sphinx.ext.autosummary', 'numpydocmod', 'youtube',
-              'yt_cookbook', 'yt_colormaps']
+              'yt_cookbook', 'yt_colormaps', 'notebook_sphinxext']
 
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']


https://bitbucket.org/yt_analysis/yt-doc/commits/9134220793d0/
Changeset:   9134220793d0
User:        MatthewTurk
Date:        2013-10-28 20:28:13
Summary:     Only do autosummary when not on RTD.
Affected #:  1 file

diff -r 3d2f5202b07eb59377dc4c1f670ed5c44e946bc9 -r 9134220793d09b64103b8f1ba34f5701fc40df57 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -29,9 +29,12 @@
 # coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
-              'sphinx.ext.autosummary', 'numpydocmod', 'youtube',
+              'numpydocmod', 'youtube',
               'yt_cookbook', 'yt_colormaps', 'notebook_sphinxext']
 
+if not on_rtd:
+    extensions.append('sphinx.ext.autosummary')
+
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']
 


https://bitbucket.org/yt_analysis/yt-doc/commits/654e3ec17767/
Changeset:   654e3ec17767
User:        MatthewTurk
Date:        2013-10-28 20:28:35
Summary:     Merging
Affected #:  4 files

diff -r 9134220793d09b64103b8f1ba34f5701fc40df57 -r 654e3ec17767b1b916732bfbc3653cf713319cdf extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -1,4 +1,4 @@
-import os, shutil, string
+import os, shutil, string, glob
 from sphinx.util.compat import Directive
 from docutils import nodes
 from docutils.parsers.rst import directives
@@ -29,24 +29,22 @@
         nb_abs_path = os.path.join(rst_dir, nb_basename)
 
         # Move files around.
-        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
-                                                os.path.dirname(nb_abs_path)))
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        dest_dir = os.path.join(setup.app.builder.outdir, rel_dir)
+        dest_path = os.path.join(dest_dir, nb_basename)
+
         if not os.path.exists(dest_dir):
             os.makedirs(dest_dir)
 
-        rel_dir = os.path.relpath(rst_dir, setup.confdir)
-        place = os.path.join(dest_dir, rel_dir)
-        if not os.path.isdir(place): os.makedirs(place)
-        dest_path = os.path.join(place, nb_basename)
-        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
-        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
-
         # Copy unevaluated script
         try:
             shutil.copyfile(nb_abs_path, dest_path)
         except IOError:
             raise RuntimeError("Unable to copy notebook to build destination.")
 
+        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
+        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
+
         # Create python script vesion
         unevaluated_text = nb_to_html(nb_abs_path)
         script_text = nb_to_python(nb_abs_path)
@@ -80,6 +78,11 @@
         # add dependency
         self.state.document.settings.record_dependencies.add(nb_abs_path)
 
+        # clean up png files left behind by notebooks.
+        png_files = glob.glob("*.png")
+        for file in png_files:
+            os.remove(file)
+
         return [nb_node]
 
 class notebook_node(nodes.raw):

diff -r 9134220793d09b64103b8f1ba34f5701fc40df57 -r 654e3ec17767b1b916732bfbc3653cf713319cdf source/analyzing/analysis_modules/quick_start_fitting.rst
--- a/source/analyzing/analysis_modules/quick_start_fitting.rst
+++ b/source/analyzing/analysis_modules/quick_start_fitting.rst
@@ -3,6 +3,7 @@
 Fitting an Absorption Spectrum
 ==============================
 .. sectionauthor:: Hilary Egan <hilary.egan at colorado.edu>
+
 This tool can be used to fit absorption spectra, particularly those
 generated using the (``AbsorptionSpectrum``) tool. For more details
 on its uses and implementation please see (`Egan et al. (2013)

diff -r 9134220793d09b64103b8f1ba34f5701fc40df57 -r 654e3ec17767b1b916732bfbc3653cf713319cdf source/api/api.rst
--- a/source/api/api.rst
+++ b/source/api/api.rst
@@ -25,7 +25,7 @@
    ~yt.visualization.plot_collection.PlotCollectionInteractive
    ~yt.visualization.fixed_resolution.FixedResolutionBuffer
    ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.plot_collection.get_multi_plot
+   ~yt.visualization.base_plot_types.get_multi_plot
 
 Data Sources
 ------------
@@ -321,9 +321,7 @@
 
 Absorption spectra fitting:
 
-.. autosummary::
-    :toctree: generated/
-    ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
 
 Sunrise exporting:
 
@@ -525,14 +523,16 @@
    :toctree: generated/
 
    ~yt.config.YTConfigParser
-   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
    ~yt.utilities.parameter_file_storage.ParameterFileStore
    ~yt.data_objects.data_containers.FakeGridForParticles
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
 
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
 Testing Infrastructure
 ----------------------
 

diff -r 9134220793d09b64103b8f1ba34f5701fc40df57 -r 654e3ec17767b1b916732bfbc3653cf713319cdf source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -246,5 +246,5 @@
                        'http://matplotlib.sourceforge.net/': None,
                        }
 
-if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+#if not on_rtd:
+#    autosummary_generate = glob.glob("api/api.rst")


https://bitbucket.org/yt_analysis/yt-doc/commits/c07a6a7d40ae/
Changeset:   c07a6a7d40ae
User:        MatthewTurk
Date:        2013-10-28 20:35:19
Summary:     Switching over which bootcamp we use.
Affected #:  2 files

diff -r 654e3ec17767b1b916732bfbc3653cf713319cdf -r c07a6a7d40ae603328818697e8e1320388f6060a source/bootcamp.rst
--- a/source/bootcamp.rst
+++ /dev/null
@@ -1,46 +0,0 @@
-yt Bootcamp
-===========
-
-We have been developing a sequence of materials that can be run in the IPython
-notebook that walk through how to look at data and how to operate on data.
-These are not meant to be detailed walkthroughs, but simply short
-introductions.  Their purpose is to let you explore, interactively, some common
-operations that can be done on data with yt!
-
-To get started with the bootcamp, you need to download the repository and start
-the IPython notebook.  The easiest way, if you have mercurial installed, to get
-the repository is to:
-
-.. code-block:: bash
-
-   hg clone https://bitbucket.org/yt_analysis/bootcamp2012/
-
-If you don't, you can download it from `here
-<https://bitbucket.org/yt_analysis/bootcamp2012/get/tip.tar.bz2>`_
-
-Now you can start the IPython notebook and begin:
-
-.. code-block:: bash
-
-   cd bootcamp2012
-   yt notebook
-
-This command will give you information about the Notebook Server and how to
-access it.  Once you have done so, choose "Introduction" from the list of
-notebooks, which includes an introduction and information about how to download
-the sample data.
-
-.. warning:: The pre-filled out notebooks are *far* less fun than running them
-             yourselves!  Check out the repo and give it a try.
-
-Here are the notebooks, which have been filled in for inspection:
-
-  * `Introduction <http://hub.yt-project.org/nb/pvvbhr>`_
-  * `Data Inspection <http://hub.yt-project.org/nb/elryrr>`_
-  * `Simple Visualization <https://hub.yt-project.org/nb/oqq55m>`_
-  * `Data Objects and Time Series <https://hub.yt-project.org/nb/vu3lia>`_
-  * `Derived Fields and Profiles <https://hub.yt-project.org/nb/bspn4w>`_
-  * `Volume Rendering <https://hub.yt-project.org/nb/guo7w6>`_
-
-Let us know if you would like to contribute other example notebooks, or have
-any suggestions for how these can be improved.

diff -r 654e3ec17767b1b916732bfbc3653cf713319cdf -r c07a6a7d40ae603328818697e8e1320388f6060a source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -30,7 +30,7 @@
    :maxdepth: 1
 
    orientation/index 
-   bootcamp
+   bootcamp/index
    help/index
    cookbook/index
    examining/index


https://bitbucket.org/yt_analysis/yt-doc/commits/7fcfd22a668c/
Changeset:   7fcfd22a668c
User:        chummels
Date:        2013-10-28 20:42:21
Summary:     Merging.
Affected #:  3 files

diff -r c07a6a7d40ae603328818697e8e1320388f6060a -r 7fcfd22a668c8e04f4b59985bd7a150fe9d8c548 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -29,7 +29,7 @@
 .. toctree::
    :maxdepth: 1
 
-   orientation/index 
+   installing
    bootcamp/index
    help/index
    cookbook/index

diff -r c07a6a7d40ae603328818697e8e1320388f6060a -r 7fcfd22a668c8e04f4b59985bd7a150fe9d8c548 source/installing.rst
--- /dev/null
+++ b/source/installing.rst
@@ -0,0 +1,96 @@
+Installing yt
+-------------
+
+yt is a Python package (with some components written in C), using NumPy as a
+computation engine, Matplotlib for some visualization tasks and Mercurial for
+version control.  Because installation of all of these interlocking parts can 
+time-consuming, yt provides an installation script which downloads and builds
+a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
+yt supports Linux and OSX deployment, with the possibility of deployment on 
+other Unix-like systems (XSEDE resources, clusters, etc.).  It Windows is not
+supported.
+
+To get the installation script, download it from:
+
+.. code-block:: bash
+
+  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
+
+By default, it will install an array of items, with an option to also download
+the current stable version of Enzo.  The script has all its options at the top
+of the script; you should be able to open it and edit it without any knowledge
+of bash syntax.  To execute it, run:
+
+.. code-block:: bash
+
+  $ bash install_script.sh
+
+Because the installer is downloading and building a variety of packages from
+source, this will likely take a while (e.g. 20 minutes), but you will get 
+updates of its status at the command line throughout.
+
+If you receive errors during this process, the installer will provide you 
+with a large amount of information to assist in debugging your problems.  The 
+file `yt_install.log` will contain all of the STDOUT and STDERR from the entire 
+installation process, so it is usually quite cumbersome.  By looking at the 
+last few hundred lines (i.e. `tail -500 yt_install.log`), you can potentially 
+figure out what went wrong.  If you have problems, though, do not hesitate to 
+:ref:`contact us asking-for-help` for assitance.
+
+Activating Your Installation
+----------------------------
+
+Once the installation has completed, there will be instructions on how to set up 
+your shell environment to use yt using the activate script.  You must execute 
+this script in order to have yt properly recognized by your system.  You can 
+either add it to your login script, or you must execute it in a shell session 
+prior to working with yt.
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate
+
+If you use csh or tcsh as your shell, activate that version of the script:
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate.csh
+
+If you don't like executing outside scripts on your computer, you can set 
+the shell variables manually.  ``YT_DEST`` needs to point to the root of the
+directory containing the install. By default, this will be ``yt-<arch>``, where
+``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You 
+will also need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain 
+``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
+
+Alternative Installation Methods
+--------------------------------
+
+If you want to forego the use of the install script, you need to make sure 
+you have yt's dependencies installed on your system.  These include: a C compiler, 
+``HDF5``, ``Freetype``, ``libpng``, ``python``, ``cython``, ``numpy``, and 
+``matplotlib``.  From here, you can use ``pip`` to install yt as:
+
+.. code-block:: bash
+
+  $ pip install yt
+
+If you choose this installation method, you do not need to run the activation
+script as it is unnecessary.
+
+Testing Your Installation
+-------------------------
+
+To test to make sure everything is installed properly, try running yt at 
+the command line:
+
+.. code-block:: bash
+
+  $ yt --help
+
+If this works, you should get a list of the various command-line options for
+yt, which means you have successfully installed yt.  Congratulations!  
+
+If you get an error, follow the instructions it gives you to debug the problem.  
+Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
+figure it out.

diff -r c07a6a7d40ae603328818697e8e1320388f6060a -r 7fcfd22a668c8e04f4b59985bd7a150fe9d8c548 source/orientation/installing.rst
--- a/source/orientation/installing.rst
+++ /dev/null
@@ -1,53 +0,0 @@
-Installing yt
--------------
-
-yt is a Python package (with some components written in C), using NumPy as a
-computation engine, Matplotlib for some visualization tasks and Mercurial for
-version control.  Installing all of these components can be a daunting task,
-particularly as the Python ecosystem of packages is rapidly evolving.  Frankly,
-one of the *last* things a computational scientist wants to do is to install a
-bunch of packages and deal with the interlocking parts, when really the goal is
-to just simply look at some data.
-
-To that end, the yt project provides an installation script for the toolchain
-upon which yt builds, which contains a fully-isolated Python + Numpy +
-Matplotlib + HDF5 + Mercurial installation.  This installation script has been
-tested on most of the Teragrid as well as on a number of private clusters and
-Linux and OS X machines; in fact, if it doesn't work, that's considered a bug
-and we would endeavor to fix it.  yt supports Linux and OSX deployment, with
-the possibility of deployment on other Unix-like systems.  Windows is not
-supported.
-
-To get the installation script, download it from:
-
-.. code-block:: bash
-
-  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
-
-By default, it will install an array of items, with an option to also download
-the current stable version of Enzo.  The script has all its options at the top
-of the script; you should be able to open it and edit it without any knowledge
-of bash syntax.
-
-.. code-block:: bash
-
-  $ bash install_script.sh
-
-It will start out by telling you a little bit about itself and what it's
-installing, and then continue on for some time while it downloads, builds, and
-installs (into an isolated directory) everything you need to run yt.
-
-Once it has completed, there will be instructions on how to set up your shell
-environment to use yt.  **You should follow these, or else yt may not work, or
-may simply fail -- in unexpected ways!**
-
-One thing that we will use for the rest of the orientation is the environment
-variable ``YT_DEST``, which is output at the end of the installation process.
-If you use the ``activate`` script as described in the instructions printed by
-the install script, you will be all set.
-
-If you'd like to do it manually, ``YT_DEST`` needs to point to the root of the
-directory containing the install. By default, this will be ``yt-<arch>``, where
-``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You will also
-need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain ``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
-


https://bitbucket.org/yt_analysis/yt-doc/commits/8de2781ad3c0/
Changeset:   8de2781ad3c0
User:        MatthewTurk
Date:        2013-10-28 20:45:11
Summary:     This makes the notebook extension only conditionally imported.
Affected #:  1 file

diff -r c07a6a7d40ae603328818697e8e1320388f6060a -r 8de2781ad3c02b9dd999d675b60627a1f73a1554 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -29,12 +29,17 @@
 # coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
-              'numpydocmod', 'youtube',
-              'yt_cookbook', 'yt_colormaps', 'notebook_sphinxext']
+              'numpydocmod', 'youtube', 'yt_cookbook', 'yt_colormaps']
 
 if not on_rtd:
     extensions.append('sphinx.ext.autosummary')
 
+try:
+    import runipy
+    extensions.append('notebook_sphinxext')
+except ImportError:
+    pass
+
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']
 


https://bitbucket.org/yt_analysis/yt-doc/commits/c082f7c4c2da/
Changeset:   c082f7c4c2da
User:        MatthewTurk
Date:        2013-10-28 20:51:59
Summary:     Adding pandoc check.
Affected #:  1 file

diff -r 8de2781ad3c02b9dd999d675b60627a1f73a1554 -r c082f7c4c2dae5ec713edd7914dbf09a7fe16a7a source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -36,6 +36,7 @@
 
 try:
     import runipy
+    import IPython.nbconvert.utils.pandoc
     extensions.append('notebook_sphinxext')
 except ImportError:
     pass


https://bitbucket.org/yt_analysis/yt-doc/commits/6b92ba058e7e/
Changeset:   6b92ba058e7e
User:        MatthewTurk
Date:        2013-10-28 20:54:34
Summary:     One more check.  Don't build notebooks if on rtd.
Affected #:  1 file

diff -r c082f7c4c2dae5ec713edd7914dbf09a7fe16a7a -r 6b92ba058e7ebbf6f255a9ecb6eadb23d6a126bb source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -37,7 +37,8 @@
 try:
     import runipy
     import IPython.nbconvert.utils.pandoc
-    extensions.append('notebook_sphinxext')
+    if not on_rtd:
+        extensions.append('notebook_sphinxext')
 except ImportError:
     pass
 


https://bitbucket.org/yt_analysis/yt-doc/commits/3995f667fbed/
Changeset:   3995f667fbed
User:        chummels
Date:        2013-10-28 20:55:51
Summary:     Merging.
Affected #:  3 files

diff -r 6b92ba058e7ebbf6f255a9ecb6eadb23d6a126bb -r 3995f667fbedfdc99186fc5845162ff73e9326ae source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -29,7 +29,7 @@
 .. toctree::
    :maxdepth: 1
 
-   orientation/index 
+   installing
    bootcamp/index
    help/index
    cookbook/index

diff -r 6b92ba058e7ebbf6f255a9ecb6eadb23d6a126bb -r 3995f667fbedfdc99186fc5845162ff73e9326ae source/installing.rst
--- /dev/null
+++ b/source/installing.rst
@@ -0,0 +1,96 @@
+Installing yt
+-------------
+
+yt is a Python package (with some components written in C), using NumPy as a
+computation engine, Matplotlib for some visualization tasks and Mercurial for
+version control.  Because installation of all of these interlocking parts can 
+time-consuming, yt provides an installation script which downloads and builds
+a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
+yt supports Linux and OSX deployment, with the possibility of deployment on 
+other Unix-like systems (XSEDE resources, clusters, etc.).  It Windows is not
+supported.
+
+To get the installation script, download it from:
+
+.. code-block:: bash
+
+  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
+
+By default, it will install an array of items, with an option to also download
+the current stable version of Enzo.  The script has all its options at the top
+of the script; you should be able to open it and edit it without any knowledge
+of bash syntax.  To execute it, run:
+
+.. code-block:: bash
+
+  $ bash install_script.sh
+
+Because the installer is downloading and building a variety of packages from
+source, this will likely take a while (e.g. 20 minutes), but you will get 
+updates of its status at the command line throughout.
+
+If you receive errors during this process, the installer will provide you 
+with a large amount of information to assist in debugging your problems.  The 
+file `yt_install.log` will contain all of the STDOUT and STDERR from the entire 
+installation process, so it is usually quite cumbersome.  By looking at the 
+last few hundred lines (i.e. `tail -500 yt_install.log`), you can potentially 
+figure out what went wrong.  If you have problems, though, do not hesitate to 
+:ref:`contact us asking-for-help` for assitance.
+
+Activating Your Installation
+----------------------------
+
+Once the installation has completed, there will be instructions on how to set up 
+your shell environment to use yt using the activate script.  You must execute 
+this script in order to have yt properly recognized by your system.  You can 
+either add it to your login script, or you must execute it in a shell session 
+prior to working with yt.
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate
+
+If you use csh or tcsh as your shell, activate that version of the script:
+
+.. code-block:: bash
+
+  $ source <yt installation directory>/bin/activate.csh
+
+If you don't like executing outside scripts on your computer, you can set 
+the shell variables manually.  ``YT_DEST`` needs to point to the root of the
+directory containing the install. By default, this will be ``yt-<arch>``, where
+``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You 
+will also need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain 
+``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
+
+Alternative Installation Methods
+--------------------------------
+
+If you want to forego the use of the install script, you need to make sure 
+you have yt's dependencies installed on your system.  These include: a C compiler, 
+``HDF5``, ``Freetype``, ``libpng``, ``python``, ``cython``, ``numpy``, and 
+``matplotlib``.  From here, you can use ``pip`` to install yt as:
+
+.. code-block:: bash
+
+  $ pip install yt
+
+If you choose this installation method, you do not need to run the activation
+script as it is unnecessary.
+
+Testing Your Installation
+-------------------------
+
+To test to make sure everything is installed properly, try running yt at 
+the command line:
+
+.. code-block:: bash
+
+  $ yt --help
+
+If this works, you should get a list of the various command-line options for
+yt, which means you have successfully installed yt.  Congratulations!  
+
+If you get an error, follow the instructions it gives you to debug the problem.  
+Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
+figure it out.

diff -r 6b92ba058e7ebbf6f255a9ecb6eadb23d6a126bb -r 3995f667fbedfdc99186fc5845162ff73e9326ae source/orientation/installing.rst
--- a/source/orientation/installing.rst
+++ /dev/null
@@ -1,53 +0,0 @@
-Installing yt
--------------
-
-yt is a Python package (with some components written in C), using NumPy as a
-computation engine, Matplotlib for some visualization tasks and Mercurial for
-version control.  Installing all of these components can be a daunting task,
-particularly as the Python ecosystem of packages is rapidly evolving.  Frankly,
-one of the *last* things a computational scientist wants to do is to install a
-bunch of packages and deal with the interlocking parts, when really the goal is
-to just simply look at some data.
-
-To that end, the yt project provides an installation script for the toolchain
-upon which yt builds, which contains a fully-isolated Python + Numpy +
-Matplotlib + HDF5 + Mercurial installation.  This installation script has been
-tested on most of the Teragrid as well as on a number of private clusters and
-Linux and OS X machines; in fact, if it doesn't work, that's considered a bug
-and we would endeavor to fix it.  yt supports Linux and OSX deployment, with
-the possibility of deployment on other Unix-like systems.  Windows is not
-supported.
-
-To get the installation script, download it from:
-
-.. code-block:: bash
-
-  http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
-
-By default, it will install an array of items, with an option to also download
-the current stable version of Enzo.  The script has all its options at the top
-of the script; you should be able to open it and edit it without any knowledge
-of bash syntax.
-
-.. code-block:: bash
-
-  $ bash install_script.sh
-
-It will start out by telling you a little bit about itself and what it's
-installing, and then continue on for some time while it downloads, builds, and
-installs (into an isolated directory) everything you need to run yt.
-
-Once it has completed, there will be instructions on how to set up your shell
-environment to use yt.  **You should follow these, or else yt may not work, or
-may simply fail -- in unexpected ways!**
-
-One thing that we will use for the rest of the orientation is the environment
-variable ``YT_DEST``, which is output at the end of the installation process.
-If you use the ``activate`` script as described in the instructions printed by
-the install script, you will be all set.
-
-If you'd like to do it manually, ``YT_DEST`` needs to point to the root of the
-directory containing the install. By default, this will be ``yt-<arch>``, where
-``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You will also
-need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain ``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
-


https://bitbucket.org/yt_analysis/yt-doc/commits/5ac4330a4601/
Changeset:   5ac4330a4601
User:        chummels
Date:        2013-10-28 21:11:55
Summary:     Making toctree function correctly.
Affected #:  1 file

diff -r 3995f667fbedfdc99186fc5845162ff73e9326ae -r 5ac4330a4601005dd70f3385ef0e7e2046fe4735 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,5 +1,5 @@
 Installing yt
--------------
+=============
 
 yt is a Python package (with some components written in C), using NumPy as a
 computation engine, Matplotlib for some visualization tasks and Mercurial for


https://bitbucket.org/yt_analysis/yt-doc/commits/e3af838cd29d/
Changeset:   e3af838cd29d
User:        chummels
Date:        2013-10-28 21:12:59
Summary:     fixing top level page.
Affected #:  1 file

diff -r 5ac4330a4601005dd70f3385ef0e7e2046fe4735 -r e3af838cd29d354ace579c190ee7a7871f8874b0 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -23,8 +23,8 @@
  * Visualize Data - Generate plots, images, and movies for better understanding your raw datasets.
  * Analyze Data - Use a variety of additional analysis routines to derive real-world results associated with your data.
 
-Documentation Highlights
-========================
+Documentation
+=============
 
 .. toctree::
    :maxdepth: 1


https://bitbucket.org/yt_analysis/yt-doc/commits/09fd04060fce/
Changeset:   09fd04060fce
User:        chummels
Date:        2013-10-28 23:37:54
Summary:     Updating developers docs.
Affected #:  1 file

diff -r e3af838cd29d354ace579c190ee7a7871f8874b0 -r 09fd04060fce8294843bc23ad0bda01cfb979177 source/developing/index.rst
--- a/source/developing/index.rst
+++ b/source/developing/index.rst
@@ -1,5 +1,13 @@
-Getting Involved with yt
-========================
+Developing in yt
+================
+
+yt is an open-source project with a community of contributing scientists.
+While you can use the existing framework within yt to help answer questions 
+about your own datasets, yt thrives by the addition of new functionality by
+users just like yourself.  Maybe you have a new data format that you would like
+supported, a new derived quantity that you feel should be included, or a new
+way of visualizing data--please add them to the code base!  We are eager to 
+help you make it happen.
 
 There are many ways to get involved with yt -- participating in the mailing
 list, helping people out in IRC, providing suggestions for the documentation,


https://bitbucket.org/yt_analysis/yt-doc/commits/df776067740b/
Changeset:   df776067740b
User:        chummels
Date:        2013-10-29 00:24:21
Summary:     Updating the top-level page and some of its links.  Also correcting a few things on the bootcamp and installation pages.
Affected #:  4 files

diff -r 09fd04060fce8294843bc23ad0bda01cfb979177 -r df776067740b8c57e2eb456e6e869a8e0207bc01 source/bootcamp/index.rst
--- a/source/bootcamp/index.rst
+++ b/source/bootcamp/index.rst
@@ -1,32 +1,33 @@
 yt Bootcamp
 ===========
 
-We have been developing a sequence of materials that can be run in the IPython
-notebook that walk through how to look at data and how to operate on data.
-These are not meant to be detailed walkthroughs, but simply short
-introductions.  Their purpose is to let you explore, interactively, some common
-operations that can be done on data with yt!
+yt Bootcamp is a series of worked examples of how to use much of the 
+funtionality of yt.  These are not meant to be detailed walkthroughs but simple,
+short introductions to give you a taste of what the code can do.
 
-To get started with the bootcamp, you need to download the repository and start
-the IPython notebook.  The easiest way, if you have mercurial installed, to get
-the repository is to:
+There are two ways in which you can go through the bootcamp: interactively and 
+non-interactively.  We recommend the interactive method, but if you're pressed 
+on time, you can non-interactively go through the following pages and view the 
+worked examples.
+
+To execute the bootcamp interactively, you need to download the repository 
+and start the IPython notebook.  The easiest way to get the repository is to 
+use your already-installed mercurial program to grab it:
 
 .. code-block:: bash
 
    hg clone https://bitbucket.org/yt_analysis/yt-doc
 
-If you don't, you can download it from `here
-<https://bitbucket.org/yt_analysis/yt-doc/get/tip.tar.bz2>`_
-
-Now you can start the IPython notebook and begin:
+Now start the IPython notebook from within the repository:
 
 .. code-block:: bash
 
    cd yt-doc/source/bootcamp
    yt notebook
 
-This command will give you information about the Notebook Server and how to
-access it.  Once you have done so, choose "Introduction" from the list of
+This command will give you information about the notebook server and how to
+access it.  You will basically just redirect your web browser to point to it.
+Once you have done so, choose "Introduction" from the list of
 notebooks, which includes an introduction and information about how to download
 the sample data.
 

diff -r 09fd04060fce8294843bc23ad0bda01cfb979177 -r df776067740b8c57e2eb456e6e869a8e0207bc01 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,9 +1,8 @@
 .. _cookbook:
 
-Example Scripts
+The yt Cookbook
 ===============
 
-
 yt scripts can be a bit intimidating, and at times a bit obtuse.  But there's a
 lot you can do, and this section of the manual will assist with figuring out
 how to do some fairly common tasks -- which can lead to combining these, with

diff -r 09fd04060fce8294843bc23ad0bda01cfb979177 -r df776067740b8c57e2eb456e6e869a8e0207bc01 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -19,9 +19,9 @@
 
 yt uses a three-pronged approach to interacting with data:
 
- * Examine Data - Access data directly on disk with a variety of helper classes for making this task easier.
- * Visualize Data - Generate plots, images, and movies for better understanding your raw datasets.
- * Analyze Data - Use a variety of additional analysis routines to derive real-world results associated with your data.
+ * Visualize Data - Generate plots, images, and movies for better understanding your datasets
+ * Analyze Data - Use additional analysis routines to derive real-world results from your data
+ * Examine Data - Directly access raw data with helper functions for making this task easier
 
 Documentation
 =============
@@ -30,11 +30,11 @@
    :maxdepth: 1
 
    installing
-   bootcamp/index
-   help/index
+   yt Bootcamp: A Worked Introduction <bootcamp/index>
+   How to Get Help <help/index>
    cookbook/index
-   examining/index
    visualizing/index
    analyzing/index
+   examining/index
    developing/index
    reference/index

diff -r 09fd04060fce8294843bc23ad0bda01cfb979177 -r df776067740b8c57e2eb456e6e869a8e0207bc01 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,13 +1,13 @@
-Installing yt
-=============
+Getting and Installing yt
+=========================
 
 yt is a Python package (with some components written in C), using NumPy as a
 computation engine, Matplotlib for some visualization tasks and Mercurial for
 version control.  Because installation of all of these interlocking parts can 
-time-consuming, yt provides an installation script which downloads and builds
+be time-consuming, yt provides an installation script which downloads and builds
 a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
 yt supports Linux and OSX deployment, with the possibility of deployment on 
-other Unix-like systems (XSEDE resources, clusters, etc.).  It Windows is not
+other Unix-like systems (XSEDE resources, clusters, etc.).  Windows is not
 supported.
 
 To get the installation script, download it from:
@@ -16,10 +16,10 @@
 
   http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
 
-By default, it will install an array of items, with an option to also download
-the current stable version of Enzo.  The script has all its options at the top
-of the script; you should be able to open it and edit it without any knowledge
-of bash syntax.  To execute it, run:
+By default, it will install an array of items, but there are additional packages
+that can be downloaded and installed (e.g. SciPy, enzo, etc.). The script has 
+all of these options at the top of the file. You should be able to open it and 
+edit it without any knowledge of bash syntax.  To execute it, run:
 
 .. code-block:: bash
 
@@ -31,19 +31,19 @@
 
 If you receive errors during this process, the installer will provide you 
 with a large amount of information to assist in debugging your problems.  The 
-file `yt_install.log` will contain all of the STDOUT and STDERR from the entire 
-installation process, so it is usually quite cumbersome.  By looking at the 
-last few hundred lines (i.e. `tail -500 yt_install.log`), you can potentially 
-figure out what went wrong.  If you have problems, though, do not hesitate to 
-:ref:`contact us asking-for-help` for assitance.
+file ``yt_install.log`` will contain all of the ``STDOUT`` and ``STDERR`` from 
+the entire installation process, so it is usually quite cumbersome.  By looking 
+at the last few hundred lines (i.e. ``tail -500 yt_install.log``), you can 
+potentially figure out what went wrong.  If you have problems, though, do not 
+hesitate to :ref:`contact us asking-for-help` for assistance.
 
 Activating Your Installation
 ----------------------------
 
 Once the installation has completed, there will be instructions on how to set up 
-your shell environment to use yt using the activate script.  You must execute 
-this script in order to have yt properly recognized by your system.  You can 
-either add it to your login script, or you must execute it in a shell session 
+your shell environment to use yt by executing the activate script.  You must 
+run this script in order to have yt properly recognized by your system.  You can 
+either add it to your login script, or you must execute it in each shell session 
 prior to working with yt.
 
 .. code-block:: bash
@@ -61,7 +61,7 @@
 directory containing the install. By default, this will be ``yt-<arch>``, where
 ``<arch>`` is your machine's architecture (usually ``x86_64`` or ``i386``). You 
 will also need to set ``LD_LIBRARY_PATH`` and ``PYTHONPATH`` to contain 
-``$YT_PATH/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
+``$YT_DEST/lib`` and ``$YT_DEST/python2.7/site-packages``, respectively.
 
 Alternative Installation Methods
 --------------------------------
@@ -69,7 +69,8 @@
 If you want to forego the use of the install script, you need to make sure 
 you have yt's dependencies installed on your system.  These include: a C compiler, 
 ``HDF5``, ``Freetype``, ``libpng``, ``python``, ``cython``, ``numpy``, and 
-``matplotlib``.  From here, you can use ``pip`` to install yt as:
+``matplotlib``.  From here, you can use ``pip`` (which comes with ``Python``)
+to install yt as:
 
 .. code-block:: bash
 


https://bitbucket.org/yt_analysis/yt-doc/commits/b5f8c3ea88e9/
Changeset:   b5f8c3ea88e9
User:        chummels
Date:        2013-10-29 00:38:28
Summary:     Updating the how to get help page.
Affected #:  2 files

diff -r df776067740b8c57e2eb456e6e869a8e0207bc01 -r b5f8c3ea88e9b8488823257966f57ed753ad8070 source/help/index.rst
--- a/source/help/index.rst
+++ b/source/help/index.rst
@@ -1,6 +1,6 @@
 .. _asking-for-help:
 
-Asking for Help
+How to Get Help
 ===============
 
 If you run into problems with ``yt``, you should feel **encouraged** to ask for
@@ -8,17 +8,37 @@
 mailing list.  If something doesn't work for you, it's in everyone's best
 interests to make sure that it gets fixed.
 
+.. _update-the-code:
+
+Try Updating yt
+---------------
+
+Sometimes the pace of development is pretty fast on yt, particularly in the
+development branch, so a fix to your problem may have already been developed
+by the time you encounter it.  Many users' problems can simply be corrected
+by updating to the latest version of the code and/or its dependencies.  You 
+can update yt's source by running:
+
+.. code-block:: bash
+
+  $ yt update 
+
+or you could update yt's source as well as any software dependencies by running:
+
+.. code-block:: bash
+
+  $ yt update --all
+
 .. _search-the-documentation:
 
 Search the Documentation
 ------------------------
 
-The first thing that you want to do if you encounter an issue with yt is to 
-do a cursory check of the documentation.  This doesn't mean you have to read 
-all of the docs top-to-bottom, but you should at least run a search to see
-if relevant topics have been answered in the docs.  Click on the search field
-to the right of this window and enter your text.  Another good place to look 
-for answers in the documentation is our :ref:`faq` page.
+The documentation has a lot of the answers to everyday problems.  This doesn't 
+mean you have to read all of the docs top-to-bottom, but you should at least 
+run a search to see if relevant topics have been answered in the docs.  Click 
+on the search field to the right of this window and enter your text.  Another 
+good place to look for answers in the documentation is our :ref:`faq` page.
 
 .. _mailing-list:
 
@@ -57,8 +77,11 @@
 #. What it is that went wrong, and how you knew it went wrong.
 #. A traceback if appropriate -- see :ref:`error-reporting` for some help with
    that.
-#. If possible, the smallest number of steps that can reproduce the problem. If you're demonstrating the bug with code, you may find the :ref:`pastebin` useful.If you've got an image output that demonstrates your problem, you may find the :ref:`upload-image` function useful.
-#. Which version of the code you are using.
+#. If possible, the smallest number of steps that can reproduce the problem. 
+   If you're demonstrating the bug with code, you may find the :ref:`pastebin` 
+   useful.If you've got an image output that demonstrates your problem, you may 
+   find the :ref:`upload-image` function useful.
+#. Which version of the code you are using (i.e. the output of ``yt instinfo``).
 
 When you email the list, providing this information can help the developers
 understand what you did, how it went wrong, and any potential fixes or similar

diff -r df776067740b8c57e2eb456e6e869a8e0207bc01 -r b5f8c3ea88e9b8488823257966f57ed753ad8070 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -31,7 +31,7 @@
 
    installing
    yt Bootcamp: A Worked Introduction <bootcamp/index>
-   How to Get Help <help/index>
+   help/index
    cookbook/index
    visualizing/index
    analyzing/index


https://bitbucket.org/yt_analysis/yt-doc/commits/bbfc729dc89a/
Changeset:   bbfc729dc89a
User:        chummels
Date:        2013-10-29 00:57:11
Summary:     Updating cookbook to reflect now having example scripts and example notebooks.
Affected #:  2 files

diff -r b5f8c3ea88e9b8488823257966f57ed753ad8070 -r bbfc729dc89ad65053da0458d4c2fb445f35300f source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -3,13 +3,20 @@
 The yt Cookbook
 ===============
 
-yt scripts can be a bit intimidating, and at times a bit obtuse.  But there's a
-lot you can do, and this section of the manual will assist with figuring out
-how to do some fairly common tasks -- which can lead to combining these, with
-other Python code, into more complicated and advanced tasks.
+yt provides a great deal of functionality to the user, but sometimes it can 
+be a bit complex.  This section of the documentation lays out examples recipes 
+for how to do a variety of tasks.  Most of the early, simple code 
+demonstrations are small scripts which you can easily copy and paste into
+your own code; however, as we move to more complex tasks, the recipes move to 
+iPython notebooks to display intermediate steps.  All of these recipes are
+available for download in a link next to the recipe.
 
-All of the data used here is freely available from http://yt-project.org/data/
-where you will find links to download individual datasets.
+Getting the Sample Data
+-----------------------
+
+All of the data used in the cookbook is freely available 
+`here <http://yt-project.org/data/>`_, where you will find links to download 
+individual datasets.
 
 If you want to take a look at more complex recipes, or submit your own,
 check out the `yt Hub <http://hub.yt-project.org>`_.
@@ -18,6 +25,8 @@
    `fork <http://bitbucket.org/yt_analysis/yt-doc/fork>`_
    the documentation repository!
 
+Example Scripts
+---------------
 .. toctree::
    :maxdepth: 2
 
@@ -26,3 +35,10 @@
    complex_plots
    cosmological_analysis
    constructing_data_objects
+
+Example Notebooks
+-----------------
+.. toctree::
+   :maxdepth: 2
+
+   notebook_tutorial

diff -r b5f8c3ea88e9b8488823257966f57ed753ad8070 -r bbfc729dc89ad65053da0458d4c2fb445f35300f source/cookbook/notebook_tutorial.rst
--- /dev/null
+++ b/source/cookbook/notebook_tutorial.rst
@@ -0,0 +1,4 @@
+Notebook Tutorial
+-----------------
+
+This is a placeholder for the badass notebook tutorial that Nathan is going to write.


https://bitbucket.org/yt_analysis/yt-doc/commits/9acb2084f9e9/
Changeset:   9acb2084f9e9
User:        chummels
Date:        2013-10-29 01:05:10
Summary:     Minor change to the main page header.
Affected #:  1 file

diff -r bbfc729dc89ad65053da0458d4c2fb445f35300f -r 9acb2084f9e9976b56f1fc399c4bfddd4d481feb source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -1,4 +1,4 @@
-yt Overview
+What is yt?
 ===========
 
 yt is a community-developed analysis and visualization toolkit for


https://bitbucket.org/yt_analysis/yt-doc/commits/d6f0c6509ea8/
Changeset:   d6f0c6509ea8
User:        chummels
Date:        2013-10-29 01:13:36
Summary:     Merged yt_analysis/yt-doc into default
Affected #:  1 file

diff -r 9acb2084f9e9976b56f1fc399c4bfddd4d481feb -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -253,5 +253,5 @@
                        'http://matplotlib.sourceforge.net/': None,
                        }
 
-#if not on_rtd:
-#    autosummary_generate = glob.glob("api/api.rst")
+if not on_rtd:
+    autosummary_generate = glob.glob("api/api.rst")


https://bitbucket.org/yt_analysis/yt-doc/commits/5c26acf5e060/
Changeset:   5c26acf5e060
User:        ngoldbaum
Date:        2013-10-28 02:50:51
Summary:     Adding a notebook sphinx extension.

This includes a new sphinx directive, notebook, that accepts the path to an
unevaluated notebook as an argument.  It outputs an evaluated version of the
notebook inline in the sphinx document.

There is a bit of monkeypatching for the notebook CSS we get from IPython's
html exporter.  The search-replaces I'm using may not work with pre-IPython 1.0
notebooks or future notebook version.
Affected #:  2 files

diff -r fbf2479e6f0c50b48b111cced08a09e6ea97af35 -r 5c26acf5e060030d7425265981240f7e37db4786 extensions/notebook_sphinxext.py
--- /dev/null
+++ b/extensions/notebook_sphinxext.py
@@ -0,0 +1,151 @@
+import os, shutil, string
+from sphinx.util.compat import Directive
+from docutils import nodes
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from runipy.notebook_runner import NotebookRunner
+from jinja2 import FileSystemLoader
+
+class NotebookDirective(Directive):
+    """Insert an evaluated notebook into a document
+
+    This uses runipy and nbconvert to transform a path to an unevaluated notebook
+    into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 1
+    optional_arguments = 0
+
+    def run(self):
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # get path to notebook
+        source_dir = os.path.dirname(
+            os.path.abspath(self.state.document.current_source))
+        nb_basename = os.path.basename(self.arguments[0])
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        nb_abs_path = os.path.join(rst_dir, nb_basename)
+
+        # Move files around.
+        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
+                                                os.path.dirname(nb_abs_path)))
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir)
+
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        place = os.path.join(dest_dir, rel_dir)
+        if not os.path.isdir(place): os.makedirs(place)
+        dest_path = os.path.join(place, nb_basename)
+        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
+        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
+
+        # Copy unevaluated script
+        try:
+            shutil.copyfile(nb_abs_path, dest_path)
+        except IOError:
+            raise RuntimeError("Unable to copy notebook to build destination.")
+
+        # Create python script vesion
+        unevaluated_text = nb_to_html(nb_abs_path)
+        script_text = nb_to_python(nb_abs_path)
+        f = open(dest_path_script, 'w')
+        f.write(script_text.encode('utf8'))
+        f.close()
+
+        # Create evaluated version and save it to the dest path.
+        # Always use --pylab so figures appear inline
+        # perhaps this is questionable?
+        nb_runner = NotebookRunner(nb_in=nb_abs_path, pylab=True)
+        nb_runner.run_notebook()
+        nb_runner.save_notebook(dest_path_eval)
+        evaluated_text = nb_to_html(dest_path_eval)
+
+        # Create link to notebook and script files
+        link_rst = "(" + \
+                   formatted_link(dest_path) + "; " + \
+                   formatted_link(dest_path_eval) + "; " + \
+                   formatted_link(dest_path_script) + \
+                   ")"
+
+        self.state_machine.insert_input([link_rst], rst_file)
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = nodes.raw('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # add dependency
+        self.state.document.settings.record_dependencies.add(nb_abs_path)
+
+        return [nb_node]
+
+class notebook_node(nodes.raw):
+    pass
+
+def nb_to_python(nb_path):
+    """convert notebook to python script"""
+    exporter = python.PythonExporter()
+    output, resources = exporter.from_filename(nb_path)
+    return output
+
+def nb_to_html(nb_path):
+    """convert notebook to html"""
+    exporter = html.HTMLExporter(template_file='full')
+    output, resources = exporter.from_filename(nb_path)
+    header = output.split('<head>', 1)[1].split('</head>',1)[0]
+    body = output.split('<body>', 1)[1].split('</body>',1)[0]
+
+    # http://imgur.com/eR9bMRH
+    header = header.replace('<style', '<style scoped="scoped"')
+    header = header.replace('body{background-color:#ffffff;}\n', '')
+    header = header.replace('body{background-color:white;position:absolute;'
+                            'left:0px;right:0px;top:0px;bottom:0px;'
+                            'overflow:visible;}\n', '')
+    header = header.replace('body{margin:0;'
+                            'font-family:"Helvetica Neue",Helvetica,Arial,'
+                            'sans-serif;font-size:13px;line-height:20px;'
+                            'color:#000000;background-color:#ffffff;}', '')
+    header = header.replace('\na{color:#0088cc;text-decoration:none;}', '')
+    header = header.replace(
+        'a:focus{color:#005580;text-decoration:underline;}', '')
+    header = header.replace(
+        '\nh1,h2,h3,h4,h5,h6{margin:10px 0;font-family:inherit;font-weight:bold;'
+        'line-height:20px;color:inherit;text-rendering:optimizelegibility;}'
+        'h1 small,h2 small,h3 small,h4 small,h5 small,'
+        'h6 small{font-weight:normal;line-height:1;color:#999999;}'
+        '\nh1,h2,h3{line-height:40px;}\nh1{font-size:35.75px;}'
+        '\nh2{font-size:29.25px;}\nh3{font-size:22.75px;}'
+        '\nh4{font-size:16.25px;}\nh5{font-size:13px;}'
+        '\nh6{font-size:11.049999999999999px;}\nh1 small{font-size:22.75px;}'
+        '\nh2 small{font-size:16.25px;}\nh3 small{font-size:13px;}'
+        '\nh4 small{font-size:13px;}', '')
+    header = header.replace('background-color:#ffffff;', '', 1)
+
+    # concatenate raw html lines
+    lines = ['<div class="ipynotebook">']
+    lines.append(header)
+    lines.append(body)
+    lines.append('</div>')
+    return '\n'.join(lines)
+
+def formatted_link(path):
+    return "`%s <%s>`__" % (os.path.basename(path), path)
+
+def visit_notebook_node(self, node):
+    self.visit_raw(node)
+
+def depart_notebook_node(self, node):
+    self.depart_raw(node)
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook', NotebookDirective)

diff -r fbf2479e6f0c50b48b111cced08a09e6ea97af35 -r 5c26acf5e060030d7425265981240f7e37db4786 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -30,7 +30,7 @@
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
               'sphinx.ext.autosummary', 'numpydocmod', 'youtube',
-              'yt_cookbook', 'yt_colormaps']
+              'yt_cookbook', 'yt_colormaps', 'notebook_sphinxext']
 
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']


https://bitbucket.org/yt_analysis/yt-doc/commits/945aaa353b3d/
Changeset:   945aaa353b3d
User:        ngoldbaum
Date:        2013-10-28 04:52:40
Summary:     Adding bootcamps to the docs proper.
Affected #:  14 files

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Data_Inspection.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Inspection.ipynb
@@ -0,0 +1,396 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Starting Out and Loading Data\n",
+      "\n",
+      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment.  Note that in most scripts, you will want to import from ``yt.mods`` rather than ``yt.imods``.  But using ``yt.imods`` gets you some nice stuff for the IPython notebook, which we'll use below."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we've loaded yt, we can load up some data.  Let's load the `IsolatedGalaxy` dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Fields and Facts\n",
+      "\n",
+      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"hierarchy\" (shorthand is `.h`) it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.print_stats()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also tell you the fields it found on disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, all of the fields it thinks it knows how to generate:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.derived_field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also transparently generate fields.  However, we encourage you to examine exactly what yt is doing when it generates those fields.  To see, you can ask for the source of a given field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.field_info[\"VorticityX\"].get_source()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt stores information about the domain of the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also convert this into various units:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width * pf[\"kpc\"]\n",
+      "print pf.domain_width * pf[\"au\"]\n",
+      "print pf.domain_width * pf[\"miles\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Mesh Structure\n",
+      "\n",
+      "If you're using a simulation type that has grids (for instance, here we're using an Enzo simulation) you can examine the structure of the mesh.  For the most part, you probably won't have to use this unless you're debugging a simulation or examining in detail what is going on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grid_left_edge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The hierarchy (`pf.h` here) has an attribute `grids` which is all of the grid objects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grids[0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g = pf.h.grids[0]\n",
+      "print g"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Grids have dimensions, extents, level, and even a list of Child grids."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.ActiveDimensions"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.LeftEdge, g.RightEdge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Level"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Children"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Grid Inspection\n",
+      "\n",
+      "If we want to examine grids only at a given level, we can!  Not only that, but we can load data and take a look at various fields.\n",
+      "\n",
+      "*This section can be skipped!*"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "gs = pf.h.select_grids(pf.h.max_level)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g2 = gs[0]\n",
+      "print g2\n",
+      "print g2.Parent\n",
+      "print g2.get_global_startindex()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print g2[\"Density\"][:,:,0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print (g2.Parent.child_mask == 0).sum() * 8\n",
+      "print g2.ActiveDimensions.prod()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in pf.h.field_list:\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in sorted(pf.h.field_list):\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Examining Data in Regions\n",
+      "\n",
+      "yt provides data object selectors.  In subsequent notebooks we'll examine these in more detail, but we can select a sphere of data and perform a number of operations on it.  yt makes it easy to operate on fluid fields in an object in *bulk*, but you can also examine individual field values.\n",
+      "\n",
+      "This creates a sphere selector positioned at the most dense point in the simulation that has a radius of 10 kpc."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10, 'kpc'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can calculate a bunch of bulk quantities.  Here's that list, but there's a list in the docs, too!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Let's look at the total mass.  This is how you call a given quantity.  yt calls these \"Derived Quantities\".  We'll talk about a few in a later notebook."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities[\"TotalMass\"]()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Data_Objects_and_Time_Series.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Objects_and_Time_Series.ipynb
@@ -0,0 +1,361 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Data Objects and Time Series Data\n",
+      "\n",
+      "Just like before, we will load up yt."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Time Series Data\n",
+      "\n",
+      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `TimeSeriesData` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
+      "\n",
+      "Note that you can specify either a list of filenames, or a glob (i.e., asterisk) pattern in this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ts = TimeSeriesData.from_filenames(os.environ[\"YT_DATA_DIR\"]+\"enzo_tiny_cosmology/*/*.hierarchy\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 1: Simple Time Series\n",
+      "\n",
+      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for pf in ts` where `pf` means \"Parameter File\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the Derived Quantity `Extrema`, and append the min and max to our extrema outputs."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "rho_ex = []\n",
+      "times = []\n",
+      "for pf in ts:\n",
+      "    dd = pf.h.all_data()\n",
+      "    rho_ex.append(dd.quantities[\"Extrema\"](\"Density\")[0])\n",
+      "    times.append(pf.current_time * pf[\"years\"])\n",
+      "rho_ex = np.array(rho_ex)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the minimum and the maximum:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.semilogy(times, rho_ex[:,0], '-xk')\n",
+      "pylab.semilogy(times, rho_ex[:,1], '-xr')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 2: Advanced Time Series\n",
+      "\n",
+      "Let's do something a bit different.  Let's calculate the total mass inside halos and outside halos.\n",
+      "\n",
+      "This actually touches a lot of different pieces of machinery in yt.  For every parameter file, we will run the halo finder HOP.  Then, we calculate the total mass in the domain.  Then, for each halo, we calculate the sum of the baryon mass in that halo.  We'll keep running tallies of these two things."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "mass = []\n",
+      "zs = []\n",
+      "for pf in ts:\n",
+      "    halos = HaloFinder(pf)\n",
+      "    dd = pf.h.all_data()\n",
+      "    total_mass = dd.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    total_in_baryons = 0.0\n",
+      "    for halo in halos:\n",
+      "        sp = halo.get_sphere()\n",
+      "        total_in_baryons += sp.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    mass.append(total_in_baryons/total_mass)\n",
+      "    zs.append(pf.current_redshift)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now let's plot them!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(zs, mass, '-xb')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Data Objects\n",
+      "\n",
+      "Time series data have many applications, but most of them rely on examining the underlying data in some way.  Below, we'll see how to use and manipulate data objects.\n",
+      "\n",
+      "### Ray Queries\n",
+      "\n",
+      "yt provides the ability to examine rays, or lines, through the domain.  Note that these are not periodic, unlike most other data objects.  We create a ray object and can then examine quantities of it.  Rays have the special fields `t` and `dts`, which correspond to the time the ray enters a given cell and the distance it travels through that cell.\n",
+      "\n",
+      "To create a ray, we specify the start and end points."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ray = pf.h.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
+      "pylab.semilogy(ray[\"t\"], ray[\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"dts\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"t\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"x\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Slice Queries\n",
+      "\n",
+      "While slices are often used for visualization, they can be useful for other operations as well.  yt regards slices as multi-resolution objects.  They are an array of cells that are not all the same size; it only returns the cells at the highest resolution that it intersects.  (This is true for all yt data objects.)  Slices and projections have the special fields `px`, `py`, `pdx` and `pdy`, which correspond to the coordinates and half-widths in the pixel plane."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "sl = pf.h.slice(0, c[0])\n",
+      "print sl[\"x\"], sl[\"z\"], sl[\"pdx\"]\n",
+      "print sl[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to do something interesting with a Slice, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "frb = sl.to_frb((50.0, 'kpc'), 1024)\n",
+      "print frb[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of Density, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "write_image(np.log10(frb[\"Density\"]), \"temp.png\")\n",
+      "from IPython.core.display import Image\n",
+      "Image(filename = \"temp.png\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Off-Axis Slices\n",
+      "\n",
+      "yt provides not only slices, but off-axis slices that are sometimes called \"cutting planes.\"  These are specified by (in order) a normal vector and a center.  Here we've set the normal vector to `[0.2, 0.3, 0.5]` and the center to be the point of maximum density.\n",
+      "\n",
+      "We can then turn these directly into plot windows using `to_pw`.  Note that the `to_pw` and `to_frb` methods are available on slices, off-axis slices, and projections, and can be used on any of them."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cp = pf.h.cutting([0.2, 0.3, 0.5], \"max\")\n",
+      "pw = cp.to_pw(fields = [\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once we have our plot window from our cutting plane, we can show it here."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pw.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can, as noted above, do the same with our slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pws = sl.to_pw(fields=[\"Density\"])\n",
+      "pws.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Covering Grids\n",
+      "\n",
+      "If we want to access a 3D array of data that spans multiple resolutions in our simulation, we can use a covering grid.  This will return a 3D array of data, drawing from up to the resolution level specified when creating the data.  For example, if you create a covering grid that spans two child grids of a single parent grid, it will fill those zones covered by a zone of a child grid with the data from that child grid.  Where it is covered only by the parent grid, the cells from the parent grid will be duplicated (appropriately) to fill the covering grid.\n",
+      "\n",
+      "There are two different types of covering grids: unsmoothed and smoothed.  Smoothed grids will be filled through a cascading interpolation process; they will be filled at level 0, interpolated to level 1, filled at level 1, interpolated to level 2, filled at level 2, etc.  This will help to reduce edge effects.  Unsmoothed covering grids will not be interpolated, but rather values will be duplicated multiple times.\n",
+      "\n",
+      "Here we create an unsmoothed covering grid at level 2, with the left edge at `[0.0, 0.0, 0.0]` and with dimensions equal to those that would cover the entire domain at level 2.  We can then ask for the Density field, which will be a 3D array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cg = pf.h.covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print cg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we do exactly the same thing: except we ask for a *smoothed* covering grid, which will reduce edge effects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "scg = pf.h.smoothed_covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print scg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Derived_Fields_and_Profiles.ipynb
--- /dev/null
+++ b/source/bootcamp/Derived_Fields_and_Profiles.ipynb
@@ -0,0 +1,316 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Derived Fields and Profiles\n",
+      "\n",
+      "One of the most powerful features in yt is the ability to create derived fields that act and look exactly like fields that exist on disk.  This means that they will be generated on demand and can be used anywhere a field that exists on disk would be used.  Additionally, you can create them by just writing python functions."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 1
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Derived Fields\n",
+      "\n",
+      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `Dinosaurs` and our units are `Trex/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(units = \"Trex/s\")\n",
+      "def Dinosaurs(field, data):\n",
+      "    return data[\"Density\"]**(2.0/3.0) * data[\"VelocityMagnitude\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 2
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One important thing to note is that derived fields must be defined *before* any datasets are loaded.  Let's load up our data and take a look at some quantities."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "print dd.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "['MinLocation', 'StarAngularMomentumVector', 'WeightedVariance', 'TotalMass', 'AngularMomentumVector', 'TotalQuantity', 'IsBound', 'WeightedAverageQuantity', 'CenterOfMass', 'BulkVelocity', 'ParticleSpinParameter', 'Action', 'Extrema', 'MaxLocation', 'BaryonSpinParameter']\n"
+       ]
+      }
+     ],
+     "prompt_number": 4
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `Extrema` quantity -- the exact same way that we would for Density, Temperature, and so on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"Extrema\"](\"Dinosaurs\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "[(2.2146366774504352e-20, 9.1573883828992124e-09)]\n"
+       ]
+      }
+     ],
+     "prompt_number": 5
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can do the same for the average quantities as well."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"WeightedAverageQuantity\"](\"Dinosaurs\", weight=\"Temperature\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## A Few Other Quantities\n",
+      "\n",
+      "We can ask other quantities of our data, as well.  For instance, this sequence of operations will find the most dense point, center a sphere on it, calculate the bulk velocity of that sphere, calculate the baryonic angular momentum vector, and then the density extrema.  All of this is done in a memory conservative way: if you have an absolutely enormous dataset, yt will split that dataset into pieces, apply intermediate reductions and then a final reduction to calculate your quantity."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10.0, 'kpc'))\n",
+      "bv = sp.quantities[\"BulkVelocity\"]()\n",
+      "L = sp.quantities[\"AngularMomentumVector\"]()\n",
+      "(rho_min, rho_max), = sp.quantities[\"Extrema\"](\"Density\")\n",
+      "print bv, L, rho_min, rho_max"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Profiles\n",
+      "\n",
+      "yt provides the ability to bin in 1, 2 and 3 dimensions.  This means discretizing in one or more dimensions of phase space (density, temperature, etc) and then calculating either the total value of a field in each bin or the average value of a field in each bin.\n",
+      "\n",
+      "We do this using the objects `BinnedProfile1D`, `BinnedProfile2D`, and `BinnedProfile3D`.  The first two are the most common since they are the easiest to visualize.\n",
+      "\n",
+      "This first set of commands manually creates a `BinnedProfile1D` from the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the Density-weighted average of the fields `Temperature` and (previously-defined) `Dinosaurs`.  We then plot it in a loglog plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof = BinnedProfile1D(sp, 32, \"Density\", rho_min, rho_max)\n",
+      "prof.add_fields([\"Temperature\", \"Dinosaurs\"], weight=\"Density\")\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"Temperature\"], \"-x\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the `Dinosaurs` field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(prof[\"Density\"], prof[\"Dinosaurs\"], '-x')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to see the total mass in every bin, we add the `CellMassMsun` field with no weight.  Specifying `weight=None` will simply take the total value in every bin and add that up."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can also specify accumulation, which sums all the bins, from left to right.  Note that for 2D and 3D profiles, this needs to be a tuple of length 2 or 3."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None, accumulation=True)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Derived Fields\n",
+      "\n",
+      "*This section can be skipped!*\n",
+      "\n",
+      "You can also define fields that require extra zones.  This is useful, for instance, if you want to take the average, or apply a stencil.  yt provides fields like `DivV` that do this internally.  This example is a very busy example of how to do it.  You need to specify the validator `ValidateSpatial` with the number of extra zones *on each side* of the grid that you need, and then inside your function you need to return a field *with those zones stripped off*.  So by necessity, the arrays returned by `data[something]` will have larger spatial extent than what should be returned by the function itself.  If you specify that you need 0 extra zones, this will also work and will simply supply a `grid` object for the field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(name = \"AveragedTemperature\",\n",
+      "               validators = [ValidateSpatial(1)],\n",
+      "               units = r\"K\")\n",
+      "def _AveragedTemperature(field, data):\n",
+      "    nx, ny, nz = data[\"Temperature\"].shape\n",
+      "    new_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    weight_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    i_i, j_i, k_i = na.mgrid[0:3,0:3,0:3]\n",
+      "    for i,j,k in zip(i_i.ravel(),j_i.ravel(),k_i.ravel()):\n",
+      "        sl = [slice(i,nx-(2-i)),slice(j,ny-(2-j)),slice(k,nz-(2-k))]\n",
+      "        new_field += data[\"Temperature\"][sl] * data[\"CellMass\"][sl]\n",
+      "        weight_field += data[\"CellMass\"][sl]\n",
+      "    # Now some fancy footwork\n",
+      "    new_field2 = na.zeros((nx,ny,nz))\n",
+      "    new_field2[1:-1,1:-1,1:-1] = new_field/weight_field\n",
+      "    return new_field2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now, once again, we can access `AveragedTemperature` just like any other field.  Note that because it requires ghost zones, this will be a much slower process!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "(tmin, tmax), (atmin, atmax) = dd.quantities[\"Extrema\"]([\"Temperature\", \"AveragedTemperature\"])\n",
+      "print tmin, tmax, atmin, atmax\n",
+      "print tmin / atmin, tmax / atmax"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Field Parameters\n",
+      "\n",
+      "Field parameters are a method of passing information to derived fields.  For instance, you might pass in information about a vector you want to use as a basis for a coordinate transformation.  yt often uses things like `bulk_velocity` to identify velocities that should be subtracted off.  Here we show how that works:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp_small = pf.h.sphere(\"max\", (1.0, 'kpc'))\n",
+      "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
+      "\n",
+      "sp = pf.h.sphere(\"max\", (0.1, 'mpc'))\n",
+      "rv1 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "sp.clear_data()\n",
+      "sp.set_field_parameter(\"bulk_velocity\", bv)\n",
+      "rv2 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "print bv\n",
+      "print rv1\n",
+      "print rv2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Introduction.ipynb
--- /dev/null
+++ b/source/bootcamp/Introduction.ipynb
@@ -0,0 +1,76 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Welcome to the yt bootcamp!\n",
+      "\n",
+      "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
+      "\n",
+      "But, before we begin, there are a few places to go if you run into trouble.\n",
+      "\n",
+      "**The yt homepage is at http://yt-project.org/**\n",
+      "\n",
+      "## Source of Help\n",
+      "\n",
+      "There are three places to check for help:\n",
+      "\n",
+      " * The documentation: http://yt-project.org/doc/\n",
+      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
+      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
+      "\n",
+      "## Sources of Information\n",
+      "\n",
+      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
+      "\n",
+      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
+      "\n",
+      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
+      "\n",
+      "## How to Update yt\n",
+      "\n",
+      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
+      "\n",
+      "`yt update`\n",
+      "\n",
+      "This will automatically update it for you.\n",
+      "\n",
+      "## Acquiring the datasets for this tutorial\n",
+      "\n",
+      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/, or run this next cell by pressing `Shift-Enter` inside it.  It may take a few minutes.\n",
+      "\n",
+      "## What's Next?\n",
+      "\n",
+      "The Notebooks are meant to be explored in this order:\n",
+      "\n",
+      "1. Introduction\n",
+      "2. Data Inspection\n",
+      "3. Simple Visualization\n",
+      "4. Data Objects and Time Series\n",
+      "5. Derived Fields and Profiles\n",
+      "6. Volume Rendering"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "!export YT_DATA_DIR=$HOME/Documents/test/"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Simple_Visualization.ipynb
--- /dev/null
+++ b/source/bootcamp/Simple_Visualization.ipynb
@@ -0,0 +1,274 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Simple Visualizations of Data\n",
+      "\n",
+      "Just like in our first notebook, we have to load yt and then some data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For this notebook, we'll load up a cosmology dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "print \"Redshift =\", pf.current_redshift"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In the terms that yt uses, a projection is a line integral through the domain.  This can either be unweighted (in which case a column density is returned) or weighted, in which case an average value is returned.  Projections are, like all other data objects in yt, full-fledged data objects that churn through data and present that to you.  However, we also provide a simple method of creating Projections and plotting them in a single step.  This is called a Plot Window, here specifically known as a `ProjectionPlot`.  One thing to note is that in yt, we project all the way through the entire domain at a single time.  This means that the first call to projecting can be somewhat time consuming, but panning, zooming and plotting are all quite fast.\n",
+      "\n",
+      "yt is designed to make it easy to make nice plots and straightforward to modify those plots directly.  The cookbook in the documentation includes detailed examples of this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"y\", \"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The `show` command simply sends the plot to the IPython notebook.  You can also call `p.save()` which will save the plot to the file system.  This function accepts an argument, which will be pre-prended to the filename and can be used to name it based on the width or to supply a location.\n",
+      "\n",
+      "Now we'll zoom and pan a bit."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(2.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((0.1, 0.0))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((-0.25, -0.5))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(0.1)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we specify multiple fields, each time we call `show` we get multiple plots back.  Same for `save`!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"z\", [\"Density\", \"Temperature\"], weight_field=\"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the colormap on a field-by-field basis."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.set_cmap(\"Temperature\", \"hot\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, we can re-center the plot on different locations.  One possible use of this would be to make a single `ProjectionPlot` which you move around to look at different regions in your simulation, saving at each one."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "p.set_center((c[0], c[1]))\n",
+      "p.zoom(10)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Okay, let's load up a bigger simulation (from `Enzo_64` this time) and make a slice plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"Enzo_64/DD0043/data0043\")\n",
+      "s = SlicePlot(pf, \"z\", [\"Density\", \"VelocityMagnitude\"], center=\"max\")\n",
+      "s.set_cmap(\"VelocityMagnitude\", \"kamae\")\n",
+      "s.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the logging of various fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.set_log(\"VelocityMagnitude\", True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides many different annotations for your plots.  You can see all of these in the documentation, or if you type `s.annotate_` and press tab, a list will show up here.  We'll annotate with velocity arrows."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.annotate_velocity()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Contours can also be overlaid:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s = SlicePlot(pf, \"x\", [\"Density\"], center=\"max\")\n",
+      "s.annotate_contour(\"Temperature\")\n",
+      "s.zoom(2.5)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we can save out to the file system."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.save()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/Volume_Rendering.ipynb
--- /dev/null
+++ b/source/bootcamp/Volume_Rendering.ipynb
@@ -0,0 +1,95 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# A Brief Demo of Volume Rendering\n",
+      "\n",
+      "This shows a small amount of volume rendering.  Really, just enough to get your feet wet!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To create a volume rendering, we need a camera and a transfer function.  We'll use the `ColorTransferFunction`, which accepts (in log space) the minimum and maximum bounds of our transfer function.  This means behavior for data outside these values is undefined.\n",
+      "\n",
+      "We then add on \"layers\" like an onion.  This function can accept a width (here specified) in data units, and also a color map.  Here we add on four layers.\n",
+      "\n",
+      "Finally, we create a camera.  The focal point is `[0.5, 0.5, 0.5]`, the width is 20 kpc (including front-to-back integration) and we specify a transfer function.  Once we've done that, we call `show` to actually cast our rays and display them inline."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -24))\n",
+      "tf.add_layers(4, w=0.01)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf)\n",
+      "cam.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the `std()` of the image array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cam.show(clip_ratio=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "There are several other options we can specify.  Note that here we have turned on the use of ghost zones, shortened the data interval for the transfer function, and widened our gaussian layers."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -25))\n",
+      "tf.add_layers(4, w=0.03)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf, no_ghost=False)\n",
+      "cam.show(clip_ratio=4.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/data_inspection.rst
--- /dev/null
+++ b/source/bootcamp/data_inspection.rst
@@ -0,0 +1,4 @@
+Data Inspection
+---------------
+
+.. notebook:: Data_Inspection.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/data_objects_and_time_series.rst
--- /dev/null
+++ b/source/bootcamp/data_objects_and_time_series.rst
@@ -0,0 +1,4 @@
+Data Objects and Time Series
+----------------------------
+
+.. notebook:: Data_Objects_and_Time_Series.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/derived_fields_and_profiles.rst
--- /dev/null
+++ b/source/bootcamp/derived_fields_and_profiles.rst
@@ -0,0 +1,4 @@
+Derived Fields and Profiles
+---------------------------
+
+.. notebook:: Derived_Fields_and_Profiles.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/index.rst
--- /dev/null
+++ b/source/bootcamp/index.rst
@@ -0,0 +1,49 @@
+yt Bootcamp
+===========
+
+We have been developing a sequence of materials that can be run in the IPython
+notebook that walk through how to look at data and how to operate on data.
+These are not meant to be detailed walkthroughs, but simply short
+introductions.  Their purpose is to let you explore, interactively, some common
+operations that can be done on data with yt!
+
+To get started with the bootcamp, you need to download the repository and start
+the IPython notebook.  The easiest way, if you have mercurial installed, to get
+the repository is to:
+
+.. code-block:: bash
+
+   hg clone https://bitbucket.org/yt_analysis/yt-doc
+
+If you don't, you can download it from `here
+<https://bitbucket.org/yt_analysis/yt-doc/get/tip.tar.bz2>`_
+
+Now you can start the IPython notebook and begin:
+
+.. code-block:: bash
+
+   cd yt-doc/source/bootcamp
+   yt notebook
+
+This command will give you information about the Notebook Server and how to
+access it.  Once you have done so, choose "Introduction" from the list of
+notebooks, which includes an introduction and information about how to download
+the sample data.
+
+.. warning:: The pre-filled out notebooks are *far* less fun than running them
+             yourselves!  Check out the repo and give it a try.
+
+Here are the notebooks, which have been filled in for inspection:
+
+.. toctree::
+   :maxdepth: 1
+
+   introduction
+   data_inspection
+   simple_visualization
+   data_objects_and_time_series
+   derived_fields_and_profiles
+   volume_rendering
+
+Let us know if you would like to contribute other example notebooks, or have
+any suggestions for how these can be improved.

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/introduction.rst
--- /dev/null
+++ b/source/bootcamp/introduction.rst
@@ -0,0 +1,4 @@
+Introduction
+------------
+
+.. notebook:: Introduction.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/simple_visualization.rst
--- /dev/null
+++ b/source/bootcamp/simple_visualization.rst
@@ -0,0 +1,4 @@
+Simple Visualization
+--------------------
+
+.. notebook:: Simple_Visualization.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/bootcamp/volume_rendering.rst
--- /dev/null
+++ b/source/bootcamp/volume_rendering.rst
@@ -0,0 +1,4 @@
+Volume Rendering
+----------------
+
+.. notebook:: Volume_Rendering.ipynb

diff -r 5c26acf5e060030d7425265981240f7e37db4786 -r 945aaa353b3db63e91e0e0434686a4a02c27bcfd source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -113,7 +113,7 @@
 
    welcome/index
    orientation/index
-   bootcamp
+   bootcamp/index
    workshop
    help/index
    interacting/index


https://bitbucket.org/yt_analysis/yt-doc/commits/d6c52ef557d5/
Changeset:   d6c52ef557d5
User:        ngoldbaum
Date:        2013-10-29 10:15:35
Summary:     Adding two new extensions and updating the narrative plotting docs.

This adds two new sphinx directives, notebook-cell and python-script.

The former evaluates the content of the directive and embeds html for an
evaluated IPython notebook cell.

The latter evaluates the content of the directive and embeds a literalinclude
of the script its self as well as (for now) any images created by the directive.
This could be straightforwardly extended to include data files and stdout as well.
Affected #:  5 files

diff -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -4,7 +4,6 @@
 from docutils.parsers.rst import directives
 from IPython.nbconvert import html, python
 from runipy.notebook_runner import NotebookRunner
-from jinja2 import FileSystemLoader
 
 class NotebookDirective(Directive):
     """Insert an evaluated notebook into a document
@@ -52,13 +51,7 @@
         f.write(script_text.encode('utf8'))
         f.close()
 
-        # Create evaluated version and save it to the dest path.
-        # Always use --pylab so figures appear inline
-        # perhaps this is questionable?
-        nb_runner = NotebookRunner(nb_in=nb_abs_path, pylab=True)
-        nb_runner.run_notebook()
-        nb_runner.save_notebook(dest_path_eval)
-        evaluated_text = nb_to_html(dest_path_eval)
+        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
 
         # Create link to notebook and script files
         link_rst = "(" + \
@@ -71,7 +64,7 @@
 
         # create notebook node
         attributes = {'format': 'html', 'source': 'nb_path'}
-        nb_node = nodes.raw('', evaluated_text, **attributes)
+        nb_node = notebook_node('', evaluated_text, **attributes)
         (nb_node.source, nb_node.line) = \
             self.state_machine.get_source_and_line(self.lineno)
 
@@ -85,6 +78,8 @@
 
         return [nb_node]
 
+
+
 class notebook_node(nodes.raw):
     pass
 
@@ -134,6 +129,20 @@
     lines.append('</div>')
     return '\n'.join(lines)
 
+def evaluate_notebook(nb_path, dest_path=None):
+    # Create evaluated version and save it to the dest path.
+    # Always use --pylab so figures appear inline
+    # perhaps this is questionable?
+    nb_runner = NotebookRunner(nb_in=nb_path, pylab=True)
+    nb_runner.run_notebook()
+    if dest_path is None:
+        dest_path = 'temp_evaluated.ipynb'
+    nb_runner.save_notebook(dest_path)
+    ret = nb_to_html(dest_path)
+    if dest_path is 'temp_evaluated.ipynb':
+        os.remove(dest_path)
+    return ret
+
 def formatted_link(path):
     return "`%s <%s>`__" % (os.path.basename(path), path)
 

diff -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 extensions/notebookcell_sphinxext.py
--- /dev/null
+++ b/extensions/notebookcell_sphinxext.py
@@ -0,0 +1,64 @@
+import os, shutil, string, glob, io
+from sphinx.util.compat import Directive
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from IPython.nbformat import current
+from runipy.notebook_runner import NotebookRunner
+from jinja2 import FileSystemLoader
+from notebook_sphinxext import \
+    notebook_node, nb_to_html, nb_to_python, \
+    visit_notebook_node, depart_notebook_node, \
+    evaluate_notebook
+
+class NotebookCellDirective(Directive):
+    """Insert an evaluated notebook cell into a document
+
+    This uses runipy and nbconvert to transform an inline python
+    script into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 0
+    optional_arguments = 0
+    has_content = True
+
+    def run(self):
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # Construct notebook from cell content
+        content = "\n".join(self.content)
+        with open("temp.py", "w") as f:
+            f.write(content)
+
+        convert_to_ipynb('temp.py', 'temp.ipynb')
+
+        evaluated_text = evaluate_notebook('temp.ipynb')
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = notebook_node('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # clean up
+        files = glob.glob("*.png") + ['temp.py', 'temp.ipynb']
+        for file in files:
+            os.remove(file)
+
+        return [nb_node]
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook-cell', NotebookCellDirective)
+
+def convert_to_ipynb(py_file, ipynb_file):
+    with io.open(py_file, 'r', encoding='utf-8') as f:
+        notebook = current.reads(f.read(), format='py')
+    with io.open(ipynb_file, 'w', encoding='utf-8') as f:
+        current.write(notebook, f, format='ipynb')

diff -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 extensions/pythonscript_sphinxext.py
--- /dev/null
+++ b/extensions/pythonscript_sphinxext.py
@@ -0,0 +1,80 @@
+from sphinx.util.compat import Directive
+from subprocess import Popen,PIPE
+from docutils.parsers.rst import directives
+from docutils import nodes
+import os, glob, shutil,  uuid, re
+
+class PythonScriptDirective(Directive):
+    """Execute an inline python script and display images.
+
+    This uses exec to execute an inline python script, copies
+    any images produced by the script, and embeds them in the document
+    along with the script.
+
+    """
+    required_arguments = 0
+    optional_arguments = 0
+    has_content = True
+
+    def run(self):
+        # Constuct paths
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        source_dir = os.path.dirname(
+            os.path.abspath(self.state.document.current_source))
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
+                                                source_dir))
+
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir) # no problem here for me, but just use built-ins
+
+        # Construct script from cell content
+        content = "\n".join(self.content)
+        with open("temp.py", "w") as f:
+            f.write(content)
+
+        # Use sphinx logger?
+        print ""
+        print content
+        print ""
+
+        codeproc = Popen(['python', 'temp.py'], stdout=PIPE)
+        out = codeproc.stdout.read()
+
+        images = sorted(glob.glob("*.png"))
+        fns = []
+        for im in images:
+            fns.append(str(uuid.uuid4()) + ".png")
+            shutil.move(im, os.path.join(dest_dir, fns[-1]))
+
+        os.remove("temp.py")
+
+        code = content
+
+        literal = nodes.literal_block(code,code)
+        literal['language'] = 'python'
+
+        images = []
+        for fn in fns:
+            images.append(nodes.image(uri="./"+fn, width="600px"))
+        return [literal] + images
+
+def setup(app):
+    app.add_directive('python-script', PythonScriptDirective)
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.connect('build-finished', cleanup)
+
+# http://stackoverflow.com/questions/136505/searching-for-uuids-in-text-with-regex
+PATTERN = \
+    "[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}"
+
+def cleanup(app, exception):
+    """ Cleanup all png files with UUID filenames in the source """ 
+    for root,dirnames,filenames in os.walk(app.srcdir):
+        matches = re.findall(PATTERN, "\n".join(filenames))
+        for match in matches:
+            os.remove(os.path.join(root, match+".png"))

diff -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -33,12 +33,14 @@
 
 if not on_rtd:
     extensions.append('sphinx.ext.autosummary')
+    extensions.append('pythonscript_sphinexext')
 
 try:
     import runipy
     import IPython.nbconvert.utils.pandoc
     if not on_rtd:
         extensions.append('notebook_sphinxext')
+        extensions.append('notebookcell_sphinxext')
 except ImportError:
     pass
 

diff -r d6f0c6509ea8a70d0d5b23290d9a4c39c8c66c6a -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 source/visualizing/plots.rst
--- a/source/visualizing/plots.rst
+++ b/source/visualizing/plots.rst
@@ -1,3 +1,4 @@
+
 .. _how-to-make-plots:
 
 How to Make Plots
@@ -51,34 +52,34 @@
 slice along, a field to plot, and, optionally, a coordinate to center
 the plot on and the width of the plot window.  For example:
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
-   slc = SlicePlot(pf, 'z', 'Density', [0.2,0.3,0.8], (20,'kpc'))
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20,'kpc'))
    slc.save()
 
 The above example will display an annotated plot of a slice of the
 Density field in a 20 kpc square window centered on the coordinate
-(0.2,0.3) in the x-y plane.  The axis to slice along is keyed to the
+(0.53,0.53) in the x-y plane.  The axis to slice along is keyed to the
 letter 'z', corresponding to the z-axis.  Finally, the image is saved to
 a png file.
 
 Conceptually, you can think of the SlicePlot as an adjustable window
 into the data. For example:
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
-   slc = SlicePlot(pf, 'z','Pressure')
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z','Pressure', center=[0.53, 0.53, 0.53])
    slc.save()
-   slc.zoom(10)
+   slc.zoom(30)
    slc.save('zoom')
 
-will save a rendering of the pressure field in a slice along the z
+will save a slice of the pressure field in a slice along the z
 axis across the entire simulation domain followed by another plot that
-is zoomed in by a factor of 10 with respect to the original
+is zoomed in by a factor of 30 with respect to the original
 image. With these sorts of manipulations, one can easily pan and zoom
 onto an interesting region in the simulation and adjust the
 boundaries of the region to visualize on the fly.
@@ -88,16 +89,16 @@
 and many other annotations, including user-customizable annotations.
 For example:
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
-   slc = SlicePlot(pf, 'x', 'VorticitySquared',width=(10,'au'),center='max')
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
    slc.annotate_grids()
    slc.save()
 
-will plot the VorticitySquared in a 10 au slice through the z-axis
-centered on the highest density point in the simulation domain.
+will plot the VorticitySquared in a 10 kiloparsec slice through the
+z-axis centered on the highest density point in the simulation domain.
 Before saving the plot, the script annotates it with the grid
 boundaries, which are drawn as thick black lines by default.
 
@@ -117,11 +118,12 @@
 :class:`~yt.visualization.plot_window.ProjectionPlot` object.  For
 example:
 
-.. code-block:: python
+.. python-script::
  
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
-   prj = ProjectionPlot(pf,0,'Density',weight_field=None,max_level=2)
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   prj = ProjectionPlot(pf, 2, 'Density', center=[0.53, 0.53, 0.53],
+                        width=(25, 'kpc'), weight_field=None)
    prj.save()
 
 will create a projection of Density field along the x axis, plot it,
@@ -147,13 +149,14 @@
 instantiated by specifying a parameter file, the normal to the cutting
 plane, and the name of the fields to plot.  For example:
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
    L = [1,1,0] # vector normal to cutting plane
-   north_vector = [1,-1,0]
-   cut = OffAxisSlicePlot(pf,L,'Density',north_vector=north_vector)
+   north_vector = [-1,1,0]
+   cut = OffAxisSlicePlot(pf, L, 'Density', width=(25, 'kpc'),
+                          center=[0.53, 0.53, 0.53], north_vector=north_vector)
    cut.save()
 
 creates an off-axis slice in the plane perpendicular to ``L``,
@@ -187,13 +190,14 @@
 used in custom plots.  This snippet creates an off axis
 projection through a simulation.
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
    L = [1,1,0] # vector normal to cutting plane
-   north_vector = [1,-1,0]
-   W = [0.2, 0.2, 0.2]
+   north_vector = [-1,1,0]
+   W = [0.02, 0.02, 0.02]
+   c = [0.53, 0.53, 0.53]
    N = 512
    image = off_axis_projection(pf, c, L, W, N, "Density")
    write_image(na.log10(image), "%s_offaxis_projection.png" % pf)
@@ -208,13 +212,15 @@
 ``OffAxisSlicePlot``, requiring only an open dataset, a direction
 to project along, and a field to project.  For example:
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
    L = [1,1,0] # vector normal to cutting plane
-   north_vector = [1,-1,0]
-   prj = OffAxisProjectionPlot(pf,L,'Density',north_vector=north_vector)
+   north_vector = [-1,1,0]
+   prj = OffAxisProjectionPlot(pf,L,'Density',width=(25, 'kpc'), 
+                               center=[0.53, 0.53, 0.53], 
+                               north_vector=north_vector)
    prj.save()
 
 OffAxisProjectionPlots can also be created with a number of
@@ -255,7 +261,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
    pc = PlotCollection(pf, 'c')
 
 will load the data, then proceed to construct a plot collection
@@ -285,10 +291,14 @@
 (:meth:`~yt.visualization.plot_collection.PlotCollection.add_profile_sphere`).
 For instance, to give the plot collection an object to profile:
 
-.. code-block:: python
+.. python-script::
 
-   my_galaxy = pf.h.disk([0.3, 0.5, 0.8], [0.4, 0.5, 0.1], 0.01, 0.001)
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   pc = PlotCollection(pf, 'c')
+   my_galaxy = pf.h.disk([0.53, 0.53, 0.53], [0.0, 0.0, 1.0], 0.01, 0.003)
    pc.add_profile_object(my_galaxy, ["Density", "Temperature"])
+   pc.save()
 
 This will create a :class:`yt.data_objects.data_containers.AMRCylinderBase`
 centered at [0.3, 0.5, 0.8], with a normal vector of [0.4, 0.5, 0.1], radius of
@@ -348,18 +358,26 @@
 available.  For instance, to generate a 2D distribution of mass enclosed in
 density and temperature bins, you can do:
 
-.. code-block:: python
+.. python-script::
 
-   pc.add_phase_sphere(100.0, 'au', ["Density", "Temperature", "CellMassMsun"],
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   pc = PlotCollection(pf, 'c')
+   pc.add_phase_sphere(50, 'kpc', ["Density", "Temperature", "CellMassMsun"],
                        weight = None)
+   pc.save()
 
 If you would rather see the average value of a field as a function of two other
 fields, you can neglect supplying the *weight* parameter.  This would look
 something like:
 
-.. code-block:: python
+.. python-script::
 
-   pc.add_phase_sphere(100.0, 'au', ["Density", "Temperature", "H2I_Fraction"])
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   pc = PlotCollection(pf, 'c')
+   pc.add_phase_sphere(50, 'kpc', ["Density", "Temperature", "HI_Fraction"])
+   pc.save()
 
 We could also construct our own objects and supply those to
 :meth:`~yt.visualization.plot_collection.PlotCollection.add_phase_object`.
@@ -385,16 +403,19 @@
 IPython notebook you can connect to.  You need to additionally change the
 import statement you use:
 
-.. code-block:: python
+.. notebook-cell::
 
    from yt.imods import *
 
 This will set up a number of helper functions and enable interactive plotting.
 Now when you create a plot window you can call ``.show()`` to see it inline:
 
-.. code-block:: python
+.. notebook-cell::
 
-   p = ProjectionPlot(pf, "x", "Density")
+   from yt.imods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, "x", "Density", center='m', width=(10,'kpc'),
+                      weight_field='Density')
    p.show()
 
 The image will appear inline.


https://bitbucket.org/yt_analysis/yt-doc/commits/e339485d15b2/
Changeset:   e339485d15b2
User:        ngoldbaum
Date:        2013-10-29 10:17:03
Summary:     Fixing a silly typo.
Affected #:  1 file

diff -r d6c52ef557d52edc5a10ebd8685bd8a6d33457d9 -r e339485d15b29241581801dc606969f64cfe4eba source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -33,7 +33,7 @@
 
 if not on_rtd:
     extensions.append('sphinx.ext.autosummary')
-    extensions.append('pythonscript_sphinexext')
+    extensions.append('pythonscript_sphinxext')
 
 try:
     import runipy


https://bitbucket.org/yt_analysis/yt-doc/commits/c57cd2b9e38c/
Changeset:   c57cd2b9e38c
User:        ngoldbaum
Date:        2013-10-29 18:30:26
Summary:     Bumping the heading level of the subheadings.
Affected #:  1 file

diff -r e339485d15b29241581801dc606969f64cfe4eba -r c57cd2b9e38c188d5f61a2318c8605afa49e9afc source/visualizing/plots.rst
--- a/source/visualizing/plots.rst
+++ b/source/visualizing/plots.rst
@@ -46,7 +46,7 @@
 .. _slice-plots:
 
 Slice Plots
-~~~~~~~~~~~
+-----------
 
 Slice plots can be created by supplying a parameter file, an axis to
 slice along, a field to plot, and, optionally, a coordinate to center
@@ -109,7 +109,7 @@
 .. _projection-plots:
 
 Projection Plots
-~~~~~~~~~~~~~~~~
+----------------
 
 Using a fast adaptive projection, ``yt`` is able to quickly project
 simulation data along the coordinate axes.
@@ -138,8 +138,8 @@
 
 .. _off-axis-slices:
 
-Off Axis Slices
-~~~~~~~~~~~~~~~
+Off Axis Slie Plots
+-------------------
 
 Off axis slice plots can be generated in much the same way as
 grid-aligned slices.  Off axis slices use
@@ -171,8 +171,8 @@
 
 .. _off-axis-projections:
 
-Off Axis Projections
-~~~~~~~~~~~~~~~~~~~~
+Off Axis Projection Plots
+-------------------------
 
 Off axis projection plots .  Internally, off axis projections are
 created using :ref:`the-camera-interface` by applying the
@@ -271,8 +271,8 @@
 
 .. _how-to-make-1d-profiles:
 
-1D Profiles
-~~~~~~~~~~~
+1D Profile Plots
+----------------
 
 1D profiles are used to calculated the average or the sum of a given quantity
 with respect to a second quantity.  This means the "average density as a
@@ -344,7 +344,7 @@
 .. _how-to-make-2d-profiles:
 
 2D Phase Plots
-~~~~~~~~~~~~~~
+--------------
 
 2D phase plots function in much the same was as 1D phase plots.  You can either
 create and supply an object yourself, or allow the plot collection to create a
@@ -387,7 +387,7 @@
 .. _interactive-plotting:
 
 Interactive Plotting
-~~~~~~~~~~~~~~~~~~~~
+--------------------
 
 The best way to interactively plot data is through the IPython notebook.  Many
 detailed tutorials on using the IPython notebook can be found at


https://bitbucket.org/yt_analysis/yt-doc/commits/ac966ace3312/
Changeset:   ac966ace3312
User:        ngoldbaum
Date:        2013-10-29 18:33:45
Summary:     Fixing a silly typo
Affected #:  1 file

diff -r c57cd2b9e38c188d5f61a2318c8605afa49e9afc -r ac966ace33129721fc2c583f506bb855d967d0b3 source/visualizing/plots.rst
--- a/source/visualizing/plots.rst
+++ b/source/visualizing/plots.rst
@@ -138,8 +138,8 @@
 
 .. _off-axis-slices:
 
-Off Axis Slie Plots
--------------------
+Off Axis Slice Plots
+--------------------
 
 Off axis slice plots can be generated in much the same way as
 grid-aligned slices.  Off axis slices use


https://bitbucket.org/yt_analysis/yt-doc/commits/bd4a2fec8113/
Changeset:   bd4a2fec8113
User:        chummels
Date:        2013-10-29 18:46:28
Summary:     Merging.
Affected #:  0 files



https://bitbucket.org/yt_analysis/yt-doc/commits/ba8d78c138fc/
Changeset:   ba8d78c138fc
User:        chummels
Date:        2013-10-29 20:56:54
Summary:     Updating front page to address some of the issues raised by other developers.  Cleaning it up.
Affected #:  1 file

diff -r bd4a2fec8113bcc75582643466466d7a0ede91bd -r ba8d78c138fcedf7b36164a5570ef33c7f51ead7 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -17,24 +17,115 @@
 `Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
 `RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
 
-yt uses a three-pronged approach to interacting with data:
-
- * Visualize Data - Generate plots, images, and movies for better understanding your datasets
- * Analyze Data - Use additional analysis routines to derive real-world results from your data
- * Examine Data - Directly access raw data with helper functions for making this task easier
-
 Documentation
 =============
 
+.. raw:: html
+
+   <table class="contentstable" align="center">
+
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="installing.html">Installation</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Getting and Installing yt</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="bootcamp/index.html">yt Bootcamp</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Demonstrations of what yt can do</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="cookbook/index.html">The Cookbook</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Example recipes for how to accomplish a variety of tasks</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="visualizing/index.html">Visualizing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Make plots, projections, volume renderings, movies, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="analyzing/index.html">Analyzing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="examining/index.html">Examining Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Load data and directly access raw values for low-level analysis</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="developing/index.html">Developing in yt</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Catering yt to work for your exact use case</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="reference/index.html">Reference Materials</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Lists of fields, quantities, classes, functions, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/index.html">Getting help</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">What to do if you run into problems</p>
+       </td>
+     </tr>
+
+   </table>
+
 .. toctree::
-   :maxdepth: 1
+   :hidden:
 
    installing
-   yt Bootcamp: A Worked Introduction <bootcamp/index>
-   help/index
+   yt Bootcamp <bootcamp/index>
    cookbook/index
    visualizing/index
    analyzing/index
    examining/index
    developing/index
    reference/index
+   help/index


https://bitbucket.org/yt_analysis/yt-doc/commits/82495c67b0d8/
Changeset:   82495c67b0d8
User:        chummels
Date:        2013-10-29 21:06:54
Summary:     Updating code list on front page.
Affected #:  1 file

diff -r ba8d78c138fcedf7b36164a5570ef33c7f51ead7 -r 82495c67b0d8a478cb96276d122f1b736588d974 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -13,9 +13,8 @@
 `Piernik <http://arxiv.org/abs/0901.0104>`_;
 and partially-supported codes include: 
 `Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
 `Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
+and `Pluto <http://plutocode.ph.unito.it/>`_
 
 Documentation
 =============


https://bitbucket.org/yt_analysis/yt-doc/commits/082432fdc539/
Changeset:   082432fdc539
User:        chummels
Date:        2013-10-29 21:10:43
Summary:     Fixing typo.
Affected #:  1 file

diff -r 82495c67b0d8a478cb96276d122f1b736588d974 -r 082432fdc53946f4eac39a602b2cbac3bb86f97e source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -14,7 +14,7 @@
 and partially-supported codes include: 
 `Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
 `Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-and `Pluto <http://plutocode.ph.unito.it/>`_
+and `Pluto <http://plutocode.ph.unito.it/>`_.
 
 Documentation
 =============


https://bitbucket.org/yt_analysis/yt-doc/commits/2794c294d94c/
Changeset:   2794c294d94c
User:        chummels
Date:        2013-10-29 22:00:24
Summary:     Updating code support in docs (pulled from wikis).  Not up to date, and needs to be updated.
Affected #:  2 files

diff -r 082432fdc53946f4eac39a602b2cbac3bb86f97e -r 2794c294d94c021a1dc3913e675f2d2d971cc964 source/reference/code_support.rst
--- /dev/null
+++ b/source/reference/code_support.rst
@@ -0,0 +1,49 @@
+
+.. _code-support:
+
+Code Support
+============
+
+Levels of Support for Various Codes
+-----------------------------------
+
+yt provides frontends to support several different simulation code formats 
+as inputs.  Below is a list showing what level of support is provided for
+each code.
+
+|
+
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability       | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++==================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles        |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters       |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units            |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand   |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data    |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+
+|
+
+If you have a dataset from a code not yet supported, you can either 
+input your data using the :ref:`loading-numpy-array` format, or help us to 
+:ref:`creating_frontend` for this new format..
+
+Future Codes to Support
+-----------------------
+
+A major overhaul of the code was required in order to cleanly support 
+additional codes.  Development in the yt 3.x branch has begun and provides 
+support for codes like: 
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_, 
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
+`Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
+version of yt for the most up-to-date support for those codes.

diff -r 082432fdc53946f4eac39a602b2cbac3bb86f97e -r 2794c294d94c021a1dc3913e675f2d2d971cc964 source/reference/index.rst
--- a/source/reference/index.rst
+++ b/source/reference/index.rst
@@ -7,6 +7,7 @@
 .. toctree::
    :maxdepth: 2
 
+   code_support
    configuration
    field_list
    changelog


https://bitbucket.org/yt_analysis/yt-doc/commits/d456ff13b274/
Changeset:   d456ff13b274
User:        chummels
Date:        2013-10-29 22:02:50
Summary:     Fixing typo.
Affected #:  1 file

diff -r 2794c294d94c021a1dc3913e675f2d2d971cc964 -r d456ff13b274f63e087b623d7b7663c31241e3cd source/reference/code_support.rst
--- a/source/reference/code_support.rst
+++ b/source/reference/code_support.rst
@@ -34,8 +34,8 @@
 |
 
 If you have a dataset from a code not yet supported, you can either 
-input your data using the :ref:`loading-numpy-array` format, or help us to 
-:ref:`creating_frontend` for this new format..
+input your data using the :ref:`loading-numpy-array` format, or help us by 
+:ref:`creating_frontend` for this new format.
 
 Future Codes to Support
 -----------------------


https://bitbucket.org/yt_analysis/yt-doc/commits/d395795dbd65/
Changeset:   d395795dbd65
User:        chummels
Date:        2013-10-29 22:10:11
Summary:     Updating front page to incorporate all of the frontends.
Affected #:  1 file

diff -r d456ff13b274f63e087b623d7b7663c31241e3cd -r d395795dbd65e5d181618aeb3fd9d80bc9e1d78d source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -5,13 +5,14 @@
 examining datasets in a variety of scientific disciplines.  yt is developed 
 in Python under the open-source model.  yt currently supports several 
 astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  Fully-supported codes 
-include: `Enzo <http://enzo-project.org/>`_, 
+for unsupported data formats.  :ref:`code-support` is included for:
+`Enzo <http://enzo-project.org/>`_, 
 `Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
 `Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
 `FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_;
-and partially-supported codes include: 
+`Piernik <http://arxiv.org/abs/0901.0104>`_,
+`Athena <https://trac.princeton.edu/Athena/>`_,
+`Chombo <http://chombo.lbl.gov>`_, 
 `Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
 `Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
 and `Pluto <http://plutocode.ph.unito.it/>`_.


https://bitbucket.org/yt_analysis/yt-doc/commits/1c2b9d4289a8/
Changeset:   1c2b9d4289a8
User:        MatthewTurk
Date:        2013-10-29 19:31:46
Summary:     Updating Changelog for 2.6.
Affected #:  1 file

diff -r bd4a2fec8113bcc75582643466466d7a0ede91bd -r 1c2b9d4289a885ec6bc0fbd2465ebb4368a82d0a source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 


https://bitbucket.org/yt_analysis/yt-doc/commits/48f08ce63a07/
Changeset:   48f08ce63a07
User:        MatthewTurk
Date:        2013-10-29 22:21:38
Summary:     Merging from Matt's changelog
Affected #:  1 file

diff -r d395795dbd65e5d181618aeb3fd9d80bc9e1d78d -r 48f08ce63a07ffc86ba11128fc03c7900c62b097 source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 


https://bitbucket.org/yt_analysis/yt-doc/commits/3824fefa586b/
Changeset:   3824fefa586b
User:        MatthewTurk
Date:        2013-10-29 22:28:36
Summary:     Updating the code support.
Affected #:  2 files

diff -r 48f08ce63a07ffc86ba11128fc03c7900c62b097 -r 3824fefa586bafd5392f7e5e19a806935453d565 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -3,19 +3,19 @@
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
-in Python under the open-source model.  yt currently supports several 
-astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  :ref:`code-support` is included for:
-`Enzo <http://enzo-project.org/>`_, 
-`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
-`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
-`FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_,
-`Athena <https://trac.princeton.edu/Athena/>`_,
-`Chombo <http://chombo.lbl.gov>`_, 
-`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-and `Pluto <http://plutocode.ph.unito.it/>`_.
+in Python under the open-source model.  In version 2.6, yt currently supports
+several astrophysical simulation code formats, as well support for
+:ref:`loading-numpy-array` for unsupported data formats.  :ref:`code-support`
+is included for: `Enzo <http://enzo-project.org/>`_, `Orion
+<http://flash.uchicago.edu/~rfisher/orion/>`_, `Nyx
+<https://ccse.lbl.gov/Research/NYX/index.html>`_, `FLASH
+<http://flash.uchicago.edu/website/home/>`_, `Piernik
+<http://arxiv.org/abs/0901.0104>`_, `Athena
+<https://trac.princeton.edu/Athena/>`_, `Chombo <http://chombo.lbl.gov>`_,
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_, `Maestro
+<https://ccse.lbl.gov/Research/MAESTRO/>`_, and `Pluto
+<http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
+particle codes and octree codes, is taking place in yt 3.0.)
 
 Documentation
 =============

diff -r 48f08ce63a07ffc86ba11128fc03c7900c62b097 -r 3824fefa586bafd5392f7e5e19a806935453d565 source/reference/code_support.rst
--- a/source/reference/code_support.rst
+++ b/source/reference/code_support.rst
@@ -13,23 +13,25 @@
 
 |
 
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Capability       | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
-+==================+======+=======+=======+======+=========+========+========+=========+=======+========+
-| Fluid Quantities |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Particles        |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Parameters       |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Units            |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Read on Demand   |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Load Raw Data    |   Y  |   Y   |   Y   |  Y   |         |        |        |         |       |        |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
-| Level of Support | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
-+------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability           | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++======================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities     |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles            |   Y  |   Y   |   Y   |  Y   |   N/A   |   N    |   Y    |   N     |   N   |    N   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters           |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units                |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand       |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data        |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Part of test suite   |   Y  |   Y   |   Y   |  Y   |    N    |   N    |   Y    |   N     |   N   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support     | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
 
 |
 
@@ -47,3 +49,5 @@
 `ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
 `Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
 version of yt for the most up-to-date support for those codes.
+
+Additionally, in yt 3.0 the Boxlib formats have been unified and streamlined.


https://bitbucket.org/yt_analysis/yt-doc/commits/e8f3afe8c5f8/
Changeset:   e8f3afe8c5f8
User:        MatthewTurk
Date:        2013-10-29 22:52:10
Summary:     Moving API under reference.
Affected #:  4 files

diff -r 3824fefa586bafd5392f7e5e19a806935453d565 -r e8f3afe8c5f8059a2b1a3819f9bb93282b6340ad source/api/api.rst
--- a/source/api/api.rst
+++ /dev/null
@@ -1,563 +0,0 @@
-API Reference
-=============
-
-Plots and the Plotting Interface
---------------------------------
-
-SlicePlot and ProjectionPlot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_window.SlicePlot
-   ~yt.visualization.plot_window.OffAxisSlicePlot
-   ~yt.visualization.plot_window.ProjectionPlot
-   ~yt.visualization.plot_window.OffAxisProjectionPlot
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_collection.PlotCollection
-   ~yt.visualization.plot_collection.PlotCollectionInteractive
-   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
-   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.base_plot_types.get_multi_plot
-
-Data Sources
-------------
-
-.. _physical-object-api:
-
-Physical Objects
-^^^^^^^^^^^^^^^^
-
-These are the objects that act as physical selections of data, describing a
-region in space.  These are not typically addressed directly; see
-:ref:`available-objects` for more information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.data_containers.AMRCoveringGridBase
-   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
-   ~yt.data_objects.data_containers.AMRCylinderBase
-   ~yt.data_objects.data_containers.AMRGridCollectionBase
-   ~yt.data_objects.data_containers.AMRRayBase
-   ~yt.data_objects.data_containers.AMROrthoRayBase
-   ~yt.data_objects.data_containers.AMRStreamlineBase
-   ~yt.data_objects.data_containers.AMRProjBase
-   ~yt.data_objects.data_containers.AMRRegionBase
-   ~yt.data_objects.data_containers.AMRSliceBase
-   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
-   ~yt.data_objects.data_containers.AMRSphereBase
-   ~yt.data_objects.data_containers.AMRSurfaceBase
-
-Time Series Objects
-^^^^^^^^^^^^^^^^^^^
-
-These are objects that either contain and represent or operate on series of
-datasets.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.time_series.TimeSeriesData
-   ~yt.data_objects.time_series.TimeSeriesDataObject
-   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
-   ~yt.data_objects.time_series.AnalysisTaskProxy
-
-Frontends
----------
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.grid_patch.AMRGridPatch
-   ~yt.data_objects.hierarchy.AMRHierarchy
-   ~yt.data_objects.static_output.StaticOutput
-
-Enzo
-^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.enzo.data_structures.EnzoGrid
-   ~yt.frontends.enzo.data_structures.EnzoHierarchy
-   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
-
-Orion
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.orion.data_structures.OrionGrid
-   ~yt.frontends.orion.data_structures.OrionHierarchy
-   ~yt.frontends.orion.data_structures.OrionStaticOutput
-
-FLASH
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.flash.data_structures.FLASHGrid
-   ~yt.frontends.flash.data_structures.FLASHHierarchy
-   ~yt.frontends.flash.data_structures.FLASHStaticOutput
-
-Chombo
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.chombo.data_structures.ChomboGrid
-   ~yt.frontends.chombo.data_structures.ChomboHierarchy
-   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
-
-RAMSES
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
-
-Derived Datatypes
------------------
-
-Profiles and Histograms
-^^^^^^^^^^^^^^^^^^^^^^^
-
-These types are used to sum data up and either return that sum or return an
-average.  Typically they are more easily used through the
-`yt.visualization.plot_collection` interface.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.profiles.BinnedProfile1D
-   ~yt.data_objects.profiles.BinnedProfile2D
-   ~yt.data_objects.profiles.BinnedProfile3D
-
-Halo Finding and Particle Functions
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Halo finding can be executed using these types.  Here we list the main halo
-finders as well as a few other supplemental objects.
-
-.. rubric:: Halo Finders
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
-   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
-
-You can also operate on the Halo and HAloList objects themselves:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.Halo
-   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
-
-There are also functions for loading halos from disk:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
-
-We have several methods that work to create merger trees:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
-
-You can use Halo catalogs generatedl externally as well:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
-
-Halo Profiling
-^^^^^^^^^^^^^^
-
-yt provides a comprehensive halo profiler that can filter, center, and analyze
-halos en masse.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
-
-
-Two Point Functions
-^^^^^^^^^^^^^^^^^^^
-
-These functions are designed to create correlations or other results of
-operations acting on two spatially-distinct points in a data source.  See also
-:ref:`two_point_functions`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
-   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
-
-Field Types
------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.field_info_container.DerivedField
-   ~yt.data_objects.field_info_container.FieldInfoContainer
-   ~yt.data_objects.field_info_container.ValidateDataField
-   ~yt.data_objects.field_info_container.ValidateGridType
-   ~yt.data_objects.field_info_container.ValidateParameter
-   ~yt.data_objects.field_info_container.ValidateProperty
-   ~yt.data_objects.field_info_container.ValidateSpatial
-
-Image Handling
---------------
-
-For volume renderings and fixed resolution buffers the image object returned is
-an ``ImageArray`` object, which has useful functions for image saving and 
-writing to bitmaps.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.image_array.ImageArray
-   ~yt.data_objects.image_array.ImageArray.write_png
-   ~yt.data_objects.image_array.ImageArray.write_hdf5
-
-Extension Types
----------------
-
-Coordinate Transformations
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
-   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
-
-Cosmology, Star Particle Analysis, and Simulated Observations
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
-
-Light cone generation and simulation analysis.  (See also
-:ref:`light-cone-generator`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
-   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
-
-Absorption and X-ray spectra and spectral lines:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
-
-Absorption spectra fitting:
-
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
-
-Sunrise exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
-
-RADMC-3D exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
-
-Radial Column Density
-^^^^^^^^^^^^^^^^^^^^^
-
-If you'd like to calculate the column density out to a given point, from a
-specified center, yt can provide that information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
-
-Volume Rendering
-^^^^^^^^^^^^^^^^
-
-See also :ref:`volume_rendering`.
-
-Here are the primary entry points:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.Camera
-   ~yt.visualization.volume_rendering.camera.off_axis_projection
-   ~yt.visualization.volume_rendering.camera.allsky_projection
-
-These objects set up the way the image looks:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
-
-There are also advanced objects for particular use cases:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
-   ~yt.visualization.volume_rendering.camera.FisheyeCamera
-   ~yt.visualization.volume_rendering.camera.MosaicCamera
-   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
-   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
-   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
-   ~yt.visualization.volume_rendering.camera.StereoPairCamera
-
-Streamlining
-^^^^^^^^^^^^
-
-See also :ref:`streamlines`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.streamlines.Streamlines
-
-Image Writing
-^^^^^^^^^^^^^
-
-These functions are all used for fast writing of images directly to disk,
-without calling matplotlib.  This can be very useful for high-cadence outputs
-where colorbars are unnecessary or for volume rendering.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.image_writer.multi_image_composite
-   ~yt.visualization.image_writer.write_bitmap
-   ~yt.visualization.image_writer.write_projection
-   ~yt.visualization.image_writer.write_fits
-   ~yt.visualization.image_writer.write_image
-   ~yt.visualization.image_writer.map_to_colors
-   ~yt.visualization.image_writer.strip_colormap_data
-   ~yt.visualization.image_writer.splat_points
-   ~yt.visualization.image_writer.annotate_image
-   ~yt.visualization.image_writer.scale_image
-
-We also provide a module that is very good for generating EPS figures,
-particularly with complicated layouts.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.eps_writer.DualEPS
-   ~yt.visualization.eps_writer.single_plot
-   ~yt.visualization.eps_writer.multiplot
-   ~yt.visualization.eps_writer.multiplot_yt
-   ~yt.visualization.eps_writer.return_cmap
-
-.. _image-panner-api:
-
-Derived Quantities
-------------------
-
-See :ref:`derived-quantities`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.derived_quantities._AngularMomentumVector
-   ~yt.data_objects.derived_quantities._BaryonSpinParameter
-   ~yt.data_objects.derived_quantities._BulkVelocity
-   ~yt.data_objects.derived_quantities._CenterOfMass
-   ~yt.data_objects.derived_quantities._Extrema
-   ~yt.data_objects.derived_quantities._IsBound
-   ~yt.data_objects.derived_quantities._MaxLocation
-   ~yt.data_objects.derived_quantities._ParticleSpinParameter
-   ~yt.data_objects.derived_quantities._TotalMass
-   ~yt.data_objects.derived_quantities._TotalQuantity
-   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
-
-.. _callback-api:
-
-Callback List
--------------
-
-
-See also :ref:`callbacks`.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_modifications.ArrowCallback
-   ~yt.visualization.plot_modifications.ClumpContourCallback
-   ~yt.visualization.plot_modifications.ContourCallback
-   ~yt.visualization.plot_modifications.CoordAxesCallback
-   ~yt.visualization.plot_modifications.CuttingQuiverCallback
-   ~yt.visualization.plot_modifications.GridBoundaryCallback
-   ~yt.visualization.plot_modifications.HopCircleCallback
-   ~yt.visualization.plot_modifications.HopParticleCallback
-   ~yt.visualization.plot_modifications.LabelCallback
-   ~yt.visualization.plot_modifications.LinePlotCallback
-   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
-   ~yt.visualization.plot_modifications.ParticleCallback
-   ~yt.visualization.plot_modifications.PointAnnotateCallback
-   ~yt.visualization.plot_modifications.QuiverCallback
-   ~yt.visualization.plot_modifications.SphereCallback
-   ~yt.visualization.plot_modifications.TextLabelCallback
-   ~yt.visualization.plot_modifications.TitleCallback
-   ~yt.visualization.plot_modifications.UnitBoundaryCallback
-   ~yt.visualization.plot_modifications.VelocityCallback
-
-Function List
--------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.convenience.load
-   ~yt.funcs.deprecate
-   ~yt.funcs.ensure_list
-   ~yt.funcs.get_pbar
-   ~yt.funcs.humanize_time
-   ~yt.funcs.insert_ipython
-   ~yt.funcs.iterable
-   ~yt.funcs.just_one
-   ~yt.funcs.only_on_root
-   ~yt.funcs.paste_traceback
-   ~yt.funcs.pdb_run
-   ~yt.funcs.print_tb
-   ~yt.funcs.rootonly
-   ~yt.funcs.time_execution
-   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
-
-Miscellaneous Types
--------------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.config.YTConfigParser
-   ~yt.utilities.parameter_file_storage.ParameterFileStore
-   ~yt.data_objects.data_containers.FakeGridForParticles
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
-
-
-Testing Infrastructure
-----------------------
-
-The first set of functions are all provided by NumPy.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_array_equal
-   ~yt.testing.assert_almost_equal
-   ~yt.testing.assert_approx_equal
-   ~yt.testing.assert_array_almost_equal
-   ~yt.testing.assert_equal
-   ~yt.testing.assert_array_less
-   ~yt.testing.assert_string_equal
-   ~yt.testing.assert_array_almost_equal_nulp
-   ~yt.testing.assert_allclose
-   ~yt.testing.assert_raises
-
-These are yt-provided functions:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_rel_equal
-   ~yt.testing.amrspace
-   ~yt.testing.fake_random_pf
-   ~yt.testing.expand_keywords

diff -r 3824fefa586bafd5392f7e5e19a806935453d565 -r e8f3afe8c5f8059a2b1a3819f9bb93282b6340ad source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -256,4 +256,4 @@
                        }
 
 if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+    autosummary_generate = glob.glob("reference/api/api.rst")

diff -r 3824fefa586bafd5392f7e5e19a806935453d565 -r e8f3afe8c5f8059a2b1a3819f9bb93282b6340ad source/reference/api/api.rst
--- /dev/null
+++ b/source/reference/api/api.rst
@@ -0,0 +1,563 @@
+API Reference
+=============
+
+Plots and the Plotting Interface
+--------------------------------
+
+SlicePlot and ProjectionPlot
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_window.SlicePlot
+   ~yt.visualization.plot_window.OffAxisSlicePlot
+   ~yt.visualization.plot_window.ProjectionPlot
+   ~yt.visualization.plot_window.OffAxisProjectionPlot
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_collection.PlotCollection
+   ~yt.visualization.plot_collection.PlotCollectionInteractive
+   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
+   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
+   ~yt.visualization.base_plot_types.get_multi_plot
+
+Data Sources
+------------
+
+.. _physical-object-api:
+
+Physical Objects
+^^^^^^^^^^^^^^^^
+
+These are the objects that act as physical selections of data, describing a
+region in space.  These are not typically addressed directly; see
+:ref:`available-objects` for more information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.data_containers.AMRCoveringGridBase
+   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
+   ~yt.data_objects.data_containers.AMRCylinderBase
+   ~yt.data_objects.data_containers.AMRGridCollectionBase
+   ~yt.data_objects.data_containers.AMRRayBase
+   ~yt.data_objects.data_containers.AMROrthoRayBase
+   ~yt.data_objects.data_containers.AMRStreamlineBase
+   ~yt.data_objects.data_containers.AMRProjBase
+   ~yt.data_objects.data_containers.AMRRegionBase
+   ~yt.data_objects.data_containers.AMRSliceBase
+   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
+   ~yt.data_objects.data_containers.AMRSphereBase
+   ~yt.data_objects.data_containers.AMRSurfaceBase
+
+Time Series Objects
+^^^^^^^^^^^^^^^^^^^
+
+These are objects that either contain and represent or operate on series of
+datasets.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.time_series.TimeSeriesData
+   ~yt.data_objects.time_series.TimeSeriesDataObject
+   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
+   ~yt.data_objects.time_series.AnalysisTaskProxy
+
+Frontends
+---------
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.grid_patch.AMRGridPatch
+   ~yt.data_objects.hierarchy.AMRHierarchy
+   ~yt.data_objects.static_output.StaticOutput
+
+Enzo
+^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.enzo.data_structures.EnzoGrid
+   ~yt.frontends.enzo.data_structures.EnzoHierarchy
+   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
+
+Orion
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.orion.data_structures.OrionGrid
+   ~yt.frontends.orion.data_structures.OrionHierarchy
+   ~yt.frontends.orion.data_structures.OrionStaticOutput
+
+FLASH
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.flash.data_structures.FLASHGrid
+   ~yt.frontends.flash.data_structures.FLASHHierarchy
+   ~yt.frontends.flash.data_structures.FLASHStaticOutput
+
+Chombo
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.chombo.data_structures.ChomboGrid
+   ~yt.frontends.chombo.data_structures.ChomboHierarchy
+   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
+
+RAMSES
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.ramses.data_structures.RAMSESGrid
+   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
+   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+
+Derived Datatypes
+-----------------
+
+Profiles and Histograms
+^^^^^^^^^^^^^^^^^^^^^^^
+
+These types are used to sum data up and either return that sum or return an
+average.  Typically they are more easily used through the
+`yt.visualization.plot_collection` interface.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.profiles.BinnedProfile1D
+   ~yt.data_objects.profiles.BinnedProfile2D
+   ~yt.data_objects.profiles.BinnedProfile3D
+
+Halo Finding and Particle Functions
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Halo finding can be executed using these types.  Here we list the main halo
+finders as well as a few other supplemental objects.
+
+.. rubric:: Halo Finders
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
+   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
+
+You can also operate on the Halo and HAloList objects themselves:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.Halo
+   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
+
+There are also functions for loading halos from disk:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
+
+We have several methods that work to create merger trees:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
+
+You can use Halo catalogs generatedl externally as well:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
+
+Halo Profiling
+^^^^^^^^^^^^^^
+
+yt provides a comprehensive halo profiler that can filter, center, and analyze
+halos en masse.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
+
+
+Two Point Functions
+^^^^^^^^^^^^^^^^^^^
+
+These functions are designed to create correlations or other results of
+operations acting on two spatially-distinct points in a data source.  See also
+:ref:`two_point_functions`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
+   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
+
+Field Types
+-----------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.field_info_container.DerivedField
+   ~yt.data_objects.field_info_container.FieldInfoContainer
+   ~yt.data_objects.field_info_container.ValidateDataField
+   ~yt.data_objects.field_info_container.ValidateGridType
+   ~yt.data_objects.field_info_container.ValidateParameter
+   ~yt.data_objects.field_info_container.ValidateProperty
+   ~yt.data_objects.field_info_container.ValidateSpatial
+
+Image Handling
+--------------
+
+For volume renderings and fixed resolution buffers the image object returned is
+an ``ImageArray`` object, which has useful functions for image saving and 
+writing to bitmaps.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.image_array.ImageArray
+   ~yt.data_objects.image_array.ImageArray.write_png
+   ~yt.data_objects.image_array.ImageArray.write_hdf5
+
+Extension Types
+---------------
+
+Coordinate Transformations
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
+   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
+
+Cosmology, Star Particle Analysis, and Simulated Observations
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
+
+Light cone generation and simulation analysis.  (See also
+:ref:`light-cone-generator`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
+   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
+
+Absorption and X-ray spectra and spectral lines:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
+
+Absorption spectra fitting:
+
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+
+Sunrise exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
+
+RADMC-3D exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
+
+Radial Column Density
+^^^^^^^^^^^^^^^^^^^^^
+
+If you'd like to calculate the column density out to a given point, from a
+specified center, yt can provide that information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
+
+Volume Rendering
+^^^^^^^^^^^^^^^^
+
+See also :ref:`volume_rendering`.
+
+Here are the primary entry points:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.Camera
+   ~yt.visualization.volume_rendering.camera.off_axis_projection
+   ~yt.visualization.volume_rendering.camera.allsky_projection
+
+These objects set up the way the image looks:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
+
+There are also advanced objects for particular use cases:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
+   ~yt.visualization.volume_rendering.camera.FisheyeCamera
+   ~yt.visualization.volume_rendering.camera.MosaicCamera
+   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
+   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
+   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
+   ~yt.visualization.volume_rendering.camera.StereoPairCamera
+
+Streamlining
+^^^^^^^^^^^^
+
+See also :ref:`streamlines`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.streamlines.Streamlines
+
+Image Writing
+^^^^^^^^^^^^^
+
+These functions are all used for fast writing of images directly to disk,
+without calling matplotlib.  This can be very useful for high-cadence outputs
+where colorbars are unnecessary or for volume rendering.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.image_writer.multi_image_composite
+   ~yt.visualization.image_writer.write_bitmap
+   ~yt.visualization.image_writer.write_projection
+   ~yt.visualization.image_writer.write_fits
+   ~yt.visualization.image_writer.write_image
+   ~yt.visualization.image_writer.map_to_colors
+   ~yt.visualization.image_writer.strip_colormap_data
+   ~yt.visualization.image_writer.splat_points
+   ~yt.visualization.image_writer.annotate_image
+   ~yt.visualization.image_writer.scale_image
+
+We also provide a module that is very good for generating EPS figures,
+particularly with complicated layouts.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.eps_writer.DualEPS
+   ~yt.visualization.eps_writer.single_plot
+   ~yt.visualization.eps_writer.multiplot
+   ~yt.visualization.eps_writer.multiplot_yt
+   ~yt.visualization.eps_writer.return_cmap
+
+.. _image-panner-api:
+
+Derived Quantities
+------------------
+
+See :ref:`derived-quantities`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.derived_quantities._AngularMomentumVector
+   ~yt.data_objects.derived_quantities._BaryonSpinParameter
+   ~yt.data_objects.derived_quantities._BulkVelocity
+   ~yt.data_objects.derived_quantities._CenterOfMass
+   ~yt.data_objects.derived_quantities._Extrema
+   ~yt.data_objects.derived_quantities._IsBound
+   ~yt.data_objects.derived_quantities._MaxLocation
+   ~yt.data_objects.derived_quantities._ParticleSpinParameter
+   ~yt.data_objects.derived_quantities._TotalMass
+   ~yt.data_objects.derived_quantities._TotalQuantity
+   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
+
+.. _callback-api:
+
+Callback List
+-------------
+
+
+See also :ref:`callbacks`.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_modifications.ArrowCallback
+   ~yt.visualization.plot_modifications.ClumpContourCallback
+   ~yt.visualization.plot_modifications.ContourCallback
+   ~yt.visualization.plot_modifications.CoordAxesCallback
+   ~yt.visualization.plot_modifications.CuttingQuiverCallback
+   ~yt.visualization.plot_modifications.GridBoundaryCallback
+   ~yt.visualization.plot_modifications.HopCircleCallback
+   ~yt.visualization.plot_modifications.HopParticleCallback
+   ~yt.visualization.plot_modifications.LabelCallback
+   ~yt.visualization.plot_modifications.LinePlotCallback
+   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
+   ~yt.visualization.plot_modifications.ParticleCallback
+   ~yt.visualization.plot_modifications.PointAnnotateCallback
+   ~yt.visualization.plot_modifications.QuiverCallback
+   ~yt.visualization.plot_modifications.SphereCallback
+   ~yt.visualization.plot_modifications.TextLabelCallback
+   ~yt.visualization.plot_modifications.TitleCallback
+   ~yt.visualization.plot_modifications.UnitBoundaryCallback
+   ~yt.visualization.plot_modifications.VelocityCallback
+
+Function List
+-------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.convenience.load
+   ~yt.funcs.deprecate
+   ~yt.funcs.ensure_list
+   ~yt.funcs.get_pbar
+   ~yt.funcs.humanize_time
+   ~yt.funcs.insert_ipython
+   ~yt.funcs.iterable
+   ~yt.funcs.just_one
+   ~yt.funcs.only_on_root
+   ~yt.funcs.paste_traceback
+   ~yt.funcs.pdb_run
+   ~yt.funcs.print_tb
+   ~yt.funcs.rootonly
+   ~yt.funcs.time_execution
+   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
+
+Miscellaneous Types
+-------------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.config.YTConfigParser
+   ~yt.utilities.parameter_file_storage.ParameterFileStore
+   ~yt.data_objects.data_containers.FakeGridForParticles
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
+
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
+Testing Infrastructure
+----------------------
+
+The first set of functions are all provided by NumPy.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_array_equal
+   ~yt.testing.assert_almost_equal
+   ~yt.testing.assert_approx_equal
+   ~yt.testing.assert_array_almost_equal
+   ~yt.testing.assert_equal
+   ~yt.testing.assert_array_less
+   ~yt.testing.assert_string_equal
+   ~yt.testing.assert_array_almost_equal_nulp
+   ~yt.testing.assert_allclose
+   ~yt.testing.assert_raises
+
+These are yt-provided functions:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_rel_equal
+   ~yt.testing.amrspace
+   ~yt.testing.fake_random_pf
+   ~yt.testing.expand_keywords

diff -r 3824fefa586bafd5392f7e5e19a806935453d565 -r e8f3afe8c5f8059a2b1a3819f9bb93282b6340ad source/reference/index.rst
--- a/source/reference/index.rst
+++ b/source/reference/index.rst
@@ -8,6 +8,7 @@
    :maxdepth: 2
 
    code_support
+   api/api
    configuration
    field_list
    changelog


https://bitbucket.org/yt_analysis/yt-doc/commits/f793c3d47fb0/
Changeset:   f793c3d47fb0
User:        chummels
Date:        2013-10-29 23:04:57
Summary:     Fixing cookbook header for consistency.
Affected #:  1 file

diff -r e8f3afe8c5f8059a2b1a3819f9bb93282b6340ad -r f793c3d47fb0a916a13955988c5ee3a26ff3ef3f source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,7 +1,7 @@
 .. _cookbook:
 
-The yt Cookbook
-===============
+The Cookbook
+============
 
 yt provides a great deal of functionality to the user, but sometimes it can 
 be a bit complex.  This section of the documentation lays out examples recipes 


https://bitbucket.org/yt_analysis/yt-doc/commits/3efef6a81d99/
Changeset:   3efef6a81d99
User:        chummels
Date:        2013-10-29 23:46:44
Summary:     Updating field list to include all of the front ends.  Also updating auto-generator function to include all front ends.
Affected #:  2 files

diff -r f793c3d47fb0a916a13955988c5ee3a26ff3ef3f -r 3efef6a81d9915681af5f23f01908e1c782ce25e helper_scripts/show_fields.py
--- a/helper_scripts/show_fields.py
+++ b/helper_scripts/show_fields.py
@@ -17,6 +17,17 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
+Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+native and derived fields available for your dataset respectively. For example
+to display the native fields in alphabetical order:
+
+.. notebook-cell::
+
+  from yt.mods import *
+  pf = load("Enzo_64/DD0043/data0043")
+  for i in sorted(pf.h.field_list):
+    print i
+
 .. note:: Universal fields will be overridden by a code-specific field.
 
 .. rubric:: Table of Contents
@@ -95,7 +106,37 @@
 print
 print_all_fields(FLASHFieldInfo)
 
-print "Nyx-Specific Field List"
+print "Athena-Specific Field List"
 print "--------------------------"
 print
+print_all_fields(AthenaFieldInfo)
+
+print "Nyx-Specific Field List"
+print "-----------------------"
+print
 print_all_fields(NyxFieldInfo)
+
+print "Castro-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(CastroFieldInfo)
+
+print "Chombo-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(ChomboFieldInfo)
+
+print "Pluto-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(PlutoFieldInfo)
+
+print "Grid-Data-Format-Specific Field List"
+print "------------------------------------"
+print
+print_all_fields(GDFFieldInfo)
+
+print "Generic-Format (Stream) Field List"
+print "----------------------------------"
+print
+print_all_fields(StreamFieldInfo)

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/c254ebdc8abd/
Changeset:   c254ebdc8abd
User:        ngoldbaum
Date:        2013-10-29 23:14:47
Summary:     Bail if execptions are raised during notebook evaluation.
Affected #:  2 files

diff -r ac966ace33129721fc2c583f506bb855d967d0b3 -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -51,7 +51,11 @@
         f.write(script_text.encode('utf8'))
         f.close()
 
-        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        try:
+            evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        except:
+            # bail
+            return []
 
         # Create link to notebook and script files
         link_rst = "(" + \

diff -r ac966ace33129721fc2c583f506bb855d967d0b3 -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 extensions/notebookcell_sphinxext.py
--- a/extensions/notebookcell_sphinxext.py
+++ b/extensions/notebookcell_sphinxext.py
@@ -32,7 +32,11 @@
 
         convert_to_ipynb('temp.py', 'temp.ipynb')
 
-        evaluated_text = evaluate_notebook('temp.ipynb')
+        try:
+            evaluated_text = evaluate_notebook('temp.ipynb')
+        except:
+            # bail
+            return []
 
         # create notebook node
         attributes = {'format': 'html', 'source': 'nb_path'}


https://bitbucket.org/yt_analysis/yt-doc/commits/e2e16b897dc8/
Changeset:   e2e16b897dc8
User:        chummels
Date:        2013-10-29 23:47:05
Summary:     Merging.
Affected #:  10 files

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec helper_scripts/show_fields.py
--- a/helper_scripts/show_fields.py
+++ b/helper_scripts/show_fields.py
@@ -17,6 +17,17 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
+Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+native and derived fields available for your dataset respectively. For example
+to display the native fields in alphabetical order:
+
+.. notebook-cell::
+
+  from yt.mods import *
+  pf = load("Enzo_64/DD0043/data0043")
+  for i in sorted(pf.h.field_list):
+    print i
+
 .. note:: Universal fields will be overridden by a code-specific field.
 
 .. rubric:: Table of Contents
@@ -95,7 +106,37 @@
 print
 print_all_fields(FLASHFieldInfo)
 
-print "Nyx-Specific Field List"
+print "Athena-Specific Field List"
 print "--------------------------"
 print
+print_all_fields(AthenaFieldInfo)
+
+print "Nyx-Specific Field List"
+print "-----------------------"
+print
 print_all_fields(NyxFieldInfo)
+
+print "Castro-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(CastroFieldInfo)
+
+print "Chombo-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(ChomboFieldInfo)
+
+print "Pluto-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(PlutoFieldInfo)
+
+print "Grid-Data-Format-Specific Field List"
+print "------------------------------------"
+print
+print_all_fields(GDFFieldInfo)
+
+print "Generic-Format (Stream) Field List"
+print "----------------------------------"
+print
+print_all_fields(StreamFieldInfo)

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/api/api.rst
--- a/source/api/api.rst
+++ /dev/null
@@ -1,563 +0,0 @@
-API Reference
-=============
-
-Plots and the Plotting Interface
---------------------------------
-
-SlicePlot and ProjectionPlot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_window.SlicePlot
-   ~yt.visualization.plot_window.OffAxisSlicePlot
-   ~yt.visualization.plot_window.ProjectionPlot
-   ~yt.visualization.plot_window.OffAxisProjectionPlot
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_collection.PlotCollection
-   ~yt.visualization.plot_collection.PlotCollectionInteractive
-   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
-   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.base_plot_types.get_multi_plot
-
-Data Sources
-------------
-
-.. _physical-object-api:
-
-Physical Objects
-^^^^^^^^^^^^^^^^
-
-These are the objects that act as physical selections of data, describing a
-region in space.  These are not typically addressed directly; see
-:ref:`available-objects` for more information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.data_containers.AMRCoveringGridBase
-   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
-   ~yt.data_objects.data_containers.AMRCylinderBase
-   ~yt.data_objects.data_containers.AMRGridCollectionBase
-   ~yt.data_objects.data_containers.AMRRayBase
-   ~yt.data_objects.data_containers.AMROrthoRayBase
-   ~yt.data_objects.data_containers.AMRStreamlineBase
-   ~yt.data_objects.data_containers.AMRProjBase
-   ~yt.data_objects.data_containers.AMRRegionBase
-   ~yt.data_objects.data_containers.AMRSliceBase
-   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
-   ~yt.data_objects.data_containers.AMRSphereBase
-   ~yt.data_objects.data_containers.AMRSurfaceBase
-
-Time Series Objects
-^^^^^^^^^^^^^^^^^^^
-
-These are objects that either contain and represent or operate on series of
-datasets.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.time_series.TimeSeriesData
-   ~yt.data_objects.time_series.TimeSeriesDataObject
-   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
-   ~yt.data_objects.time_series.AnalysisTaskProxy
-
-Frontends
----------
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.grid_patch.AMRGridPatch
-   ~yt.data_objects.hierarchy.AMRHierarchy
-   ~yt.data_objects.static_output.StaticOutput
-
-Enzo
-^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.enzo.data_structures.EnzoGrid
-   ~yt.frontends.enzo.data_structures.EnzoHierarchy
-   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
-
-Orion
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.orion.data_structures.OrionGrid
-   ~yt.frontends.orion.data_structures.OrionHierarchy
-   ~yt.frontends.orion.data_structures.OrionStaticOutput
-
-FLASH
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.flash.data_structures.FLASHGrid
-   ~yt.frontends.flash.data_structures.FLASHHierarchy
-   ~yt.frontends.flash.data_structures.FLASHStaticOutput
-
-Chombo
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.chombo.data_structures.ChomboGrid
-   ~yt.frontends.chombo.data_structures.ChomboHierarchy
-   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
-
-RAMSES
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
-
-Derived Datatypes
------------------
-
-Profiles and Histograms
-^^^^^^^^^^^^^^^^^^^^^^^
-
-These types are used to sum data up and either return that sum or return an
-average.  Typically they are more easily used through the
-`yt.visualization.plot_collection` interface.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.profiles.BinnedProfile1D
-   ~yt.data_objects.profiles.BinnedProfile2D
-   ~yt.data_objects.profiles.BinnedProfile3D
-
-Halo Finding and Particle Functions
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Halo finding can be executed using these types.  Here we list the main halo
-finders as well as a few other supplemental objects.
-
-.. rubric:: Halo Finders
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
-   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
-
-You can also operate on the Halo and HAloList objects themselves:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.Halo
-   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
-
-There are also functions for loading halos from disk:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
-
-We have several methods that work to create merger trees:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
-
-You can use Halo catalogs generatedl externally as well:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
-
-Halo Profiling
-^^^^^^^^^^^^^^
-
-yt provides a comprehensive halo profiler that can filter, center, and analyze
-halos en masse.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
-
-
-Two Point Functions
-^^^^^^^^^^^^^^^^^^^
-
-These functions are designed to create correlations or other results of
-operations acting on two spatially-distinct points in a data source.  See also
-:ref:`two_point_functions`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
-   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
-
-Field Types
------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.field_info_container.DerivedField
-   ~yt.data_objects.field_info_container.FieldInfoContainer
-   ~yt.data_objects.field_info_container.ValidateDataField
-   ~yt.data_objects.field_info_container.ValidateGridType
-   ~yt.data_objects.field_info_container.ValidateParameter
-   ~yt.data_objects.field_info_container.ValidateProperty
-   ~yt.data_objects.field_info_container.ValidateSpatial
-
-Image Handling
---------------
-
-For volume renderings and fixed resolution buffers the image object returned is
-an ``ImageArray`` object, which has useful functions for image saving and 
-writing to bitmaps.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.image_array.ImageArray
-   ~yt.data_objects.image_array.ImageArray.write_png
-   ~yt.data_objects.image_array.ImageArray.write_hdf5
-
-Extension Types
----------------
-
-Coordinate Transformations
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
-   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
-
-Cosmology, Star Particle Analysis, and Simulated Observations
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
-
-Light cone generation and simulation analysis.  (See also
-:ref:`light-cone-generator`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
-   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
-
-Absorption and X-ray spectra and spectral lines:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
-
-Absorption spectra fitting:
-
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
-
-Sunrise exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
-
-RADMC-3D exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
-
-Radial Column Density
-^^^^^^^^^^^^^^^^^^^^^
-
-If you'd like to calculate the column density out to a given point, from a
-specified center, yt can provide that information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
-
-Volume Rendering
-^^^^^^^^^^^^^^^^
-
-See also :ref:`volume_rendering`.
-
-Here are the primary entry points:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.Camera
-   ~yt.visualization.volume_rendering.camera.off_axis_projection
-   ~yt.visualization.volume_rendering.camera.allsky_projection
-
-These objects set up the way the image looks:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
-
-There are also advanced objects for particular use cases:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
-   ~yt.visualization.volume_rendering.camera.FisheyeCamera
-   ~yt.visualization.volume_rendering.camera.MosaicCamera
-   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
-   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
-   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
-   ~yt.visualization.volume_rendering.camera.StereoPairCamera
-
-Streamlining
-^^^^^^^^^^^^
-
-See also :ref:`streamlines`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.streamlines.Streamlines
-
-Image Writing
-^^^^^^^^^^^^^
-
-These functions are all used for fast writing of images directly to disk,
-without calling matplotlib.  This can be very useful for high-cadence outputs
-where colorbars are unnecessary or for volume rendering.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.image_writer.multi_image_composite
-   ~yt.visualization.image_writer.write_bitmap
-   ~yt.visualization.image_writer.write_projection
-   ~yt.visualization.image_writer.write_fits
-   ~yt.visualization.image_writer.write_image
-   ~yt.visualization.image_writer.map_to_colors
-   ~yt.visualization.image_writer.strip_colormap_data
-   ~yt.visualization.image_writer.splat_points
-   ~yt.visualization.image_writer.annotate_image
-   ~yt.visualization.image_writer.scale_image
-
-We also provide a module that is very good for generating EPS figures,
-particularly with complicated layouts.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.eps_writer.DualEPS
-   ~yt.visualization.eps_writer.single_plot
-   ~yt.visualization.eps_writer.multiplot
-   ~yt.visualization.eps_writer.multiplot_yt
-   ~yt.visualization.eps_writer.return_cmap
-
-.. _image-panner-api:
-
-Derived Quantities
-------------------
-
-See :ref:`derived-quantities`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.derived_quantities._AngularMomentumVector
-   ~yt.data_objects.derived_quantities._BaryonSpinParameter
-   ~yt.data_objects.derived_quantities._BulkVelocity
-   ~yt.data_objects.derived_quantities._CenterOfMass
-   ~yt.data_objects.derived_quantities._Extrema
-   ~yt.data_objects.derived_quantities._IsBound
-   ~yt.data_objects.derived_quantities._MaxLocation
-   ~yt.data_objects.derived_quantities._ParticleSpinParameter
-   ~yt.data_objects.derived_quantities._TotalMass
-   ~yt.data_objects.derived_quantities._TotalQuantity
-   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
-
-.. _callback-api:
-
-Callback List
--------------
-
-
-See also :ref:`callbacks`.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_modifications.ArrowCallback
-   ~yt.visualization.plot_modifications.ClumpContourCallback
-   ~yt.visualization.plot_modifications.ContourCallback
-   ~yt.visualization.plot_modifications.CoordAxesCallback
-   ~yt.visualization.plot_modifications.CuttingQuiverCallback
-   ~yt.visualization.plot_modifications.GridBoundaryCallback
-   ~yt.visualization.plot_modifications.HopCircleCallback
-   ~yt.visualization.plot_modifications.HopParticleCallback
-   ~yt.visualization.plot_modifications.LabelCallback
-   ~yt.visualization.plot_modifications.LinePlotCallback
-   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
-   ~yt.visualization.plot_modifications.ParticleCallback
-   ~yt.visualization.plot_modifications.PointAnnotateCallback
-   ~yt.visualization.plot_modifications.QuiverCallback
-   ~yt.visualization.plot_modifications.SphereCallback
-   ~yt.visualization.plot_modifications.TextLabelCallback
-   ~yt.visualization.plot_modifications.TitleCallback
-   ~yt.visualization.plot_modifications.UnitBoundaryCallback
-   ~yt.visualization.plot_modifications.VelocityCallback
-
-Function List
--------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.convenience.load
-   ~yt.funcs.deprecate
-   ~yt.funcs.ensure_list
-   ~yt.funcs.get_pbar
-   ~yt.funcs.humanize_time
-   ~yt.funcs.insert_ipython
-   ~yt.funcs.iterable
-   ~yt.funcs.just_one
-   ~yt.funcs.only_on_root
-   ~yt.funcs.paste_traceback
-   ~yt.funcs.pdb_run
-   ~yt.funcs.print_tb
-   ~yt.funcs.rootonly
-   ~yt.funcs.time_execution
-   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
-
-Miscellaneous Types
--------------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.config.YTConfigParser
-   ~yt.utilities.parameter_file_storage.ParameterFileStore
-   ~yt.data_objects.data_containers.FakeGridForParticles
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
-
-
-Testing Infrastructure
-----------------------
-
-The first set of functions are all provided by NumPy.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_array_equal
-   ~yt.testing.assert_almost_equal
-   ~yt.testing.assert_approx_equal
-   ~yt.testing.assert_array_almost_equal
-   ~yt.testing.assert_equal
-   ~yt.testing.assert_array_less
-   ~yt.testing.assert_string_equal
-   ~yt.testing.assert_array_almost_equal_nulp
-   ~yt.testing.assert_allclose
-   ~yt.testing.assert_raises
-
-These are yt-provided functions:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_rel_equal
-   ~yt.testing.amrspace
-   ~yt.testing.fake_random_pf
-   ~yt.testing.expand_keywords

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -256,4 +256,4 @@
                        }
 
 if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+    autosummary_generate = glob.glob("reference/api/api.rst")

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,7 +1,7 @@
 .. _cookbook:
 
-The yt Cookbook
-===============
+The Cookbook
+============
 
 yt provides a great deal of functionality to the user, but sometimes it can 
 be a bit complex.  This section of the documentation lays out examples recipes 

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -3,38 +3,129 @@
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
-in Python under the open-source model.  yt currently supports several 
-astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  Fully-supported codes 
-include: `Enzo <http://enzo-project.org/>`_, 
-`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
-`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
-`FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_;
-and partially-supported codes include: 
-`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
-`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
-
-yt uses a three-pronged approach to interacting with data:
-
- * Visualize Data - Generate plots, images, and movies for better understanding your datasets
- * Analyze Data - Use additional analysis routines to derive real-world results from your data
- * Examine Data - Directly access raw data with helper functions for making this task easier
+in Python under the open-source model.  In version 2.6, yt currently supports
+several astrophysical simulation code formats, as well support for
+:ref:`loading-numpy-array` for unsupported data formats.  :ref:`code-support`
+is included for: `Enzo <http://enzo-project.org/>`_, `Orion
+<http://flash.uchicago.edu/~rfisher/orion/>`_, `Nyx
+<https://ccse.lbl.gov/Research/NYX/index.html>`_, `FLASH
+<http://flash.uchicago.edu/website/home/>`_, `Piernik
+<http://arxiv.org/abs/0901.0104>`_, `Athena
+<https://trac.princeton.edu/Athena/>`_, `Chombo <http://chombo.lbl.gov>`_,
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_, `Maestro
+<https://ccse.lbl.gov/Research/MAESTRO/>`_, and `Pluto
+<http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
+particle codes and octree codes, is taking place in yt 3.0.)
 
 Documentation
 =============
 
+.. raw:: html
+
+   <table class="contentstable" align="center">
+
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="installing.html">Installation</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Getting and Installing yt</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="bootcamp/index.html">yt Bootcamp</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Demonstrations of what yt can do</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="cookbook/index.html">The Cookbook</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Example recipes for how to accomplish a variety of tasks</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="visualizing/index.html">Visualizing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Make plots, projections, volume renderings, movies, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="analyzing/index.html">Analyzing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="examining/index.html">Examining Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Load data and directly access raw values for low-level analysis</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="developing/index.html">Developing in yt</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Catering yt to work for your exact use case</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="reference/index.html">Reference Materials</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Lists of fields, quantities, classes, functions, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/index.html">Getting help</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">What to do if you run into problems</p>
+       </td>
+     </tr>
+
+   </table>
+
 .. toctree::
-   :maxdepth: 1
+   :hidden:
 
    installing
-   yt Bootcamp: A Worked Introduction <bootcamp/index>
-   help/index
+   yt Bootcamp <bootcamp/index>
    cookbook/index
    visualizing/index
    analyzing/index
    examining/index
    developing/index
    reference/index
+   help/index

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/reference/api/api.rst
--- /dev/null
+++ b/source/reference/api/api.rst
@@ -0,0 +1,563 @@
+API Reference
+=============
+
+Plots and the Plotting Interface
+--------------------------------
+
+SlicePlot and ProjectionPlot
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_window.SlicePlot
+   ~yt.visualization.plot_window.OffAxisSlicePlot
+   ~yt.visualization.plot_window.ProjectionPlot
+   ~yt.visualization.plot_window.OffAxisProjectionPlot
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_collection.PlotCollection
+   ~yt.visualization.plot_collection.PlotCollectionInteractive
+   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
+   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
+   ~yt.visualization.base_plot_types.get_multi_plot
+
+Data Sources
+------------
+
+.. _physical-object-api:
+
+Physical Objects
+^^^^^^^^^^^^^^^^
+
+These are the objects that act as physical selections of data, describing a
+region in space.  These are not typically addressed directly; see
+:ref:`available-objects` for more information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.data_containers.AMRCoveringGridBase
+   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
+   ~yt.data_objects.data_containers.AMRCylinderBase
+   ~yt.data_objects.data_containers.AMRGridCollectionBase
+   ~yt.data_objects.data_containers.AMRRayBase
+   ~yt.data_objects.data_containers.AMROrthoRayBase
+   ~yt.data_objects.data_containers.AMRStreamlineBase
+   ~yt.data_objects.data_containers.AMRProjBase
+   ~yt.data_objects.data_containers.AMRRegionBase
+   ~yt.data_objects.data_containers.AMRSliceBase
+   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
+   ~yt.data_objects.data_containers.AMRSphereBase
+   ~yt.data_objects.data_containers.AMRSurfaceBase
+
+Time Series Objects
+^^^^^^^^^^^^^^^^^^^
+
+These are objects that either contain and represent or operate on series of
+datasets.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.time_series.TimeSeriesData
+   ~yt.data_objects.time_series.TimeSeriesDataObject
+   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
+   ~yt.data_objects.time_series.AnalysisTaskProxy
+
+Frontends
+---------
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.grid_patch.AMRGridPatch
+   ~yt.data_objects.hierarchy.AMRHierarchy
+   ~yt.data_objects.static_output.StaticOutput
+
+Enzo
+^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.enzo.data_structures.EnzoGrid
+   ~yt.frontends.enzo.data_structures.EnzoHierarchy
+   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
+
+Orion
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.orion.data_structures.OrionGrid
+   ~yt.frontends.orion.data_structures.OrionHierarchy
+   ~yt.frontends.orion.data_structures.OrionStaticOutput
+
+FLASH
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.flash.data_structures.FLASHGrid
+   ~yt.frontends.flash.data_structures.FLASHHierarchy
+   ~yt.frontends.flash.data_structures.FLASHStaticOutput
+
+Chombo
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.chombo.data_structures.ChomboGrid
+   ~yt.frontends.chombo.data_structures.ChomboHierarchy
+   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
+
+RAMSES
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.ramses.data_structures.RAMSESGrid
+   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
+   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+
+Derived Datatypes
+-----------------
+
+Profiles and Histograms
+^^^^^^^^^^^^^^^^^^^^^^^
+
+These types are used to sum data up and either return that sum or return an
+average.  Typically they are more easily used through the
+`yt.visualization.plot_collection` interface.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.profiles.BinnedProfile1D
+   ~yt.data_objects.profiles.BinnedProfile2D
+   ~yt.data_objects.profiles.BinnedProfile3D
+
+Halo Finding and Particle Functions
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Halo finding can be executed using these types.  Here we list the main halo
+finders as well as a few other supplemental objects.
+
+.. rubric:: Halo Finders
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
+   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
+
+You can also operate on the Halo and HAloList objects themselves:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.Halo
+   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
+
+There are also functions for loading halos from disk:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
+
+We have several methods that work to create merger trees:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
+
+You can use Halo catalogs generatedl externally as well:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
+
+Halo Profiling
+^^^^^^^^^^^^^^
+
+yt provides a comprehensive halo profiler that can filter, center, and analyze
+halos en masse.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
+
+
+Two Point Functions
+^^^^^^^^^^^^^^^^^^^
+
+These functions are designed to create correlations or other results of
+operations acting on two spatially-distinct points in a data source.  See also
+:ref:`two_point_functions`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
+   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
+
+Field Types
+-----------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.field_info_container.DerivedField
+   ~yt.data_objects.field_info_container.FieldInfoContainer
+   ~yt.data_objects.field_info_container.ValidateDataField
+   ~yt.data_objects.field_info_container.ValidateGridType
+   ~yt.data_objects.field_info_container.ValidateParameter
+   ~yt.data_objects.field_info_container.ValidateProperty
+   ~yt.data_objects.field_info_container.ValidateSpatial
+
+Image Handling
+--------------
+
+For volume renderings and fixed resolution buffers the image object returned is
+an ``ImageArray`` object, which has useful functions for image saving and 
+writing to bitmaps.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.image_array.ImageArray
+   ~yt.data_objects.image_array.ImageArray.write_png
+   ~yt.data_objects.image_array.ImageArray.write_hdf5
+
+Extension Types
+---------------
+
+Coordinate Transformations
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
+   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
+
+Cosmology, Star Particle Analysis, and Simulated Observations
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
+
+Light cone generation and simulation analysis.  (See also
+:ref:`light-cone-generator`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
+   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
+
+Absorption and X-ray spectra and spectral lines:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
+
+Absorption spectra fitting:
+
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+
+Sunrise exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
+
+RADMC-3D exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
+
+Radial Column Density
+^^^^^^^^^^^^^^^^^^^^^
+
+If you'd like to calculate the column density out to a given point, from a
+specified center, yt can provide that information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
+
+Volume Rendering
+^^^^^^^^^^^^^^^^
+
+See also :ref:`volume_rendering`.
+
+Here are the primary entry points:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.Camera
+   ~yt.visualization.volume_rendering.camera.off_axis_projection
+   ~yt.visualization.volume_rendering.camera.allsky_projection
+
+These objects set up the way the image looks:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
+
+There are also advanced objects for particular use cases:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
+   ~yt.visualization.volume_rendering.camera.FisheyeCamera
+   ~yt.visualization.volume_rendering.camera.MosaicCamera
+   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
+   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
+   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
+   ~yt.visualization.volume_rendering.camera.StereoPairCamera
+
+Streamlining
+^^^^^^^^^^^^
+
+See also :ref:`streamlines`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.streamlines.Streamlines
+
+Image Writing
+^^^^^^^^^^^^^
+
+These functions are all used for fast writing of images directly to disk,
+without calling matplotlib.  This can be very useful for high-cadence outputs
+where colorbars are unnecessary or for volume rendering.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.image_writer.multi_image_composite
+   ~yt.visualization.image_writer.write_bitmap
+   ~yt.visualization.image_writer.write_projection
+   ~yt.visualization.image_writer.write_fits
+   ~yt.visualization.image_writer.write_image
+   ~yt.visualization.image_writer.map_to_colors
+   ~yt.visualization.image_writer.strip_colormap_data
+   ~yt.visualization.image_writer.splat_points
+   ~yt.visualization.image_writer.annotate_image
+   ~yt.visualization.image_writer.scale_image
+
+We also provide a module that is very good for generating EPS figures,
+particularly with complicated layouts.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.eps_writer.DualEPS
+   ~yt.visualization.eps_writer.single_plot
+   ~yt.visualization.eps_writer.multiplot
+   ~yt.visualization.eps_writer.multiplot_yt
+   ~yt.visualization.eps_writer.return_cmap
+
+.. _image-panner-api:
+
+Derived Quantities
+------------------
+
+See :ref:`derived-quantities`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.derived_quantities._AngularMomentumVector
+   ~yt.data_objects.derived_quantities._BaryonSpinParameter
+   ~yt.data_objects.derived_quantities._BulkVelocity
+   ~yt.data_objects.derived_quantities._CenterOfMass
+   ~yt.data_objects.derived_quantities._Extrema
+   ~yt.data_objects.derived_quantities._IsBound
+   ~yt.data_objects.derived_quantities._MaxLocation
+   ~yt.data_objects.derived_quantities._ParticleSpinParameter
+   ~yt.data_objects.derived_quantities._TotalMass
+   ~yt.data_objects.derived_quantities._TotalQuantity
+   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
+
+.. _callback-api:
+
+Callback List
+-------------
+
+
+See also :ref:`callbacks`.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_modifications.ArrowCallback
+   ~yt.visualization.plot_modifications.ClumpContourCallback
+   ~yt.visualization.plot_modifications.ContourCallback
+   ~yt.visualization.plot_modifications.CoordAxesCallback
+   ~yt.visualization.plot_modifications.CuttingQuiverCallback
+   ~yt.visualization.plot_modifications.GridBoundaryCallback
+   ~yt.visualization.plot_modifications.HopCircleCallback
+   ~yt.visualization.plot_modifications.HopParticleCallback
+   ~yt.visualization.plot_modifications.LabelCallback
+   ~yt.visualization.plot_modifications.LinePlotCallback
+   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
+   ~yt.visualization.plot_modifications.ParticleCallback
+   ~yt.visualization.plot_modifications.PointAnnotateCallback
+   ~yt.visualization.plot_modifications.QuiverCallback
+   ~yt.visualization.plot_modifications.SphereCallback
+   ~yt.visualization.plot_modifications.TextLabelCallback
+   ~yt.visualization.plot_modifications.TitleCallback
+   ~yt.visualization.plot_modifications.UnitBoundaryCallback
+   ~yt.visualization.plot_modifications.VelocityCallback
+
+Function List
+-------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.convenience.load
+   ~yt.funcs.deprecate
+   ~yt.funcs.ensure_list
+   ~yt.funcs.get_pbar
+   ~yt.funcs.humanize_time
+   ~yt.funcs.insert_ipython
+   ~yt.funcs.iterable
+   ~yt.funcs.just_one
+   ~yt.funcs.only_on_root
+   ~yt.funcs.paste_traceback
+   ~yt.funcs.pdb_run
+   ~yt.funcs.print_tb
+   ~yt.funcs.rootonly
+   ~yt.funcs.time_execution
+   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
+
+Miscellaneous Types
+-------------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.config.YTConfigParser
+   ~yt.utilities.parameter_file_storage.ParameterFileStore
+   ~yt.data_objects.data_containers.FakeGridForParticles
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
+
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
+Testing Infrastructure
+----------------------
+
+The first set of functions are all provided by NumPy.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_array_equal
+   ~yt.testing.assert_almost_equal
+   ~yt.testing.assert_approx_equal
+   ~yt.testing.assert_array_almost_equal
+   ~yt.testing.assert_equal
+   ~yt.testing.assert_array_less
+   ~yt.testing.assert_string_equal
+   ~yt.testing.assert_array_almost_equal_nulp
+   ~yt.testing.assert_allclose
+   ~yt.testing.assert_raises
+
+These are yt-provided functions:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_rel_equal
+   ~yt.testing.amrspace
+   ~yt.testing.fake_random_pf
+   ~yt.testing.expand_keywords

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r e2e16b897dc81e5c4de765393c937ba7f641c8ec source/reference/code_support.rst
--- /dev/null
+++ b/source/reference/code_support.rst
@@ -0,0 +1,53 @@
+
+.. _code-support:
+
+Code Support
+============
+
+Levels of Support for Various Codes
+-----------------------------------
+
+yt provides frontends to support several different simulation code formats 
+as inputs.  Below is a list showing what level of support is provided for
+each code.
+
+|
+
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability           | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++======================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities     |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles            |   Y  |   Y   |   Y   |  Y   |   N/A   |   N    |   Y    |   N     |   N   |    N   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters           |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units                |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand       |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data        |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Part of test suite   |   Y  |   Y   |   Y   |  Y   |    N    |   N    |   Y    |   N     |   N   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support     | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+
+|
+
+If you have a dataset from a code not yet supported, you can either 
+input your data using the :ref:`loading-numpy-array` format, or help us by 
+:ref:`creating_frontend` for this new format.
+
+Future Codes to Support
+-----------------------
+
+A major overhaul of the code was required in order to cleanly support 
+additional codes.  Development in the yt 3.x branch has begun and provides 
+support for codes like: 
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_, 
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
+`Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
+version of yt for the most up-to-date support for those codes.
+
+Additionally, in yt 3.0 the Boxlib formats have been unified and streamlined.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/dac3a79b2dc4/
Changeset:   dac3a79b2dc4
User:        chummels
Date:        2013-10-30 03:20:16
Summary:     Stripping out Ramses and ART from how to load data.
Affected #:  1 file

diff -r e2e16b897dc81e5c4de765393c937ba7f641c8ec -r dac3a79b2dc452e3ad14ec0839f97baafd0b1dab source/examining/loading_data.rst
--- a/source/examining/loading_data.rst
+++ b/source/examining/loading_data.rst
@@ -11,40 +11,52 @@
 Generic Array Data
 ------------------
 
-Even if your data is not strictly related to fields commonly used in
-astrophysical codes or your code is not supported yet, you can still feed it to
-``yt`` to use its advanced visualization and analysis facilities. The only
-requirement is that your data can be represented as one or more uniform, three
-dimensional numpy arrays. Assuming that you have your data in ``arr``,
-the following code:
+If you have a data format which is unsupported by the existing code frontends,
+you can still read your data into ``yt``.  The only requirement is that your 
+data can be represented as one or more uniform, three
+dimensional numpy arrays.  
+
+For example, let's say you have a 3D dataset of a density field in a cubical 
+volume which is 3 Mpc on a side.
+
+
+Assuming that you have a managed to get your data 
+into a numpy array called ``a``, you can now read it in using the following
+code:
 
 .. code-block:: python
 
    from yt.frontends.stream.api import load_uniform_grid
 
-   data = dict(Density = arr)
+   data = dict(Density = a)
    bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
+   pf = load_uniform_grid(data, a.shape, 3.08e24, bbox=bbox, nprocs=12)
 
-will create ``yt``-native parameter file ``pf`` that will treat your array as
-density field in cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) and
-simultaneously divide the domain into 12 chunks, so that you can take advantage
-of the underlying parallelism. 
+This will generate a ``yt``-native parameter file as ``pf``.  It will treat 
+your array as density field in a cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) 
+and simultaneously divide the domain into 12 chunks, so that you can take advantage
+of the underlying parallelism.  If you want to set another fieldname other 
+than ``Density``, feel free to do that here as well.  To disable parallelism, 
+just don't set nprocs.
+
+You can now use ``pf`` as though it were a normal parameter file like the 
+many examples in the Cookbook.
 
 Particle fields are detected as one-dimensional fields. The number of
-particles is set by the ``number_of_particles`` key in
-``data``. Particle fields are then added as one-dimensional arrays in
-a similar manner as the three-dimensional grid fields:
+particles is set by the ``number_of_particles`` key in the ``data`` dictionary. 
+Particle fields are then added as one-dimensional arrays in a similar manner 
+as the three-dimensional grid fields.  So starting again with a 3D numpy array 
+``a`` that represents our Density field
 
 .. code-block:: python
 
    from yt.frontends.stream.api import load_uniform_grid
 
-   data = dict(Density = dens, 
+   data = dict(Density = a, 
                number_of_particles = 1000000,
                particle_position_x = posx_arr, 
-	       particle_position_y = posy_arr,
-	       particle_position_z = posz_arr)
+	           particle_position_y = posy_arr,
+	           particle_position_z = posz_arr)
    bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
    pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
 
@@ -175,114 +187,6 @@
   positions will not be.
 * Domains may be visualized assuming periodicity.
 
-.. _loading-ramses-data:
-
-RAMSES Data
------------
-
-RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
-you are interested in taking a development or stewardship role, please contact
-him.  To load a RAMSES dataset, you can use the ``load`` command provided by
-``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
-were in a directory with the following files:
-
-.. code-block:: none
-
-   output_00007
-   output_00007/amr_00007.out00001
-   output_00007/grav_00007.out00001
-   output_00007/hydro_00007.out00001
-   output_00007/info_00007.txt
-   output_00007/part_00007.out00001
-
-You would feed it the filename ``output_00007/info_00007.txt``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("output_00007/info_00007.txt")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly set!  This may not be the
-  case for RAMSES data
-* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
-  domain to ensure minimum-coverage from a set of grid patches.  (This is
-  described in the yt method paper.)  This is a time-consuming process and it
-  has not yet been written to be stored between calls.
-* Particles are not supported
-* Parallelism will not be terribly efficient for large datasets
-* There may be occasional segfaults on multi-domain data, which do not
-  reflect errors in the calculation
-
-If you are interested in helping with RAMSES support, we are eager to hear from
-you!
-
-.. _loading-art-data:
-
-ART Data
---------
-
-ART data enjoys preliminary support and is supported by Christopher Moody.
-Please contact the ``yt-dev`` mailing list if you are interested in using yt
-for ART data, or if you are interested in assisting with development of yt to
-work with ART data.
-
-At the moment, the ART octree is 'regridded' at each level to make the native
-octree look more like a mesh-based code. As a result, the initial outlay
-is about ~60 seconds to grid octs onto a mesh. This will be improved in 
-``yt-3.0``, where octs will be supported natively. 
-
-To load an ART dataset you can use the ``load`` command provided by 
-``yt.mods`` and passing the gas mesh file. It will search for and attempt 
-to find the complementary dark matter and stellar particle header and data 
-files. However, your simulations may not follow the same naming convention.
-
-So for example, a single snapshot might have a series of files looking like
-this:
-
-.. code-block:: none
-
-   10MpcBox_csf512_a0.300.d    #Gas mesh
-   PMcrda0.300.DAT             #Particle header
-   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
-   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
-
-The ART frontend tries to find the associated files matching the above, but
-if that fails you can specify ``file_particle_data``,``file_particle_data``,
-``file_star_data`` in addition to the specifying the gas mesh. You also have 
-the option of gridding particles, and assigning them onto the meshes.
-This process is in beta, and for the time being it's probably  best to leave
-``do_grid_particles=False`` as the default.
-
-To speed up the loading of an ART file, you have a few options. You can turn 
-off the particles entirely by setting ``discover_particles=False``. You can
-also only grid octs up to a certain level, ``limit_level=5``, which is useful
-when debugging by artificially creating a 'smaller' dataset to work with.
-
-Finally, when stellar ages are computed we 'spread' the ages evenly within a
-smoothing window. By default this is turned on and set to 10Myr. To turn this 
-off you can set ``spread=False``, and you can tweak the age smoothing window
-by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
-
-.. code-block:: python
-    
-   from yt.mods import *
-
-   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
-   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
-   pf.h.print_stats()
-   dd=pf.h.all_data()
-   print np.sum(dd['particle_type']==0)
-
-In the above example code, the first line imports the standard yt functions,
-followed by defining the gas mesh file. It's loaded only through level 3, but
-grids particles on to meshes on level 2 and higher. Finally, we create a data
-container and ask it to gather the particle_type array. In this case ``type==0``
-is for the most highly-refined dark matter particle, and we print out how many
-high-resolution star particles we find in the simulation.  Typically, however,
-you shouldn't have to specify any keyword arguments to load in a dataset.
-
 .. loading-amr-data:
 
 Generic AMR Data


https://bitbucket.org/yt_analysis/yt-doc/commits/68e1842a5b10/
Changeset:   68e1842a5b10
User:        chummels
Date:        2013-10-30 04:30:40
Summary:     Adding update instructions to the installation instructions.
Affected #:  1 file

diff -r dac3a79b2dc452e3ad14ec0839f97baafd0b1dab -r 68e1842a5b103ec913b94372c79dc655d03028e6 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,6 +1,9 @@
 Getting and Installing yt
 =========================
 
+Getting yt
+----------
+
 yt is a Python package (with some components written in C), using NumPy as a
 computation engine, Matplotlib for some visualization tasks and Mercurial for
 version control.  Because installation of all of these interlocking parts can 
@@ -16,10 +19,14 @@
 
   http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
 
-By default, it will install an array of items, but there are additional packages
-that can be downloaded and installed (e.g. SciPy, enzo, etc.). The script has 
-all of these options at the top of the file. You should be able to open it and 
-edit it without any knowledge of bash syntax.  To execute it, run:
+Installing yt
+-------------
+
+By default, the bash script will install an array of items, but there are 
+additional packages that can be downloaded and installed (e.g. SciPy, enzo, 
+etc.). The script has all of these options at the top of the file. You should 
+be able to open it and edit it without any knowledge of bash syntax.  
+To execute it, run:
 
 .. code-block:: bash
 
@@ -95,3 +102,21 @@
 If you get an error, follow the instructions it gives you to debug the problem.  
 Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
 figure it out.
+
+Updating yt and its dependencies
+--------------------------------
+
+With many active developers, code development sometimes occurs at a furious 
+pace in yt.  To make sure you're using the latest version of the code, run
+this command at a command-line:
+
+.. code-block:: bash
+
+  $ yt update
+
+Additionally, if you want to make sure you have the latest dependencies 
+associated with yt and update the codebase simultaneously, type this:
+
+.. code-block:: bash
+
+  $ yt update --all


https://bitbucket.org/yt_analysis/yt-doc/commits/c2e74d7656ee/
Changeset:   c2e74d7656ee
User:        chummels
Date:        2013-10-30 05:23:08
Summary:     Cleaning up introduction bootcamp
Affected #:  1 file

diff -r 68e1842a5b103ec913b94372c79dc655d03028e6 -r c2e74d7656ee7b147d9ca270de67263a86f13cb3 source/bootcamp/Introduction.ipynb
--- a/source/bootcamp/Introduction.ipynb
+++ b/source/bootcamp/Introduction.ipynb
@@ -15,52 +15,26 @@
       "\n",
       "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
       "\n",
-      "But, before we begin, there are a few places to go if you run into trouble.\n",
-      "\n",
-      "**The yt homepage is at http://yt-project.org/**\n",
-      "\n",
-      "## Source of Help\n",
-      "\n",
-      "There are three places to check for help:\n",
-      "\n",
-      " * The documentation: http://yt-project.org/doc/\n",
-      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
-      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
-      "\n",
-      "## Sources of Information\n",
-      "\n",
-      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
-      "\n",
-      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
-      "\n",
-      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
-      "\n",
-      "## How to Update yt\n",
-      "\n",
-      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
-      "\n",
-      "`yt update`\n",
-      "\n",
-      "This will automatically update it for you.\n",
+      "Our documentation page can provide information on a variety of the commands that are used here, both in narrative documentation as well as recipes for specific functionality in our cookbook.  The documentation exists at http://yt-project.org/doc/.  If you encounter problems, look for help here: http://yt-project.org/doc/help/index.html.\n",
       "\n",
       "## Acquiring the datasets for this tutorial\n",
       "\n",
-      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/.\n",
+      "If you are executing these tutorials interactively, you need some sample datasets on which to run the code.  You can download these datasets at http://yt-project.org/data/.  The datasets necessary for each lesson are noted next to the corresponding tutorial.\n",
       "\n",
       "## What's Next?\n",
       "\n",
       "The Notebooks are meant to be explored in this order:\n",
       "\n",
       "1. Introduction\n",
-      "2. Data Inspection\n",
-      "3. Simple Visualization\n",
-      "4. Data Objects and Time Series\n",
-      "5. Derived Fields and Profiles\n",
-      "6. Volume Rendering"
+      "2. Data Inspection (IsolatedGalaxy dataset)\n",
+      "3. Simple Visualization (enzo_tiny_cosmology & Enzo_64 datasets)\n",
+      "4. Data Objects and Time Series (IsolatedGalaxy dataset)\n",
+      "5. Derived Fields and Profiles (IsolatedGalaxy dataset)\n",
+      "6. Volume Rendering (IsolatedGalaxy dataset)"
      ]
     }
    ],
    "metadata": {}
   }
  ]
-}
\ No newline at end of file
+}


https://bitbucket.org/yt_analysis/yt-doc/commits/1dec03d4ff07/
Changeset:   1dec03d4ff07
User:        ngoldbaum
Date:        2013-10-30 03:56:08
Summary:     Updating the callback docs.
Affected #:  4 files

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r 1dec03d4ff076d7546ad559894e2eaebda97d0f7 extensions/pythonscript_sphinxext.py
--- a/extensions/pythonscript_sphinxext.py
+++ b/extensions/pythonscript_sphinxext.py
@@ -2,7 +2,7 @@
 from subprocess import Popen,PIPE
 from docutils.parsers.rst import directives
 from docutils import nodes
-import os, glob, shutil,  uuid, re
+import os, glob, shutil, uuid, re, string
 
 class PythonScriptDirective(Directive):
     """Execute an inline python script and display images.
@@ -26,6 +26,9 @@
         dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
                                                 source_dir))
 
+        # working around a docutils/sphinx issue?
+        dest_dir = string.replace(dest_dir, 'internal padding after ', '')
+
         if not os.path.exists(dest_dir):
             os.makedirs(dest_dir) # no problem here for me, but just use built-ins
 
@@ -47,6 +50,7 @@
         for im in images:
             fns.append(str(uuid.uuid4()) + ".png")
             shutil.move(im, os.path.join(dest_dir, fns[-1]))
+            print im, os.path.join(dest_dir, fns[-1])
 
         os.remove("temp.py")
 
@@ -73,7 +77,7 @@
     "[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}"
 
 def cleanup(app, exception):
-    """ Cleanup all png files with UUID filenames in the source """ 
+    """ Cleanup all png files with UUID filenames in the source """
     for root,dirnames,filenames in os.walk(app.srcdir):
         matches = re.findall(PATTERN, "\n".join(filenames))
         for match in matches:

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r 1dec03d4ff076d7546ad559894e2eaebda97d0f7 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -250,7 +250,7 @@
 
 # Example configuration for intersphinx: refer to the Python standard library.
 intersphinx_mapping = {'http://docs.python.org/': None,
-                       'http://ipython.org/ipython-doc/rel-1.10/html/': None,
+                       'http://ipython.org/ipython-doc/rel-1.10/': None,
                        'http://docs.scipy.org/doc/numpy/': None,
                        'http://matplotlib.sourceforge.net/': None,
                        }

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r 1dec03d4ff076d7546ad559894e2eaebda97d0f7 source/visualizing/_cb_docstrings.inc
--- a/source/visualizing/_cb_docstrings.inc
+++ b/source/visualizing/_cb_docstrings.inc
@@ -1,6 +1,4 @@
-
-
-.. function:: arrow(self, pos, code_size, plot_args=None):
+.. function:: annotate_arrow(self, pos, code_size, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ArrowCallback`.)
 
@@ -8,18 +6,47 @@
    *code_size* in code units.  *plot_args* is a dict fed to
    matplotlib with arrow properties.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'Density', width=(10,'kpc'), center='max')
+   slc.annotate_arrow((0.53, 0.53, 0.53), 1/pf['kpc'])
+   slc.save()
 
-.. function:: clumps(self, clumps, plot_args=None):
+-------------
+
+.. function:: annotate_clumps(self, clumps, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ClumpContourCallback`.)
 
    Take a list of *clumps* and plot them as a set of
    contours.
 
+.. python-script::
 
+   from yt.mods import *
+   from yt.analysis_modules.level_sets.api import *
 
-.. function:: contour(self, field, ncont=5, factor=4, take_log=False, clim=None, plot_args=None):
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
+   data_source = pf.h.disk([0.5, 0.5, 0.5], [0., 0., 1.],
+                           8./pf.units['kpc'], 1./pf.units['kpc'])
+
+   c_min = 10**na.floor(na.log10(data_source['Density']).min()  )
+   c_max = 10**na.floor(na.log10(data_source['Density']).max()+1)
+
+   function = 'self.data[\'Density\'].size > 20'
+   master_clump = Clump(data_source, None, 'Density', function=function)
+   find_clumps(master_clump, c_min, c_max, 2.0)
+   leaf_clumps = get_lowest_clumps(master_clump)
+
+   prj = ProjectionPlot(pf, 2, 'Density', center='c', width=(20,'kpc'))
+   prj.annotate_clumps(leaf_clumps)
+   prj.save('clumps')
+
+-------------
+
+.. function:: annotate_contour(self, field, ncont=5, factor=4, take_log=False, clim=None, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ContourCallback`.)
 
@@ -29,18 +56,17 @@
    how it is contoured and *clim* gives the (upper, lower)
    limits for contouring.
 
+.. python-script::
+   
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   s = SlicePlot(pf, "x", ["Density"], center="max")
+   s.annotate_contour("Temperature")
+   s.save()
 
+-------------
 
-.. function:: coord_axes(self, unit=None, coords=False):
-
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.CoordAxesCallback`.)
-
-   Creates x and y axes for a VMPlot. In the future, it will
-   attempt to guess the proper units to use.
-
-
-
-.. function:: cquiver(self, field_x, field_y, factor):
+.. function:: annotate_cquiver(self, field_x, field_y, factor):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.CuttingQuiverCallback`.)
 
@@ -48,9 +74,18 @@
    *field_x* and *field_y*, skipping every *factor*
    datapoint in the discretization.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   s = OffAxisSlicePlot(pf, [1,1,0], ["Density"], center="c")
+   s.annotate_cquiver('CuttingPlaneVelocityX', 'CuttingPlaneVelocityY', 10)
+   s.zoom(1.5)
+   s.save()
 
-.. function:: grids(self, alpha=1.0, min_pix=1, annotate=False, periodic=True):
+-------------
+
+.. function:: annotate_grids(self, alpha=1.0, min_pix=1, annotate=False, periodic=True):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.GridBoundaryCallback`.)
 
@@ -59,18 +94,35 @@
    wide. *annotate* puts the grid id in the corner of the
    grid.  (Not so great in projections...)
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'Density', width=(10,'kpc'), center='max')
+   slc.annotate_grids()
+   slc.save()
 
-.. function:: hop_circles(self, hop_output, max_number=None, annotate=False, min_size=20, max_size=10000000, font_size=8, print_halo_size=False, print_halo_mass=False, width=None):
+-------------
+
+.. function:: annotate_hop_circles(self, hop_output, max_number=None, annotate=False, min_size=20, max_size=10000000, font_size=8, print_halo_size=False, print_halo_mass=False, width=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.HopCircleCallback`.)
 
    Accepts a :class:`yt.HopList` *hop_output* and plots up
    to *max_number* (None for unlimited) halos as circles.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   halos = HaloFinder(pf)
+   p = ProjectionPlot(pf, "z", "Density")
+   p.annotate_hop_circles(halos)
+   p.save()
 
-.. function:: hop_particles(self, hop_output, max_number, p_size=1.0, min_size=20, alpha=0.2):
+-------------
+
+.. function:: annotate_hop_particles(self, hop_output, max_number, p_size=1.0, min_size=20, alpha=0.2):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.HopParticleCallback`.)
 
@@ -80,34 +132,51 @@
    plotted with *p_size* pixels per particle;  *alpha*
    determines the opacity of each particle.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   halos = HaloFinder(pf)
+   p = ProjectionPlot(pf, "x", "Density", center='m', width=(10, 'Mpc'))
+   p.annotate_hop_particles(halos, max_number=100, p_size=5.0)
+   p.save()
 
-.. function:: image_line(self, p1, p2, data_coords=False, plot_args=None):
+-------------
+
+.. function:: annotate_image_line(self, p1, p2, data_coords=False, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ImageLineCallback`.)
 
-   Plot from *p1* to *p2* (image plane coordinates) with
+   Plot from *p1* to *p2* (normalized image plane coordinates) with
    *plot_args* fed into the plot.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_image_line((0.3, 0.4), (0.8, 0.9), plot_args={'linewidth':5})
+   p.save()
 
-.. function:: axis_label(self, label):
+-------------
 
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.LabelCallback`.)
-
-   This adds a label to the plot.
-
-
-
-.. function:: line(self, x, y, plot_args=None):
+.. function:: annotate_line(self, x, y, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.LinePlotCallback`.)
 
-   Over plot *x* and *y* with *plot_args* fed into the plot.
+   Over plot *x* and *y* (in code units) with *plot_args* fed into the plot.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_line([-6, -4, -2, 0, 2, 4, 6], [3.6, 1.6, 0.4, 0, 0.4, 1.6, 3.6], plot_args={'linewidth':5})
+   p.save()
 
-.. function:: magnetic_field(self, factor=16, scale=None, scale_units=None, normalize=False):
+-------------
+
+.. function:: annotate_magnetic_field(self, factor=16, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.MagFieldCallback`.)
 
@@ -120,19 +189,35 @@
    features to be more clearly seen for fields with
    substantial variation in field strength.
 
+.. code-block:: python
 
+   from yt.mods import *
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
+   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(20, 'kpc'))
+   p.annotate_magnetic_field()
+   p.save()
 
-.. function:: marker(self, pos, marker='x', plot_args=None):
+-------------
+
+.. function:: annotate_marker(self, pos, marker='x', plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.MarkerAnnotateCallback`.)
 
-   Adds text *marker* at *pos* in code-arguments.
+   Adds text *marker* at *pos* in code coordinates.
    *plot_args* is a dict that will be forwarded to the plot
    command.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   s.annotate_marker([0.53, 0.53, 0.53], plot_args={'s':10000})
+   s.save()   
 
-.. function:: particles(self, width, p_size=1.0, col='k', marker='o', stride=1.0, ptype=None, stars_only=False, dm_only=False, minimum_mass=None):
+-------------
+
+.. function:: annotate_particles(self, width, p_size=1.0, col='k', marker='o', stride=1.0, ptype=None, stars_only=False, dm_only=False, minimum_mass=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ParticleCallback`.)
 
@@ -145,9 +230,17 @@
    given mass, calculated via ParticleMassMsun, to be
    plotted.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   p = ProjectionPlot(pf, "x", "Density", center='m', width=(10, 'Mpc'))
+   p.annotate_particles(10/pf['Mpc'])
+   p.save()
 
-.. function:: point(self, pos, text, text_args=None):
+-------------
+
+.. function:: annotate_point(self, pos, text, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.PointAnnotateCallback`.)
 
@@ -155,9 +248,17 @@
    code-space. *text_args* is a dict fed to the text
    placement code.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_point([0.53, 0.526, 0.53], "What's going on here?", text_args={'size':'xx-large', 'color':'w'})
+   p.save()
 
-.. function:: quiver(self, field_x, field_y, factor, scale=None, scale_units=None, normalize=False):
+-------------
+
+.. function:: annotate_quiver(self, field_x, field_y, factor, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.QuiverCallback`.)
 
@@ -167,9 +268,18 @@
    length unit using *scale_units*  (see
    matplotlib.axes.Axes.quiver for more info)
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], 
+                      weight_field='Density', width=(20, 'kpc'))
+   p.annotate_quiver('x-velocity', 'y-velocity', 16)
+   p.save()
 
-.. function:: sphere(self, center, radius, circle_args=None, text=None, text_args=None):
+-------------
+
+.. function:: annotate_sphere(self, center, radius, circle_args=None, text=None, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.SphereCallback`.)
 
@@ -177,9 +287,17 @@
    *radius* in code units will be created, with optional
    *circle_args*, *text*, and *text_args*.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   p.annotate_sphere([0.53, 0.53, 0.53], 2/pf['kpc'], {'fill':True})
+   p.save()
 
-.. function:: streamlines(self, field_x, field_y, factor=6.0, nx=16, ny=16, xstart=(0, 1), ystart=(0, 1), nsample=256, start_at_xedge=False, start_at_yedge=False, plot_args=None):
+-------------
+
+.. function:: annotate_streamlines(self, field_x, field_y, factor=6.0, nx=16, ny=16, xstart=(0, 1), ystart=(0, 1), nsample=256, start_at_xedge=False, start_at_yedge=False, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.StreamlineCallback`.)
 
@@ -191,9 +309,17 @@
    use *start_at_yedge*.  A line with the qmean vector
    magnitude will cover 1.0/*factor* of the image.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   s.annotate_streamlines('x-velocity', 'y-velocity')
+   s.save()
 
-.. function:: text(self, pos, text, data_coords=False, text_args=None):
+-------------
+
+.. function:: annotate_text(self, pos, text, data_coords=False, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.TextLabelCallback`.)
 
@@ -202,27 +328,33 @@
    is True, position will be in code units instead of image
    coordinates.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   s.annotate_text((0.53, 0.53), 'Sample text', {'size':'xx-large', 'color':'w'})
+   s.save()
 
-.. function:: title(self, title='Plot'):
+-------------
+
+.. function:: annotate_title(self, title='Plot'):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.TitleCallback`.)
 
    Accepts a *title* and adds it to the plot
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   p.annotate_title('Density plot')
+   p.save()
 
-.. function:: units(self, unit='au', factor=4, text_annotate=True, text_which=-2):
+-------------
 
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.UnitBoundaryCallback`.)
-
-   Add on a plot indicating where *factor*s of *unit* are
-   shown. Optionally *text_annotate* on the
-   *text_which*-indexed box on display.
-
-
-
-.. function:: velocity(self, factor=16, scale=None, scale_units=None, normalize=False):
+.. function:: annotate_velocity(self, factor=16, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.VelocityCallback`.)
 
@@ -236,12 +368,10 @@
    substantial variation in field strength (normalize is not
    implemented and thus ignored for Cutting Planes).
 
+.. python-script::
 
-
-.. function:: voboz_circle(self, voboz_output, max_number=None, annotate=False, min_size=20, font_size=8, print_halo_size=False):
-
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.VobozCircleCallback`.)
-
-   x.__init__(...) initializes x; see help(type(x)) for
-   signature
-
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_velocity()
+   p.save()

diff -r c254ebdc8abdfd8ea2ff571fe96809f7303412a2 -r 1dec03d4ff076d7546ad559894e2eaebda97d0f7 source/visualizing/callbacks.rst
--- a/source/visualizing/callbacks.rst
+++ b/source/visualizing/callbacks.rst
@@ -17,8 +17,9 @@
 
 Callbacks can be applied to plots created with
 :class:`~yt.visualization.plot_window.SlicePlot`,
-:class:`~yt.visualization.plot_window.ProjectionPlot`, or
-:class:`~yt.visualization.plot_window.OffAxisSlicePlot`,  by calling
+:class:`~yt.visualization.plot_window.ProjectionPlot`,
+:class:`~yt.visualization.plot_window.OffAxisSlicePlot`, or
+:class:`~yt.visualization.plot_windiw.OffAxisProjectionPlot` by calling
 one of the ``annotate_`` methods that hang off of the plot object.
 The ``annotate_`` methods are dynamically generated based on the list
 of available callbacks.  For example:
@@ -32,41 +33,6 @@
 callbacks listed below are available via similar ``annotate_``
 functions.
 
-
-PlotCollection Plots
-~~~~~~~~~~~~~~~~~~~~
-
-For :class:`~yt.visualization.plot_collection.PlotCollection` plots,
-the callbacks can be accessed through a registry attached to every
-plot object.  When you add a plot to a
-:class:`~yt.visualization.plot_collection.PlotCollection`, you get back
-that affiliated plot object.  By accessing ``modify`` on that plot
-object, you have access to the available callbacks.  For instance,
-
-.. code-block:: python
-
-   p = PlotCollection.add_slice("Density", 0)
-   p.modify["velocity"]()
-
-would add the :func:`velocity` callback to the plot object.  When you save the
-plot, the list of callbacks will be iterated over, and the velocity callback
-will be handed the current state of the plot.  It will then be able to
-dynamically modify the plot before saving -- in this case, adding on velocity
-vectors atop the image.  You can also access the plot objects inside the
-PlotCollection directly:
-
-.. code-block:: python
-
-   pc.add_slice("Density", 0)
-   pc.plots[-1].modify["grids"]()
-
-Note that if you are plotting interactively, the PlotCollection will need to
-have ``redraw`` called on it.
-
-.. note:: You can access ``plot`` objects after creation through the ``plots``
-   list on the ``PlotCollection``.
-
-
 Available Callbacks
 -------------------
 


https://bitbucket.org/yt_analysis/yt-doc/commits/468981f26501/
Changeset:   468981f26501
User:        ngoldbaum
Date:        2013-10-30 04:13:44
Summary:     Small adjustment for annotate_text recipe.
Affected #:  1 file

diff -r 1dec03d4ff076d7546ad559894e2eaebda97d0f7 -r 468981f26501ae4bf9d708d5caf374e73980e45a source/visualizing/_cb_docstrings.inc
--- a/source/visualizing/_cb_docstrings.inc
+++ b/source/visualizing/_cb_docstrings.inc
@@ -333,7 +333,7 @@
    from yt.mods import *
    pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
    s = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
-   s.annotate_text((0.53, 0.53), 'Sample text', {'size':'xx-large', 'color':'w'})
+   s.annotate_text((0.53, 0.53), 'Sample text', text_args={'size':'xx-large', 'color':'w'})
    s.save()
 
 -------------


https://bitbucket.org/yt_analysis/yt-doc/commits/9bd2eefa6364/
Changeset:   9bd2eefa6364
User:        ngoldbaum
Date:        2013-10-30 04:26:04
Summary:     Adding Cameron's suggestions.
Affected #:  1 file

diff -r 468981f26501ae4bf9d708d5caf374e73980e45a -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b source/visualizing/callbacks.rst
--- a/source/visualizing/callbacks.rst
+++ b/source/visualizing/callbacks.rst
@@ -6,9 +6,6 @@
 Adding callbacks to plots
 -------------------------
 
-Plot window Plots
-~~~~~~~~~~~~~~~~~
-
 Because the plots in ``yt`` are considered to be "volatile" -- existing
 independent of the canvas on which they are plotted -- before they are saved,
 you can have a set of "callbacks" run that modify them before saving to disk.
@@ -36,8 +33,7 @@
 Available Callbacks
 -------------------
 
-These are the callbacks available through the ``modify[]`` mechanism.  The
-underlying functions are documented (largely identical to this) in
+The underlying functions are documented (largely identical to this) in
 :ref:`callback-api`.
 
 .. include:: _cb_docstrings.inc


https://bitbucket.org/yt_analysis/yt-doc/commits/45b305e19ab8/
Changeset:   45b305e19ab8
User:        chummels
Date:        2013-10-30 05:23:35
Summary:     Merging.
Affected #:  13 files

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 helper_scripts/show_fields.py
--- a/helper_scripts/show_fields.py
+++ b/helper_scripts/show_fields.py
@@ -17,6 +17,17 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
+Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+native and derived fields available for your dataset respectively. For example
+to display the native fields in alphabetical order:
+
+.. notebook-cell::
+
+  from yt.mods import *
+  pf = load("Enzo_64/DD0043/data0043")
+  for i in sorted(pf.h.field_list):
+    print i
+
 .. note:: Universal fields will be overridden by a code-specific field.
 
 .. rubric:: Table of Contents
@@ -95,7 +106,37 @@
 print
 print_all_fields(FLASHFieldInfo)
 
-print "Nyx-Specific Field List"
+print "Athena-Specific Field List"
 print "--------------------------"
 print
+print_all_fields(AthenaFieldInfo)
+
+print "Nyx-Specific Field List"
+print "-----------------------"
+print
 print_all_fields(NyxFieldInfo)
+
+print "Castro-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(CastroFieldInfo)
+
+print "Chombo-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(ChomboFieldInfo)
+
+print "Pluto-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(PlutoFieldInfo)
+
+print "Grid-Data-Format-Specific Field List"
+print "------------------------------------"
+print
+print_all_fields(GDFFieldInfo)
+
+print "Generic-Format (Stream) Field List"
+print "----------------------------------"
+print
+print_all_fields(StreamFieldInfo)

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/api/api.rst
--- a/source/api/api.rst
+++ /dev/null
@@ -1,563 +0,0 @@
-API Reference
-=============
-
-Plots and the Plotting Interface
---------------------------------
-
-SlicePlot and ProjectionPlot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_window.SlicePlot
-   ~yt.visualization.plot_window.OffAxisSlicePlot
-   ~yt.visualization.plot_window.ProjectionPlot
-   ~yt.visualization.plot_window.OffAxisProjectionPlot
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_collection.PlotCollection
-   ~yt.visualization.plot_collection.PlotCollectionInteractive
-   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
-   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.base_plot_types.get_multi_plot
-
-Data Sources
-------------
-
-.. _physical-object-api:
-
-Physical Objects
-^^^^^^^^^^^^^^^^
-
-These are the objects that act as physical selections of data, describing a
-region in space.  These are not typically addressed directly; see
-:ref:`available-objects` for more information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.data_containers.AMRCoveringGridBase
-   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
-   ~yt.data_objects.data_containers.AMRCylinderBase
-   ~yt.data_objects.data_containers.AMRGridCollectionBase
-   ~yt.data_objects.data_containers.AMRRayBase
-   ~yt.data_objects.data_containers.AMROrthoRayBase
-   ~yt.data_objects.data_containers.AMRStreamlineBase
-   ~yt.data_objects.data_containers.AMRProjBase
-   ~yt.data_objects.data_containers.AMRRegionBase
-   ~yt.data_objects.data_containers.AMRSliceBase
-   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
-   ~yt.data_objects.data_containers.AMRSphereBase
-   ~yt.data_objects.data_containers.AMRSurfaceBase
-
-Time Series Objects
-^^^^^^^^^^^^^^^^^^^
-
-These are objects that either contain and represent or operate on series of
-datasets.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.time_series.TimeSeriesData
-   ~yt.data_objects.time_series.TimeSeriesDataObject
-   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
-   ~yt.data_objects.time_series.AnalysisTaskProxy
-
-Frontends
----------
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.grid_patch.AMRGridPatch
-   ~yt.data_objects.hierarchy.AMRHierarchy
-   ~yt.data_objects.static_output.StaticOutput
-
-Enzo
-^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.enzo.data_structures.EnzoGrid
-   ~yt.frontends.enzo.data_structures.EnzoHierarchy
-   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
-
-Orion
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.orion.data_structures.OrionGrid
-   ~yt.frontends.orion.data_structures.OrionHierarchy
-   ~yt.frontends.orion.data_structures.OrionStaticOutput
-
-FLASH
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.flash.data_structures.FLASHGrid
-   ~yt.frontends.flash.data_structures.FLASHHierarchy
-   ~yt.frontends.flash.data_structures.FLASHStaticOutput
-
-Chombo
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.chombo.data_structures.ChomboGrid
-   ~yt.frontends.chombo.data_structures.ChomboHierarchy
-   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
-
-RAMSES
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
-
-Derived Datatypes
------------------
-
-Profiles and Histograms
-^^^^^^^^^^^^^^^^^^^^^^^
-
-These types are used to sum data up and either return that sum or return an
-average.  Typically they are more easily used through the
-`yt.visualization.plot_collection` interface.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.profiles.BinnedProfile1D
-   ~yt.data_objects.profiles.BinnedProfile2D
-   ~yt.data_objects.profiles.BinnedProfile3D
-
-Halo Finding and Particle Functions
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Halo finding can be executed using these types.  Here we list the main halo
-finders as well as a few other supplemental objects.
-
-.. rubric:: Halo Finders
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
-   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
-
-You can also operate on the Halo and HAloList objects themselves:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.Halo
-   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
-
-There are also functions for loading halos from disk:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
-
-We have several methods that work to create merger trees:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
-
-You can use Halo catalogs generatedl externally as well:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
-
-Halo Profiling
-^^^^^^^^^^^^^^
-
-yt provides a comprehensive halo profiler that can filter, center, and analyze
-halos en masse.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
-
-
-Two Point Functions
-^^^^^^^^^^^^^^^^^^^
-
-These functions are designed to create correlations or other results of
-operations acting on two spatially-distinct points in a data source.  See also
-:ref:`two_point_functions`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
-   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
-
-Field Types
------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.field_info_container.DerivedField
-   ~yt.data_objects.field_info_container.FieldInfoContainer
-   ~yt.data_objects.field_info_container.ValidateDataField
-   ~yt.data_objects.field_info_container.ValidateGridType
-   ~yt.data_objects.field_info_container.ValidateParameter
-   ~yt.data_objects.field_info_container.ValidateProperty
-   ~yt.data_objects.field_info_container.ValidateSpatial
-
-Image Handling
---------------
-
-For volume renderings and fixed resolution buffers the image object returned is
-an ``ImageArray`` object, which has useful functions for image saving and 
-writing to bitmaps.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.image_array.ImageArray
-   ~yt.data_objects.image_array.ImageArray.write_png
-   ~yt.data_objects.image_array.ImageArray.write_hdf5
-
-Extension Types
----------------
-
-Coordinate Transformations
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
-   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
-
-Cosmology, Star Particle Analysis, and Simulated Observations
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
-
-Light cone generation and simulation analysis.  (See also
-:ref:`light-cone-generator`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
-   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
-
-Absorption and X-ray spectra and spectral lines:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
-
-Absorption spectra fitting:
-
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
-
-Sunrise exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
-
-RADMC-3D exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
-
-Radial Column Density
-^^^^^^^^^^^^^^^^^^^^^
-
-If you'd like to calculate the column density out to a given point, from a
-specified center, yt can provide that information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
-
-Volume Rendering
-^^^^^^^^^^^^^^^^
-
-See also :ref:`volume_rendering`.
-
-Here are the primary entry points:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.Camera
-   ~yt.visualization.volume_rendering.camera.off_axis_projection
-   ~yt.visualization.volume_rendering.camera.allsky_projection
-
-These objects set up the way the image looks:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
-
-There are also advanced objects for particular use cases:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
-   ~yt.visualization.volume_rendering.camera.FisheyeCamera
-   ~yt.visualization.volume_rendering.camera.MosaicCamera
-   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
-   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
-   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
-   ~yt.visualization.volume_rendering.camera.StereoPairCamera
-
-Streamlining
-^^^^^^^^^^^^
-
-See also :ref:`streamlines`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.streamlines.Streamlines
-
-Image Writing
-^^^^^^^^^^^^^
-
-These functions are all used for fast writing of images directly to disk,
-without calling matplotlib.  This can be very useful for high-cadence outputs
-where colorbars are unnecessary or for volume rendering.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.image_writer.multi_image_composite
-   ~yt.visualization.image_writer.write_bitmap
-   ~yt.visualization.image_writer.write_projection
-   ~yt.visualization.image_writer.write_fits
-   ~yt.visualization.image_writer.write_image
-   ~yt.visualization.image_writer.map_to_colors
-   ~yt.visualization.image_writer.strip_colormap_data
-   ~yt.visualization.image_writer.splat_points
-   ~yt.visualization.image_writer.annotate_image
-   ~yt.visualization.image_writer.scale_image
-
-We also provide a module that is very good for generating EPS figures,
-particularly with complicated layouts.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.eps_writer.DualEPS
-   ~yt.visualization.eps_writer.single_plot
-   ~yt.visualization.eps_writer.multiplot
-   ~yt.visualization.eps_writer.multiplot_yt
-   ~yt.visualization.eps_writer.return_cmap
-
-.. _image-panner-api:
-
-Derived Quantities
-------------------
-
-See :ref:`derived-quantities`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.derived_quantities._AngularMomentumVector
-   ~yt.data_objects.derived_quantities._BaryonSpinParameter
-   ~yt.data_objects.derived_quantities._BulkVelocity
-   ~yt.data_objects.derived_quantities._CenterOfMass
-   ~yt.data_objects.derived_quantities._Extrema
-   ~yt.data_objects.derived_quantities._IsBound
-   ~yt.data_objects.derived_quantities._MaxLocation
-   ~yt.data_objects.derived_quantities._ParticleSpinParameter
-   ~yt.data_objects.derived_quantities._TotalMass
-   ~yt.data_objects.derived_quantities._TotalQuantity
-   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
-
-.. _callback-api:
-
-Callback List
--------------
-
-
-See also :ref:`callbacks`.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_modifications.ArrowCallback
-   ~yt.visualization.plot_modifications.ClumpContourCallback
-   ~yt.visualization.plot_modifications.ContourCallback
-   ~yt.visualization.plot_modifications.CoordAxesCallback
-   ~yt.visualization.plot_modifications.CuttingQuiverCallback
-   ~yt.visualization.plot_modifications.GridBoundaryCallback
-   ~yt.visualization.plot_modifications.HopCircleCallback
-   ~yt.visualization.plot_modifications.HopParticleCallback
-   ~yt.visualization.plot_modifications.LabelCallback
-   ~yt.visualization.plot_modifications.LinePlotCallback
-   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
-   ~yt.visualization.plot_modifications.ParticleCallback
-   ~yt.visualization.plot_modifications.PointAnnotateCallback
-   ~yt.visualization.plot_modifications.QuiverCallback
-   ~yt.visualization.plot_modifications.SphereCallback
-   ~yt.visualization.plot_modifications.TextLabelCallback
-   ~yt.visualization.plot_modifications.TitleCallback
-   ~yt.visualization.plot_modifications.UnitBoundaryCallback
-   ~yt.visualization.plot_modifications.VelocityCallback
-
-Function List
--------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.convenience.load
-   ~yt.funcs.deprecate
-   ~yt.funcs.ensure_list
-   ~yt.funcs.get_pbar
-   ~yt.funcs.humanize_time
-   ~yt.funcs.insert_ipython
-   ~yt.funcs.iterable
-   ~yt.funcs.just_one
-   ~yt.funcs.only_on_root
-   ~yt.funcs.paste_traceback
-   ~yt.funcs.pdb_run
-   ~yt.funcs.print_tb
-   ~yt.funcs.rootonly
-   ~yt.funcs.time_execution
-   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
-
-Miscellaneous Types
--------------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.config.YTConfigParser
-   ~yt.utilities.parameter_file_storage.ParameterFileStore
-   ~yt.data_objects.data_containers.FakeGridForParticles
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
-
-
-Testing Infrastructure
-----------------------
-
-The first set of functions are all provided by NumPy.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_array_equal
-   ~yt.testing.assert_almost_equal
-   ~yt.testing.assert_approx_equal
-   ~yt.testing.assert_array_almost_equal
-   ~yt.testing.assert_equal
-   ~yt.testing.assert_array_less
-   ~yt.testing.assert_string_equal
-   ~yt.testing.assert_array_almost_equal_nulp
-   ~yt.testing.assert_allclose
-   ~yt.testing.assert_raises
-
-These are yt-provided functions:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_rel_equal
-   ~yt.testing.amrspace
-   ~yt.testing.fake_random_pf
-   ~yt.testing.expand_keywords

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/bootcamp/Introduction.ipynb
--- a/source/bootcamp/Introduction.ipynb
+++ b/source/bootcamp/Introduction.ipynb
@@ -15,52 +15,26 @@
       "\n",
       "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
       "\n",
-      "But, before we begin, there are a few places to go if you run into trouble.\n",
-      "\n",
-      "**The yt homepage is at http://yt-project.org/**\n",
-      "\n",
-      "## Source of Help\n",
-      "\n",
-      "There are three places to check for help:\n",
-      "\n",
-      " * The documentation: http://yt-project.org/doc/\n",
-      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
-      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
-      "\n",
-      "## Sources of Information\n",
-      "\n",
-      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
-      "\n",
-      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
-      "\n",
-      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
-      "\n",
-      "## How to Update yt\n",
-      "\n",
-      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
-      "\n",
-      "`yt update`\n",
-      "\n",
-      "This will automatically update it for you.\n",
+      "Our documentation page can provide information on a variety of the commands that are used here, both in narrative documentation as well as recipes for specific functionality in our cookbook.  The documentation exists at http://yt-project.org/doc/.  If you encounter problems, look for help here: http://yt-project.org/doc/help/index.html.\n",
       "\n",
       "## Acquiring the datasets for this tutorial\n",
       "\n",
-      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/.\n",
+      "If you are executing these tutorials interactively, you need some sample datasets on which to run the code.  You can download these datasets at http://yt-project.org/data/.  The datasets necessary for each lesson are noted next to the corresponding tutorial.\n",
       "\n",
       "## What's Next?\n",
       "\n",
       "The Notebooks are meant to be explored in this order:\n",
       "\n",
       "1. Introduction\n",
-      "2. Data Inspection\n",
-      "3. Simple Visualization\n",
-      "4. Data Objects and Time Series\n",
-      "5. Derived Fields and Profiles\n",
-      "6. Volume Rendering"
+      "2. Data Inspection (IsolatedGalaxy dataset)\n",
+      "3. Simple Visualization (enzo_tiny_cosmology & Enzo_64 datasets)\n",
+      "4. Data Objects and Time Series (IsolatedGalaxy dataset)\n",
+      "5. Derived Fields and Profiles (IsolatedGalaxy dataset)\n",
+      "6. Volume Rendering (IsolatedGalaxy dataset)"
      ]
     }
    ],
    "metadata": {}
   }
  ]
-}
\ No newline at end of file
+}

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -256,4 +256,4 @@
                        }
 
 if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+    autosummary_generate = glob.glob("reference/api/api.rst")

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,7 +1,7 @@
 .. _cookbook:
 
-The yt Cookbook
-===============
+The Cookbook
+============
 
 yt provides a great deal of functionality to the user, but sometimes it can 
 be a bit complex.  This section of the documentation lays out examples recipes 

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/examining/loading_data.rst
--- a/source/examining/loading_data.rst
+++ b/source/examining/loading_data.rst
@@ -11,40 +11,52 @@
 Generic Array Data
 ------------------
 
-Even if your data is not strictly related to fields commonly used in
-astrophysical codes or your code is not supported yet, you can still feed it to
-``yt`` to use its advanced visualization and analysis facilities. The only
-requirement is that your data can be represented as one or more uniform, three
-dimensional numpy arrays. Assuming that you have your data in ``arr``,
-the following code:
+If you have a data format which is unsupported by the existing code frontends,
+you can still read your data into ``yt``.  The only requirement is that your 
+data can be represented as one or more uniform, three
+dimensional numpy arrays.  
+
+For example, let's say you have a 3D dataset of a density field in a cubical 
+volume which is 3 Mpc on a side.
+
+
+Assuming that you have a managed to get your data 
+into a numpy array called ``a``, you can now read it in using the following
+code:
 
 .. code-block:: python
 
    from yt.frontends.stream.api import load_uniform_grid
 
-   data = dict(Density = arr)
+   data = dict(Density = a)
    bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
+   pf = load_uniform_grid(data, a.shape, 3.08e24, bbox=bbox, nprocs=12)
 
-will create ``yt``-native parameter file ``pf`` that will treat your array as
-density field in cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) and
-simultaneously divide the domain into 12 chunks, so that you can take advantage
-of the underlying parallelism. 
+This will generate a ``yt``-native parameter file as ``pf``.  It will treat 
+your array as density field in a cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) 
+and simultaneously divide the domain into 12 chunks, so that you can take advantage
+of the underlying parallelism.  If you want to set another fieldname other 
+than ``Density``, feel free to do that here as well.  To disable parallelism, 
+just don't set nprocs.
+
+You can now use ``pf`` as though it were a normal parameter file like the 
+many examples in the Cookbook.
 
 Particle fields are detected as one-dimensional fields. The number of
-particles is set by the ``number_of_particles`` key in
-``data``. Particle fields are then added as one-dimensional arrays in
-a similar manner as the three-dimensional grid fields:
+particles is set by the ``number_of_particles`` key in the ``data`` dictionary. 
+Particle fields are then added as one-dimensional arrays in a similar manner 
+as the three-dimensional grid fields.  So starting again with a 3D numpy array 
+``a`` that represents our Density field
 
 .. code-block:: python
 
    from yt.frontends.stream.api import load_uniform_grid
 
-   data = dict(Density = dens, 
+   data = dict(Density = a, 
                number_of_particles = 1000000,
                particle_position_x = posx_arr, 
-	       particle_position_y = posy_arr,
-	       particle_position_z = posz_arr)
+	           particle_position_y = posy_arr,
+	           particle_position_z = posz_arr)
    bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
    pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
 
@@ -175,114 +187,6 @@
   positions will not be.
 * Domains may be visualized assuming periodicity.
 
-.. _loading-ramses-data:
-
-RAMSES Data
------------
-
-RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
-you are interested in taking a development or stewardship role, please contact
-him.  To load a RAMSES dataset, you can use the ``load`` command provided by
-``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
-were in a directory with the following files:
-
-.. code-block:: none
-
-   output_00007
-   output_00007/amr_00007.out00001
-   output_00007/grav_00007.out00001
-   output_00007/hydro_00007.out00001
-   output_00007/info_00007.txt
-   output_00007/part_00007.out00001
-
-You would feed it the filename ``output_00007/info_00007.txt``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("output_00007/info_00007.txt")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly set!  This may not be the
-  case for RAMSES data
-* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
-  domain to ensure minimum-coverage from a set of grid patches.  (This is
-  described in the yt method paper.)  This is a time-consuming process and it
-  has not yet been written to be stored between calls.
-* Particles are not supported
-* Parallelism will not be terribly efficient for large datasets
-* There may be occasional segfaults on multi-domain data, which do not
-  reflect errors in the calculation
-
-If you are interested in helping with RAMSES support, we are eager to hear from
-you!
-
-.. _loading-art-data:
-
-ART Data
---------
-
-ART data enjoys preliminary support and is supported by Christopher Moody.
-Please contact the ``yt-dev`` mailing list if you are interested in using yt
-for ART data, or if you are interested in assisting with development of yt to
-work with ART data.
-
-At the moment, the ART octree is 'regridded' at each level to make the native
-octree look more like a mesh-based code. As a result, the initial outlay
-is about ~60 seconds to grid octs onto a mesh. This will be improved in 
-``yt-3.0``, where octs will be supported natively. 
-
-To load an ART dataset you can use the ``load`` command provided by 
-``yt.mods`` and passing the gas mesh file. It will search for and attempt 
-to find the complementary dark matter and stellar particle header and data 
-files. However, your simulations may not follow the same naming convention.
-
-So for example, a single snapshot might have a series of files looking like
-this:
-
-.. code-block:: none
-
-   10MpcBox_csf512_a0.300.d    #Gas mesh
-   PMcrda0.300.DAT             #Particle header
-   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
-   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
-
-The ART frontend tries to find the associated files matching the above, but
-if that fails you can specify ``file_particle_data``,``file_particle_data``,
-``file_star_data`` in addition to the specifying the gas mesh. You also have 
-the option of gridding particles, and assigning them onto the meshes.
-This process is in beta, and for the time being it's probably  best to leave
-``do_grid_particles=False`` as the default.
-
-To speed up the loading of an ART file, you have a few options. You can turn 
-off the particles entirely by setting ``discover_particles=False``. You can
-also only grid octs up to a certain level, ``limit_level=5``, which is useful
-when debugging by artificially creating a 'smaller' dataset to work with.
-
-Finally, when stellar ages are computed we 'spread' the ages evenly within a
-smoothing window. By default this is turned on and set to 10Myr. To turn this 
-off you can set ``spread=False``, and you can tweak the age smoothing window
-by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
-
-.. code-block:: python
-    
-   from yt.mods import *
-
-   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
-   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
-   pf.h.print_stats()
-   dd=pf.h.all_data()
-   print np.sum(dd['particle_type']==0)
-
-In the above example code, the first line imports the standard yt functions,
-followed by defining the gas mesh file. It's loaded only through level 3, but
-grids particles on to meshes on level 2 and higher. Finally, we create a data
-container and ask it to gather the particle_type array. In this case ``type==0``
-is for the most highly-refined dark matter particle, and we print out how many
-high-resolution star particles we find in the simulation.  Typically, however,
-you shouldn't have to specify any keyword arguments to load in a dataset.
-
 .. loading-amr-data:
 
 Generic AMR Data

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -3,38 +3,129 @@
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
-in Python under the open-source model.  yt currently supports several 
-astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  Fully-supported codes 
-include: `Enzo <http://enzo-project.org/>`_, 
-`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
-`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
-`FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_;
-and partially-supported codes include: 
-`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
-`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
-
-yt uses a three-pronged approach to interacting with data:
-
- * Visualize Data - Generate plots, images, and movies for better understanding your datasets
- * Analyze Data - Use additional analysis routines to derive real-world results from your data
- * Examine Data - Directly access raw data with helper functions for making this task easier
+in Python under the open-source model.  In version 2.6, yt currently supports
+several astrophysical simulation code formats, as well support for
+:ref:`loading-numpy-array` for unsupported data formats.  :ref:`code-support`
+is included for: `Enzo <http://enzo-project.org/>`_, `Orion
+<http://flash.uchicago.edu/~rfisher/orion/>`_, `Nyx
+<https://ccse.lbl.gov/Research/NYX/index.html>`_, `FLASH
+<http://flash.uchicago.edu/website/home/>`_, `Piernik
+<http://arxiv.org/abs/0901.0104>`_, `Athena
+<https://trac.princeton.edu/Athena/>`_, `Chombo <http://chombo.lbl.gov>`_,
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_, `Maestro
+<https://ccse.lbl.gov/Research/MAESTRO/>`_, and `Pluto
+<http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
+particle codes and octree codes, is taking place in yt 3.0.)
 
 Documentation
 =============
 
+.. raw:: html
+
+   <table class="contentstable" align="center">
+
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="installing.html">Installation</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Getting and Installing yt</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="bootcamp/index.html">yt Bootcamp</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Demonstrations of what yt can do</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="cookbook/index.html">The Cookbook</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Example recipes for how to accomplish a variety of tasks</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="visualizing/index.html">Visualizing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Make plots, projections, volume renderings, movies, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="analyzing/index.html">Analyzing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="examining/index.html">Examining Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Load data and directly access raw values for low-level analysis</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="developing/index.html">Developing in yt</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Catering yt to work for your exact use case</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="reference/index.html">Reference Materials</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Lists of fields, quantities, classes, functions, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/index.html">Getting help</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">What to do if you run into problems</p>
+       </td>
+     </tr>
+
+   </table>
+
 .. toctree::
-   :maxdepth: 1
+   :hidden:
 
    installing
-   yt Bootcamp: A Worked Introduction <bootcamp/index>
-   help/index
+   yt Bootcamp <bootcamp/index>
    cookbook/index
    visualizing/index
    analyzing/index
    examining/index
    developing/index
    reference/index
+   help/index

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,6 +1,9 @@
 Getting and Installing yt
 =========================
 
+Getting yt
+----------
+
 yt is a Python package (with some components written in C), using NumPy as a
 computation engine, Matplotlib for some visualization tasks and Mercurial for
 version control.  Because installation of all of these interlocking parts can 
@@ -16,10 +19,14 @@
 
   http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
 
-By default, it will install an array of items, but there are additional packages
-that can be downloaded and installed (e.g. SciPy, enzo, etc.). The script has 
-all of these options at the top of the file. You should be able to open it and 
-edit it without any knowledge of bash syntax.  To execute it, run:
+Installing yt
+-------------
+
+By default, the bash script will install an array of items, but there are 
+additional packages that can be downloaded and installed (e.g. SciPy, enzo, 
+etc.). The script has all of these options at the top of the file. You should 
+be able to open it and edit it without any knowledge of bash syntax.  
+To execute it, run:
 
 .. code-block:: bash
 
@@ -95,3 +102,21 @@
 If you get an error, follow the instructions it gives you to debug the problem.  
 Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
 figure it out.
+
+Updating yt and its dependencies
+--------------------------------
+
+With many active developers, code development sometimes occurs at a furious 
+pace in yt.  To make sure you're using the latest version of the code, run
+this command at a command-line:
+
+.. code-block:: bash
+
+  $ yt update
+
+Additionally, if you want to make sure you have the latest dependencies 
+associated with yt and update the codebase simultaneously, type this:
+
+.. code-block:: bash
+
+  $ yt update --all

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/reference/api/api.rst
--- /dev/null
+++ b/source/reference/api/api.rst
@@ -0,0 +1,563 @@
+API Reference
+=============
+
+Plots and the Plotting Interface
+--------------------------------
+
+SlicePlot and ProjectionPlot
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_window.SlicePlot
+   ~yt.visualization.plot_window.OffAxisSlicePlot
+   ~yt.visualization.plot_window.ProjectionPlot
+   ~yt.visualization.plot_window.OffAxisProjectionPlot
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_collection.PlotCollection
+   ~yt.visualization.plot_collection.PlotCollectionInteractive
+   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
+   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
+   ~yt.visualization.base_plot_types.get_multi_plot
+
+Data Sources
+------------
+
+.. _physical-object-api:
+
+Physical Objects
+^^^^^^^^^^^^^^^^
+
+These are the objects that act as physical selections of data, describing a
+region in space.  These are not typically addressed directly; see
+:ref:`available-objects` for more information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.data_containers.AMRCoveringGridBase
+   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
+   ~yt.data_objects.data_containers.AMRCylinderBase
+   ~yt.data_objects.data_containers.AMRGridCollectionBase
+   ~yt.data_objects.data_containers.AMRRayBase
+   ~yt.data_objects.data_containers.AMROrthoRayBase
+   ~yt.data_objects.data_containers.AMRStreamlineBase
+   ~yt.data_objects.data_containers.AMRProjBase
+   ~yt.data_objects.data_containers.AMRRegionBase
+   ~yt.data_objects.data_containers.AMRSliceBase
+   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
+   ~yt.data_objects.data_containers.AMRSphereBase
+   ~yt.data_objects.data_containers.AMRSurfaceBase
+
+Time Series Objects
+^^^^^^^^^^^^^^^^^^^
+
+These are objects that either contain and represent or operate on series of
+datasets.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.time_series.TimeSeriesData
+   ~yt.data_objects.time_series.TimeSeriesDataObject
+   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
+   ~yt.data_objects.time_series.AnalysisTaskProxy
+
+Frontends
+---------
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.grid_patch.AMRGridPatch
+   ~yt.data_objects.hierarchy.AMRHierarchy
+   ~yt.data_objects.static_output.StaticOutput
+
+Enzo
+^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.enzo.data_structures.EnzoGrid
+   ~yt.frontends.enzo.data_structures.EnzoHierarchy
+   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
+
+Orion
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.orion.data_structures.OrionGrid
+   ~yt.frontends.orion.data_structures.OrionHierarchy
+   ~yt.frontends.orion.data_structures.OrionStaticOutput
+
+FLASH
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.flash.data_structures.FLASHGrid
+   ~yt.frontends.flash.data_structures.FLASHHierarchy
+   ~yt.frontends.flash.data_structures.FLASHStaticOutput
+
+Chombo
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.chombo.data_structures.ChomboGrid
+   ~yt.frontends.chombo.data_structures.ChomboHierarchy
+   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
+
+RAMSES
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.ramses.data_structures.RAMSESGrid
+   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
+   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+
+Derived Datatypes
+-----------------
+
+Profiles and Histograms
+^^^^^^^^^^^^^^^^^^^^^^^
+
+These types are used to sum data up and either return that sum or return an
+average.  Typically they are more easily used through the
+`yt.visualization.plot_collection` interface.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.profiles.BinnedProfile1D
+   ~yt.data_objects.profiles.BinnedProfile2D
+   ~yt.data_objects.profiles.BinnedProfile3D
+
+Halo Finding and Particle Functions
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Halo finding can be executed using these types.  Here we list the main halo
+finders as well as a few other supplemental objects.
+
+.. rubric:: Halo Finders
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
+   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
+
+You can also operate on the Halo and HAloList objects themselves:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.Halo
+   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
+
+There are also functions for loading halos from disk:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
+
+We have several methods that work to create merger trees:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
+
+You can use Halo catalogs generatedl externally as well:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
+
+Halo Profiling
+^^^^^^^^^^^^^^
+
+yt provides a comprehensive halo profiler that can filter, center, and analyze
+halos en masse.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
+
+
+Two Point Functions
+^^^^^^^^^^^^^^^^^^^
+
+These functions are designed to create correlations or other results of
+operations acting on two spatially-distinct points in a data source.  See also
+:ref:`two_point_functions`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
+   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
+
+Field Types
+-----------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.field_info_container.DerivedField
+   ~yt.data_objects.field_info_container.FieldInfoContainer
+   ~yt.data_objects.field_info_container.ValidateDataField
+   ~yt.data_objects.field_info_container.ValidateGridType
+   ~yt.data_objects.field_info_container.ValidateParameter
+   ~yt.data_objects.field_info_container.ValidateProperty
+   ~yt.data_objects.field_info_container.ValidateSpatial
+
+Image Handling
+--------------
+
+For volume renderings and fixed resolution buffers the image object returned is
+an ``ImageArray`` object, which has useful functions for image saving and 
+writing to bitmaps.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.image_array.ImageArray
+   ~yt.data_objects.image_array.ImageArray.write_png
+   ~yt.data_objects.image_array.ImageArray.write_hdf5
+
+Extension Types
+---------------
+
+Coordinate Transformations
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
+   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
+
+Cosmology, Star Particle Analysis, and Simulated Observations
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
+
+Light cone generation and simulation analysis.  (See also
+:ref:`light-cone-generator`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
+   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
+
+Absorption and X-ray spectra and spectral lines:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
+
+Absorption spectra fitting:
+
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+
+Sunrise exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
+
+RADMC-3D exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
+
+Radial Column Density
+^^^^^^^^^^^^^^^^^^^^^
+
+If you'd like to calculate the column density out to a given point, from a
+specified center, yt can provide that information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
+
+Volume Rendering
+^^^^^^^^^^^^^^^^
+
+See also :ref:`volume_rendering`.
+
+Here are the primary entry points:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.Camera
+   ~yt.visualization.volume_rendering.camera.off_axis_projection
+   ~yt.visualization.volume_rendering.camera.allsky_projection
+
+These objects set up the way the image looks:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
+
+There are also advanced objects for particular use cases:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
+   ~yt.visualization.volume_rendering.camera.FisheyeCamera
+   ~yt.visualization.volume_rendering.camera.MosaicCamera
+   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
+   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
+   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
+   ~yt.visualization.volume_rendering.camera.StereoPairCamera
+
+Streamlining
+^^^^^^^^^^^^
+
+See also :ref:`streamlines`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.streamlines.Streamlines
+
+Image Writing
+^^^^^^^^^^^^^
+
+These functions are all used for fast writing of images directly to disk,
+without calling matplotlib.  This can be very useful for high-cadence outputs
+where colorbars are unnecessary or for volume rendering.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.image_writer.multi_image_composite
+   ~yt.visualization.image_writer.write_bitmap
+   ~yt.visualization.image_writer.write_projection
+   ~yt.visualization.image_writer.write_fits
+   ~yt.visualization.image_writer.write_image
+   ~yt.visualization.image_writer.map_to_colors
+   ~yt.visualization.image_writer.strip_colormap_data
+   ~yt.visualization.image_writer.splat_points
+   ~yt.visualization.image_writer.annotate_image
+   ~yt.visualization.image_writer.scale_image
+
+We also provide a module that is very good for generating EPS figures,
+particularly with complicated layouts.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.eps_writer.DualEPS
+   ~yt.visualization.eps_writer.single_plot
+   ~yt.visualization.eps_writer.multiplot
+   ~yt.visualization.eps_writer.multiplot_yt
+   ~yt.visualization.eps_writer.return_cmap
+
+.. _image-panner-api:
+
+Derived Quantities
+------------------
+
+See :ref:`derived-quantities`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.derived_quantities._AngularMomentumVector
+   ~yt.data_objects.derived_quantities._BaryonSpinParameter
+   ~yt.data_objects.derived_quantities._BulkVelocity
+   ~yt.data_objects.derived_quantities._CenterOfMass
+   ~yt.data_objects.derived_quantities._Extrema
+   ~yt.data_objects.derived_quantities._IsBound
+   ~yt.data_objects.derived_quantities._MaxLocation
+   ~yt.data_objects.derived_quantities._ParticleSpinParameter
+   ~yt.data_objects.derived_quantities._TotalMass
+   ~yt.data_objects.derived_quantities._TotalQuantity
+   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
+
+.. _callback-api:
+
+Callback List
+-------------
+
+
+See also :ref:`callbacks`.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_modifications.ArrowCallback
+   ~yt.visualization.plot_modifications.ClumpContourCallback
+   ~yt.visualization.plot_modifications.ContourCallback
+   ~yt.visualization.plot_modifications.CoordAxesCallback
+   ~yt.visualization.plot_modifications.CuttingQuiverCallback
+   ~yt.visualization.plot_modifications.GridBoundaryCallback
+   ~yt.visualization.plot_modifications.HopCircleCallback
+   ~yt.visualization.plot_modifications.HopParticleCallback
+   ~yt.visualization.plot_modifications.LabelCallback
+   ~yt.visualization.plot_modifications.LinePlotCallback
+   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
+   ~yt.visualization.plot_modifications.ParticleCallback
+   ~yt.visualization.plot_modifications.PointAnnotateCallback
+   ~yt.visualization.plot_modifications.QuiverCallback
+   ~yt.visualization.plot_modifications.SphereCallback
+   ~yt.visualization.plot_modifications.TextLabelCallback
+   ~yt.visualization.plot_modifications.TitleCallback
+   ~yt.visualization.plot_modifications.UnitBoundaryCallback
+   ~yt.visualization.plot_modifications.VelocityCallback
+
+Function List
+-------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.convenience.load
+   ~yt.funcs.deprecate
+   ~yt.funcs.ensure_list
+   ~yt.funcs.get_pbar
+   ~yt.funcs.humanize_time
+   ~yt.funcs.insert_ipython
+   ~yt.funcs.iterable
+   ~yt.funcs.just_one
+   ~yt.funcs.only_on_root
+   ~yt.funcs.paste_traceback
+   ~yt.funcs.pdb_run
+   ~yt.funcs.print_tb
+   ~yt.funcs.rootonly
+   ~yt.funcs.time_execution
+   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
+
+Miscellaneous Types
+-------------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.config.YTConfigParser
+   ~yt.utilities.parameter_file_storage.ParameterFileStore
+   ~yt.data_objects.data_containers.FakeGridForParticles
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
+
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
+Testing Infrastructure
+----------------------
+
+The first set of functions are all provided by NumPy.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_array_equal
+   ~yt.testing.assert_almost_equal
+   ~yt.testing.assert_approx_equal
+   ~yt.testing.assert_array_almost_equal
+   ~yt.testing.assert_equal
+   ~yt.testing.assert_array_less
+   ~yt.testing.assert_string_equal
+   ~yt.testing.assert_array_almost_equal_nulp
+   ~yt.testing.assert_allclose
+   ~yt.testing.assert_raises
+
+These are yt-provided functions:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_rel_equal
+   ~yt.testing.amrspace
+   ~yt.testing.fake_random_pf
+   ~yt.testing.expand_keywords

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 45b305e19ab8133b7086c3a1df0df559dd524353 source/reference/code_support.rst
--- /dev/null
+++ b/source/reference/code_support.rst
@@ -0,0 +1,53 @@
+
+.. _code-support:
+
+Code Support
+============
+
+Levels of Support for Various Codes
+-----------------------------------
+
+yt provides frontends to support several different simulation code formats 
+as inputs.  Below is a list showing what level of support is provided for
+each code.
+
+|
+
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability           | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++======================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities     |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles            |   Y  |   Y   |   Y   |  Y   |   N/A   |   N    |   Y    |   N     |   N   |    N   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters           |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units                |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand       |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data        |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Part of test suite   |   Y  |   Y   |   Y   |  Y   |    N    |   N    |   Y    |   N     |   N   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support     | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+
+|
+
+If you have a dataset from a code not yet supported, you can either 
+input your data using the :ref:`loading-numpy-array` format, or help us by 
+:ref:`creating_frontend` for this new format.
+
+Future Codes to Support
+-----------------------
+
+A major overhaul of the code was required in order to cleanly support 
+additional codes.  Development in the yt 3.x branch has begun and provides 
+support for codes like: 
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_, 
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
+`Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
+version of yt for the most up-to-date support for those codes.
+
+Additionally, in yt 3.0 the Boxlib formats have been unified and streamlined.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/11fea34c6f5c/
Changeset:   11fea34c6f5c
User:        jzuhone
Date:        2013-10-28 19:58:52
Summary:     Sunyaev-Zeldovich analysis docs
Affected #:  2 files

diff -r a059cbb57ddf355a8e311bc1816c0728351deee4 -r 11fea34c6f5ce7477f92d31b920613d1d345562b source/analysis_modules/index.rst
--- a/source/analysis_modules/index.rst
+++ b/source/analysis_modules/index.rst
@@ -27,3 +27,5 @@
    ellipsoid_analysis
    xray_emission_fields
    radmc3d_export
+   sunyaev_zeldovich
+

diff -r a059cbb57ddf355a8e311bc1816c0728351deee4 -r 11fea34c6f5ce7477f92d31b920613d1d345562b source/analysis_modules/sunyaev_zeldovich.rst
--- /dev/null
+++ b/source/analysis_modules/sunyaev_zeldovich.rst
@@ -0,0 +1,143 @@
+.. _sunyaev_zeldovich:
+
+Mock Observations of the Sunyaev-Zeldovich Effect
+=================================
+.. sectionauthor:: John ZuHone <jzuhone at gmail.com>
+
+The change in the CMB intensity due to Compton scattering of CMB
+photons off of thermal electrons in galaxy clusters, otherwise known as the
+Sunyaev-Zeldovich (S-Z) effect, can to a reasonable approximation be represented by a
+projection of the pressure field of a cluster. However, the `full` S-Z signal is a combination of thermal and kinetic
+contributions, and for large frequencies and high temperatures
+relativistic effects are important. For computing the full S-Z signal
+incorporating all of these effects, Jens Chluba has written a library:
+SZpack (`Chluba et al 2012 <http://adsabs.harvard.edu/abs/2012MNRAS.426..510C>`_). 
+
+The ``sunyaev_zeldovich`` analysis module in ``yt`` makes it possible
+to make projections of the full S-Z signal given the properties of the
+thermal gas in the simulation using SZpack. SZpack has several different options for computing the S-Z signal, from full
+integrations to very good approximations.  Since a full or even a
+partial integration of the signal for each cell in the projection
+would be prohibitively expensive, we use the method outlined in
+`Chluba et al 2013 <http://adsabs.harvard.edu/abs/2013MNRAS.430.3054C>`_ to expand the
+total S-Z signal in terms of moments of the projected optical depth
+:math:`\tau`, projected electron temperature :math:`T_e`, and
+velocities :math:`\beta_{c,\parallel})` and :math:`\beta_{c,\perp})` (their equation 18):
+
+.. math::
+  S(\tau, T_{e},\beta_{c,\parallel},\beta_{\rm
+  c,\perp}) \approx S_{\rm iso}^{(0)} +
+  S_{\rm iso}^{(2)}\omega^{(1)} + C_{\rm iso}^{(1)}\sigma^{(1)} +
+  D_{\rm iso}^{(2)}\kappa^{(1)} + E_{\rm iso}^{(2)}\beta_{\rm
+  c,\perp,SZ}^2 + ...
+
+`yt` makes projections of the various moments needed for the
+calculation, and then the resulting projected fields are used to
+compute the S-Z signal. In our implementation, the expansion is carried out to first-order
+terms in :math:`T_e` and zeroth-order terms in
+:math:`\beta_{c,\parallel}` by default, but terms up to second-order in can be optionally
+included. 
+ 
+Installing SZpack
+-------------------------------
+
+SZpack can be downloaded `here
+<http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html>`_. Make
+sure you install a version later than v1.1.1. For computing the S-Z
+integrals, SZpack requires the `GNU Scientific Library <http://www.gnu.org/software/gsl/>`_. For compiling
+the Python module, you need to have a recent version of `swig <http://www.swig.org>`_
+installed. After running ``make`` in the top-level SZpack directory,
+you'll need to run it in the ``python`` subdirectory, which is the
+location of the ``SZpack`` module. You may have to include this location in the ``PYTHONPATH`` environment variable.
+
+Creating S-Z Projections
+-------------------------------
+
+Once you have ``SZpack`` installed, making S-Z projections from ``yt``
+datasets is fairly straightforward:
+
+.. code-block:: python
+
+  from yt.analysis_modules.api import SZProjection
+
+  pf = load("fiducial_1to10_b0_hdf5_plt_cnt_0115.gz")
+  freqs = [90., 180., 240.]
+  szprj = SZProjection(pf, freqs)
+
+``freqs`` is a list or array of frequencies in GHz at which the signal
+is to be computed. The ``SZProjection`` constructor also accepts the
+optional keywords, **mue** (mean molecular weight for computing the
+electron number density, 1.143 is the default) and **high_order** (set
+to True to compute terms in the S-Z signal expansion up to
+second-order in T_eSZ and \beta). 
+
+Once you have created the ``SZProjection`` object, you can use it to
+make on-axis and off-axis projections:
+
+.. code-block:: python
+
+  # An on-axis projection along the z-axis with width 7 Mpc
+  szprj.on_axis("z", width=(7.0, "mpc"))
+  # An off-axis projection along a normal vector centered at the
+  # maximum gas density with a width of 6000 kpc
+  L = np.array([0.1,-0.1,0.3])
+  szprj.off_axis(L, center="max", width=(6000., "kpc"))
+
+Currently, only one projection can be in memory at once. These methods
+create images of the projected S-Z signal at each requested frequency,
+which can be accessed dict-like from the projection object (e.g.,
+``szprj["90_GHz"]``). Projections of other quantities may also be
+accessed; to see what fields are available call ``szprj.keys()``. The methods also accept standard ``yt``
+keywords for projections such as **center**, **width**, and **source**. The image buffer size can be controlled by setting **nx**.  
+
+Writing out the S-Z Projections
+-------------------------------
+
+You may want to output the S-Z images to figures suitable for
+inclusion in a paper, or save them to disk for later use. There are a
+few methods included for this purpose. For PNG figures with a colorbar
+and axes, use ``write_png``:
+
+.. code-block:: python
+
+  szprj.write_png("SZbullet")
+
+which would result in the following images of the S-Z signal for our previous on-axis
+example:
+
+.. image:: _images/SZbullet_90_GHz.png
+   :width: 500
+
+.. image:: _images/SZbullet_180_GHz.png
+   :width: 500
+
+.. image:: _images/SZbullet_240_GHz.png
+   :width: 500
+
+along with projections of the optical depth and the mass-weighted
+temperature:
+
+.. image:: _images/SZbullet_Tau.png
+   :width: 500
+
+.. image:: _images/SZbullet_TeSZ.png
+   :width: 500
+
+For simple output of the image data to disk, call ``write_hdf5``:
+
+.. code-block:: python
+
+  szprj.write_hdf5("SZbullet.h5")
+
+Finally, for output to FITS files which can be opened or analyzed
+using other programs (such as ds9), call ``export_fits``.
+
+.. code-block:: python
+
+  szprj.write_fits("SZbullet", clobber=True)
+
+which would write all of the projections to a single FITS file named ``"SZbullet.fits"``,
+including coordinate information in kpc. The optional keyword
+**clobber** allows a previous file to be overwritten. 
+
+.. note:: To write out a FITS file, you must install the `pyfits <http://www.stsci.edu/resources/software_hardware/pyfits>`_ or the `AstroPy <http://www.astropy.org>`_ module.


https://bitbucket.org/yt_analysis/yt-doc/commits/eea7a0e14c5f/
Changeset:   eea7a0e14c5f
User:        jzuhone
Date:        2013-10-29 15:33:24
Summary:     Merged yt_analysis/yt-doc into default
Affected #:  18 files

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 extensions/notebook_sphinxext.py
--- /dev/null
+++ b/extensions/notebook_sphinxext.py
@@ -0,0 +1,154 @@
+import os, shutil, string, glob
+from sphinx.util.compat import Directive
+from docutils import nodes
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from runipy.notebook_runner import NotebookRunner
+from jinja2 import FileSystemLoader
+
+class NotebookDirective(Directive):
+    """Insert an evaluated notebook into a document
+
+    This uses runipy and nbconvert to transform a path to an unevaluated notebook
+    into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 1
+    optional_arguments = 0
+
+    def run(self):
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # get path to notebook
+        source_dir = os.path.dirname(
+            os.path.abspath(self.state.document.current_source))
+        nb_basename = os.path.basename(self.arguments[0])
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        nb_abs_path = os.path.join(rst_dir, nb_basename)
+
+        # Move files around.
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        dest_dir = os.path.join(setup.app.builder.outdir, rel_dir)
+        dest_path = os.path.join(dest_dir, nb_basename)
+
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir)
+
+        # Copy unevaluated script
+        try:
+            shutil.copyfile(nb_abs_path, dest_path)
+        except IOError:
+            raise RuntimeError("Unable to copy notebook to build destination.")
+
+        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
+        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
+
+        # Create python script vesion
+        unevaluated_text = nb_to_html(nb_abs_path)
+        script_text = nb_to_python(nb_abs_path)
+        f = open(dest_path_script, 'w')
+        f.write(script_text.encode('utf8'))
+        f.close()
+
+        # Create evaluated version and save it to the dest path.
+        # Always use --pylab so figures appear inline
+        # perhaps this is questionable?
+        nb_runner = NotebookRunner(nb_in=nb_abs_path, pylab=True)
+        nb_runner.run_notebook()
+        nb_runner.save_notebook(dest_path_eval)
+        evaluated_text = nb_to_html(dest_path_eval)
+
+        # Create link to notebook and script files
+        link_rst = "(" + \
+                   formatted_link(dest_path) + "; " + \
+                   formatted_link(dest_path_eval) + "; " + \
+                   formatted_link(dest_path_script) + \
+                   ")"
+
+        self.state_machine.insert_input([link_rst], rst_file)
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = nodes.raw('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # add dependency
+        self.state.document.settings.record_dependencies.add(nb_abs_path)
+
+        # clean up png files left behind by notebooks.
+        png_files = glob.glob("*.png")
+        for file in png_files:
+            os.remove(file)
+
+        return [nb_node]
+
+class notebook_node(nodes.raw):
+    pass
+
+def nb_to_python(nb_path):
+    """convert notebook to python script"""
+    exporter = python.PythonExporter()
+    output, resources = exporter.from_filename(nb_path)
+    return output
+
+def nb_to_html(nb_path):
+    """convert notebook to html"""
+    exporter = html.HTMLExporter(template_file='full')
+    output, resources = exporter.from_filename(nb_path)
+    header = output.split('<head>', 1)[1].split('</head>',1)[0]
+    body = output.split('<body>', 1)[1].split('</body>',1)[0]
+
+    # http://imgur.com/eR9bMRH
+    header = header.replace('<style', '<style scoped="scoped"')
+    header = header.replace('body{background-color:#ffffff;}\n', '')
+    header = header.replace('body{background-color:white;position:absolute;'
+                            'left:0px;right:0px;top:0px;bottom:0px;'
+                            'overflow:visible;}\n', '')
+    header = header.replace('body{margin:0;'
+                            'font-family:"Helvetica Neue",Helvetica,Arial,'
+                            'sans-serif;font-size:13px;line-height:20px;'
+                            'color:#000000;background-color:#ffffff;}', '')
+    header = header.replace('\na{color:#0088cc;text-decoration:none;}', '')
+    header = header.replace(
+        'a:focus{color:#005580;text-decoration:underline;}', '')
+    header = header.replace(
+        '\nh1,h2,h3,h4,h5,h6{margin:10px 0;font-family:inherit;font-weight:bold;'
+        'line-height:20px;color:inherit;text-rendering:optimizelegibility;}'
+        'h1 small,h2 small,h3 small,h4 small,h5 small,'
+        'h6 small{font-weight:normal;line-height:1;color:#999999;}'
+        '\nh1,h2,h3{line-height:40px;}\nh1{font-size:35.75px;}'
+        '\nh2{font-size:29.25px;}\nh3{font-size:22.75px;}'
+        '\nh4{font-size:16.25px;}\nh5{font-size:13px;}'
+        '\nh6{font-size:11.049999999999999px;}\nh1 small{font-size:22.75px;}'
+        '\nh2 small{font-size:16.25px;}\nh3 small{font-size:13px;}'
+        '\nh4 small{font-size:13px;}', '')
+    header = header.replace('background-color:#ffffff;', '', 1)
+
+    # concatenate raw html lines
+    lines = ['<div class="ipynotebook">']
+    lines.append(header)
+    lines.append(body)
+    lines.append('</div>')
+    return '\n'.join(lines)
+
+def formatted_link(path):
+    return "`%s <%s>`__" % (os.path.basename(path), path)
+
+def visit_notebook_node(self, node):
+    self.visit_raw(node)
+
+def depart_notebook_node(self, node):
+    self.depart_raw(node)
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook', NotebookDirective)

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/analysis_modules/quick_start_fitting.rst
--- a/source/analysis_modules/quick_start_fitting.rst
+++ b/source/analysis_modules/quick_start_fitting.rst
@@ -3,6 +3,7 @@
 Fitting an Absorption Spectrum
 ==============================
 .. sectionauthor:: Hilary Egan <hilary.egan at colorado.edu>
+
 This tool can be used to fit absorption spectra, particularly those
 generated using the (``AbsorptionSpectrum``) tool. For more details
 on its uses and implementation please see (`Egan et al. (2013)

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/api/api.rst
--- a/source/api/api.rst
+++ b/source/api/api.rst
@@ -25,7 +25,7 @@
    ~yt.visualization.plot_collection.PlotCollectionInteractive
    ~yt.visualization.fixed_resolution.FixedResolutionBuffer
    ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.plot_collection.get_multi_plot
+   ~yt.visualization.base_plot_types.get_multi_plot
 
 Data Sources
 ------------
@@ -321,9 +321,7 @@
 
 Absorption spectra fitting:
 
-.. autosummary::
-    :toctree: generated/
-    ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
 
 Sunrise exporting:
 
@@ -525,14 +523,16 @@
    :toctree: generated/
 
    ~yt.config.YTConfigParser
-   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
    ~yt.utilities.parameter_file_storage.ParameterFileStore
    ~yt.data_objects.data_containers.FakeGridForParticles
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
 
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
 Testing Infrastructure
 ----------------------
 

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Data_Inspection.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Inspection.ipynb
@@ -0,0 +1,396 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Starting Out and Loading Data\n",
+      "\n",
+      "We're going to get started by loading up yt.  This next command brings all of the libraries into memory and sets up our environment.  Note that in most scripts, you will want to import from ``yt.mods`` rather than ``yt.imods``.  But using ``yt.imods`` gets you some nice stuff for the IPython notebook, which we'll use below."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now that we've loaded yt, we can load up some data.  Let's load the `IsolatedGalaxy` dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Fields and Facts\n",
+      "\n",
+      "When you call the `load` function, yt tries to do very little -- this is designed to be a fast operation, just setting up some information about the simulation.  Now, the first time you access the \"hierarchy\" (shorthand is `.h`) it will read and load the mesh and then determine where data is placed in the physical domain and on disk.  Once it knows that, yt can tell you some statistics about the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.print_stats()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also tell you the fields it found on disk:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, all of the fields it thinks it knows how to generate:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf.h.derived_field_list"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also transparently generate fields.  However, we encourage you to examine exactly what yt is doing when it generates those fields.  To see, you can ask for the source of a given field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.field_info[\"VorticityX\"].get_source()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt stores information about the domain of the simulation:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt can also convert this into various units:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.domain_width * pf[\"kpc\"]\n",
+      "print pf.domain_width * pf[\"au\"]\n",
+      "print pf.domain_width * pf[\"miles\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Mesh Structure\n",
+      "\n",
+      "If you're using a simulation type that has grids (for instance, here we're using an Enzo simulation) you can examine the structure of the mesh.  For the most part, you probably won't have to use this unless you're debugging a simulation or examining in detail what is going on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grid_left_edge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "But, you may have to access information about individual grid objects!  Each grid object mediates accessing data from the disk and has a number of attributes that tell you about it.  The hierarchy (`pf.h` here) has an attribute `grids` which is all of the grid objects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print pf.h.grids[0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g = pf.h.grids[0]\n",
+      "print g"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Grids have dimensions, extents, level, and even a list of Child grids."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.ActiveDimensions"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.LeftEdge, g.RightEdge"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Level"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g.Children"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Grid Inspection\n",
+      "\n",
+      "If we want to examine grids only at a given level, we can!  Not only that, but we can load data and take a look at various fields.\n",
+      "\n",
+      "*This section can be skipped!*"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "gs = pf.h.select_grids(pf.h.max_level)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "g2 = gs[0]\n",
+      "print g2\n",
+      "print g2.Parent\n",
+      "print g2.get_global_startindex()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print g2[\"Density\"][:,:,0]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print (g2.Parent.child_mask == 0).sum() * 8\n",
+      "print g2.ActiveDimensions.prod()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in pf.h.field_list:\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for f in sorted(pf.h.field_list):\n",
+      "    fv = g[f]\n",
+      "    if fv.size == 0: continue\n",
+      "    print f, fv.min(), fv.max()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Examining Data in Regions\n",
+      "\n",
+      "yt provides data object selectors.  In subsequent notebooks we'll examine these in more detail, but we can select a sphere of data and perform a number of operations on it.  yt makes it easy to operate on fluid fields in an object in *bulk*, but you can also examine individual field values.\n",
+      "\n",
+      "This creates a sphere selector positioned at the most dense point in the simulation that has a radius of 10 kpc."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10, 'kpc'))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can calculate a bunch of bulk quantities.  Here's that list, but there's a list in the docs, too!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Let's look at the total mass.  This is how you call a given quantity.  yt calls these \"Derived Quantities\".  We'll talk about a few in a later notebook."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print sp.quantities[\"TotalMass\"]()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Data_Objects_and_Time_Series.ipynb
--- /dev/null
+++ b/source/bootcamp/Data_Objects_and_Time_Series.ipynb
@@ -0,0 +1,361 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Data Objects and Time Series Data\n",
+      "\n",
+      "Just like before, we will load up yt."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Time Series Data\n",
+      "\n",
+      "Unlike before, instead of loading a single dataset, this time we'll load a bunch which we'll examine in sequence.  This command creates a `TimeSeriesData` object, which can be iterated over (including in parallel, which is outside the scope of this bootcamp) and analyzed.  There are some other helpful operations it can provide, but we'll stick to the basics here.\n",
+      "\n",
+      "Note that you can specify either a list of filenames, or a glob (i.e., asterisk) pattern in this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ts = TimeSeriesData.from_filenames(\"enzo_tiny_cosmology/*/*.hierarchy\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 1: Simple Time Series\n",
+      "\n",
+      "As a simple example of how we can use this functionality, let's find the min and max of the density as a function of time in this simulation.  To do this we use the construction `for pf in ts` where `pf` means \"Parameter File\" and `ts` is the \"Time Series\" we just loaded up.  For each parameter file, we'll create an object (`dd`) that covers the entire domain.  (`all_data` is a shorthand function for this.)  We'll then call the Derived Quantity `Extrema`, and append the min and max to our extrema outputs."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "rho_ex = []\n",
+      "times = []\n",
+      "for pf in ts:\n",
+      "    dd = pf.h.all_data()\n",
+      "    rho_ex.append(dd.quantities[\"Extrema\"](\"Density\")[0])\n",
+      "    times.append(pf.current_time * pf[\"years\"])\n",
+      "rho_ex = np.array(rho_ex)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the minimum and the maximum:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.semilogy(times, rho_ex[:,0], '-xk')\n",
+      "pylab.semilogy(times, rho_ex[:,1], '-xr')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Example 2: Advanced Time Series\n",
+      "\n",
+      "Let's do something a bit different.  Let's calculate the total mass inside halos and outside halos.\n",
+      "\n",
+      "This actually touches a lot of different pieces of machinery in yt.  For every parameter file, we will run the halo finder HOP.  Then, we calculate the total mass in the domain.  Then, for each halo, we calculate the sum of the baryon mass in that halo.  We'll keep running tallies of these two things."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "mass = []\n",
+      "zs = []\n",
+      "for pf in ts:\n",
+      "    halos = HaloFinder(pf)\n",
+      "    dd = pf.h.all_data()\n",
+      "    total_mass = dd.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    total_in_baryons = 0.0\n",
+      "    for halo in halos:\n",
+      "        sp = halo.get_sphere()\n",
+      "        total_in_baryons += sp.quantities[\"TotalQuantity\"](\"CellMassMsun\")[0]\n",
+      "    mass.append(total_in_baryons/total_mass)\n",
+      "    zs.append(pf.current_redshift)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now let's plot them!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(zs, mass, '-xb')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Data Objects\n",
+      "\n",
+      "Time series data have many applications, but most of them rely on examining the underlying data in some way.  Below, we'll see how to use and manipulate data objects.\n",
+      "\n",
+      "### Ray Queries\n",
+      "\n",
+      "yt provides the ability to examine rays, or lines, through the domain.  Note that these are not periodic, unlike most other data objects.  We create a ray object and can then examine quantities of it.  Rays have the special fields `t` and `dts`, which correspond to the time the ray enters a given cell and the distance it travels through that cell.\n",
+      "\n",
+      "To create a ray, we specify the start and end points."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "ray = pf.h.ray([0.1, 0.2, 0.3], [0.9, 0.8, 0.7])\n",
+      "pylab.semilogy(ray[\"t\"], ray[\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"dts\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"t\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print ray[\"x\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Slice Queries\n",
+      "\n",
+      "While slices are often used for visualization, they can be useful for other operations as well.  yt regards slices as multi-resolution objects.  They are an array of cells that are not all the same size; it only returns the cells at the highest resolution that it intersects.  (This is true for all yt data objects.)  Slices and projections have the special fields `px`, `py`, `pdx` and `pdy`, which correspond to the coordinates and half-widths in the pixel plane."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "sl = pf.h.slice(0, c[0])\n",
+      "print sl[\"x\"], sl[\"z\"], sl[\"pdx\"]\n",
+      "print sl[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to do something interesting with a Slice, we can turn it into a `FixedResolutionBuffer`.  This object can be queried and will return a 2D array of values."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "frb = sl.to_frb((50.0, 'kpc'), 1024)\n",
+      "print frb[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides a few functions for writing arrays to disk, particularly in image form.  Here we'll write out the log of Density, and then use IPython to display it back here.  Note that for the most part, you will probably want to use a `PlotWindow` for this, but in the case that it is useful you can directly manipulate the data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "write_image(np.log10(frb[\"Density\"]), \"temp.png\")\n",
+      "from IPython.core.display import Image\n",
+      "Image(filename = \"temp.png\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Off-Axis Slices\n",
+      "\n",
+      "yt provides not only slices, but off-axis slices that are sometimes called \"cutting planes.\"  These are specified by (in order) a normal vector and a center.  Here we've set the normal vector to `[0.2, 0.3, 0.5]` and the center to be the point of maximum density.\n",
+      "\n",
+      "We can then turn these directly into plot windows using `to_pw`.  Note that the `to_pw` and `to_frb` methods are available on slices, off-axis slices, and projections, and can be used on any of them."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cp = pf.h.cutting([0.2, 0.3, 0.5], \"max\")\n",
+      "pw = cp.to_pw(fields = [\"Density\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once we have our plot window from our cutting plane, we can show it here."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pw.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can, as noted above, do the same with our slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pws = sl.to_pw(fields=[\"Density\"])\n",
+      "pws.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "### Covering Grids\n",
+      "\n",
+      "If we want to access a 3D array of data that spans multiple resolutions in our simulation, we can use a covering grid.  This will return a 3D array of data, drawing from up to the resolution level specified when creating the data.  For example, if you create a covering grid that spans two child grids of a single parent grid, it will fill those zones covered by a zone of a child grid with the data from that child grid.  Where it is covered only by the parent grid, the cells from the parent grid will be duplicated (appropriately) to fill the covering grid.\n",
+      "\n",
+      "There are two different types of covering grids: unsmoothed and smoothed.  Smoothed grids will be filled through a cascading interpolation process; they will be filled at level 0, interpolated to level 1, filled at level 1, interpolated to level 2, filled at level 2, etc.  This will help to reduce edge effects.  Unsmoothed covering grids will not be interpolated, but rather values will be duplicated multiple times.\n",
+      "\n",
+      "Here we create an unsmoothed covering grid at level 2, with the left edge at `[0.0, 0.0, 0.0]` and with dimensions equal to those that would cover the entire domain at level 2.  We can then ask for the Density field, which will be a 3D array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cg = pf.h.covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print cg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we do exactly the same thing: except we ask for a *smoothed* covering grid, which will reduce edge effects."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "scg = pf.h.smoothed_covering_grid(2, [0.0, 0.0, 0.0], pf.domain_dimensions * 2**2)\n",
+      "print scg[\"Density\"].shape"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Derived_Fields_and_Profiles.ipynb
--- /dev/null
+++ b/source/bootcamp/Derived_Fields_and_Profiles.ipynb
@@ -0,0 +1,316 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Derived Fields and Profiles\n",
+      "\n",
+      "One of the most powerful features in yt is the ability to create derived fields that act and look exactly like fields that exist on disk.  This means that they will be generated on demand and can be used anywhere a field that exists on disk would be used.  Additionally, you can create them by just writing python functions."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 1
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Derived Fields\n",
+      "\n",
+      "This is an example of the simplest possible way to create a derived field.  All derived fields are defined by a function and some metadata; that metadata can include units, LaTeX-friendly names, conversion factors, and so on.  Fields can be defined in the way in the next cell.  What this does is create a function which accepts two arguments and then provide the units for that field.  In this case, our field is `Dinosaurs` and our units are `Trex/s`.  The function itself can access any fields that are in the simulation, and it does so by requesting data from the object called `data`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(units = \"Trex/s\")\n",
+      "def Dinosaurs(field, data):\n",
+      "    return data[\"Density\"]**(2.0/3.0) * data[\"VelocityMagnitude\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [],
+     "prompt_number": 2
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One important thing to note is that derived fields must be defined *before* any datasets are loaded.  Let's load up our data and take a look at some quantities."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "print dd.quantities.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "['MinLocation', 'StarAngularMomentumVector', 'WeightedVariance', 'TotalMass', 'AngularMomentumVector', 'TotalQuantity', 'IsBound', 'WeightedAverageQuantity', 'CenterOfMass', 'BulkVelocity', 'ParticleSpinParameter', 'Action', 'Extrema', 'MaxLocation', 'BaryonSpinParameter']\n"
+       ]
+      }
+     ],
+     "prompt_number": 4
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "One interesting question is, what are the minimum and maximum values of dinosaur production rates in our isolated galaxy?  We can do that by examining the `Extrema` quantity -- the exact same way that we would for Density, Temperature, and so on."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"Extrema\"](\"Dinosaurs\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": [
+      {
+       "output_type": "stream",
+       "stream": "stdout",
+       "text": [
+        "[(2.2146366774504352e-20, 9.1573883828992124e-09)]\n"
+       ]
+      }
+     ],
+     "prompt_number": 5
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can do the same for the average quantities as well."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print dd.quantities[\"WeightedAverageQuantity\"](\"Dinosaurs\", weight=\"Temperature\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## A Few Other Quantities\n",
+      "\n",
+      "We can ask other quantities of our data, as well.  For instance, this sequence of operations will find the most dense point, center a sphere on it, calculate the bulk velocity of that sphere, calculate the baryonic angular momentum vector, and then the density extrema.  All of this is done in a memory conservative way: if you have an absolutely enormous dataset, yt will split that dataset into pieces, apply intermediate reductions and then a final reduction to calculate your quantity."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp = pf.h.sphere(\"max\", (10.0, 'kpc'))\n",
+      "bv = sp.quantities[\"BulkVelocity\"]()\n",
+      "L = sp.quantities[\"AngularMomentumVector\"]()\n",
+      "(rho_min, rho_max), = sp.quantities[\"Extrema\"](\"Density\")\n",
+      "print bv, L, rho_min, rho_max"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Profiles\n",
+      "\n",
+      "yt provides the ability to bin in 1, 2 and 3 dimensions.  This means discretizing in one or more dimensions of phase space (density, temperature, etc) and then calculating either the total value of a field in each bin or the average value of a field in each bin.\n",
+      "\n",
+      "We do this using the objects `BinnedProfile1D`, `BinnedProfile2D`, and `BinnedProfile3D`.  The first two are the most common since they are the easiest to visualize.\n",
+      "\n",
+      "This first set of commands manually creates a `BinnedProfile1D` from the sphere we created earlier, binned in 32 bins according to density between `rho_min` and `rho_max`, and then takes the Density-weighted average of the fields `Temperature` and (previously-defined) `Dinosaurs`.  We then plot it in a loglog plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof = BinnedProfile1D(sp, 32, \"Density\", rho_min, rho_max)\n",
+      "prof.add_fields([\"Temperature\", \"Dinosaurs\"], weight=\"Density\")\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"Temperature\"], \"-x\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we plot the `Dinosaurs` field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pylab.loglog(prof[\"Density\"], prof[\"Dinosaurs\"], '-x')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to see the total mass in every bin, we add the `CellMassMsun` field with no weight.  Specifying `weight=None` will simply take the total value in every bin and add that up."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can also specify accumulation, which sums all the bins, from left to right.  Note that for 2D and 3D profiles, this needs to be a tuple of length 2 or 3."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prof.add_fields([\"CellMassMsun\"], weight=None, accumulation=True)\n",
+      "pylab.loglog(prof[\"Density\"], prof[\"CellMassMsun\"])"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Advanced Derived Fields\n",
+      "\n",
+      "*This section can be skipped!*\n",
+      "\n",
+      "You can also define fields that require extra zones.  This is useful, for instance, if you want to take the average, or apply a stencil.  yt provides fields like `DivV` that do this internally.  This example is a very busy example of how to do it.  You need to specify the validator `ValidateSpatial` with the number of extra zones *on each side* of the grid that you need, and then inside your function you need to return a field *with those zones stripped off*.  So by necessity, the arrays returned by `data[something]` will have larger spatial extent than what should be returned by the function itself.  If you specify that you need 0 extra zones, this will also work and will simply supply a `grid` object for the field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "@derived_field(name = \"AveragedTemperature\",\n",
+      "               validators = [ValidateSpatial(1)],\n",
+      "               units = r\"K\")\n",
+      "def _AveragedTemperature(field, data):\n",
+      "    nx, ny, nz = data[\"Temperature\"].shape\n",
+      "    new_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    weight_field = na.zeros((nx-2,ny-2,nz-2), dtype='float64')\n",
+      "    i_i, j_i, k_i = na.mgrid[0:3,0:3,0:3]\n",
+      "    for i,j,k in zip(i_i.ravel(),j_i.ravel(),k_i.ravel()):\n",
+      "        sl = [slice(i,nx-(2-i)),slice(j,ny-(2-j)),slice(k,nz-(2-k))]\n",
+      "        new_field += data[\"Temperature\"][sl] * data[\"CellMass\"][sl]\n",
+      "        weight_field += data[\"CellMass\"][sl]\n",
+      "    # Now some fancy footwork\n",
+      "    new_field2 = na.zeros((nx,ny,nz))\n",
+      "    new_field2[1:-1,1:-1,1:-1] = new_field/weight_field\n",
+      "    return new_field2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now, once again, we can access `AveragedTemperature` just like any other field.  Note that because it requires ghost zones, this will be a much slower process!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")\n",
+      "dd = pf.h.all_data()\n",
+      "(tmin, tmax), (atmin, atmax) = dd.quantities[\"Extrema\"]([\"Temperature\", \"AveragedTemperature\"])\n",
+      "print tmin, tmax, atmin, atmax\n",
+      "print tmin / atmin, tmax / atmax"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "## Field Parameters\n",
+      "\n",
+      "Field parameters are a method of passing information to derived fields.  For instance, you might pass in information about a vector you want to use as a basis for a coordinate transformation.  yt often uses things like `bulk_velocity` to identify velocities that should be subtracted off.  Here we show how that works:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "sp_small = pf.h.sphere(\"max\", (1.0, 'kpc'))\n",
+      "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
+      "\n",
+      "sp = pf.h.sphere(\"max\", (0.1, 'mpc'))\n",
+      "rv1 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "sp.clear_data()\n",
+      "sp.set_field_parameter(\"bulk_velocity\", bv)\n",
+      "rv2 = sp.quantities[\"Extrema\"](\"RadialVelocity\")\n",
+      "\n",
+      "print bv\n",
+      "print rv1\n",
+      "print rv2"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Introduction.ipynb
--- /dev/null
+++ b/source/bootcamp/Introduction.ipynb
@@ -0,0 +1,66 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Welcome to the yt bootcamp!\n",
+      "\n",
+      "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
+      "\n",
+      "But, before we begin, there are a few places to go if you run into trouble.\n",
+      "\n",
+      "**The yt homepage is at http://yt-project.org/**\n",
+      "\n",
+      "## Source of Help\n",
+      "\n",
+      "There are three places to check for help:\n",
+      "\n",
+      " * The documentation: http://yt-project.org/doc/\n",
+      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
+      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
+      "\n",
+      "## Sources of Information\n",
+      "\n",
+      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
+      "\n",
+      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
+      "\n",
+      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
+      "\n",
+      "## How to Update yt\n",
+      "\n",
+      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
+      "\n",
+      "`yt update`\n",
+      "\n",
+      "This will automatically update it for you.\n",
+      "\n",
+      "## Acquiring the datasets for this tutorial\n",
+      "\n",
+      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/.\n",
+      "\n",
+      "## What's Next?\n",
+      "\n",
+      "The Notebooks are meant to be explored in this order:\n",
+      "\n",
+      "1. Introduction\n",
+      "2. Data Inspection\n",
+      "3. Simple Visualization\n",
+      "4. Data Objects and Time Series\n",
+      "5. Derived Fields and Profiles\n",
+      "6. Volume Rendering"
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Simple_Visualization.ipynb
--- /dev/null
+++ b/source/bootcamp/Simple_Visualization.ipynb
@@ -0,0 +1,274 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# Simple Visualizations of Data\n",
+      "\n",
+      "Just like in our first notebook, we have to load yt and then some data."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For this notebook, we'll load up a cosmology dataset."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "print \"Redshift =\", pf.current_redshift"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In the terms that yt uses, a projection is a line integral through the domain.  This can either be unweighted (in which case a column density is returned) or weighted, in which case an average value is returned.  Projections are, like all other data objects in yt, full-fledged data objects that churn through data and present that to you.  However, we also provide a simple method of creating Projections and plotting them in a single step.  This is called a Plot Window, here specifically known as a `ProjectionPlot`.  One thing to note is that in yt, we project all the way through the entire domain at a single time.  This means that the first call to projecting can be somewhat time consuming, but panning, zooming and plotting are all quite fast.\n",
+      "\n",
+      "yt is designed to make it easy to make nice plots and straightforward to modify those plots directly.  The cookbook in the documentation includes detailed examples of this."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"y\", \"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The `show` command simply sends the plot to the IPython notebook.  You can also call `p.save()` which will save the plot to the file system.  This function accepts an argument, which will be pre-prended to the filename and can be used to name it based on the width or to supply a location.\n",
+      "\n",
+      "Now we'll zoom and pan a bit."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(2.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((0.1, 0.0))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.pan_rel((-0.25, -0.5))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.zoom(0.1)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we specify multiple fields, each time we call `show` we get multiple plots back.  Same for `save`!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p = ProjectionPlot(pf, \"z\", [\"Density\", \"Temperature\"], weight_field=\"Density\")\n",
+      "p.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the colormap on a field-by-field basis."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "p.set_cmap(\"Temperature\", \"hot\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "And, we can re-center the plot on different locations.  One possible use of this would be to make a single `ProjectionPlot` which you move around to look at different regions in your simulation, saving at each one."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "v, c = pf.h.find_max(\"Density\")\n",
+      "p.set_center((c[0], c[1]))\n",
+      "p.zoom(10)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Okay, let's load up a bigger simulation (from `Enzo_64` this time) and make a slice plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load(\"Enzo_64/DD0043/data0043\")\n",
+      "s = SlicePlot(pf, \"z\", [\"Density\", \"VelocityMagnitude\"], center=\"max\")\n",
+      "s.set_cmap(\"VelocityMagnitude\", \"kamae\")\n",
+      "s.zoom(10.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can adjust the logging of various fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.set_log(\"VelocityMagnitude\", True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "yt provides many different annotations for your plots.  You can see all of these in the documentation, or if you type `s.annotate_` and press tab, a list will show up here.  We'll annotate with velocity arrows."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.annotate_velocity()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Contours can also be overlaid:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s = SlicePlot(pf, \"x\", [\"Density\"], center=\"max\")\n",
+      "s.annotate_contour(\"Temperature\")\n",
+      "s.zoom(2.5)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we can save out to the file system."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "s.save()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/Volume_Rendering.ipynb
--- /dev/null
+++ b/source/bootcamp/Volume_Rendering.ipynb
@@ -0,0 +1,95 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "# A Brief Demo of Volume Rendering\n",
+      "\n",
+      "This shows a small amount of volume rendering.  Really, just enough to get your feet wet!"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "pf = load(\"IsolatedGalaxy/galaxy0030/galaxy0030\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To create a volume rendering, we need a camera and a transfer function.  We'll use the `ColorTransferFunction`, which accepts (in log space) the minimum and maximum bounds of our transfer function.  This means behavior for data outside these values is undefined.\n",
+      "\n",
+      "We then add on \"layers\" like an onion.  This function can accept a width (here specified) in data units, and also a color map.  Here we add on four layers.\n",
+      "\n",
+      "Finally, we create a camera.  The focal point is `[0.5, 0.5, 0.5]`, the width is 20 kpc (including front-to-back integration) and we specify a transfer function.  Once we've done that, we call `show` to actually cast our rays and display them inline."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -24))\n",
+      "tf.add_layers(4, w=0.01)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf)\n",
+      "cam.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "If we want to apply a clipping, we can specify the `clip_ratio`.  This will clip the upper bounds to this value times the `std()` of the image array."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "cam.show(clip_ratio=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "There are several other options we can specify.  Note that here we have turned on the use of ghost zones, shortened the data interval for the transfer function, and widened our gaussian layers."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "tf = ColorTransferFunction((-28, -25))\n",
+      "tf.add_layers(4, w=0.03)\n",
+      "cam = pf.h.camera([0.5, 0.5, 0.5], [1.0, 1.0, 1.0], 20.0/pf['kpc'], 512, tf, no_ghost=False)\n",
+      "cam.show(clip_ratio=4.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/data_inspection.rst
--- /dev/null
+++ b/source/bootcamp/data_inspection.rst
@@ -0,0 +1,4 @@
+Data Inspection
+---------------
+
+.. notebook:: Data_Inspection.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/data_objects_and_time_series.rst
--- /dev/null
+++ b/source/bootcamp/data_objects_and_time_series.rst
@@ -0,0 +1,4 @@
+Data Objects and Time Series
+----------------------------
+
+.. notebook:: Data_Objects_and_Time_Series.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/derived_fields_and_profiles.rst
--- /dev/null
+++ b/source/bootcamp/derived_fields_and_profiles.rst
@@ -0,0 +1,4 @@
+Derived Fields and Profiles
+---------------------------
+
+.. notebook:: Derived_Fields_and_Profiles.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/index.rst
--- /dev/null
+++ b/source/bootcamp/index.rst
@@ -0,0 +1,49 @@
+yt Bootcamp
+===========
+
+We have been developing a sequence of materials that can be run in the IPython
+notebook that walk through how to look at data and how to operate on data.
+These are not meant to be detailed walkthroughs, but simply short
+introductions.  Their purpose is to let you explore, interactively, some common
+operations that can be done on data with yt!
+
+To get started with the bootcamp, you need to download the repository and start
+the IPython notebook.  The easiest way, if you have mercurial installed, to get
+the repository is to:
+
+.. code-block:: bash
+
+   hg clone https://bitbucket.org/yt_analysis/yt-doc
+
+If you don't, you can download it from `here
+<https://bitbucket.org/yt_analysis/yt-doc/get/tip.tar.bz2>`_
+
+Now you can start the IPython notebook and begin:
+
+.. code-block:: bash
+
+   cd yt-doc/source/bootcamp
+   yt notebook
+
+This command will give you information about the Notebook Server and how to
+access it.  Once you have done so, choose "Introduction" from the list of
+notebooks, which includes an introduction and information about how to download
+the sample data.
+
+.. warning:: The pre-filled out notebooks are *far* less fun than running them
+             yourselves!  Check out the repo and give it a try.
+
+Here are the notebooks, which have been filled in for inspection:
+
+.. toctree::
+   :maxdepth: 1
+
+   introduction
+   data_inspection
+   simple_visualization
+   data_objects_and_time_series
+   derived_fields_and_profiles
+   volume_rendering
+
+Let us know if you would like to contribute other example notebooks, or have
+any suggestions for how these can be improved.

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/introduction.rst
--- /dev/null
+++ b/source/bootcamp/introduction.rst
@@ -0,0 +1,4 @@
+Introduction
+------------
+
+.. notebook:: Introduction.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/simple_visualization.rst
--- /dev/null
+++ b/source/bootcamp/simple_visualization.rst
@@ -0,0 +1,4 @@
+Simple Visualization
+--------------------
+
+.. notebook:: Simple_Visualization.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/bootcamp/volume_rendering.rst
--- /dev/null
+++ b/source/bootcamp/volume_rendering.rst
@@ -0,0 +1,4 @@
+Volume Rendering
+----------------
+
+.. notebook:: Volume_Rendering.ipynb

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -30,7 +30,7 @@
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
               'sphinx.ext.autosummary', 'numpydocmod', 'youtube',
-              'yt_cookbook', 'yt_colormaps']
+              'yt_cookbook', 'yt_colormaps', 'notebook_sphinxext']
 
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']

diff -r 11fea34c6f5ce7477f92d31b920613d1d345562b -r eea7a0e14c5fadc1436e347873d296d16cc83705 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -113,7 +113,7 @@
 
    welcome/index
    orientation/index
-   bootcamp
+   bootcamp/index
    workshop
    help/index
    interacting/index


https://bitbucket.org/yt_analysis/yt-doc/commits/1917a259f9ed/
Changeset:   1917a259f9ed
User:        jzuhone
Date:        2013-10-29 16:41:06
Summary:     Notebook-ified version of the S-Z docs.
Affected #:  2 files

diff -r eea7a0e14c5fadc1436e347873d296d16cc83705 -r 1917a259f9ed14de1932872283341d5ede77532a source/analysis_modules/SZ_projections.ipynb
--- /dev/null
+++ b/source/analysis_modules/SZ_projections.ipynb
@@ -0,0 +1,232 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "heading",
+     "level": 1,
+     "metadata": {},
+     "source": [
+      "Mock Observations of the Sunyaev-Zeldovich Effect"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The change in the CMB intensity due to Compton scattering of CMB\n",
+      "photons off of thermal electrons in galaxy clusters, otherwise known as the\n",
+      "Sunyaev-Zeldovich (S-Z) effect, can to a reasonable approximation be represented by a\n",
+      "projection of the pressure field of a cluster. However, the *full* S-Z signal is a combination of thermal and kinetic\n",
+      "contributions, and for large frequencies and high temperatures\n",
+      "relativistic effects are important. For computing the full S-Z signal\n",
+      "incorporating all of these effects, Jens Chluba has written a library:\n",
+      "SZpack ([Chluba et al 2012](http://adsabs.harvard.edu/abs/2012MNRAS.426..510C)). \n",
+      "\n",
+      "The `sunyaev_zeldovich` analysis module in `yt` makes it possible\n",
+      "to make projections of the full S-Z signal given the properties of the\n",
+      "thermal gas in the simulation using SZpack. SZpack has several different options for computing the S-Z signal, from full\n",
+      "integrations to very good approximations.  Since a full or even a\n",
+      "partial integration of the signal for each cell in the projection\n",
+      "would be prohibitively expensive, we use the method outlined in\n",
+      "[Chluba et al 2013](http://adsabs.harvard.edu/abs/2013MNRAS.430.3054C) to expand the\n",
+      "total S-Z signal in terms of moments of the projected optical depth $\\tau$, projected electron temperature $T_e$, and\n",
+      "velocities $\\beta_{c,\\parallel}$ and $\\beta_{c,\\perp}$ (their equation 18):"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "$$S(\\tau, T_{e},\\beta_{c,\\parallel},\\beta_{\\rm c,\\perp}) \\approx S_{\\rm iso}^{(0)} + S_{\\rm iso}^{(2)}\\omega^{(1)} + C_{\\rm iso}^{(1)}\\sigma^{(1)} + D_{\\rm iso}^{(2)}\\kappa^{(1)} + E_{\\rm iso}^{(2)}\\beta_{\\rm c,\\perp,SZ}^2 +~...$$\n"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`yt` makes projections of the various moments needed for the\n",
+      "calculation, and then the resulting projected fields are used to\n",
+      "compute the S-Z signal. In our implementation, the expansion is carried out to first-order\n",
+      "terms in $T_e$ and zeroth-order terms in $\\beta_{c,\\parallel}$ by default, but terms up to second-order in can be optionally\n",
+      "included. "
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Installing SZpack"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "SZpack can be downloaded [here](http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html). Make\n",
+      "sure you install a version later than v1.1.1. For computing the S-Z\n",
+      "integrals, SZpack requires the [GNU Scientific Library](http://www.gnu.org/software/gsl/). For compiling\n",
+      "the Python module, you need to have a recent version of [swig](http://www.swig.org>) installed. After running `make` in the top-level SZpack directory, you'll need to run it in the `python` subdirectory, which is the\n",
+      "location of the `SZpack` module. You may have to include this location in the `PYTHONPATH` environment variable.\n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Creating S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have SZpack installed, making S-Z projections from ``yt``\n",
+      "datasets is fairly straightforward:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from yt.analysis_modules.api import SZProjection\n",
+      "\n",
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "\n",
+      "freqs = [90.,180.,240.]\n",
+      "szprj = SZProjection(pf, freqs)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`freqs` is a list or array of frequencies in GHz at which the signal\n",
+      "is to be computed. The `SZProjection` constructor also accepts the\n",
+      "optional keywords, **mue** (mean molecular weight for computing the\n",
+      "electron number density, 1.143 is the default) and **high_order** (set\n",
+      "to True to compute terms in the S-Z signal expansion up to\n",
+      "second-order in $T_{e,SZ}$ and $\\beta$). "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have created the `SZProjection` object, you can use it to\n",
+      "make on-axis and off-axis projections:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# An on-axis projection along the z-axis with width 10 Mpc, centered on the gas density maximum\n",
+      "szprj.on_axis(\"z\", center=\"max\", width=(10.0, \"mpc\"), nx=400)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To make an off-axis projection, `szprj.off_axis` is called in the same way, except that the first argument is a three-component normal vector. \n",
+      "\n",
+      "Currently, only one projection can be in memory at once. These methods\n",
+      "create images of the projected S-Z signal at each requested frequency,\n",
+      "which can be accessed dict-like from the projection object (e.g.,\n",
+      "`szprj[\"90_GHz\"]`). Projections of other quantities may also be\n",
+      "accessed; to see what fields are available call `szprj.keys()`. The methods also accept standard ``yt``\n",
+      "keywords for projections such as **center**, **width**, and **source**. The image buffer size can be controlled by setting **nx**.  \n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Writing out the S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "You may want to output the S-Z images to figures suitable for\n",
+      "inclusion in a paper, or save them to disk for later use. There are a\n",
+      "few methods included for this purpose. For PNG figures with a colorbar\n",
+      "and axes, use `write_png`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_png(\"SZ_example\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For simple output of the image data to disk, call `write_hdf5`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_hdf5(\"SZ_example.h5\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, for output to FITS files which can be opened or analyzed\n",
+      "using other programs (such as ds9), call `export_fits`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_fits(\"SZ_example.fits\", clobber=True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "which would write all of the projections to a single FITS file named `\"SZexample.fits\"`,\n",
+      "including coordinate information in kpc. The optional keyword\n",
+      "**clobber** allows a previous file to be overwritten. \n"
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r eea7a0e14c5fadc1436e347873d296d16cc83705 -r 1917a259f9ed14de1932872283341d5ede77532a source/analysis_modules/sunyaev_zeldovich.rst
--- a/source/analysis_modules/sunyaev_zeldovich.rst
+++ b/source/analysis_modules/sunyaev_zeldovich.rst
@@ -1,143 +1,4 @@
-.. _sunyaev_zeldovich:
+Mock Observations of the Sunyaev-Zeldovich Effect
+-----------------------------------------
 
-Mock Observations of the Sunyaev-Zeldovich Effect
-=================================
-.. sectionauthor:: John ZuHone <jzuhone at gmail.com>
-
-The change in the CMB intensity due to Compton scattering of CMB
-photons off of thermal electrons in galaxy clusters, otherwise known as the
-Sunyaev-Zeldovich (S-Z) effect, can to a reasonable approximation be represented by a
-projection of the pressure field of a cluster. However, the `full` S-Z signal is a combination of thermal and kinetic
-contributions, and for large frequencies and high temperatures
-relativistic effects are important. For computing the full S-Z signal
-incorporating all of these effects, Jens Chluba has written a library:
-SZpack (`Chluba et al 2012 <http://adsabs.harvard.edu/abs/2012MNRAS.426..510C>`_). 
-
-The ``sunyaev_zeldovich`` analysis module in ``yt`` makes it possible
-to make projections of the full S-Z signal given the properties of the
-thermal gas in the simulation using SZpack. SZpack has several different options for computing the S-Z signal, from full
-integrations to very good approximations.  Since a full or even a
-partial integration of the signal for each cell in the projection
-would be prohibitively expensive, we use the method outlined in
-`Chluba et al 2013 <http://adsabs.harvard.edu/abs/2013MNRAS.430.3054C>`_ to expand the
-total S-Z signal in terms of moments of the projected optical depth
-:math:`\tau`, projected electron temperature :math:`T_e`, and
-velocities :math:`\beta_{c,\parallel})` and :math:`\beta_{c,\perp})` (their equation 18):
-
-.. math::
-  S(\tau, T_{e},\beta_{c,\parallel},\beta_{\rm
-  c,\perp}) \approx S_{\rm iso}^{(0)} +
-  S_{\rm iso}^{(2)}\omega^{(1)} + C_{\rm iso}^{(1)}\sigma^{(1)} +
-  D_{\rm iso}^{(2)}\kappa^{(1)} + E_{\rm iso}^{(2)}\beta_{\rm
-  c,\perp,SZ}^2 + ...
-
-`yt` makes projections of the various moments needed for the
-calculation, and then the resulting projected fields are used to
-compute the S-Z signal. In our implementation, the expansion is carried out to first-order
-terms in :math:`T_e` and zeroth-order terms in
-:math:`\beta_{c,\parallel}` by default, but terms up to second-order in can be optionally
-included. 
- 
-Installing SZpack
--------------------------------
-
-SZpack can be downloaded `here
-<http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html>`_. Make
-sure you install a version later than v1.1.1. For computing the S-Z
-integrals, SZpack requires the `GNU Scientific Library <http://www.gnu.org/software/gsl/>`_. For compiling
-the Python module, you need to have a recent version of `swig <http://www.swig.org>`_
-installed. After running ``make`` in the top-level SZpack directory,
-you'll need to run it in the ``python`` subdirectory, which is the
-location of the ``SZpack`` module. You may have to include this location in the ``PYTHONPATH`` environment variable.
-
-Creating S-Z Projections
--------------------------------
-
-Once you have ``SZpack`` installed, making S-Z projections from ``yt``
-datasets is fairly straightforward:
-
-.. code-block:: python
-
-  from yt.analysis_modules.api import SZProjection
-
-  pf = load("fiducial_1to10_b0_hdf5_plt_cnt_0115.gz")
-  freqs = [90., 180., 240.]
-  szprj = SZProjection(pf, freqs)
-
-``freqs`` is a list or array of frequencies in GHz at which the signal
-is to be computed. The ``SZProjection`` constructor also accepts the
-optional keywords, **mue** (mean molecular weight for computing the
-electron number density, 1.143 is the default) and **high_order** (set
-to True to compute terms in the S-Z signal expansion up to
-second-order in T_eSZ and \beta). 
-
-Once you have created the ``SZProjection`` object, you can use it to
-make on-axis and off-axis projections:
-
-.. code-block:: python
-
-  # An on-axis projection along the z-axis with width 7 Mpc
-  szprj.on_axis("z", width=(7.0, "mpc"))
-  # An off-axis projection along a normal vector centered at the
-  # maximum gas density with a width of 6000 kpc
-  L = np.array([0.1,-0.1,0.3])
-  szprj.off_axis(L, center="max", width=(6000., "kpc"))
-
-Currently, only one projection can be in memory at once. These methods
-create images of the projected S-Z signal at each requested frequency,
-which can be accessed dict-like from the projection object (e.g.,
-``szprj["90_GHz"]``). Projections of other quantities may also be
-accessed; to see what fields are available call ``szprj.keys()``. The methods also accept standard ``yt``
-keywords for projections such as **center**, **width**, and **source**. The image buffer size can be controlled by setting **nx**.  
-
-Writing out the S-Z Projections
--------------------------------
-
-You may want to output the S-Z images to figures suitable for
-inclusion in a paper, or save them to disk for later use. There are a
-few methods included for this purpose. For PNG figures with a colorbar
-and axes, use ``write_png``:
-
-.. code-block:: python
-
-  szprj.write_png("SZbullet")
-
-which would result in the following images of the S-Z signal for our previous on-axis
-example:
-
-.. image:: _images/SZbullet_90_GHz.png
-   :width: 500
-
-.. image:: _images/SZbullet_180_GHz.png
-   :width: 500
-
-.. image:: _images/SZbullet_240_GHz.png
-   :width: 500
-
-along with projections of the optical depth and the mass-weighted
-temperature:
-
-.. image:: _images/SZbullet_Tau.png
-   :width: 500
-
-.. image:: _images/SZbullet_TeSZ.png
-   :width: 500
-
-For simple output of the image data to disk, call ``write_hdf5``:
-
-.. code-block:: python
-
-  szprj.write_hdf5("SZbullet.h5")
-
-Finally, for output to FITS files which can be opened or analyzed
-using other programs (such as ds9), call ``export_fits``.
-
-.. code-block:: python
-
-  szprj.write_fits("SZbullet", clobber=True)
-
-which would write all of the projections to a single FITS file named ``"SZbullet.fits"``,
-including coordinate information in kpc. The optional keyword
-**clobber** allows a previous file to be overwritten. 
-
-.. note:: To write out a FITS file, you must install the `pyfits <http://www.stsci.edu/resources/software_hardware/pyfits>`_ or the `AstroPy <http://www.astropy.org>`_ module.
+.. notebook:: SZ_projections.ipynb


https://bitbucket.org/yt_analysis/yt-doc/commits/2227c183b825/
Changeset:   2227c183b825
User:        jzuhone
Date:        2013-10-29 20:34:32
Summary:     S-Z projection in notebook form.
Affected #:  3 files

diff -r bd4a2fec8113bcc75582643466466d7a0ede91bd -r 2227c183b825d79705299accdbd87be1b636d82b source/analyzing/analysis_modules/SZ_projections.ipynb
--- /dev/null
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -0,0 +1,224 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The change in the CMB intensity due to Compton scattering of CMB\n",
+      "photons off of thermal electrons in galaxy clusters, otherwise known as the\n",
+      "Sunyaev-Zeldovich (S-Z) effect, can to a reasonable approximation be represented by a\n",
+      "projection of the pressure field of a cluster. However, the *full* S-Z signal is a combination of thermal and kinetic\n",
+      "contributions, and for large frequencies and high temperatures\n",
+      "relativistic effects are important. For computing the full S-Z signal\n",
+      "incorporating all of these effects, Jens Chluba has written a library:\n",
+      "SZpack ([Chluba et al 2012](http://adsabs.harvard.edu/abs/2012MNRAS.426..510C)). \n",
+      "\n",
+      "The `sunyaev_zeldovich` analysis module in `yt` makes it possible\n",
+      "to make projections of the full S-Z signal given the properties of the\n",
+      "thermal gas in the simulation using SZpack. SZpack has several different options for computing the S-Z signal, from full\n",
+      "integrations to very good approximations.  Since a full or even a\n",
+      "partial integration of the signal for each cell in the projection\n",
+      "would be prohibitively expensive, we use the method outlined in\n",
+      "[Chluba et al 2013](http://adsabs.harvard.edu/abs/2013MNRAS.430.3054C) to expand the\n",
+      "total S-Z signal in terms of moments of the projected optical depth $\\tau$, projected electron temperature $T_e$, and\n",
+      "velocities $\\beta_{c,\\parallel}$ and $\\beta_{c,\\perp}$ (their equation 18):"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "$$S(\\tau, T_{e},\\beta_{c,\\parallel},\\beta_{\\rm c,\\perp}) \\approx S_{\\rm iso}^{(0)} + S_{\\rm iso}^{(2)}\\omega^{(1)} + C_{\\rm iso}^{(1)}\\sigma^{(1)} + D_{\\rm iso}^{(2)}\\kappa^{(1)} + E_{\\rm iso}^{(2)}\\beta_{\\rm c,\\perp,SZ}^2 +~...$$\n"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`yt` makes projections of the various moments needed for the\n",
+      "calculation, and then the resulting projected fields are used to\n",
+      "compute the S-Z signal. In our implementation, the expansion is carried out to first-order\n",
+      "terms in $T_e$ and zeroth-order terms in $\\beta_{c,\\parallel}$ by default, but terms up to second-order in can be optionally\n",
+      "included. "
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Installing SZpack"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "SZpack can be downloaded [here](http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html). Make\n",
+      "sure you install a version later than v1.1.1. For computing the S-Z\n",
+      "integrals, SZpack requires the [GNU Scientific Library](http://www.gnu.org/software/gsl/). For compiling\n",
+      "the Python module, you need to have a recent version of [swig](http://www.swig.org>) installed. After running `make` in the top-level SZpack directory, you'll need to run it in the `python` subdirectory, which is the\n",
+      "location of the `SZpack` module. You may have to include this location in the `PYTHONPATH` environment variable.\n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Creating S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have SZpack installed, making S-Z projections from ``yt``\n",
+      "datasets is fairly straightforward:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from yt.analysis_modules.api import SZProjection\n",
+      "\n",
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "\n",
+      "freqs = [90.,180.,240.]\n",
+      "szprj = SZProjection(pf, freqs)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`freqs` is a list or array of frequencies in GHz at which the signal\n",
+      "is to be computed. The `SZProjection` constructor also accepts the\n",
+      "optional keywords, **mue** (mean molecular weight for computing the\n",
+      "electron number density, 1.143 is the default) and **high_order** (set\n",
+      "to True to compute terms in the S-Z signal expansion up to\n",
+      "second-order in $T_{e,SZ}$ and $\\beta$). "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have created the `SZProjection` object, you can use it to\n",
+      "make on-axis and off-axis projections:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# An on-axis projection along the z-axis with width 10 Mpc, centered on the gas density maximum\n",
+      "szprj.on_axis(\"z\", center=\"max\", width=(10.0, \"mpc\"), nx=400)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To make an off-axis projection, `szprj.off_axis` is called in the same way, except that the first argument is a three-component normal vector. \n",
+      "\n",
+      "Currently, only one projection can be in memory at once. These methods\n",
+      "create images of the projected S-Z signal at each requested frequency,\n",
+      "which can be accessed dict-like from the projection object (e.g.,\n",
+      "`szprj[\"90_GHz\"]`). Projections of other quantities may also be\n",
+      "accessed; to see what fields are available call `szprj.keys()`. The methods also accept standard ``yt``\n",
+      "keywords for projections such as **center**, **width**, and **source**. The image buffer size can be controlled by setting **nx**.  \n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Writing out the S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "You may want to output the S-Z images to figures suitable for\n",
+      "inclusion in a paper, or save them to disk for later use. There are a\n",
+      "few methods included for this purpose. For PNG figures with a colorbar\n",
+      "and axes, use `write_png`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_png(\"SZ_example\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For simple output of the image data to disk, call `write_hdf5`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_hdf5(\"SZ_example.h5\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, for output to FITS files which can be opened or analyzed\n",
+      "using other programs (such as ds9), call `export_fits`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_fits(\"SZ_example.fits\", clobber=True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "which would write all of the projections to a single FITS file named `\"SZexample.fits\"`,\n",
+      "including coordinate information in kpc. The optional keyword\n",
+      "**clobber** allows a previous file to be overwritten. \n"
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r bd4a2fec8113bcc75582643466466d7a0ede91bd -r 2227c183b825d79705299accdbd87be1b636d82b source/analyzing/analysis_modules/index.rst
--- a/source/analyzing/analysis_modules/index.rst
+++ b/source/analyzing/analysis_modules/index.rst
@@ -28,6 +28,7 @@
    ellipsoid_analysis
    xray_emission_fields
    radmc3d_export
+   sunyaev_zeldovich
 
 General Analysis Modules
 ------------------------
@@ -37,3 +38,4 @@
 
    two_point_functions
    clump_finding
+

diff -r bd4a2fec8113bcc75582643466466d7a0ede91bd -r 2227c183b825d79705299accdbd87be1b636d82b source/analyzing/analysis_modules/sunyaev_zeldovich.rst
--- /dev/null
+++ b/source/analyzing/analysis_modules/sunyaev_zeldovich.rst
@@ -0,0 +1,4 @@
+Mock Observations of the Sunyaev-Zeldovich Effect
+-----------------------------------------
+
+.. notebook:: SZ_projections.ipynb


https://bitbucket.org/yt_analysis/yt-doc/commits/9264483fcfe2/
Changeset:   9264483fcfe2
User:        jzuhone
Date:        2013-10-29 20:43:13
Summary:     Getting rid of the extra line
Affected #:  1 file

diff -r 2227c183b825d79705299accdbd87be1b636d82b -r 9264483fcfe2ff294532d381006f7eb6f7d4c140 source/analyzing/analysis_modules/index.rst
--- a/source/analyzing/analysis_modules/index.rst
+++ b/source/analyzing/analysis_modules/index.rst
@@ -38,4 +38,3 @@
 
    two_point_functions
    clump_finding
-


https://bitbucket.org/yt_analysis/yt-doc/commits/1015a8a45a11/
Changeset:   1015a8a45a11
User:        jzuhone
Date:        2013-10-29 21:03:41
Summary:     Removed an extraneous phrase
Affected #:  1 file

diff -r 9264483fcfe2ff294532d381006f7eb6f7d4c140 -r 1015a8a45a11e9a483755a8a943758fc174e04b4 source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -212,7 +212,7 @@
      "cell_type": "markdown",
      "metadata": {},
      "source": [
-      "which would write all of the projections to a single FITS file named `\"SZexample.fits\"`,\n",
+      "which would write all of the projections to a single FITS file,\n",
       "including coordinate information in kpc. The optional keyword\n",
       "**clobber** allows a previous file to be overwritten. \n"
      ]


https://bitbucket.org/yt_analysis/yt-doc/commits/ff7fa3bcc62b/
Changeset:   ff7fa3bcc62b
User:        chummels
Date:        2013-10-30 05:26:15
Summary:     Merged in jzuhone/yt-doc_chummels (pull request #1)

S-Z analysis doc and notebook
Affected #:  3 files

diff -r 45b305e19ab8133b7086c3a1df0df559dd524353 -r ff7fa3bcc62b1983de717477db367adf36604d32 source/analyzing/analysis_modules/SZ_projections.ipynb
--- /dev/null
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -0,0 +1,224 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The change in the CMB intensity due to Compton scattering of CMB\n",
+      "photons off of thermal electrons in galaxy clusters, otherwise known as the\n",
+      "Sunyaev-Zeldovich (S-Z) effect, can to a reasonable approximation be represented by a\n",
+      "projection of the pressure field of a cluster. However, the *full* S-Z signal is a combination of thermal and kinetic\n",
+      "contributions, and for large frequencies and high temperatures\n",
+      "relativistic effects are important. For computing the full S-Z signal\n",
+      "incorporating all of these effects, Jens Chluba has written a library:\n",
+      "SZpack ([Chluba et al 2012](http://adsabs.harvard.edu/abs/2012MNRAS.426..510C)). \n",
+      "\n",
+      "The `sunyaev_zeldovich` analysis module in `yt` makes it possible\n",
+      "to make projections of the full S-Z signal given the properties of the\n",
+      "thermal gas in the simulation using SZpack. SZpack has several different options for computing the S-Z signal, from full\n",
+      "integrations to very good approximations.  Since a full or even a\n",
+      "partial integration of the signal for each cell in the projection\n",
+      "would be prohibitively expensive, we use the method outlined in\n",
+      "[Chluba et al 2013](http://adsabs.harvard.edu/abs/2013MNRAS.430.3054C) to expand the\n",
+      "total S-Z signal in terms of moments of the projected optical depth $\\tau$, projected electron temperature $T_e$, and\n",
+      "velocities $\\beta_{c,\\parallel}$ and $\\beta_{c,\\perp}$ (their equation 18):"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "$$S(\\tau, T_{e},\\beta_{c,\\parallel},\\beta_{\\rm c,\\perp}) \\approx S_{\\rm iso}^{(0)} + S_{\\rm iso}^{(2)}\\omega^{(1)} + C_{\\rm iso}^{(1)}\\sigma^{(1)} + D_{\\rm iso}^{(2)}\\kappa^{(1)} + E_{\\rm iso}^{(2)}\\beta_{\\rm c,\\perp,SZ}^2 +~...$$\n"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`yt` makes projections of the various moments needed for the\n",
+      "calculation, and then the resulting projected fields are used to\n",
+      "compute the S-Z signal. In our implementation, the expansion is carried out to first-order\n",
+      "terms in $T_e$ and zeroth-order terms in $\\beta_{c,\\parallel}$ by default, but terms up to second-order in can be optionally\n",
+      "included. "
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Installing SZpack"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "SZpack can be downloaded [here](http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html). Make\n",
+      "sure you install a version later than v1.1.1. For computing the S-Z\n",
+      "integrals, SZpack requires the [GNU Scientific Library](http://www.gnu.org/software/gsl/). For compiling\n",
+      "the Python module, you need to have a recent version of [swig](http://www.swig.org>) installed. After running `make` in the top-level SZpack directory, you'll need to run it in the `python` subdirectory, which is the\n",
+      "location of the `SZpack` module. You may have to include this location in the `PYTHONPATH` environment variable.\n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Creating S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have SZpack installed, making S-Z projections from ``yt``\n",
+      "datasets is fairly straightforward:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from yt.analysis_modules.api import SZProjection\n",
+      "\n",
+      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "\n",
+      "freqs = [90.,180.,240.]\n",
+      "szprj = SZProjection(pf, freqs)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`freqs` is a list or array of frequencies in GHz at which the signal\n",
+      "is to be computed. The `SZProjection` constructor also accepts the\n",
+      "optional keywords, **mue** (mean molecular weight for computing the\n",
+      "electron number density, 1.143 is the default) and **high_order** (set\n",
+      "to True to compute terms in the S-Z signal expansion up to\n",
+      "second-order in $T_{e,SZ}$ and $\\beta$). "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Once you have created the `SZProjection` object, you can use it to\n",
+      "make on-axis and off-axis projections:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "# An on-axis projection along the z-axis with width 10 Mpc, centered on the gas density maximum\n",
+      "szprj.on_axis(\"z\", center=\"max\", width=(10.0, \"mpc\"), nx=400)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To make an off-axis projection, `szprj.off_axis` is called in the same way, except that the first argument is a three-component normal vector. \n",
+      "\n",
+      "Currently, only one projection can be in memory at once. These methods\n",
+      "create images of the projected S-Z signal at each requested frequency,\n",
+      "which can be accessed dict-like from the projection object (e.g.,\n",
+      "`szprj[\"90_GHz\"]`). Projections of other quantities may also be\n",
+      "accessed; to see what fields are available call `szprj.keys()`. The methods also accept standard ``yt``\n",
+      "keywords for projections such as **center**, **width**, and **source**. The image buffer size can be controlled by setting **nx**.  \n"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Writing out the S-Z Projections"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "You may want to output the S-Z images to figures suitable for\n",
+      "inclusion in a paper, or save them to disk for later use. There are a\n",
+      "few methods included for this purpose. For PNG figures with a colorbar\n",
+      "and axes, use `write_png`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_png(\"SZ_example\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "For simple output of the image data to disk, call `write_hdf5`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_hdf5(\"SZ_example.h5\")"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, for output to FITS files which can be opened or analyzed\n",
+      "using other programs (such as ds9), call `export_fits`."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "szprj.write_fits(\"SZ_example.fits\", clobber=True)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "which would write all of the projections to a single FITS file,\n",
+      "including coordinate information in kpc. The optional keyword\n",
+      "**clobber** allows a previous file to be overwritten. \n"
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 45b305e19ab8133b7086c3a1df0df559dd524353 -r ff7fa3bcc62b1983de717477db367adf36604d32 source/analyzing/analysis_modules/index.rst
--- a/source/analyzing/analysis_modules/index.rst
+++ b/source/analyzing/analysis_modules/index.rst
@@ -28,6 +28,7 @@
    ellipsoid_analysis
    xray_emission_fields
    radmc3d_export
+   sunyaev_zeldovich
 
 General Analysis Modules
 ------------------------

diff -r 45b305e19ab8133b7086c3a1df0df559dd524353 -r ff7fa3bcc62b1983de717477db367adf36604d32 source/analyzing/analysis_modules/sunyaev_zeldovich.rst
--- /dev/null
+++ b/source/analyzing/analysis_modules/sunyaev_zeldovich.rst
@@ -0,0 +1,4 @@
+Mock Observations of the Sunyaev-Zeldovich Effect
+-----------------------------------------
+
+.. notebook:: SZ_projections.ipynb


https://bitbucket.org/yt_analysis/yt-doc/commits/1bf90b5dd8d8/
Changeset:   1bf90b5dd8d8
User:        ngoldbaum
Date:        2013-10-30 06:04:17
Summary:     Using the new athena dataset for the MHD callback.
Affected #:  1 file

diff -r 9bd2eefa636425e41b4cae73e20dd8a87d7a772b -r 1bf90b5dd8d842123e5c5161d37e8b231af05028 source/visualizing/_cb_docstrings.inc
--- a/source/visualizing/_cb_docstrings.inc
+++ b/source/visualizing/_cb_docstrings.inc
@@ -189,11 +189,13 @@
    features to be more clearly seen for fields with
    substantial variation in field strength.
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
-   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(20, 'kpc'))
+   pf = load("MHDSloshing/virgo_low_res.0054.vtk",
+             parameters={"TimeUnits":3.1557e13, "LengthUnits":3.0856e24,
+                         "DensityUnits":6.770424595218825e-27})
+   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(300, 'kpc'))
    p.annotate_magnetic_field()
    p.save()
 


https://bitbucket.org/yt_analysis/yt-doc/commits/500eed20fe41/
Changeset:   500eed20fe41
User:        chummels
Date:        2013-10-30 06:15:44
Summary:     Merging.
Affected #:  1 file

diff -r ff7fa3bcc62b1983de717477db367adf36604d32 -r 500eed20fe413592cf3c081a292ce623c885a0ef source/visualizing/_cb_docstrings.inc
--- a/source/visualizing/_cb_docstrings.inc
+++ b/source/visualizing/_cb_docstrings.inc
@@ -189,11 +189,13 @@
    features to be more clearly seen for fields with
    substantial variation in field strength.
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
-   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
-   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(20, 'kpc'))
+   pf = load("MHDSloshing/virgo_low_res.0054.vtk",
+             parameters={"TimeUnits":3.1557e13, "LengthUnits":3.0856e24,
+                         "DensityUnits":6.770424595218825e-27})
+   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(300, 'kpc'))
    p.annotate_magnetic_field()
    p.save()
 


https://bitbucket.org/yt_analysis/yt-doc/commits/a5de37401519/
Changeset:   a5de37401519
User:        chummels
Date:        2013-10-30 07:03:11
Summary:     Updating install instructions to cover removal.
Affected #:  1 file

diff -r 500eed20fe413592cf3c081a292ce623c885a0ef -r a5de37401519993756246741a29f36374f472987 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,6 +1,10 @@
+.. _getting-and-installing-yt:
+
 Getting and Installing yt
 =========================
 
+.. _getting-yt:
+
 Getting yt
 ----------
 
@@ -10,15 +14,22 @@
 be time-consuming, yt provides an installation script which downloads and builds
 a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
 yt supports Linux and OSX deployment, with the possibility of deployment on 
-other Unix-like systems (XSEDE resources, clusters, etc.).  Windows is not
+other Unix-like systems (XSEDE resources, clusters, etc.).  Windows is not 
 supported.
 
+Since the install is fully-isolated, if you get tired of having yt on your 
+system, you can just delete its directory, and yt and all of its dependencies
+will be removed from your system (no scattered files remaining throughout 
+your system).  
+
 To get the installation script, download it from:
 
 .. code-block:: bash
 
   http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
 
+.. _installing-yt:
+
 Installing yt
 -------------
 
@@ -44,6 +55,8 @@
 potentially figure out what went wrong.  If you have problems, though, do not 
 hesitate to :ref:`contact us asking-for-help` for assistance.
 
+.. _activating-yt:
+
 Activating Your Installation
 ----------------------------
 
@@ -86,6 +99,8 @@
 If you choose this installation method, you do not need to run the activation
 script as it is unnecessary.
 
+.. _testing-installation:
+
 Testing Your Installation
 -------------------------
 
@@ -103,6 +118,8 @@
 Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
 figure it out.
 
+.. _updating-yt:
+
 Updating yt and its dependencies
 --------------------------------
 
@@ -120,3 +137,15 @@
 .. code-block:: bash
 
   $ yt update --all
+
+.. _removing-yt:
+
+Removing yt and its dependencies
+--------------------------------
+
+Because yt and its dependencies are installed in an isolated directory when
+you use the script installer, you can easily remove yt and all of its 
+dependencies cleanly.  Simply remove the install directory and its 
+subdirectories and you're done.  If you *really* had problems with the
+code, this is a last defense for solving: remove and then fully
+:ref:`re-install <_installing-yt>` from the install script again.


https://bitbucket.org/yt_analysis/yt-doc/commits/2e0f9b675643/
Changeset:   2e0f9b675643
User:        chummels
Date:        2013-10-30 07:48:45
Summary:     Updating help file.
Affected #:  1 file

diff -r a5de37401519993756246741a29f36374f472987 -r 2e0f9b6756437c1249ac2c22b83e153208d5fe3a source/help/index.rst
--- a/source/help/index.rst
+++ b/source/help/index.rst
@@ -1,16 +1,40 @@
 .. _asking-for-help:
 
-How to Get Help
-===============
+What to do if you run into problems
+===================================
 
-If you run into problems with ``yt``, you should feel **encouraged** to ask for
-help -- whether this comes in the form of reporting a bug or emailing the
-mailing list.  If something doesn't work for you, it's in everyone's best
-interests to make sure that it gets fixed.
+If you run into problems with ``yt``, there are a number of steps to follow
+to come to a solution.  The first handful of options are things you can do 
+on your own, but if those don't yield results, we have provided a number of 
+ways to connect with our community of users and developers to solve the 
+problem together.
+
+To summarize, here are the steps in order:
+
+ #. Don’t panic and don’t give up
+ #. Update to the latest version
+ #. Search the yt documentation and mailing list archives
+ #. Look at the yt source
+ #. Isolate & document your problem 
+ #. Go on IRC and ask a question
+ #. Ask the mailing list
+ #. Submit a bug report
+
+.. _dont-panic:
+
+Don't panic and don't give up
+-----------------------------
+
+This may seem silly, but it's effective.  While yt is a robust code with
+lots of functionality, like all actively-developed codes sometimes there
+are bugs.  Chances are good that your problems have a quick fix, either 
+because someone encountered it before and fixed it, the documentation is 
+out of date, or some other simple solution.  Don't give up!  We want
+to help you succeed!
 
 .. _update-the-code:
 
-Try Updating yt
+Try updating yt
 ---------------
 
 Sometimes the pace of development is pretty fast on yt, particularly in the
@@ -31,8 +55,8 @@
 
 .. _search-the-documentation:
 
-Search the Documentation
-------------------------
+Search the documentation and mailing lists
+------------------------------------------
 
 The documentation has a lot of the answers to everyday problems.  This doesn't 
 mean you have to read all of the docs top-to-bottom, but you should at least 
@@ -40,11 +64,6 @@
 on the search field to the right of this window and enter your text.  Another 
 good place to look for answers in the documentation is our :ref:`faq` page.
 
-.. _mailing-list:
-
-Search/Ask the Mailing List
----------------------------
-
 OK, so there was no obvious solution to your problem in the documentation.  
 It is possible that someone else experienced the problem before you did, and
 wrote to the mailing list about it.  You can easily check the mailing list 
@@ -63,7 +82,79 @@
    </form><script type="text/javascript" src="http://www.google.com/cse/brand?form=cse-search-box&lang=en"></script>
 
-If you didn't find any hint of a solution in the archive, then feel free to 
+.. _look-at-the-source:
+
+Look at the source code
+-----------------------
+
+We've done our best to make the source clean, and it is easily searchable from 
+your computer.  Go inside your yt install directory by going to the 
+``$YT_DEST/src/yt-hg/yt`` directory where all the code lives.  You can then search 
+for the class, function, or keyword which is giving you problems with 
+``grep -r *``, which will recursively search throughout the code base.  (For a 
+much faster and cleaner experience, we recommend ``grin`` instead of 
+``grep -r *``.  To install ``grin`` with python, just type ``pip install 
+grin``.)  
+
+So let's say that pesky ``SlicePlot`` is giving you problems still, and you 
+want to look at the source to figure out what is going on.
+
+.. code-block:: bash
+
+  $ cd $YT_DEST/src/yt-hg/yt
+  $ grep -r SlicePlot *         (or $ grin SlicePlot)
+  
+   data_objects/analyzer_objects.py:class SlicePlotDataset(AnalysisTask):
+   data_objects/analyzer_objects.py:        from yt.visualization.api import SlicePlot
+   data_objects/analyzer_objects.py:        self.SlicePlot = SlicePlot
+   data_objects/analyzer_objects.py:        slc = self.SlicePlot(pf, self.axis, self.field, center = self.center)
+   ...
+
+You can now followup on this and open up the files that have references to 
+``SlicePlot`` (particularly the one that definese SlicePlot) and inspect its
+contents for problems or clarification.
+
+.. _isolate_and_document:
+
+Isolate and document your problem
+---------------------------------
+
+As you gear up to take your question to the rest of the community, try to distill
+your problem down to the fewest number of steps needed to produce it in a 
+script.  This can help you (and us) to identify the basic problem.  Follow
+these steps:
+
+ * Identify what it is that went wrong, and how you knew it went wrong.
+ * Put your script, errors, and outputs online:
+   * ``$ yt pastebin script.py`` - pastes script.py online
+   * ``$ python script.py --paste`` - pastes errors online
+   * ``$ yt upload_image image.png`` - pastes image online
+ * Identify which version of the code you’re using. 
+   * ``$ yt instinfo`` - provides version information, including changeset hash
+
+It may be that through the mere process of doing this, you end up solving 
+the problem!
+
+.. _irc:
+
+IRC
+---
+
+If you want a fast, interactive experience, you could try jumping into our IRC 
+channel to get your questions answered in a chatroom style environment.  You 
+don't even need to have any special IRC client in order to join.  We are the
+#yt channel on irc.freenode.net, but you can also connect using your web 
+browser by going to http://yt-project.org/irc.html .  There are usually 2-8 
+members of the user base and development team online, so you'll probably get 
+your answers quickly.  Remember to bring the information from the 
+:ref:`last step <isolate_and_document>`.
+
+.. _mailing-list:
+
+Ask the mailing list
+--------------------
+
+If you still haven't yet found a solution, feel free to 
 write to the mailing list regarding your problems.  There are two mailing lists,
 `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_ and
 `yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_.  The
@@ -71,71 +162,43 @@
 the latter has more chatter about the way the code is developed and discussions
 of changes and feature improvements.
 
-If you email ``yt-users`` asking for help, there are several things you must
-provide, or else we won't be able to do much:
-
-#. What it is that went wrong, and how you knew it went wrong.
-#. A traceback if appropriate -- see :ref:`error-reporting` for some help with
-   that.
-#. If possible, the smallest number of steps that can reproduce the problem. 
-   If you're demonstrating the bug with code, you may find the :ref:`pastebin` 
-   useful.If you've got an image output that demonstrates your problem, you may 
-   find the :ref:`upload-image` function useful.
-#. Which version of the code you are using (i.e. the output of ``yt instinfo``).
+If you email ``yt-users`` asking for help, remember to include the information
+about your problem you identified in :ref:`this step <isolate_and_document>`.
 
 When you email the list, providing this information can help the developers
 understand what you did, how it went wrong, and any potential fixes or similar
 problems they have seen in the past.  Without this context, it can be very
 difficult to help out!
 
-.. _irc:
-
-IRC
----
-
-If you want a more interactive experience, you could try jumping into our IRC 
-channel to get your questions answered in a chatroom style environment.  You 
-don't even need to have any special IRC client in order to join.  We are the
-#yt channel on irc.freenode.net, but you can also connect using your web 
-browser by going to http://yt-project.org/irc.html .  There are usually 2-8 members of the user base and development team online, so you'll probably get your
-answers quickly.
-
 .. _reporting-a-bug:
 
-How To Report A Bug
+How To report A bug
 -------------------
 
 If you have gone through all of the above steps, and you're still encountering 
-problems, then you have found a bug.  The first step, when reporting a bug, 
-is to identify the smallest piece of code that reproduces the bug.
+problems, then you have found a bug.  
 To submit a bug report, you can either directly create one through the
 BitBucket `web interface <http://hg.yt-project.org/yt/issues/new>`_,
 or you can use the command line ``yt bugreport`` to interactively create one.
 Alternatively, email the ``yt-users`` mailing list and we will construct a new
-ticket in your stead.
+ticket in your stead.  Remember to include the information
+about your problem you identified in :ref:`this step <isolate_and_document>`.
 
 
 Installation Issues
 -------------------
 
-If you are having installation issues, you should *definitely* email the
-``yt-users`` email list.  You should provide information about the host, the
-version of the code you are using, and the output of ``yt_install.log`` from
-your installation.  We are very interested in making sure that ``yt`` installs
-everywhere!
-
-Vanilla Usage Issues
---------------------
-
-If you're running ``yt`` without having made any modifications to the code
-base, please provide as much of your script as you are able to.  Submitting
-both the script and the traceback to the pastebin (as described in :ref:`pastebin`)
-is usually sufficient to reproduce the error.
+If you are having installation issues and nothing from the 
+:ref:`installing page <getting-and-installing-yt` seems to work, you should 
+*definitely* email the ``yt-users`` email list.  You should provide information 
+about the host, the version of the code you are using, and the output of 
+``yt_install.log`` from your installation.  We are very interested in making 
+sure that ``yt`` installs everywhere!
 
 Customization and Scripting Issues
 ----------------------------------
 
 If you have customized ``yt`` in some way, or created your own plugins file (as
 described in :ref:`plugin-file`) then it may be necessary to supply both your
-patches to the source and the plugin file, if you are utilizing something
-defined in that file.
+patches to the source, the plugin file, and perhaps even the datafile on which
+you're running.


https://bitbucket.org/yt_analysis/yt-doc/commits/fcf46f3fe76d/
Changeset:   fcf46f3fe76d
User:        ngoldbaum
Date:        2013-10-30 07:19:04
Summary:     Fixing the SZ notebook.
Affected #:  1 file

diff -r 500eed20fe413592cf3c081a292ce623c885a0ef -r fcf46f3fe76d3283d84b343a894cb6b26c3b1e0a source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -91,7 +91,7 @@
       "from yt.imods import *\n",
       "from yt.analysis_modules.api import SZProjection\n",
       "\n",
-      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
       "\n",
       "freqs = [90.,180.,240.]\n",
       "szprj = SZProjection(pf, freqs)"
@@ -167,7 +167,32 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "szprj.write_png(\"SZ_example\")"
+      "szprj.write_png(\"SZ_example\")\n",
+      "\n",
+      "import glob\n",
+      "from IPython.display import Image\n",
+      "fns = glob.glob(\"SZ_example*.png\")\n",
+      "print fns"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "Image('SZ_example_TeSz.png')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "Image('SZ_example_240_GHz.png')"
      ],
      "language": "python",
      "metadata": {},


https://bitbucket.org/yt_analysis/yt-doc/commits/a5d3baf4fd32/
Changeset:   a5d3baf4fd32
User:        chummels
Date:        2013-10-30 07:49:06
Summary:     Merging
Affected #:  1 file

diff -r 2e0f9b6756437c1249ac2c22b83e153208d5fe3a -r a5d3baf4fd32c9be292166c548ca9b4ec74daea8 source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -91,7 +91,7 @@
       "from yt.imods import *\n",
       "from yt.analysis_modules.api import SZProjection\n",
       "\n",
-      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
       "\n",
       "freqs = [90.,180.,240.]\n",
       "szprj = SZProjection(pf, freqs)"
@@ -167,7 +167,32 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "szprj.write_png(\"SZ_example\")"
+      "szprj.write_png(\"SZ_example\")\n",
+      "\n",
+      "import glob\n",
+      "from IPython.display import Image\n",
+      "fns = glob.glob(\"SZ_example*.png\")\n",
+      "print fns"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "Image('SZ_example_TeSz.png')"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "Image('SZ_example_240_GHz.png')"
      ],
      "language": "python",
      "metadata": {},


https://bitbucket.org/yt_analysis/yt-doc/commits/8f4277fe9727/
Changeset:   8f4277fe9727
User:        chummels
Date:        2013-10-30 08:02:34
Summary:     Minor change on front page.
Affected #:  1 file

diff -r a5d3baf4fd32c9be292166c548ca9b4ec74daea8 -r 8f4277fe972722114277583a8daf655c21262cc6 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -128,4 +128,4 @@
    examining/index
    developing/index
    reference/index
-   help/index
+   Getting Help <help/index>


https://bitbucket.org/yt_analysis/yt-doc/commits/cda84dc2fb80/
Changeset:   cda84dc2fb80
User:        chummels
Date:        2013-10-30 08:17:45
Summary:     Fixing some rendering issues and typos in help docs.
Affected #:  1 file

diff -r 8f4277fe972722114277583a8daf655c21262cc6 -r cda84dc2fb803f5043fa2576cf4814645746c378 source/help/index.rst
--- a/source/help/index.rst
+++ b/source/help/index.rst
@@ -111,7 +111,7 @@
    ...
 
 You can now followup on this and open up the files that have references to 
-``SlicePlot`` (particularly the one that definese SlicePlot) and inspect its
+``SlicePlot`` (particularly the one that definese SlicePlot) and inspect their
 contents for problems or clarification.
 
 .. _isolate_and_document:
@@ -126,10 +126,13 @@
 
  * Identify what it is that went wrong, and how you knew it went wrong.
  * Put your script, errors, and outputs online:
+
    * ``$ yt pastebin script.py`` - pastes script.py online
    * ``$ python script.py --paste`` - pastes errors online
    * ``$ yt upload_image image.png`` - pastes image online
+
  * Identify which version of the code you’re using. 
+
    * ``$ yt instinfo`` - provides version information, including changeset hash
 
 It may be that through the mere process of doing this, you end up solving 


https://bitbucket.org/yt_analysis/yt-doc/commits/616b3fc41bfa/
Changeset:   616b3fc41bfa
User:        chummels
Date:        2013-10-30 08:18:56
Summary:     Fixing a bug in the reference in the install.
Affected #:  1 file

diff -r cda84dc2fb803f5043fa2576cf4814645746c378 -r 616b3fc41bfa475aa77b10649dd1476f6e7a3529 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -148,4 +148,4 @@
 dependencies cleanly.  Simply remove the install directory and its 
 subdirectories and you're done.  If you *really* had problems with the
 code, this is a last defense for solving: remove and then fully
-:ref:`re-install <_installing-yt>` from the install script again.
+:ref:`re-install <installing-yt>` from the install script again.


https://bitbucket.org/yt_analysis/yt-doc/commits/c0d53b0f11e3/
Changeset:   c0d53b0f11e3
User:        chummels
Date:        2013-10-30 09:13:52
Summary:     Updating headers on front page.
Affected #:  1 file

diff -r 616b3fc41bfa475aa77b10649dd1476f6e7a3529 -r c0d53b0f11e3740b109bfbabf21d1e137d0ebd88 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -1,5 +1,5 @@
-What is yt?
-===========
+yt Documentation
+================
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
@@ -17,8 +17,8 @@
 <http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
 particle codes and octree codes, is taking place in yt 3.0.)
 
-Documentation
-=============
+Table of Contents
+-----------------
 
 .. raw:: html
 


https://bitbucket.org/yt_analysis/yt-doc/commits/0e0a73bd3895/
Changeset:   0e0a73bd3895
User:        chummels
Date:        2013-10-30 09:21:26
Summary:     Added John's SZ notebook to the cookbook.
Affected #:  1 file

diff -r c0d53b0f11e3740b109bfbabf21d1e137d0ebd88 -r 0e0a73bd3895d5df9c53d782084df9b349d0bd38 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -39,6 +39,7 @@
 Example Notebooks
 -----------------
 .. toctree::
-   :maxdepth: 2
+   :maxdepth: 1
 
    notebook_tutorial
+   ../analyzing/analysis_modules/sunyaev_zeldovich


https://bitbucket.org/yt_analysis/yt-doc/commits/680aa17f9cea/
Changeset:   680aa17f9cea
User:        ngoldbaum
Date:        2013-10-30 08:39:35
Summary:     Reverting some incorrect changes to the SZ projection docs

(sorry John!)
Affected #:  1 file

diff -r fcf46f3fe76d3283d84b343a894cb6b26c3b1e0a -r 680aa17f9cea966f0faf963c3e81c9f49eab6d3e source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -167,32 +167,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "szprj.write_png(\"SZ_example\")\n",
-      "\n",
-      "import glob\n",
-      "from IPython.display import Image\n",
-      "fns = glob.glob(\"SZ_example*.png\")\n",
-      "print fns"
-     ],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [
-      "Image('SZ_example_TeSz.png')"
-     ],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [
-      "Image('SZ_example_240_GHz.png')"
+      "szprj.write_png(\"SZ_example\")"
      ],
      "language": "python",
      "metadata": {},


https://bitbucket.org/yt_analysis/yt-doc/commits/830058811163/
Changeset:   830058811163
User:        chummels
Date:        2013-10-30 09:21:59
Summary:     Merging.
Affected #:  1 file

diff -r 0e0a73bd3895d5df9c53d782084df9b349d0bd38 -r 830058811163d05ae489c2571861f094dae691a9 source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -167,32 +167,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "szprj.write_png(\"SZ_example\")\n",
-      "\n",
-      "import glob\n",
-      "from IPython.display import Image\n",
-      "fns = glob.glob(\"SZ_example*.png\")\n",
-      "print fns"
-     ],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [
-      "Image('SZ_example_TeSz.png')"
-     ],
-     "language": "python",
-     "metadata": {},
-     "outputs": []
-    },
-    {
-     "cell_type": "code",
-     "collapsed": false,
-     "input": [
-      "Image('SZ_example_240_GHz.png')"
+      "szprj.write_png(\"SZ_example\")"
      ],
      "language": "python",
      "metadata": {},


https://bitbucket.org/yt_analysis/yt-doc/commits/fde11e6718f2/
Changeset:   fde11e6718f2
User:        MatthewTurk
Date:        2013-10-30 14:44:23
Summary:     Changing from GPL to BSD in the license text.
Affected #:  1 file

diff -r 830058811163d05ae489c2571861f094dae691a9 -r fde11e6718f280d02dfe796b79dab97d714e0065 source/developing/developing.rst
--- a/source/developing/developing.rst
+++ b/source/developing/developing.rst
@@ -66,11 +66,13 @@
 Licensing
 +++++++++
 
-All contributed code must be GPL-compatible; we ask that you consider licensing
-under the GPL version 3, but we will consider submissions of code that are
-BSD-like licensed as well.  If you'd rather not license in this manner, but
-still want to contribute, just drop me a line and I'll put a link on the main
-wiki page to wherever you like!
+yt has, with the 2.6 release, been `relicensed
+<http://blog.yt-project.org/post/Relicensing.html>`_ under the BSD 3-clause
+license.  Previously versions were released under the GPLv3.
+
+All contributed code must be BSD-compatible.  If you'd rather not license in
+this manner, but still want to contribute, please consider creating an external
+package, which we'll happily link to.
 
 Requirements for Code Submission
 ++++++++++++++++++++++++++++++++


https://bitbucket.org/yt_analysis/yt-doc/commits/d034c66e6cf4/
Changeset:   d034c66e6cf4
User:        MatthewTurk
Date:        2013-10-30 14:55:08
Summary:     Adding Sketchfab from the blog.
Affected #:  3 files

diff -r fde11e6718f280d02dfe796b79dab97d714e0065 -r d034c66e6cf465a96a23fef6f7725c437229e0b6 source/visualizing/_images/surfaces_blender.png
Binary file source/visualizing/_images/surfaces_blender.png has changed

diff -r fde11e6718f280d02dfe796b79dab97d714e0065 -r d034c66e6cf465a96a23fef6f7725c437229e0b6 source/visualizing/index.rst
--- a/source/visualizing/index.rst
+++ b/source/visualizing/index.rst
@@ -8,5 +8,6 @@
    callbacks
    manual_plotting
    volume_rendering
+   sketchfab
    streamlines
    colormaps/index

diff -r fde11e6718f280d02dfe796b79dab97d714e0065 -r d034c66e6cf465a96a23fef6f7725c437229e0b6 source/visualizing/sketchfab.rst
--- /dev/null
+++ b/source/visualizing/sketchfab.rst
@@ -0,0 +1,301 @@
+3D Surfaces and Sketchfab
+=========================
+
+.. sectionauthor:: Jill Naiman and Matthew Turk
+
+Surfaces
+--------
+
+For a while now, yt has had the ability to extract isosurfaces from volumetric
+data using a `marching cubes <http://en.wikipedia.org/wiki/Marching_cubes>`_
+algorithm.  The surfaces could be exported in `OBJ format
+<http://en.wikipedia.org/wiki/Wavefront_.obj_file>`_, values could be samples
+at the center of each face of the surface, and flux of a given field could be
+calculated over the surface.  This means you could, for instance, extract an
+isocontour in density and calculate the mass flux over that isocontour.  It
+also means you could export a surface from yt and view it in something like
+`Blender <http://www.blender.org/>`_, `MeshLab
+<http://meshlab.sourceforge.net/>`_, or even on your Android or iOS device in
+`MeshPad <http://www.meshpad.org/>`_ or `MeshLab Android
+<https://play.google.com/store/apps/details?id=it.isticnr.meshlab&hl=en>`_.
+One important caveat with marching cubes is that with adaptive mesh refinement
+data, you *will* see cracks across refinement boundaries unless a
+"crack-fixing" step is applied to match up these boundaries.  yt does not
+perform such an operation, and so there will be seams visible in 3D views of
+your isosurfaces.
+
+The methods to do so were methods on data objects -- ``extract_isocontours``,
+``calculate_isocontour_flux`` -- which returned just numbers or values.
+However, recently, I've created a new object called ``AMRSurface`` that makes
+this process much easier.  You can create one of these objects by specifying a
+source data object and a field over which to identify a surface at a given
+value.  For example:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("/data/workshop2012/IsolatedGalaxy/galaxy0030/galaxy0030")
+   sphere = pf.h.sphere("max", (1.0, "mpc"))
+   surface = pf.h.surface(sphere, "Density", 1e-27)
+
+This object, ``surface``, can now be queried for values on the surface.  For
+instance:
+
+.. code-block:: python
+
+   print surface["Temperature"].min(), surface["Temperature"].max()
+
+will return the values 11850.7476943 and 13641.0663899.  These values are
+interpolated to the face centers of every triangle that constitutes a portion
+of the surface.  Note that reading a new field requires re-calculating the
+entire surface, so it's not the fastest operation.  You can get the vertices of
+the triangle by looking at the property ``.vertices``.
+
+Exporting to a File
+-------------------
+
+If you want to export this to a `PLY file
+<http://en.wikipedia.org/wiki/PLY_(file_format)>`_ you can call the routine
+``export_ply``, which will write to a file and optionally sample a field at
+every face or vertex, outputting a color value to the file as well.  This file
+can then be viewed in MeshLab, Blender or on the website `Sketchfab.com
+<Sketchfab.com>`_.  But if you want to view it on Sketchfab, there's an even
+easier way!
+
+Exporting to Sketchfab
+----------------------
+
+`Sketchfab <http://sketchfab.com>`_ is a website that uses WebGL, a relatively
+new technology for displaying 3D graphics in any browser.  It's very fast and
+typically requires no plugins.  Plus, it means that you can share data with
+anyone and they can view it immersively without having to download the data or
+any software packages!  Sketchfab provides a free tier for up to 10 models, and
+these models can be embedded in websites.
+
+There are lots of reasons to want to export to Sketchfab.  For instance, if
+you're looking at a galaxy formation simulation and you publish a paper, you
+can include a link to the model in that paper (or in the arXiv listing) so that
+people can explore and see what the data looks like.  You can also embed a
+model in a website with other supplemental data, or you can use Sketchfab to
+discuss morphological properties of a dataset with collaborators.  It's also
+just plain cool.
+
+The ``AMRSurface`` object includes a method to upload directly to Sketchfab,
+but it requires that you get an API key first.  You can get this API key by
+creating an account and then going to your "dashboard," where it will be listed
+on the right hand side.  Once you've obtained it, put it into your
+``~/.yt/config`` file under the heading ``[yt]`` as the variable
+``sketchfab_api_key``.  If you don't want to do this, you can also supply it as
+an argument to the function ``export_sketchfab``.
+
+Now you can run a script like this:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("redshift0058")
+   dd = pf.h.sphere("max", (200, "kpc"))
+   rho = 5e-27
+
+   bounds = [(dd.center[i] - 100.0/pf['kpc'],
+              dd.center[i] + 100.0/pf['kpc']) for i in range(3)]
+
+   surf = pf.h.surface(dd, "Density", rho)
+
+   upload_id = surf.export_sketchfab(
+       title = "RD0058 - 5e-27",
+       description = "Extraction of Density (colored by Temperature) at 5e-27 " \
+                   + "g/cc from a galaxy formation simulation by Ryan Joung."
+       color_field = "Temperature",
+       color_map = "hot",
+       color_log = True,
+       bounds = bounds
+   )
+
+and yt will extract a surface, convert to a format that Sketchfab.com
+understands (PLY, in a zip file) and then upload it using your API key.  For
+this demo, I've used data kindly provided by Ryan Joung from a simulation of
+galaxy formation.  Here's what my newly-uploaded model looks like, using the
+embed code from Sketchfab:
+
+.. raw:: html
+
+   <iframe frameborder="0" height="480" width="854" allowFullScreen
+   webkitallowfullscreen="true" mozallowfullscreen="true"
+   src="http://skfb.ly/l4jh2edcba?autostart=0&transparent=0&autospin=0&controls=1&watermark=1"></iframe>
+
+As a note, Sketchfab has a maximum model size of 50MB for the free account.
+50MB is pretty hefty, though, so it shouldn't be a problem for most needs.
+We're working on a way to optionally upload links to the Sketchfab models on
+the `yt Hub <https://hub.yt-project.org/>`_, but for now, if you want to share
+a cool model we'd love to see it!
+
+OBJ and MTL Files
+-----------------
+
+If the ability to maneuver around an isosurface of your 3D simulation in
+`Sketchfab <http://sketchfab.com>`_ cost you half a day of work (let's be
+honest, 2 days), prepare to be even less productive.  With a new  `OBJ file
+<http://en.wikipedia.org/wiki/Wavefront_.obj_file>`_ exporter, you can now
+upload multiple surfaces of different transparencies in the same file.
+The following code snippet produces two files which contain the vertex info
+(surfaces.obj) and color/transparency info (surfaces.mtl) for a 3D
+galaxy simulation:
+
+.. code-block:: python
+
+   from yt.mods import *
+
+   pf = load("/data/workshop2012/IsolatedGalaxy/galaxy0030/galaxy0030")
+   rho = [2e-27, 1e-27]
+   trans = [1.0, 0.5]
+   filename = './surfaces'
+
+   sphere = pf.h.sphere("max", (1.0, "mpc"))
+   for i,r in enumerate(rho):
+       surf = pf.h.surface(sphere, 'Density', r)
+       surf.export_obj(filename, transparency = trans[i], color_field='Temperature', plot_index = i)
+
+The calling sequence is fairly similar to the ``export_ply`` function
+`previously used <http://blog.yt-project.org/post/3DSurfacesAndSketchFab.html>`_
+to export 3D surfaces.  However, one can now specify a transparency for each
+surface of interest, and each surface is enumerated in the OBJ files with ``plot_index``.
+This means one could potentially add surfaces to a previously
+created file by setting ``plot_index`` to the number of previously written
+surfaces.
+
+One tricky thing: the header of the OBJ file points to the MTL file (with
+the header command ``mtllib``).  This means if you move one or both of the files
+you may have to change the header to reflect their new directory location.
+
+A Few More Options
+------------------
+
+There are a few extra inputs for formatting the surface files you may want to use.
+
+(1) Setting ``dist_fac`` will divide all the vertex coordinates by this factor.
+Default will scale the vertices by the physical bounds of your sphere.
+
+(2) Setting ``color_field_max`` and/or ``color_field_min`` will scale the colors
+of all surfaces between this min and max.  Default is to scale the colors of each
+surface to their own min and max values.
+
+Uploading to SketchFab
+----------------------
+
+To upload to `Sketchfab <http://sketchfab.com>`_ one only needs to zip the
+OBJ and MTL files together, and then upload via your dashboard prompts in
+the usual way.  For example, the above script produces:
+
+.. raw:: html
+
+   <iframe frameborder="0" height="480" width="854" allowFullScreen
+   webkitallowfullscreen="true" mozallowfullscreen="true"
+   src="http://skfb.ly/5k4j2fdcb?autostart=0&transparent=0&autospin=0&controls=1&watermark=1">
+   </iframe>
+
+Importing to MeshLab and Blender
+--------------------------------
+
+The new OBJ formatting will produce multi-colored surfaces in both
+`MeshLab <http://meshlab.sourceforge.net/>`_ and `Blender <http://www.blender.org/>`_,
+a feature not possible with the
+`previous PLY exporter <http://blog.yt-project.org/post/3DSurfacesAndSketchFab.html>`_.
+To see colors in MeshLab go to the "Render" tab and
+select "Color -> Per Face".  Note in both MeshLab and Blender, unlike Sketchfab, you can't see
+transparencies until you render.
+
+...One More Option
+------------------
+
+If you've started poking around the actual code instead of skipping off to
+lose a few days running around your own simulations
+you may have noticed there are a few more options then those listed above,
+specifically, a few related to something called "Emissivity."  This allows you
+to output one more type of variable on your surfaces.  For example:
+
+.. code-block:: python
+
+   from yt.mods import *
+
+   pf = load("/data/workshop2012/IsolatedGalaxy/galaxy0030/galaxy0030")
+   rho = [2e-27, 1e-27]
+   trans = [1.0, 0.5]
+   filename = './surfaces'
+
+   def _Emissivity(field, data):
+       return (data['Density']*data['Density']*np.sqrt(data['Temperature']))
+   add_field("Emissivity", function=_Emissivity, units=r"\rm{g K}/\rm{cm}^{6}")
+
+   sphere = pf.h.sphere("max", (1.0, "mpc"))
+   for i,r in enumerate(rho):
+       surf = pf.h.surface(sphere, 'Density', r)
+       surf.export_obj(filename, transparency = trans[i],
+                       color_field='Temperature', emit_field = 'Emissivity',
+		       plot_index = i)
+
+will output the same OBJ and MTL as in our previous example, but it will scale
+an emissivity parameter by our new field.  Technically, this makes our outputs
+not really OBJ files at all, but a new sort of hybrid file, however we needn't worry
+too much about that for now.
+
+This parameter is useful if you want to upload your files in Blender and have the
+embedded rendering engine do some approximate ray-tracing on your transparencies
+and emissivities.   This does take some slight modifications to the OBJ importer
+scripts in Blender.  For example, on a Mac, you would modify the file
+"/Applications/Blender/blender.app/Contents/MacOS/2.65/scripts/addons/io_scene_obj/import_obj.py",
+in the function "create_materials" with:
+
+.. code-block:: python
+
+   # ...
+
+                    elif line_lower.startswith(b'tr'):  # translucency
+                        context_material.translucency = float_func(line_split[1])
+                    elif line_lower.startswith(b'tf'):
+                        # rgb, filter color, blender has no support for this.
+                        pass
+                    elif line_lower.startswith(b'em'): # MODIFY: ADD THIS LINE
+                        context_material.emit = float_func(line_split[1]) # MODIFY: THIS LINE TOO
+                    elif line_lower.startswith(b'illum'):
+                        illum = int(line_split[1])
+
+   # ...
+
+To use this in Blender, you might create a
+`Blender script <http://cgcookie.com/blender/2011/08/26/introduction-to-scripting-with-python-in-blender/>`_
+like the following:
+
+.. code-block:: python
+
+   import bpy
+   from math import *
+
+   bpy.ops.import_scene.obj(filepath='./surfaces.obj') # will use new importer
+
+   # set up lighting = indirect
+   bpy.data.worlds['World'].light_settings.use_indirect_light = True
+   bpy.data.worlds['World'].horizon_color = [0.0, 0.0, 0.0] # background = black
+   # have to use approximate, not ray tracing for emitting objects ...
+   #   ... for now...
+   bpy.data.worlds['World'].light_settings.gather_method = 'APPROXIMATE'
+   bpy.data.worlds['World'].light_settings.indirect_factor=20. # turn up all emiss
+
+   # set up camera to be on -x axis, facing toward your object
+   scene = bpy.data.scenes["Scene"]
+   scene.camera.location = [-0.12, 0.0, 0.0] # location
+   scene.camera.rotation_euler = [radians(90.), 0.0, radians(-90.)] # face to (0,0,0)
+
+   # render
+   scene.render.filepath ='/Users/jillnaiman/surfaces_blender' # needs full path
+   bpy.ops.render.render(write_still=True)
+
+This above bit of code would produce an image like so:
+
+.. image:: _images/surfaces_blender.png
+
+Note that the hottest stuff is brightly shining, while the cool stuff is less so
+(making the inner isodensity contour barely visible from the outside of the surfaces).
+
+If the Blender image caught your fancy, you'll be happy to know there is a greater
+integration of Blender and yt in the works, so stay tuned!


https://bitbucket.org/yt_analysis/yt-doc/commits/69470f236db4/
Changeset:   69470f236db4
User:        MatthewTurk
Date:        2013-10-30 15:12:47
Summary:     Adding frontends we support to API docs
Affected #:  1 file

diff -r d034c66e6cf465a96a23fef6f7725c437229e0b6 -r 69470f236db424621cf2ea3fd07b980d50f88602 source/reference/api/api.rst
--- a/source/reference/api/api.rst
+++ b/source/reference/api/api.rst
@@ -124,16 +124,71 @@
    ~yt.frontends.chombo.data_structures.ChomboHierarchy
    ~yt.frontends.chombo.data_structures.ChomboStaticOutput
 
-RAMSES
+Castro
 ^^^^^^
 
 
 .. autosummary::
    :toctree: generated/
 
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+   ~yt.frontends.castro.data_structures.CastroGrid
+   ~yt.frontends.castro.data_structures.CastroHierarchy
+   ~yt.frontends.castro.data_structures.CastroStaticOutput
+
+Pluto
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.pluto.data_structures.PlutoGrid
+   ~yt.frontends.pluto.data_structures.PlutoHierarchy
+   ~yt.frontends.pluto.data_structures.PlutoStaticOutput
+
+Stream
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.stream.data_structures.StreamGrid
+   ~yt.frontends.stream.data_structures.StreamHierarchy
+   ~yt.frontends.stream.data_structures.StreamStaticOutput
+
+Nyx
+^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.nyx.data_structures.NyxGrid
+   ~yt.frontends.nyx.data_structures.NyxHierarchy
+   ~yt.frontends.nyx.data_structures.NyxStaticOutput
+
+Athena
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.athena.data_structures.AthenaGrid
+   ~yt.frontends.athena.data_structures.AthenaHierarchy
+   ~yt.frontends.athena.data_structures.AthenaStaticOutput
+
+GDF
+^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.gdf.data_structures.GDFGrid
+   ~yt.frontends.gdf.data_structures.GDFHierarchy
+   ~yt.frontends.gdf.data_structures.GDFStaticOutput
 
 Derived Datatypes
 -----------------


https://bitbucket.org/yt_analysis/yt-doc/commits/a4f8002a77bf/
Changeset:   a4f8002a77bf
User:        MatthewTurk
Date:        2013-10-30 15:20:56
Summary:     Seems like creating_derived_fields should be under analysis, not developing.
Affected #:  3 files

diff -r 69470f236db424621cf2ea3fd07b980d50f88602 -r a4f8002a77bf17c6f7bc987f652c427aebf496bb source/analyzing/creating_derived_fields.rst
--- /dev/null
+++ b/source/analyzing/creating_derived_fields.rst
@@ -0,0 +1,313 @@
+.. _creating-derived-fields:
+
+Creating Derived Fields
+=======================
+
+One of the more powerful means of extending yt is through the usage of derived
+fields.  These are fields that describe a value at each cell in a simulation.
+
+Defining a New Field
+--------------------
+
+So once a new field has been conceived of, the best way to create it is to
+construct a function that performs an array operation -- operating on a 
+collection of data, neutral to its size, shape, and type.  (All fields should
+be provided as 64-bit floats.)
+
+A simple example of this is the pressure field, which demonstrates the ease of
+this approach.
+
+.. code-block:: python
+
+   def _Pressure(field, data):
+       return (data.pf["Gamma"] - 1.0) * \
+              data["Density"] * data["ThermalEnergy"]
+
+Note that we do a couple different things here.  We access the "Gamma"
+parameter from the parameter file, we access the "Density" field and we access
+the "ThermalEnergy" field.  "ThermalEnergy" is, in fact, another derived field!
+("ThermalEnergy" deals with the distinction in storage of energy between dual
+energy formalism and non-DEF.)  We don't do any loops, we don't do any
+type-checking, we can simply multiply the three items together.
+
+Once we've defined our function, we need to notify yt that the field is
+available.  The :func:`add_field` function is the means of doing this; it has a
+number of fairly specific parameters that can be passed in, but here we'll only
+look at the most basic ones needed for a simple scalar baryon field.
+
+.. code-block:: python
+
+   add_field("Pressure", function=_Pressure, units=r"\rm{dyne}/\rm{cm}^{2}")
+
+We feed it the name of the field, the name of the function, and the
+units.  Note that the units parameter is a "raw" string, with some
+LaTeX-style formatting -- Matplotlib actually has a MathText rendering
+engine, so if you include LaTeX it will be rendered appropriately.
+
+.. One very important thing to note about the call to ``add_field`` is
+.. that it **does not** need to specify the function name **if** the
+.. function is the name of the field prefixed with an underscore.  If it
+.. is not -- and it won't be for fields in different units (such as
+.. "CellMassMsun") -- then you need to specify it with the argument
+.. ``function``.
+
+We suggest that you name the function that creates a derived field
+with the intended field name prefixed by a single underscore, as in
+the ``_Pressure`` example above.
+
+If you find yourself using the same custom-defined fields over and over, you
+should put them in your plugins file as described in :ref:`plugin-file`.
+
+Conversion Factors
+~~~~~~~~~~~~~~~~~~
+
+When creating a derived field, yt does not by default do unit
+conversion.  All of the fields fed into the field are pre-supposed to
+be in CGS.  If the field does not need any constants applied after
+that, you are done. If it does, you should define a second function
+that applies the proper multiple in order to return the desired units
+and use the argument ``convert_function`` to ``add_field`` to point to
+it.  
+
+The argument that you pass to ``convert_function`` will be dependent on 
+what fields are input into your derived field, and in what form they
+are passed from their native format.  For enzo fields, nearly all the
+native on-disk fields are in CGS units already (except for ``dx``, ``dy``,
+and ``dz`` fields), so you typically only need to convert for 
+off-standard fields taking into account where those fields are 
+used in the final output derived field.  For other codes, it can vary.
+
+You can check to see the units associated with any field in a dataset
+from any code by using the ``_units`` attribute.  Here is an example 
+with one of our sample FLASH datasets available publicly at 
+http://yt-project.org/data :
+
+.. code-block:: python
+
+   >>> from yt.mods import *
+   >>> pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100")
+   >>> pf.h.field_list
+   ['dens', 'temp', 'pres', 'gpot', 'divb', 'velx', 'vely', 'velz', 'magx', 'magy', 'magz', 'magp']
+   >>> pf.field_info['dens']._units
+   '\\rm{g}/\\rm{cm}^{3}'
+   >>> pf.field_info['temp']._units
+   '\\rm{K}'
+   >>> pf.field_info['velx']._units
+   '\\rm{cm}/\\rm{s}'
+
+Thus if you were using any of these fields as input to your derived field, you 
+wouldn't have to worry about unit conversion because they're already in CGS.
+
+Some More Complicated Examples
+------------------------------
+
+But what if we want to do some more fancy stuff?  Here's an example of getting
+parameters from the data object and using those to define the field;
+specifically, here we obtain the ``center`` and ``height_vector`` parameters
+and use those to define an angle of declination of a point with respect to a
+disk.
+
+.. code-block:: python
+
+   def _DiskAngle(field, data):
+       # We make both r_vec and h_vec into unit vectors
+       center = data.get_field_parameter("center")
+       r_vec = np.array([data["x"] - center[0],
+                         data["y"] - center[1],
+                         data["z"] - center[2]])
+       r_vec = r_vec/np.sqrt((r_vec**2.0).sum(axis=0))
+       h_vec = np.array(data.get_field_parameter("height_vector"))
+       dp = r_vec[0,:] * h_vec[0] \
+          + r_vec[1,:] * h_vec[1] \
+          + r_vec[2,:] * h_vec[2]
+       return np.arccos(dp)
+   add_field("DiskAngle", take_log=False,
+             validators=[ValidateParameter("height_vector"),
+                         ValidateParameter("center")],
+             display_field=False)
+
+Note that we have added a few parameters below the main function; we specify
+that we do not wish to display this field as logged, that we require both
+``height_vector`` and ``center`` to be present in a given data object we wish
+to calculate this for, and we say that it should not be displayed in a
+drop-down box of fields to display.  This is done through the parameter
+*validators*, which accepts a list of :class:`FieldValidator` objects.  These
+objects define the way in which the field is generated, and when it is able to
+be created.  In this case, we mandate that parameters *center* and
+*height_vector* are set before creating the field.  These are set via 
+:meth:`~yt.data_objects.data_containers.set_field_parameter`, which can 
+be called on any object that has fields.
+
+We can also define vector fields.
+
+.. code-block:: python
+
+   def _SpecificAngularMomentum(field, data):
+       if data.has_field_parameter("bulk_velocity"):
+           bv = data.get_field_parameter("bulk_velocity")
+       else: bv = np.zeros(3, dtype='float64')
+       xv = data["x-velocity"] - bv[0]
+       yv = data["y-velocity"] - bv[1]
+       zv = data["z-velocity"] - bv[2]
+       center = data.get_field_parameter('center')
+       coords = np.array([data['x'],data['y'],data['z']], dtype='float64')
+       new_shape = tuple([3] + [1]*(len(coords.shape)-1))
+       r_vec = coords - np.reshape(center,new_shape)
+       v_vec = np.array([xv,yv,zv], dtype='float64')
+       return np.cross(r_vec, v_vec, axis=0)
+   def _convertSpecificAngularMomentum(data):
+       return data.convert("cm")
+   add_field("SpecificAngularMomentum",
+             convert_function=_convertSpecificAngularMomentum, vector_field=True,
+             units=r"\rm{cm}^2/\rm{s}", validators=[ValidateParameter('center')])
+
+Here we define the SpecificAngularMomentum field, optionally taking a
+``bulk_velocity``, and returning a vector field that needs conversion by the
+function ``_convertSpecificAngularMomentum``.
+
+It is also possible to define fields that depend on spatial derivatives of 
+other fields.  Calculating the derivative for a single grid cell requires 
+information about neighboring grid cells.  Therefore, properly calculating 
+a derivative for a cell on the edge of the grid will require cell values from 
+neighboring grids.  Below is an example of a field that is the divergence of the 
+velocity.
+
+.. code-block:: python
+
+    def _DivV(field, data):
+        # We need to set up stencils
+        if data.pf["HydroMethod"] == 2:
+            sl_left = slice(None,-2,None)
+            sl_right = slice(1,-1,None)
+            div_fac = 1.0
+        else:
+            sl_left = slice(None,-2,None)
+            sl_right = slice(2,None,None)
+            div_fac = 2.0
+        ds = div_fac * data['dx'].flat[0]
+        f  = data["x-velocity"][sl_right,1:-1,1:-1]/ds
+        f -= data["x-velocity"][sl_left ,1:-1,1:-1]/ds
+        if data.pf.dimensionality > 1:
+            ds = div_fac * data['dy'].flat[0]
+            f += data["y-velocity"][1:-1,sl_right,1:-1]/ds
+            f -= data["y-velocity"][1:-1,sl_left ,1:-1]/ds
+        if data.pf.dimensionality > 2:
+            ds = div_fac * data['dz'].flat[0]
+            f += data["z-velocity"][1:-1,1:-1,sl_right]/ds
+            f -= data["z-velocity"][1:-1,1:-1,sl_left ]/ds
+        new_field = np.zeros(data["x-velocity"].shape, dtype='float64')
+        new_field[1:-1,1:-1,1:-1] = f
+        return new_field
+    def _convertDivV(data):
+        return data.convert("cm")**-1.0
+    add_field("DivV", function=_DivV,
+               validators=[ValidateSpatial(ghost_zones=1,
+	                   fields=["x-velocity","y-velocity","z-velocity"])],
+              units=r"\rm{s}^{-1}", take_log=False,
+              convert_function=_convertDivV)
+
+Note that *slice* is simply a native Python object used for taking slices of 
+arrays or lists.  Another :class:`FieldValidator` object, ``ValidateSpatial`` 
+is given in the list of *validators* in the call to ``add_field`` with 
+*ghost_zones* = 1, specifying that the original grid be padded with one additional 
+cell from the neighboring grids on all sides.  The *fields* keyword simply 
+mandates that the listed fields be present.  With one ghost zone added to all sides 
+of the grid, the data fields (data["x-velocity"], data["y-velocity"], and 
+data["z-velocity"]) will have a shape of (NX+2, NY+2, NZ+2) inside of this function, 
+where the original grid has dimension (NX, NY, NZ).  However, when the final field 
+data is returned, the ghost zones will be removed and the shape will again be 
+(NX, NY, NZ).
+
+.. _derived-field-options:
+
+Saving Derived Fields
+---------------------
+
+Complex fields can be time-consuming to generate, especially on large datasets. To mitigate this, yt provides a mechanism for saving fields to a backup file using the Grid Data Format. The next time you start yt, it will check this file and your field will be treated as native if present. 
+
+The code below creates a new derived field called "Entr" and saves it to disk:
+
+.. code-block:: python
+
+    from yt.mods import *
+    from yt.utilities.grid_data_format import writer
+
+    def _Entropy(field, data) :
+        return data["Temperature"]*data["Density"]**(-2./3.)
+    add_field("Entr", function=_Entropy)
+
+    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
+    writer.save_field(pf, "Entr")
+
+This creates a "_backup.gdf" file next to your datadump. If you load up the dataset again:
+
+.. code-block:: python
+
+    from yt.mods import *
+
+    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
+    data = pf.h.all_data()
+    print data["Entr"]
+
+you can work with the field exactly as before, without having to recompute it.
+
+Field Options
+-------------
+
+The arguments to :func:`add_field` are passed on to the constructor of
+:class:`DerivedField`.  :func:`add_field` takes care of finding the arguments
+`function` and `convert_function` if it can, however.  There are a number of
+options available, but the only mandatory ones are ``name`` and possibly
+``function``.
+
+   ``name``
+     This is the name of the field -- how you refer to it.  For instance,
+     ``Pressure`` or ``H2I_Fraction``.
+   ``function``
+     This is a function handle that defines the field
+   ``convert_function``
+     This is the function that converts the field to CGS.  All inputs to this
+     function are mandated to already *be* in CGS.
+   ``units``
+     This is a mathtext (LaTeX-like) string that describes the units.
+   ``projected_units``
+     This is a mathtext (LaTeX-like) string that describes the units if the
+     field has been projected without a weighting.
+   ``display_name``
+     This is a name used in the plots, for instance ``"Divergence of
+     Velocity"``.  If not supplied, the ``name`` value is used.
+   ``take_log``
+     This is *True* or *False* and describes whether the field should be logged
+     when plotted.
+   ``particle_type``
+     Is this field a *particle* field?
+   ``validators``
+     (*Advanced*) This is a list of :class:`FieldValidator` objects, for instance to mandate
+     spatial data.
+   ``vector_field``
+     (*Advanced*) Is this field more than one value per cell?
+   ``display_field``
+     (*Advanced*) Should this field appear in the dropdown box in Reason?
+   ``not_in_all``
+     (*Advanced*) If this is *True*, the field may not be in all the grids.
+   ``projection_conversion``
+     (*Advanced*) Which unit should we multiply by in a projection?
+
+How Do Units Work?
+------------------
+
+Everything is done under the assumption that all of the native Enzo fields that
+yt knows about are converted to cgs before being handed to any processing
+routines.
+
+Which Enzo Fields Does yt Know About?
+-------------------------------------
+
+* Density
+* Temperature
+* Gas Energy
+* Total Energy
+* [xyz]-velocity
+* Species fields: HI, HII, Electron, HeI, HeII, HeIII, HM, H2I, H2II, DI, DII, HDI
+* Particle mass, velocity, 
+

diff -r 69470f236db424621cf2ea3fd07b980d50f88602 -r a4f8002a77bf17c6f7bc987f652c427aebf496bb source/developing/creating_derived_fields.rst
--- a/source/developing/creating_derived_fields.rst
+++ /dev/null
@@ -1,313 +0,0 @@
-.. _creating-derived-fields:
-
-Creating Derived Fields
-=======================
-
-One of the more powerful means of extending yt is through the usage of derived
-fields.  These are fields that describe a value at each cell in a simulation.
-
-Defining a New Field
---------------------
-
-So once a new field has been conceived of, the best way to create it is to
-construct a function that performs an array operation -- operating on a 
-collection of data, neutral to its size, shape, and type.  (All fields should
-be provided as 64-bit floats.)
-
-A simple example of this is the pressure field, which demonstrates the ease of
-this approach.
-
-.. code-block:: python
-
-   def _Pressure(field, data):
-       return (data.pf["Gamma"] - 1.0) * \
-              data["Density"] * data["ThermalEnergy"]
-
-Note that we do a couple different things here.  We access the "Gamma"
-parameter from the parameter file, we access the "Density" field and we access
-the "ThermalEnergy" field.  "ThermalEnergy" is, in fact, another derived field!
-("ThermalEnergy" deals with the distinction in storage of energy between dual
-energy formalism and non-DEF.)  We don't do any loops, we don't do any
-type-checking, we can simply multiply the three items together.
-
-Once we've defined our function, we need to notify yt that the field is
-available.  The :func:`add_field` function is the means of doing this; it has a
-number of fairly specific parameters that can be passed in, but here we'll only
-look at the most basic ones needed for a simple scalar baryon field.
-
-.. code-block:: python
-
-   add_field("Pressure", function=_Pressure, units=r"\rm{dyne}/\rm{cm}^{2}")
-
-We feed it the name of the field, the name of the function, and the
-units.  Note that the units parameter is a "raw" string, with some
-LaTeX-style formatting -- Matplotlib actually has a MathText rendering
-engine, so if you include LaTeX it will be rendered appropriately.
-
-.. One very important thing to note about the call to ``add_field`` is
-.. that it **does not** need to specify the function name **if** the
-.. function is the name of the field prefixed with an underscore.  If it
-.. is not -- and it won't be for fields in different units (such as
-.. "CellMassMsun") -- then you need to specify it with the argument
-.. ``function``.
-
-We suggest that you name the function that creates a derived field
-with the intended field name prefixed by a single underscore, as in
-the ``_Pressure`` example above.
-
-If you find yourself using the same custom-defined fields over and over, you
-should put them in your plugins file as described in :ref:`plugin-file`.
-
-Conversion Factors
-~~~~~~~~~~~~~~~~~~
-
-When creating a derived field, yt does not by default do unit
-conversion.  All of the fields fed into the field are pre-supposed to
-be in CGS.  If the field does not need any constants applied after
-that, you are done. If it does, you should define a second function
-that applies the proper multiple in order to return the desired units
-and use the argument ``convert_function`` to ``add_field`` to point to
-it.  
-
-The argument that you pass to ``convert_function`` will be dependent on 
-what fields are input into your derived field, and in what form they
-are passed from their native format.  For enzo fields, nearly all the
-native on-disk fields are in CGS units already (except for ``dx``, ``dy``,
-and ``dz`` fields), so you typically only need to convert for 
-off-standard fields taking into account where those fields are 
-used in the final output derived field.  For other codes, it can vary.
-
-You can check to see the units associated with any field in a dataset
-from any code by using the ``_units`` attribute.  Here is an example 
-with one of our sample FLASH datasets available publicly at 
-http://yt-project.org/data :
-
-.. code-block:: python
-
-   >>> from yt.mods import *
-   >>> pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100")
-   >>> pf.h.field_list
-   ['dens', 'temp', 'pres', 'gpot', 'divb', 'velx', 'vely', 'velz', 'magx', 'magy', 'magz', 'magp']
-   >>> pf.field_info['dens']._units
-   '\\rm{g}/\\rm{cm}^{3}'
-   >>> pf.field_info['temp']._units
-   '\\rm{K}'
-   >>> pf.field_info['velx']._units
-   '\\rm{cm}/\\rm{s}'
-
-Thus if you were using any of these fields as input to your derived field, you 
-wouldn't have to worry about unit conversion because they're already in CGS.
-
-Some More Complicated Examples
-------------------------------
-
-But what if we want to do some more fancy stuff?  Here's an example of getting
-parameters from the data object and using those to define the field;
-specifically, here we obtain the ``center`` and ``height_vector`` parameters
-and use those to define an angle of declination of a point with respect to a
-disk.
-
-.. code-block:: python
-
-   def _DiskAngle(field, data):
-       # We make both r_vec and h_vec into unit vectors
-       center = data.get_field_parameter("center")
-       r_vec = np.array([data["x"] - center[0],
-                         data["y"] - center[1],
-                         data["z"] - center[2]])
-       r_vec = r_vec/np.sqrt((r_vec**2.0).sum(axis=0))
-       h_vec = np.array(data.get_field_parameter("height_vector"))
-       dp = r_vec[0,:] * h_vec[0] \
-          + r_vec[1,:] * h_vec[1] \
-          + r_vec[2,:] * h_vec[2]
-       return np.arccos(dp)
-   add_field("DiskAngle", take_log=False,
-             validators=[ValidateParameter("height_vector"),
-                         ValidateParameter("center")],
-             display_field=False)
-
-Note that we have added a few parameters below the main function; we specify
-that we do not wish to display this field as logged, that we require both
-``height_vector`` and ``center`` to be present in a given data object we wish
-to calculate this for, and we say that it should not be displayed in a
-drop-down box of fields to display.  This is done through the parameter
-*validators*, which accepts a list of :class:`FieldValidator` objects.  These
-objects define the way in which the field is generated, and when it is able to
-be created.  In this case, we mandate that parameters *center* and
-*height_vector* are set before creating the field.  These are set via 
-:meth:`~yt.data_objects.data_containers.set_field_parameter`, which can 
-be called on any object that has fields.
-
-We can also define vector fields.
-
-.. code-block:: python
-
-   def _SpecificAngularMomentum(field, data):
-       if data.has_field_parameter("bulk_velocity"):
-           bv = data.get_field_parameter("bulk_velocity")
-       else: bv = np.zeros(3, dtype='float64')
-       xv = data["x-velocity"] - bv[0]
-       yv = data["y-velocity"] - bv[1]
-       zv = data["z-velocity"] - bv[2]
-       center = data.get_field_parameter('center')
-       coords = np.array([data['x'],data['y'],data['z']], dtype='float64')
-       new_shape = tuple([3] + [1]*(len(coords.shape)-1))
-       r_vec = coords - np.reshape(center,new_shape)
-       v_vec = np.array([xv,yv,zv], dtype='float64')
-       return np.cross(r_vec, v_vec, axis=0)
-   def _convertSpecificAngularMomentum(data):
-       return data.convert("cm")
-   add_field("SpecificAngularMomentum",
-             convert_function=_convertSpecificAngularMomentum, vector_field=True,
-             units=r"\rm{cm}^2/\rm{s}", validators=[ValidateParameter('center')])
-
-Here we define the SpecificAngularMomentum field, optionally taking a
-``bulk_velocity``, and returning a vector field that needs conversion by the
-function ``_convertSpecificAngularMomentum``.
-
-It is also possible to define fields that depend on spatial derivatives of 
-other fields.  Calculating the derivative for a single grid cell requires 
-information about neighboring grid cells.  Therefore, properly calculating 
-a derivative for a cell on the edge of the grid will require cell values from 
-neighboring grids.  Below is an example of a field that is the divergence of the 
-velocity.
-
-.. code-block:: python
-
-    def _DivV(field, data):
-        # We need to set up stencils
-        if data.pf["HydroMethod"] == 2:
-            sl_left = slice(None,-2,None)
-            sl_right = slice(1,-1,None)
-            div_fac = 1.0
-        else:
-            sl_left = slice(None,-2,None)
-            sl_right = slice(2,None,None)
-            div_fac = 2.0
-        ds = div_fac * data['dx'].flat[0]
-        f  = data["x-velocity"][sl_right,1:-1,1:-1]/ds
-        f -= data["x-velocity"][sl_left ,1:-1,1:-1]/ds
-        if data.pf.dimensionality > 1:
-            ds = div_fac * data['dy'].flat[0]
-            f += data["y-velocity"][1:-1,sl_right,1:-1]/ds
-            f -= data["y-velocity"][1:-1,sl_left ,1:-1]/ds
-        if data.pf.dimensionality > 2:
-            ds = div_fac * data['dz'].flat[0]
-            f += data["z-velocity"][1:-1,1:-1,sl_right]/ds
-            f -= data["z-velocity"][1:-1,1:-1,sl_left ]/ds
-        new_field = np.zeros(data["x-velocity"].shape, dtype='float64')
-        new_field[1:-1,1:-1,1:-1] = f
-        return new_field
-    def _convertDivV(data):
-        return data.convert("cm")**-1.0
-    add_field("DivV", function=_DivV,
-               validators=[ValidateSpatial(ghost_zones=1,
-	                   fields=["x-velocity","y-velocity","z-velocity"])],
-              units=r"\rm{s}^{-1}", take_log=False,
-              convert_function=_convertDivV)
-
-Note that *slice* is simply a native Python object used for taking slices of 
-arrays or lists.  Another :class:`FieldValidator` object, ``ValidateSpatial`` 
-is given in the list of *validators* in the call to ``add_field`` with 
-*ghost_zones* = 1, specifying that the original grid be padded with one additional 
-cell from the neighboring grids on all sides.  The *fields* keyword simply 
-mandates that the listed fields be present.  With one ghost zone added to all sides 
-of the grid, the data fields (data["x-velocity"], data["y-velocity"], and 
-data["z-velocity"]) will have a shape of (NX+2, NY+2, NZ+2) inside of this function, 
-where the original grid has dimension (NX, NY, NZ).  However, when the final field 
-data is returned, the ghost zones will be removed and the shape will again be 
-(NX, NY, NZ).
-
-.. _derived-field-options:
-
-Saving Derived Fields
----------------------
-
-Complex fields can be time-consuming to generate, especially on large datasets. To mitigate this, yt provides a mechanism for saving fields to a backup file using the Grid Data Format. The next time you start yt, it will check this file and your field will be treated as native if present. 
-
-The code below creates a new derived field called "Entr" and saves it to disk:
-
-.. code-block:: python
-
-    from yt.mods import *
-    from yt.utilities.grid_data_format import writer
-
-    def _Entropy(field, data) :
-        return data["Temperature"]*data["Density"]**(-2./3.)
-    add_field("Entr", function=_Entropy)
-
-    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
-    writer.save_field(pf, "Entr")
-
-This creates a "_backup.gdf" file next to your datadump. If you load up the dataset again:
-
-.. code-block:: python
-
-    from yt.mods import *
-
-    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
-    data = pf.h.all_data()
-    print data["Entr"]
-
-you can work with the field exactly as before, without having to recompute it.
-
-Field Options
--------------
-
-The arguments to :func:`add_field` are passed on to the constructor of
-:class:`DerivedField`.  :func:`add_field` takes care of finding the arguments
-`function` and `convert_function` if it can, however.  There are a number of
-options available, but the only mandatory ones are ``name`` and possibly
-``function``.
-
-   ``name``
-     This is the name of the field -- how you refer to it.  For instance,
-     ``Pressure`` or ``H2I_Fraction``.
-   ``function``
-     This is a function handle that defines the field
-   ``convert_function``
-     This is the function that converts the field to CGS.  All inputs to this
-     function are mandated to already *be* in CGS.
-   ``units``
-     This is a mathtext (LaTeX-like) string that describes the units.
-   ``projected_units``
-     This is a mathtext (LaTeX-like) string that describes the units if the
-     field has been projected without a weighting.
-   ``display_name``
-     This is a name used in the plots, for instance ``"Divergence of
-     Velocity"``.  If not supplied, the ``name`` value is used.
-   ``take_log``
-     This is *True* or *False* and describes whether the field should be logged
-     when plotted.
-   ``particle_type``
-     Is this field a *particle* field?
-   ``validators``
-     (*Advanced*) This is a list of :class:`FieldValidator` objects, for instance to mandate
-     spatial data.
-   ``vector_field``
-     (*Advanced*) Is this field more than one value per cell?
-   ``display_field``
-     (*Advanced*) Should this field appear in the dropdown box in Reason?
-   ``not_in_all``
-     (*Advanced*) If this is *True*, the field may not be in all the grids.
-   ``projection_conversion``
-     (*Advanced*) Which unit should we multiply by in a projection?
-
-How Do Units Work?
-------------------
-
-Everything is done under the assumption that all of the native Enzo fields that
-yt knows about are converted to cgs before being handed to any processing
-routines.
-
-Which Enzo Fields Does yt Know About?
--------------------------------------
-
-* Density
-* Temperature
-* Gas Energy
-* Total Energy
-* [xyz]-velocity
-* Species fields: HI, HII, Electron, HeI, HeII, HeIII, HM, H2I, H2II, DI, DII, HDI
-* Particle mass, velocity, 
-

diff -r 69470f236db424621cf2ea3fd07b980d50f88602 -r a4f8002a77bf17c6f7bc987f652c427aebf496bb source/developing/index.rst
--- a/source/developing/index.rst
+++ b/source/developing/index.rst
@@ -21,6 +21,5 @@
    testing
    debugdrive
    creating_datatypes
-   creating_derived_fields
    creating_derived_quantities
    creating_frontend


https://bitbucket.org/yt_analysis/yt-doc/commits/bd16c81de056/
Changeset:   bd16c81de056
User:        MatthewTurk
Date:        2013-10-30 15:30:12
Summary:     Adding FAQ to top level, a reference to plugin-file, and fixing conf.py
Affected #:  3 files

diff -r a4f8002a77bf17c6f7bc987f652c427aebf496bb -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -152,7 +152,7 @@
 # Add any paths that contain custom static files (such as style sheets) here,
 # relative to this directory. They are copied after the builtin static files,
 # so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static', 'advanced/_static']
+html_static_path = ['_static', 'analyzing/_static']
 
 # If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
 # using the given strftime format.

diff -r a4f8002a77bf17c6f7bc987f652c427aebf496bb -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc source/faq/index.rst
--- a/source/faq/index.rst
+++ b/source/faq/index.rst
@@ -174,6 +174,8 @@
 
   $ python setup.py install
 
+.. _plugin-file:
+
 What is the "Plugin File"?
 --------------------------
 

diff -r a4f8002a77bf17c6f7bc987f652c427aebf496bb -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -114,6 +114,16 @@
          <p class="linkdescr">What to do if you run into problems</p></td></tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/faq.html">FAQ</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Frequently Asked Questions</p>
+       </td>
+     </tr></table>
 
@@ -129,3 +139,4 @@
    developing/index
    reference/index
    Getting Help <help/index>
+   FAQ <faq/index>


https://bitbucket.org/yt_analysis/yt-doc/commits/cb06b1e0790e/
Changeset:   cb06b1e0790e
User:        ngoldbaum
Date:        2013-10-31 00:05:51
Summary:     Adding documentation on how to build the docs.
Affected #:  5 files

diff -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -2,7 +2,7 @@
 *.pyc
 .*.swp
 build/*
-source/api/generated/*
+source/reference/api/generated/*
 _temp/*
 **/.DS_Store
 RD0005-mine/*

diff -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -77,6 +77,8 @@
 
         # clean up png files left behind by notebooks.
         png_files = glob.glob("*.png")
+        fits_files = glob.glob("*.fits")
+        h5_files = glob.glob("*.h5")
         for file in png_files:
             os.remove(file)
 

diff -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac extensions/pythonscript_sphinxext.py
--- a/extensions/pythonscript_sphinxext.py
+++ b/extensions/pythonscript_sphinxext.py
@@ -2,7 +2,7 @@
 from subprocess import Popen,PIPE
 from docutils.parsers.rst import directives
 from docutils import nodes
-import os, glob, shutil, uuid, re, string
+import os, glob, shutil, hashlib, re, string
 
 class PythonScriptDirective(Directive):
     """Execute an inline python script and display images.
@@ -18,13 +18,7 @@
 
     def run(self):
         # Constuct paths
-        rst_file = self.state_machine.document.attributes['source']
-        rst_dir = os.path.abspath(os.path.dirname(rst_file))
-        source_dir = os.path.dirname(
-            os.path.abspath(self.state.document.current_source))
-        rel_dir = os.path.relpath(rst_dir, setup.confdir)
-        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
-                                                source_dir))
+        dest_dir = setup.app.builder.outdir + "/_images"
 
         # working around a docutils/sphinx issue?
         dest_dir = string.replace(dest_dir, 'internal padding after ', '')
@@ -48,7 +42,8 @@
         images = sorted(glob.glob("*.png"))
         fns = []
         for im in images:
-            fns.append(str(uuid.uuid4()) + ".png")
+            hash = hashlib.sha256(open(im, 'rb').read()).hexdigest()
+            fns.append(hash + ".png")
             shutil.move(im, os.path.join(dest_dir, fns[-1]))
             print im, os.path.join(dest_dir, fns[-1])
 
@@ -72,12 +67,9 @@
 
     app.connect('build-finished', cleanup)
 
-# http://stackoverflow.com/questions/136505/searching-for-uuids-in-text-with-regex
-PATTERN = \
-    "[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}"
-
+PATTERN = "[A-Fa-f0-9]{64}"
 def cleanup(app, exception):
-    """ Cleanup all png files with UUID filenames in the source """
+    """ Cleanup all png files with sha256 filenames in the source """
     for root,dirnames,filenames in os.walk(app.srcdir):
         matches = re.findall(PATTERN, "\n".join(filenames))
         for match in matches:

diff -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac source/developing/building_the_docs.rst
--- /dev/null
+++ b/source/developing/building_the_docs.rst
@@ -0,0 +1,95 @@
+.. _docs_build:
+
+=================
+Building the Docs
+=================
+
+The yt documentation makes heavy use of the sphinx documentation automation
+suite.  Sphinx, written in python, was originally created for the documentation
+of the python project and has many nice capabilities for managing the
+documentation of python code.
+
+While much of the yt documentation is static text, we make heavy use of
+cross-referencing with API documentation that is automatically generated at
+build time by sphinx.  We also use sphinx to run code snippets and embed
+resulting images and example data.
+
+yt Sphinx extensions
+--------------------
+
+The documentation makes heavy use of custom sphinx extensions to transform
+recipes, notebooks, and inline code snippets into python scripts, IPython_
+notebooks, or notebook cells that are executed when the docs are built.
+
+To do this, we use IPython's nbconvert module to transform notebooks into
+HTML. to simplify versioning of the notebook JSON format, we store notebooks in
+an unevaluated state.  To generate evaluated notebooks, which could include
+arbitrary output (text, images, HTML), we make use of runipy_, which provides
+facilities to script notebook evaluation.
+
+.. _runipy: https://github.com/paulgb/runipy
+.. _IPython: http://ipython.org/
+
+Dependencies
+------------
+
+To build the docs, you will need yt, IPython, runipy, and all supplementary yt
+analysis modules installed. The following dependencies were used to generate the
+yt documentation during the release of yt 2.6 in late 2013.
+
+- Sphinx 1.1.3
+- IPython 1.1
+- runipy_ (git hash f74458c2877)
+- pandoc 1.11.1
+- Rockstar halo finder 0.99.6
+- SZpack_ 1.1.1
+
+.. _SZpack: http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html
+
+You will also need the full yt suite of `yt test data
+<http://yt-project.org/data/>`_, including the larger datasets that are not used
+in the answer tests.
+
+Building the docs
+-----------------
+
+First, you will need to ensure that your testing configuration is properly
+configured and that all of the yt test data is in the testing directory.  See
+:ref:`run_answer_testing` for more details on how to set up the testing
+configuration.
+
+Next, clone the yt-doc repository, navigate to the root of the repository, and
+do :code:`make html`.
+
+.. code-block:: bash
+
+   hg clone https://bitbucket.org/yt_analysis/yt-doc ./yt-doc
+   cd yt-doc
+   make html
+
+If all of the dependencies are installed and all of the test data is in the
+testing directory, this should churn away for a while and eventually generate a
+docs build.  This process is lengthy but shouldn't last more than an hour.  We
+suggest setting :code:`suppressStreamLogging = True` in your yt configuration
+(See :ref:`configuration-file`) to suppress large amounts of debug output from
+yt.
+
+To clean the docs build, use :code:`make clean`.  By default, :code:`make clean`
+will not delete the autogenerated API docs, so use :code:`make fullclean` to
+delete those as well.
+
+
+Quick docs builds
+-----------------
+
+Clearly, building the complete docs is something of an undertaking.  If you are
+adding new static content building the complete docs build is probably
+overkill.  To skip some of the lengthier operations, you can do the following
+from the bash prompt:
+
+.. code-block:: bash
+
+   export READTHEDOCS=True
+
+This variable is set for automated builds on the free ReadTheDocs service but
+can be used by anyone to force a quick, minimal build.

diff -r bd16c81de056bb7ccf2c80558f2406f5b7138dfc -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac source/developing/index.rst
--- a/source/developing/index.rst
+++ b/source/developing/index.rst
@@ -23,3 +23,4 @@
    creating_datatypes
    creating_derived_quantities
    creating_frontend
+   building_the_docs


https://bitbucket.org/yt_analysis/yt-doc/commits/73a02306e5dd/
Changeset:   73a02306e5dd
User:        ngoldbaum
Date:        2013-10-31 02:20:26
Summary:     Reworking how images work in python-script, this caches better.
Affected #:  1 file

diff -r cb06b1e0790e12b1e94fbfd99ba8f347ff8dd1ac -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 extensions/pythonscript_sphinxext.py
--- a/extensions/pythonscript_sphinxext.py
+++ b/extensions/pythonscript_sphinxext.py
@@ -2,7 +2,7 @@
 from subprocess import Popen,PIPE
 from docutils.parsers.rst import directives
 from docutils import nodes
-import os, glob, shutil, hashlib, re, string
+import os, glob, base64
 
 class PythonScriptDirective(Directive):
     """Execute an inline python script and display images.
@@ -17,15 +17,6 @@
     has_content = True
 
     def run(self):
-        # Constuct paths
-        dest_dir = setup.app.builder.outdir + "/_images"
-
-        # working around a docutils/sphinx issue?
-        dest_dir = string.replace(dest_dir, 'internal padding after ', '')
-
-        if not os.path.exists(dest_dir):
-            os.makedirs(dest_dir) # no problem here for me, but just use built-ins
-
         # Construct script from cell content
         content = "\n".join(self.content)
         with open("temp.py", "w") as f:
@@ -41,12 +32,11 @@
 
         images = sorted(glob.glob("*.png"))
         fns = []
+        text = ''
         for im in images:
-            hash = hashlib.sha256(open(im, 'rb').read()).hexdigest()
-            fns.append(hash + ".png")
-            shutil.move(im, os.path.join(dest_dir, fns[-1]))
-            print im, os.path.join(dest_dir, fns[-1])
-
+            text += get_image_tag(im)
+            os.remove(im)
+            
         os.remove("temp.py")
 
         code = content
@@ -54,10 +44,10 @@
         literal = nodes.literal_block(code,code)
         literal['language'] = 'python'
 
-        images = []
-        for fn in fns:
-            images.append(nodes.image(uri="./"+fn, width="600px"))
-        return [literal] + images
+        attributes = {'format': 'html'}
+        img_node = nodes.raw('', text, **attributes)
+        
+        return [literal, img_node]
 
 def setup(app):
     app.add_directive('python-script', PythonScriptDirective)
@@ -65,12 +55,7 @@
     setup.config = app.config
     setup.confdir = app.confdir
 
-    app.connect('build-finished', cleanup)
-
-PATTERN = "[A-Fa-f0-9]{64}"
-def cleanup(app, exception):
-    """ Cleanup all png files with sha256 filenames in the source """
-    for root,dirnames,filenames in os.walk(app.srcdir):
-        matches = re.findall(PATTERN, "\n".join(filenames))
-        for match in matches:
-            os.remove(os.path.join(root, match+".png"))
+def get_image_tag(filename):
+    with open(filename, "rb") as image_file:
+        encoded_string = base64.b64encode(image_file.read())
+        return '<img src="data:image/png;base64,%s" width="600"><br>' % encoded_string


https://bitbucket.org/yt_analysis/yt-doc/commits/2f145817efd0/
Changeset:   2f145817efd0
User:        ngoldbaum
Date:        2013-10-31 02:58:47
Summary:     Removing the vestigial orientation and advanced sections.
Affected #:  12 files

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/advanced/index.rst
--- a/source/advanced/index.rst
+++ /dev/null
@@ -1,21 +0,0 @@
-.. _advanced:
-
-Advanced yt Usage
-=================
-
-yt has been designed to be flexible, with several entry points.
-
-.. toctree::
-   :maxdepth: 2
-
-   installing
-   plugin_file
-   parallel_computation
-   creating_derived_quantities
-   creating_datatypes
-   debugdrive
-   external_analysis
-   developing
-   testing
-   creating_frontend
-   reason_architecture

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/advanced/installing.rst
--- a/source/advanced/installing.rst
+++ /dev/null
@@ -1,233 +0,0 @@
-.. _installing-yt:
-
-Installing yt
-=============
-
-.. _automated-installation:
-
-Automated Installation
-----------------------
-
-The recommended method for installing yt is to install an isolated environment,
-using the installation script.  The front yt homepage will always contain a
-link to the most up to date version, but you should be able to obtain it from a
-command prompt by executing:
-
-.. code-block:: bash
-
-   $ wget http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
-   $ bash install_script.sh
-
-at a command prompt.  This script will download the **stable** version of the
-``yt``, along with all of its affiliated dependencies.  It will tell you at the
-end various variables you need to set in order to ensure that ``yt`` works
-correctly.  If you run into *any* problems with the installation script, that
-is considered a bug we will fix, and we encourage you to write to `yt-users
-<http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_.
-
-.. _manual-installation:
-
-Manual Installation
--------------------
-
-If you choose to install ``yt`` yourself, you will have to ensure that the
-correct dependencies have been met.  A few are optional, and one is necessary
-if you wish to install the latest development version of ``yt``, but here is a list
-of the various necessary items to build ``yt``.
-
-Installation of the various libraries is a bit beyond the scope of this
-document; if you run into any problems, your best course of action is to
-consult with the documentation for the individual projects.
-
-.. _dependencies:
-
-Required Libraries
-++++++++++++++++++
-
-This is a list of libraries installed by the installation script.  The version
-numbers are those used by the installation script -- ``yt`` may work with lower
-versions or higher versions, but these are known to work.
-
- * Python-2.7.3, but not (yet) 3.0 or higher
- * NumPy-1.6.1 (at least 1.4.1)
- * HDF5-1.8.9 or higher (at least 1.8.7)
- * h5py-2.1.0 (2.0 fixes a major bug)
- * Matplotlib-1.2.0 or higher
- * Mercurial-2.5.1 or higher (anything higher than 1.5 works)
- * Cython-0.17.1 or higher (at least 0.15.1)
-
-Optional Libraries
-++++++++++++++++++
-
-These libraries are all optional, but they are highly recommended.
-
- * Forthon-0.8.10 or higher (for halo finding and correlation functions)
- * libpng-1.5.12 or higher (for raw image outputting)
- * FreeType-2.4.4 or higher (for text annotation on raw images)
- * IPython-0.13.1 (0.10 will also work)
- * PyX-0.11.1
- * zeromq-2.2.0 (needed for IPython notebook)
- * pyzmq-2.2.11 (needed for IPython notebook)
- * tornado-2.2  (needed for IPython notebook)
- * sympy-0.7.2 
- * nose-1.2.1
-
-If you are attempting to install manually, and you are not installing into a
-fully-isolated location, you should probably use your system's package
-management system as much as possible.
-
-Once you have successfully installed the dependencies, you should clone the
-primary ``yt`` repository.  
-
-You can clone the repository with this mercurial command:
-
-.. code-block:: bash
-
-   $ hg clone http://hg.yt-project.org/yt ./yt-hg
-   $ cd yt-hg
-   $ hg up -C stable
-
-This will create a directory called ``yt-hg`` that contains the entire version
-control history of ``yt``.  If you would rather use the branch ``yt``, which is
-the current development version, issue the command ``hg up -C yt`` .
-
-To compile ``yt``, you will have to specify the location to the HDF5 libraries,
-and optionally the libpng and freetype libraries.  To do so, put the "prefix"
-for the installation location into the files ``hdf5.cfg`` and (optionally)
-``png.cfg`` and ``freetype.cfg``.  For instance, if you installed into
-``/opt/hdf5/`` you would put ``/opt/hdf5/`` into ``hdf5.cfg``.  Once you have
-specified the location to these libraries, you can execute the command:
-
-.. code-block:: bash
-
-   $ python2.7 setup.py install
-
-from the ``yt-hg`` directory.  Alternately, you can replace ``install`` with
-``develop`` if you anticipate making any modifications to the code; ``develop``
-simply means that the source will be read from that directory, whereas
-``install`` will copy it into the main Python package directory.
-
-That should install ``yt`` the library as well as the commands ``iyt`` and
-``yt``.  Good luck!
-
-Package Management System Installation
---------------------------------------
-
-While the installation script provides a complete stack of utilities,
-integration into your existing operating system can often be desirable.
-
-Ubuntu PPAs
-+++++++++++
-
-Mike Kuhlen has kindly provided PPAs for Ubuntu. If you're running Ubuntu, you
-can install these easily:
-
-.. code-block:: bash
-
-   $ sudo add-apt-repository ppa:kuhlen
-   $ sudo apt-get update
-   $ sudo apt-get install yt
-
-If you'd like a development branch of yt, you can change yt for yt-devel to get
-the most recently packaged development branch.
-
-MacPorts
-++++++++
-
-Thomas Robitaille has kindly provided a `MacPorts <http://www.macports.org/>`_
-installation of yt, as part of his `MacPorts for Python Astronomers
-<http://astrofrog.github.com/macports-python/>`_.  To activate, simply type:
-
-Thanks very much, Thomas!
-
-
-.. _community-installations:
-
-Community Installations
------------------------
-
-Recently, yt has been added as a module on several supercomputers.  We hope to
-increase this list through partnership with supercomputer centers.  You should
-be able to load an appropriate yt module on these systems:
-
- * NICS Kraken
- * NICS Nautilus
-
-.. _updating-yt:
-
-Updating yt
-===========
-
-.. _automated-update:
-
-Automated Update
-----------------
-
-The recommended method for updating yt is to run the update tool at a command 
-prompt:
-
-.. code-block:: bash
-
-   $ yt update
-
-This script will identify which repository you're using (stable, development, 
-etc.), connect to the yt-project.org server, download any recent changesets 
-for your version and then recompile any new code that needs 
-it (e.g. cython, rebuild).  This same behavior is achieved manually by running:
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   $ python setup.py develop
-
-Note that this automated update will fail if you have made modifications to
-the yt code base that you have not yet committed.  If this occurs, identify
-your modifications using 'hg status', and then commit them using 'hg commit',
-in order to bring the repository back to a state where you can automatically
-update the code as above.  On the other hand, if you want to wipe out your
-uncommitted changes and just update to the latest version, you can type: 
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   $ hg up -C      # N.B. This will wipe your uncommitted changes! 
-   $ python setup.py develop
-
-If you run into *any* problems with the update utility, it should be considered
-a bug, and we would love to hear about it so we can fix it.  Please inform us 
-through the bugsubmit utility or through the yt-users mailing list.
-
-Updating yt's Dependencies
---------------------------
-
-If you used the install script to originally install yt, updating the various 
-libraries and modules yt depends on can be done by running:
-
-.. code-block:: bash
-
-   $ yt update --all
-
-For custom installs, you will need to update the dependencies by hand.
-
-Switching Between Branches in yt
-================================
-
-.. _switching-versions:
-
-If you are running the stable version of the code, and you want to switch 
-to using the development version of the code (or vice versa), you can merely
-follow a few steps (without reinstalling all of the source again):
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   <commit all changes or they will be lost>
-   $ hg up -C <branch>     # N.B. This will wipe your uncommitted changes! 
-   $ python setup.py develop
-
-If you want to switch to using the development version of the code, use: 
-"yt" as <branch>, whereas if you want to switch to using the stable version
-of the code, use: "stable" as <branch>.

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/interacting/command-line.rst
--- a/source/interacting/command-line.rst
+++ b/source/interacting/command-line.rst
@@ -3,31 +3,18 @@
 Command-line Functions
 ----------------------
 
-The developers of yt realize that there is a lot more to analyzing code 
-than just making pretty pictures.  That is why we included several easy-to-call
-functions that could be executed from a command-line prompt for sharing code 
-and images with others, using our GUI Reason, manipulating your data 
-google-maps style, updating yt's codebase and more.  To get a quick list of 
-what is available, just type:
+The :code:`yt` command-line tool allows you to access some of yt's basic
+funcionality without opening a python interpreter.  The tools is a collection of
+subcommands.  These can quickly making plots of slices and projections through a
+dataset, updating yt's codebase, print basic statistics about a dataset, laucnh
+an IPython notebook session, and more.  To get a quick list of what is
+available, just type:
 
 .. code-block:: bash
 
    yt -h
 
-This yields all of the subcommands.  To execute any such function, 
-simply run:
-
-.. code-block:: bash
-
-   yt <subcommand>
-
-Finally, to identify the options associated with any of these subcommand, run:
-
-.. code-block:: bash
-
-   yt <subcommand> -h
-
-Let's go through each subcommand.
+This will print the list of available subcommands,
 
 .. code-block:: bash
 
@@ -60,6 +47,69 @@
     upload_image        Upload an image to imgur.com. Must be PNG.
 
 
+To execute any such function, simply run:
+
+.. code-block:: bash
+
+   yt <subcommand>
+
+Finally, to identify the options associated with any of these subcommand, run:
+
+.. code-block:: bash
+
+   yt <subcommand> -h
+
+Plotting from the command line
+------------------------------
+
+First, we'll discuss plotting from the command line, then we will give a brief
+summary of the functionality provided by each command line subcommand. This
+example uses the :code:`DD0010/moving7_0010` dataset distributed in the yt
+mercurial repository.
+
+First let's see what our options are for plotting:
+
+.. code-block:: bash
+
+  $ yt plot --help
+
+There are many!  We can choose whether we want a slice (default) or a
+projection (``-p``), the field, the colormap, the center of the image, the
+width and unit of width of the image, the limits, the weighting field for
+projections, and on and on.  By default the plotting command will execute the
+same thing along all three axes, so keep that in mind if it takes three times
+as long as you'd like!  The center of a slice defaults to the center of
+the domain, so let's just give that a shot and see what it looks like:
+
+.. code-block:: bash
+
+  $ yt plot DD0010/moving7_0010
+
+Well, that looks pretty bad!  What has happened here is that the center of the
+domain only has some minor shifts in density, so the plot is essentially
+incomprehensible.  Let's try it again, but instead of slicing, let's project.
+This is a line integral through the domain, and for the density field this
+becomes a column density.:
+
+.. code-block:: bash
+
+  $ yt plot -p DD0010/moving7_0010
+
+Now that looks much better!  Note that all three axes' projections appear
+nearly indistinguishable, because of how the two spheres are located in the
+domain.  We could center our domain on one of the spheres and take a slice, as
+well.  Now let's see what the domain looks like with grids overlaid, using the
+``--show-grids`` option.:
+
+.. code-block:: bash
+
+  $ yt plot --show-grids -p DD0010/moving7_0010
+
+We can now see all the grids in the field of view.
+
+Command-line subcommand summary
+-------------------------------
+
 help
 ++++
 

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/first_steps.rst
--- a/source/orientation/first_steps.rst
+++ /dev/null
@@ -1,235 +0,0 @@
-.. _first_steps:
-
-First Steps with yt
--------------------
-
-Starting Up yt
-++++++++++++++
-
-"Starting up" yt is a bit of a misnomer.  What we're going to do is actually
-start up Python, and from there, we'll load up yt as a library, like we did for
-NumPy earlier.  yt provides a primary entry point (yt.mods), through which we
-are attempting to ensure backwards compatibility for quite some time to come.
-
-At your shell prompt, type:
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg/tests
-   $ python
-
-Once Python has started up, we will load yt with this command::
-
-   >>> from yt.mods import *
-
-Now all of the primary functions and data objects that yt provides will be
-available to you.  We now use the ``load`` function, which accepts a filename
-and then attempts to guess the file type.  If it's able to figure out what kind
-of simulation output it is, it will parse the parameter file to determine a
-number of components, and then return to you an object.  The convention in the
-yt documentation is to call this object ``pf`` but you can call it whatever you
-like.  We'll load up the dataset that comes with yt::
-
-   >>> pf = load("DD0010/moving7_0010")
-
-The parameter file itself has some useful information in it, but it's also
-designed to be quite lightweight.  This comes in handy with very large
-simulations, where you may wish to know the current time of the simulation, or
-some other parameter available in the parameter file, but you don't necessarily
-want yt to parse the full hierarchy of grids and create hundreds if not
-hundreds of thousands of objects.  For instance, though, you can get some basic
-information out::
-
-   >>> print pf.domain_dimensions
-
-This shows you the dimensions of the domain, in terms of the coarsest mesh that
-covers the domain.  (In Enzo this is called the 'root grid.')
-
-To access information about the fluid quantities in the simulation, we rely on
-the "hierarchy" object.  This hierarchy object contains the entire geometry of
-the simulation: all of the grid patches, their parentage relationships, and the
-fluid states of those grids.  When you first access the hierarchy object, yt
-will construct in memory a representation of that geometry, determine the
-available (on-disk and yt-calculated) fluid quantities, and then create a
-number of objects in memory to represent these representations.  We will now
-ask yt for the hierarchy object from our sample dataset::
-
-   >>> pf.hierarchy
-
-You can use a shorthand for ``hierarchy``, as well::
-
-   >>> pf.h
-
-The first thing we can do is ask yt for some statistics about our simulation,
-namely, how many grid patches we have at each level how many cells, and a
-little bit about the smallest grid resolution element.  This is through the
-``print_stats`` function that the hierarchy provides::
-
-   >>> pf.h.print_stats()
-
-As you can see, this simulation has only a handful of grids, but it does have 7
-levels of refinement past the coarsest level.  The hierarchy object has a
-number of other properties that describe the field state of the gas.  For
-instance, we can find out which fields exist on disk for a given simulation
-output::
-
-   >>> print pf.h.field_list
-
-Additionally, yt will attempt to guess which fields it can generate from the
-simulation, so-called "derived fields."  This process occurs when the data is
-loaded, so it may not be a complete listing.::
-
-   >>> print pf.h.derived_field_list
-
-Finally, the function that the hierarchy uses that I personally use the most is
-the ability to find the maximum value of a given field in a simulation.  This
-function returns both the value and the location of the maximum::
-
-   >>> value, location = pf.h.find_max("Density")
-
-.. _grid_inspection:
-
-Grid Inspection
-+++++++++++++++
-
-This section is optional, and can be skipped.
-
-Before we move on to generalized data selection, it's worthwhile to examine
-individual grid patches.  This can be useful for debugging as well as for
-detailed inspection.  While there are several ways to select grids on a given
-level, or within a given region, we'll simply look at grids selected just by
-their index.  The hierarchy object possesses an array of grids::
-
-   >>> print pf.h.grids
-
-This grid array can be indexed, and we will choose to examine the first grid in
-that array::
-
-   >>> my_grid = pf.h.grids[0]
-
-Each GridPatch object has a number of attributes we can examine.  To begin
-with, it knows its dimensions and its position within the simulation domain::
-
-   >>> print my_grid.Level
-   >>> print my_grid.ActiveDimensions
-   >>> print my_grid.LeftEdge
-   >>> print my_grid.RightEdge
-   >>> print my_grid.dds
-
-It also has information about its "parent" grid -- which in some simulation
-codes will in fact be a list of parents, within which it resides -- and about
-any higher-resolution grids that are contained in whole or in part within it.::
-
-   >>> print my_grid.Parent
-   >>> print my_grid.Children
-
-Each element in these arrays is another grid object -- each of which possesses
-the same set of attributes as the ``my_grid`` object.  For instance::
-
-   >>> print my_grid.Children[0].LeftEdge
-   >>> print my_grid.Children[0].RightEdge
-
-Grid objects also respect the yt data access idiom.  We can request an array of
-the Density within a grid object.::
-
-   >>> print my_grid["Density"]
-
-The grid object also possesses information about which of its zones have been
-refined and are available at a finer resolution, within the ``child_mask``
-attribute.  This attribute is an array of 0's and 1's, which is set to 0
-everywhere that the grid has been further refined by one of the elements of
-the ``Children`` list.  You can, for instance, examine the refinement fraction
-of a grid using NumPy operations to multiply the dimensions and to sum the
-child mask::
-
-   >>> print my_grid.child_mask.sum() / float(my_grid.ActiveDimensions.prod())
-
-As you can see, only a small fraction of our grid has been refined!
-
-Data Containers and Data Selection
-++++++++++++++++++++++++++++++++++
-
-The hierarchy object, acting as the primary interface to the geometry of the
-simulation, is also the provider of all the yt-provided operators for cutting
-or selecting data.  When yt was first conceived, it was designed to be a very
-simple way to make slices and projections.  As time went on, it became clear
-that some degree of subselection of data was important, if not mandatory, and 
-so objects that selected data based on geometric bounds or fluid quantity
-values were added.
-
-The most straightforward object we can create is a sphere.  We'll center this
-sphere at the most dense point, which we found above, and we'll give it a
-radius of 10 kiloparsecs.::
-
-   >>> my_sphere = pf.h.sphere(location, 10.0/pf['kpc'])
-
-This function can accept a few more arguments, but this covers the essentials.
-We supply it a center and a radius.  The radius is specified in the units
-native to the simulation domain; this is not terribly useful, so we have used a
-unit conversion to convert from 10 kiloparsecs into the native simulation
-units.  You can do this with a number of different units (all of which are
-actually listed in the ``print_stats`` output) and the opposite (multiplication
-by the conversion factor) works for conversion from code units back to a
-physical unit.  
-
-There are a number of different objects that can be created, all of which are
-described in the documentation.  These include spheres, rectangular prisms,
-rays, slices, arbitrarily-aligned cylinders, and several others.  It is
-relatively simple to add a new type of data container, which has also been
-detailed in the documentation.
-
-We now have an object, ``my_sphere``, which functions as a data container.  We
-can access the data in the way described above::
-
-   >>> my_sphere["Density"]
-
-This will read all the data off disk (or generate it, if it's a derived field)
-and print out a representation of it.  It's then stored in the data object and
-can be accessed again without having to read it from disk.
-
-However, most operations in yt are designed so that the entire contents of the
-sphere do not have to be in memory.  For instance, to calculate the center of
-mass, one could imagine doing something like this::
-
-   >>> M_i = my_sphere["CellMassMsun"]
-   >>> M = M_i.sum()
-   >>> com_x = (my_sphere["x"] * M_i).sum()/M
-   >>> com_y = (my_sphere["y"] * M_i).sum()/M
-   >>> com_z = (my_sphere["z"] * M_i).sum()/M
-
-But for this to work, all of the arrays listed would have to be held in memory,
-even though the algorithm operates on each element individually.  Clearly, the
-mechanism described above simply won't work for very large data objects!  It
-works for our sphere because we have only a couple thousand points, but if
-we're looking at a galaxy cluster size halo in a high-resolution dataset, for
-instance, this may simply run the computer out of memory.
-
-To get past this, yt provides a mechanism for conducting operations on data
-containers that removes the need for both manual memory management and manual
-parallelism.  This functionality is rolled into the broad term "derived
-quantities" (search for this in the documentation for more information) but it
-can really be thought of as any operation that can be decomposed into a
-pre-calculation step and a reduction step.  For the center of mass, the first
-step is to calculate the values of ``M_i`` and ``M_i * x`` (as well as ``y``
-and ``z``) for each grid patch, then to sum all of these and conduct the
-division to get the overall center of mass.
-
-yt provides a number of pre-defined derived quantities, but you can also write
-your own.  For now, let's just take a look at a few of them.  For starters,
-there's the center of mass quantity.  We access these quantities from the
-``quantities`` object that hangs off every data container, like so::
-
-   >>> my_sphere.quantities["CenterOfMass"]()
-
-What this does is to access the derived quantity center of mass, and then call
-it.  I know it looks a little funny, with the empty parenthesis after the
-closing bracket, but this is necessary -- and while "CenterOfMass" doesn't take
-any arguments, some of the derived quantities do.  For instance, yt also
-provides a derived quantity for finding the extrema of a set of fields:
-
-   >>> print my_sphere.quantities["Extrema"](["x-velocity", "Density"])
-
-All of these operations, by default, will operate in a memory conservative
-fashion.  Additionally, because they work on a grid-by-grid basis, they can be
-transparently parallelized.  Discussing parallelism is outside the scope of
-this document, but it's discussed at some length in the yt documentation.

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/how_yt_thinks.rst
--- a/source/orientation/how_yt_thinks.rst
+++ /dev/null
@@ -1,57 +0,0 @@
-.. _how-yt-thinks-about-data:
-
-How yt Thinks About Data
-------------------------
-
-In this section, we'll briefly introduce the ideas behind how yt operates on
-and thinks about data internally.  For a more comprehensive discussion, see the
-`yt method paper <http://adsabs.harvard.edu/abs/2011ApJS..192....9T>`_,
-particularly Sections 2.1 and 2.2.
-
-When using yt, there are two main mechanisms for interacting with data: the
-first is to create simple plots, where yt makes a number of assumptions about
-how you want to process the data, and the more complicated version where you
-exert greater control over the selection, transformation and visualization of
-data.  This could involve choosing a subset of the simulation domain, or
-providing new ways of processing data fields written by the simulation code, or
-even manually visualizing data yourself without using any of the built-in yt
-mechanisms for visualization.  Furthermore, yt has been designed such that, if
-at all possible, all units are returned in cgs.
-
-Before we get into too much of that, it's worthwhile to mention that Python
-provides facilities for data access from objects in simple ways.  yt makes
-extensive use of this, and in fact every “data container” that yt provides
-allows you to act like it is a dictionary (as described above), which will lead
-to it either reading the appropriate data from the disk or generating that data
-in memory.  This sounds a bit complicated, but what it essentially means is
-that you can perform this operation::
-
-   >>> my_data_container["Density"]
-
-and all of the "Density" data contained within that data container will be read
-from disk, converted to CGS if necessary, and returned to you.
-
-What this kind of access leads to, particularly since the data is only read and
-assembled in memory when it is requested, is an idea that data objects are
-really just conduits.  Data flows through them from the disk to a processing
-step, and they are neutral to what that processing step is.
-
-Furthermore, yt is neutral to the type of field requested.  Simulation codes
-often only store the fluid quantities that cannot be regenerated: density,
-velocity, internal energy, etc.  However, from these quantities, other fluid
-quantities can be constructed.  Perhaps the simplest example would be that of a
-mass-fraction.  A simulation code such as Enzo may store the density of
-molecular hydrogen, but often during post-processing the more interesting
-quantity is the molecular hydrogen fraction; this can be generated trivially by
-dividing the molecular hydrogen density by the total fluid density.  yt is
-designed so that you can define functions that describe how to create a field,
-and then that field becomes "native" -- it will be accessible in the same
-manner as any field stored by the simulation.  yt also comes with a number of
-these fields pre-defined.
-
-Some (admittedly larger and more complex) data visualization and analysis
-packages encourage or even require quite a lot of pipeline construction, using
-words like Filter and Process and so on.  With yt we've tried to avoid a whole
-lot of overhead in terms of creating pipes, dealing with zones, dealing with
-interpolation, all of that -- and simply stuck to creating fields simply,
-selecting data simply, and then processing it and saving out plots.

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/index.rst
--- a/source/orientation/index.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-.. _orientation:
-
-Quickstart Guide
-================
-
-This quickstart guide to using yt begins by showing you how to install yt and
-its dependencies, shows you how to make some simple plots, and then moves to
-a brief explanation of how to use python and how yt's framework fits into that.
-Finally, it addresses how to ask questions of your data using yt.
-
-But, before getting too far in, here are a few resources that may come in
-handy along the way.  (These go from most-specific to least-specific!)
-
-    * `yt-users mailing list <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
-    * `Numpy docs <http://docs.numpy.org/>`_
-    * `Matplotlib docs <http://matplotlib.sf.net>`_
-    * `Python quickstart <http://docs.python.org/tutorial/>`_
-    * `Learn Python the Hard Way <http://learnpythonthehardway.org/index>`_
-    * `Byte of Python <http://www.swaroopch.com/notes/Python>`_
-    * `Dive Into Python <http://diveintopython.org>`_
-
-If you encounter any problems here, or anywhere in yt, please visit the 
-:ref:`asking-for-help` page to figure out a solution.
-
-.. toctree::
-   :maxdepth: 2
-
-   installing
-   simple_data_inspection
-   python_introduction
-   how_yt_thinks
-   first_steps
-   making_plots
-   where_to_go

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/making_plots.rst
--- a/source/orientation/making_plots.rst
+++ /dev/null
@@ -1,210 +0,0 @@
-Making Plots
-------------
-
-Slices
-^^^^^^
-
-Examining data by hand and looking at individual quantities one at a time can be
-interesting and productive, but yt also provides a set of visualization tools
-that you can use. We'll start by showing you how to make visualizations of
-slices and projections through your data.  We will then move on to demonstrate
-how to make analysis plots, including phase diagrams and profiles.
-
-The quickest way to plot a slice of a field through your data is to use
-:class:`~yt.visualization.plot_window.SlicePlot`.  Say we want to visualize a
-slice through the Density field along the z-axis centered on the center of the
-simulation box in a simulation dataset we've opened and stored in the parameter
-file object ``pf``.  This can be accomplished with the following command:
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.save()
-
-These two commands will create a slice object and store it in a variable we've
-called ``slc``.  We then call the ``save()`` function that is associated with
-the slice object.  This automatically saves the plot in png image format with an
-automatically generated filename.  If you don't want the slice object to stick
-around, you can accomplish the same thing in one line:
-
-.. code-block:: python
-   
-   >>> SlicePlot(pf, 'z', 'Density').save()
-
-It's nice to keep the slice object around if you want to modify the plot.  By
-default, the plot width will be set to the size of the simulation box.  To zoom
-in by a factor of ten, you can call the zoom function attached to the slice
-object:
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.zoom(10)
-   >>> slc.save('zoom')
-
-This will save a new plot to disk with a different filename - prepended with
-'zoom' instead of the name of the parameter file. If you want to set the width
-manually, you can do that as well. For example, the following sequence of
-commands will create a slice, set the width of the plot to 10 kiloparsecs, and
-save it to disk.
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.set_width((10,'kpc'))
-   >>> slc.save('10kpc')
-
-The SlicePlot also optionally accepts the coordinate to center the plot on and
-the width of the plot:
-
-.. code-block:: python
-
-   >>> SlicePlot(pf, 'z', 'Density', center=[0.2, 0.3, 0.8], 
-   ...           width = (10,'kpc')).save()
-
-The center must be given in code units.  Optionally, you can supply 'c' or 'm'
-for the center.  These two choices will center the plot on the center of the
-simulation box and the coordinate of the maximum density cell, respectively.
-
-One can also use the SlicePlot to make annotated plots.  The following commands
-will create a slice, annotate it by marking the grid boundaries, and save the
-plot to disk:
-
-.. code-block:: python
-
-   >>> SlicePlot(pf, 'z', 'Density')
-   >>> SlicePlot.annotate_grids()
-   >>> SlicePlot.save()
-
-There are a number of annotations available.  The full list is available in
-:ref:`callbacks`.
-
-Projections
-^^^^^^^^^^^^
-
-It can be limiting to only look at slices through 3D data.  In most cases, Doing
-so discards the vast majority of the data.  For this reason, yt provides a
-simple interface for generating plots of projections through your data.  The
-interface for making projection plots,
-:class:`~yt.visualization.plot_window.ProjectionPlot` is very similar to
-``SlicePlot``, described above.  To create and save a plot of the projection of
-the density field through the z-axis of a dataset, centered on the center of the
-simulation box, do the following:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z' 'Density).save()
-
-A ``ProjectionPlot`` can be modified and created in exactly the same keyword
-arguments as s ``SlicePlot``. For example, one can also adjust the width of
-the plot, either after creating the projection plot:
-
-.. code-block:: python
-
-   >>> prj = ProjectionPlot(pf, 'z', 'Density')
-   >>> prj.set_width((10,'kpc'))
-
-or while creating the projection in the first place:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Density', width=(10,'kpc'))
-
-In addition, one can optionally supply a maximum level to project to, this is
-very useful for large datasets where projections can be costly:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Density', max_level=10)
-
-as well as a field to weight the projection by.  The following example creates a
-map of the density-weighted mean temperature, projected along the z-axis:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Temperature', weight_field='Density')
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-To create profiles, yt supplies the ``PlotCollection``, an object designed to
-enable you to make several related plots all at once.  Originally, the idea was
-that yt would get used to make multiple plots of different fields, along
-different axes, all centered at the same point.  This has somewhat faded with
-time, but it still functions as a convenient way to set up a bunch of plots with
-only one or two commands.
-
-A plot collection is really defined by two things: the simulation output it
-will make plots from, and the "center" of the plot collection.  By default, the
-center is the place where all phase plots are centered, although
-there is some leeway on this.  We start by creating our plot collection.  The
-plot collection takes two arguments: the first is the parameter file (``pf``)
-we associate the plot collection with, and the second is our center.  Note that
-if you don't specify a center, yt will search the simulation for the most dense
-point and center the plot collection there.
-
-The yt convention is to call plot collection objects ``pc``, which we do here::
-
-   >>> pc = PlotCollection(pf, [0.5, 0.5, 0.5])
-
-We've chosen to center at [0.5, 0.5, 0.5], which for this simulation is the
-center of the domain.  We can now add a number of different types of
-visualizations to this plot collection, but we'll only look at a few.  
-
-Phase Plots
-^^^^^^^^^^^
-
-Phase plots are pretty cool -- they take all the data inside a data container
-and they bin it with respect to two variables.  You can then have it calculate
-the average, or the sum, of some other quantity as a function of those two
-variables.
-
-This allows, for instance, the calculation of the average Temperature as a
-function of Density and velocity.  Or, it allows the distribution of all the
-mass as a function of Density and Temperature.  I have used phase plots to good
-effect to show the variation of chemical quantities as a function of spatial and
-angular distribution, for instance.  There are several ways to create a phase
-plot; we'll actually show the most flexible method, which uses data containers.
-There's a convenience function that takes a center and a radius and makes one
-from a sphere, too.
-
-Earlier we created the data object ``my_sphere``.  We'll use that object here::
-
-   >>> pc.add_phase_object(my_sphere, ["Density", "Temperature",
-   ...                                 "VelocityMagnitude"])
-
-This will calculate the average (weighted by cell mass, by default) velocity
-magnitude at every point in the sphere and plot it as a function of the local
-density and temperature.  This function has a number of options, which can be
-seen by calling ``help`` on it::
-
-   >>> help(pc.add_phase_object)
-
-As you can see, you can specify the number of bins, the boundaries for the
-histogram, and so on and so forth.  Of particular note is that we can also have
-it calculate the mass-distribution in a data object as a function of two
-variables, but to do that we need to tell yt specifically not to take the
-average.  We do that like this::
-
-   >>> pc.add_phase_object(my_sphere, ["Density", "Temperature",
-   ...                                 "CellMassMsun"], weight=None)
-
-When the weight is specified to be empty, it will only take the sum of all the
-values in a bin, rather than the average of those values.
-
-So now we've added four plots to our plot collection.  We can now operate on
-them in bulk -- although you will probably find it's much more useful to
-operate on image plots in bulk than on phase plots.  Our first task is to save
-them all out::
-
-   >>> pc.save("first_images")
-
-The way the ``save`` function works is that it accepts a prefix, and then every
-plot is saved out with that prefix.  Each plot's name is calculated from a
-combination of the type of plot and the fields in that plot.  For plots where
-many duplicates can be included, a counter is included too -- for instance,
-phase and profile plots.
-
-All of these commands can be run from a script -- which, in fact, is the way
-that I would personally encourage.  It will make it easier to produce plots
-repeatedly, without having to worry about a great deal of manual tweaking.

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/python_introduction.rst
--- a/source/orientation/python_introduction.rst
+++ /dev/null
@@ -1,743 +0,0 @@
-A Brief Introduction to Python
-------------------------------
-
-All scripts that use yt are really Python scripts that use yt as a library.
-The great thing about Python is that the standard set of libraries that come
-with it are very extensive -- Python comes with everything you need to write
-and run mail servers and web servers, create Logo-style turtle graphics, do
-arbitrary precision math, interact with the operating system, and many other
-things.  In addition to that, efforts by the scientific community to improve
-Python for computational science have created libraries for fast array
-computation, GPGPU operations, distributed computing, and visualization.
-
-So when you use yt through the scripting interface, you get for free the
-ability to interlink it with any of the other libraries available for Python.
-In the past, this has been used to create new types of visualization using
-OpenGL, data management using Google Docs, and even a simulation that sends an
-SMS when it has new data to report on.
-
-But, this also means learning a little bit of Python!  This next section
-presents a short tutorial of how to start up and think about Python, and then
-moves on to how to use Python with yt.
-
-Starting Python
-+++++++++++++++
-
-Python has two different modes of execution: interactive execution, and
-scripted execution.  We'll start with interactive execution and then move on to
-how to write and use scripts.
-
-Before we get started, we should briefly touch upon the commands ``help`` and
-``dir``.  These two commands provide a level of introspection:
-``help(something)`` will return the internal documentation on ``something``,
-including how it can be used and all of the possible "methods" that can be
-called on it.  ``dir()`` will return the available commands and objects that
-can be directly called, and ``dir(something)`` will return information about
-all the commands that ``something`` provides.  This probably sounds a bit
-opaque, but it will become clearer with time -- it's also probably helpful to
-call ``help`` on any or all of the objects we create during this orientation.
-
-To start up Python, at your prompt simply type:
-
-.. code-block:: bash
-
-  $ python
-
-This will open up Python and give to you a simple prompt of three greater-than
-signs.  Let's inaugurate the occasion appropriately -- type this::
-
-   >>> print "Hello, world."
-
-As you can see, this printed out the string "Hello, world." just as we
-expected.  Now let's try a more advanced string, one with a number in it.  For
-this we'll use the percent (``%``) operator, which is the manner by which
-values are fed into a formatted string.  We'll print pi, but only with three
-digits of accuracy.::
-
-   >>> print "Pi is precisely %0.2f" % (3.1415926)
-
-This took the number we fed it (3.1415926) and printed it out as a floating
-point number with two decimal places.  Now let's try something a bit different
--- let's print out both the name of the number and its value.::
-
-   >>> print "%s is precisely %0.2f" % ("pi", 3.1415926)
-
-As you can see, we used ``%s`` to say that the string should print a value as a
-string (the supplied value does not have to be a string -- ``"pi"`` could be
-replaced with, for instance, another number!) and then supplied the string to
-print, as well.
-
-And there you have it -- the very basics of starting up Python, and some very
-simple mechanisms for printing values out.  Now let's explore a few types of
-data that Python can store and manipulate.
-
-Data Types
-++++++++++
-
-Python provides a number of datatypes, but the main ones that we'll concern
-ourselves with at first are lists, tuples, strings, numbers, and dictionaries.
-Most of these can be instantiated in a couple different ways, and we'll look at
-a few of them.  Some of these objects can be modified in place, which is called
-being mutable, and some are immutable and cannot be modified in place.  We'll
-talk below about what that means.
-
-Perhaps most importantly, though, is an idea about how Python works in terms of
-names and bindings -- this is called the "object model."  When you create an
-object, that is independent from binding it to a name -- think of this like
-pointers in C.  This also operates a bit differently for mutable and immutable
-types.  We'll talk a bit more about this later, but it's handy to initially
-think of things in terms of references.  (This is also, not coincidentally, how
-Python thinks of things internally as well!)  When you create an object,
-initially it has no references to it -- there's nothing that points to it.
-When you bind it to a name, then you are making a reference, and so its
-"reference count" is now 1.  When something else makes a reference to it, the
-reference count goes up to 2.  When the reference count returns to 0, the
-object is deleted -- but not before.  This concept of reference counting comes
-up from time to time, but it's not going to be a focus of this orientation.
-
-The two easiest datatypes are simply strings and numbers.  We can make a string
-very easily::
-
-   >>> my_string = "Hello there"
-   >>> print my_string
-
-We can also take a look at each individual part of a string.  We'll use the
-'slicing' notation for this.  As a brief note, slicing is 0-indexed, so that
-element 0 corresponds to the first element.  If we wanted to see the third
-element of our string::
-
-   >>> print my_string[2]
-
-We can also take the third through the 5 elements::
-
-   >>> print my_string[2:5]
-
-But note that if you try to change an element directly, Python objects and it
-won't let you -- that's because strings are immutable.  (But, note that because
-of how the += operator works, we can do "my_string += '1'" without issue.)
-
-To create a number, we do something similar::
-
-   >>> a = 10
-   >>> print a
-
-This works for floating points as well.  Now we can do math on these numbers::
-
-   >>> print a**2
-   >>> print a + 5
-   >>> print a + 5.1
-   >>> print a / 2.0
-
-Because of a historical aversion to floating point division in Python (which is
-now changing) it's always safest to ensure that either the numerator or the
-denominator is a floating point number.
-
-Now that we have a couple primitive datatypes, we can move on to sequences --
-lists and tuples.  These two objects are very similar, in that they are
-collections of arbitrary data types.  We'll only look at collections of strings
-and numbers for now, but these can be filled with arbitrary datatypes
-(including objects that yt provides, like spheres, parameter files, grids, and
-so on.)  The easiest way to create a list is to simply construct one::
-
-   >>> my_list = []
-
-At this point, you can find out how long it is, you can append elements, and
-you can access them at will::
-
-   >>> my_list.append(1)
-   >>> my_list.append(my_string)
-   >>> print my_list[0]
-   >>> print my_list[-1]
-   >>> print len(my_list)
-
-You can also create a list already containing an initial set of elements::
-
-   >>> my_list = [1, 2, 3, "four"]
-   >>> my_list[2] = "three!!"
-
-Lists are very powerful objects, which we'll talk about a bit below when
-discussing how iteration works in Python.
-
-A tuple is like a list, in that it's a sequence of objects, and it can be
-sliced and examined piece by piece.  But unlike a list, it's immutable:
-whatever a tuple contains at instantiation is what it contains for the rest of
-its existence.  Creating a tuple is just like creating a list, except that you
-use parentheses instead of brackets::
-
-   >>> my_tuple = (1, "a", 62.6)
-
-Tuples show up very commonly when handling arguments to Python functions and
-when dealing with multiple return values from a function.  They can also be
-unpacked::
-
-   >>> v1, v2, v3 = my_tuple
-
-will assign 1, "a", and 62.6 to v1, v2, and v3, respectively.
-
-Mutables vs Immutables and Is Versus Equals
-+++++++++++++++++++++++++++++++++++++++++++
-
-This section is not a "must read" -- it's more of an exploration of how
-Python's objects work.  At some point this is something you may want to be
-familiar with, but it's not strictly necessary on your first pass.
-
-Python provides the operator ``is`` as well as the comparison operator ``==``.
-The operator ``is`` determines whether two objects are in fact the same object,
-whereas the operator ``==`` determines if they are equal, according to some
-arbitrarily defined equality operation.  Think of this like comparing the
-serial numbers on two pictures of a dollar bill (the ``is`` operator) versus
-comparing the values of two pieces of currency (the ``==`` operator).
-
-This digs in to the idea of how the Python object model works, so let's test
-some things out.  For instance, let's take a look at comparing two floating
-point numbers::
-
-   >>> a = 10.1
-   >>> b = 10.1
-   >>> print a == b
-   >>> print a is b
-
-The first one returned True, but the second one returned False.  Even though
-both numbers are equal, they point to different points in memory.  Now let's
-try assigning things a bit differently::
-
-   >>> b = a
-   >>> print a is b
-
-This time it's true -- they point to the same part of memory.  Try incrementing
-one and seeing what happens.  Now let's try this with a string::
-
-   >>> a = "Hi there"
-   >>> b = a
-   >>> print a is b
-
-Okay, so our intuition here works the same way, and it returns True.  But what
-happens if we modify the string?::
-
-   >>> a += "!"
-   >>> print a
-   >>> print b
-   >>> print a is b
-
-As you can see, now not only does a contain the value "Hi there!", but it also
-is a different value than what b contains, and it also points to a different
-region in memory.  That's because strings are immutable -- the act of adding on
-"!" actually creates an entirely new string and assigns that entirely new
-string to the variable a, leaving the string pointed to by b untouched.  
-
-With lists, which are mutable, we have a bit more liberty with how we modify
-the items and how that modifies the object and its pointers.  A list is really
-just a pointer to a collection; the list object itself does not have any
-special knowledge of what constitutes that list.  So when we initialize a and
-b::
-
-   >>> a = [1, 5, 1094.154]
-   >>> b = a
-
-We end up with two pointers to the same set of objects.  (We can also have a
-list inside a list, which adds another fun layer.)  Now when we modify a, it
-shows up in b::
-
-   >>> a.append("hat wobble")
-   >>> print b[-1]
-
-This also works with the concatenation operator::
-
-   >>> a += ["beta sequences"]
-   >>> print a[-1], b[-1]
-
-But we can force a break in this by slicing the list when we initialize::
-
-   >>> a = [1, 2, 3, 4]
-   >>> b = a[:]
-   >>> a.append(5)
-   >>> print b[-1], a[-1]
-
-Here they are different, because we have sliced the list when initializing b.
-
-The coolest datatype available in Python, however, is the dictionary.  This is
-a mapping object of key:value pairs, where one value is used to look up another
-value.  We can instantiate a dictionary in a variety of ways, but for now we'll
-only look at one of the simplest mechanisms for doing so::
-
-   >>> my_dict = {}
-   >>> my_dict["A"] = 1.0
-   >>> my_dict["B"] = 154.014
-   >>> my_dict[14001] = "This number is great"
-   >>> print my_dict["A"]
-
-As you can see, one value can be used to look up another.  Almost all datatypes
-(with a few notable exceptions, but for the most part these are quite uncommon)
-can be used as a key, and you can use any object as a value.
-
-We won't spend too much time discussing dictionaries explicitly, but I will
-leave you with a word on their efficiency: the Python lookup algorithm is known
-for its hand-tuned optimization and speed, and it's very common to use
-dictionaries to look up hundreds of thousands or even millions of elements and
-to expect it to be responsive.
-
-Looping
-+++++++
-
-Looping in Python is both different and more powerful than in lower-level
-languages.  Rather than looping based exclusively on conditionals (which is
-possible in Python) the fundamental mode of looping in Python is iterating
-over objects.  In C, one might construct a loop where some counter variable is
-initialized, and at each iteration of the loop it is incremented and compared
-against a reference value; when the counter variable reaches the reference
-variable, the loop is terminated.
-
-In Python, on the other hand, to accomplish iteration through a set of
-sequential integers, one actually constructs a sequence of those integers, and
-iterates over that sequence.  For more discussion of this, and some very, very
-powerful ways of accomplishing this iteration process, look through the Python
-documentation for the words 'iterable' and 'generator.'
-
-To see this in action, let's first take a look at the built-in function
-``range``. ::
-
-   >>> print range(10)
-
-As you can see, what the function ``range`` returns is a list of integers,
-starting at zero, that is as long as the argument to the ``range`` function.
-In practice, this means that calling ``range(N)`` returns ``0, 1, 2, ... N-1``
-in a list.  So now we can execute a for loop, but first, an important
-interlude:
-
-Control blocks in Python are delimited by white space.
-
-This means that, unlike in C with its brackets, you indicate an isolated
-control block for conditionals, function declarations, loops and other things
-with an indentation.  When that control block ends, you dedent the text.  In
-yt, we use four spaces -- I recommend you do the same -- which can be inserted
-by a text editor in place of tab characters.
-
-Let's try this out with a for loop.  First type ``for i in range(10):`` and
-press enter.  This will change the prompt to be three periods, instead of three
-greater-than signs, and you will be expected to hit the tab key to indent.
-Then type "print i", press enter, and then instead of indenting again, press
-enter again.  The entire entry should look like this::
-
-   >>> for i in range(10):
-   ...     print i
-   ...
-
-As you can see, it prints out each integer in turn.  So far this feels a lot
-like C.  (It won't, if you start using iterables in place of sequences -- for
-instance, ``xrange`` operates just like range, except instead of returning an
-already-created list, it returns the promise of a sequence, whose elements
-aren't created until they are requested.)  Let's try it with our earlier list::
-
-   >>> my_sequence = ["a", "b", 4, 110.4]
-   >>> for i in my_sequence:
-   ...     print i
-   ...
-
-This time it prints out every item in the sequence.
-
-A common idiom that gets used a lot is to figure out which index the loop is
-at.  The first time this is written, it usually goes something like this::
-
-   >>> index = 0
-   >>> my_sequence = ["a", "b", 4, 110.4]
-   >>> for i in my_sequence:
-   ...     print "%s = %s" % (index, i)
-   ...     index += 1
-   ...
-
-This does what you would expect: it prints out the index we're at, then the
-value of that index in the list.  But there's an easier way to do this, less
-prone to error -- and a bit cleaner!  You can use the ``enumerate`` function to
-accomplish this::
-
-   >>> my_sequence = ["a", "b", 4, 110.4]
-   >>> for index, val in enumerate(my_sequence):
-   ...     print "%s = %s" % (index, val)
-   ...
-
-This does the exact same thing, but we didn't have to keep track of the counter
-variable ourselves.  You can use the function ``reversed`` to reverse a
-sequence in a similar fashion.  Try this out::
-
-   >>> my_sequence = range(10)
-   >>> for val in reversed(my_sequence):
-   ...     print val
-   ...
-
-We can even combine the two!::
-
-   >>> my_sequence = range(10)
-   >>> for index, val in enumerate(reversed(my_sequence)):
-   ...     print "%s = %s" % (index, val)
-   ...
-
-The most fun of all the built-in functions that operate on iterables, however,
-is the ``zip`` function.  This function will combine two sequences (but only up
-to the shorter of the two -- so if one has 16 elements and the other 1000, the
-zipped sequence will only have 16) and produce iterators over both.
-
-As an example, let's say you have two sequences of values, and you want to
-produce a single combined sequence from them.::
-
-   >>> seq1 = ["Hello", "What's up", "I'm fine"]
-   >>> seq2 = ["!", "?", "."]
-   >>> seq3 = []
-   >>> for v1, v2 in zip(seq1, seq2):
-   ...     seq3.append(v1 + v2)
-   ...
-   >>> print seq3
-
-As you can see, this is much easier than constructing index values by hand and
-then drawing from the two sequences using those index values.  I should note
-that while this is great in some instances, for numeric operations, NumPy
-arrays (discussed below) will invariably be faster.
-
-Conditionals
-++++++++++++
-
-Conditionals, like loops, are delimited by indentation.  They follow a
-relatively simple structure, with an "if" statement, followed by the
-conditional itself, and then a block of indented text to be executed in the
-event of the success of that conditional.  For subsequent conditionals, the
-word "elif" is used, and for the default, the word "else" is used.
-
-As a brief aside, the case/switch statement in Python is typically executed
-using an if/elif/else block; this can be done using more complicated
-dictionary-type statements with functions, but that typically only adds
-unnecessary complexity.
-
-For a simple example of how to do an if/else statement, we'll return to the
-idea of iterating over a loop of numbers.  We'll use the ``%`` operator, which
-is a binary modulus operation: it divides the first number by the second and
-then returns the remainder.  Our first pass will examine the remainders from
-dividing by 2, and print out all the even numbers.  (There are of course easier
-ways of determining which numbers are multiples of 2 -- particularly using
-NumPy, as we'll do below.)::
-
-   >>> for val in range(100):
-   ...     if val % 2 == 0:
-   ...         print "%s is a multiple of 2" % (val)
-   ...
-
-Now we'll add on an ``else`` statement, so that we print out all the odd
-numbers as well, with the caveat that they are not multiples of 2.::
-
-   >>> for val in range(100):
-   ...     if val % 2 == 0:
-   ...         print "%s is a multiple of 2" % (val)
-   ...     else:
-   ...         print "%s is not a multiple of 2" % (val)
-   ...
-
-Let's extend this to check the remainders of division with both 2 and 3, and
-determine which numbers are multiples of 2, 3, or neither.  We'll do this for
-all numbers between 0 and 99.::
-
-   >>> for val in range(100):
-   ...     if val % 2 == 0:
-   ...         print "%s is a multiple of 2" % (val)
-   ...     elif val % 3 == 0:
-   ...         print "%s is a multiple of 3" % (val):
-   ...     else:
-   ...         print "%s is not a multiple of 2 or 3" % (val)
-   ...
-
-This should print out which numbers are multiples of 2 or 3 -- but note that
-we're not catching all the multiples of 6, which are multiples of both 2 and 3.
-To do that, we have a couple options, but we can start with just changing the
-first if statement to encompass both, using the ``and`` operator::
-
-   >>> for val in range(100):
-   ...     if val % 2 == 3 and val % 3 == 0:
-   ...         print "%s is a multiple of 6" % (val)
-   ...     elif val % 2 == 0:
-   ...         print "%s is a multiple of 2" % (val)
-   ...     elif val % 3 == 0:
-   ...         print "%s is a multiple of 3" % (val):
-   ...     else:
-   ...         print "%s is not a multiple of 2 or 3" % (val)
-   ...
-
-In addition to the ``and`` statement, the ``or`` and ``not`` statements work in
-the expected manner.  There are also several built-in operators, including
-``any`` and ``all`` that operate on sequences of conditionals, but those are
-perhaps better saved for later.
-
-Array Operations
-++++++++++++++++
-
-In general, iteration over sequences carries with it some substantial overhead:
-each value is selected, bound to a local name, and then its type is determined
-when it is acted upon.  This is, regrettably, the price of the generality that
-Python brings with it.  While this overhead is minimal for operations acting on
-a handful of values, if you have a million floating point elements in a
-sequence and you want to simply add 1.2 to all of them, or multiply them by
-2.5, or exponentiate them, this carries with it a substantial performance hit.
-
-To accommodate this, the NumPy library has been created to provide very fast
-operations on arrays of numerical elements.  When you create a NumPy array, you
-are creating a shaped array of (potentially) sequential locations in memory
-which can be operated on at the C-level, rather than at the interpreted Python
-level.  For this reason, which NumPy arrays can act like Python sequences can,
-and can thus be iterated over, modified in place, and sliced, they can also be
-addressed as a monolithic block.  All of the fluid and particle quantities used
-in yt will be expressed as NumPy arrays, allowing for both efficient
-computation and a minimal memory footprint.
-
-For instance, the following operation will not work in standard Python::
-
-   >>> vals = range(10)
-   >>> vals *= 2.0
-
-(Note that multiplying vals by the integer 2 will not do what you think: rather
-than multiplying each value by 2.0, it will simply double the length of the
-sequence!)
-
-To get started with array operations, let's first import the NumPy library.
-This is the first time we've seen an import in this orientation, so we'll
-dwell for a moment on what this means.  When a library is imported, it is read
-from disk, the functions are loaded into memory, and they are made available
-to the user.  So when we execute::
-
-   >>> import numpy
-
-The ``numpy`` module is loaded, and then can be accessed::
-
-   >>> numpy.arange(10)
-
-This calls the ``arange`` function that belongs to the ``numpy`` module's
-"namespace."  We'll use the term namespace to refer to the variables,
-functions, and submodules that belong to a given conceptual region.  We can
-also extend our current namespace with the contents of the ``numpy`` module, so
-that we don't have to prefix all of our calling of ``numpy`` functions with
-``numpy.`` but we will not do so here, so as to preserve the distinction
-between the built-in Python functions and the NumPy-provided functions.
-
-To get started, let's perform the NumPy version of getting a sequence of
-numbers from 0 to 99::
-
-   >>> my_array = numpy.arange(100)
-   >>> print my_array
-   >>> print my_array * 2.0
-   >>> print my_array * 2
-
-As you can see, each of these operations does exactly what we think it ought
-to.  And, in fact, so does this one::
-
-   >>> my_array *= 2.0
-
-So far we've only examined what happens if we have operate on a single array of
-a given shape -- specifically, if we have an array that is N elements long, but
-only one dimensional.  NumPy arrays are, for the most part, defined by their
-data, their shape, and their data type.  We can examine both the shape (which
-includes dimensionality) and the size (strictly the total number of elements)
-in an array by looking at a couple properties of the array::
-
-   >>> print my_array.size
-   >>> print my_array.shape
-
-Note that size must be the product of the components of the shape.  In this
-case, both are 100.  We can obtain a new array of a different shape by calling
-the ``reshape`` method on an array::
-
-   >>> print my_array.reshape((10, 10))
-
-In this case, we have not modified ``my_array`` but instead created a new array
-containing the same elements, but with a different dimensionality and shape.
-You can modify an array's shape in place, as well, but that should be done with
-care and the explanation of how that works and its caveats can come a bit
-later.
-
-There are a few other important characteristics of arrays, and ways to create
-them.  We can see what kind of datatype an array is by examining its ``dtype``
-attribute::
-
-   >>> print my_array.dtype
-
-This can be changed by calling ``astype`` with another datatype.  Datatypes
-include, but are not limited to, ``int32``, ``int64``, ``float32``,
-``float64``.::
-
-   >>> float_array = my_array.astype("float64")
-
-Arrays can also be operated on together, in lieu of something like an iteration
-using the ``zip`` function.  To show this, we'll use the
-``numpy.random.random`` function to generate a random set of values of length
-100, and then we'll multiply our original array against those random values.::
-
-   >>> rand_array = numpy.random.random(100)
-   >>> print rand_array * my_array
-
-There are a number of functions you can call on arrays, as well.  For
-instance::
-
-   >>> print rand_array.sum()
-   >>> print rand_array.mean()
-   >>> print rand_array.min()
-   >>> print rand_array.max()
-
-Indexing in NumPy is very fun, and also provides some advanced functionality
-for selecting values.  You can slice and dice arrays::
-
-   >>> print my_array[50:60]
-   >>> print my_array[::2]
-   >>> print my_array[:-10]
-
-But Numpy also provides the ability to construct boolean arrays, which are the
-result of conditionals.  For example, let's say that you wanted to generate a
-random set of values, and select only those less than 0.2::
-
-   >>> rand_array = numpy.random.random(100)
-   >>> print rand_array < 0.2
-
-What is returned is a long list of booleans.  Boolean arrays can be used as
-indices -- what this means is that you can construct an index array and then
-use that toe select only those values where that index array is true.  In this
-example we also use the ``numpy.all`` and ``numpy.any`` functions, which do
-exactly what you might think -- they evaluate a statement and see if all
-elements satisfy it, and if any individual element satisfies it,
-respectively.::
-
-   >>> ind_array = rand_array < 0.2
-   >>> print rand_array[ind_array]
-   >>> print numpy.all(rand_array[ind_array] < 0.2)
-
-You can even skip the creation of the variable ``ind_array`` completely, and
-instead just coalesce the statements into a single statement::
-
-   >>> print numpy.all(rand_array[rand_array < 0.2] < 0.2)
-   >>> print numpy.any(rand_array[rand_array < 0.2] > 0.2)
-
-You might look at these and wonder why this is useful -- we've already selected
-those elements that are less than 0.2, so why do we want to re-evaluate it?
-But the interesting component to this is that a conditional applied to one
-array can be used to index another array.  For instance::
-
-   >>> print my_array[rand_array < 0.2]
-
-Here we've identified those elements in our random number array that are less
-than 0.2, and printed the corresponding elements from our original sequential
-array of integers.  This is actually a great way of selecting a random sample
-of a dataset -- in this case we get back approximately 20% of the dataset
-``my_array``, selected at random.
-
-To create arrays from nothing, several options are available.  The command
-``numpy.array`` will create an array from any arbitrary sequence::
-
-   >>> my_sequence = [1.0, 510.42, 1789532.01482]
-   >>> my_array = numpy.array(my_sequence)
-
-Additionally, arrays full of ones and zeros can be created::
-
-   >>> my_integer_ones = numpy.ones(100)
-   >>> my_float_ones = numpy.ones(100, dtype="float64")
-   >>> my_integer_zeros = numpy.zeros(100)
-   >>> my_float_zeros = numpy.zeros(100, dtype="float64")
-
-The function ``numpy.concatenate`` is also useful, but outside the scope of
-this orientation.
-
-The NumPy documentation has a number of more advanced mechanisms for combining
-arrays; the documentation for "broadcasting" in particular is very useful, and
-covers mechanisms for combining arrays of different shapes and sizes, which can
-be tricky but also extremely powerful.  We won't discuss the idea of
-broadcasting here, simply because I don't know that I could do it justice!  The
-NumPy Docs have a great `section on broadcasting
-<http://docs.scipy.org/doc/numpy/user/basics.broadcasting.html>`_.
-
-Scripted Usage
-++++++++++++++
-
-We've now explored Python interactively.  However, for long-running analysis
-tasks or analysis tasks meant to be run on a compute cluster non-interactively,
-we will want to utilize its scripting interface.  Let's start by quitting out
-of the interpreter.  If you have not already done so, you can quit by pressing
-"Ctrl-D", which will free all memory used by Python and return you to your
-shell's command prompt.
-
-At this point, open up a text editor and edit a file called
-``my_first_script.py``.  Python scripts typically end in the extension ``.py``.
-We'll start our scripting tests by doing some timing of array operations versus
-sequence operations.  Into this file, type this text::
-
-   import numpy
-   import time
-
-   my_array = numpy.arange(1000000, dtype="float64")
-
-   t1 = time.time()
-   my_array_squared = my_array**2.0
-   t2 = time.time()
-
-   print "It took me %0.3e seconds to square the array using NumPy" % (t2-t1)
-
-   t1 = time.time()
-   my_sequence_squared = []
-   for i in range(1000000):
-       my_sequence_squared.append(i**2.0)
-   t2 = time.time()
-
-   print "It took me %0.3e seconds to square the sequence without NumPy" % (t2-t1)
-
-Now save this file, and return to the command prompt.  We can execute it by
-supplying it to Python:
-
-.. code-block:: bash
-
-   $ python my_first_script.py
-
-It should run, display two pieces of information, and terminate, leaving you
-back at the command prompt.  On my laptop, the array operation is approximately
-42 times faster than the sequence operation!  Of course, depending on the
-operation conducted, this number can go up quite substantially.
-
-If you want to run a Python script and then be given a Python interpreter
-prompt, you can call the ``python`` command with the option ``-i``:
-
-.. code-block:: bash
-
-   $ python -i my_first_script.py
-
-Python will execute the script and when it has reached the end it will give you
-a command prompt.  At this point, all of the variables you have set up and
-created will be available to you -- so you can, for instance, print out the
-contents of ``my_array_squared``::
-
-   >>> print my_array_squared
-
-The scripting interface for Python is quite powerful, and by combining it with
-interactive execution, you can, for instance, set up variables and functions
-for interactive exploration of data.
-
-Functions and Objects
-+++++++++++++++++++++
-
-Functions and Objects in Python are the easiest way to perform very complex,
-powerful actions in Python.  For the most part we will not discuss them; in
-fact, the standard Python tutorial that comes with the Python documentation is
-a very good explanation of how to create and use objects and functions, and
-attempting to replicate it here would simply be futile.
-
-yt provides both many objects and functions for your usage, and it is through
-the usage and combination of functions and objects that you will be able to
-create plots, manipulate data, and visualize your data.
-
-And with that, we conclude our brief introduction to Python.  I recommend
-checking out the standard Python tutorial or browsing some of the NumPy
-documentation.  If you're looking for a book to buy, the only book I've
-personally ever been completely satisfied with has been David Beazley's book on
-Python Essentials and the Python standard library, but I've also heard good
-things about many of the others, including those by Alex Martelli and Wesley
-Chun.
-
-We'll now move on to talking more about how to use yt, both from a scripting
-perspective and interactively.
-
-Python and Related References
-+++++++++++++++++++++++++++++
-    * `Python quickstart <http://docs.python.org/tutorial/>`_
-    * `Learn Python the Hard Way <http://learnpythonthehardway.org/index>`_
-    * `Byte of Python <http://www.swaroopch.com/notes/Python>`_
-    * `Dive Into Python <http://diveintopython.org>`_
-    * `Numpy docs <http://docs.numpy.org/>`_
-    * `Matplotlib docs <http://matplotlib.sf.net>`_

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/simple_data_inspection.rst
--- a/source/orientation/simple_data_inspection.rst
+++ /dev/null
@@ -1,82 +0,0 @@
-Extremely Simple Data Inspection
---------------------------------
-
-yt comes with a command line tool for some very simple data inspection.  It can
-give you some basic statistics about your simulation, it can make slices and
-projections of your data, it can make zoomins or animations from multiple
-datasets, it can run the Hop halo finder on your data, and it can perform
-simple baryon analysis of halos found in a simulation.  This is only the tip of
-the iceberg for what yt provides, but we'll use it as here as a method of rapid
-inspection.
-
-For the purposes of just starting out, we'll make some slices.  For the
-purposes of this example, we'll use the very simple dataset that comes with yt,
-``DD0010/moving7_0010``.  As a brief explanation, this dataset is an extremely
-low-resolution Enzo dataset, with two "spheres" that have begun collapsing.  It
-has several levels of refinement, but it's also quite low resolution.  I ask
-kindly that you forgive how ugly the images we're going to create will be.  If
-you'd rather, you can use some data of your own instead.
-
-Let's first take a look at what's available with the yt command.  To do this,
-just run:
-
-.. code-block:: bash
-
-  $ yt
-
-You'll see that what comes out are a number of subcommands.  Let's start by
-taking a quick look at our data, using the ``stats`` command.:
-
-.. code-block:: bash
-
-  $ yt stats $YT_DEST/src/yt-hg/tests/DD0010/moving7_0010
-
-This loads the data, prints out a little bit about the level structure, and
-then searches the finest couple levels for the maximum density.  It's very
-simple and brief, but for some types of simulations this can give a good
-overview of the state of the simulation.
-
-Let's now try plotting the simulation.  First let's see what our options are
-for plotting:
-
-.. code-block:: bash
-
-  $ yt plot --help
-
-There are many!  We can choose whether we want a slice (default) or a
-projection (``-p``), the field, the colormap, the center of the image, the
-width and unit of width of the image, the limits, the weighting field for
-projections, and on and on.  By default the plotting command will execute the
-same thing along all three axes, so keep that in mind if it takes three times
-as long as you'd like!  The center of a slice defaults to the center of
-the domain, so let's just give that a shot and see what it looks like:
-
-.. code-block:: bash
-
-  $ yt plot $YT_DEST/src/yt-hg/tests/DD0010/moving7_0010
-
-Well, that looks pretty bad!  What has happened here is that the center of the
-domain only has some minor shifts in density, so the plot is essentially
-incomprehensible.  Let's try it again, but instead of slicing, let's project.
-This is a line integral through the domain, and for the density field this
-becomes a column density.:
-
-.. code-block:: bash
-
-  $ yt plot -p $YT_DEST/src/yt-hg/tests/DD0010/moving7_0010
-
-Now that looks much better!  Note that all three axes' projections appear
-nearly indistinguishable, because of how the two spheres are located in the
-domain.  We could center our domain on one of the spheres and take a slice, as
-well.  Now let's see what the domain looks like with grids overlaid, using the
-``--show-grids`` option.:
-
-.. code-block:: bash
-
-  $ yt plot --show-grids -p $YT_DEST/src/yt-hg/tests/DD0010/moving7_0010
-
-We can now see all the grids in the field of view.
-
-Feel free to explore other options for plotting, but for now, we're going to
-move to more advanced topics, including how to initialize a yt scripting
-session and how to write scripts for analysis with yt.

diff -r 73a02306e5dda2e0a3bfa0a9bed8abb1ce3e7844 -r 2f145817efd090a5f01699fde1bbc7039c231a56 source/orientation/where_to_go.rst
--- a/source/orientation/where_to_go.rst
+++ /dev/null
@@ -1,17 +0,0 @@
-Where To Go From Here
----------------------
-
-At this point, you've seen some basics of Python, how to load up data in yt,
-and also made some plots and some quantitative analysis.
-
-From here, you can start with simply exploring your data.  You can define your
-own derived fields, if you so choose.  You can try your hand at volume
-rendering your data -- not just for visualization, but to do more complicated
-things like averaging over off-axis lines of sight.  You could even create
-your own data containers, if you need to select data in a different mechanism.
-
-For ideas, see the yt documentation -- the cookbook has a whole bunch of
-scripts that can help you get going.  Feel free to email the users' list with
-any questions or problems you run into along the way.
-
-Have fun!

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/918bb31b8eb6/
Changeset:   918bb31b8eb6
User:        ngoldbaum
Date:        2013-10-31 03:16:30
Summary:     Removing the interacting section.  Moving things around that were still useful.
Affected #:  21 files

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -140,3 +140,4 @@
    reference/index
    Getting Help <help/index>
    FAQ <faq/index>
+   sharing_data_hub

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/mapserver.png
Binary file source/interacting/_images/mapserver.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/rs1_welcome.png
Binary file source/interacting/_images/rs1_welcome.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/rs2_printstats.png
Binary file source/interacting/_images/rs2_printstats.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/rs3_proj.png
Binary file source/interacting/_images/rs3_proj.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/rs4_widget.png
Binary file source/interacting/_images/rs4_widget.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/_images/rs5_menu.png
Binary file source/interacting/_images/rs5_menu.png has changed

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/command-line.rst
--- a/source/interacting/command-line.rst
+++ /dev/null
@@ -1,285 +0,0 @@
-.. _command-line:
-
-Command-line Functions
-----------------------
-
-The :code:`yt` command-line tool allows you to access some of yt's basic
-funcionality without opening a python interpreter.  The tools is a collection of
-subcommands.  These can quickly making plots of slices and projections through a
-dataset, updating yt's codebase, print basic statistics about a dataset, laucnh
-an IPython notebook session, and more.  To get a quick list of what is
-available, just type:
-
-.. code-block:: bash
-
-   yt -h
-
-This will print the list of available subcommands,
-
-.. code-block:: bash
-
-    help                Print help message
-    bootstrap_dev       Bootstrap a yt development environment
-    bugreport           Report a bug in yt
-    hop                 Run HOP on one or more datasets
-    hub_register        Register a user on the Hub: http://hub.yt-project.org/
-    hub_submit          Submit a mercurial repository to the yt Hub
-                        (http://hub.yt-project.org/), creating a BitBucket
-                        repo in the process if necessary.
-    instinfo            Get some information about the yt installation
-    load                Load a single dataset into an IPython instance
-    mapserver           Serve a plot in a GMaps-style interface
-    pastebin            Post a script to an anonymous pastebin
-    pastebin_grab       Print an online pastebin to STDOUT for local use.
-    upload_notebook     Upload an IPython notebook to hub.yt-project.org.
-    plot                Create a set of images
-    render              Create a simple volume rendering
-    rpdb                Connect to a currently running (on localhost) rpd
-                        session. Commands run with --rpdb will trigger an rpdb
-                        session with any uncaught exceptions.
-    notebook            Run the IPython Notebook
-    serve               Run the Web GUI Reason
-    reason              Run the Web GUI Reason
-    stats               Print stats and max/min value of a given field (if
-                        requested), for one or more datasets (default field is
-                        Density)
-    update              Update the yt installation to the most recent version
-    upload_image        Upload an image to imgur.com. Must be PNG.
-
-
-To execute any such function, simply run:
-
-.. code-block:: bash
-
-   yt <subcommand>
-
-Finally, to identify the options associated with any of these subcommand, run:
-
-.. code-block:: bash
-
-   yt <subcommand> -h
-
-Plotting from the command line
-------------------------------
-
-First, we'll discuss plotting from the command line, then we will give a brief
-summary of the functionality provided by each command line subcommand. This
-example uses the :code:`DD0010/moving7_0010` dataset distributed in the yt
-mercurial repository.
-
-First let's see what our options are for plotting:
-
-.. code-block:: bash
-
-  $ yt plot --help
-
-There are many!  We can choose whether we want a slice (default) or a
-projection (``-p``), the field, the colormap, the center of the image, the
-width and unit of width of the image, the limits, the weighting field for
-projections, and on and on.  By default the plotting command will execute the
-same thing along all three axes, so keep that in mind if it takes three times
-as long as you'd like!  The center of a slice defaults to the center of
-the domain, so let's just give that a shot and see what it looks like:
-
-.. code-block:: bash
-
-  $ yt plot DD0010/moving7_0010
-
-Well, that looks pretty bad!  What has happened here is that the center of the
-domain only has some minor shifts in density, so the plot is essentially
-incomprehensible.  Let's try it again, but instead of slicing, let's project.
-This is a line integral through the domain, and for the density field this
-becomes a column density.:
-
-.. code-block:: bash
-
-  $ yt plot -p DD0010/moving7_0010
-
-Now that looks much better!  Note that all three axes' projections appear
-nearly indistinguishable, because of how the two spheres are located in the
-domain.  We could center our domain on one of the spheres and take a slice, as
-well.  Now let's see what the domain looks like with grids overlaid, using the
-``--show-grids`` option.:
-
-.. code-block:: bash
-
-  $ yt plot --show-grids -p DD0010/moving7_0010
-
-We can now see all the grids in the field of view.
-
-Command-line subcommand summary
--------------------------------
-
-help
-++++
-
-Help lists all of the various command-line options in yt.
-
-bootstrap_dev
-+++++++++++++
-
-After you have installed yt and you want to do some development, there may 
-be a few more steps to complete.  This subcommand automates building a 
-development environment for you by setting up your hg preferences correctly,
-creating/linking to a bitbucket account for hosting and sharing your code, 
-and setting up a pasteboard for your code snippets.  A full description of 
-how this works can be found in :ref:`bootstrap-dev`.
-
-bugreport         
-+++++++++
-
-Encountering a bug in your own code can be a big hassle, but it can be 
-exponentially worse to find it in someone else's.  That's why we tried to 
-make it as easy as possible for users to report bugs they find in yt.  
-After you go through the necessary channels to make sure you're not just
-making a mistake (see :ref:`asking-for-help`), you can submit bug 
-reports using this nice utility.
-
-hop               
-+++
-
-This lets you run the HOP algorithm as a halo-finder on one or more 
-datasets.  It nominally reproduces the behavior of enzohop from the 
-enzo suite.  There are several flags you can use in order to specify
-your threshold, input names, output names, and whether you want to use 
-dark matter or all particles.  To view these flags run help with the 
-hop subcommand.
-
-hub_register and hub_submit
-+++++++++++++++++++++++++++
-
-We in the yt camp believe firmly in the ideals of open-source coding.  To
-further those ends, we have made a location for people to share their 
-nifty and useful codes with other scientists who might be able to use 
-them: the `yt hub <http://hub.yt-project.org/>`_.  Did you make a cool 
-code for generating a movie from your simulation outputs?  Submit it to 
-the hub.  Did you create a perl script that automates something and saves 
-you some time while on a supercomputer.  Submit it to the hub.  And 
-using the hubsubmit command, you can do this really easily.  If you 
-create a mercurial repository for the code you want to submit, just 
-run the hubsubmit command from within its directory structure, and we'll 
-take care of the rest, by putting it on bitbucket and finally submitting 
-it to the hub to share with the rest of the yt community.  Check out 
-what people have already put up on the
-`yt hub <http://hub.yt-project.org/>`_, and see :ref:`share-your-scripts` 
-for more details about sharing your work on the hub.
-
-instinfo
-++++++++
-
-This gives very similar behavior to the update command, in that it 
-will automatically update your yt version to the latest in whichever
-repository you're in (stable, development, etc.).  It can also provide 
-you with the hash of the version you're using.
-
-load
-++++
-
-This will start the iyt interactive environment with your specified 
-dataset already loaded.  See :ref:`interactive-prompt` for more details.
-
-mapserver
-+++++++++
-
-Ever wanted to interact with your data using the 
-`google maps <http://maps.google.com/>`_ interface?  Now you can by using the
-yt mapserver.  See :ref:`mapserver` for more details.
-
-pastebin and pastebin_grab
-++++++++++++++++++++++++++
-
-The `pastebin <http://paste.yt-project.org/>`_ is an online location where 
-you can anonymously post code snippets and error messages to share with 
-other users in a quick, informal way.  It is often useful for debugging 
-code or co-developing.  By running the ``pastebin`` subcommand with a 
-text file, you send the contents of that file to an anonymous pastebin; 
-
-.. code-block:: bash
-
-   yt pastebin my_script.py
-
-By running the ``pastebin_grab`` subcommand with a pastebin number 
-(e.g. 1768), it will grab the contents of that pastebin 
-(e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
-STDOUT for local use.  For more details see the :ref:`pastebin` section.
-
-.. code-block:: bash
-
-   yt pastebin_grab 1768
-
-plot
-++++
-
-This command generates one or many simple plots for a single dataset.  
-By specifying the axis, center, width, etc. (run ``yt help plot`` for 
-details), you can create slices and projections easily at the 
-command-line.
-
-upload_notebook
-+++++++++++++++
-
-This command will accept the filename of a ``.ipynb`` file (generated from an
-IPython notebook session) and upload it to the `yt hub
-<http://hub.yt-project.org/>` where others will be able to view it, and
-download it.  This is an easy method for recording a sequence of commands,
-their output, narrative information, and then sharing that with others.  These
-notebooks will be viewable online, and the appropriate URLs will be returned on
-the command line.
-
-reason and serve
-++++++++++++++++
-
-The ``reason`` and ``serve`` subcommands have identical functionality in that
-they both initiate the Web GUI Reason. See :ref:`reason`.
-
-render
-++++++
-
-This command generates a volume rendering for a single dataset.  By specifying
-the center, width, number of pixels, number and thickness of contours, etc.
-(run ``yt help render`` for details),  you can create high-quality volume
-renderings at the command-line before moving on to more involved volume
-rendering scripts.
-
-rpdb
-++++
-
-Connect to a currently running (on localhost) rpd session.
-
-notebook
-++++++++
-
-Launches an IPython notebook server and prints out instructions on how to open
-an ssh tunnel to connect to the notebook server with a web browser.  This is
-most useful when you want to run an IPython notebook using CPUs on a remote
-host.
-
-stats
-+++++
-
-This subcommand provides you with some basic statistics on a given dataset.
-It provides you with the number of grids and cells in each level, the time
-of the dataset, the resolution, and the maximum density in a variety of units.
-It is tantamount to performing the ``print_stats()`` inside of yt.
-
-update
-++++++
-
-This subcommand updates the yt installation to the most recent version for
-your repository (e.g. stable, 2.0, development, etc.).  Adding the ``--all`` 
-flag will update the dependencies as well. See 
-:ref:`automated-update` for more details.
-
-.. _upload-image:
-
-upload_image
-++++++++++++
-
-Images are often worth a thousand words, so when you're trying to 
-share a piece of code that generates an image, or you're trying to 
-debug image-generation scripts, it can be useful to send your
-co-authors a link to the image.  This subcommand makes such sharing 
-a breeze.  By specifying the image to share, ``upload_image`` automatically
-uploads it anonymously to the website `imgur.com <http://imgur.com/>`_ and
-provides you with a link to share with your collaborators.  Note that the
-image *must* be in the PNG format in order to use this function.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/index.rst
--- a/source/interacting/index.rst
+++ /dev/null
@@ -1,19 +0,0 @@
-.. _interacting_with_yt:
-
-Ways of Interacting with yt
-===========================
-
-There are several entry points to yt, and each carries its own strengths and
-weaknesses.  We cover a few of those here.
-
-.. toctree::
-   :maxdepth: 2
-
-   scripts
-   interactive_prompt
-   ipython_notebook
-   command-line
-   reason
-   mapserver
-   sharing_data_hub
-   paraview

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/interactive_prompt.rst
--- a/source/interacting/interactive_prompt.rst
+++ /dev/null
@@ -1,50 +0,0 @@
-.. _interactive-prompt:
-
-Interactive Prompt
-------------------
-
-The interactive prompt offers a number of excellent opportunities for
-exploration of data.  While there are challenges for repeatability, and some
-operations will be more challenging to operate in parallel, interactive prompts
-can be exceptionally useful for debugging, exploring, and tweaking plots.
-
-There are several different ways to get an interactive prompt for yt, the most
-easy of which is simply to type:
-
-.. code-block:: bash
-
-   pyyt
-
-This will start up the python interpreter.  You can now execute yt commands as
-normal (once you've imported yt!) and examine your data.  There are two other
-handy commands, however, which put you into the IPython interactive shell.
-
-.. warning:: IPython has changed their API substantially in recent versions.
-   yt does not support IPython versions newer than 0.10.
-
-You can start up an empty shell, with a handful of useful yt utilities (such as
-tab-completion and pre-imported modules) by executing:
-
-.. code-block:: bash
-
-   iyt
-
-The other option, which is shorthand for "iyt plus dataset loading" is to use
-the command-line tool (see :ref:`command-line`) with the ``load`` subcommand
-and to specify a parameter file.  For instance:
-
-.. code-block:: bash
-
-   yt load cosmoSim_coolhdf5_chk_0026
-
-or
-
-.. code-block:: bash
-
-   yt load DD0030/DD0030
-
-This will spawn ``iyt``, but the parameter file given on the command line will
-already be in the namespace as ``pf``.  With interactive mode, you can use the
-``pylab`` module to interactively plot, and there is also the object
-``PlotCollectionInteractive`` which can handle setting up draw mode and
-updating plots interactively.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/ipython_notebook.rst
--- a/source/interacting/ipython_notebook.rst
+++ /dev/null
@@ -1,33 +0,0 @@
-IPython Notebook
-================
-
-Starting with 2.4, yt ships with several functions and helpers to display
-information in the IPython web notebook.
-
-.. note:: The IPython necessary dependencies (0MQ, Py0MQ, Tornado and IPython)
-          come with the yt install script.  However, you should read in detail
-          the `IPython documentation
-          <http://ipython.org/ipython-doc/stable/interactive/htmlnotebook.html>`__
-          for how to use it.
-
-A sample notebook, demonstrating some of the functionality of both yt 2.4 and
-the IPython notebook (as exposed through yt) can be found at
-http://yt-project.org/files/yt24.ipynb .
-
-There are two main things that yt exposes to the IPython notebook: displaying
-PlotWindow objects and displaying Volume Renderings.  Both of these are exposed
-through the ``show`` method.  For instance:
-
-.. code-block:: python
-
-   slc = SlicePlot(pf, "x", "Density")
-   slc.show()
-
-Or with a volume rendering, call ``show`` on the camera:
-
-.. code-block:: python
-
-   cam = pf.h.camera([0.5, 0.5, 0.5], [0.2, 0.3, 0.4], 0.10, 1024, tf)
-   cam.show()
-
-In both of these cases, an image will appear in the cell output.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/mapserver.rst
--- a/source/interacting/mapserver.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-.. _mapserver:
-
-Mapserver
----------
-
-The mapserver is a new, experimental feature.  It's based on `Leaflet
-<http://leaflet.cloudmade.com/>`, a library written to create zoomable,
-map-tile interfaces.  (Similar to Google Maps.)  yt provides everything you
-need to start up a web server that will interactively re-pixelize an adaptive
-image.  This means you can explore your datasets in a fully pan-n-zoom
-interface.
-
-To start up the mapserver, you can use the command ``yt`` (see
-:ref:`command-line`) with the ``mapserver`` subcommand.  It takes several of
-the same options and arguments as the ``plot`` subcommand.  For instance:
-
-.. code-block:: bash
-
-   yt mapserver DD0050/DD0050
-
-That will take a slice along the x axis at the center of the domain.  The
-field, projection, weight and axis can all be specified on the command line.
-
-When you do this, it will spawn a micro-webserver on your localhost, and output
-the URL to connect to to standard output.  You can connect to it (or create an
-SSH tunnel to connect to it) and explore your data.  Double-clicking zooms, and
-dragging drags.
-
-.. image:: _images/mapserver.png
-   :scale: 50%
-
-This is also functional on touch-capable devices such as Android Tablets and
-iPads/iPhones.  In future versions, we hope to add halo-overlays and
-markers-of-interest to this.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/paraview.rst
--- a/source/interacting/paraview.rst
+++ /dev/null
@@ -1,32 +0,0 @@
-.. _paraview:
-
-ParaView
---------
-
-.. note:: As of 2.2 the ParaView-yt interoperability is still a work in
-   progress.  Much can be done, but the setup still requires a bit of work.
-
-ParaView support for yt is still preliminary, but is possible.  Future versions
-of ParaView will include the necessary components.  For now, however, to enable
-yt as a plugin in ParaView you will have to build from source and use the
-branches ``AMR-Refactor`` in both VTK and ParaView.  When building, you must
-also link against the Python with which yt was installed.
-
-Finally, to enable the yt ParaView plugin, you must also install the yt plugin,
-available in the `yt-paraview
-<https://gitorious.org/yt-paraview/paraview-plugins>`_ git repository.  (You
-should be able to use ``pip install hg-git`` to install hg-git, which enables
-checking out git repos via mercurial.)
-
-Jorge Poco has created a YouTube video of `ParaView using yt as a plugin
-<http://www.youtube.com/watch?v=cOv4Ob2q1fM>`_:
-
-.. youtube:: cOv4Ob2q1fM
-   width: 600
-   height: 400
-
-
-For more information, there are also two blog posts about this:
-
- * http://blog.yt-project.org/a-movie-of-yt-in-paraview
- * http://blog.yt-project.org/paraview-and-yt

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/reason.rst
--- a/source/interacting/reason.rst
+++ /dev/null
@@ -1,246 +0,0 @@
-.. _reason:
-
-The GUI Reason
---------------
-
-.. warning:: Current versions of Reason may not work quite as expected with
-             Firefox.  They have all been tested under Chrome, and if you run
-             into any bugs with either, please `report them
-             <https://bitbucket.org/yt_analysis/yt/issues/new>`_!
-
-Demo
-++++
-
-Cameron created a short screencast of what Reason is, how it works, and how you
-can use it.  It's best viewed in full-screen, so click the little X in the
-bottom right.
-
-.. raw:: html
-
-   <iframe src="http://player.vimeo.com/video/28506477" width="640"
-        height="320" frameborder="0"></iframe>
-
-
-What is Reason?
-+++++++++++++++
-
-Reason is a web-based GUI for yt.  It's still currently a BETA sequence, but we
-are working very hard to improve it and ensure it's productive.  It's designed
-to act as a very simple web-notebook -- it's not a replacement for something
-like the much more complex IPython web notebook, or the SAGE notebook, but
-rather a means of exploring simulation data easily, safely, and without
-requiring the overhead of additional dependencies.
-
-Everything you need to run reason comes right with the yt install script -- if
-you have installed yt another way, you may need to separately obtain the ExtJS
-packages.
-
-The idea behind reason is to spawn a web server on a shared resource system,
-and connect to that web server from your desktop, tunnelling over SSH.  Reason
-is not designed to be run over unencrypted wires; you should either be running
-fully locally or through an SSH tunnel.  Reasonable steps have been taken to
-ensure that your connections are safe: each connection occurs only on a random
-port (which is potentially identifiable on a shared user system) and with a
-UUID prefixed into each URL (which should be difficult if not impossible to
-identify without root authority.
-
-Starting Reason
-+++++++++++++++
-
-Reason can be started very easily from the command line:
-
-.. code-block:: bash
-
-   $ yt serve
-
-If you are running on your local machine, you can also execute:
-
-.. code-block:: bash
-
-   $ yt serve -o
-
-to open up a local web browser window.  If you want Reason to search for
-(currently only Enzo) parameter files under your current directory, you can
-execute:
-
-.. code-block:: bash
-
-   $ yt serve -o -f
-
-yt will print out something like:
-
-.. code-block:: bash
-
-   =============================================================================
-   =============================================================================
-   Greetings, and welcome to Reason!
-   Your private token is c2dcd1dc-d40f-11e0-8f6b-bc305ba67797 .
-   DO NOT SHARE THIS TOKEN.
-
-   Please direct your browser to:
-
-        http://localhost:51707/c2dcd1dc-d40f-11e0-8f6b-bc305ba67797/
-
-   =============================================================================
-
-   If you are currently ssh'd into a remote machine, you should be able
-   to create a new SSH tunnel by typing or copy/pasting this text
-   verbatim, while waiting to see the 'ssh>' prompt after the first line.
-
-   ~C
-   -L51707:localhost:51707
-
-   and then pointing a web browser on your local machine to the above URL.
-
-   =============================================================================
-   =============================================================================
-
-If you are on a remote machine, you will need to execute the one additional
-step that yt mentions in order to be able to connect.  Press ~C (that's tilde,
-then C) which should bring up a prompt that looks like ``ssh>`` .  At that
-prompt, type what you are told to, which will open up a port over which you can
-tunnel to talk to the server:
-
-.. code-block:: bash
-
-   ssh>-L51707:localhost:51707
-
-Now you can open the URL printed out.
-
-.. _within-reason:
-
-What is Within Reason?
-++++++++++++++++++++++
-
-Once you start up reason, for the first time, you will see something like:
-
-.. image:: _images/rs1_welcome.png
-   :target: _images/rs1_welcome.png
-   :scale: 50%
-
-This is the basic layout.  There are three primary components:
-
-  * (top, left) *Object List*: The list of parameter files and objects.  Every
-    time you load a parameter file or create a (persistent) data object, it will
-    appear here.
-  * (top, right) *Interaction Area*: this is where the notebook and any
-    widgets will appear.
-  * (bottom) *Log Area*: The messages normally spit out to the log will be put
-    here.
-
-The main mechanism of interacting with reason is through the notebook.  You can
-type commands in.  When you either click the down-arrow on the right or press
-Shift-Enter, these commands will be sent and execute on the far side.  This
-should be thought of more like a series of mini-scripts, rather than individual
-lines: you can send multiple lines of input, including for loops, conditionals,
-and so on, very easily.  If you want to access any object, you can drag it from
-the object list, as demonstrated here, where I have drug in the parameter file
-and called ``print_stats`` on its hierarchy:
-
-.. image:: _images/rs2_printstats.png
-   :target: _images/rs2_printstats.png
-   :scale: 50%
-
-Any command can be executed here, and the output will appear in an output cell
-below.  The output cells have two sets of arrows on them.  The leftmost (blue)
-arrow will upload the contents of that cell to the yt `pastebin
-<http://paste.yt-project.org/>`_.  The rightmost (green) set of double arrows
-will put the contents of that cell up top, in the execution zone -- this is
-useful if you are iterating on a command.
-
-You can also right-click on a parameter file to create slices, projections and
-to view grid information.  If you right click on a parameter file, you can
-choose to project the dataset.  Progress bars have been added, so you should be
-mapserver.png
-able to view progress as normal:
-
-.. image:: _images/rs3_proj.png
-   :target: _images/rs3_proj.png
-   :scale: 50%
-
-Once the projecting is done, a new tab for the plot widget should be opened.
-This will include the image, a colorbar, a metadata window, and a couple
-buttons to press:
-
-.. image:: _images/rs4_widget.png
-   :target: _images/rs4_widget.png
-   :scale: 50%
-
-You can ctrl-click on the image (this is broken in Firefox, but works in
-Chrome!  We're working to fix it!) to re-center on a given location.  The
-scroll bar at the bottom controls zooming, and you can dynamically change the
-field that is displayed.  There are zoom controls as well as panning controls,
-and the option to upload the image to `imgur.com <http://imgur.com/>`_, a
-simple image sharing service on the web.
-
-You can also click the button "Pannable Map" to open up a Google Maps-style
-interface, using the same underlying code as described in :ref:`mapserver`.
-
-Once you have created a data object, for instance by creating a sphere or a
-region inside the scripting window, you can right click on that object to
-extract isocontours.  The resultant widget, based on `PhiloGL
-<http://senchalabs.github.com/philogl/>`_, will be colored with the field you
-sampled and will be shaped like the extracted isocontour at your specified
-value.
-
-What Special Things Can Reason Do?
-++++++++++++++++++++++++++++++++++
-
-There are several special commands that Reason builds in, that make a few
-common operations easy.
-
- * ``load_script(filename)`` will load a script from the file system and insert
-   it into the currently-active input area.
- * ``deliver_image(filename)`` can accept a filename (PNG-only), a string of
-   binary PNG data, or a file-like object full of PNG data.  This data will
-   then be delivered to the browser, in the next active cell.  This is the
-   mechanism by which most image display occurs in Reason.
-
-Plot collections have been instrumented to work with Reason.  This means that
-if you create a plot collection like normal, as soon as you run ``pc.save()``
-the images that are saved out will be displayed in the active cell output.
-
-Pylab has also been modified to work with Reason, and Reason imports it before
-it starts up.  You can run any normal pylab command:
-
-.. code-block:: python
-
-   pylab.plot([1, 2, 3, 4, 5], [10, 210, 503, 1, 42.1])
-
-and the output should appear in the active cell.
-
-You may notice that there is a menu above the object list:
-
-.. image:: _images/rs5_menu.png
-   :scale: 50%
-   :target: _images/rs5_menu.png
-
-There are a number of options here:
-
- * ``Open`` and ``Open Directory`` -- currently disabled while we implement
-   this functionality.
- * ``Save Script`` -- Save the concatenated set of executed cells to a file on
-   the server.  This script should be directly executable, including all widget
-   interactions.
- * ``Download Script`` -- Download the concatenated set of executed cells to a
-   file on your local machine.
- * ``Pastebin Script`` -- Upload the concatenated set of executed cells to the
-   `yt pastebin <http://paste.yt-project.org/>`.
- * ``Help`` -- A quick little help file.
- * ``yt Chat`` -- Open up the Web portal to IRC, for live chatting with other
-   yt users and developers.
-
-Please feel free to share with us your thoughts and experiences -- good and
-bad! -- with Reason.
-
-I Want To Add A Widget!
-+++++++++++++++++++++++
-
-Adding a new widget is pretty straightforward, but you might need some guidance
-along the way.  It might be a good idea to stop by the yt-dev mailing list (see
-:ref:`getting-involved`) but you can also explore how widgets are made in the
-directory ``yt/gui/reason/html/js/`` and take a look at the specific python
-code in ``yt/gui/reason/extdirect_repl.py``.
-
-But seriously, if you have the desire to play with or update or extend or
-prettify reason, we'd be really excited to work with you.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/scripts.rst
--- a/source/interacting/scripts.rst
+++ /dev/null
@@ -1,38 +0,0 @@
-.. _scripting-yt:
-
-Scripts
--------
-
-The mechanism by which most people primarily interact with yt is by writing,
-then executing, a script.  This is covered somewhat in the :ref:`orientation`,
-but here we describe it again.  There are several advantages to scripts as
-opposed to interactivity:
-
- * Repeatability of experiments
- * Easier MPI-parallelism
- * Versioned control of changes to a script
- * Simpler declaration and usage of subroutines and loops
-
-Running scripts is pretty easy.  It's a three step process.
-
- #. Edit script in a text editor (vim, emacs, textmate -- as long as it's not
-    pico or edlin!)
- #. Run script, invoking it with either the python version installed with yt or
-    the alias ``pyyt``.
- #. Edit, and repeat!
-
-To encourage easy submission to the `yt Hub <http://hub.yt-project.org/>`_, we
-suggest you place your scripts in an isolated subdirectory and name each one
-individually.  For instance:
-
-.. code-block:: bash
-
-   mkdir turbulence_paper
-   cd turbulence_paper
-   vim calculate_power_spectra.py
-   pyyt calculate_power_spectra.py
-
-You will have to reference the datasets you want to analyze with either
-relative or absolute paths, but when you have completed your work, you can use
-the command (see :ref:`command-line`) ``hubsubmit`` to (if necessary) create a
-repository and submit to the `yt Hub <http://hub.yt-project.org/>`_.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/interacting/sharing_data_hub.rst
--- a/source/interacting/sharing_data_hub.rst
+++ /dev/null
@@ -1,108 +0,0 @@
-Sharing Data on the yt Hub
-==========================
-
-The yt data hub is a mechanism by which images, data objects and projects can
-be shared with other people.  For instance, one can upload projections and
-browse them with an interface similar to Google Maps.
-
-.. note:: All items posted on the hub are public!
-
-Over time, more widgets will be added, and more datatypes will be able to be
-uploaded.  If you are interested in adding more ways of sharing data, please
-email the developers' list.  We would like to add support for 3D widgets such
-as isocontours as well as interactive binning and rebinning of data from yt
-data objects, to be displayed as phase plots and profiles.
-
-Registering a User
-------------------
-
-Because of problems with spammers, registering a user can only be done from the
-yt command line.  Once you have registered a user, you can log on to the
-website and obtain an API key.
-
-To register a user:
-
-.. code-block:: bash
-
-   $ yt hub_register
-
-This will walk you through the process of registering.  You will need to supply
-a name, a username, a password and an email address.  Once you have gotten that
-out of the way, you can go to http://hub.yt-project.org/login and log in with
-your new password.  You can then receive your API key by clicking on your
-username in the upper left.
-
-After you have gotten your API key, place it in in your ``~/.yt/config`` file:
-
-.. code-block:: none
-
-   [yt]
-   hub_api_key = 3fd8de56c2884c13a2de4dd51a80974b
-
-Replace ``3fd8de56c2884c13a2de4dd51a80974b`` with your API key.  At this point,
-you're ready to go!
-
-What Can Be Uploaded
---------------------
-
-Currently, the yt hub can accept these types of data:
-
- * Projects and script repositories: these will be displayed with an optional
-   image, a description, and a link to the source repository.
- * Projections and Slices: these will be displayed in a maps-like interface,
-   for interactive panning and zooming
- * Plot collections: these will be displayed as a list of images
-
-How to Upload Data
-------------------
-
-Uploading data takes place inside scripts.  For the most part, it is relatively
-simple to do: you construct the object you would like to share, and then you
-upload it.
-
-Uploading Projects
-~~~~~~~~~~~~~~~~~~
-
-For information on how to share a project or a set of scripts, see
-:ref:`share-your-scripts`.
-
-Uploading Projections and Slices
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Projections and slices both have a ``hub_upload`` method.  Here is an example
-of uploading a projection:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
-   proj = pf.h.proj(0, "Density", weight="Density")
-   proj.hub_upload()
-
-Here is an example of uploading a slice:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("JHK-DD0030/galaxy0030")
-   sl = pf.h.slice(0, 0.5, fields=["Density"])
-   sl.hub_upload()
-
-Uploading Plot Collections
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Plot collections can be uploaded and viewed as a selection of images.  To
-upload a plot collection, call ``hub_upload`` on the plot collection.
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("DD0252/DD0252")
-   pc = PlotCollection(pf, 'c')
-   pc.add_projection("Density", 0)
-   pc.add_slice("Temperature", 1)
-   pc.add_profile_sphere(0.2, 'unitary', ["Density", "Temperature"])
-   dd = pf.h.all_data()
-   pc.add_phase_object(dd, ["Density", "Temperature", "CellMassMsun"],
-                       weight=None)
-                    pc.hub_upload()

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/reference/command-line.rst
--- /dev/null
+++ b/source/reference/command-line.rst
@@ -0,0 +1,320 @@
+.. _command-line:
+
+.. _interactive-prompt:
+
+Interactive Prompt
+------------------
+
+The interactive prompt offers a number of excellent opportunities for
+exploration of data.  While there are challenges for repeatability, and some
+operations will be more challenging to operate in parallel, interactive prompts
+can be exceptionally useful for debugging, exploring, and tweaking plots.
+
+You can start up an empty shell, with a handful of useful yt utilities (such as
+tab-completion and pre-imported modules) by executing:
+
+.. code-block:: bash
+
+   iyt
+
+The other option, which is shorthand for "iyt plus dataset loading" is to use
+the command-line tool (see :ref:`command-line`) with the ``load`` subcommand
+and to specify a parameter file.  For instance:
+
+.. code-block:: bash
+
+   yt load cosmoSim_coolhdf5_chk_0026
+
+or
+
+.. code-block:: bash
+
+   yt load DD0030/DD0030
+
+This will spawn ``iyt``, but the parameter file given on the command line will
+already be in the namespace as ``pf``.  With interactive mode, you can use the
+``pylab`` module to interactively plot.
+
+Command-line Functions
+----------------------
+
+The :code:`yt` command-line tool allows you to access some of yt's basic
+funcionality without opening a python interpreter.  The tools is a collection of
+subcommands.  These can quickly making plots of slices and projections through a
+dataset, updating yt's codebase, print basic statistics about a dataset, laucnh
+an IPython notebook session, and more.  To get a quick list of what is
+available, just type:
+
+.. code-block:: bash
+
+   yt -h
+
+This will print the list of available subcommands,
+
+.. code-block:: bash
+
+    help                Print help message
+    bootstrap_dev       Bootstrap a yt development environment
+    bugreport           Report a bug in yt
+    hop                 Run HOP on one or more datasets
+    hub_register        Register a user on the Hub: http://hub.yt-project.org/
+    hub_submit          Submit a mercurial repository to the yt Hub
+                        (http://hub.yt-project.org/), creating a BitBucket
+                        repo in the process if necessary.
+    instinfo            Get some information about the yt installation
+    load                Load a single dataset into an IPython instance
+    mapserver           Serve a plot in a GMaps-style interface
+    pastebin            Post a script to an anonymous pastebin
+    pastebin_grab       Print an online pastebin to STDOUT for local use.
+    upload_notebook     Upload an IPython notebook to hub.yt-project.org.
+    plot                Create a set of images
+    render              Create a simple volume rendering
+    rpdb                Connect to a currently running (on localhost) rpd
+                        session. Commands run with --rpdb will trigger an rpdb
+                        session with any uncaught exceptions.
+    notebook            Run the IPython Notebook
+    serve               Run the Web GUI Reason
+    reason              Run the Web GUI Reason
+    stats               Print stats and max/min value of a given field (if
+                        requested), for one or more datasets (default field is
+                        Density)
+    update              Update the yt installation to the most recent version
+    upload_image        Upload an image to imgur.com. Must be PNG.
+
+
+To execute any such function, simply run:
+
+.. code-block:: bash
+
+   yt <subcommand>
+
+Finally, to identify the options associated with any of these subcommand, run:
+
+.. code-block:: bash
+
+   yt <subcommand> -h
+
+Plotting from the command line
+------------------------------
+
+First, we'll discuss plotting from the command line, then we will give a brief
+summary of the functionality provided by each command line subcommand. This
+example uses the :code:`DD0010/moving7_0010` dataset distributed in the yt
+mercurial repository.
+
+First let's see what our options are for plotting:
+
+.. code-block:: bash
+
+  $ yt plot --help
+
+There are many!  We can choose whether we want a slice (default) or a
+projection (``-p``), the field, the colormap, the center of the image, the
+width and unit of width of the image, the limits, the weighting field for
+projections, and on and on.  By default the plotting command will execute the
+same thing along all three axes, so keep that in mind if it takes three times
+as long as you'd like!  The center of a slice defaults to the center of
+the domain, so let's just give that a shot and see what it looks like:
+
+.. code-block:: bash
+
+  $ yt plot DD0010/moving7_0010
+
+Well, that looks pretty bad!  What has happened here is that the center of the
+domain only has some minor shifts in density, so the plot is essentially
+incomprehensible.  Let's try it again, but instead of slicing, let's project.
+This is a line integral through the domain, and for the density field this
+becomes a column density.:
+
+.. code-block:: bash
+
+  $ yt plot -p DD0010/moving7_0010
+
+Now that looks much better!  Note that all three axes' projections appear
+nearly indistinguishable, because of how the two spheres are located in the
+domain.  We could center our domain on one of the spheres and take a slice, as
+well.  Now let's see what the domain looks like with grids overlaid, using the
+``--show-grids`` option.:
+
+.. code-block:: bash
+
+  $ yt plot --show-grids -p DD0010/moving7_0010
+
+We can now see all the grids in the field of view.
+
+Command-line subcommand summary
+-------------------------------
+
+help
+++++
+
+Help lists all of the various command-line options in yt.
+
+bootstrap_dev
++++++++++++++
+
+After you have installed yt and you want to do some development, there may 
+be a few more steps to complete.  This subcommand automates building a 
+development environment for you by setting up your hg preferences correctly,
+creating/linking to a bitbucket account for hosting and sharing your code, 
+and setting up a pasteboard for your code snippets.  A full description of 
+how this works can be found in :ref:`bootstrap-dev`.
+
+bugreport         
++++++++++
+
+Encountering a bug in your own code can be a big hassle, but it can be 
+exponentially worse to find it in someone else's.  That's why we tried to 
+make it as easy as possible for users to report bugs they find in yt.  
+After you go through the necessary channels to make sure you're not just
+making a mistake (see :ref:`asking-for-help`), you can submit bug 
+reports using this nice utility.
+
+hop               
++++
+
+This lets you run the HOP algorithm as a halo-finder on one or more 
+datasets.  It nominally reproduces the behavior of enzohop from the 
+enzo suite.  There are several flags you can use in order to specify
+your threshold, input names, output names, and whether you want to use 
+dark matter or all particles.  To view these flags run help with the 
+hop subcommand.
+
+hub_register and hub_submit
++++++++++++++++++++++++++++
+
+We in the yt camp believe firmly in the ideals of open-source coding.  To
+further those ends, we have made a location for people to share their 
+nifty and useful codes with other scientists who might be able to use 
+them: the `yt hub <http://hub.yt-project.org/>`_.  Did you make a cool 
+code for generating a movie from your simulation outputs?  Submit it to 
+the hub.  Did you create a perl script that automates something and saves 
+you some time while on a supercomputer.  Submit it to the hub.  And 
+using the hubsubmit command, you can do this really easily.  If you 
+create a mercurial repository for the code you want to submit, just 
+run the hubsubmit command from within its directory structure, and we'll 
+take care of the rest, by putting it on bitbucket and finally submitting 
+it to the hub to share with the rest of the yt community.  Check out 
+what people have already put up on the
+`yt hub <http://hub.yt-project.org/>`_, and see :ref:`share-your-scripts` 
+for more details about sharing your work on the hub.
+
+instinfo
+++++++++
+
+This gives very similar behavior to the update command, in that it 
+will automatically update your yt version to the latest in whichever
+repository you're in (stable, development, etc.).  It can also provide 
+you with the hash of the version you're using.
+
+load
+++++
+
+This will start the iyt interactive environment with your specified 
+dataset already loaded.  See :ref:`interactive-prompt` for more details.
+
+mapserver
++++++++++
+
+Ever wanted to interact with your data using the 
+`google maps <http://maps.google.com/>`_ interface?  Now you can by using the
+yt mapserver.  See :ref:`mapserver` for more details.
+
+pastebin and pastebin_grab
+++++++++++++++++++++++++++
+
+The `pastebin <http://paste.yt-project.org/>`_ is an online location where 
+you can anonymously post code snippets and error messages to share with 
+other users in a quick, informal way.  It is often useful for debugging 
+code or co-developing.  By running the ``pastebin`` subcommand with a 
+text file, you send the contents of that file to an anonymous pastebin; 
+
+.. code-block:: bash
+
+   yt pastebin my_script.py
+
+By running the ``pastebin_grab`` subcommand with a pastebin number 
+(e.g. 1768), it will grab the contents of that pastebin 
+(e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
+STDOUT for local use.  For more details see the :ref:`pastebin` section.
+
+.. code-block:: bash
+
+   yt pastebin_grab 1768
+
+plot
+++++
+
+This command generates one or many simple plots for a single dataset.  
+By specifying the axis, center, width, etc. (run ``yt help plot`` for 
+details), you can create slices and projections easily at the 
+command-line.
+
+upload_notebook
++++++++++++++++
+
+This command will accept the filename of a ``.ipynb`` file (generated from an
+IPython notebook session) and upload it to the `yt hub
+<http://hub.yt-project.org/>` where others will be able to view it, and
+download it.  This is an easy method for recording a sequence of commands,
+their output, narrative information, and then sharing that with others.  These
+notebooks will be viewable online, and the appropriate URLs will be returned on
+the command line.
+
+reason and serve
+++++++++++++++++
+
+The ``reason`` and ``serve`` subcommands have identical functionality in that
+they both initiate the Web GUI Reason. See :ref:`reason`.
+
+render
+++++++
+
+This command generates a volume rendering for a single dataset.  By specifying
+the center, width, number of pixels, number and thickness of contours, etc.
+(run ``yt help render`` for details),  you can create high-quality volume
+renderings at the command-line before moving on to more involved volume
+rendering scripts.
+
+rpdb
+++++
+
+Connect to a currently running (on localhost) rpd session.
+
+notebook
+++++++++
+
+Launches an IPython notebook server and prints out instructions on how to open
+an ssh tunnel to connect to the notebook server with a web browser.  This is
+most useful when you want to run an IPython notebook using CPUs on a remote
+host.
+
+stats
++++++
+
+This subcommand provides you with some basic statistics on a given dataset.
+It provides you with the number of grids and cells in each level, the time
+of the dataset, the resolution, and the maximum density in a variety of units.
+It is tantamount to performing the ``print_stats()`` inside of yt.
+
+update
+++++++
+
+This subcommand updates the yt installation to the most recent version for
+your repository (e.g. stable, 2.0, development, etc.).  Adding the ``--all`` 
+flag will update the dependencies as well. See 
+:ref:`automated-update` for more details.
+
+.. _upload-image:
+
+upload_image
+++++++++++++
+
+Images are often worth a thousand words, so when you're trying to 
+share a piece of code that generates an image, or you're trying to 
+debug image-generation scripts, it can be useful to send your
+co-authors a link to the image.  This subcommand makes such sharing 
+a breeze.  By specifying the image to share, ``upload_image`` automatically
+uploads it anonymously to the website `imgur.com <http://imgur.com/>`_ and
+provides you with a link to share with your collaborators.  Note that the
+image *must* be in the PNG format in order to use this function.

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/reference/index.rst
--- a/source/reference/index.rst
+++ b/source/reference/index.rst
@@ -8,7 +8,9 @@
    :maxdepth: 2
 
    code_support
+   command-line
    api/api
+   python_introduction
    configuration
    field_list
    changelog

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/sharing_data_hub.rst
--- /dev/null
+++ b/source/sharing_data_hub.rst
@@ -0,0 +1,108 @@
+What is the yt Hub?
+===================
+
+The yt data hub is a mechanism by which images, data objects and projects can
+be shared with other people.  For instance, one can upload projections and
+browse them with an interface similar to Google Maps.
+
+.. note:: All items posted on the hub are public!
+
+Over time, more widgets will be added, and more datatypes will be able to be
+uploaded.  If you are interested in adding more ways of sharing data, please
+email the developers' list.  We would like to add support for 3D widgets such
+as isocontours as well as interactive binning and rebinning of data from yt
+data objects, to be displayed as phase plots and profiles.
+
+Registering a User
+------------------
+
+Because of problems with spammers, registering a user can only be done from the
+yt command line.  Once you have registered a user, you can log on to the
+website and obtain an API key.
+
+To register a user:
+
+.. code-block:: bash
+
+   $ yt hub_register
+
+This will walk you through the process of registering.  You will need to supply
+a name, a username, a password and an email address.  Once you have gotten that
+out of the way, you can go to http://hub.yt-project.org/login and log in with
+your new password.  You can then receive your API key by clicking on your
+username in the upper left.
+
+After you have gotten your API key, place it in in your ``~/.yt/config`` file:
+
+.. code-block:: none
+
+   [yt]
+   hub_api_key = 3fd8de56c2884c13a2de4dd51a80974b
+
+Replace ``3fd8de56c2884c13a2de4dd51a80974b`` with your API key.  At this point,
+you're ready to go!
+
+What Can Be Uploaded
+--------------------
+
+Currently, the yt hub can accept these types of data:
+
+ * Projects and script repositories: these will be displayed with an optional
+   image, a description, and a link to the source repository.
+ * Projections and Slices: these will be displayed in a maps-like interface,
+   for interactive panning and zooming
+ * Plot collections: these will be displayed as a list of images
+
+How to Upload Data
+------------------
+
+Uploading data takes place inside scripts.  For the most part, it is relatively
+simple to do: you construct the object you would like to share, and then you
+upload it.
+
+Uploading Projects
+~~~~~~~~~~~~~~~~~~
+
+For information on how to share a project or a set of scripts, see
+:ref:`share-your-scripts`.
+
+Uploading Projections and Slices
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Projections and slices both have a ``hub_upload`` method.  Here is an example
+of uploading a projection:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
+   proj = pf.h.proj(0, "Density", weight="Density")
+   proj.hub_upload()
+
+Here is an example of uploading a slice:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("JHK-DD0030/galaxy0030")
+   sl = pf.h.slice(0, 0.5, fields=["Density"])
+   sl.hub_upload()
+
+Uploading Plot Collections
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Plot collections can be uploaded and viewed as a selection of images.  To
+upload a plot collection, call ``hub_upload`` on the plot collection.
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("DD0252/DD0252")
+   pc = PlotCollection(pf, 'c')
+   pc.add_projection("Density", 0)
+   pc.add_slice("Temperature", 1)
+   pc.add_profile_sphere(0.2, 'unitary', ["Density", "Temperature"])
+   dd = pf.h.all_data()
+   pc.add_phase_object(dd, ["Density", "Temperature", "CellMassMsun"],
+                       weight=None)
+                    pc.hub_upload()

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/visualizing/index.rst
--- a/source/visualizing/index.rst
+++ b/source/visualizing/index.rst
@@ -9,5 +9,6 @@
    manual_plotting
    volume_rendering
    sketchfab
+   mapserver
    streamlines
    colormaps/index

diff -r 2f145817efd090a5f01699fde1bbc7039c231a56 -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 source/visualizing/mapserver.rst
--- /dev/null
+++ b/source/visualizing/mapserver.rst
@@ -0,0 +1,34 @@
+.. _mapserver:
+
+Mapserver
+---------
+
+The mapserver is a new, experimental feature.  It's based on `Leaflet
+<http://leaflet.cloudmade.com/>`, a library written to create zoomable,
+map-tile interfaces.  (Similar to Google Maps.)  yt provides everything you
+need to start up a web server that will interactively re-pixelize an adaptive
+image.  This means you can explore your datasets in a fully pan-n-zoom
+interface.
+
+To start up the mapserver, you can use the command ``yt`` (see
+:ref:`command-line`) with the ``mapserver`` subcommand.  It takes several of
+the same options and arguments as the ``plot`` subcommand.  For instance:
+
+.. code-block:: bash
+
+   yt mapserver DD0050/DD0050
+
+That will take a slice along the x axis at the center of the domain.  The
+field, projection, weight and axis can all be specified on the command line.
+
+When you do this, it will spawn a micro-webserver on your localhost, and output
+the URL to connect to to standard output.  You can connect to it (or create an
+SSH tunnel to connect to it) and explore your data.  Double-clicking zooms, and
+dragging drags.
+
+.. image:: _images/mapserver.png
+   :scale: 50%
+
+This is also functional on touch-capable devices such as Android Tablets and
+iPads/iPhones.  In future versions, we hope to add halo-overlays and
+markers-of-interest to this.


https://bitbucket.org/yt_analysis/yt-doc/commits/c4beea87a17b/
Changeset:   c4beea87a17b
User:        MatthewTurk
Date:        2013-10-31 19:35:58
Summary:     Fixing autosummary
Affected #:  1 file

diff -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 -r c4beea87a17b1942abf9890df01395cb56838464 source/reference/api/api.rst
--- a/source/reference/api/api.rst
+++ b/source/reference/api/api.rst
@@ -376,7 +376,9 @@
 
 Absorption spectra fitting:
 
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+.. autosummary::
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
 
 Sunrise exporting:
 


https://bitbucket.org/yt_analysis/yt-doc/commits/2be8f4fb078f/
Changeset:   2be8f4fb078f
User:        ngoldbaum
Date:        2013-10-31 05:45:48
Summary:     Adding documentation for plot modifying functions.
Affected #:  1 file

diff -r 918bb31b8eb63a2be3e7e3684159822f87fa1d66 -r 2be8f4fb078f37cc4a0ca1f4aedc3a61e6adeef0 source/visualizing/plots.rst
--- a/source/visualizing/plots.rst
+++ b/source/visualizing/plots.rst
@@ -38,7 +38,7 @@
 is requested of it -- for instance, when the width or field is changed
 -- this high-resolution data is then pixelized and placed in a buffer
 of fixed size. This is accomplished behind the scenes using
-:class:`yt.visualization.fixed_resolution.FixedResolutionBuffer``
+:class:`~yt.visualization.fixed_resolution.FixedResolutionBuffer`
 ``PlotWindow`` expose the underlying matplotlib ``figure`` and
 ``axes`` objects, making it easy to customize your plots and 
 add new annotations.
@@ -283,6 +283,167 @@
 
 __ :class:`~yt.visualization.plot_window.OffAxisProjectionPlot`
 
+Plot Customization
+------------------
+
+You can customize each of the four plot types above in identical ways.  We'll go
+over each of the customizations methos below.  For each of the examples below we
+will modify the following plot.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.save()
+
+Panning and zooming
+~~~~~~~~~~~~~~~~~~~
+
+There are three methods to dynamically pan around the data.  
+
+:class:`~yt.visualization.plot_window.SlicePlot.pan` accepts x and y deltas in code
+units.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.pan((2/pf['kpc'],2/pf['kpc']))
+   slc.save()
+
+:class:`~yt.visualization.plot_window.SlicePlot.pan_rel` accepts deltas in units relative
+to the field of view of the plot.  
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.pan_rel((0.1, -0.1))
+   slc.save()
+
+:class:`~yt.visualization.plot_window.SlicePlot.zoom` accepts a factor to zoom in by.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.zoom(2)
+   slc.save()
+
+Set axes units
+~~~~~~~~~~~~~~
+
+:class:`~yt.visualization.plot_window.SlicePlot.set_axes_unit` allows the customization of
+the axes unit labels.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_axes_unit('Mpc')
+   slc.save()
+
+Set the plot center
+~~~~~~~~~~~~~~~~~~~
+
+The :class:`~yt.visualization.plot_window.SlicePlot.set_center` function accepts a new
+center for the plot, in code units.  New centers must be two element tuples.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_center((0.53, 0.53))
+   slc.save()
+
+Fonts
+~~~~~
+
+:class:`~yt.visualization.plot_window.SlicePlot.set_font` allows font costomization.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_font({'family': 'sans-serif', 'style': 'italic','weight': 'bold', 'size': 24})
+   slc.save()
+
+Colormaps
+~~~~~~~~~
+
+Each of these functions accept two arguments.  In all cases the first argument
+is a field name.  This makes it possible to use different custom colormaps for
+different fields tracked by the plot object.
+
+To change the colormap for the plot, call the
+:class:`~yt.visualization.plot_window.SlicePlot.set_cmap` function.  Use any of the
+colormaps listed in the :ref:`colormaps` section.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_cmap('VorticitySquared', 'RdBu_r')
+   slc.save()
+
+The :class:`~yt.visualization.plot_window.SlicePlot.set_log` function accepts a field name
+and a boolean.  If the boolean is :code:`True`, the colormap for the field will
+be log scaled.  If it is `False` the colormap will be linear.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_log('VorticitySquared', False)
+   slc.save()
+
+Lastly, the :class:`~yt.visualization.plot_window.SlicePlot.set_zlim` function makes it
+possible to set a custom colormap range.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_zlim('VorticitySquared', 1e-30, 1e-25)
+   slc.save()
+
+Set the size of the plot
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+To set the size of the plot, use the
+:class:`~yt.visualization.plot_window.SlicePlot.set_window_size` function.  The argument
+is the size of the longest edge of the plot in inches.  View the full resolution
+image to see the difference more clearly.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_window_size(10)
+   slc.save()
+
+To change the resolution of the image, call the
+:class:`~yt.visualization.plot_window.SlicePlot.set_buff_size` function.
+
+.. python-script::
+
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'VorticitySquared', width=(10,'kpc'), center='max')
+   slc.set_buff_size(1600)
+   slc.save()
+
 Quantative Analysis and Visualization
 -------------------------------------
 


https://bitbucket.org/yt_analysis/yt-doc/commits/d78bbec871ed/
Changeset:   d78bbec871ed
User:        ngoldbaum
Date:        2013-10-31 05:54:25
Summary:     Making the code blocks in the manual plotting docs python-script blocks.
Affected #:  1 file

diff -r 2be8f4fb078f37cc4a0ca1f4aedc3a61e6adeef0 -r d78bbec871ed46319586f2f58edfa82c667bef75 source/visualizing/manual_plotting.rst
--- a/source/visualizing/manual_plotting.rst
+++ b/source/visualizing/manual_plotting.rst
@@ -32,16 +32,16 @@
 generate an FRB is to use the ``.to_frb(width, resolution, center=None)`` method
 of any data two-dimensional data object:
 
-.. code-block:: python
+.. python-script::
    
    import pylab as P
    from yt.mods import *
-   pf = load("RedshiftOutput0005")
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
 
    c = pf.h.find_max('Density')[1]
    proj = pf.h.proj(0, 'Density')
 
-   width = 1.5/pf['mpc'] # we want a 1.5 mpc view
+   width = 10/pf['kpc'] # we want a 1.5 mpc view
    res = [1000, 1000] # create an image with 1000x1000 pixels
    frb = proj.to_frb(width, res, center=c)
 
@@ -66,20 +66,20 @@
 
 This is perhaps the simplest thing to do. ``yt`` provides a number of one dimensional objects, and these return a 1-D numpy array of their contents with direct dictionary access. As a simple example, take a :class:`~yt.data_objects.data_containers.AMROrthoRayBase` object, which can be created from a hierarchy by calling ``pf.h.ortho_ray(axis, center)``. 
 
-.. code-block:: python
+.. python-script::
 
    from yt.mods import *
    import pylab as P
-   pf = load("RedshiftOutput0005")
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
    c = pf.h.find_max("Density")[1]
    ax = 0 # take a line cut along the x axis
    ray = pf.h.ortho_ray(ax, (c[1], c[2])) # cutting through the y0,z0 such that we hit the max density
 
    P.subplot(211)
-   P.plot(ray['x'], ray['Density'])
+   P.semilogy(ray['x'], ray['Density'])
    P.ylabel('Density')
    P.subplot(212)
-   P.plot(ray['x'], ray['Temperature'])
+   P.semilogy(ray['x'], ray['Temperature'])
    P.xlabel('x')
    P.ylabel('Temperature')
 
@@ -99,21 +99,20 @@
 :class:`yt.visualization.profile_plotter.PhasePlotter` object, giving it a data
 source and three fields: the x-axis field, the y-axis field, and the z field (that is, the color of the cells). 
 
-.. code-block:: python
+.. python-script::
    
    from yt.mods import *
    import yt.visualization.profile_plotter as pp
    import pylab as P
    
-   pf = load("RedshiftOutput0005")
+   pf = load("Enzo_64/DD0043/data0043")
    c = pf.h.find_max("Density")[1]
-   radius = 1.5/pf['mpc']
+   radius = 10/pf['mpc']
    sph = pf.h.sphere(c,radius)
    
    phase = pp.PhasePlotter(sph,'Density', 'Temperature','CellMassMsun')
    
    fig, ax = phase.plot.to_mpl()
-   # sorry this is convoluted!
    from yt.visualization._mpl_imports import FigureCanvasAgg
    
    canvas = FigureCanvasAgg(fig)


https://bitbucket.org/yt_analysis/yt-doc/commits/a231d26a2cc8/
Changeset:   a231d26a2cc8
User:        ngoldbaum
Date:        2013-11-01 22:24:46
Summary:     Adding new example notebooks.
Affected #:  9 files

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/custom_colorbar_tickmarks.ipynb
--- /dev/null
+++ b/source/cookbook/custom_colorbar_tickmarks.ipynb
@@ -0,0 +1,90 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "%matplotlib inline\n",
+      "from yt.mods import *"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
+      "slc = SlicePlot(pf, 'x', 'Density')\n",
+      "slc"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`PlotWindow` plots are containers for plots, keyed to field names.  Below, we get a copy of the plot for the `Density` field."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "plot = slc.plots['Density']"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The plot has a few attributes that point to underlying `matplotlib` plot primites.  For example, the `colorbar` object corresponds to the `cb` attribute of the plot."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "colorbar = plot.cb"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To set custom tickmarks, simply call the `matplotlib` [`set_ticks`](http://matplotlib.org/api/colorbar_api.html#matplotlib.colorbar.ColorbarBase.set_ticks) and [`set_ticklabels`](http://matplotlib.org/api/colorbar_api.html#matplotlib.colorbar.ColorbarBase.set_ticklabels) functions."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "colorbar.set_ticks([1e-28])\n",
+      "colorbar.set_ticklabels(['$10^{-28}$'])\n",
+      "slc"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/custom_colorbar_tickmarks.rst
--- /dev/null
+++ b/source/cookbook/custom_colorbar_tickmarks.rst
@@ -0,0 +1,4 @@
+Custom Colorabar Tickmarks
+--------------------------
+
+.. notebook:: custom_colorbar_tickmarks.ipynb

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/embedded_javascript_animation.ipynb
--- /dev/null
+++ b/source/cookbook/embedded_javascript_animation.ipynb
@@ -0,0 +1,70 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "This example shows how to embed an animation produced by `matplotlib` into an IPython notebook.  This example makes use of `matplotlib`'s [animation toolkit](http://matplotlib.org/api/animation_api.html) to transform individual frames into a final rendered movie.  \n",
+      "\n",
+      "Additionally, this uses Jake VanderPlas' [`JSAnimation`](https://github.com/jakevdp/JSAnimation) library to embed the movie as a javascript widget, directly in the notebook.  This does not use `ffmpeg` to stitch the frames together and thus does not require `ffmpeg`.  However, you must have `JSAnimation` installed.\n",
+      "\n",
+      "To do so, clone to git repostiory and run `python setup.py install` in the root of the repository."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from JSAnimation import IPython_display\n",
+      "from matplotlib import animation"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Here we set up the animation.  We use `yt` to load the data and create each frame and use matplotlib to stitch the frames together.  Note that we customize the plot a bit by calling the `set_zlim` function.  Customizations only need to be applied to the first frame - they will carry through to the rest.\n",
+      "\n",
+      "This may take a while to run, be patient."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import matplotlib.pyplot as plt\n",
+      "from matplotlib.backends.backend_agg import FigureCanvasAgg\n",
+      "\n",
+      "prj = ProjectionPlot(load('Enzo_64/DD0000/data0000'), 0, 'Density', weight_field='Density',width=(180,'mpccm'))\n",
+      "prj.set_zlim('Density',1e-32,1e-26)\n",
+      "fig = prj.plots['Density'].figure\n",
+      "fig.canvas = FigureCanvasAgg(fig)\n",
+      "\n",
+      "# animation function.  This is called sequentially\n",
+      "def animate(i):\n",
+      "    pf = load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
+      "    prj._switch_pf(pf)\n",
+      "\n",
+      "# call the animator.  blit=True means only re-draw the parts that have changed.\n",
+      "animation.FuncAnimation(fig, animate, frames=44, interval=200, blit=False)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/embedded_javascript_animation.rst
--- /dev/null
+++ b/source/cookbook/embedded_javascript_animation.rst
@@ -0,0 +1,4 @@
+Making a javascript animation widget using JSAnimation
+------------------------------------------------------
+
+.. notebook:: embedded_javascript_animation.ipynb

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/embedded_webm_animation.ipynb
--- /dev/null
+++ b/source/cookbook/embedded_webm_animation.ipynb
@@ -0,0 +1,122 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "This example shows how to embed an animation produced by `matplotlib` into an IPython notebook.  This example makes use of `matplotlib`'s [animation toolkit](http://matplotlib.org/api/animation_api.html) to transform individual frames into a final rendered movie.  \n",
+      "\n",
+      "Matplotlib uses [`ffmpeg`](http://www.ffmpeg.org/) to generate the movie, so you must install `ffmpeg` for this example to work correctly.  Usually the best way to install `ffmpeg` is using your system's package manager."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from matplotlib import animation"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "First, we need to construct a function that will embed the video produced by ffmpeg directly into the notebook document. This makes use of the [HTML5 video tag](http://www.w3schools.com/html/html5_video.asp) and the WebM video format.  WebM is supported by Chrome, Firefox, and Opera, but not Safari and Internet Explorer.  If you have trouble viewing the video you may need to use a different video format.  Since this uses `libvpx` to construct the frames, you will need to ensure that ffmpeg has been compiled with `libvpx` support."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from tempfile import NamedTemporaryFile\n",
+      "\n",
+      "VIDEO_TAG = \"\"\"<video controls>\n",
+      " <source src=\"data:video/x-webm;base64,{0}\" type=\"video/webm\">\n",
+      " Your browser does not support the video tag.\n",
+      "</video>\"\"\"\n",
+      "\n",
+      "def anim_to_html(anim):\n",
+      "    if not hasattr(anim, '_encoded_video'):\n",
+      "        with NamedTemporaryFile(suffix='.webm') as f:\n",
+      "            anim.save(f.name, fps=6, extra_args=['-vcodec', 'libvpx'])\n",
+      "            video = open(f.name, \"rb\").read()\n",
+      "        anim._encoded_video = video.encode(\"base64\")\n",
+      "    \n",
+      "    return VIDEO_TAG.format(anim._encoded_video)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Next, we define a function to actually display the video inline in the notebook."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from IPython.display import HTML\n",
+      "\n",
+      "def display_animation(anim):\n",
+      "    plt.close(anim._fig)\n",
+      "    return HTML(anim_to_html(anim))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Finally, we set up the animation itsself.  We use `yt` to load the data and create each frame and use matplotlib to stitch the frames together.  Note that we customize the plot a bit by calling the `set_zlim` function.  Customizations only need to be applied to the first frame - they will carry through to the rest.\n",
+      "\n",
+      "This may take a while to run, be patient."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import matplotlib.pyplot as plt\n",
+      "from matplotlib.backends.backend_agg import FigureCanvasAgg\n",
+      "\n",
+      "prj = ProjectionPlot(load('Enzo_64/DD0000/data0000'), 0, 'Density', weight_field='Density',width=(180,'mpccm'))\n",
+      "prj.set_zlim('Density',1e-32,1e-26)\n",
+      "fig = prj.plots['Density'].figure\n",
+      "fig.canvas = FigureCanvasAgg(fig)\n",
+      "\n",
+      "# animation function.  This is called sequentially\n",
+      "def animate(i):\n",
+      "    pf = load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
+      "    prj._switch_pf(pf)\n",
+      "\n",
+      "# call the animator.  blit=True means only re-draw the parts that have changed.\n",
+      "anim = animation.FuncAnimation(fig, animate, frames=44, interval=200, blit=False)\n",
+      "\n",
+      "# call our new function to display the animation\n",
+      "display_animation(anim)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/embedded_webm_animation.rst
--- /dev/null
+++ b/source/cookbook/embedded_webm_animation.rst
@@ -0,0 +1,4 @@
+Making animations using matplotlib and ffmpeg
+---------------------------------------------
+
+.. notebook:: embedded_webm_animation.ipynb

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -42,4 +42,8 @@
    :maxdepth: 1
 
    notebook_tutorial
+   custom_colorbar_tickmarks
+   embedded_javascript_animation
+   embedded_webm_animation
    ../analyzing/analysis_modules/sunyaev_zeldovich
+   

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/cookbook/notebook_tutorial.rst
--- a/source/cookbook/notebook_tutorial.rst
+++ b/source/cookbook/notebook_tutorial.rst
@@ -1,4 +1,33 @@
 Notebook Tutorial
 -----------------
 
-This is a placeholder for the badass notebook tutorial that Nathan is going to write.
+The IPython notebook is a powerful system for literate codoing - a style of
+writing code that embeds input, output, and explanatory text into one document.
+
+yt has deep integration with the IPython notebook, explained in-depth in the
+other example notebooks and the rest of the yt documentation.  This page is here
+to give a brief introduction to the notebook its self.
+
+To start the notebook, enter the following command at the bash command line:
+
+.. code-block:: bash
+
+   $ ipython notebook
+
+Depending on your default web browser and system setup this will open a web
+browser and direct you to the notebook dahboard.  If it does not,  you might
+need to connect to the notebook manually.  See the `IPython documentation
+<http://ipython.org/ipython-doc/stable/interactive/notebook.html#starting-the-notebook-server>`_
+for more details.
+
+For the notebook tutorial, we rely on example notebooks that are part of the
+IPython documentation.  We link to static nbviewer versions of the 'evaluated'
+versions of these example notebooks.  If you would like to run them locally on
+your own computer, simply download the notebook by clicking the 'Download
+Notebook' link in the top right corner of each page.
+
+1. `Running Code in the IPython Notebook <http://nbviewer.ipython.org/url/github.com/ipython/ipython/raw/master/examples/notebooks/Part%201%20-%20Running%20Code.ipynb>`_
+2. `Basic Output <http://nbviewer.ipython.org/url/github.com/ipython/ipython/raw/master/examples/notebooks/Part%202%20-%20Basic%20Output.ipynb>`_
+3. `Plotting with matplotlib <http://nbviewer.ipython.org/url/github.com/ipython/ipython/raw/master/examples/notebooks/Part%203%20-%20Plotting%20with%20Matplotlib.ipynb>`_
+4. `Markdown Cells <http://nbviewer.ipython.org/url/github.com/ipython/ipython/raw/master/examples/notebooks/Part%204%20-%20Markdown%20Cells.ipynb>`_
+5. `IPython's rich display system <http://nbviewer.ipython.org/url/github.com/ipython/ipython/raw/master/examples/notebooks/Part%205%20-%20Rich%20Display%20System.ipynb>`_

diff -r d78bbec871ed46319586f2f58edfa82c667bef75 -r a231d26a2cc8a0f00e110bcd7069446f77570a9d source/developing/building_the_docs.rst
--- a/source/developing/building_the_docs.rst
+++ b/source/developing/building_the_docs.rst
@@ -43,6 +43,8 @@
 - pandoc 1.11.1
 - Rockstar halo finder 0.99.6
 - SZpack_ 1.1.1
+- ffmpeg 1.2.4 (compiled with libvpx support)
+- JSAnimation (git hash 1b95cb3a3a)
 
 .. _SZpack: http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html
 


https://bitbucket.org/yt_analysis/yt-doc/commits/1ae32f6f8858/
Changeset:   1ae32f6f8858
User:        ngoldbaum
Date:        2013-11-02 08:37:50
Summary:     Fixing some bad links.
Affected #:  1 file

diff -r a231d26a2cc8a0f00e110bcd7069446f77570a9d -r 1ae32f6f885825a029f90d0d0bab45a7304b99e7 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -53,7 +53,7 @@
 the entire installation process, so it is usually quite cumbersome.  By looking 
 at the last few hundred lines (i.e. ``tail -500 yt_install.log``), you can 
 potentially figure out what went wrong.  If you have problems, though, do not 
-hesitate to :ref:`contact us asking-for-help` for assistance.
+hesitate to :ref:`contact us <asking-for-help>` for assistance.
 
 .. _activating-yt:
 
@@ -115,7 +115,7 @@
 yt, which means you have successfully installed yt.  Congratulations!  
 
 If you get an error, follow the instructions it gives you to debug the problem.  
-Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
+Do not hesitate to :ref:`contact us <asking-for-help>` so we can help you 
 figure it out.
 
 .. _updating-yt:


https://bitbucket.org/yt_analysis/yt-doc/commits/4e31555d10c9/
Changeset:   4e31555d10c9
User:        ngoldbaum
Date:        2013-11-02 08:40:18
Summary:     Merging.
Affected #:  1 file



https://bitbucket.org/yt_analysis/yt-doc/commits/29f4c3fe570e/
Changeset:   29f4c3fe570e
User:        ngoldbaum
Date:        2013-11-02 08:47:43
Summary:     Fixing an error I introduced in the API docs.
Affected #:  1 file

diff -r 4e31555d10c92f00acee24f7bbf63f59c915a8f6 -r 29f4c3fe570e3429d165974e4f14657b84e7f5de source/reference/api/api.rst
--- a/source/reference/api/api.rst
+++ b/source/reference/api/api.rst
@@ -376,7 +376,10 @@
 
 Absorption spectra fitting:
 
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+.. autosummary:: 
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
 
 Sunrise exporting:
 
@@ -583,9 +586,8 @@
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
    ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+   ~yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
 
 
 Testing Infrastructure


https://bitbucket.org/yt_analysis/yt-doc/commits/e288cc0ef750/
Changeset:   e288cc0ef750
User:        jzuhone
Date:        2013-10-30 02:06:20
Summary:     Merged chummels/yt-doc into default
Affected #:  12 files

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -51,7 +51,11 @@
         f.write(script_text.encode('utf8'))
         f.close()
 
-        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        try:
+            evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        except:
+            # bail
+            return []
 
         # Create link to notebook and script files
         link_rst = "(" + \

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 extensions/notebookcell_sphinxext.py
--- a/extensions/notebookcell_sphinxext.py
+++ b/extensions/notebookcell_sphinxext.py
@@ -32,7 +32,11 @@
 
         convert_to_ipynb('temp.py', 'temp.ipynb')
 
-        evaluated_text = evaluate_notebook('temp.ipynb')
+        try:
+            evaluated_text = evaluate_notebook('temp.ipynb')
+        except:
+            # bail
+            return []
 
         # create notebook node
         attributes = {'format': 'html', 'source': 'nb_path'}

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 helper_scripts/show_fields.py
--- a/helper_scripts/show_fields.py
+++ b/helper_scripts/show_fields.py
@@ -17,6 +17,17 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
+Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+native and derived fields available for your dataset respectively. For example
+to display the native fields in alphabetical order:
+
+.. notebook-cell::
+
+  from yt.mods import *
+  pf = load("Enzo_64/DD0043/data0043")
+  for i in sorted(pf.h.field_list):
+    print i
+
 .. note:: Universal fields will be overridden by a code-specific field.
 
 .. rubric:: Table of Contents
@@ -95,7 +106,37 @@
 print
 print_all_fields(FLASHFieldInfo)
 
-print "Nyx-Specific Field List"
+print "Athena-Specific Field List"
 print "--------------------------"
 print
+print_all_fields(AthenaFieldInfo)
+
+print "Nyx-Specific Field List"
+print "-----------------------"
+print
 print_all_fields(NyxFieldInfo)
+
+print "Castro-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(CastroFieldInfo)
+
+print "Chombo-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(ChomboFieldInfo)
+
+print "Pluto-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(PlutoFieldInfo)
+
+print "Grid-Data-Format-Specific Field List"
+print "------------------------------------"
+print
+print_all_fields(GDFFieldInfo)
+
+print "Generic-Format (Stream) Field List"
+print "----------------------------------"
+print
+print_all_fields(StreamFieldInfo)

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/api/api.rst
--- a/source/api/api.rst
+++ /dev/null
@@ -1,563 +0,0 @@
-API Reference
-=============
-
-Plots and the Plotting Interface
---------------------------------
-
-SlicePlot and ProjectionPlot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_window.SlicePlot
-   ~yt.visualization.plot_window.OffAxisSlicePlot
-   ~yt.visualization.plot_window.ProjectionPlot
-   ~yt.visualization.plot_window.OffAxisProjectionPlot
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_collection.PlotCollection
-   ~yt.visualization.plot_collection.PlotCollectionInteractive
-   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
-   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.base_plot_types.get_multi_plot
-
-Data Sources
-------------
-
-.. _physical-object-api:
-
-Physical Objects
-^^^^^^^^^^^^^^^^
-
-These are the objects that act as physical selections of data, describing a
-region in space.  These are not typically addressed directly; see
-:ref:`available-objects` for more information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.data_containers.AMRCoveringGridBase
-   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
-   ~yt.data_objects.data_containers.AMRCylinderBase
-   ~yt.data_objects.data_containers.AMRGridCollectionBase
-   ~yt.data_objects.data_containers.AMRRayBase
-   ~yt.data_objects.data_containers.AMROrthoRayBase
-   ~yt.data_objects.data_containers.AMRStreamlineBase
-   ~yt.data_objects.data_containers.AMRProjBase
-   ~yt.data_objects.data_containers.AMRRegionBase
-   ~yt.data_objects.data_containers.AMRSliceBase
-   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
-   ~yt.data_objects.data_containers.AMRSphereBase
-   ~yt.data_objects.data_containers.AMRSurfaceBase
-
-Time Series Objects
-^^^^^^^^^^^^^^^^^^^
-
-These are objects that either contain and represent or operate on series of
-datasets.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.time_series.TimeSeriesData
-   ~yt.data_objects.time_series.TimeSeriesDataObject
-   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
-   ~yt.data_objects.time_series.AnalysisTaskProxy
-
-Frontends
----------
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.grid_patch.AMRGridPatch
-   ~yt.data_objects.hierarchy.AMRHierarchy
-   ~yt.data_objects.static_output.StaticOutput
-
-Enzo
-^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.enzo.data_structures.EnzoGrid
-   ~yt.frontends.enzo.data_structures.EnzoHierarchy
-   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
-
-Orion
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.orion.data_structures.OrionGrid
-   ~yt.frontends.orion.data_structures.OrionHierarchy
-   ~yt.frontends.orion.data_structures.OrionStaticOutput
-
-FLASH
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.flash.data_structures.FLASHGrid
-   ~yt.frontends.flash.data_structures.FLASHHierarchy
-   ~yt.frontends.flash.data_structures.FLASHStaticOutput
-
-Chombo
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.chombo.data_structures.ChomboGrid
-   ~yt.frontends.chombo.data_structures.ChomboHierarchy
-   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
-
-RAMSES
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
-
-Derived Datatypes
------------------
-
-Profiles and Histograms
-^^^^^^^^^^^^^^^^^^^^^^^
-
-These types are used to sum data up and either return that sum or return an
-average.  Typically they are more easily used through the
-`yt.visualization.plot_collection` interface.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.profiles.BinnedProfile1D
-   ~yt.data_objects.profiles.BinnedProfile2D
-   ~yt.data_objects.profiles.BinnedProfile3D
-
-Halo Finding and Particle Functions
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Halo finding can be executed using these types.  Here we list the main halo
-finders as well as a few other supplemental objects.
-
-.. rubric:: Halo Finders
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
-   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
-
-You can also operate on the Halo and HAloList objects themselves:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.Halo
-   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
-
-There are also functions for loading halos from disk:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
-
-We have several methods that work to create merger trees:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
-
-You can use Halo catalogs generatedl externally as well:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
-
-Halo Profiling
-^^^^^^^^^^^^^^
-
-yt provides a comprehensive halo profiler that can filter, center, and analyze
-halos en masse.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
-
-
-Two Point Functions
-^^^^^^^^^^^^^^^^^^^
-
-These functions are designed to create correlations or other results of
-operations acting on two spatially-distinct points in a data source.  See also
-:ref:`two_point_functions`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
-   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
-
-Field Types
------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.field_info_container.DerivedField
-   ~yt.data_objects.field_info_container.FieldInfoContainer
-   ~yt.data_objects.field_info_container.ValidateDataField
-   ~yt.data_objects.field_info_container.ValidateGridType
-   ~yt.data_objects.field_info_container.ValidateParameter
-   ~yt.data_objects.field_info_container.ValidateProperty
-   ~yt.data_objects.field_info_container.ValidateSpatial
-
-Image Handling
---------------
-
-For volume renderings and fixed resolution buffers the image object returned is
-an ``ImageArray`` object, which has useful functions for image saving and 
-writing to bitmaps.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.image_array.ImageArray
-   ~yt.data_objects.image_array.ImageArray.write_png
-   ~yt.data_objects.image_array.ImageArray.write_hdf5
-
-Extension Types
----------------
-
-Coordinate Transformations
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
-   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
-
-Cosmology, Star Particle Analysis, and Simulated Observations
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
-
-Light cone generation and simulation analysis.  (See also
-:ref:`light-cone-generator`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
-   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
-
-Absorption and X-ray spectra and spectral lines:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
-
-Absorption spectra fitting:
-
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
-
-Sunrise exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
-
-RADMC-3D exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
-
-Radial Column Density
-^^^^^^^^^^^^^^^^^^^^^
-
-If you'd like to calculate the column density out to a given point, from a
-specified center, yt can provide that information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
-
-Volume Rendering
-^^^^^^^^^^^^^^^^
-
-See also :ref:`volume_rendering`.
-
-Here are the primary entry points:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.Camera
-   ~yt.visualization.volume_rendering.camera.off_axis_projection
-   ~yt.visualization.volume_rendering.camera.allsky_projection
-
-These objects set up the way the image looks:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
-
-There are also advanced objects for particular use cases:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
-   ~yt.visualization.volume_rendering.camera.FisheyeCamera
-   ~yt.visualization.volume_rendering.camera.MosaicCamera
-   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
-   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
-   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
-   ~yt.visualization.volume_rendering.camera.StereoPairCamera
-
-Streamlining
-^^^^^^^^^^^^
-
-See also :ref:`streamlines`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.streamlines.Streamlines
-
-Image Writing
-^^^^^^^^^^^^^
-
-These functions are all used for fast writing of images directly to disk,
-without calling matplotlib.  This can be very useful for high-cadence outputs
-where colorbars are unnecessary or for volume rendering.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.image_writer.multi_image_composite
-   ~yt.visualization.image_writer.write_bitmap
-   ~yt.visualization.image_writer.write_projection
-   ~yt.visualization.image_writer.write_fits
-   ~yt.visualization.image_writer.write_image
-   ~yt.visualization.image_writer.map_to_colors
-   ~yt.visualization.image_writer.strip_colormap_data
-   ~yt.visualization.image_writer.splat_points
-   ~yt.visualization.image_writer.annotate_image
-   ~yt.visualization.image_writer.scale_image
-
-We also provide a module that is very good for generating EPS figures,
-particularly with complicated layouts.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.eps_writer.DualEPS
-   ~yt.visualization.eps_writer.single_plot
-   ~yt.visualization.eps_writer.multiplot
-   ~yt.visualization.eps_writer.multiplot_yt
-   ~yt.visualization.eps_writer.return_cmap
-
-.. _image-panner-api:
-
-Derived Quantities
-------------------
-
-See :ref:`derived-quantities`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.derived_quantities._AngularMomentumVector
-   ~yt.data_objects.derived_quantities._BaryonSpinParameter
-   ~yt.data_objects.derived_quantities._BulkVelocity
-   ~yt.data_objects.derived_quantities._CenterOfMass
-   ~yt.data_objects.derived_quantities._Extrema
-   ~yt.data_objects.derived_quantities._IsBound
-   ~yt.data_objects.derived_quantities._MaxLocation
-   ~yt.data_objects.derived_quantities._ParticleSpinParameter
-   ~yt.data_objects.derived_quantities._TotalMass
-   ~yt.data_objects.derived_quantities._TotalQuantity
-   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
-
-.. _callback-api:
-
-Callback List
--------------
-
-
-See also :ref:`callbacks`.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_modifications.ArrowCallback
-   ~yt.visualization.plot_modifications.ClumpContourCallback
-   ~yt.visualization.plot_modifications.ContourCallback
-   ~yt.visualization.plot_modifications.CoordAxesCallback
-   ~yt.visualization.plot_modifications.CuttingQuiverCallback
-   ~yt.visualization.plot_modifications.GridBoundaryCallback
-   ~yt.visualization.plot_modifications.HopCircleCallback
-   ~yt.visualization.plot_modifications.HopParticleCallback
-   ~yt.visualization.plot_modifications.LabelCallback
-   ~yt.visualization.plot_modifications.LinePlotCallback
-   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
-   ~yt.visualization.plot_modifications.ParticleCallback
-   ~yt.visualization.plot_modifications.PointAnnotateCallback
-   ~yt.visualization.plot_modifications.QuiverCallback
-   ~yt.visualization.plot_modifications.SphereCallback
-   ~yt.visualization.plot_modifications.TextLabelCallback
-   ~yt.visualization.plot_modifications.TitleCallback
-   ~yt.visualization.plot_modifications.UnitBoundaryCallback
-   ~yt.visualization.plot_modifications.VelocityCallback
-
-Function List
--------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.convenience.load
-   ~yt.funcs.deprecate
-   ~yt.funcs.ensure_list
-   ~yt.funcs.get_pbar
-   ~yt.funcs.humanize_time
-   ~yt.funcs.insert_ipython
-   ~yt.funcs.iterable
-   ~yt.funcs.just_one
-   ~yt.funcs.only_on_root
-   ~yt.funcs.paste_traceback
-   ~yt.funcs.pdb_run
-   ~yt.funcs.print_tb
-   ~yt.funcs.rootonly
-   ~yt.funcs.time_execution
-   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
-
-Miscellaneous Types
--------------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.config.YTConfigParser
-   ~yt.utilities.parameter_file_storage.ParameterFileStore
-   ~yt.data_objects.data_containers.FakeGridForParticles
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
-
-
-Testing Infrastructure
-----------------------
-
-The first set of functions are all provided by NumPy.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_array_equal
-   ~yt.testing.assert_almost_equal
-   ~yt.testing.assert_approx_equal
-   ~yt.testing.assert_array_almost_equal
-   ~yt.testing.assert_equal
-   ~yt.testing.assert_array_less
-   ~yt.testing.assert_string_equal
-   ~yt.testing.assert_array_almost_equal_nulp
-   ~yt.testing.assert_allclose
-   ~yt.testing.assert_raises
-
-These are yt-provided functions:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_rel_equal
-   ~yt.testing.amrspace
-   ~yt.testing.fake_random_pf
-   ~yt.testing.expand_keywords

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -256,4 +256,4 @@
                        }
 
 if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+    autosummary_generate = glob.glob("reference/api/api.rst")

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,7 +1,7 @@
 .. _cookbook:
 
-The yt Cookbook
-===============
+The Cookbook
+============
 
 yt provides a great deal of functionality to the user, but sometimes it can 
 be a bit complex.  This section of the documentation lays out examples recipes 

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -3,38 +3,129 @@
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
-in Python under the open-source model.  yt currently supports several 
-astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  Fully-supported codes 
-include: `Enzo <http://enzo-project.org/>`_, 
-`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
-`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
-`FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_;
-and partially-supported codes include: 
-`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
-`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
-
-yt uses a three-pronged approach to interacting with data:
-
- * Visualize Data - Generate plots, images, and movies for better understanding your datasets
- * Analyze Data - Use additional analysis routines to derive real-world results from your data
- * Examine Data - Directly access raw data with helper functions for making this task easier
+in Python under the open-source model.  In version 2.6, yt currently supports
+several astrophysical simulation code formats, as well support for
+:ref:`loading-numpy-array` for unsupported data formats.  :ref:`code-support`
+is included for: `Enzo <http://enzo-project.org/>`_, `Orion
+<http://flash.uchicago.edu/~rfisher/orion/>`_, `Nyx
+<https://ccse.lbl.gov/Research/NYX/index.html>`_, `FLASH
+<http://flash.uchicago.edu/website/home/>`_, `Piernik
+<http://arxiv.org/abs/0901.0104>`_, `Athena
+<https://trac.princeton.edu/Athena/>`_, `Chombo <http://chombo.lbl.gov>`_,
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_, `Maestro
+<https://ccse.lbl.gov/Research/MAESTRO/>`_, and `Pluto
+<http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
+particle codes and octree codes, is taking place in yt 3.0.)
 
 Documentation
 =============
 
+.. raw:: html
+
+   <table class="contentstable" align="center">
+
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="installing.html">Installation</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Getting and Installing yt</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="bootcamp/index.html">yt Bootcamp</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Demonstrations of what yt can do</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="cookbook/index.html">The Cookbook</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Example recipes for how to accomplish a variety of tasks</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="visualizing/index.html">Visualizing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Make plots, projections, volume renderings, movies, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="analyzing/index.html">Analyzing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="examining/index.html">Examining Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Load data and directly access raw values for low-level analysis</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="developing/index.html">Developing in yt</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Catering yt to work for your exact use case</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="reference/index.html">Reference Materials</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Lists of fields, quantities, classes, functions, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/index.html">Getting help</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">What to do if you run into problems</p>
+       </td>
+     </tr>
+
+   </table>
+
 .. toctree::
-   :maxdepth: 1
+   :hidden:
 
    installing
-   yt Bootcamp: A Worked Introduction <bootcamp/index>
-   help/index
+   yt Bootcamp <bootcamp/index>
    cookbook/index
    visualizing/index
    analyzing/index
    examining/index
    developing/index
    reference/index
+   help/index

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/reference/api/api.rst
--- /dev/null
+++ b/source/reference/api/api.rst
@@ -0,0 +1,563 @@
+API Reference
+=============
+
+Plots and the Plotting Interface
+--------------------------------
+
+SlicePlot and ProjectionPlot
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_window.SlicePlot
+   ~yt.visualization.plot_window.OffAxisSlicePlot
+   ~yt.visualization.plot_window.ProjectionPlot
+   ~yt.visualization.plot_window.OffAxisProjectionPlot
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_collection.PlotCollection
+   ~yt.visualization.plot_collection.PlotCollectionInteractive
+   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
+   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
+   ~yt.visualization.base_plot_types.get_multi_plot
+
+Data Sources
+------------
+
+.. _physical-object-api:
+
+Physical Objects
+^^^^^^^^^^^^^^^^
+
+These are the objects that act as physical selections of data, describing a
+region in space.  These are not typically addressed directly; see
+:ref:`available-objects` for more information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.data_containers.AMRCoveringGridBase
+   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
+   ~yt.data_objects.data_containers.AMRCylinderBase
+   ~yt.data_objects.data_containers.AMRGridCollectionBase
+   ~yt.data_objects.data_containers.AMRRayBase
+   ~yt.data_objects.data_containers.AMROrthoRayBase
+   ~yt.data_objects.data_containers.AMRStreamlineBase
+   ~yt.data_objects.data_containers.AMRProjBase
+   ~yt.data_objects.data_containers.AMRRegionBase
+   ~yt.data_objects.data_containers.AMRSliceBase
+   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
+   ~yt.data_objects.data_containers.AMRSphereBase
+   ~yt.data_objects.data_containers.AMRSurfaceBase
+
+Time Series Objects
+^^^^^^^^^^^^^^^^^^^
+
+These are objects that either contain and represent or operate on series of
+datasets.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.time_series.TimeSeriesData
+   ~yt.data_objects.time_series.TimeSeriesDataObject
+   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
+   ~yt.data_objects.time_series.AnalysisTaskProxy
+
+Frontends
+---------
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.grid_patch.AMRGridPatch
+   ~yt.data_objects.hierarchy.AMRHierarchy
+   ~yt.data_objects.static_output.StaticOutput
+
+Enzo
+^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.enzo.data_structures.EnzoGrid
+   ~yt.frontends.enzo.data_structures.EnzoHierarchy
+   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
+
+Orion
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.orion.data_structures.OrionGrid
+   ~yt.frontends.orion.data_structures.OrionHierarchy
+   ~yt.frontends.orion.data_structures.OrionStaticOutput
+
+FLASH
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.flash.data_structures.FLASHGrid
+   ~yt.frontends.flash.data_structures.FLASHHierarchy
+   ~yt.frontends.flash.data_structures.FLASHStaticOutput
+
+Chombo
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.chombo.data_structures.ChomboGrid
+   ~yt.frontends.chombo.data_structures.ChomboHierarchy
+   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
+
+RAMSES
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.ramses.data_structures.RAMSESGrid
+   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
+   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+
+Derived Datatypes
+-----------------
+
+Profiles and Histograms
+^^^^^^^^^^^^^^^^^^^^^^^
+
+These types are used to sum data up and either return that sum or return an
+average.  Typically they are more easily used through the
+`yt.visualization.plot_collection` interface.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.profiles.BinnedProfile1D
+   ~yt.data_objects.profiles.BinnedProfile2D
+   ~yt.data_objects.profiles.BinnedProfile3D
+
+Halo Finding and Particle Functions
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Halo finding can be executed using these types.  Here we list the main halo
+finders as well as a few other supplemental objects.
+
+.. rubric:: Halo Finders
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
+   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
+
+You can also operate on the Halo and HAloList objects themselves:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.Halo
+   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
+
+There are also functions for loading halos from disk:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
+
+We have several methods that work to create merger trees:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
+
+You can use Halo catalogs generatedl externally as well:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
+
+Halo Profiling
+^^^^^^^^^^^^^^
+
+yt provides a comprehensive halo profiler that can filter, center, and analyze
+halos en masse.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
+
+
+Two Point Functions
+^^^^^^^^^^^^^^^^^^^
+
+These functions are designed to create correlations or other results of
+operations acting on two spatially-distinct points in a data source.  See also
+:ref:`two_point_functions`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
+   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
+
+Field Types
+-----------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.field_info_container.DerivedField
+   ~yt.data_objects.field_info_container.FieldInfoContainer
+   ~yt.data_objects.field_info_container.ValidateDataField
+   ~yt.data_objects.field_info_container.ValidateGridType
+   ~yt.data_objects.field_info_container.ValidateParameter
+   ~yt.data_objects.field_info_container.ValidateProperty
+   ~yt.data_objects.field_info_container.ValidateSpatial
+
+Image Handling
+--------------
+
+For volume renderings and fixed resolution buffers the image object returned is
+an ``ImageArray`` object, which has useful functions for image saving and 
+writing to bitmaps.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.image_array.ImageArray
+   ~yt.data_objects.image_array.ImageArray.write_png
+   ~yt.data_objects.image_array.ImageArray.write_hdf5
+
+Extension Types
+---------------
+
+Coordinate Transformations
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
+   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
+
+Cosmology, Star Particle Analysis, and Simulated Observations
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
+
+Light cone generation and simulation analysis.  (See also
+:ref:`light-cone-generator`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
+   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
+
+Absorption and X-ray spectra and spectral lines:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
+
+Absorption spectra fitting:
+
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+
+Sunrise exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
+
+RADMC-3D exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
+
+Radial Column Density
+^^^^^^^^^^^^^^^^^^^^^
+
+If you'd like to calculate the column density out to a given point, from a
+specified center, yt can provide that information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
+
+Volume Rendering
+^^^^^^^^^^^^^^^^
+
+See also :ref:`volume_rendering`.
+
+Here are the primary entry points:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.Camera
+   ~yt.visualization.volume_rendering.camera.off_axis_projection
+   ~yt.visualization.volume_rendering.camera.allsky_projection
+
+These objects set up the way the image looks:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
+
+There are also advanced objects for particular use cases:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
+   ~yt.visualization.volume_rendering.camera.FisheyeCamera
+   ~yt.visualization.volume_rendering.camera.MosaicCamera
+   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
+   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
+   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
+   ~yt.visualization.volume_rendering.camera.StereoPairCamera
+
+Streamlining
+^^^^^^^^^^^^
+
+See also :ref:`streamlines`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.streamlines.Streamlines
+
+Image Writing
+^^^^^^^^^^^^^
+
+These functions are all used for fast writing of images directly to disk,
+without calling matplotlib.  This can be very useful for high-cadence outputs
+where colorbars are unnecessary or for volume rendering.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.image_writer.multi_image_composite
+   ~yt.visualization.image_writer.write_bitmap
+   ~yt.visualization.image_writer.write_projection
+   ~yt.visualization.image_writer.write_fits
+   ~yt.visualization.image_writer.write_image
+   ~yt.visualization.image_writer.map_to_colors
+   ~yt.visualization.image_writer.strip_colormap_data
+   ~yt.visualization.image_writer.splat_points
+   ~yt.visualization.image_writer.annotate_image
+   ~yt.visualization.image_writer.scale_image
+
+We also provide a module that is very good for generating EPS figures,
+particularly with complicated layouts.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.eps_writer.DualEPS
+   ~yt.visualization.eps_writer.single_plot
+   ~yt.visualization.eps_writer.multiplot
+   ~yt.visualization.eps_writer.multiplot_yt
+   ~yt.visualization.eps_writer.return_cmap
+
+.. _image-panner-api:
+
+Derived Quantities
+------------------
+
+See :ref:`derived-quantities`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.derived_quantities._AngularMomentumVector
+   ~yt.data_objects.derived_quantities._BaryonSpinParameter
+   ~yt.data_objects.derived_quantities._BulkVelocity
+   ~yt.data_objects.derived_quantities._CenterOfMass
+   ~yt.data_objects.derived_quantities._Extrema
+   ~yt.data_objects.derived_quantities._IsBound
+   ~yt.data_objects.derived_quantities._MaxLocation
+   ~yt.data_objects.derived_quantities._ParticleSpinParameter
+   ~yt.data_objects.derived_quantities._TotalMass
+   ~yt.data_objects.derived_quantities._TotalQuantity
+   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
+
+.. _callback-api:
+
+Callback List
+-------------
+
+
+See also :ref:`callbacks`.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_modifications.ArrowCallback
+   ~yt.visualization.plot_modifications.ClumpContourCallback
+   ~yt.visualization.plot_modifications.ContourCallback
+   ~yt.visualization.plot_modifications.CoordAxesCallback
+   ~yt.visualization.plot_modifications.CuttingQuiverCallback
+   ~yt.visualization.plot_modifications.GridBoundaryCallback
+   ~yt.visualization.plot_modifications.HopCircleCallback
+   ~yt.visualization.plot_modifications.HopParticleCallback
+   ~yt.visualization.plot_modifications.LabelCallback
+   ~yt.visualization.plot_modifications.LinePlotCallback
+   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
+   ~yt.visualization.plot_modifications.ParticleCallback
+   ~yt.visualization.plot_modifications.PointAnnotateCallback
+   ~yt.visualization.plot_modifications.QuiverCallback
+   ~yt.visualization.plot_modifications.SphereCallback
+   ~yt.visualization.plot_modifications.TextLabelCallback
+   ~yt.visualization.plot_modifications.TitleCallback
+   ~yt.visualization.plot_modifications.UnitBoundaryCallback
+   ~yt.visualization.plot_modifications.VelocityCallback
+
+Function List
+-------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.convenience.load
+   ~yt.funcs.deprecate
+   ~yt.funcs.ensure_list
+   ~yt.funcs.get_pbar
+   ~yt.funcs.humanize_time
+   ~yt.funcs.insert_ipython
+   ~yt.funcs.iterable
+   ~yt.funcs.just_one
+   ~yt.funcs.only_on_root
+   ~yt.funcs.paste_traceback
+   ~yt.funcs.pdb_run
+   ~yt.funcs.print_tb
+   ~yt.funcs.rootonly
+   ~yt.funcs.time_execution
+   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
+
+Miscellaneous Types
+-------------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.config.YTConfigParser
+   ~yt.utilities.parameter_file_storage.ParameterFileStore
+   ~yt.data_objects.data_containers.FakeGridForParticles
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
+
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
+Testing Infrastructure
+----------------------
+
+The first set of functions are all provided by NumPy.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_array_equal
+   ~yt.testing.assert_almost_equal
+   ~yt.testing.assert_approx_equal
+   ~yt.testing.assert_array_almost_equal
+   ~yt.testing.assert_equal
+   ~yt.testing.assert_array_less
+   ~yt.testing.assert_string_equal
+   ~yt.testing.assert_array_almost_equal_nulp
+   ~yt.testing.assert_allclose
+   ~yt.testing.assert_raises
+
+These are yt-provided functions:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_rel_equal
+   ~yt.testing.amrspace
+   ~yt.testing.fake_random_pf
+   ~yt.testing.expand_keywords

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r e288cc0ef750b36b9bef823fe3777474c8aae641 source/reference/code_support.rst
--- /dev/null
+++ b/source/reference/code_support.rst
@@ -0,0 +1,53 @@
+
+.. _code-support:
+
+Code Support
+============
+
+Levels of Support for Various Codes
+-----------------------------------
+
+yt provides frontends to support several different simulation code formats 
+as inputs.  Below is a list showing what level of support is provided for
+each code.
+
+|
+
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability           | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++======================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities     |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles            |   Y  |   Y   |   Y   |  Y   |   N/A   |   N    |   Y    |   N     |   N   |    N   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters           |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units                |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand       |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data        |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Part of test suite   |   Y  |   Y   |   Y   |  Y   |    N    |   N    |   Y    |   N     |   N   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support     | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+
+|
+
+If you have a dataset from a code not yet supported, you can either 
+input your data using the :ref:`loading-numpy-array` format, or help us by 
+:ref:`creating_frontend` for this new format.
+
+Future Codes to Support
+-----------------------
+
+A major overhaul of the code was required in order to cleanly support 
+additional codes.  Development in the yt 3.x branch has begun and provides 
+support for codes like: 
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_, 
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
+`Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
+version of yt for the most up-to-date support for those codes.
+
+Additionally, in yt 3.0 the Boxlib formats have been unified and streamlined.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/42d8995683f6/
Changeset:   42d8995683f6
User:        jzuhone
Date:        2013-10-30 06:46:22
Summary:     Splitting off loading generic/in-memory array data into a new section and notebook-ifying it.
Affected #:  5 files

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r 42d8995683f6efcea72131f6f03d6279b538be24 source/examining/Loading_Generic_Array_Data.ipynb
--- /dev/null
+++ b/source/examining/Loading_Generic_Array_Data.ipynb
@@ -0,0 +1,485 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Even if your data is not strictly related to fields commonly used in\n",
+      "astrophysical codes or your code is not supported yet, you can still feed it to\n",
+      "`yt` to use its advanced visualization and analysis facilities. The only\n",
+      "requirement is that your data can be represented as three-dimensional NumPy arrays with a consistent grid structure. What follows are some common examples of loading in generic array data that you may find useful. "
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Generic Unigrid Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The simplest case is that of a single grid of data spanning the domain, with one or more fields. The data could be generated from a variety of sources; we'll just give three common examples:"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Data generated \"on-the-fly\""
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The most common example is that of data that is generated in memory from the currently running script or notebook. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from yt.utilities.physical_constants import cm_per_kpc, cm_per_mpc"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we'll just create a 3-D array of random floating-point data using NumPy:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "arr = np.random.random(size=(64,64,64))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To load this data into `yt`, we need to assign it a field name, in this case \"Density\", and place it into a dictionary. Then, we call `load_uniform_grid`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = dict(Density = arr)\n",
+      "bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [-1.5, 1.5]])\n",
+      "pf = load_uniform_grid(data, arr.shape, cm_per_mpc, bbox=bbox, nprocs=64)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`load_uniform_grid` takes the following arguments and optional keywords:\n",
+      "\n",
+      "* `data` : This is a dict of numpy arrays, where the keys are the field names.\n",
+      "* `domain_dimensions` : The domain dimensions of the unigrid\n",
+      "* `sim_unit_to_cm` : Conversion factor from simulation units to centimeters\n",
+      "* `bbox` : Size of computational domain in units sim_unit_to_cm\n",
+      "* `nprocs` : If greater than 1, will create this number of subarrays out of data\n",
+      "* `sim_time` : The simulation time in seconds\n",
+      "* `periodicity` : A tuple of booleans that determines whether the data will be treated as periodic along each axis\n",
+      "\n",
+      "This example creates a `yt`-native parameter file `pf` that will treat your array as a\n",
+      "density field in cubic domain of 3 Mpc edge size (3 * 3.0856e24 cm) and\n",
+      "simultaneously divide the domain into `nprocs` = 64 chunks, so that you can take advantage\n",
+      "of the underlying parallelism. "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The resulting `pf` functions exactly like a parameter file from any other dataset--it can be sliced, and we can show the grid boundaries:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, 2, [\"Density\"])\n",
+      "slc.set_cmap(\"Density\", \"Blues\")\n",
+      "slc.annotate_grids(cmap=None)\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Particle fields are detected as one-dimensional fields. The number of\n",
+      "particles is set by the `number_of_particles` key in\n",
+      "`data`. Particle fields are then added as one-dimensional arrays in\n",
+      "a similar manner as the three-dimensional grid fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "posx_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "posy_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "posz_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "data = dict(Density = np.random.random(size=(64,64,64)), \n",
+      "            number_of_particles = 10000,\n",
+      "            particle_position_x = posx_arr, \n",
+      "\t        particle_position_y = posy_arr,\n",
+      "\t        particle_position_z = posz_arr)\n",
+      "bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [-1.5, 1.5]])\n",
+      "pf = load_uniform_grid(data, data[\"Density\"].shape, cm_per_mpc, bbox=bbox, nprocs=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example only the particle position fields have been assigned. `number_of_particles` must be the same size as the particle\n",
+      "arrays. If no particle arrays are supplied then `number_of_particles` is assumed to be zero. Take a slice, and overlay particle positions:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, \"z\", [\"Density\"])\n",
+      "slc.set_cmap(\"Density\", \"Blues\")\n",
+      "slc.annotate_particles(0.25, p_size=12.0, col=\"Red\")\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "HDF5 data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "HDF5 is a convenient format to store data. If you have unigrid data stored in an HDF5 file, it is possible to load it into memory and then use `load_uniform_grid` to get it into `yt`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import h5py\n",
+      "f = h5py.File(os.environ[\"YT_DATA_DIR\"]+\"/turb_vels.h5\", \"r\") # Read-only access to the file"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The HDF5 file handle's keys correspond to the datasets stored in the file:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print f.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can iterate over the items in the file handle to get the data into a dictionary, which we will then load:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = {k:v for k,v in f.items()}\n",
+      "bbox = np.array([[-0.5, 0.5], [-0.5, 0.5], [-0.5, 0.5]])\n",
+      "pf = load_uniform_grid(data, data[\"Density\"].shape, 250.*cm_per_kpc, bbox=bbox, nprocs=8, periodicity=(False,False,False))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this case, the data came from a simulation which was 250 kpc on a side. An example projection of two fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prj = ProjectionPlot(pf, \"z\", [\"z-velocity\",\"Temperature\"], weight_field=\"Density\")\n",
+      "prj.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "FITS image data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The FITS file format is a common astronomical format for 2-D images, but it can store three-dimensional data as well. The [AstroPy](http://www.astropy.org) project has modules for FITS reading and writing, which were incorporated from the [PyFITS](http://www.stsci.edu/institute/software_hardware/pyfits) library."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import astropy.io.fits as pyfits\n",
+      "# Or, just import pyfits if that's what you have installed"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Using `pyfits` we can open a FITS file. If we call `info()` on the file handle, we can figure out some information about the file's contents. The file in this example has a primary HDU (header-data-unit) with no data, and three HDUs with 3-D data. In this case, the data consists of three velocity fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "f = pyfits.open(os.environ[\"YT_DATA_DIR\"]+\"/velocity_field_20.fits.gz\")\n",
+      "f.info()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can put it into a dictionary in the same way as before, but we slice the file handle `f` so that we don't use the `PrimaryHDU`. `hdu.name` is the field name and `hdu.data` is the actual data. We can check that we got the correct fields. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = {hdu.name.lower():hdu.data for hdu in f[1:]}\n",
+      "print data.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we load the data into `yt`. This particular file doesn't have any coordinate information, but let's assume that the box size is a Mpc. Since these are velocity fields, we can overlay velocity vectors on slices, just as if we had loaded in data from a supported code. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load_uniform_grid(data, data[\"x-velocity\"].shape, cm_per_mpc)\n",
+      "slc = SlicePlot(pf, \"x\", [\"x-velocity\",\"y-velocity\",\"z-velocity\"])\n",
+      "slc.annotate_velocity()\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Generic AMR Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In a similar fashion to unigrid data, data gridded into rectangular patches at varying levels of resolution may also be loaded into `yt`. In this case, a list of grid dictionaries should be provided, with the requisite information about each grid's properties. This example sets up two grids: a top-level grid (`level == 0`) covering the entire domain and a subgrid at `level == 1`. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "grid_data = [\n",
+      "    dict(left_edge = [0.0, 0.0, 0.0],\n",
+      "         right_edge = [1.0, 1.0, 1.],\n",
+      "         level = 0,\n",
+      "         dimensions = [32, 32, 32]), \n",
+      "    dict(left_edge = [0.25, 0.25, 0.25],\n",
+      "         right_edge = [0.75, 0.75, 0.75],\n",
+      "         level = 1,\n",
+      "         dimensions = [32, 32, 32])\n",
+      "   ]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We'll just fill each grid with random density data, with a scaling with the grid refinement level."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for g in grid_data: g[\"Density\"] = np.random.random(g[\"dimensions\"]) * 2**g[\"level\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Particle fields are supported by adding 1-dimensional arrays to each `grid` and\n",
+      "setting the `number_of_particles` key in each `grid`'s dict. If a grid has no particles, set `number_of_particles = 0`, but the particle fields still have to be defined since they are defined elsewhere; set them to empty NumPy arrays:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "grid_data[0][\"number_of_particles\"] = 0 # Set no particles in the top-level grid\n",
+      "grid_data[0][\"particle_position_x\"] = np.array([]) # No particles, so set empty arrays\n",
+      "grid_data[0][\"particle_position_y\"] = np.array([]) \n",
+      "grid_data[0][\"particle_position_z\"] = np.array([])\n",
+      "grid_data[1][\"number_of_particles\"] = 1000\n",
+      "grid_data[1][\"particle_position_x\"] = np.random.uniform(low=0.25, high=0.75, size=1000)\n",
+      "grid_data[1][\"particle_position_y\"] = np.random.uniform(low=0.25, high=0.75, size=1000)\n",
+      "grid_data[1][\"particle_position_z\"] = np.random.uniform(low=0.25, high=0.75, size=1000)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Then, call `load_amr_grids`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`load_amr_grids` also takes the same keywords `bbox` and `sim_time` as `load_uniform_grid`. Let's take a slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, \"z\", [\"Density\"])\n",
+      "slc.annotate_particles(0.25, p_size=15.0, col=\"Pink\")\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Caveats for Loading Generic Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "* Units will be incorrect unless the data has already been converted to cgs.\n",
+      "* Particles may be difficult to integrate.\n",
+      "* Data must already reside in memory before loading it in to `yt`, whether it is generated at runtime or loaded from disk. \n",
+      "* Some functions may behave oddly, and parallelism will be disappointing or non-existent in most cases.\n",
+      "* No consistency checks are performed on the hierarchy\n",
+      "* Consistency between particle positions and grids is not checked; `load_amr_grids` assumes that particle positions associated with one grid are not bounded within another grid at a higher level, so this must be ensured by the user prior to loading the grid data. "
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r 42d8995683f6efcea72131f6f03d6279b538be24 source/examining/generic_array_data.rst
--- /dev/null
+++ b/source/examining/generic_array_data.rst
@@ -0,0 +1,5 @@
+
+Loading Generic Array Data
+=================
+
+.. notebook:: Loading_Generic_Array_Data.ipynb

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r 42d8995683f6efcea72131f6f03d6279b538be24 source/examining/index.rst
--- a/source/examining/index.rst
+++ b/source/examining/index.rst
@@ -1,10 +1,11 @@
 Examining Data
 ==============
 
-How to examine a dataset on disk.
+How to examine datasets.
 
 .. toctree::
    :maxdepth: 2
 
-   loading_data
+   supported_frontends_data
+   generic_array_data
    low_level_inspection

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r 42d8995683f6efcea72131f6f03d6279b538be24 source/examining/loading_data.rst
--- a/source/examining/loading_data.rst
+++ /dev/null
@@ -1,336 +0,0 @@
-.. _loading-data:
-
-Loading Data
-============
-
-This section contains information on how to load data into ``yt``, as well as
-some important caveats about different data formats.
-
-.. _loading-numpy-array:
-
-Generic Array Data
-------------------
-
-Even if your data is not strictly related to fields commonly used in
-astrophysical codes or your code is not supported yet, you can still feed it to
-``yt`` to use its advanced visualization and analysis facilities. The only
-requirement is that your data can be represented as one or more uniform, three
-dimensional numpy arrays. Assuming that you have your data in ``arr``,
-the following code:
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = arr)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-will create ``yt``-native parameter file ``pf`` that will treat your array as
-density field in cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) and
-simultaneously divide the domain into 12 chunks, so that you can take advantage
-of the underlying parallelism. 
-
-Particle fields are detected as one-dimensional fields. The number of
-particles is set by the ``number_of_particles`` key in
-``data``. Particle fields are then added as one-dimensional arrays in
-a similar manner as the three-dimensional grid fields:
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = dens, 
-               number_of_particles = 1000000,
-               particle_position_x = posx_arr, 
-	       particle_position_y = posy_arr,
-	       particle_position_z = posz_arr)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-where in this exampe the particle position fields have been assigned. ``number_of_particles`` must be the same size as the particle
-arrays. If no particle arrays are supplied then ``number_of_particles`` is assumed to be zero. 
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Particles may be difficult to integrate.
-* Data must already reside in memory.
-
-.. _loading-enzo-data:
-
-Enzo Data
----------
-
-Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
-dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
-it the parameter file name.  This would be the name of the output file, and it
-contains no extension.  For instance, if you have the following files:
-
-.. code-block:: none
-
-   DD0010/
-   DD0010/data0010
-   DD0010/data0010.hierarchy
-   DD0010/data0010.cpu0000
-   DD0010/data0010.cpu0001
-   DD0010/data0010.cpu0002
-   DD0010/data0010.cpu0003
-
-You would feed the ``load`` command the filename ``DD0010/data0010`` as
-mentioned.
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("DD0010/data0010")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Enzo usage
-* Units should be correct, if you utilize standard unit-setting routines.  yt
-  will notify you if it cannot determine the units, although this
-  notification will be passive.
-* 2D and 1D data are supported, but the extraneous dimensions are set to be
-  of length 1.0
-
-.. _loading-orion-data:
-
-Orion Data
-----------
-
-Orion data is fully supported and cared for by Jeff Oishi.  This method should
-also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
-Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
-command provided by ``yt.mods`` and supply to it the directory file name.
-**You must also have the ``inputs`` file in the base directory.**  For
-instance, if you were in a directory with the following files:
-
-.. code-block:: none
-
-   inputs
-   pltgmlcs5600/
-   pltgmlcs5600/Header
-   pltgmlcs5600/Level_0
-   pltgmlcs5600/Level_0/Cell_H
-   pltgmlcs5600/Level_1
-   pltgmlcs5600/Level_1/Cell_H
-   pltgmlcs5600/Level_2
-   pltgmlcs5600/Level_2/Cell_H
-   pltgmlcs5600/Level_3
-   pltgmlcs5600/Level_3/Cell_H
-   pltgmlcs5600/Level_4
-   pltgmlcs5600/Level_4/Cell_H
-
-You would feed it the filename ``pltgmlcs5600``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("pltgmlcs5600")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Orion usage
-* Star particles are not supported at the current time
-
-.. _loading-flash-data:
-
-FLASH Data
-----------
-
-FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
-FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
-supply to it the file name of a plot file or checkpoint file, but particle
-files are not currently directly loadable by themselves, due to the
-fact that they typically lack grid information. For instance, if you were in a directory with
-the following files:
-
-.. code-block:: none
-
-   cosmoSim_coolhdf5_chk_0026
-
-You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("cosmoSim_coolhdf5_chk_0026")
-
-If you have a FLASH particle file that was created at the same time as
-a plotfile or checkpoint file (therefore having particle data
-consistent with the grid structure of the latter), its data may be loaded with the
-``particle_filename`` optional argument:
-
-.. code-block:: python
-
-    from yt.mods import *
-    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly utilized; yt assumes cgs
-* Velocities and length units will be scaled to comoving coordinates if yt is
-  able to discern you are examining a cosmology simulation; particle and grid
-  positions will not be.
-* Domains may be visualized assuming periodicity.
-
-.. _loading-ramses-data:
-
-RAMSES Data
------------
-
-RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
-you are interested in taking a development or stewardship role, please contact
-him.  To load a RAMSES dataset, you can use the ``load`` command provided by
-``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
-were in a directory with the following files:
-
-.. code-block:: none
-
-   output_00007
-   output_00007/amr_00007.out00001
-   output_00007/grav_00007.out00001
-   output_00007/hydro_00007.out00001
-   output_00007/info_00007.txt
-   output_00007/part_00007.out00001
-
-You would feed it the filename ``output_00007/info_00007.txt``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("output_00007/info_00007.txt")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly set!  This may not be the
-  case for RAMSES data
-* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
-  domain to ensure minimum-coverage from a set of grid patches.  (This is
-  described in the yt method paper.)  This is a time-consuming process and it
-  has not yet been written to be stored between calls.
-* Particles are not supported
-* Parallelism will not be terribly efficient for large datasets
-* There may be occasional segfaults on multi-domain data, which do not
-  reflect errors in the calculation
-
-If you are interested in helping with RAMSES support, we are eager to hear from
-you!
-
-.. _loading-art-data:
-
-ART Data
---------
-
-ART data enjoys preliminary support and is supported by Christopher Moody.
-Please contact the ``yt-dev`` mailing list if you are interested in using yt
-for ART data, or if you are interested in assisting with development of yt to
-work with ART data.
-
-At the moment, the ART octree is 'regridded' at each level to make the native
-octree look more like a mesh-based code. As a result, the initial outlay
-is about ~60 seconds to grid octs onto a mesh. This will be improved in 
-``yt-3.0``, where octs will be supported natively. 
-
-To load an ART dataset you can use the ``load`` command provided by 
-``yt.mods`` and passing the gas mesh file. It will search for and attempt 
-to find the complementary dark matter and stellar particle header and data 
-files. However, your simulations may not follow the same naming convention.
-
-So for example, a single snapshot might have a series of files looking like
-this:
-
-.. code-block:: none
-
-   10MpcBox_csf512_a0.300.d    #Gas mesh
-   PMcrda0.300.DAT             #Particle header
-   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
-   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
-
-The ART frontend tries to find the associated files matching the above, but
-if that fails you can specify ``file_particle_data``,``file_particle_data``,
-``file_star_data`` in addition to the specifying the gas mesh. You also have 
-the option of gridding particles, and assigning them onto the meshes.
-This process is in beta, and for the time being it's probably  best to leave
-``do_grid_particles=False`` as the default.
-
-To speed up the loading of an ART file, you have a few options. You can turn 
-off the particles entirely by setting ``discover_particles=False``. You can
-also only grid octs up to a certain level, ``limit_level=5``, which is useful
-when debugging by artificially creating a 'smaller' dataset to work with.
-
-Finally, when stellar ages are computed we 'spread' the ages evenly within a
-smoothing window. By default this is turned on and set to 10Myr. To turn this 
-off you can set ``spread=False``, and you can tweak the age smoothing window
-by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
-
-.. code-block:: python
-    
-   from yt.mods import *
-
-   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
-   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
-   pf.h.print_stats()
-   dd=pf.h.all_data()
-   print np.sum(dd['particle_type']==0)
-
-In the above example code, the first line imports the standard yt functions,
-followed by defining the gas mesh file. It's loaded only through level 3, but
-grids particles on to meshes on level 2 and higher. Finally, we create a data
-container and ask it to gather the particle_type array. In this case ``type==0``
-is for the most highly-refined dark matter particle, and we print out how many
-high-resolution star particles we find in the simulation.  Typically, however,
-you shouldn't have to specify any keyword arguments to load in a dataset.
-
-.. loading-amr-data:
-
-Generic AMR Data
-----------------
-
-It is possible to create native ``yt`` parameter file from Python's dictionary
-that describes set of rectangular patches of data of possibly varying
-resolution. 
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_amr_grids
-
-   grid_data = [
-       dict(left_edge = [0.0, 0.0, 0.0],
-            right_edge = [1.0, 1.0, 1.],
-            level = 0,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-       dict(left_edge = [0.25, 0.25, 0.25],
-            right_edge = [0.75, 0.75, 0.75],
-            level = 1,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-   ]
-  
-   for g in grid_data:
-       g["Density"] = np.random.random(g["dimensions"]) * 2**g["level"]
-  
-   pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)
-
-Particle fields are supported by adding 1-dimensional arrays and
-setting the ``number_of_particles`` key to each ``grid``'s dict:
-
-.. code-block:: python
-
-    for g in grid_data:
-        g["number_of_particles"] = 100000
-        g["particle_position_x"] = np.random.random((g["number_of_particles"]))
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Some functions may behave oddly, and parallelism will be disappointing or
-  non-existent in most cases.
-* No consistency checks are performed on the hierarchy
-* Data must already reside in memory.
-* Consistency between particle positions and grids is not checked;
-  ``load_amr_grids`` assumes that particle positions associated with one grid are
-  not bounded within another grid at a higher level, so this must be
-  ensured by the user prior to loading the grid data. 

diff -r 1015a8a45a11e9a483755a8a943758fc174e04b4 -r 42d8995683f6efcea72131f6f03d6279b538be24 source/examining/supported_frontends_data.rst
--- /dev/null
+++ b/source/examining/supported_frontends_data.rst
@@ -0,0 +1,296 @@
+.. _loading-data-from-supported-codes:
+
+Loading Data from Supported Codes
+============
+
+This section contains information on how to load data into ``yt`` from
+supported codes, as well as some important caveats about different data formats.
+
+.. _loading-enzo-data:
+
+Enzo Data
+---------
+
+Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
+dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
+it the parameter file name.  This would be the name of the output file, and it
+contains no extension.  For instance, if you have the following files:
+
+.. code-block:: none
+
+   DD0010/
+   DD0010/data0010
+   DD0010/data0010.hierarchy
+   DD0010/data0010.cpu0000
+   DD0010/data0010.cpu0001
+   DD0010/data0010.cpu0002
+   DD0010/data0010.cpu0003
+
+You would feed the ``load`` command the filename ``DD0010/data0010`` as
+mentioned.
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("DD0010/data0010")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Enzo usage
+* Units should be correct, if you utilize standard unit-setting routines.  yt
+  will notify you if it cannot determine the units, although this
+  notification will be passive.
+* 2D and 1D data are supported, but the extraneous dimensions are set to be
+  of length 1.0
+
+.. _loading-orion-data:
+
+Orion Data
+----------
+
+Orion data is fully supported and cared for by Jeff Oishi.  This method should
+also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
+Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
+command provided by ``yt.mods`` and supply to it the directory file name.
+**You must also have the ``inputs`` file in the base directory.**  For
+instance, if you were in a directory with the following files:
+
+.. code-block:: none
+
+   inputs
+   pltgmlcs5600/
+   pltgmlcs5600/Header
+   pltgmlcs5600/Level_0
+   pltgmlcs5600/Level_0/Cell_H
+   pltgmlcs5600/Level_1
+   pltgmlcs5600/Level_1/Cell_H
+   pltgmlcs5600/Level_2
+   pltgmlcs5600/Level_2/Cell_H
+   pltgmlcs5600/Level_3
+   pltgmlcs5600/Level_3/Cell_H
+   pltgmlcs5600/Level_4
+   pltgmlcs5600/Level_4/Cell_H
+
+You would feed it the filename ``pltgmlcs5600``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("pltgmlcs5600")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Orion usage
+* Star particles are not supported at the current time
+
+.. _loading-flash-data:
+
+FLASH Data
+----------
+
+FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
+FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
+supply to it the file name of a plot file or checkpoint file, but particle
+files are not currently directly loadable by themselves, due to the
+fact that they typically lack grid information. For instance, if you were in a directory with
+the following files:
+
+.. code-block:: none
+
+   cosmoSim_coolhdf5_chk_0026
+
+You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("cosmoSim_coolhdf5_chk_0026")
+
+If you have a FLASH particle file that was created at the same time as
+a plotfile or checkpoint file (therefore having particle data
+consistent with the grid structure of the latter), its data may be loaded with the
+``particle_filename`` optional argument:
+
+.. code-block:: python
+
+    from yt.mods import *
+    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly utilized; yt assumes cgs
+* Velocities and length units will be scaled to comoving coordinates if yt is
+  able to discern you are examining a cosmology simulation; particle and grid
+  positions will not be.
+* Domains may be visualized assuming periodicity.
+
+.. _loading-ramses-data:
+
+RAMSES Data
+-----------
+
+RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
+you are interested in taking a development or stewardship role, please contact
+him.  To load a RAMSES dataset, you can use the ``load`` command provided by
+``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
+were in a directory with the following files:
+
+.. code-block:: none
+
+   output_00007
+   output_00007/amr_00007.out00001
+   output_00007/grav_00007.out00001
+   output_00007/hydro_00007.out00001
+   output_00007/info_00007.txt
+   output_00007/part_00007.out00001
+
+You would feed it the filename ``output_00007/info_00007.txt``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("output_00007/info_00007.txt")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly set!  This may not be the
+  case for RAMSES data
+* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
+  domain to ensure minimum-coverage from a set of grid patches.  (This is
+  described in the yt method paper.)  This is a time-consuming process and it
+  has not yet been written to be stored between calls.
+* Particles are not supported
+* Parallelism will not be terribly efficient for large datasets
+* There may be occasional segfaults on multi-domain data, which do not
+  reflect errors in the calculation
+
+If you are interested in helping with RAMSES support, we are eager to hear from
+you!
+
+.. _loading-art-data:
+
+ART Data
+--------
+
+ART data enjoys preliminary support and is supported by Christopher Moody.
+Please contact the ``yt-dev`` mailing list if you are interested in using yt
+for ART data, or if you are interested in assisting with development of yt to
+work with ART data.
+
+At the moment, the ART octree is 'regridded' at each level to make the native
+octree look more like a mesh-based code. As a result, the initial outlay
+is about ~60 seconds to grid octs onto a mesh. This will be improved in 
+``yt-3.0``, where octs will be supported natively. 
+
+To load an ART dataset you can use the ``load`` command provided by 
+``yt.mods`` and passing the gas mesh file. It will search for and attempt 
+to find the complementary dark matter and stellar particle header and data 
+files. However, your simulations may not follow the same naming convention.
+
+So for example, a single snapshot might have a series of files looking like
+this:
+
+.. code-block:: none
+
+   10MpcBox_csf512_a0.300.d    #Gas mesh
+   PMcrda0.300.DAT             #Particle header
+   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
+   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
+
+The ART frontend tries to find the associated files matching the above, but
+if that fails you can specify ``file_particle_data``,``file_particle_data``,
+``file_star_data`` in addition to the specifying the gas mesh. You also have 
+the option of gridding particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's probably  best to leave
+``do_grid_particles=False`` as the default.
+
+To speed up the loading of an ART file, you have a few options. You can turn 
+off the particles entirely by setting ``discover_particles=False``. You can
+also only grid octs up to a certain level, ``limit_level=5``, which is useful
+when debugging by artificially creating a 'smaller' dataset to work with.
+
+Finally, when stellar ages are computed we 'spread' the ages evenly within a
+smoothing window. By default this is turned on and set to 10Myr. To turn this 
+off you can set ``spread=False``, and you can tweak the age smoothing window
+by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
+
+.. code-block:: python
+    
+   from yt.mods import *
+
+   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
+   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
+   pf.h.print_stats()
+   dd=pf.h.all_data()
+   print np.sum(dd['particle_type']==0)
+
+In the above example code, the first line imports the standard yt functions,
+followed by defining the gas mesh file. It's loaded only through level 3, but
+grids particles on to meshes on level 2 and higher. Finally, we create a data
+container and ask it to gather the particle_type array. In this case ``type==0``
+is for the most highly-refined dark matter particle, and we print out how many
+high-resolution star particles we find in the simulation.  Typically, however,
+you shouldn't have to specify any keyword arguments to load in a dataset.
+
+Athena Data
+----------
+
+Athena 4.x VTK data is *mostly* supported and cared for by John
+ZuHone. Both uniform grid and SMR datasets are supported. 
+
+Loading Athena datasets is slightly different depending on whether
+your dataset came from a serial or a parallel run. If the data came
+from a serial run or you have joined the VTK files together using the
+Athena tool ``join_vtk``, you can load the data like this:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("kh.0010.vtk")
+
+The filename corresponds to the file on SMR level 0, whereas if there
+are multiple levels the corresponding files will be picked up
+automatically, assuming they are laid out in ``lev*`` subdirectories
+under the directory where the base file is located.
+
+For parallel datasets, yt assumes that they are laid out in
+directories named ``id*``, one for each processor number, each with
+``lev*`` subdirectories for additional refinement levels. To load this
+data, call ``load`` with the base file in the ``id0`` directory:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("id0/kh.0010.vtk")
+
+which will pick up all of the files in the different ``id*`` directories for
+the entire dataset. 
+
+yt works in cgs ("Gaussian") units, but Athena data is not
+normally stored in these units. If you would like to convert data to
+cgs units, you may supply conversions for length, time, and density to ``load``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("id0/cluster_merger.0250.vtk", 
+          parameters={"LengthUnits":3.0856e24,
+                               "TimeUnits":3.1557e13,"DensityUnits":1.67e-24)
+
+This means that the yt fields (e.g. ``Density``, ``x-velocity``,
+``Bx``) will be in cgs units, but the Athena fields (e.g.,
+``density``, ``velocity_x``, ``cell_centered_B_x``) will be in code
+units. 
+
+.. rubric:: Caveats
+
+* yt primarily works with primitive variables. If the Athena
+  dataset contains conservative variables, the yt primitive fields will be generated from the
+  conserved variables on disk. 
+* Domains may be visualized assuming periodicity.
+* Particle list data is currently unsupported.
+* In some parallel Athena datasets, it is possible for a grid from one
+  refinement level to overlap with more than one grid on the parent
+  level. This may result in unpredictable behavior for some analysis
+  or visualization tasks. 
+


https://bitbucket.org/yt_analysis/yt-doc/commits/924870bf3906/
Changeset:   924870bf3906
User:        jzuhone
Date:        2013-10-30 06:51:05
Summary:     Fixed a few issues with the notebook.
Affected #:  1 file

diff -r 42d8995683f6efcea72131f6f03d6279b538be24 -r 924870bf39063f488091434379b912fb17536879 source/examining/Loading_Generic_Array_Data.ipynb
--- a/source/examining/Loading_Generic_Array_Data.ipynb
+++ b/source/examining/Loading_Generic_Array_Data.ipynb
@@ -204,7 +204,7 @@
      "collapsed": false,
      "input": [
       "import h5py\n",
-      "f = h5py.File(os.environ[\"YT_DATA_DIR\"]+\"/turb_vels.h5\", \"r\") # Read-only access to the file"
+      "f = h5py.File(os.environ[\"YT_DATA_DIR\"]+\"/UnigridData/turb_vels.h5\", \"r\") # Read-only access to the file"
      ],
      "language": "python",
      "metadata": {},
@@ -301,7 +301,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
-      "f = pyfits.open(os.environ[\"YT_DATA_DIR\"]+\"/velocity_field_20.fits.gz\")\n",
+      "f = pyfits.open(os.environ[\"YT_DATA_DIR\"]+\"/UnigridData/velocity_field_20.fits.gz\")\n",
       "f.info()"
      ],
      "language": "python",
@@ -463,7 +463,7 @@
      "level": 2,
      "metadata": {},
      "source": [
-      "Caveats for Loading Generic Data"
+      "Caveats for Loading Generic Array Data"
      ]
     },
     {


https://bitbucket.org/yt_analysis/yt-doc/commits/4c5ea52653be/
Changeset:   4c5ea52653be
User:        jzuhone
Date:        2013-10-30 06:51:31
Summary:     Merging
Affected #:  12 files

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -51,7 +51,11 @@
         f.write(script_text.encode('utf8'))
         f.close()
 
-        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        try:
+            evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval)
+        except:
+            # bail
+            return []
 
         # Create link to notebook and script files
         link_rst = "(" + \

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 extensions/notebookcell_sphinxext.py
--- a/extensions/notebookcell_sphinxext.py
+++ b/extensions/notebookcell_sphinxext.py
@@ -32,7 +32,11 @@
 
         convert_to_ipynb('temp.py', 'temp.ipynb')
 
-        evaluated_text = evaluate_notebook('temp.ipynb')
+        try:
+            evaluated_text = evaluate_notebook('temp.ipynb')
+        except:
+            # bail
+            return []
 
         # create notebook node
         attributes = {'format': 'html', 'source': 'nb_path'}

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 helper_scripts/show_fields.py
--- a/helper_scripts/show_fields.py
+++ b/helper_scripts/show_fields.py
@@ -17,6 +17,17 @@
 everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
 and so on.
 
+Try using the ``pf.h.field_list`` and ``pf.h.derived_field_list`` to view the
+native and derived fields available for your dataset respectively. For example
+to display the native fields in alphabetical order:
+
+.. notebook-cell::
+
+  from yt.mods import *
+  pf = load("Enzo_64/DD0043/data0043")
+  for i in sorted(pf.h.field_list):
+    print i
+
 .. note:: Universal fields will be overridden by a code-specific field.
 
 .. rubric:: Table of Contents
@@ -95,7 +106,37 @@
 print
 print_all_fields(FLASHFieldInfo)
 
-print "Nyx-Specific Field List"
+print "Athena-Specific Field List"
 print "--------------------------"
 print
+print_all_fields(AthenaFieldInfo)
+
+print "Nyx-Specific Field List"
+print "-----------------------"
+print
 print_all_fields(NyxFieldInfo)
+
+print "Castro-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(CastroFieldInfo)
+
+print "Chombo-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(ChomboFieldInfo)
+
+print "Pluto-Specific Field List"
+print "--------------------------"
+print
+print_all_fields(PlutoFieldInfo)
+
+print "Grid-Data-Format-Specific Field List"
+print "------------------------------------"
+print
+print_all_fields(GDFFieldInfo)
+
+print "Generic-Format (Stream) Field List"
+print "----------------------------------"
+print
+print_all_fields(StreamFieldInfo)

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/api/api.rst
--- a/source/api/api.rst
+++ /dev/null
@@ -1,563 +0,0 @@
-API Reference
-=============
-
-Plots and the Plotting Interface
---------------------------------
-
-SlicePlot and ProjectionPlot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_window.SlicePlot
-   ~yt.visualization.plot_window.OffAxisSlicePlot
-   ~yt.visualization.plot_window.ProjectionPlot
-   ~yt.visualization.plot_window.OffAxisProjectionPlot
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_collection.PlotCollection
-   ~yt.visualization.plot_collection.PlotCollectionInteractive
-   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
-   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
-   ~yt.visualization.base_plot_types.get_multi_plot
-
-Data Sources
-------------
-
-.. _physical-object-api:
-
-Physical Objects
-^^^^^^^^^^^^^^^^
-
-These are the objects that act as physical selections of data, describing a
-region in space.  These are not typically addressed directly; see
-:ref:`available-objects` for more information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.data_containers.AMRCoveringGridBase
-   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
-   ~yt.data_objects.data_containers.AMRCylinderBase
-   ~yt.data_objects.data_containers.AMRGridCollectionBase
-   ~yt.data_objects.data_containers.AMRRayBase
-   ~yt.data_objects.data_containers.AMROrthoRayBase
-   ~yt.data_objects.data_containers.AMRStreamlineBase
-   ~yt.data_objects.data_containers.AMRProjBase
-   ~yt.data_objects.data_containers.AMRRegionBase
-   ~yt.data_objects.data_containers.AMRSliceBase
-   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
-   ~yt.data_objects.data_containers.AMRSphereBase
-   ~yt.data_objects.data_containers.AMRSurfaceBase
-
-Time Series Objects
-^^^^^^^^^^^^^^^^^^^
-
-These are objects that either contain and represent or operate on series of
-datasets.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.time_series.TimeSeriesData
-   ~yt.data_objects.time_series.TimeSeriesDataObject
-   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
-   ~yt.data_objects.time_series.AnalysisTaskProxy
-
-Frontends
----------
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.grid_patch.AMRGridPatch
-   ~yt.data_objects.hierarchy.AMRHierarchy
-   ~yt.data_objects.static_output.StaticOutput
-
-Enzo
-^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.enzo.data_structures.EnzoGrid
-   ~yt.frontends.enzo.data_structures.EnzoHierarchy
-   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
-
-Orion
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.orion.data_structures.OrionGrid
-   ~yt.frontends.orion.data_structures.OrionHierarchy
-   ~yt.frontends.orion.data_structures.OrionStaticOutput
-
-FLASH
-^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.flash.data_structures.FLASHGrid
-   ~yt.frontends.flash.data_structures.FLASHHierarchy
-   ~yt.frontends.flash.data_structures.FLASHStaticOutput
-
-Chombo
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.chombo.data_structures.ChomboGrid
-   ~yt.frontends.chombo.data_structures.ChomboHierarchy
-   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
-
-RAMSES
-^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
-
-Derived Datatypes
------------------
-
-Profiles and Histograms
-^^^^^^^^^^^^^^^^^^^^^^^
-
-These types are used to sum data up and either return that sum or return an
-average.  Typically they are more easily used through the
-`yt.visualization.plot_collection` interface.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.profiles.BinnedProfile1D
-   ~yt.data_objects.profiles.BinnedProfile2D
-   ~yt.data_objects.profiles.BinnedProfile3D
-
-Halo Finding and Particle Functions
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Halo finding can be executed using these types.  Here we list the main halo
-finders as well as a few other supplemental objects.
-
-.. rubric:: Halo Finders
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
-   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
-
-You can also operate on the Halo and HAloList objects themselves:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.Halo
-   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
-   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
-   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
-
-There are also functions for loading halos from disk:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
-   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
-
-We have several methods that work to create merger trees:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
-   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
-
-You can use Halo catalogs generatedl externally as well:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
-   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
-
-Halo Profiling
-^^^^^^^^^^^^^^
-
-yt provides a comprehensive halo profiler that can filter, center, and analyze
-halos en masse.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
-   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
-
-
-Two Point Functions
-^^^^^^^^^^^^^^^^^^^
-
-These functions are designed to create correlations or other results of
-operations acting on two spatially-distinct points in a data source.  See also
-:ref:`two_point_functions`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
-   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
-
-Field Types
------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.field_info_container.DerivedField
-   ~yt.data_objects.field_info_container.FieldInfoContainer
-   ~yt.data_objects.field_info_container.ValidateDataField
-   ~yt.data_objects.field_info_container.ValidateGridType
-   ~yt.data_objects.field_info_container.ValidateParameter
-   ~yt.data_objects.field_info_container.ValidateProperty
-   ~yt.data_objects.field_info_container.ValidateSpatial
-
-Image Handling
---------------
-
-For volume renderings and fixed resolution buffers the image object returned is
-an ``ImageArray`` object, which has useful functions for image saving and 
-writing to bitmaps.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.image_array.ImageArray
-   ~yt.data_objects.image_array.ImageArray.write_png
-   ~yt.data_objects.image_array.ImageArray.write_hdf5
-
-Extension Types
----------------
-
-Coordinate Transformations
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
-   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
-
-Cosmology, Star Particle Analysis, and Simulated Observations
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
-   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
-
-Light cone generation and simulation analysis.  (See also
-:ref:`light-cone-generator`.)
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
-   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
-
-Absorption and X-ray spectra and spectral lines:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
-   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
-
-Absorption spectra fitting:
-
-.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
-
-Sunrise exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
-   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
-
-RADMC-3D exporting:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
-   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
-
-Radial Column Density
-^^^^^^^^^^^^^^^^^^^^^
-
-If you'd like to calculate the column density out to a given point, from a
-specified center, yt can provide that information.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
-
-Volume Rendering
-^^^^^^^^^^^^^^^^
-
-See also :ref:`volume_rendering`.
-
-Here are the primary entry points:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.Camera
-   ~yt.visualization.volume_rendering.camera.off_axis_projection
-   ~yt.visualization.volume_rendering.camera.allsky_projection
-
-These objects set up the way the image looks:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
-   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
-
-There are also advanced objects for particular use cases:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
-   ~yt.visualization.volume_rendering.camera.FisheyeCamera
-   ~yt.visualization.volume_rendering.camera.MosaicCamera
-   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
-   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
-   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
-   ~yt.visualization.volume_rendering.camera.StereoPairCamera
-
-Streamlining
-^^^^^^^^^^^^
-
-See also :ref:`streamlines`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.streamlines.Streamlines
-
-Image Writing
-^^^^^^^^^^^^^
-
-These functions are all used for fast writing of images directly to disk,
-without calling matplotlib.  This can be very useful for high-cadence outputs
-where colorbars are unnecessary or for volume rendering.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.image_writer.multi_image_composite
-   ~yt.visualization.image_writer.write_bitmap
-   ~yt.visualization.image_writer.write_projection
-   ~yt.visualization.image_writer.write_fits
-   ~yt.visualization.image_writer.write_image
-   ~yt.visualization.image_writer.map_to_colors
-   ~yt.visualization.image_writer.strip_colormap_data
-   ~yt.visualization.image_writer.splat_points
-   ~yt.visualization.image_writer.annotate_image
-   ~yt.visualization.image_writer.scale_image
-
-We also provide a module that is very good for generating EPS figures,
-particularly with complicated layouts.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.eps_writer.DualEPS
-   ~yt.visualization.eps_writer.single_plot
-   ~yt.visualization.eps_writer.multiplot
-   ~yt.visualization.eps_writer.multiplot_yt
-   ~yt.visualization.eps_writer.return_cmap
-
-.. _image-panner-api:
-
-Derived Quantities
-------------------
-
-See :ref:`derived-quantities`.
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.data_objects.derived_quantities._AngularMomentumVector
-   ~yt.data_objects.derived_quantities._BaryonSpinParameter
-   ~yt.data_objects.derived_quantities._BulkVelocity
-   ~yt.data_objects.derived_quantities._CenterOfMass
-   ~yt.data_objects.derived_quantities._Extrema
-   ~yt.data_objects.derived_quantities._IsBound
-   ~yt.data_objects.derived_quantities._MaxLocation
-   ~yt.data_objects.derived_quantities._ParticleSpinParameter
-   ~yt.data_objects.derived_quantities._TotalMass
-   ~yt.data_objects.derived_quantities._TotalQuantity
-   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
-
-.. _callback-api:
-
-Callback List
--------------
-
-
-See also :ref:`callbacks`.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.visualization.plot_modifications.ArrowCallback
-   ~yt.visualization.plot_modifications.ClumpContourCallback
-   ~yt.visualization.plot_modifications.ContourCallback
-   ~yt.visualization.plot_modifications.CoordAxesCallback
-   ~yt.visualization.plot_modifications.CuttingQuiverCallback
-   ~yt.visualization.plot_modifications.GridBoundaryCallback
-   ~yt.visualization.plot_modifications.HopCircleCallback
-   ~yt.visualization.plot_modifications.HopParticleCallback
-   ~yt.visualization.plot_modifications.LabelCallback
-   ~yt.visualization.plot_modifications.LinePlotCallback
-   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
-   ~yt.visualization.plot_modifications.ParticleCallback
-   ~yt.visualization.plot_modifications.PointAnnotateCallback
-   ~yt.visualization.plot_modifications.QuiverCallback
-   ~yt.visualization.plot_modifications.SphereCallback
-   ~yt.visualization.plot_modifications.TextLabelCallback
-   ~yt.visualization.plot_modifications.TitleCallback
-   ~yt.visualization.plot_modifications.UnitBoundaryCallback
-   ~yt.visualization.plot_modifications.VelocityCallback
-
-Function List
--------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.convenience.load
-   ~yt.funcs.deprecate
-   ~yt.funcs.ensure_list
-   ~yt.funcs.get_pbar
-   ~yt.funcs.humanize_time
-   ~yt.funcs.insert_ipython
-   ~yt.funcs.iterable
-   ~yt.funcs.just_one
-   ~yt.funcs.only_on_root
-   ~yt.funcs.paste_traceback
-   ~yt.funcs.pdb_run
-   ~yt.funcs.print_tb
-   ~yt.funcs.rootonly
-   ~yt.funcs.time_execution
-   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
-
-Miscellaneous Types
--------------------
-
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.config.YTConfigParser
-   ~yt.utilities.parameter_file_storage.ParameterFileStore
-   ~yt.data_objects.data_containers.FakeGridForParticles
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
-   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
-
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
-.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
-
-
-Testing Infrastructure
-----------------------
-
-The first set of functions are all provided by NumPy.
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_array_equal
-   ~yt.testing.assert_almost_equal
-   ~yt.testing.assert_approx_equal
-   ~yt.testing.assert_array_almost_equal
-   ~yt.testing.assert_equal
-   ~yt.testing.assert_array_less
-   ~yt.testing.assert_string_equal
-   ~yt.testing.assert_array_almost_equal_nulp
-   ~yt.testing.assert_allclose
-   ~yt.testing.assert_raises
-
-These are yt-provided functions:
-
-.. autosummary::
-   :toctree: generated/
-
-   ~yt.testing.assert_rel_equal
-   ~yt.testing.amrspace
-   ~yt.testing.fake_random_pf
-   ~yt.testing.expand_keywords

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -256,4 +256,4 @@
                        }
 
 if not on_rtd:
-    autosummary_generate = glob.glob("api/api.rst")
+    autosummary_generate = glob.glob("reference/api/api.rst")

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -1,7 +1,7 @@
 .. _cookbook:
 
-The yt Cookbook
-===============
+The Cookbook
+============
 
 yt provides a great deal of functionality to the user, but sometimes it can 
 be a bit complex.  This section of the documentation lays out examples recipes 

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -3,38 +3,129 @@
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
-in Python under the open-source model.  yt currently supports several 
-astrophysical simulation code formats, as well support for :ref:`loading-numpy-array`
-for unsupported data formats.  Fully-supported codes 
-include: `Enzo <http://enzo-project.org/>`_, 
-`Orion <http://flash.uchicago.edu/~rfisher/orion/>`_,
-`Nyx <https://ccse.lbl.gov/Research/NYX/index.html>`_, 
-`FLASH <http://flash.uchicago.edu/website/home/>`_, 
-`Piernik <http://arxiv.org/abs/0901.0104>`_;
-and partially-supported codes include: 
-`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_,
-`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_,
-`Maestro <https://ccse.lbl.gov/Research/MAESTRO/>`_,
-`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_.
-
-yt uses a three-pronged approach to interacting with data:
-
- * Visualize Data - Generate plots, images, and movies for better understanding your datasets
- * Analyze Data - Use additional analysis routines to derive real-world results from your data
- * Examine Data - Directly access raw data with helper functions for making this task easier
+in Python under the open-source model.  In version 2.6, yt currently supports
+several astrophysical simulation code formats, as well support for
+:ref:`loading-numpy-array` for unsupported data formats.  :ref:`code-support`
+is included for: `Enzo <http://enzo-project.org/>`_, `Orion
+<http://flash.uchicago.edu/~rfisher/orion/>`_, `Nyx
+<https://ccse.lbl.gov/Research/NYX/index.html>`_, `FLASH
+<http://flash.uchicago.edu/website/home/>`_, `Piernik
+<http://arxiv.org/abs/0901.0104>`_, `Athena
+<https://trac.princeton.edu/Athena/>`_, `Chombo <http://chombo.lbl.gov>`_,
+`Castro <https://ccse.lbl.gov/Research/CASTRO/>`_, `Maestro
+<https://ccse.lbl.gov/Research/MAESTRO/>`_, and `Pluto
+<http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
+particle codes and octree codes, is taking place in yt 3.0.)
 
 Documentation
 =============
 
+.. raw:: html
+
+   <table class="contentstable" align="center">
+
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="installing.html">Installation</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Getting and Installing yt</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="bootcamp/index.html">yt Bootcamp</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Demonstrations of what yt can do</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="cookbook/index.html">The Cookbook</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Example recipes for how to accomplish a variety of tasks</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="visualizing/index.html">Visualizing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Make plots, projections, volume renderings, movies, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="analyzing/index.html">Analyzing Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="examining/index.html">Examining Data</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Load data and directly access raw values for low-level analysis</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="developing/index.html">Developing in yt</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Catering yt to work for your exact use case</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="reference/index.html">Reference Materials</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Lists of fields, quantities, classes, functions, and more</p>
+       </td>
+     </tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/index.html">Getting help</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">What to do if you run into problems</p>
+       </td>
+     </tr>
+
+   </table>
+
 .. toctree::
-   :maxdepth: 1
+   :hidden:
 
    installing
-   yt Bootcamp: A Worked Introduction <bootcamp/index>
-   help/index
+   yt Bootcamp <bootcamp/index>
    cookbook/index
    visualizing/index
    analyzing/index
    examining/index
    developing/index
    reference/index
+   help/index

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/reference/api/api.rst
--- /dev/null
+++ b/source/reference/api/api.rst
@@ -0,0 +1,563 @@
+API Reference
+=============
+
+Plots and the Plotting Interface
+--------------------------------
+
+SlicePlot and ProjectionPlot
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_window.SlicePlot
+   ~yt.visualization.plot_window.OffAxisSlicePlot
+   ~yt.visualization.plot_window.ProjectionPlot
+   ~yt.visualization.plot_window.OffAxisProjectionPlot
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_collection.PlotCollection
+   ~yt.visualization.plot_collection.PlotCollectionInteractive
+   ~yt.visualization.fixed_resolution.FixedResolutionBuffer
+   ~yt.visualization.fixed_resolution.ObliqueFixedResolutionBuffer
+   ~yt.visualization.base_plot_types.get_multi_plot
+
+Data Sources
+------------
+
+.. _physical-object-api:
+
+Physical Objects
+^^^^^^^^^^^^^^^^
+
+These are the objects that act as physical selections of data, describing a
+region in space.  These are not typically addressed directly; see
+:ref:`available-objects` for more information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.data_containers.AMRCoveringGridBase
+   ~yt.data_objects.data_containers.AMRCuttingPlaneBase
+   ~yt.data_objects.data_containers.AMRCylinderBase
+   ~yt.data_objects.data_containers.AMRGridCollectionBase
+   ~yt.data_objects.data_containers.AMRRayBase
+   ~yt.data_objects.data_containers.AMROrthoRayBase
+   ~yt.data_objects.data_containers.AMRStreamlineBase
+   ~yt.data_objects.data_containers.AMRProjBase
+   ~yt.data_objects.data_containers.AMRRegionBase
+   ~yt.data_objects.data_containers.AMRSliceBase
+   ~yt.data_objects.data_containers.AMRSmoothedCoveringGridBase
+   ~yt.data_objects.data_containers.AMRSphereBase
+   ~yt.data_objects.data_containers.AMRSurfaceBase
+
+Time Series Objects
+^^^^^^^^^^^^^^^^^^^
+
+These are objects that either contain and represent or operate on series of
+datasets.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.time_series.TimeSeriesData
+   ~yt.data_objects.time_series.TimeSeriesDataObject
+   ~yt.data_objects.time_series.TimeSeriesQuantitiesContainer
+   ~yt.data_objects.time_series.AnalysisTaskProxy
+
+Frontends
+---------
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.grid_patch.AMRGridPatch
+   ~yt.data_objects.hierarchy.AMRHierarchy
+   ~yt.data_objects.static_output.StaticOutput
+
+Enzo
+^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.enzo.data_structures.EnzoGrid
+   ~yt.frontends.enzo.data_structures.EnzoHierarchy
+   ~yt.frontends.enzo.data_structures.EnzoStaticOutput
+
+Orion
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.orion.data_structures.OrionGrid
+   ~yt.frontends.orion.data_structures.OrionHierarchy
+   ~yt.frontends.orion.data_structures.OrionStaticOutput
+
+FLASH
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.flash.data_structures.FLASHGrid
+   ~yt.frontends.flash.data_structures.FLASHHierarchy
+   ~yt.frontends.flash.data_structures.FLASHStaticOutput
+
+Chombo
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.chombo.data_structures.ChomboGrid
+   ~yt.frontends.chombo.data_structures.ChomboHierarchy
+   ~yt.frontends.chombo.data_structures.ChomboStaticOutput
+
+RAMSES
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.ramses.data_structures.RAMSESGrid
+   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
+   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+
+Derived Datatypes
+-----------------
+
+Profiles and Histograms
+^^^^^^^^^^^^^^^^^^^^^^^
+
+These types are used to sum data up and either return that sum or return an
+average.  Typically they are more easily used through the
+`yt.visualization.plot_collection` interface.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.profiles.BinnedProfile1D
+   ~yt.data_objects.profiles.BinnedProfile2D
+   ~yt.data_objects.profiles.BinnedProfile3D
+
+Halo Finding and Particle Functions
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Halo finding can be executed using these types.  Here we list the main halo
+finders as well as a few other supplemental objects.
+
+.. rubric:: Halo Finders
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloFinder
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHF
+   ~yt.analysis_modules.halo_finding.rockstar.api.RockstarHaloFinder
+
+You can also operate on the Halo and HAloList objects themselves:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.Halo
+   ~yt.analysis_modules.halo_finding.halo_objects.HaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHalo
+   ~yt.analysis_modules.halo_finding.halo_objects.RockstarHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.HOPHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.FOFHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadedHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.TextHaloList
+   ~yt.analysis_modules.halo_finding.halo_objects.parallelHOPHaloList
+
+There are also functions for loading halos from disk:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadTextHaloes
+   ~yt.analysis_modules.halo_finding.halo_objects.LoadRockstarHalos
+
+We have several methods that work to create merger trees:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTree
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeConnect
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeDotOutput
+   ~yt.analysis_modules.halo_merger_tree.merger_tree.MergerTreeTextOutput
+
+You can use Halo catalogs generatedl externally as well:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.HaloCatalog
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.EnzoFOFMergerTree
+   ~yt.analysis_modules.halo_merger_tree.enzofof_merger_tree.plot_halo_evolution
+
+Halo Profiling
+^^^^^^^^^^^^^^
+
+yt provides a comprehensive halo profiler that can filter, center, and analyze
+halos en masse.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.HaloProfiler
+   ~yt.analysis_modules.halo_profiler.multi_halo_profiler.VirialFilter
+
+
+Two Point Functions
+^^^^^^^^^^^^^^^^^^^
+
+These functions are designed to create correlations or other results of
+operations acting on two spatially-distinct points in a data source.  See also
+:ref:`two_point_functions`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.two_point_functions.two_point_functions.TwoPointFunctions
+   ~yt.analysis_modules.two_point_functions.two_point_functions.FcnSet
+
+Field Types
+-----------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.field_info_container.DerivedField
+   ~yt.data_objects.field_info_container.FieldInfoContainer
+   ~yt.data_objects.field_info_container.ValidateDataField
+   ~yt.data_objects.field_info_container.ValidateGridType
+   ~yt.data_objects.field_info_container.ValidateParameter
+   ~yt.data_objects.field_info_container.ValidateProperty
+   ~yt.data_objects.field_info_container.ValidateSpatial
+
+Image Handling
+--------------
+
+For volume renderings and fixed resolution buffers the image object returned is
+an ``ImageArray`` object, which has useful functions for image saving and 
+writing to bitmaps.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.image_array.ImageArray
+   ~yt.data_objects.image_array.ImageArray.write_png
+   ~yt.data_objects.image_array.ImageArray.write_hdf5
+
+Extension Types
+---------------
+
+Coordinate Transformations
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.coordinate_transformation.transforms.arbitrary_regrid
+   ~yt.analysis_modules.coordinate_transformation.transforms.spherical_regrid
+
+Cosmology, Star Particle Analysis, and Simulated Observations
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For the generation of stellar SEDs.  (See also :ref:`star_analysis`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.StarFormationRate
+   ~yt.analysis_modules.star_analysis.sfr_spectrum.SpectrumBuilder
+
+Light cone generation and simulation analysis.  (See also
+:ref:`light-cone-generator`.)
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.cosmological_observation.light_cone.light_cone.LightCone
+   ~yt.analysis_modules.cosmological_observation.light_ray.light_ray.LightRay
+
+Absorption and X-ray spectra and spectral lines:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.absorption_spectrum.absorption_spectrum.AbsorptionSpectrum
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.EmissivityIntegrator
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_emissivity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_luminosity_field
+   ~yt.analysis_modules.spectral_integrator.spectral_frequency_integrator.add_xray_photon_emissivity_field
+
+Absorption spectra fitting:
+
+.. autofunction:: yt.analysis_modules.absorption_spectrum.absorption_spectrum_fit.generate_total_fit
+
+Sunrise exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise
+   ~yt.analysis_modules.sunrise_export.sunrise_exporter.export_to_sunrise_from_halolist
+
+RADMC-3D exporting:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DLayer
+   ~yt.analysis_modules.radmc3d_export.RadMC3DInterface.RadMC3DWriter
+
+Radial Column Density
+^^^^^^^^^^^^^^^^^^^^^
+
+If you'd like to calculate the column density out to a given point, from a
+specified center, yt can provide that information.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.analysis_modules.radial_column_density.radial_column_density.RadialColumnDensity
+
+Volume Rendering
+^^^^^^^^^^^^^^^^
+
+See also :ref:`volume_rendering`.
+
+Here are the primary entry points:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.Camera
+   ~yt.visualization.volume_rendering.camera.off_axis_projection
+   ~yt.visualization.volume_rendering.camera.allsky_projection
+
+These objects set up the way the image looks:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.transfer_functions.ColorTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.MultiVariateTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.PlanckTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.ProjectionTransferFunction
+   ~yt.visualization.volume_rendering.transfer_functions.TransferFunction
+
+There are also advanced objects for particular use cases:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.volume_rendering.camera.MosaicFisheyeCamera
+   ~yt.visualization.volume_rendering.camera.FisheyeCamera
+   ~yt.visualization.volume_rendering.camera.MosaicCamera
+   ~yt.visualization.volume_rendering.camera.plot_allsky_healpix
+   ~yt.visualization.volume_rendering.camera.PerspectiveCamera
+   ~yt.utilities.amr_kdtree.amr_kdtree.AMRKDTree
+   ~yt.visualization.volume_rendering.camera.StereoPairCamera
+
+Streamlining
+^^^^^^^^^^^^
+
+See also :ref:`streamlines`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.streamlines.Streamlines
+
+Image Writing
+^^^^^^^^^^^^^
+
+These functions are all used for fast writing of images directly to disk,
+without calling matplotlib.  This can be very useful for high-cadence outputs
+where colorbars are unnecessary or for volume rendering.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.image_writer.multi_image_composite
+   ~yt.visualization.image_writer.write_bitmap
+   ~yt.visualization.image_writer.write_projection
+   ~yt.visualization.image_writer.write_fits
+   ~yt.visualization.image_writer.write_image
+   ~yt.visualization.image_writer.map_to_colors
+   ~yt.visualization.image_writer.strip_colormap_data
+   ~yt.visualization.image_writer.splat_points
+   ~yt.visualization.image_writer.annotate_image
+   ~yt.visualization.image_writer.scale_image
+
+We also provide a module that is very good for generating EPS figures,
+particularly with complicated layouts.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.eps_writer.DualEPS
+   ~yt.visualization.eps_writer.single_plot
+   ~yt.visualization.eps_writer.multiplot
+   ~yt.visualization.eps_writer.multiplot_yt
+   ~yt.visualization.eps_writer.return_cmap
+
+.. _image-panner-api:
+
+Derived Quantities
+------------------
+
+See :ref:`derived-quantities`.
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.data_objects.derived_quantities._AngularMomentumVector
+   ~yt.data_objects.derived_quantities._BaryonSpinParameter
+   ~yt.data_objects.derived_quantities._BulkVelocity
+   ~yt.data_objects.derived_quantities._CenterOfMass
+   ~yt.data_objects.derived_quantities._Extrema
+   ~yt.data_objects.derived_quantities._IsBound
+   ~yt.data_objects.derived_quantities._MaxLocation
+   ~yt.data_objects.derived_quantities._ParticleSpinParameter
+   ~yt.data_objects.derived_quantities._TotalMass
+   ~yt.data_objects.derived_quantities._TotalQuantity
+   ~yt.data_objects.derived_quantities._WeightedAverageQuantity
+
+.. _callback-api:
+
+Callback List
+-------------
+
+
+See also :ref:`callbacks`.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.visualization.plot_modifications.ArrowCallback
+   ~yt.visualization.plot_modifications.ClumpContourCallback
+   ~yt.visualization.plot_modifications.ContourCallback
+   ~yt.visualization.plot_modifications.CoordAxesCallback
+   ~yt.visualization.plot_modifications.CuttingQuiverCallback
+   ~yt.visualization.plot_modifications.GridBoundaryCallback
+   ~yt.visualization.plot_modifications.HopCircleCallback
+   ~yt.visualization.plot_modifications.HopParticleCallback
+   ~yt.visualization.plot_modifications.LabelCallback
+   ~yt.visualization.plot_modifications.LinePlotCallback
+   ~yt.visualization.plot_modifications.MarkerAnnotateCallback
+   ~yt.visualization.plot_modifications.ParticleCallback
+   ~yt.visualization.plot_modifications.PointAnnotateCallback
+   ~yt.visualization.plot_modifications.QuiverCallback
+   ~yt.visualization.plot_modifications.SphereCallback
+   ~yt.visualization.plot_modifications.TextLabelCallback
+   ~yt.visualization.plot_modifications.TitleCallback
+   ~yt.visualization.plot_modifications.UnitBoundaryCallback
+   ~yt.visualization.plot_modifications.VelocityCallback
+
+Function List
+-------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.convenience.load
+   ~yt.funcs.deprecate
+   ~yt.funcs.ensure_list
+   ~yt.funcs.get_pbar
+   ~yt.funcs.humanize_time
+   ~yt.funcs.insert_ipython
+   ~yt.funcs.iterable
+   ~yt.funcs.just_one
+   ~yt.funcs.only_on_root
+   ~yt.funcs.paste_traceback
+   ~yt.funcs.pdb_run
+   ~yt.funcs.print_tb
+   ~yt.funcs.rootonly
+   ~yt.funcs.time_execution
+   ~yt.analysis_modules.level_sets.contour_finder.identify_contours
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_blocking_call
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_splitter
+
+Miscellaneous Types
+-------------------
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.config.YTConfigParser
+   ~yt.utilities.parameter_file_storage.ParameterFileStore
+   ~yt.data_objects.data_containers.FakeGridForParticles
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ObjectIterator
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelAnalysisInterface
+   ~yt.utilities.parallel_tools.parallel_analysis_interface.ParallelObjectIterator
+
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ConstructedRootGrid
+.. autoclass:: yt.analysis_modules.hierarchy_subset.hierarchy_subset.ExtractedHierarchy
+
+
+Testing Infrastructure
+----------------------
+
+The first set of functions are all provided by NumPy.
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_array_equal
+   ~yt.testing.assert_almost_equal
+   ~yt.testing.assert_approx_equal
+   ~yt.testing.assert_array_almost_equal
+   ~yt.testing.assert_equal
+   ~yt.testing.assert_array_less
+   ~yt.testing.assert_string_equal
+   ~yt.testing.assert_array_almost_equal_nulp
+   ~yt.testing.assert_allclose
+   ~yt.testing.assert_raises
+
+These are yt-provided functions:
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.testing.assert_rel_equal
+   ~yt.testing.amrspace
+   ~yt.testing.fake_random_pf
+   ~yt.testing.expand_keywords

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/reference/changelog.rst
--- a/source/reference/changelog.rst
+++ b/source/reference/changelog.rst
@@ -15,6 +15,8 @@
  * David Collins
  * Brian Crosby
  * Andrew Cunningham
+ * Hilary Egan
+ * John Forbes
  * Nathan Goldbaum
  * Markus Haider
  * Cameron Hummels
@@ -24,17 +26,22 @@
  * Kacper Kowalik
  * Michael Kuhlen
  * Eve Lee
+ * Sam Leitner
  * Yuan Li
  * Chris Malone
  * Josh Moloney
  * Chris Moody
  * Andrew Myers
+ * Jill Naiman
+ * Kaylea Nelson
  * Jeff Oishi
  * Jean-Claude Passy
  * Mark Richardson
  * Thomass Robitaille
  * Anna Rosen
+ * Douglas Rudd
  * Anthony Scopatz
+ * Noel Scudder
  * Devin Silvia
  * Sam Skillman
  * Stephen Skory
@@ -45,9 +52,98 @@
  * Stephanie Tonnesen
  * Matthew Turk
  * Rick Wagner
+ * Andrew Wetzel
  * John Wise
  * John ZuHone
 
+Version 2.6
+-----------
+
+This is a scheduled release, bringing to a close the development in the 2.5
+series.  Below are the itemized, aggregate changes since version 2.5.
+
+Major changes:
+
+  * yt is now licensed under the 3-clause BSD license.
+  * HEALpix has been removed for the time being, as a result of licensing
+    incompatibility.
+  * The addition of a frontend for the Pluto code
+  * The addition of an OBJ exporter to enable transparent and multi-surface
+    exports of surfaces to Blender and Sketchfab
+  * New absorption spectrum analysis module with documentation
+  * Adding ability to draw lines with Grey Opacity in volume rendering
+  * Updated physical constants to reflect 2010 CODATA data
+  * Dependency updates (including IPython 1.0)
+  * Better notebook support for yt plots
+  * Considerably (10x+) faster kD-tree building for volume rendering
+  * yt can now export to RADMC3D
+  * Athena frontend now supports Static Mesh Refinement and units (
+    http://hub.yt-project.org/nb/7l1zua )
+  * Fix long-standing bug for plotting arrays with range of zero
+  * Adding option to have interpolation based on non-uniform bins in
+    interpolator code
+  * Upgrades to most of the dependencies in the install script
+  * ProjectionPlot now accepts a data_source keyword argument
+
+Minor or bugfix changes:
+
+  * Fix for volume rendering on the command line
+  * map_to_colormap will no longer return out-of-bounds errors
+  * Fixes for dds in covering grid calculations
+  * Library searching for build process is now more reliable
+  * Unit fix for "VorticityGrowthTimescale" field
+  * Pyflakes stylistic fixes
+  * Number density added to FLASH
+  * Many fixes for Athena frontend
+  * Radius and ParticleRadius now work for reduced-dimensionality datasets
+  * Source distributions now work again!
+  * Athena data now 64 bits everywhere
+  * Grids displays on plots are now shaded to reflect the level of refinement
+  * show_colormaps() is a new function for displaying all known colormaps
+  * PhasePlotter by default now adds a colormap.
+  * System build fix for POSIX systems
+  * Fixing domain offsets for halo centers-of-mass
+  * Removing some Enzo-specific terminology in the Halo Mass Function
+  * Addition of coordinate vectors on volume render
+  * Pickling fix for extracted regions
+  * Addition of some tracer particle annotation functions
+  * Better error message for "yt" command
+  * Fix for radial vs poloidal fields
+  * Piernik 2D data handling fix
+  * Fixes for FLASH current redshift
+  * PlotWindows now have a set_font function and a new default font setting
+  * Colorbars less likely to extend off the edge of a PlotWindow
+  * Clumps overplotted on PlotWindows are now correctly contoured
+  * Many fixes to light ray and profiles for integrated cosmological analysis
+  * Improvements to OpenMP compilation
+  * Typo in value for km_per_pc (not used elsewhere in the code base) has been
+    fixed
+  * Enable parallel IPython notebook sessions (
+    http://hub.yt-project.org/nb/qgn19h )
+  * Change (~1e-6) to particle_density deposition, enabling it to be used by
+    FLASH and other frontends
+  * Addition of is_root function for convenience in parallel analysis sessions
+  * Additions to Orion particle reader
+  * Fixing TotalMass for case when particles not present
+  * Fixing the density threshold or HOP and pHOP to match the merger tree
+  * Reason can now plot with latest plot window
+  * Issues with VelocityMagnitude and aliases with velo have been corrected in
+    the FLASH frontend
+  * Halo radii are calculated correctly for domains that do not start at 0,0,0.
+  * Halo mass function now works for non-Enzo frontends.
+  * Bug fixes for directory creation, typos in docstrings
+  * Speed improvements to ellipsoidal particle detection
+  * Updates to FLASH fields
+  * CASTRO frontend bug fixes
+  * Fisheye camera bug fixes
+  * Answer testing now includes plot window answer testing
+  * Athena data serialization
+  * load_uniform_grid can now decompose dims >= 1024.  (#537)
+  * Axis unit setting works correctly for unit names  (#534)
+  * ThermalEnergy is now calculated correctly for Enzo MHD simulations (#535)
+  * Radius fields had an asymmetry in periodicity calculation (#531)
+  * Boolean regions can now be pickled (#517)
+
 Version 2.5
 -----------
 

diff -r 924870bf39063f488091434379b912fb17536879 -r 4c5ea52653bea409733f3a47a011c35677d06a84 source/reference/code_support.rst
--- /dev/null
+++ b/source/reference/code_support.rst
@@ -0,0 +1,53 @@
+
+.. _code-support:
+
+Code Support
+============
+
+Levels of Support for Various Codes
+-----------------------------------
+
+yt provides frontends to support several different simulation code formats 
+as inputs.  Below is a list showing what level of support is provided for
+each code.
+
+|
+
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Capability           | Enzo | Orion | FLASH | Nyx  | Piernik | Athena | Castro | Maestro | Pluto | Chombo |
++======================+======+=======+=======+======+=========+========+========+=========+=======+========+
+| Fluid Quantities     |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Particles            |   Y  |   Y   |   Y   |  Y   |   N/A   |   N    |   Y    |   N     |   N   |    N   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Parameters           |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Units                |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Read on Demand       |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Load Raw Data        |   Y  |   Y   |   Y   |  Y   |    Y    |   Y    |   Y    |   Y     |   Y   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Part of test suite   |   Y  |   Y   |   Y   |  Y   |    N    |   N    |   Y    |   N     |   N   |    Y   |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+| Level of Support     | Full | Full  | Full  | Full |  Full   |  Full  |  Part  |  Part   | Part  |  Part  |
++----------------------+------+-------+-------+------+---------+--------+--------+---------+-------+--------+
+
+|
+
+If you have a dataset from a code not yet supported, you can either 
+input your data using the :ref:`loading-numpy-array` format, or help us by 
+:ref:`creating_frontend` for this new format.
+
+Future Codes to Support
+-----------------------
+
+A major overhaul of the code was required in order to cleanly support 
+additional codes.  Development in the yt 3.x branch has begun and provides 
+support for codes like: 
+`RAMSES <http://irfu.cea.fr/Phocea/Vie_des_labos/Ast/ast_sstechnique.php?id_ast=904>`_, 
+`ART (NMSU) <http://adsabs.harvard.edu/abs/1997ApJS..111...73K>`_, and 
+`Gadget <http://www.mpa-garching.mpg.de/gadget/>`_.  Please switch to that 
+version of yt for the most up-to-date support for those codes.
+
+Additionally, in yt 3.0 the Boxlib formats have been unified and streamlined.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/b2f5d3d06e3f/
Changeset:   b2f5d3d06e3f
User:        jzuhone
Date:        2013-10-31 02:53:15
Summary:     Merging
Affected #:  21 files

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 extensions/pythonscript_sphinxext.py
--- a/extensions/pythonscript_sphinxext.py
+++ b/extensions/pythonscript_sphinxext.py
@@ -2,7 +2,7 @@
 from subprocess import Popen,PIPE
 from docutils.parsers.rst import directives
 from docutils import nodes
-import os, glob, shutil,  uuid, re
+import os, glob, shutil, uuid, re, string
 
 class PythonScriptDirective(Directive):
     """Execute an inline python script and display images.
@@ -26,6 +26,9 @@
         dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
                                                 source_dir))
 
+        # working around a docutils/sphinx issue?
+        dest_dir = string.replace(dest_dir, 'internal padding after ', '')
+
         if not os.path.exists(dest_dir):
             os.makedirs(dest_dir) # no problem here for me, but just use built-ins
 
@@ -47,6 +50,7 @@
         for im in images:
             fns.append(str(uuid.uuid4()) + ".png")
             shutil.move(im, os.path.join(dest_dir, fns[-1]))
+            print im, os.path.join(dest_dir, fns[-1])
 
         os.remove("temp.py")
 
@@ -73,7 +77,7 @@
     "[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}"
 
 def cleanup(app, exception):
-    """ Cleanup all png files with UUID filenames in the source """ 
+    """ Cleanup all png files with UUID filenames in the source """
     for root,dirnames,filenames in os.walk(app.srcdir):
         matches = re.findall(PATTERN, "\n".join(filenames))
         for match in matches:

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -91,7 +91,7 @@
       "from yt.imods import *\n",
       "from yt.analysis_modules.api import SZProjection\n",
       "\n",
-      "pf = load(os.environ[\"YT_DATA_DIR\"]+\"/enzo_tiny_cosmology/DD0046/DD0046\")\n",
+      "pf = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
       "\n",
       "freqs = [90.,180.,240.]\n",
       "szprj = SZProjection(pf, freqs)"

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/analyzing/creating_derived_fields.rst
--- /dev/null
+++ b/source/analyzing/creating_derived_fields.rst
@@ -0,0 +1,313 @@
+.. _creating-derived-fields:
+
+Creating Derived Fields
+=======================
+
+One of the more powerful means of extending yt is through the usage of derived
+fields.  These are fields that describe a value at each cell in a simulation.
+
+Defining a New Field
+--------------------
+
+So once a new field has been conceived of, the best way to create it is to
+construct a function that performs an array operation -- operating on a 
+collection of data, neutral to its size, shape, and type.  (All fields should
+be provided as 64-bit floats.)
+
+A simple example of this is the pressure field, which demonstrates the ease of
+this approach.
+
+.. code-block:: python
+
+   def _Pressure(field, data):
+       return (data.pf["Gamma"] - 1.0) * \
+              data["Density"] * data["ThermalEnergy"]
+
+Note that we do a couple different things here.  We access the "Gamma"
+parameter from the parameter file, we access the "Density" field and we access
+the "ThermalEnergy" field.  "ThermalEnergy" is, in fact, another derived field!
+("ThermalEnergy" deals with the distinction in storage of energy between dual
+energy formalism and non-DEF.)  We don't do any loops, we don't do any
+type-checking, we can simply multiply the three items together.
+
+Once we've defined our function, we need to notify yt that the field is
+available.  The :func:`add_field` function is the means of doing this; it has a
+number of fairly specific parameters that can be passed in, but here we'll only
+look at the most basic ones needed for a simple scalar baryon field.
+
+.. code-block:: python
+
+   add_field("Pressure", function=_Pressure, units=r"\rm{dyne}/\rm{cm}^{2}")
+
+We feed it the name of the field, the name of the function, and the
+units.  Note that the units parameter is a "raw" string, with some
+LaTeX-style formatting -- Matplotlib actually has a MathText rendering
+engine, so if you include LaTeX it will be rendered appropriately.
+
+.. One very important thing to note about the call to ``add_field`` is
+.. that it **does not** need to specify the function name **if** the
+.. function is the name of the field prefixed with an underscore.  If it
+.. is not -- and it won't be for fields in different units (such as
+.. "CellMassMsun") -- then you need to specify it with the argument
+.. ``function``.
+
+We suggest that you name the function that creates a derived field
+with the intended field name prefixed by a single underscore, as in
+the ``_Pressure`` example above.
+
+If you find yourself using the same custom-defined fields over and over, you
+should put them in your plugins file as described in :ref:`plugin-file`.
+
+Conversion Factors
+~~~~~~~~~~~~~~~~~~
+
+When creating a derived field, yt does not by default do unit
+conversion.  All of the fields fed into the field are pre-supposed to
+be in CGS.  If the field does not need any constants applied after
+that, you are done. If it does, you should define a second function
+that applies the proper multiple in order to return the desired units
+and use the argument ``convert_function`` to ``add_field`` to point to
+it.  
+
+The argument that you pass to ``convert_function`` will be dependent on 
+what fields are input into your derived field, and in what form they
+are passed from their native format.  For enzo fields, nearly all the
+native on-disk fields are in CGS units already (except for ``dx``, ``dy``,
+and ``dz`` fields), so you typically only need to convert for 
+off-standard fields taking into account where those fields are 
+used in the final output derived field.  For other codes, it can vary.
+
+You can check to see the units associated with any field in a dataset
+from any code by using the ``_units`` attribute.  Here is an example 
+with one of our sample FLASH datasets available publicly at 
+http://yt-project.org/data :
+
+.. code-block:: python
+
+   >>> from yt.mods import *
+   >>> pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100")
+   >>> pf.h.field_list
+   ['dens', 'temp', 'pres', 'gpot', 'divb', 'velx', 'vely', 'velz', 'magx', 'magy', 'magz', 'magp']
+   >>> pf.field_info['dens']._units
+   '\\rm{g}/\\rm{cm}^{3}'
+   >>> pf.field_info['temp']._units
+   '\\rm{K}'
+   >>> pf.field_info['velx']._units
+   '\\rm{cm}/\\rm{s}'
+
+Thus if you were using any of these fields as input to your derived field, you 
+wouldn't have to worry about unit conversion because they're already in CGS.
+
+Some More Complicated Examples
+------------------------------
+
+But what if we want to do some more fancy stuff?  Here's an example of getting
+parameters from the data object and using those to define the field;
+specifically, here we obtain the ``center`` and ``height_vector`` parameters
+and use those to define an angle of declination of a point with respect to a
+disk.
+
+.. code-block:: python
+
+   def _DiskAngle(field, data):
+       # We make both r_vec and h_vec into unit vectors
+       center = data.get_field_parameter("center")
+       r_vec = np.array([data["x"] - center[0],
+                         data["y"] - center[1],
+                         data["z"] - center[2]])
+       r_vec = r_vec/np.sqrt((r_vec**2.0).sum(axis=0))
+       h_vec = np.array(data.get_field_parameter("height_vector"))
+       dp = r_vec[0,:] * h_vec[0] \
+          + r_vec[1,:] * h_vec[1] \
+          + r_vec[2,:] * h_vec[2]
+       return np.arccos(dp)
+   add_field("DiskAngle", take_log=False,
+             validators=[ValidateParameter("height_vector"),
+                         ValidateParameter("center")],
+             display_field=False)
+
+Note that we have added a few parameters below the main function; we specify
+that we do not wish to display this field as logged, that we require both
+``height_vector`` and ``center`` to be present in a given data object we wish
+to calculate this for, and we say that it should not be displayed in a
+drop-down box of fields to display.  This is done through the parameter
+*validators*, which accepts a list of :class:`FieldValidator` objects.  These
+objects define the way in which the field is generated, and when it is able to
+be created.  In this case, we mandate that parameters *center* and
+*height_vector* are set before creating the field.  These are set via 
+:meth:`~yt.data_objects.data_containers.set_field_parameter`, which can 
+be called on any object that has fields.
+
+We can also define vector fields.
+
+.. code-block:: python
+
+   def _SpecificAngularMomentum(field, data):
+       if data.has_field_parameter("bulk_velocity"):
+           bv = data.get_field_parameter("bulk_velocity")
+       else: bv = np.zeros(3, dtype='float64')
+       xv = data["x-velocity"] - bv[0]
+       yv = data["y-velocity"] - bv[1]
+       zv = data["z-velocity"] - bv[2]
+       center = data.get_field_parameter('center')
+       coords = np.array([data['x'],data['y'],data['z']], dtype='float64')
+       new_shape = tuple([3] + [1]*(len(coords.shape)-1))
+       r_vec = coords - np.reshape(center,new_shape)
+       v_vec = np.array([xv,yv,zv], dtype='float64')
+       return np.cross(r_vec, v_vec, axis=0)
+   def _convertSpecificAngularMomentum(data):
+       return data.convert("cm")
+   add_field("SpecificAngularMomentum",
+             convert_function=_convertSpecificAngularMomentum, vector_field=True,
+             units=r"\rm{cm}^2/\rm{s}", validators=[ValidateParameter('center')])
+
+Here we define the SpecificAngularMomentum field, optionally taking a
+``bulk_velocity``, and returning a vector field that needs conversion by the
+function ``_convertSpecificAngularMomentum``.
+
+It is also possible to define fields that depend on spatial derivatives of 
+other fields.  Calculating the derivative for a single grid cell requires 
+information about neighboring grid cells.  Therefore, properly calculating 
+a derivative for a cell on the edge of the grid will require cell values from 
+neighboring grids.  Below is an example of a field that is the divergence of the 
+velocity.
+
+.. code-block:: python
+
+    def _DivV(field, data):
+        # We need to set up stencils
+        if data.pf["HydroMethod"] == 2:
+            sl_left = slice(None,-2,None)
+            sl_right = slice(1,-1,None)
+            div_fac = 1.0
+        else:
+            sl_left = slice(None,-2,None)
+            sl_right = slice(2,None,None)
+            div_fac = 2.0
+        ds = div_fac * data['dx'].flat[0]
+        f  = data["x-velocity"][sl_right,1:-1,1:-1]/ds
+        f -= data["x-velocity"][sl_left ,1:-1,1:-1]/ds
+        if data.pf.dimensionality > 1:
+            ds = div_fac * data['dy'].flat[0]
+            f += data["y-velocity"][1:-1,sl_right,1:-1]/ds
+            f -= data["y-velocity"][1:-1,sl_left ,1:-1]/ds
+        if data.pf.dimensionality > 2:
+            ds = div_fac * data['dz'].flat[0]
+            f += data["z-velocity"][1:-1,1:-1,sl_right]/ds
+            f -= data["z-velocity"][1:-1,1:-1,sl_left ]/ds
+        new_field = np.zeros(data["x-velocity"].shape, dtype='float64')
+        new_field[1:-1,1:-1,1:-1] = f
+        return new_field
+    def _convertDivV(data):
+        return data.convert("cm")**-1.0
+    add_field("DivV", function=_DivV,
+               validators=[ValidateSpatial(ghost_zones=1,
+	                   fields=["x-velocity","y-velocity","z-velocity"])],
+              units=r"\rm{s}^{-1}", take_log=False,
+              convert_function=_convertDivV)
+
+Note that *slice* is simply a native Python object used for taking slices of 
+arrays or lists.  Another :class:`FieldValidator` object, ``ValidateSpatial`` 
+is given in the list of *validators* in the call to ``add_field`` with 
+*ghost_zones* = 1, specifying that the original grid be padded with one additional 
+cell from the neighboring grids on all sides.  The *fields* keyword simply 
+mandates that the listed fields be present.  With one ghost zone added to all sides 
+of the grid, the data fields (data["x-velocity"], data["y-velocity"], and 
+data["z-velocity"]) will have a shape of (NX+2, NY+2, NZ+2) inside of this function, 
+where the original grid has dimension (NX, NY, NZ).  However, when the final field 
+data is returned, the ghost zones will be removed and the shape will again be 
+(NX, NY, NZ).
+
+.. _derived-field-options:
+
+Saving Derived Fields
+---------------------
+
+Complex fields can be time-consuming to generate, especially on large datasets. To mitigate this, yt provides a mechanism for saving fields to a backup file using the Grid Data Format. The next time you start yt, it will check this file and your field will be treated as native if present. 
+
+The code below creates a new derived field called "Entr" and saves it to disk:
+
+.. code-block:: python
+
+    from yt.mods import *
+    from yt.utilities.grid_data_format import writer
+
+    def _Entropy(field, data) :
+        return data["Temperature"]*data["Density"]**(-2./3.)
+    add_field("Entr", function=_Entropy)
+
+    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
+    writer.save_field(pf, "Entr")
+
+This creates a "_backup.gdf" file next to your datadump. If you load up the dataset again:
+
+.. code-block:: python
+
+    from yt.mods import *
+
+    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
+    data = pf.h.all_data()
+    print data["Entr"]
+
+you can work with the field exactly as before, without having to recompute it.
+
+Field Options
+-------------
+
+The arguments to :func:`add_field` are passed on to the constructor of
+:class:`DerivedField`.  :func:`add_field` takes care of finding the arguments
+`function` and `convert_function` if it can, however.  There are a number of
+options available, but the only mandatory ones are ``name`` and possibly
+``function``.
+
+   ``name``
+     This is the name of the field -- how you refer to it.  For instance,
+     ``Pressure`` or ``H2I_Fraction``.
+   ``function``
+     This is a function handle that defines the field
+   ``convert_function``
+     This is the function that converts the field to CGS.  All inputs to this
+     function are mandated to already *be* in CGS.
+   ``units``
+     This is a mathtext (LaTeX-like) string that describes the units.
+   ``projected_units``
+     This is a mathtext (LaTeX-like) string that describes the units if the
+     field has been projected without a weighting.
+   ``display_name``
+     This is a name used in the plots, for instance ``"Divergence of
+     Velocity"``.  If not supplied, the ``name`` value is used.
+   ``take_log``
+     This is *True* or *False* and describes whether the field should be logged
+     when plotted.
+   ``particle_type``
+     Is this field a *particle* field?
+   ``validators``
+     (*Advanced*) This is a list of :class:`FieldValidator` objects, for instance to mandate
+     spatial data.
+   ``vector_field``
+     (*Advanced*) Is this field more than one value per cell?
+   ``display_field``
+     (*Advanced*) Should this field appear in the dropdown box in Reason?
+   ``not_in_all``
+     (*Advanced*) If this is *True*, the field may not be in all the grids.
+   ``projection_conversion``
+     (*Advanced*) Which unit should we multiply by in a projection?
+
+How Do Units Work?
+------------------
+
+Everything is done under the assumption that all of the native Enzo fields that
+yt knows about are converted to cgs before being handed to any processing
+routines.
+
+Which Enzo Fields Does yt Know About?
+-------------------------------------
+
+* Density
+* Temperature
+* Gas Energy
+* Total Energy
+* [xyz]-velocity
+* Species fields: HI, HII, Electron, HeI, HeII, HeIII, HM, H2I, H2II, DI, DII, HDI
+* Particle mass, velocity, 
+

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/bootcamp/Introduction.ipynb
--- a/source/bootcamp/Introduction.ipynb
+++ b/source/bootcamp/Introduction.ipynb
@@ -15,52 +15,26 @@
       "\n",
       "In this brief tutorial, we'll go over how to load up data, analyze things, inspect your data, and make some visualizations.\n",
       "\n",
-      "But, before we begin, there are a few places to go if you run into trouble.\n",
-      "\n",
-      "**The yt homepage is at http://yt-project.org/**\n",
-      "\n",
-      "## Source of Help\n",
-      "\n",
-      "There are three places to check for help:\n",
-      "\n",
-      " * The documentation: http://yt-project.org/doc/\n",
-      " * The IRC Channel (`#yt` on `chat.freenode.net`, also at http://yt-project.org/irc.html)\n",
-      " * The `yt-users` mailing list, at http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org\n",
-      "\n",
-      "## Sources of Information\n",
-      "\n",
-      "The first place to go for information about any kind of development is BitBucket at https://bitbucket.org/yt_analysis/yt/ , which contains a bug tracker, the source code, and links to other useful places.\n",
-      "\n",
-      "You can find recipes in the documentation ( http://yt-project.org/doc/ ) under the \"Cookbook\" section.\n",
-      "\n",
-      "There is a portal with access to data and IPython notebooks at http://hub.yt-project.org/ .\n",
-      "\n",
-      "## How to Update yt\n",
-      "\n",
-      "If you ever run into a situation where you need to update your yt installation, simply type this on the command line:\n",
-      "\n",
-      "`yt update`\n",
-      "\n",
-      "This will automatically update it for you.\n",
+      "Our documentation page can provide information on a variety of the commands that are used here, both in narrative documentation as well as recipes for specific functionality in our cookbook.  The documentation exists at http://yt-project.org/doc/.  If you encounter problems, look for help here: http://yt-project.org/doc/help/index.html.\n",
       "\n",
       "## Acquiring the datasets for this tutorial\n",
       "\n",
-      "To access the datasets that are used in these bootcamp tutorials, you can either download them manually at http://yt-project.org/data/.\n",
+      "If you are executing these tutorials interactively, you need some sample datasets on which to run the code.  You can download these datasets at http://yt-project.org/data/.  The datasets necessary for each lesson are noted next to the corresponding tutorial.\n",
       "\n",
       "## What's Next?\n",
       "\n",
       "The Notebooks are meant to be explored in this order:\n",
       "\n",
       "1. Introduction\n",
-      "2. Data Inspection\n",
-      "3. Simple Visualization\n",
-      "4. Data Objects and Time Series\n",
-      "5. Derived Fields and Profiles\n",
-      "6. Volume Rendering"
+      "2. Data Inspection (IsolatedGalaxy dataset)\n",
+      "3. Simple Visualization (enzo_tiny_cosmology & Enzo_64 datasets)\n",
+      "4. Data Objects and Time Series (IsolatedGalaxy dataset)\n",
+      "5. Derived Fields and Profiles (IsolatedGalaxy dataset)\n",
+      "6. Volume Rendering (IsolatedGalaxy dataset)"
      ]
     }
    ],
    "metadata": {}
   }
  ]
-}
\ No newline at end of file
+}

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -152,7 +152,7 @@
 # Add any paths that contain custom static files (such as style sheets) here,
 # relative to this directory. They are copied after the builtin static files,
 # so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static', 'advanced/_static']
+html_static_path = ['_static', 'analyzing/_static']
 
 # If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
 # using the given strftime format.
@@ -250,7 +250,7 @@
 
 # Example configuration for intersphinx: refer to the Python standard library.
 intersphinx_mapping = {'http://docs.python.org/': None,
-                       'http://ipython.org/ipython-doc/rel-1.10/html/': None,
+                       'http://ipython.org/ipython-doc/rel-1.10/': None,
                        'http://docs.scipy.org/doc/numpy/': None,
                        'http://matplotlib.sourceforge.net/': None,
                        }

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -39,6 +39,7 @@
 Example Notebooks
 -----------------
 .. toctree::
-   :maxdepth: 2
+   :maxdepth: 1
 
    notebook_tutorial
+   ../analyzing/analysis_modules/sunyaev_zeldovich

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/developing/creating_derived_fields.rst
--- a/source/developing/creating_derived_fields.rst
+++ /dev/null
@@ -1,313 +0,0 @@
-.. _creating-derived-fields:
-
-Creating Derived Fields
-=======================
-
-One of the more powerful means of extending yt is through the usage of derived
-fields.  These are fields that describe a value at each cell in a simulation.
-
-Defining a New Field
---------------------
-
-So once a new field has been conceived of, the best way to create it is to
-construct a function that performs an array operation -- operating on a 
-collection of data, neutral to its size, shape, and type.  (All fields should
-be provided as 64-bit floats.)
-
-A simple example of this is the pressure field, which demonstrates the ease of
-this approach.
-
-.. code-block:: python
-
-   def _Pressure(field, data):
-       return (data.pf["Gamma"] - 1.0) * \
-              data["Density"] * data["ThermalEnergy"]
-
-Note that we do a couple different things here.  We access the "Gamma"
-parameter from the parameter file, we access the "Density" field and we access
-the "ThermalEnergy" field.  "ThermalEnergy" is, in fact, another derived field!
-("ThermalEnergy" deals with the distinction in storage of energy between dual
-energy formalism and non-DEF.)  We don't do any loops, we don't do any
-type-checking, we can simply multiply the three items together.
-
-Once we've defined our function, we need to notify yt that the field is
-available.  The :func:`add_field` function is the means of doing this; it has a
-number of fairly specific parameters that can be passed in, but here we'll only
-look at the most basic ones needed for a simple scalar baryon field.
-
-.. code-block:: python
-
-   add_field("Pressure", function=_Pressure, units=r"\rm{dyne}/\rm{cm}^{2}")
-
-We feed it the name of the field, the name of the function, and the
-units.  Note that the units parameter is a "raw" string, with some
-LaTeX-style formatting -- Matplotlib actually has a MathText rendering
-engine, so if you include LaTeX it will be rendered appropriately.
-
-.. One very important thing to note about the call to ``add_field`` is
-.. that it **does not** need to specify the function name **if** the
-.. function is the name of the field prefixed with an underscore.  If it
-.. is not -- and it won't be for fields in different units (such as
-.. "CellMassMsun") -- then you need to specify it with the argument
-.. ``function``.
-
-We suggest that you name the function that creates a derived field
-with the intended field name prefixed by a single underscore, as in
-the ``_Pressure`` example above.
-
-If you find yourself using the same custom-defined fields over and over, you
-should put them in your plugins file as described in :ref:`plugin-file`.
-
-Conversion Factors
-~~~~~~~~~~~~~~~~~~
-
-When creating a derived field, yt does not by default do unit
-conversion.  All of the fields fed into the field are pre-supposed to
-be in CGS.  If the field does not need any constants applied after
-that, you are done. If it does, you should define a second function
-that applies the proper multiple in order to return the desired units
-and use the argument ``convert_function`` to ``add_field`` to point to
-it.  
-
-The argument that you pass to ``convert_function`` will be dependent on 
-what fields are input into your derived field, and in what form they
-are passed from their native format.  For enzo fields, nearly all the
-native on-disk fields are in CGS units already (except for ``dx``, ``dy``,
-and ``dz`` fields), so you typically only need to convert for 
-off-standard fields taking into account where those fields are 
-used in the final output derived field.  For other codes, it can vary.
-
-You can check to see the units associated with any field in a dataset
-from any code by using the ``_units`` attribute.  Here is an example 
-with one of our sample FLASH datasets available publicly at 
-http://yt-project.org/data :
-
-.. code-block:: python
-
-   >>> from yt.mods import *
-   >>> pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100")
-   >>> pf.h.field_list
-   ['dens', 'temp', 'pres', 'gpot', 'divb', 'velx', 'vely', 'velz', 'magx', 'magy', 'magz', 'magp']
-   >>> pf.field_info['dens']._units
-   '\\rm{g}/\\rm{cm}^{3}'
-   >>> pf.field_info['temp']._units
-   '\\rm{K}'
-   >>> pf.field_info['velx']._units
-   '\\rm{cm}/\\rm{s}'
-
-Thus if you were using any of these fields as input to your derived field, you 
-wouldn't have to worry about unit conversion because they're already in CGS.
-
-Some More Complicated Examples
-------------------------------
-
-But what if we want to do some more fancy stuff?  Here's an example of getting
-parameters from the data object and using those to define the field;
-specifically, here we obtain the ``center`` and ``height_vector`` parameters
-and use those to define an angle of declination of a point with respect to a
-disk.
-
-.. code-block:: python
-
-   def _DiskAngle(field, data):
-       # We make both r_vec and h_vec into unit vectors
-       center = data.get_field_parameter("center")
-       r_vec = np.array([data["x"] - center[0],
-                         data["y"] - center[1],
-                         data["z"] - center[2]])
-       r_vec = r_vec/np.sqrt((r_vec**2.0).sum(axis=0))
-       h_vec = np.array(data.get_field_parameter("height_vector"))
-       dp = r_vec[0,:] * h_vec[0] \
-          + r_vec[1,:] * h_vec[1] \
-          + r_vec[2,:] * h_vec[2]
-       return np.arccos(dp)
-   add_field("DiskAngle", take_log=False,
-             validators=[ValidateParameter("height_vector"),
-                         ValidateParameter("center")],
-             display_field=False)
-
-Note that we have added a few parameters below the main function; we specify
-that we do not wish to display this field as logged, that we require both
-``height_vector`` and ``center`` to be present in a given data object we wish
-to calculate this for, and we say that it should not be displayed in a
-drop-down box of fields to display.  This is done through the parameter
-*validators*, which accepts a list of :class:`FieldValidator` objects.  These
-objects define the way in which the field is generated, and when it is able to
-be created.  In this case, we mandate that parameters *center* and
-*height_vector* are set before creating the field.  These are set via 
-:meth:`~yt.data_objects.data_containers.set_field_parameter`, which can 
-be called on any object that has fields.
-
-We can also define vector fields.
-
-.. code-block:: python
-
-   def _SpecificAngularMomentum(field, data):
-       if data.has_field_parameter("bulk_velocity"):
-           bv = data.get_field_parameter("bulk_velocity")
-       else: bv = np.zeros(3, dtype='float64')
-       xv = data["x-velocity"] - bv[0]
-       yv = data["y-velocity"] - bv[1]
-       zv = data["z-velocity"] - bv[2]
-       center = data.get_field_parameter('center')
-       coords = np.array([data['x'],data['y'],data['z']], dtype='float64')
-       new_shape = tuple([3] + [1]*(len(coords.shape)-1))
-       r_vec = coords - np.reshape(center,new_shape)
-       v_vec = np.array([xv,yv,zv], dtype='float64')
-       return np.cross(r_vec, v_vec, axis=0)
-   def _convertSpecificAngularMomentum(data):
-       return data.convert("cm")
-   add_field("SpecificAngularMomentum",
-             convert_function=_convertSpecificAngularMomentum, vector_field=True,
-             units=r"\rm{cm}^2/\rm{s}", validators=[ValidateParameter('center')])
-
-Here we define the SpecificAngularMomentum field, optionally taking a
-``bulk_velocity``, and returning a vector field that needs conversion by the
-function ``_convertSpecificAngularMomentum``.
-
-It is also possible to define fields that depend on spatial derivatives of 
-other fields.  Calculating the derivative for a single grid cell requires 
-information about neighboring grid cells.  Therefore, properly calculating 
-a derivative for a cell on the edge of the grid will require cell values from 
-neighboring grids.  Below is an example of a field that is the divergence of the 
-velocity.
-
-.. code-block:: python
-
-    def _DivV(field, data):
-        # We need to set up stencils
-        if data.pf["HydroMethod"] == 2:
-            sl_left = slice(None,-2,None)
-            sl_right = slice(1,-1,None)
-            div_fac = 1.0
-        else:
-            sl_left = slice(None,-2,None)
-            sl_right = slice(2,None,None)
-            div_fac = 2.0
-        ds = div_fac * data['dx'].flat[0]
-        f  = data["x-velocity"][sl_right,1:-1,1:-1]/ds
-        f -= data["x-velocity"][sl_left ,1:-1,1:-1]/ds
-        if data.pf.dimensionality > 1:
-            ds = div_fac * data['dy'].flat[0]
-            f += data["y-velocity"][1:-1,sl_right,1:-1]/ds
-            f -= data["y-velocity"][1:-1,sl_left ,1:-1]/ds
-        if data.pf.dimensionality > 2:
-            ds = div_fac * data['dz'].flat[0]
-            f += data["z-velocity"][1:-1,1:-1,sl_right]/ds
-            f -= data["z-velocity"][1:-1,1:-1,sl_left ]/ds
-        new_field = np.zeros(data["x-velocity"].shape, dtype='float64')
-        new_field[1:-1,1:-1,1:-1] = f
-        return new_field
-    def _convertDivV(data):
-        return data.convert("cm")**-1.0
-    add_field("DivV", function=_DivV,
-               validators=[ValidateSpatial(ghost_zones=1,
-	                   fields=["x-velocity","y-velocity","z-velocity"])],
-              units=r"\rm{s}^{-1}", take_log=False,
-              convert_function=_convertDivV)
-
-Note that *slice* is simply a native Python object used for taking slices of 
-arrays or lists.  Another :class:`FieldValidator` object, ``ValidateSpatial`` 
-is given in the list of *validators* in the call to ``add_field`` with 
-*ghost_zones* = 1, specifying that the original grid be padded with one additional 
-cell from the neighboring grids on all sides.  The *fields* keyword simply 
-mandates that the listed fields be present.  With one ghost zone added to all sides 
-of the grid, the data fields (data["x-velocity"], data["y-velocity"], and 
-data["z-velocity"]) will have a shape of (NX+2, NY+2, NZ+2) inside of this function, 
-where the original grid has dimension (NX, NY, NZ).  However, when the final field 
-data is returned, the ghost zones will be removed and the shape will again be 
-(NX, NY, NZ).
-
-.. _derived-field-options:
-
-Saving Derived Fields
----------------------
-
-Complex fields can be time-consuming to generate, especially on large datasets. To mitigate this, yt provides a mechanism for saving fields to a backup file using the Grid Data Format. The next time you start yt, it will check this file and your field will be treated as native if present. 
-
-The code below creates a new derived field called "Entr" and saves it to disk:
-
-.. code-block:: python
-
-    from yt.mods import *
-    from yt.utilities.grid_data_format import writer
-
-    def _Entropy(field, data) :
-        return data["Temperature"]*data["Density"]**(-2./3.)
-    add_field("Entr", function=_Entropy)
-
-    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
-    writer.save_field(pf, "Entr")
-
-This creates a "_backup.gdf" file next to your datadump. If you load up the dataset again:
-
-.. code-block:: python
-
-    from yt.mods import *
-
-    pf = load('GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0100')
-    data = pf.h.all_data()
-    print data["Entr"]
-
-you can work with the field exactly as before, without having to recompute it.
-
-Field Options
--------------
-
-The arguments to :func:`add_field` are passed on to the constructor of
-:class:`DerivedField`.  :func:`add_field` takes care of finding the arguments
-`function` and `convert_function` if it can, however.  There are a number of
-options available, but the only mandatory ones are ``name`` and possibly
-``function``.
-
-   ``name``
-     This is the name of the field -- how you refer to it.  For instance,
-     ``Pressure`` or ``H2I_Fraction``.
-   ``function``
-     This is a function handle that defines the field
-   ``convert_function``
-     This is the function that converts the field to CGS.  All inputs to this
-     function are mandated to already *be* in CGS.
-   ``units``
-     This is a mathtext (LaTeX-like) string that describes the units.
-   ``projected_units``
-     This is a mathtext (LaTeX-like) string that describes the units if the
-     field has been projected without a weighting.
-   ``display_name``
-     This is a name used in the plots, for instance ``"Divergence of
-     Velocity"``.  If not supplied, the ``name`` value is used.
-   ``take_log``
-     This is *True* or *False* and describes whether the field should be logged
-     when plotted.
-   ``particle_type``
-     Is this field a *particle* field?
-   ``validators``
-     (*Advanced*) This is a list of :class:`FieldValidator` objects, for instance to mandate
-     spatial data.
-   ``vector_field``
-     (*Advanced*) Is this field more than one value per cell?
-   ``display_field``
-     (*Advanced*) Should this field appear in the dropdown box in Reason?
-   ``not_in_all``
-     (*Advanced*) If this is *True*, the field may not be in all the grids.
-   ``projection_conversion``
-     (*Advanced*) Which unit should we multiply by in a projection?
-
-How Do Units Work?
-------------------
-
-Everything is done under the assumption that all of the native Enzo fields that
-yt knows about are converted to cgs before being handed to any processing
-routines.
-
-Which Enzo Fields Does yt Know About?
--------------------------------------
-
-* Density
-* Temperature
-* Gas Energy
-* Total Energy
-* [xyz]-velocity
-* Species fields: HI, HII, Electron, HeI, HeII, HeIII, HM, H2I, H2II, DI, DII, HDI
-* Particle mass, velocity, 
-

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/developing/developing.rst
--- a/source/developing/developing.rst
+++ b/source/developing/developing.rst
@@ -66,11 +66,13 @@
 Licensing
 +++++++++
 
-All contributed code must be GPL-compatible; we ask that you consider licensing
-under the GPL version 3, but we will consider submissions of code that are
-BSD-like licensed as well.  If you'd rather not license in this manner, but
-still want to contribute, just drop me a line and I'll put a link on the main
-wiki page to wherever you like!
+yt has, with the 2.6 release, been `relicensed
+<http://blog.yt-project.org/post/Relicensing.html>`_ under the BSD 3-clause
+license.  Previously versions were released under the GPLv3.
+
+All contributed code must be BSD-compatible.  If you'd rather not license in
+this manner, but still want to contribute, please consider creating an external
+package, which we'll happily link to.
 
 Requirements for Code Submission
 ++++++++++++++++++++++++++++++++

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/developing/index.rst
--- a/source/developing/index.rst
+++ b/source/developing/index.rst
@@ -21,6 +21,5 @@
    testing
    debugdrive
    creating_datatypes
-   creating_derived_fields
    creating_derived_quantities
    creating_frontend

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/examining/supported_frontends_data.rst
--- a/source/examining/supported_frontends_data.rst
+++ b/source/examining/supported_frontends_data.rst
@@ -4,7 +4,8 @@
 ============
 
 This section contains information on how to load data into ``yt`` from
-supported codes, as well as some important caveats about different data formats.
+supported codes, as well as some important caveats about different
+data formats.
 
 .. _loading-enzo-data:
 
@@ -124,6 +125,7 @@
   positions will not be.
 * Domains may be visualized assuming periodicity.
 
+<<<<<<< local
 .. _loading-ramses-data:
 
 RAMSES Data
@@ -234,6 +236,9 @@
 
 Athena Data
 ----------
+=======
+.. loading-amr-data:
+>>>>>>> other
 
 Athena 4.x VTK data is *mostly* supported and cared for by John
 ZuHone. Both uniform grid and SMR datasets are supported. 

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/faq/index.rst
--- a/source/faq/index.rst
+++ b/source/faq/index.rst
@@ -174,6 +174,8 @@
 
   $ python setup.py install
 
+.. _plugin-file:
+
 What is the "Plugin File"?
 --------------------------
 

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/help/index.rst
--- a/source/help/index.rst
+++ b/source/help/index.rst
@@ -1,16 +1,40 @@
 .. _asking-for-help:
 
-How to Get Help
-===============
+What to do if you run into problems
+===================================
 
-If you run into problems with ``yt``, you should feel **encouraged** to ask for
-help -- whether this comes in the form of reporting a bug or emailing the
-mailing list.  If something doesn't work for you, it's in everyone's best
-interests to make sure that it gets fixed.
+If you run into problems with ``yt``, there are a number of steps to follow
+to come to a solution.  The first handful of options are things you can do 
+on your own, but if those don't yield results, we have provided a number of 
+ways to connect with our community of users and developers to solve the 
+problem together.
+
+To summarize, here are the steps in order:
+
+ #. Don’t panic and don’t give up
+ #. Update to the latest version
+ #. Search the yt documentation and mailing list archives
+ #. Look at the yt source
+ #. Isolate & document your problem 
+ #. Go on IRC and ask a question
+ #. Ask the mailing list
+ #. Submit a bug report
+
+.. _dont-panic:
+
+Don't panic and don't give up
+-----------------------------
+
+This may seem silly, but it's effective.  While yt is a robust code with
+lots of functionality, like all actively-developed codes sometimes there
+are bugs.  Chances are good that your problems have a quick fix, either 
+because someone encountered it before and fixed it, the documentation is 
+out of date, or some other simple solution.  Don't give up!  We want
+to help you succeed!
 
 .. _update-the-code:
 
-Try Updating yt
+Try updating yt
 ---------------
 
 Sometimes the pace of development is pretty fast on yt, particularly in the
@@ -31,8 +55,8 @@
 
 .. _search-the-documentation:
 
-Search the Documentation
-------------------------
+Search the documentation and mailing lists
+------------------------------------------
 
 The documentation has a lot of the answers to everyday problems.  This doesn't 
 mean you have to read all of the docs top-to-bottom, but you should at least 
@@ -40,11 +64,6 @@
 on the search field to the right of this window and enter your text.  Another 
 good place to look for answers in the documentation is our :ref:`faq` page.
 
-.. _mailing-list:
-
-Search/Ask the Mailing List
----------------------------
-
 OK, so there was no obvious solution to your problem in the documentation.  
 It is possible that someone else experienced the problem before you did, and
 wrote to the mailing list about it.  You can easily check the mailing list 
@@ -63,7 +82,82 @@
    </form><script type="text/javascript" src="http://www.google.com/cse/brand?form=cse-search-box&lang=en"></script>
 
-If you didn't find any hint of a solution in the archive, then feel free to 
+.. _look-at-the-source:
+
+Look at the source code
+-----------------------
+
+We've done our best to make the source clean, and it is easily searchable from 
+your computer.  Go inside your yt install directory by going to the 
+``$YT_DEST/src/yt-hg/yt`` directory where all the code lives.  You can then search 
+for the class, function, or keyword which is giving you problems with 
+``grep -r *``, which will recursively search throughout the code base.  (For a 
+much faster and cleaner experience, we recommend ``grin`` instead of 
+``grep -r *``.  To install ``grin`` with python, just type ``pip install 
+grin``.)  
+
+So let's say that pesky ``SlicePlot`` is giving you problems still, and you 
+want to look at the source to figure out what is going on.
+
+.. code-block:: bash
+
+  $ cd $YT_DEST/src/yt-hg/yt
+  $ grep -r SlicePlot *         (or $ grin SlicePlot)
+  
+   data_objects/analyzer_objects.py:class SlicePlotDataset(AnalysisTask):
+   data_objects/analyzer_objects.py:        from yt.visualization.api import SlicePlot
+   data_objects/analyzer_objects.py:        self.SlicePlot = SlicePlot
+   data_objects/analyzer_objects.py:        slc = self.SlicePlot(pf, self.axis, self.field, center = self.center)
+   ...
+
+You can now followup on this and open up the files that have references to 
+``SlicePlot`` (particularly the one that definese SlicePlot) and inspect their
+contents for problems or clarification.
+
+.. _isolate_and_document:
+
+Isolate and document your problem
+---------------------------------
+
+As you gear up to take your question to the rest of the community, try to distill
+your problem down to the fewest number of steps needed to produce it in a 
+script.  This can help you (and us) to identify the basic problem.  Follow
+these steps:
+
+ * Identify what it is that went wrong, and how you knew it went wrong.
+ * Put your script, errors, and outputs online:
+
+   * ``$ yt pastebin script.py`` - pastes script.py online
+   * ``$ python script.py --paste`` - pastes errors online
+   * ``$ yt upload_image image.png`` - pastes image online
+
+ * Identify which version of the code you’re using. 
+
+   * ``$ yt instinfo`` - provides version information, including changeset hash
+
+It may be that through the mere process of doing this, you end up solving 
+the problem!
+
+.. _irc:
+
+IRC
+---
+
+If you want a fast, interactive experience, you could try jumping into our IRC 
+channel to get your questions answered in a chatroom style environment.  You 
+don't even need to have any special IRC client in order to join.  We are the
+#yt channel on irc.freenode.net, but you can also connect using your web 
+browser by going to http://yt-project.org/irc.html .  There are usually 2-8 
+members of the user base and development team online, so you'll probably get 
+your answers quickly.  Remember to bring the information from the 
+:ref:`last step <isolate_and_document>`.
+
+.. _mailing-list:
+
+Ask the mailing list
+--------------------
+
+If you still haven't yet found a solution, feel free to 
 write to the mailing list regarding your problems.  There are two mailing lists,
 `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_ and
 `yt-dev <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_.  The
@@ -71,71 +165,43 @@
 the latter has more chatter about the way the code is developed and discussions
 of changes and feature improvements.
 
-If you email ``yt-users`` asking for help, there are several things you must
-provide, or else we won't be able to do much:
-
-#. What it is that went wrong, and how you knew it went wrong.
-#. A traceback if appropriate -- see :ref:`error-reporting` for some help with
-   that.
-#. If possible, the smallest number of steps that can reproduce the problem. 
-   If you're demonstrating the bug with code, you may find the :ref:`pastebin` 
-   useful.If you've got an image output that demonstrates your problem, you may 
-   find the :ref:`upload-image` function useful.
-#. Which version of the code you are using (i.e. the output of ``yt instinfo``).
+If you email ``yt-users`` asking for help, remember to include the information
+about your problem you identified in :ref:`this step <isolate_and_document>`.
 
 When you email the list, providing this information can help the developers
 understand what you did, how it went wrong, and any potential fixes or similar
 problems they have seen in the past.  Without this context, it can be very
 difficult to help out!
 
-.. _irc:
-
-IRC
----
-
-If you want a more interactive experience, you could try jumping into our IRC 
-channel to get your questions answered in a chatroom style environment.  You 
-don't even need to have any special IRC client in order to join.  We are the
-#yt channel on irc.freenode.net, but you can also connect using your web 
-browser by going to http://yt-project.org/irc.html .  There are usually 2-8 members of the user base and development team online, so you'll probably get your
-answers quickly.
-
 .. _reporting-a-bug:
 
-How To Report A Bug
+How To report A bug
 -------------------
 
 If you have gone through all of the above steps, and you're still encountering 
-problems, then you have found a bug.  The first step, when reporting a bug, 
-is to identify the smallest piece of code that reproduces the bug.
+problems, then you have found a bug.  
 To submit a bug report, you can either directly create one through the
 BitBucket `web interface <http://hg.yt-project.org/yt/issues/new>`_,
 or you can use the command line ``yt bugreport`` to interactively create one.
 Alternatively, email the ``yt-users`` mailing list and we will construct a new
-ticket in your stead.
+ticket in your stead.  Remember to include the information
+about your problem you identified in :ref:`this step <isolate_and_document>`.
 
 
 Installation Issues
 -------------------
 
-If you are having installation issues, you should *definitely* email the
-``yt-users`` email list.  You should provide information about the host, the
-version of the code you are using, and the output of ``yt_install.log`` from
-your installation.  We are very interested in making sure that ``yt`` installs
-everywhere!
-
-Vanilla Usage Issues
---------------------
-
-If you're running ``yt`` without having made any modifications to the code
-base, please provide as much of your script as you are able to.  Submitting
-both the script and the traceback to the pastebin (as described in :ref:`pastebin`)
-is usually sufficient to reproduce the error.
+If you are having installation issues and nothing from the 
+:ref:`installing page <getting-and-installing-yt` seems to work, you should 
+*definitely* email the ``yt-users`` email list.  You should provide information 
+about the host, the version of the code you are using, and the output of 
+``yt_install.log`` from your installation.  We are very interested in making 
+sure that ``yt`` installs everywhere!
 
 Customization and Scripting Issues
 ----------------------------------
 
 If you have customized ``yt`` in some way, or created your own plugins file (as
 described in :ref:`plugin-file`) then it may be necessary to supply both your
-patches to the source and the plugin file, if you are utilizing something
-defined in that file.
+patches to the source, the plugin file, and perhaps even the datafile on which
+you're running.

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -1,5 +1,5 @@
-What is yt?
-===========
+yt Documentation
+================
 
 yt is a community-developed analysis and visualization toolkit for
 examining datasets in a variety of scientific disciplines.  yt is developed 
@@ -17,8 +17,8 @@
 <http://plutocode.ph.unito.it/>`_.  (Development of additional codes, including
 particle codes and octree codes, is taking place in yt 3.0.)
 
-Documentation
-=============
+Table of Contents
+-----------------
 
 .. raw:: html
 
@@ -114,6 +114,16 @@
          <p class="linkdescr">What to do if you run into problems</p></td></tr>
+     <tr valign="top">
+       <td width="25%">
+         <p>
+           <a href="help/faq.html">FAQ</a>
+         </p>
+       </td>
+       <td width="75%">
+         <p class="linkdescr">Frequently Asked Questions</p>
+       </td>
+     </tr></table>
 
@@ -128,4 +138,5 @@
    examining/index
    developing/index
    reference/index
-   help/index
+   Getting Help <help/index>
+   FAQ <faq/index>

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/installing.rst
--- a/source/installing.rst
+++ b/source/installing.rst
@@ -1,25 +1,43 @@
+.. _getting-and-installing-yt:
+
 Getting and Installing yt
 =========================
 
+.. _getting-yt:
+
+Getting yt
+----------
+
 yt is a Python package (with some components written in C), using NumPy as a
 computation engine, Matplotlib for some visualization tasks and Mercurial for
 version control.  Because installation of all of these interlocking parts can 
 be time-consuming, yt provides an installation script which downloads and builds
 a fully-isolated Python + Numpy + Matplotlib + HDF5 + Mercurial installation.  
 yt supports Linux and OSX deployment, with the possibility of deployment on 
-other Unix-like systems (XSEDE resources, clusters, etc.).  Windows is not
+other Unix-like systems (XSEDE resources, clusters, etc.).  Windows is not 
 supported.
 
+Since the install is fully-isolated, if you get tired of having yt on your 
+system, you can just delete its directory, and yt and all of its dependencies
+will be removed from your system (no scattered files remaining throughout 
+your system).  
+
 To get the installation script, download it from:
 
 .. code-block:: bash
 
   http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
 
-By default, it will install an array of items, but there are additional packages
-that can be downloaded and installed (e.g. SciPy, enzo, etc.). The script has 
-all of these options at the top of the file. You should be able to open it and 
-edit it without any knowledge of bash syntax.  To execute it, run:
+.. _installing-yt:
+
+Installing yt
+-------------
+
+By default, the bash script will install an array of items, but there are 
+additional packages that can be downloaded and installed (e.g. SciPy, enzo, 
+etc.). The script has all of these options at the top of the file. You should 
+be able to open it and edit it without any knowledge of bash syntax.  
+To execute it, run:
 
 .. code-block:: bash
 
@@ -37,6 +55,8 @@
 potentially figure out what went wrong.  If you have problems, though, do not 
 hesitate to :ref:`contact us asking-for-help` for assistance.
 
+.. _activating-yt:
+
 Activating Your Installation
 ----------------------------
 
@@ -79,6 +99,8 @@
 If you choose this installation method, you do not need to run the activation
 script as it is unnecessary.
 
+.. _testing-installation:
+
 Testing Your Installation
 -------------------------
 
@@ -95,3 +117,35 @@
 If you get an error, follow the instructions it gives you to debug the problem.  
 Do not hesitate to :ref:`contact us asking-for-help` so we can help you 
 figure it out.
+
+.. _updating-yt:
+
+Updating yt and its dependencies
+--------------------------------
+
+With many active developers, code development sometimes occurs at a furious 
+pace in yt.  To make sure you're using the latest version of the code, run
+this command at a command-line:
+
+.. code-block:: bash
+
+  $ yt update
+
+Additionally, if you want to make sure you have the latest dependencies 
+associated with yt and update the codebase simultaneously, type this:
+
+.. code-block:: bash
+
+  $ yt update --all
+
+.. _removing-yt:
+
+Removing yt and its dependencies
+--------------------------------
+
+Because yt and its dependencies are installed in an isolated directory when
+you use the script installer, you can easily remove yt and all of its 
+dependencies cleanly.  Simply remove the install directory and its 
+subdirectories and you're done.  If you *really* had problems with the
+code, this is a last defense for solving: remove and then fully
+:ref:`re-install <installing-yt>` from the install script again.

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/reference/api/api.rst
--- a/source/reference/api/api.rst
+++ b/source/reference/api/api.rst
@@ -124,16 +124,71 @@
    ~yt.frontends.chombo.data_structures.ChomboHierarchy
    ~yt.frontends.chombo.data_structures.ChomboStaticOutput
 
-RAMSES
+Castro
 ^^^^^^
 
 
 .. autosummary::
    :toctree: generated/
 
-   ~yt.frontends.ramses.data_structures.RAMSESGrid
-   ~yt.frontends.ramses.data_structures.RAMSESHierarchy
-   ~yt.frontends.ramses.data_structures.RAMSESStaticOutput
+   ~yt.frontends.castro.data_structures.CastroGrid
+   ~yt.frontends.castro.data_structures.CastroHierarchy
+   ~yt.frontends.castro.data_structures.CastroStaticOutput
+
+Pluto
+^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.pluto.data_structures.PlutoGrid
+   ~yt.frontends.pluto.data_structures.PlutoHierarchy
+   ~yt.frontends.pluto.data_structures.PlutoStaticOutput
+
+Stream
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.stream.data_structures.StreamGrid
+   ~yt.frontends.stream.data_structures.StreamHierarchy
+   ~yt.frontends.stream.data_structures.StreamStaticOutput
+
+Nyx
+^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.nyx.data_structures.NyxGrid
+   ~yt.frontends.nyx.data_structures.NyxHierarchy
+   ~yt.frontends.nyx.data_structures.NyxStaticOutput
+
+Athena
+^^^^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.athena.data_structures.AthenaGrid
+   ~yt.frontends.athena.data_structures.AthenaHierarchy
+   ~yt.frontends.athena.data_structures.AthenaStaticOutput
+
+GDF
+^^^
+
+
+.. autosummary::
+   :toctree: generated/
+
+   ~yt.frontends.gdf.data_structures.GDFGrid
+   ~yt.frontends.gdf.data_structures.GDFHierarchy
+   ~yt.frontends.gdf.data_structures.GDFStaticOutput
 
 Derived Datatypes
 -----------------

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/visualizing/_cb_docstrings.inc
--- a/source/visualizing/_cb_docstrings.inc
+++ b/source/visualizing/_cb_docstrings.inc
@@ -1,6 +1,4 @@
-
-
-.. function:: arrow(self, pos, code_size, plot_args=None):
+.. function:: annotate_arrow(self, pos, code_size, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ArrowCallback`.)
 
@@ -8,18 +6,47 @@
    *code_size* in code units.  *plot_args* is a dict fed to
    matplotlib with arrow properties.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'Density', width=(10,'kpc'), center='max')
+   slc.annotate_arrow((0.53, 0.53, 0.53), 1/pf['kpc'])
+   slc.save()
 
-.. function:: clumps(self, clumps, plot_args=None):
+-------------
+
+.. function:: annotate_clumps(self, clumps, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ClumpContourCallback`.)
 
    Take a list of *clumps* and plot them as a set of
    contours.
 
+.. python-script::
 
+   from yt.mods import *
+   from yt.analysis_modules.level_sets.api import *
 
-.. function:: contour(self, field, ncont=5, factor=4, take_log=False, clim=None, plot_args=None):
+   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
+   data_source = pf.h.disk([0.5, 0.5, 0.5], [0., 0., 1.],
+                           8./pf.units['kpc'], 1./pf.units['kpc'])
+
+   c_min = 10**na.floor(na.log10(data_source['Density']).min()  )
+   c_max = 10**na.floor(na.log10(data_source['Density']).max()+1)
+
+   function = 'self.data[\'Density\'].size > 20'
+   master_clump = Clump(data_source, None, 'Density', function=function)
+   find_clumps(master_clump, c_min, c_max, 2.0)
+   leaf_clumps = get_lowest_clumps(master_clump)
+
+   prj = ProjectionPlot(pf, 2, 'Density', center='c', width=(20,'kpc'))
+   prj.annotate_clumps(leaf_clumps)
+   prj.save('clumps')
+
+-------------
+
+.. function:: annotate_contour(self, field, ncont=5, factor=4, take_log=False, clim=None, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ContourCallback`.)
 
@@ -29,18 +56,17 @@
    how it is contoured and *clim* gives the (upper, lower)
    limits for contouring.
 
+.. python-script::
+   
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   s = SlicePlot(pf, "x", ["Density"], center="max")
+   s.annotate_contour("Temperature")
+   s.save()
 
+-------------
 
-.. function:: coord_axes(self, unit=None, coords=False):
-
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.CoordAxesCallback`.)
-
-   Creates x and y axes for a VMPlot. In the future, it will
-   attempt to guess the proper units to use.
-
-
-
-.. function:: cquiver(self, field_x, field_y, factor):
+.. function:: annotate_cquiver(self, field_x, field_y, factor):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.CuttingQuiverCallback`.)
 
@@ -48,9 +74,18 @@
    *field_x* and *field_y*, skipping every *factor*
    datapoint in the discretization.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   s = OffAxisSlicePlot(pf, [1,1,0], ["Density"], center="c")
+   s.annotate_cquiver('CuttingPlaneVelocityX', 'CuttingPlaneVelocityY', 10)
+   s.zoom(1.5)
+   s.save()
 
-.. function:: grids(self, alpha=1.0, min_pix=1, annotate=False, periodic=True):
+-------------
+
+.. function:: annotate_grids(self, alpha=1.0, min_pix=1, annotate=False, periodic=True):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.GridBoundaryCallback`.)
 
@@ -59,18 +94,35 @@
    wide. *annotate* puts the grid id in the corner of the
    grid.  (Not so great in projections...)
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   slc = SlicePlot(pf, 'z', 'Density', width=(10,'kpc'), center='max')
+   slc.annotate_grids()
+   slc.save()
 
-.. function:: hop_circles(self, hop_output, max_number=None, annotate=False, min_size=20, max_size=10000000, font_size=8, print_halo_size=False, print_halo_mass=False, width=None):
+-------------
+
+.. function:: annotate_hop_circles(self, hop_output, max_number=None, annotate=False, min_size=20, max_size=10000000, font_size=8, print_halo_size=False, print_halo_mass=False, width=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.HopCircleCallback`.)
 
    Accepts a :class:`yt.HopList` *hop_output* and plots up
    to *max_number* (None for unlimited) halos as circles.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   halos = HaloFinder(pf)
+   p = ProjectionPlot(pf, "z", "Density")
+   p.annotate_hop_circles(halos)
+   p.save()
 
-.. function:: hop_particles(self, hop_output, max_number, p_size=1.0, min_size=20, alpha=0.2):
+-------------
+
+.. function:: annotate_hop_particles(self, hop_output, max_number, p_size=1.0, min_size=20, alpha=0.2):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.HopParticleCallback`.)
 
@@ -80,34 +132,51 @@
    plotted with *p_size* pixels per particle;  *alpha*
    determines the opacity of each particle.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   halos = HaloFinder(pf)
+   p = ProjectionPlot(pf, "x", "Density", center='m', width=(10, 'Mpc'))
+   p.annotate_hop_particles(halos, max_number=100, p_size=5.0)
+   p.save()
 
-.. function:: image_line(self, p1, p2, data_coords=False, plot_args=None):
+-------------
+
+.. function:: annotate_image_line(self, p1, p2, data_coords=False, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ImageLineCallback`.)
 
-   Plot from *p1* to *p2* (image plane coordinates) with
+   Plot from *p1* to *p2* (normalized image plane coordinates) with
    *plot_args* fed into the plot.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_image_line((0.3, 0.4), (0.8, 0.9), plot_args={'linewidth':5})
+   p.save()
 
-.. function:: axis_label(self, label):
+-------------
 
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.LabelCallback`.)
-
-   This adds a label to the plot.
-
-
-
-.. function:: line(self, x, y, plot_args=None):
+.. function:: annotate_line(self, x, y, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.LinePlotCallback`.)
 
-   Over plot *x* and *y* with *plot_args* fed into the plot.
+   Over plot *x* and *y* (in code units) with *plot_args* fed into the plot.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_line([-6, -4, -2, 0, 2, 4, 6], [3.6, 1.6, 0.4, 0, 0.4, 1.6, 3.6], plot_args={'linewidth':5})
+   p.save()
 
-.. function:: magnetic_field(self, factor=16, scale=None, scale_units=None, normalize=False):
+-------------
+
+.. function:: annotate_magnetic_field(self, factor=16, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.MagFieldCallback`.)
 
@@ -120,19 +189,37 @@
    features to be more clearly seen for fields with
    substantial variation in field strength.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("MHDSloshing/virgo_low_res.0054.vtk",
+             parameters={"TimeUnits":3.1557e13, "LengthUnits":3.0856e24,
+                         "DensityUnits":6.770424595218825e-27})
+   p = ProjectionPlot(pf, 'z', 'Density', center='c', width=(300, 'kpc'))
+   p.annotate_magnetic_field()
+   p.save()
 
-.. function:: marker(self, pos, marker='x', plot_args=None):
+-------------
+
+.. function:: annotate_marker(self, pos, marker='x', plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.MarkerAnnotateCallback`.)
 
-   Adds text *marker* at *pos* in code-arguments.
+   Adds text *marker* at *pos* in code coordinates.
    *plot_args* is a dict that will be forwarded to the plot
    command.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   s.annotate_marker([0.53, 0.53, 0.53], plot_args={'s':10000})
+   s.save()   
 
-.. function:: particles(self, width, p_size=1.0, col='k', marker='o', stride=1.0, ptype=None, stars_only=False, dm_only=False, minimum_mass=None):
+-------------
+
+.. function:: annotate_particles(self, width, p_size=1.0, col='k', marker='o', stride=1.0, ptype=None, stars_only=False, dm_only=False, minimum_mass=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.ParticleCallback`.)
 
@@ -145,9 +232,17 @@
    given mass, calculated via ParticleMassMsun, to be
    plotted.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("Enzo_64/DD0043/data0043")
+   p = ProjectionPlot(pf, "x", "Density", center='m', width=(10, 'Mpc'))
+   p.annotate_particles(10/pf['Mpc'])
+   p.save()
 
-.. function:: point(self, pos, text, text_args=None):
+-------------
+
+.. function:: annotate_point(self, pos, text, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.PointAnnotateCallback`.)
 
@@ -155,9 +250,17 @@
    code-space. *text_args* is a dict fed to the text
    placement code.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_point([0.53, 0.526, 0.53], "What's going on here?", text_args={'size':'xx-large', 'color':'w'})
+   p.save()
 
-.. function:: quiver(self, field_x, field_y, factor, scale=None, scale_units=None, normalize=False):
+-------------
+
+.. function:: annotate_quiver(self, field_x, field_y, factor, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.QuiverCallback`.)
 
@@ -167,9 +270,18 @@
    length unit using *scale_units*  (see
    matplotlib.axes.Axes.quiver for more info)
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], 
+                      weight_field='Density', width=(20, 'kpc'))
+   p.annotate_quiver('x-velocity', 'y-velocity', 16)
+   p.save()
 
-.. function:: sphere(self, center, radius, circle_args=None, text=None, text_args=None):
+-------------
+
+.. function:: annotate_sphere(self, center, radius, circle_args=None, text=None, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.SphereCallback`.)
 
@@ -177,9 +289,17 @@
    *radius* in code units will be created, with optional
    *circle_args*, *text*, and *text_args*.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   p.annotate_sphere([0.53, 0.53, 0.53], 2/pf['kpc'], {'fill':True})
+   p.save()
 
-.. function:: streamlines(self, field_x, field_y, factor=6.0, nx=16, ny=16, xstart=(0, 1), ystart=(0, 1), nsample=256, start_at_xedge=False, start_at_yedge=False, plot_args=None):
+-------------
+
+.. function:: annotate_streamlines(self, field_x, field_y, factor=6.0, nx=16, ny=16, xstart=(0, 1), ystart=(0, 1), nsample=256, start_at_xedge=False, start_at_yedge=False, plot_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.StreamlineCallback`.)
 
@@ -191,9 +311,17 @@
    use *start_at_yedge*.  A line with the qmean vector
    magnitude will cover 1.0/*factor* of the image.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   s.annotate_streamlines('x-velocity', 'y-velocity')
+   s.save()
 
-.. function:: text(self, pos, text, data_coords=False, text_args=None):
+-------------
+
+.. function:: annotate_text(self, pos, text, data_coords=False, text_args=None):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.TextLabelCallback`.)
 
@@ -202,27 +330,33 @@
    is True, position will be in code units instead of image
    coordinates.
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   s = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   s.annotate_text((0.53, 0.53), 'Sample text', text_args={'size':'xx-large', 'color':'w'})
+   s.save()
 
-.. function:: title(self, title='Plot'):
+-------------
+
+.. function:: annotate_title(self, title='Plot'):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.TitleCallback`.)
 
    Accepts a *title* and adds it to the plot
 
+.. python-script::
 
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = ProjectionPlot(pf, 'z', 'Density', center=[0.53, 0.53, 0.53], width=(20, 'kpc'))
+   p.annotate_title('Density plot')
+   p.save()
 
-.. function:: units(self, unit='au', factor=4, text_annotate=True, text_which=-2):
+-------------
 
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.UnitBoundaryCallback`.)
-
-   Add on a plot indicating where *factor*s of *unit* are
-   shown. Optionally *text_annotate* on the
-   *text_which*-indexed box on display.
-
-
-
-.. function:: velocity(self, factor=16, scale=None, scale_units=None, normalize=False):
+.. function:: annotate_velocity(self, factor=16, scale=None, scale_units=None, normalize=False):
 
    (This is a proxy for :class:`~yt.visualization.plot_modifications.VelocityCallback`.)
 
@@ -236,12 +370,10 @@
    substantial variation in field strength (normalize is not
    implemented and thus ignored for Cutting Planes).
 
+.. python-script::
 
-
-.. function:: voboz_circle(self, voboz_output, max_number=None, annotate=False, min_size=20, font_size=8, print_halo_size=False):
-
-   (This is a proxy for :class:`~yt.visualization.plot_modifications.VobozCircleCallback`.)
-
-   x.__init__(...) initializes x; see help(type(x)) for
-   signature
-
+   from yt.mods import *
+   pf = load("HiresIsolatedGalaxy/DD0044/DD0044")
+   p = SlicePlot(pf, 'z', 'Density', center='m', width=(10, 'kpc'))
+   p.annotate_velocity()
+   p.save()

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/visualizing/_images/surfaces_blender.png
Binary file source/visualizing/_images/surfaces_blender.png has changed

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/visualizing/callbacks.rst
--- a/source/visualizing/callbacks.rst
+++ b/source/visualizing/callbacks.rst
@@ -6,9 +6,6 @@
 Adding callbacks to plots
 -------------------------
 
-Plot window Plots
-~~~~~~~~~~~~~~~~~
-
 Because the plots in ``yt`` are considered to be "volatile" -- existing
 independent of the canvas on which they are plotted -- before they are saved,
 you can have a set of "callbacks" run that modify them before saving to disk.
@@ -17,8 +14,9 @@
 
 Callbacks can be applied to plots created with
 :class:`~yt.visualization.plot_window.SlicePlot`,
-:class:`~yt.visualization.plot_window.ProjectionPlot`, or
-:class:`~yt.visualization.plot_window.OffAxisSlicePlot`,  by calling
+:class:`~yt.visualization.plot_window.ProjectionPlot`,
+:class:`~yt.visualization.plot_window.OffAxisSlicePlot`, or
+:class:`~yt.visualization.plot_windiw.OffAxisProjectionPlot` by calling
 one of the ``annotate_`` methods that hang off of the plot object.
 The ``annotate_`` methods are dynamically generated based on the list
 of available callbacks.  For example:
@@ -32,46 +30,10 @@
 callbacks listed below are available via similar ``annotate_``
 functions.
 
-
-PlotCollection Plots
-~~~~~~~~~~~~~~~~~~~~
-
-For :class:`~yt.visualization.plot_collection.PlotCollection` plots,
-the callbacks can be accessed through a registry attached to every
-plot object.  When you add a plot to a
-:class:`~yt.visualization.plot_collection.PlotCollection`, you get back
-that affiliated plot object.  By accessing ``modify`` on that plot
-object, you have access to the available callbacks.  For instance,
-
-.. code-block:: python
-
-   p = PlotCollection.add_slice("Density", 0)
-   p.modify["velocity"]()
-
-would add the :func:`velocity` callback to the plot object.  When you save the
-plot, the list of callbacks will be iterated over, and the velocity callback
-will be handed the current state of the plot.  It will then be able to
-dynamically modify the plot before saving -- in this case, adding on velocity
-vectors atop the image.  You can also access the plot objects inside the
-PlotCollection directly:
-
-.. code-block:: python
-
-   pc.add_slice("Density", 0)
-   pc.plots[-1].modify["grids"]()
-
-Note that if you are plotting interactively, the PlotCollection will need to
-have ``redraw`` called on it.
-
-.. note:: You can access ``plot`` objects after creation through the ``plots``
-   list on the ``PlotCollection``.
-
-
 Available Callbacks
 -------------------
 
-These are the callbacks available through the ``modify[]`` mechanism.  The
-underlying functions are documented (largely identical to this) in
+The underlying functions are documented (largely identical to this) in
 :ref:`callback-api`.
 
 .. include:: _cb_docstrings.inc

diff -r 4c5ea52653bea409733f3a47a011c35677d06a84 -r b2f5d3d06e3f799e3818f3a276ec06848750a899 source/visualizing/index.rst
--- a/source/visualizing/index.rst
+++ b/source/visualizing/index.rst
@@ -8,5 +8,6 @@
    callbacks
    manual_plotting
    volume_rendering
+   sketchfab
    streamlines
    colormaps/index

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/935aea735b47/
Changeset:   935aea735b47
User:        jzuhone
Date:        2013-10-31 02:55:10
Summary:     Merging
Affected #:  5 files

diff -r b2f5d3d06e3f799e3818f3a276ec06848750a899 -r 935aea735b47c303f9b8f879841166abc5910480 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -2,7 +2,7 @@
 *.pyc
 .*.swp
 build/*
-source/api/generated/*
+source/reference/api/generated/*
 _temp/*
 **/.DS_Store
 RD0005-mine/*

diff -r b2f5d3d06e3f799e3818f3a276ec06848750a899 -r 935aea735b47c303f9b8f879841166abc5910480 extensions/notebook_sphinxext.py
--- a/extensions/notebook_sphinxext.py
+++ b/extensions/notebook_sphinxext.py
@@ -77,6 +77,8 @@
 
         # clean up png files left behind by notebooks.
         png_files = glob.glob("*.png")
+        fits_files = glob.glob("*.fits")
+        h5_files = glob.glob("*.h5")
         for file in png_files:
             os.remove(file)
 

diff -r b2f5d3d06e3f799e3818f3a276ec06848750a899 -r 935aea735b47c303f9b8f879841166abc5910480 extensions/pythonscript_sphinxext.py
--- a/extensions/pythonscript_sphinxext.py
+++ b/extensions/pythonscript_sphinxext.py
@@ -2,7 +2,7 @@
 from subprocess import Popen,PIPE
 from docutils.parsers.rst import directives
 from docutils import nodes
-import os, glob, shutil, uuid, re, string
+import os, glob, base64
 
 class PythonScriptDirective(Directive):
     """Execute an inline python script and display images.
@@ -17,21 +17,6 @@
     has_content = True
 
     def run(self):
-        # Constuct paths
-        rst_file = self.state_machine.document.attributes['source']
-        rst_dir = os.path.abspath(os.path.dirname(rst_file))
-        source_dir = os.path.dirname(
-            os.path.abspath(self.state.document.current_source))
-        rel_dir = os.path.relpath(rst_dir, setup.confdir)
-        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
-                                                source_dir))
-
-        # working around a docutils/sphinx issue?
-        dest_dir = string.replace(dest_dir, 'internal padding after ', '')
-
-        if not os.path.exists(dest_dir):
-            os.makedirs(dest_dir) # no problem here for me, but just use built-ins
-
         # Construct script from cell content
         content = "\n".join(self.content)
         with open("temp.py", "w") as f:
@@ -47,11 +32,11 @@
 
         images = sorted(glob.glob("*.png"))
         fns = []
+        text = ''
         for im in images:
-            fns.append(str(uuid.uuid4()) + ".png")
-            shutil.move(im, os.path.join(dest_dir, fns[-1]))
-            print im, os.path.join(dest_dir, fns[-1])
-
+            text += get_image_tag(im)
+            os.remove(im)
+            
         os.remove("temp.py")
 
         code = content
@@ -59,10 +44,10 @@
         literal = nodes.literal_block(code,code)
         literal['language'] = 'python'
 
-        images = []
-        for fn in fns:
-            images.append(nodes.image(uri="./"+fn, width="600px"))
-        return [literal] + images
+        attributes = {'format': 'html'}
+        img_node = nodes.raw('', text, **attributes)
+        
+        return [literal, img_node]
 
 def setup(app):
     app.add_directive('python-script', PythonScriptDirective)
@@ -70,15 +55,7 @@
     setup.config = app.config
     setup.confdir = app.confdir
 
-    app.connect('build-finished', cleanup)
-
-# http://stackoverflow.com/questions/136505/searching-for-uuids-in-text-with-regex
-PATTERN = \
-    "[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}"
-
-def cleanup(app, exception):
-    """ Cleanup all png files with UUID filenames in the source """
-    for root,dirnames,filenames in os.walk(app.srcdir):
-        matches = re.findall(PATTERN, "\n".join(filenames))
-        for match in matches:
-            os.remove(os.path.join(root, match+".png"))
+def get_image_tag(filename):
+    with open(filename, "rb") as image_file:
+        encoded_string = base64.b64encode(image_file.read())
+        return '<img src="data:image/png;base64,%s" width="600"><br>' % encoded_string

diff -r b2f5d3d06e3f799e3818f3a276ec06848750a899 -r 935aea735b47c303f9b8f879841166abc5910480 source/developing/building_the_docs.rst
--- /dev/null
+++ b/source/developing/building_the_docs.rst
@@ -0,0 +1,95 @@
+.. _docs_build:
+
+=================
+Building the Docs
+=================
+
+The yt documentation makes heavy use of the sphinx documentation automation
+suite.  Sphinx, written in python, was originally created for the documentation
+of the python project and has many nice capabilities for managing the
+documentation of python code.
+
+While much of the yt documentation is static text, we make heavy use of
+cross-referencing with API documentation that is automatically generated at
+build time by sphinx.  We also use sphinx to run code snippets and embed
+resulting images and example data.
+
+yt Sphinx extensions
+--------------------
+
+The documentation makes heavy use of custom sphinx extensions to transform
+recipes, notebooks, and inline code snippets into python scripts, IPython_
+notebooks, or notebook cells that are executed when the docs are built.
+
+To do this, we use IPython's nbconvert module to transform notebooks into
+HTML. to simplify versioning of the notebook JSON format, we store notebooks in
+an unevaluated state.  To generate evaluated notebooks, which could include
+arbitrary output (text, images, HTML), we make use of runipy_, which provides
+facilities to script notebook evaluation.
+
+.. _runipy: https://github.com/paulgb/runipy
+.. _IPython: http://ipython.org/
+
+Dependencies
+------------
+
+To build the docs, you will need yt, IPython, runipy, and all supplementary yt
+analysis modules installed. The following dependencies were used to generate the
+yt documentation during the release of yt 2.6 in late 2013.
+
+- Sphinx 1.1.3
+- IPython 1.1
+- runipy_ (git hash f74458c2877)
+- pandoc 1.11.1
+- Rockstar halo finder 0.99.6
+- SZpack_ 1.1.1
+
+.. _SZpack: http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html
+
+You will also need the full yt suite of `yt test data
+<http://yt-project.org/data/>`_, including the larger datasets that are not used
+in the answer tests.
+
+Building the docs
+-----------------
+
+First, you will need to ensure that your testing configuration is properly
+configured and that all of the yt test data is in the testing directory.  See
+:ref:`run_answer_testing` for more details on how to set up the testing
+configuration.
+
+Next, clone the yt-doc repository, navigate to the root of the repository, and
+do :code:`make html`.
+
+.. code-block:: bash
+
+   hg clone https://bitbucket.org/yt_analysis/yt-doc ./yt-doc
+   cd yt-doc
+   make html
+
+If all of the dependencies are installed and all of the test data is in the
+testing directory, this should churn away for a while and eventually generate a
+docs build.  This process is lengthy but shouldn't last more than an hour.  We
+suggest setting :code:`suppressStreamLogging = True` in your yt configuration
+(See :ref:`configuration-file`) to suppress large amounts of debug output from
+yt.
+
+To clean the docs build, use :code:`make clean`.  By default, :code:`make clean`
+will not delete the autogenerated API docs, so use :code:`make fullclean` to
+delete those as well.
+
+
+Quick docs builds
+-----------------
+
+Clearly, building the complete docs is something of an undertaking.  If you are
+adding new static content building the complete docs build is probably
+overkill.  To skip some of the lengthier operations, you can do the following
+from the bash prompt:
+
+.. code-block:: bash
+
+   export READTHEDOCS=True
+
+This variable is set for automated builds on the free ReadTheDocs service but
+can be used by anyone to force a quick, minimal build.

diff -r b2f5d3d06e3f799e3818f3a276ec06848750a899 -r 935aea735b47c303f9b8f879841166abc5910480 source/developing/index.rst
--- a/source/developing/index.rst
+++ b/source/developing/index.rst
@@ -23,3 +23,4 @@
    creating_datatypes
    creating_derived_quantities
    creating_frontend
+   building_the_docs


https://bitbucket.org/yt_analysis/yt-doc/commits/64fe0d129a4d/
Changeset:   64fe0d129a4d
User:        jzuhone
Date:        2013-10-31 12:50:52
Summary:     Merged chummels/yt-doc into default
Affected #:  32 files

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/advanced/index.rst
--- a/source/advanced/index.rst
+++ /dev/null
@@ -1,21 +0,0 @@
-.. _advanced:
-
-Advanced yt Usage
-=================
-
-yt has been designed to be flexible, with several entry points.
-
-.. toctree::
-   :maxdepth: 2
-
-   installing
-   plugin_file
-   parallel_computation
-   creating_derived_quantities
-   creating_datatypes
-   debugdrive
-   external_analysis
-   developing
-   testing
-   creating_frontend
-   reason_architecture

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/advanced/installing.rst
--- a/source/advanced/installing.rst
+++ /dev/null
@@ -1,233 +0,0 @@
-.. _installing-yt:
-
-Installing yt
-=============
-
-.. _automated-installation:
-
-Automated Installation
-----------------------
-
-The recommended method for installing yt is to install an isolated environment,
-using the installation script.  The front yt homepage will always contain a
-link to the most up to date version, but you should be able to obtain it from a
-command prompt by executing:
-
-.. code-block:: bash
-
-   $ wget http://hg.yt-project.org/yt/raw/stable/doc/install_script.sh
-   $ bash install_script.sh
-
-at a command prompt.  This script will download the **stable** version of the
-``yt``, along with all of its affiliated dependencies.  It will tell you at the
-end various variables you need to set in order to ensure that ``yt`` works
-correctly.  If you run into *any* problems with the installation script, that
-is considered a bug we will fix, and we encourage you to write to `yt-users
-<http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_.
-
-.. _manual-installation:
-
-Manual Installation
--------------------
-
-If you choose to install ``yt`` yourself, you will have to ensure that the
-correct dependencies have been met.  A few are optional, and one is necessary
-if you wish to install the latest development version of ``yt``, but here is a list
-of the various necessary items to build ``yt``.
-
-Installation of the various libraries is a bit beyond the scope of this
-document; if you run into any problems, your best course of action is to
-consult with the documentation for the individual projects.
-
-.. _dependencies:
-
-Required Libraries
-++++++++++++++++++
-
-This is a list of libraries installed by the installation script.  The version
-numbers are those used by the installation script -- ``yt`` may work with lower
-versions or higher versions, but these are known to work.
-
- * Python-2.7.3, but not (yet) 3.0 or higher
- * NumPy-1.6.1 (at least 1.4.1)
- * HDF5-1.8.9 or higher (at least 1.8.7)
- * h5py-2.1.0 (2.0 fixes a major bug)
- * Matplotlib-1.2.0 or higher
- * Mercurial-2.5.1 or higher (anything higher than 1.5 works)
- * Cython-0.17.1 or higher (at least 0.15.1)
-
-Optional Libraries
-++++++++++++++++++
-
-These libraries are all optional, but they are highly recommended.
-
- * Forthon-0.8.10 or higher (for halo finding and correlation functions)
- * libpng-1.5.12 or higher (for raw image outputting)
- * FreeType-2.4.4 or higher (for text annotation on raw images)
- * IPython-0.13.1 (0.10 will also work)
- * PyX-0.11.1
- * zeromq-2.2.0 (needed for IPython notebook)
- * pyzmq-2.2.11 (needed for IPython notebook)
- * tornado-2.2  (needed for IPython notebook)
- * sympy-0.7.2 
- * nose-1.2.1
-
-If you are attempting to install manually, and you are not installing into a
-fully-isolated location, you should probably use your system's package
-management system as much as possible.
-
-Once you have successfully installed the dependencies, you should clone the
-primary ``yt`` repository.  
-
-You can clone the repository with this mercurial command:
-
-.. code-block:: bash
-
-   $ hg clone http://hg.yt-project.org/yt ./yt-hg
-   $ cd yt-hg
-   $ hg up -C stable
-
-This will create a directory called ``yt-hg`` that contains the entire version
-control history of ``yt``.  If you would rather use the branch ``yt``, which is
-the current development version, issue the command ``hg up -C yt`` .
-
-To compile ``yt``, you will have to specify the location to the HDF5 libraries,
-and optionally the libpng and freetype libraries.  To do so, put the "prefix"
-for the installation location into the files ``hdf5.cfg`` and (optionally)
-``png.cfg`` and ``freetype.cfg``.  For instance, if you installed into
-``/opt/hdf5/`` you would put ``/opt/hdf5/`` into ``hdf5.cfg``.  Once you have
-specified the location to these libraries, you can execute the command:
-
-.. code-block:: bash
-
-   $ python2.7 setup.py install
-
-from the ``yt-hg`` directory.  Alternately, you can replace ``install`` with
-``develop`` if you anticipate making any modifications to the code; ``develop``
-simply means that the source will be read from that directory, whereas
-``install`` will copy it into the main Python package directory.
-
-That should install ``yt`` the library as well as the commands ``iyt`` and
-``yt``.  Good luck!
-
-Package Management System Installation
---------------------------------------
-
-While the installation script provides a complete stack of utilities,
-integration into your existing operating system can often be desirable.
-
-Ubuntu PPAs
-+++++++++++
-
-Mike Kuhlen has kindly provided PPAs for Ubuntu. If you're running Ubuntu, you
-can install these easily:
-
-.. code-block:: bash
-
-   $ sudo add-apt-repository ppa:kuhlen
-   $ sudo apt-get update
-   $ sudo apt-get install yt
-
-If you'd like a development branch of yt, you can change yt for yt-devel to get
-the most recently packaged development branch.
-
-MacPorts
-++++++++
-
-Thomas Robitaille has kindly provided a `MacPorts <http://www.macports.org/>`_
-installation of yt, as part of his `MacPorts for Python Astronomers
-<http://astrofrog.github.com/macports-python/>`_.  To activate, simply type:
-
-Thanks very much, Thomas!
-
-
-.. _community-installations:
-
-Community Installations
------------------------
-
-Recently, yt has been added as a module on several supercomputers.  We hope to
-increase this list through partnership with supercomputer centers.  You should
-be able to load an appropriate yt module on these systems:
-
- * NICS Kraken
- * NICS Nautilus
-
-.. _updating-yt:
-
-Updating yt
-===========
-
-.. _automated-update:
-
-Automated Update
-----------------
-
-The recommended method for updating yt is to run the update tool at a command 
-prompt:
-
-.. code-block:: bash
-
-   $ yt update
-
-This script will identify which repository you're using (stable, development, 
-etc.), connect to the yt-project.org server, download any recent changesets 
-for your version and then recompile any new code that needs 
-it (e.g. cython, rebuild).  This same behavior is achieved manually by running:
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   $ python setup.py develop
-
-Note that this automated update will fail if you have made modifications to
-the yt code base that you have not yet committed.  If this occurs, identify
-your modifications using 'hg status', and then commit them using 'hg commit',
-in order to bring the repository back to a state where you can automatically
-update the code as above.  On the other hand, if you want to wipe out your
-uncommitted changes and just update to the latest version, you can type: 
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   $ hg up -C      # N.B. This will wipe your uncommitted changes! 
-   $ python setup.py develop
-
-If you run into *any* problems with the update utility, it should be considered
-a bug, and we would love to hear about it so we can fix it.  Please inform us 
-through the bugsubmit utility or through the yt-users mailing list.
-
-Updating yt's Dependencies
---------------------------
-
-If you used the install script to originally install yt, updating the various 
-libraries and modules yt depends on can be done by running:
-
-.. code-block:: bash
-
-   $ yt update --all
-
-For custom installs, you will need to update the dependencies by hand.
-
-Switching Between Branches in yt
-================================
-
-.. _switching-versions:
-
-If you are running the stable version of the code, and you want to switch 
-to using the development version of the code (or vice versa), you can merely
-follow a few steps (without reinstalling all of the source again):
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg 
-   $ hg pull
-   <commit all changes or they will be lost>
-   $ hg up -C <branch>     # N.B. This will wipe your uncommitted changes! 
-   $ python setup.py develop
-
-If you want to switch to using the development version of the code, use: 
-"yt" as <branch>, whereas if you want to switch to using the stable version
-of the code, use: "stable" as <branch>.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/index.rst
--- a/source/index.rst
+++ b/source/index.rst
@@ -140,3 +140,4 @@
    reference/index
    Getting Help <help/index>
    FAQ <faq/index>
+   sharing_data_hub

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/mapserver.png
Binary file source/interacting/_images/mapserver.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/rs1_welcome.png
Binary file source/interacting/_images/rs1_welcome.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/rs2_printstats.png
Binary file source/interacting/_images/rs2_printstats.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/rs3_proj.png
Binary file source/interacting/_images/rs3_proj.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/rs4_widget.png
Binary file source/interacting/_images/rs4_widget.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/_images/rs5_menu.png
Binary file source/interacting/_images/rs5_menu.png has changed

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/command-line.rst
--- a/source/interacting/command-line.rst
+++ /dev/null
@@ -1,235 +0,0 @@
-.. _command-line:
-
-Command-line Functions
-----------------------
-
-The developers of yt realize that there is a lot more to analyzing code 
-than just making pretty pictures.  That is why we included several easy-to-call
-functions that could be executed from a command-line prompt for sharing code 
-and images with others, using our GUI Reason, manipulating your data 
-google-maps style, updating yt's codebase and more.  To get a quick list of 
-what is available, just type:
-
-.. code-block:: bash
-
-   yt -h
-
-This yields all of the subcommands.  To execute any such function, 
-simply run:
-
-.. code-block:: bash
-
-   yt <subcommand>
-
-Finally, to identify the options associated with any of these subcommand, run:
-
-.. code-block:: bash
-
-   yt <subcommand> -h
-
-Let's go through each subcommand.
-
-.. code-block:: bash
-
-    help                Print help message
-    bootstrap_dev       Bootstrap a yt development environment
-    bugreport           Report a bug in yt
-    hop                 Run HOP on one or more datasets
-    hub_register        Register a user on the Hub: http://hub.yt-project.org/
-    hub_submit          Submit a mercurial repository to the yt Hub
-                        (http://hub.yt-project.org/), creating a BitBucket
-                        repo in the process if necessary.
-    instinfo            Get some information about the yt installation
-    load                Load a single dataset into an IPython instance
-    mapserver           Serve a plot in a GMaps-style interface
-    pastebin            Post a script to an anonymous pastebin
-    pastebin_grab       Print an online pastebin to STDOUT for local use.
-    upload_notebook     Upload an IPython notebook to hub.yt-project.org.
-    plot                Create a set of images
-    render              Create a simple volume rendering
-    rpdb                Connect to a currently running (on localhost) rpd
-                        session. Commands run with --rpdb will trigger an rpdb
-                        session with any uncaught exceptions.
-    notebook            Run the IPython Notebook
-    serve               Run the Web GUI Reason
-    reason              Run the Web GUI Reason
-    stats               Print stats and max/min value of a given field (if
-                        requested), for one or more datasets (default field is
-                        Density)
-    update              Update the yt installation to the most recent version
-    upload_image        Upload an image to imgur.com. Must be PNG.
-
-
-help
-++++
-
-Help lists all of the various command-line options in yt.
-
-bootstrap_dev
-+++++++++++++
-
-After you have installed yt and you want to do some development, there may 
-be a few more steps to complete.  This subcommand automates building a 
-development environment for you by setting up your hg preferences correctly,
-creating/linking to a bitbucket account for hosting and sharing your code, 
-and setting up a pasteboard for your code snippets.  A full description of 
-how this works can be found in :ref:`bootstrap-dev`.
-
-bugreport         
-+++++++++
-
-Encountering a bug in your own code can be a big hassle, but it can be 
-exponentially worse to find it in someone else's.  That's why we tried to 
-make it as easy as possible for users to report bugs they find in yt.  
-After you go through the necessary channels to make sure you're not just
-making a mistake (see :ref:`asking-for-help`), you can submit bug 
-reports using this nice utility.
-
-hop               
-+++
-
-This lets you run the HOP algorithm as a halo-finder on one or more 
-datasets.  It nominally reproduces the behavior of enzohop from the 
-enzo suite.  There are several flags you can use in order to specify
-your threshold, input names, output names, and whether you want to use 
-dark matter or all particles.  To view these flags run help with the 
-hop subcommand.
-
-hub_register and hub_submit
-+++++++++++++++++++++++++++
-
-We in the yt camp believe firmly in the ideals of open-source coding.  To
-further those ends, we have made a location for people to share their 
-nifty and useful codes with other scientists who might be able to use 
-them: the `yt hub <http://hub.yt-project.org/>`_.  Did you make a cool 
-code for generating a movie from your simulation outputs?  Submit it to 
-the hub.  Did you create a perl script that automates something and saves 
-you some time while on a supercomputer.  Submit it to the hub.  And 
-using the hubsubmit command, you can do this really easily.  If you 
-create a mercurial repository for the code you want to submit, just 
-run the hubsubmit command from within its directory structure, and we'll 
-take care of the rest, by putting it on bitbucket and finally submitting 
-it to the hub to share with the rest of the yt community.  Check out 
-what people have already put up on the
-`yt hub <http://hub.yt-project.org/>`_, and see :ref:`share-your-scripts` 
-for more details about sharing your work on the hub.
-
-instinfo
-++++++++
-
-This gives very similar behavior to the update command, in that it 
-will automatically update your yt version to the latest in whichever
-repository you're in (stable, development, etc.).  It can also provide 
-you with the hash of the version you're using.
-
-load
-++++
-
-This will start the iyt interactive environment with your specified 
-dataset already loaded.  See :ref:`interactive-prompt` for more details.
-
-mapserver
-+++++++++
-
-Ever wanted to interact with your data using the 
-`google maps <http://maps.google.com/>`_ interface?  Now you can by using the
-yt mapserver.  See :ref:`mapserver` for more details.
-
-pastebin and pastebin_grab
-++++++++++++++++++++++++++
-
-The `pastebin <http://paste.yt-project.org/>`_ is an online location where 
-you can anonymously post code snippets and error messages to share with 
-other users in a quick, informal way.  It is often useful for debugging 
-code or co-developing.  By running the ``pastebin`` subcommand with a 
-text file, you send the contents of that file to an anonymous pastebin; 
-
-.. code-block:: bash
-
-   yt pastebin my_script.py
-
-By running the ``pastebin_grab`` subcommand with a pastebin number 
-(e.g. 1768), it will grab the contents of that pastebin 
-(e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
-STDOUT for local use.  For more details see the :ref:`pastebin` section.
-
-.. code-block:: bash
-
-   yt pastebin_grab 1768
-
-plot
-++++
-
-This command generates one or many simple plots for a single dataset.  
-By specifying the axis, center, width, etc. (run ``yt help plot`` for 
-details), you can create slices and projections easily at the 
-command-line.
-
-upload_notebook
-+++++++++++++++
-
-This command will accept the filename of a ``.ipynb`` file (generated from an
-IPython notebook session) and upload it to the `yt hub
-<http://hub.yt-project.org/>` where others will be able to view it, and
-download it.  This is an easy method for recording a sequence of commands,
-their output, narrative information, and then sharing that with others.  These
-notebooks will be viewable online, and the appropriate URLs will be returned on
-the command line.
-
-reason and serve
-++++++++++++++++
-
-The ``reason`` and ``serve`` subcommands have identical functionality in that
-they both initiate the Web GUI Reason. See :ref:`reason`.
-
-render
-++++++
-
-This command generates a volume rendering for a single dataset.  By specifying
-the center, width, number of pixels, number and thickness of contours, etc.
-(run ``yt help render`` for details),  you can create high-quality volume
-renderings at the command-line before moving on to more involved volume
-rendering scripts.
-
-rpdb
-++++
-
-Connect to a currently running (on localhost) rpd session.
-
-notebook
-++++++++
-
-Launches an IPython notebook server and prints out instructions on how to open
-an ssh tunnel to connect to the notebook server with a web browser.  This is
-most useful when you want to run an IPython notebook using CPUs on a remote
-host.
-
-stats
-+++++
-
-This subcommand provides you with some basic statistics on a given dataset.
-It provides you with the number of grids and cells in each level, the time
-of the dataset, the resolution, and the maximum density in a variety of units.
-It is tantamount to performing the ``print_stats()`` inside of yt.
-
-update
-++++++
-
-This subcommand updates the yt installation to the most recent version for
-your repository (e.g. stable, 2.0, development, etc.).  Adding the ``--all`` 
-flag will update the dependencies as well. See 
-:ref:`automated-update` for more details.
-
-.. _upload-image:
-
-upload_image
-++++++++++++
-
-Images are often worth a thousand words, so when you're trying to 
-share a piece of code that generates an image, or you're trying to 
-debug image-generation scripts, it can be useful to send your
-co-authors a link to the image.  This subcommand makes such sharing 
-a breeze.  By specifying the image to share, ``upload_image`` automatically
-uploads it anonymously to the website `imgur.com <http://imgur.com/>`_ and
-provides you with a link to share with your collaborators.  Note that the
-image *must* be in the PNG format in order to use this function.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/index.rst
--- a/source/interacting/index.rst
+++ /dev/null
@@ -1,19 +0,0 @@
-.. _interacting_with_yt:
-
-Ways of Interacting with yt
-===========================
-
-There are several entry points to yt, and each carries its own strengths and
-weaknesses.  We cover a few of those here.
-
-.. toctree::
-   :maxdepth: 2
-
-   scripts
-   interactive_prompt
-   ipython_notebook
-   command-line
-   reason
-   mapserver
-   sharing_data_hub
-   paraview

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/interactive_prompt.rst
--- a/source/interacting/interactive_prompt.rst
+++ /dev/null
@@ -1,50 +0,0 @@
-.. _interactive-prompt:
-
-Interactive Prompt
-------------------
-
-The interactive prompt offers a number of excellent opportunities for
-exploration of data.  While there are challenges for repeatability, and some
-operations will be more challenging to operate in parallel, interactive prompts
-can be exceptionally useful for debugging, exploring, and tweaking plots.
-
-There are several different ways to get an interactive prompt for yt, the most
-easy of which is simply to type:
-
-.. code-block:: bash
-
-   pyyt
-
-This will start up the python interpreter.  You can now execute yt commands as
-normal (once you've imported yt!) and examine your data.  There are two other
-handy commands, however, which put you into the IPython interactive shell.
-
-.. warning:: IPython has changed their API substantially in recent versions.
-   yt does not support IPython versions newer than 0.10.
-
-You can start up an empty shell, with a handful of useful yt utilities (such as
-tab-completion and pre-imported modules) by executing:
-
-.. code-block:: bash
-
-   iyt
-
-The other option, which is shorthand for "iyt plus dataset loading" is to use
-the command-line tool (see :ref:`command-line`) with the ``load`` subcommand
-and to specify a parameter file.  For instance:
-
-.. code-block:: bash
-
-   yt load cosmoSim_coolhdf5_chk_0026
-
-or
-
-.. code-block:: bash
-
-   yt load DD0030/DD0030
-
-This will spawn ``iyt``, but the parameter file given on the command line will
-already be in the namespace as ``pf``.  With interactive mode, you can use the
-``pylab`` module to interactively plot, and there is also the object
-``PlotCollectionInteractive`` which can handle setting up draw mode and
-updating plots interactively.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/ipython_notebook.rst
--- a/source/interacting/ipython_notebook.rst
+++ /dev/null
@@ -1,33 +0,0 @@
-IPython Notebook
-================
-
-Starting with 2.4, yt ships with several functions and helpers to display
-information in the IPython web notebook.
-
-.. note:: The IPython necessary dependencies (0MQ, Py0MQ, Tornado and IPython)
-          come with the yt install script.  However, you should read in detail
-          the `IPython documentation
-          <http://ipython.org/ipython-doc/stable/interactive/htmlnotebook.html>`__
-          for how to use it.
-
-A sample notebook, demonstrating some of the functionality of both yt 2.4 and
-the IPython notebook (as exposed through yt) can be found at
-http://yt-project.org/files/yt24.ipynb .
-
-There are two main things that yt exposes to the IPython notebook: displaying
-PlotWindow objects and displaying Volume Renderings.  Both of these are exposed
-through the ``show`` method.  For instance:
-
-.. code-block:: python
-
-   slc = SlicePlot(pf, "x", "Density")
-   slc.show()
-
-Or with a volume rendering, call ``show`` on the camera:
-
-.. code-block:: python
-
-   cam = pf.h.camera([0.5, 0.5, 0.5], [0.2, 0.3, 0.4], 0.10, 1024, tf)
-   cam.show()
-
-In both of these cases, an image will appear in the cell output.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/mapserver.rst
--- a/source/interacting/mapserver.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-.. _mapserver:
-
-Mapserver
----------
-
-The mapserver is a new, experimental feature.  It's based on `Leaflet
-<http://leaflet.cloudmade.com/>`, a library written to create zoomable,
-map-tile interfaces.  (Similar to Google Maps.)  yt provides everything you
-need to start up a web server that will interactively re-pixelize an adaptive
-image.  This means you can explore your datasets in a fully pan-n-zoom
-interface.
-
-To start up the mapserver, you can use the command ``yt`` (see
-:ref:`command-line`) with the ``mapserver`` subcommand.  It takes several of
-the same options and arguments as the ``plot`` subcommand.  For instance:
-
-.. code-block:: bash
-
-   yt mapserver DD0050/DD0050
-
-That will take a slice along the x axis at the center of the domain.  The
-field, projection, weight and axis can all be specified on the command line.
-
-When you do this, it will spawn a micro-webserver on your localhost, and output
-the URL to connect to to standard output.  You can connect to it (or create an
-SSH tunnel to connect to it) and explore your data.  Double-clicking zooms, and
-dragging drags.
-
-.. image:: _images/mapserver.png
-   :scale: 50%
-
-This is also functional on touch-capable devices such as Android Tablets and
-iPads/iPhones.  In future versions, we hope to add halo-overlays and
-markers-of-interest to this.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/paraview.rst
--- a/source/interacting/paraview.rst
+++ /dev/null
@@ -1,32 +0,0 @@
-.. _paraview:
-
-ParaView
---------
-
-.. note:: As of 2.2 the ParaView-yt interoperability is still a work in
-   progress.  Much can be done, but the setup still requires a bit of work.
-
-ParaView support for yt is still preliminary, but is possible.  Future versions
-of ParaView will include the necessary components.  For now, however, to enable
-yt as a plugin in ParaView you will have to build from source and use the
-branches ``AMR-Refactor`` in both VTK and ParaView.  When building, you must
-also link against the Python with which yt was installed.
-
-Finally, to enable the yt ParaView plugin, you must also install the yt plugin,
-available in the `yt-paraview
-<https://gitorious.org/yt-paraview/paraview-plugins>`_ git repository.  (You
-should be able to use ``pip install hg-git`` to install hg-git, which enables
-checking out git repos via mercurial.)
-
-Jorge Poco has created a YouTube video of `ParaView using yt as a plugin
-<http://www.youtube.com/watch?v=cOv4Ob2q1fM>`_:
-
-.. youtube:: cOv4Ob2q1fM
-   width: 600
-   height: 400
-
-
-For more information, there are also two blog posts about this:
-
- * http://blog.yt-project.org/a-movie-of-yt-in-paraview
- * http://blog.yt-project.org/paraview-and-yt

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/reason.rst
--- a/source/interacting/reason.rst
+++ /dev/null
@@ -1,246 +0,0 @@
-.. _reason:
-
-The GUI Reason
---------------
-
-.. warning:: Current versions of Reason may not work quite as expected with
-             Firefox.  They have all been tested under Chrome, and if you run
-             into any bugs with either, please `report them
-             <https://bitbucket.org/yt_analysis/yt/issues/new>`_!
-
-Demo
-++++
-
-Cameron created a short screencast of what Reason is, how it works, and how you
-can use it.  It's best viewed in full-screen, so click the little X in the
-bottom right.
-
-.. raw:: html
-
-   <iframe src="http://player.vimeo.com/video/28506477" width="640"
-        height="320" frameborder="0"></iframe>
-
-
-What is Reason?
-+++++++++++++++
-
-Reason is a web-based GUI for yt.  It's still currently a BETA sequence, but we
-are working very hard to improve it and ensure it's productive.  It's designed
-to act as a very simple web-notebook -- it's not a replacement for something
-like the much more complex IPython web notebook, or the SAGE notebook, but
-rather a means of exploring simulation data easily, safely, and without
-requiring the overhead of additional dependencies.
-
-Everything you need to run reason comes right with the yt install script -- if
-you have installed yt another way, you may need to separately obtain the ExtJS
-packages.
-
-The idea behind reason is to spawn a web server on a shared resource system,
-and connect to that web server from your desktop, tunnelling over SSH.  Reason
-is not designed to be run over unencrypted wires; you should either be running
-fully locally or through an SSH tunnel.  Reasonable steps have been taken to
-ensure that your connections are safe: each connection occurs only on a random
-port (which is potentially identifiable on a shared user system) and with a
-UUID prefixed into each URL (which should be difficult if not impossible to
-identify without root authority.
-
-Starting Reason
-+++++++++++++++
-
-Reason can be started very easily from the command line:
-
-.. code-block:: bash
-
-   $ yt serve
-
-If you are running on your local machine, you can also execute:
-
-.. code-block:: bash
-
-   $ yt serve -o
-
-to open up a local web browser window.  If you want Reason to search for
-(currently only Enzo) parameter files under your current directory, you can
-execute:
-
-.. code-block:: bash
-
-   $ yt serve -o -f
-
-yt will print out something like:
-
-.. code-block:: bash
-
-   =============================================================================
-   =============================================================================
-   Greetings, and welcome to Reason!
-   Your private token is c2dcd1dc-d40f-11e0-8f6b-bc305ba67797 .
-   DO NOT SHARE THIS TOKEN.
-
-   Please direct your browser to:
-
-        http://localhost:51707/c2dcd1dc-d40f-11e0-8f6b-bc305ba67797/
-
-   =============================================================================
-
-   If you are currently ssh'd into a remote machine, you should be able
-   to create a new SSH tunnel by typing or copy/pasting this text
-   verbatim, while waiting to see the 'ssh>' prompt after the first line.
-
-   ~C
-   -L51707:localhost:51707
-
-   and then pointing a web browser on your local machine to the above URL.
-
-   =============================================================================
-   =============================================================================
-
-If you are on a remote machine, you will need to execute the one additional
-step that yt mentions in order to be able to connect.  Press ~C (that's tilde,
-then C) which should bring up a prompt that looks like ``ssh>`` .  At that
-prompt, type what you are told to, which will open up a port over which you can
-tunnel to talk to the server:
-
-.. code-block:: bash
-
-   ssh>-L51707:localhost:51707
-
-Now you can open the URL printed out.
-
-.. _within-reason:
-
-What is Within Reason?
-++++++++++++++++++++++
-
-Once you start up reason, for the first time, you will see something like:
-
-.. image:: _images/rs1_welcome.png
-   :target: _images/rs1_welcome.png
-   :scale: 50%
-
-This is the basic layout.  There are three primary components:
-
-  * (top, left) *Object List*: The list of parameter files and objects.  Every
-    time you load a parameter file or create a (persistent) data object, it will
-    appear here.
-  * (top, right) *Interaction Area*: this is where the notebook and any
-    widgets will appear.
-  * (bottom) *Log Area*: The messages normally spit out to the log will be put
-    here.
-
-The main mechanism of interacting with reason is through the notebook.  You can
-type commands in.  When you either click the down-arrow on the right or press
-Shift-Enter, these commands will be sent and execute on the far side.  This
-should be thought of more like a series of mini-scripts, rather than individual
-lines: you can send multiple lines of input, including for loops, conditionals,
-and so on, very easily.  If you want to access any object, you can drag it from
-the object list, as demonstrated here, where I have drug in the parameter file
-and called ``print_stats`` on its hierarchy:
-
-.. image:: _images/rs2_printstats.png
-   :target: _images/rs2_printstats.png
-   :scale: 50%
-
-Any command can be executed here, and the output will appear in an output cell
-below.  The output cells have two sets of arrows on them.  The leftmost (blue)
-arrow will upload the contents of that cell to the yt `pastebin
-<http://paste.yt-project.org/>`_.  The rightmost (green) set of double arrows
-will put the contents of that cell up top, in the execution zone -- this is
-useful if you are iterating on a command.
-
-You can also right-click on a parameter file to create slices, projections and
-to view grid information.  If you right click on a parameter file, you can
-choose to project the dataset.  Progress bars have been added, so you should be
-mapserver.png
-able to view progress as normal:
-
-.. image:: _images/rs3_proj.png
-   :target: _images/rs3_proj.png
-   :scale: 50%
-
-Once the projecting is done, a new tab for the plot widget should be opened.
-This will include the image, a colorbar, a metadata window, and a couple
-buttons to press:
-
-.. image:: _images/rs4_widget.png
-   :target: _images/rs4_widget.png
-   :scale: 50%
-
-You can ctrl-click on the image (this is broken in Firefox, but works in
-Chrome!  We're working to fix it!) to re-center on a given location.  The
-scroll bar at the bottom controls zooming, and you can dynamically change the
-field that is displayed.  There are zoom controls as well as panning controls,
-and the option to upload the image to `imgur.com <http://imgur.com/>`_, a
-simple image sharing service on the web.
-
-You can also click the button "Pannable Map" to open up a Google Maps-style
-interface, using the same underlying code as described in :ref:`mapserver`.
-
-Once you have created a data object, for instance by creating a sphere or a
-region inside the scripting window, you can right click on that object to
-extract isocontours.  The resultant widget, based on `PhiloGL
-<http://senchalabs.github.com/philogl/>`_, will be colored with the field you
-sampled and will be shaped like the extracted isocontour at your specified
-value.
-
-What Special Things Can Reason Do?
-++++++++++++++++++++++++++++++++++
-
-There are several special commands that Reason builds in, that make a few
-common operations easy.
-
- * ``load_script(filename)`` will load a script from the file system and insert
-   it into the currently-active input area.
- * ``deliver_image(filename)`` can accept a filename (PNG-only), a string of
-   binary PNG data, or a file-like object full of PNG data.  This data will
-   then be delivered to the browser, in the next active cell.  This is the
-   mechanism by which most image display occurs in Reason.
-
-Plot collections have been instrumented to work with Reason.  This means that
-if you create a plot collection like normal, as soon as you run ``pc.save()``
-the images that are saved out will be displayed in the active cell output.
-
-Pylab has also been modified to work with Reason, and Reason imports it before
-it starts up.  You can run any normal pylab command:
-
-.. code-block:: python
-
-   pylab.plot([1, 2, 3, 4, 5], [10, 210, 503, 1, 42.1])
-
-and the output should appear in the active cell.
-
-You may notice that there is a menu above the object list:
-
-.. image:: _images/rs5_menu.png
-   :scale: 50%
-   :target: _images/rs5_menu.png
-
-There are a number of options here:
-
- * ``Open`` and ``Open Directory`` -- currently disabled while we implement
-   this functionality.
- * ``Save Script`` -- Save the concatenated set of executed cells to a file on
-   the server.  This script should be directly executable, including all widget
-   interactions.
- * ``Download Script`` -- Download the concatenated set of executed cells to a
-   file on your local machine.
- * ``Pastebin Script`` -- Upload the concatenated set of executed cells to the
-   `yt pastebin <http://paste.yt-project.org/>`.
- * ``Help`` -- A quick little help file.
- * ``yt Chat`` -- Open up the Web portal to IRC, for live chatting with other
-   yt users and developers.
-
-Please feel free to share with us your thoughts and experiences -- good and
-bad! -- with Reason.
-
-I Want To Add A Widget!
-+++++++++++++++++++++++
-
-Adding a new widget is pretty straightforward, but you might need some guidance
-along the way.  It might be a good idea to stop by the yt-dev mailing list (see
-:ref:`getting-involved`) but you can also explore how widgets are made in the
-directory ``yt/gui/reason/html/js/`` and take a look at the specific python
-code in ``yt/gui/reason/extdirect_repl.py``.
-
-But seriously, if you have the desire to play with or update or extend or
-prettify reason, we'd be really excited to work with you.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/scripts.rst
--- a/source/interacting/scripts.rst
+++ /dev/null
@@ -1,38 +0,0 @@
-.. _scripting-yt:
-
-Scripts
--------
-
-The mechanism by which most people primarily interact with yt is by writing,
-then executing, a script.  This is covered somewhat in the :ref:`orientation`,
-but here we describe it again.  There are several advantages to scripts as
-opposed to interactivity:
-
- * Repeatability of experiments
- * Easier MPI-parallelism
- * Versioned control of changes to a script
- * Simpler declaration and usage of subroutines and loops
-
-Running scripts is pretty easy.  It's a three step process.
-
- #. Edit script in a text editor (vim, emacs, textmate -- as long as it's not
-    pico or edlin!)
- #. Run script, invoking it with either the python version installed with yt or
-    the alias ``pyyt``.
- #. Edit, and repeat!
-
-To encourage easy submission to the `yt Hub <http://hub.yt-project.org/>`_, we
-suggest you place your scripts in an isolated subdirectory and name each one
-individually.  For instance:
-
-.. code-block:: bash
-
-   mkdir turbulence_paper
-   cd turbulence_paper
-   vim calculate_power_spectra.py
-   pyyt calculate_power_spectra.py
-
-You will have to reference the datasets you want to analyze with either
-relative or absolute paths, but when you have completed your work, you can use
-the command (see :ref:`command-line`) ``hubsubmit`` to (if necessary) create a
-repository and submit to the `yt Hub <http://hub.yt-project.org/>`_.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/interacting/sharing_data_hub.rst
--- a/source/interacting/sharing_data_hub.rst
+++ /dev/null
@@ -1,108 +0,0 @@
-Sharing Data on the yt Hub
-==========================
-
-The yt data hub is a mechanism by which images, data objects and projects can
-be shared with other people.  For instance, one can upload projections and
-browse them with an interface similar to Google Maps.
-
-.. note:: All items posted on the hub are public!
-
-Over time, more widgets will be added, and more datatypes will be able to be
-uploaded.  If you are interested in adding more ways of sharing data, please
-email the developers' list.  We would like to add support for 3D widgets such
-as isocontours as well as interactive binning and rebinning of data from yt
-data objects, to be displayed as phase plots and profiles.
-
-Registering a User
-------------------
-
-Because of problems with spammers, registering a user can only be done from the
-yt command line.  Once you have registered a user, you can log on to the
-website and obtain an API key.
-
-To register a user:
-
-.. code-block:: bash
-
-   $ yt hub_register
-
-This will walk you through the process of registering.  You will need to supply
-a name, a username, a password and an email address.  Once you have gotten that
-out of the way, you can go to http://hub.yt-project.org/login and log in with
-your new password.  You can then receive your API key by clicking on your
-username in the upper left.
-
-After you have gotten your API key, place it in in your ``~/.yt/config`` file:
-
-.. code-block:: none
-
-   [yt]
-   hub_api_key = 3fd8de56c2884c13a2de4dd51a80974b
-
-Replace ``3fd8de56c2884c13a2de4dd51a80974b`` with your API key.  At this point,
-you're ready to go!
-
-What Can Be Uploaded
---------------------
-
-Currently, the yt hub can accept these types of data:
-
- * Projects and script repositories: these will be displayed with an optional
-   image, a description, and a link to the source repository.
- * Projections and Slices: these will be displayed in a maps-like interface,
-   for interactive panning and zooming
- * Plot collections: these will be displayed as a list of images
-
-How to Upload Data
-------------------
-
-Uploading data takes place inside scripts.  For the most part, it is relatively
-simple to do: you construct the object you would like to share, and then you
-upload it.
-
-Uploading Projects
-~~~~~~~~~~~~~~~~~~
-
-For information on how to share a project or a set of scripts, see
-:ref:`share-your-scripts`.
-
-Uploading Projections and Slices
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Projections and slices both have a ``hub_upload`` method.  Here is an example
-of uploading a projection:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
-   proj = pf.h.proj(0, "Density", weight="Density")
-   proj.hub_upload()
-
-Here is an example of uploading a slice:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("JHK-DD0030/galaxy0030")
-   sl = pf.h.slice(0, 0.5, fields=["Density"])
-   sl.hub_upload()
-
-Uploading Plot Collections
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Plot collections can be uploaded and viewed as a selection of images.  To
-upload a plot collection, call ``hub_upload`` on the plot collection.
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("DD0252/DD0252")
-   pc = PlotCollection(pf, 'c')
-   pc.add_projection("Density", 0)
-   pc.add_slice("Temperature", 1)
-   pc.add_profile_sphere(0.2, 'unitary', ["Density", "Temperature"])
-   dd = pf.h.all_data()
-   pc.add_phase_object(dd, ["Density", "Temperature", "CellMassMsun"],
-                       weight=None)
-                    pc.hub_upload()

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/orientation/first_steps.rst
--- a/source/orientation/first_steps.rst
+++ /dev/null
@@ -1,235 +0,0 @@
-.. _first_steps:
-
-First Steps with yt
--------------------
-
-Starting Up yt
-++++++++++++++
-
-"Starting up" yt is a bit of a misnomer.  What we're going to do is actually
-start up Python, and from there, we'll load up yt as a library, like we did for
-NumPy earlier.  yt provides a primary entry point (yt.mods), through which we
-are attempting to ensure backwards compatibility for quite some time to come.
-
-At your shell prompt, type:
-
-.. code-block:: bash
-
-   $ cd $YT_DEST/src/yt-hg/tests
-   $ python
-
-Once Python has started up, we will load yt with this command::
-
-   >>> from yt.mods import *
-
-Now all of the primary functions and data objects that yt provides will be
-available to you.  We now use the ``load`` function, which accepts a filename
-and then attempts to guess the file type.  If it's able to figure out what kind
-of simulation output it is, it will parse the parameter file to determine a
-number of components, and then return to you an object.  The convention in the
-yt documentation is to call this object ``pf`` but you can call it whatever you
-like.  We'll load up the dataset that comes with yt::
-
-   >>> pf = load("DD0010/moving7_0010")
-
-The parameter file itself has some useful information in it, but it's also
-designed to be quite lightweight.  This comes in handy with very large
-simulations, where you may wish to know the current time of the simulation, or
-some other parameter available in the parameter file, but you don't necessarily
-want yt to parse the full hierarchy of grids and create hundreds if not
-hundreds of thousands of objects.  For instance, though, you can get some basic
-information out::
-
-   >>> print pf.domain_dimensions
-
-This shows you the dimensions of the domain, in terms of the coarsest mesh that
-covers the domain.  (In Enzo this is called the 'root grid.')
-
-To access information about the fluid quantities in the simulation, we rely on
-the "hierarchy" object.  This hierarchy object contains the entire geometry of
-the simulation: all of the grid patches, their parentage relationships, and the
-fluid states of those grids.  When you first access the hierarchy object, yt
-will construct in memory a representation of that geometry, determine the
-available (on-disk and yt-calculated) fluid quantities, and then create a
-number of objects in memory to represent these representations.  We will now
-ask yt for the hierarchy object from our sample dataset::
-
-   >>> pf.hierarchy
-
-You can use a shorthand for ``hierarchy``, as well::
-
-   >>> pf.h
-
-The first thing we can do is ask yt for some statistics about our simulation,
-namely, how many grid patches we have at each level how many cells, and a
-little bit about the smallest grid resolution element.  This is through the
-``print_stats`` function that the hierarchy provides::
-
-   >>> pf.h.print_stats()
-
-As you can see, this simulation has only a handful of grids, but it does have 7
-levels of refinement past the coarsest level.  The hierarchy object has a
-number of other properties that describe the field state of the gas.  For
-instance, we can find out which fields exist on disk for a given simulation
-output::
-
-   >>> print pf.h.field_list
-
-Additionally, yt will attempt to guess which fields it can generate from the
-simulation, so-called "derived fields."  This process occurs when the data is
-loaded, so it may not be a complete listing.::
-
-   >>> print pf.h.derived_field_list
-
-Finally, the function that the hierarchy uses that I personally use the most is
-the ability to find the maximum value of a given field in a simulation.  This
-function returns both the value and the location of the maximum::
-
-   >>> value, location = pf.h.find_max("Density")
-
-.. _grid_inspection:
-
-Grid Inspection
-+++++++++++++++
-
-This section is optional, and can be skipped.
-
-Before we move on to generalized data selection, it's worthwhile to examine
-individual grid patches.  This can be useful for debugging as well as for
-detailed inspection.  While there are several ways to select grids on a given
-level, or within a given region, we'll simply look at grids selected just by
-their index.  The hierarchy object possesses an array of grids::
-
-   >>> print pf.h.grids
-
-This grid array can be indexed, and we will choose to examine the first grid in
-that array::
-
-   >>> my_grid = pf.h.grids[0]
-
-Each GridPatch object has a number of attributes we can examine.  To begin
-with, it knows its dimensions and its position within the simulation domain::
-
-   >>> print my_grid.Level
-   >>> print my_grid.ActiveDimensions
-   >>> print my_grid.LeftEdge
-   >>> print my_grid.RightEdge
-   >>> print my_grid.dds
-
-It also has information about its "parent" grid -- which in some simulation
-codes will in fact be a list of parents, within which it resides -- and about
-any higher-resolution grids that are contained in whole or in part within it.::
-
-   >>> print my_grid.Parent
-   >>> print my_grid.Children
-
-Each element in these arrays is another grid object -- each of which possesses
-the same set of attributes as the ``my_grid`` object.  For instance::
-
-   >>> print my_grid.Children[0].LeftEdge
-   >>> print my_grid.Children[0].RightEdge
-
-Grid objects also respect the yt data access idiom.  We can request an array of
-the Density within a grid object.::
-
-   >>> print my_grid["Density"]
-
-The grid object also possesses information about which of its zones have been
-refined and are available at a finer resolution, within the ``child_mask``
-attribute.  This attribute is an array of 0's and 1's, which is set to 0
-everywhere that the grid has been further refined by one of the elements of
-the ``Children`` list.  You can, for instance, examine the refinement fraction
-of a grid using NumPy operations to multiply the dimensions and to sum the
-child mask::
-
-   >>> print my_grid.child_mask.sum() / float(my_grid.ActiveDimensions.prod())
-
-As you can see, only a small fraction of our grid has been refined!
-
-Data Containers and Data Selection
-++++++++++++++++++++++++++++++++++
-
-The hierarchy object, acting as the primary interface to the geometry of the
-simulation, is also the provider of all the yt-provided operators for cutting
-or selecting data.  When yt was first conceived, it was designed to be a very
-simple way to make slices and projections.  As time went on, it became clear
-that some degree of subselection of data was important, if not mandatory, and 
-so objects that selected data based on geometric bounds or fluid quantity
-values were added.
-
-The most straightforward object we can create is a sphere.  We'll center this
-sphere at the most dense point, which we found above, and we'll give it a
-radius of 10 kiloparsecs.::
-
-   >>> my_sphere = pf.h.sphere(location, 10.0/pf['kpc'])
-
-This function can accept a few more arguments, but this covers the essentials.
-We supply it a center and a radius.  The radius is specified in the units
-native to the simulation domain; this is not terribly useful, so we have used a
-unit conversion to convert from 10 kiloparsecs into the native simulation
-units.  You can do this with a number of different units (all of which are
-actually listed in the ``print_stats`` output) and the opposite (multiplication
-by the conversion factor) works for conversion from code units back to a
-physical unit.  
-
-There are a number of different objects that can be created, all of which are
-described in the documentation.  These include spheres, rectangular prisms,
-rays, slices, arbitrarily-aligned cylinders, and several others.  It is
-relatively simple to add a new type of data container, which has also been
-detailed in the documentation.
-
-We now have an object, ``my_sphere``, which functions as a data container.  We
-can access the data in the way described above::
-
-   >>> my_sphere["Density"]
-
-This will read all the data off disk (or generate it, if it's a derived field)
-and print out a representation of it.  It's then stored in the data object and
-can be accessed again without having to read it from disk.
-
-However, most operations in yt are designed so that the entire contents of the
-sphere do not have to be in memory.  For instance, to calculate the center of
-mass, one could imagine doing something like this::
-
-   >>> M_i = my_sphere["CellMassMsun"]
-   >>> M = M_i.sum()
-   >>> com_x = (my_sphere["x"] * M_i).sum()/M
-   >>> com_y = (my_sphere["y"] * M_i).sum()/M
-   >>> com_z = (my_sphere["z"] * M_i).sum()/M
-
-But for this to work, all of the arrays listed would have to be held in memory,
-even though the algorithm operates on each element individually.  Clearly, the
-mechanism described above simply won't work for very large data objects!  It
-works for our sphere because we have only a couple thousand points, but if
-we're looking at a galaxy cluster size halo in a high-resolution dataset, for
-instance, this may simply run the computer out of memory.
-
-To get past this, yt provides a mechanism for conducting operations on data
-containers that removes the need for both manual memory management and manual
-parallelism.  This functionality is rolled into the broad term "derived
-quantities" (search for this in the documentation for more information) but it
-can really be thought of as any operation that can be decomposed into a
-pre-calculation step and a reduction step.  For the center of mass, the first
-step is to calculate the values of ``M_i`` and ``M_i * x`` (as well as ``y``
-and ``z``) for each grid patch, then to sum all of these and conduct the
-division to get the overall center of mass.
-
-yt provides a number of pre-defined derived quantities, but you can also write
-your own.  For now, let's just take a look at a few of them.  For starters,
-there's the center of mass quantity.  We access these quantities from the
-``quantities`` object that hangs off every data container, like so::
-
-   >>> my_sphere.quantities["CenterOfMass"]()
-
-What this does is to access the derived quantity center of mass, and then call
-it.  I know it looks a little funny, with the empty parenthesis after the
-closing bracket, but this is necessary -- and while "CenterOfMass" doesn't take
-any arguments, some of the derived quantities do.  For instance, yt also
-provides a derived quantity for finding the extrema of a set of fields:
-
-   >>> print my_sphere.quantities["Extrema"](["x-velocity", "Density"])
-
-All of these operations, by default, will operate in a memory conservative
-fashion.  Additionally, because they work on a grid-by-grid basis, they can be
-transparently parallelized.  Discussing parallelism is outside the scope of
-this document, but it's discussed at some length in the yt documentation.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/orientation/how_yt_thinks.rst
--- a/source/orientation/how_yt_thinks.rst
+++ /dev/null
@@ -1,57 +0,0 @@
-.. _how-yt-thinks-about-data:
-
-How yt Thinks About Data
-------------------------
-
-In this section, we'll briefly introduce the ideas behind how yt operates on
-and thinks about data internally.  For a more comprehensive discussion, see the
-`yt method paper <http://adsabs.harvard.edu/abs/2011ApJS..192....9T>`_,
-particularly Sections 2.1 and 2.2.
-
-When using yt, there are two main mechanisms for interacting with data: the
-first is to create simple plots, where yt makes a number of assumptions about
-how you want to process the data, and the more complicated version where you
-exert greater control over the selection, transformation and visualization of
-data.  This could involve choosing a subset of the simulation domain, or
-providing new ways of processing data fields written by the simulation code, or
-even manually visualizing data yourself without using any of the built-in yt
-mechanisms for visualization.  Furthermore, yt has been designed such that, if
-at all possible, all units are returned in cgs.
-
-Before we get into too much of that, it's worthwhile to mention that Python
-provides facilities for data access from objects in simple ways.  yt makes
-extensive use of this, and in fact every “data container” that yt provides
-allows you to act like it is a dictionary (as described above), which will lead
-to it either reading the appropriate data from the disk or generating that data
-in memory.  This sounds a bit complicated, but what it essentially means is
-that you can perform this operation::
-
-   >>> my_data_container["Density"]
-
-and all of the "Density" data contained within that data container will be read
-from disk, converted to CGS if necessary, and returned to you.
-
-What this kind of access leads to, particularly since the data is only read and
-assembled in memory when it is requested, is an idea that data objects are
-really just conduits.  Data flows through them from the disk to a processing
-step, and they are neutral to what that processing step is.
-
-Furthermore, yt is neutral to the type of field requested.  Simulation codes
-often only store the fluid quantities that cannot be regenerated: density,
-velocity, internal energy, etc.  However, from these quantities, other fluid
-quantities can be constructed.  Perhaps the simplest example would be that of a
-mass-fraction.  A simulation code such as Enzo may store the density of
-molecular hydrogen, but often during post-processing the more interesting
-quantity is the molecular hydrogen fraction; this can be generated trivially by
-dividing the molecular hydrogen density by the total fluid density.  yt is
-designed so that you can define functions that describe how to create a field,
-and then that field becomes "native" -- it will be accessible in the same
-manner as any field stored by the simulation.  yt also comes with a number of
-these fields pre-defined.
-
-Some (admittedly larger and more complex) data visualization and analysis
-packages encourage or even require quite a lot of pipeline construction, using
-words like Filter and Process and so on.  With yt we've tried to avoid a whole
-lot of overhead in terms of creating pipes, dealing with zones, dealing with
-interpolation, all of that -- and simply stuck to creating fields simply,
-selecting data simply, and then processing it and saving out plots.

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/orientation/index.rst
--- a/source/orientation/index.rst
+++ /dev/null
@@ -1,34 +0,0 @@
-.. _orientation:
-
-Quickstart Guide
-================
-
-This quickstart guide to using yt begins by showing you how to install yt and
-its dependencies, shows you how to make some simple plots, and then moves to
-a brief explanation of how to use python and how yt's framework fits into that.
-Finally, it addresses how to ask questions of your data using yt.
-
-But, before getting too far in, here are a few resources that may come in
-handy along the way.  (These go from most-specific to least-specific!)
-
-    * `yt-users mailing list <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
-    * `Numpy docs <http://docs.numpy.org/>`_
-    * `Matplotlib docs <http://matplotlib.sf.net>`_
-    * `Python quickstart <http://docs.python.org/tutorial/>`_
-    * `Learn Python the Hard Way <http://learnpythonthehardway.org/index>`_
-    * `Byte of Python <http://www.swaroopch.com/notes/Python>`_
-    * `Dive Into Python <http://diveintopython.org>`_
-
-If you encounter any problems here, or anywhere in yt, please visit the 
-:ref:`asking-for-help` page to figure out a solution.
-
-.. toctree::
-   :maxdepth: 2
-
-   installing
-   simple_data_inspection
-   python_introduction
-   how_yt_thinks
-   first_steps
-   making_plots
-   where_to_go

diff -r 935aea735b47c303f9b8f879841166abc5910480 -r 64fe0d129a4d1f06feb5cb1b8a20aee03cca9794 source/orientation/making_plots.rst
--- a/source/orientation/making_plots.rst
+++ /dev/null
@@ -1,210 +0,0 @@
-Making Plots
-------------
-
-Slices
-^^^^^^
-
-Examining data by hand and looking at individual quantities one at a time can be
-interesting and productive, but yt also provides a set of visualization tools
-that you can use. We'll start by showing you how to make visualizations of
-slices and projections through your data.  We will then move on to demonstrate
-how to make analysis plots, including phase diagrams and profiles.
-
-The quickest way to plot a slice of a field through your data is to use
-:class:`~yt.visualization.plot_window.SlicePlot`.  Say we want to visualize a
-slice through the Density field along the z-axis centered on the center of the
-simulation box in a simulation dataset we've opened and stored in the parameter
-file object ``pf``.  This can be accomplished with the following command:
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.save()
-
-These two commands will create a slice object and store it in a variable we've
-called ``slc``.  We then call the ``save()`` function that is associated with
-the slice object.  This automatically saves the plot in png image format with an
-automatically generated filename.  If you don't want the slice object to stick
-around, you can accomplish the same thing in one line:
-
-.. code-block:: python
-   
-   >>> SlicePlot(pf, 'z', 'Density').save()
-
-It's nice to keep the slice object around if you want to modify the plot.  By
-default, the plot width will be set to the size of the simulation box.  To zoom
-in by a factor of ten, you can call the zoom function attached to the slice
-object:
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.zoom(10)
-   >>> slc.save('zoom')
-
-This will save a new plot to disk with a different filename - prepended with
-'zoom' instead of the name of the parameter file. If you want to set the width
-manually, you can do that as well. For example, the following sequence of
-commands will create a slice, set the width of the plot to 10 kiloparsecs, and
-save it to disk.
-
-.. code-block:: python
-
-   >>> slc = SlicePlot(pf, 'z', 'Density')
-   >>> slc.set_width((10,'kpc'))
-   >>> slc.save('10kpc')
-
-The SlicePlot also optionally accepts the coordinate to center the plot on and
-the width of the plot:
-
-.. code-block:: python
-
-   >>> SlicePlot(pf, 'z', 'Density', center=[0.2, 0.3, 0.8], 
-   ...           width = (10,'kpc')).save()
-
-The center must be given in code units.  Optionally, you can supply 'c' or 'm'
-for the center.  These two choices will center the plot on the center of the
-simulation box and the coordinate of the maximum density cell, respectively.
-
-One can also use the SlicePlot to make annotated plots.  The following commands
-will create a slice, annotate it by marking the grid boundaries, and save the
-plot to disk:
-
-.. code-block:: python
-
-   >>> SlicePlot(pf, 'z', 'Density')
-   >>> SlicePlot.annotate_grids()
-   >>> SlicePlot.save()
-
-There are a number of annotations available.  The full list is available in
-:ref:`callbacks`.
-
-Projections
-^^^^^^^^^^^^
-
-It can be limiting to only look at slices through 3D data.  In most cases, Doing
-so discards the vast majority of the data.  For this reason, yt provides a
-simple interface for generating plots of projections through your data.  The
-interface for making projection plots,
-:class:`~yt.visualization.plot_window.ProjectionPlot` is very similar to
-``SlicePlot``, described above.  To create and save a plot of the projection of
-the density field through the z-axis of a dataset, centered on the center of the
-simulation box, do the following:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z' 'Density).save()
-
-A ``ProjectionPlot`` can be modified and created in exactly the same keyword
-arguments as s ``SlicePlot``. For example, one can also adjust the width of
-the plot, either after creating the projection plot:
-
-.. code-block:: python
-
-   >>> prj = ProjectionPlot(pf, 'z', 'Density')
-   >>> prj.set_width((10,'kpc'))
-
-or while creating the projection in the first place:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Density', width=(10,'kpc'))
-
-In addition, one can optionally supply a maximum level to project to, this is
-very useful for large datasets where projections can be costly:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Density', max_level=10)
-
-as well as a field to weight the projection by.  The following example creates a
-map of the density-weighted mean temperature, projected along the z-axis:
-
-.. code-block:: python
-
-   >>> ProjectionPlot(pf, 'z', 'Temperature', weight_field='Density')
-
-PlotCollection
-^^^^^^^^^^^^^^
-
-To create profiles, yt supplies the ``PlotCollection``, an object designed to
-enable you to make several related plots all at once.  Originally, the idea was
-that yt would get used to make multiple plots of different fields, along
-different axes, all centered at the same point.  This has somewhat faded with
-time, but it still functions as a convenient way to set up a bunch of plots with
-only one or two commands.
-
-A plot collection is really defined by two things: the simulation output it
-will make plots from, and the "center" of the plot collection.  By default, the
-center is the place where all phase plots are centered, although
-there is some leeway on this.  We start by creating our plot collection.  The
-plot collection takes two arguments: the first is the parameter file (``pf``)
-we associate the plot collection with, and the second is our center.  Note that
-if you don't specify a center, yt will search the simulation for the most dense
-point and center the plot collection there.
-
-The yt convention is to call plot collection objects ``pc``, which we do here::
-
-   >>> pc = PlotCollection(pf, [0.5, 0.5, 0.5])
-
-We've chosen to center at [0.5, 0.5, 0.5], which for this simulation is the
-center of the domain.  We can now add a number of different types of
-visualizations to this plot collection, but we'll only look at a few.  
-
-Phase Plots
-^^^^^^^^^^^
-
-Phase plots are pretty cool -- they take all the data inside a data container
-and they bin it with respect to two variables.  You can then have it calculate
-the average, or the sum, of some other quantity as a function of those two
-variables.
-
-This allows, for instance, the calculation of the average Temperature as a
-function of Density and velocity.  Or, it allows the distribution of all the
-mass as a function of Density and Temperature.  I have used phase plots to good
-effect to show the variation of chemical quantities as a function of spatial and
-angular distribution, for instance.  There are several ways to create a phase
-plot; we'll actually show the most flexible method, which uses data containers.
-There's a convenience function that takes a center and a radius and makes one
-from a sphere, too.
-
-Earlier we created the data object ``my_sphere``.  We'll use that object here::
-
-   >>> pc.add_phase_object(my_sphere, ["Density", "Temperature",
-   ...                                 "VelocityMagnitude"])
-
-This will calculate the average (weighted by cell mass, by default) velocity
-magnitude at every point in the sphere and plot it as a function of the local
-density and temperature.  This function has a number of options, which can be
-seen by calling ``help`` on it::
-
-   >>> help(pc.add_phase_object)
-
-As you can see, you can specify the number of bins, the boundaries for the
-histogram, and so on and so forth.  Of particular note is that we can also have
-it calculate the mass-distribution in a data object as a function of two
-variables, but to do that we need to tell yt specifically not to take the
-average.  We do that like this::
-
-   >>> pc.add_phase_object(my_sphere, ["Density", "Temperature",
-   ...                                 "CellMassMsun"], weight=None)
-
-When the weight is specified to be empty, it will only take the sum of all the
-values in a bin, rather than the average of those values.
-
-So now we've added four plots to our plot collection.  We can now operate on
-them in bulk -- although you will probably find it's much more useful to
-operate on image plots in bulk than on phase plots.  Our first task is to save
-them all out::
-
-   >>> pc.save("first_images")
-
-The way the ``save`` function works is that it accepts a prefix, and then every
-plot is saved out with that prefix.  Each plot's name is calculated from a
-combination of the type of plot and the fields in that plot.  For plots where
-many duplicates can be included, a counter is included too -- for instance,
-phase and profile plots.
-
-All of these commands can be run from a script -- which, in fact, is the way
-that I would personally encourage.  It will make it easier to produce plots
-repeatedly, without having to worry about a great deal of manual tweaking.

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt-doc/commits/6948242c1296/
Changeset:   6948242c1296
User:        ngoldbaum
Date:        2013-11-02 20:20:11
Summary:     Merged in jzuhone/yt-doc_chummels (pull request #2)

Re-organization of Loading Data docs
Affected #:  5 files

diff -r 29f4c3fe570e3429d165974e4f14657b84e7f5de -r 6948242c129603cdcf7bdf55eb33b48f94315fbf source/examining/Loading_Generic_Array_Data.ipynb
--- /dev/null
+++ b/source/examining/Loading_Generic_Array_Data.ipynb
@@ -0,0 +1,485 @@
+{
+ "metadata": {
+  "name": ""
+ },
+ "nbformat": 3,
+ "nbformat_minor": 0,
+ "worksheets": [
+  {
+   "cells": [
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Even if your data is not strictly related to fields commonly used in\n",
+      "astrophysical codes or your code is not supported yet, you can still feed it to\n",
+      "`yt` to use its advanced visualization and analysis facilities. The only\n",
+      "requirement is that your data can be represented as three-dimensional NumPy arrays with a consistent grid structure. What follows are some common examples of loading in generic array data that you may find useful. "
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Generic Unigrid Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The simplest case is that of a single grid of data spanning the domain, with one or more fields. The data could be generated from a variety of sources; we'll just give three common examples:"
+     ]
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "Data generated \"on-the-fly\""
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The most common example is that of data that is generated in memory from the currently running script or notebook. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "from yt.imods import *\n",
+      "from yt.utilities.physical_constants import cm_per_kpc, cm_per_mpc"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example, we'll just create a 3-D array of random floating-point data using NumPy:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "arr = np.random.random(size=(64,64,64))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "To load this data into `yt`, we need to assign it a field name, in this case \"Density\", and place it into a dictionary. Then, we call `load_uniform_grid`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = dict(Density = arr)\n",
+      "bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [-1.5, 1.5]])\n",
+      "pf = load_uniform_grid(data, arr.shape, cm_per_mpc, bbox=bbox, nprocs=64)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`load_uniform_grid` takes the following arguments and optional keywords:\n",
+      "\n",
+      "* `data` : This is a dict of numpy arrays, where the keys are the field names.\n",
+      "* `domain_dimensions` : The domain dimensions of the unigrid\n",
+      "* `sim_unit_to_cm` : Conversion factor from simulation units to centimeters\n",
+      "* `bbox` : Size of computational domain in units sim_unit_to_cm\n",
+      "* `nprocs` : If greater than 1, will create this number of subarrays out of data\n",
+      "* `sim_time` : The simulation time in seconds\n",
+      "* `periodicity` : A tuple of booleans that determines whether the data will be treated as periodic along each axis\n",
+      "\n",
+      "This example creates a `yt`-native parameter file `pf` that will treat your array as a\n",
+      "density field in cubic domain of 3 Mpc edge size (3 * 3.0856e24 cm) and\n",
+      "simultaneously divide the domain into `nprocs` = 64 chunks, so that you can take advantage\n",
+      "of the underlying parallelism. "
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The resulting `pf` functions exactly like a parameter file from any other dataset--it can be sliced, and we can show the grid boundaries:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, 2, [\"Density\"])\n",
+      "slc.set_cmap(\"Density\", \"Blues\")\n",
+      "slc.annotate_grids(cmap=None)\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Particle fields are detected as one-dimensional fields. The number of\n",
+      "particles is set by the `number_of_particles` key in\n",
+      "`data`. Particle fields are then added as one-dimensional arrays in\n",
+      "a similar manner as the three-dimensional grid fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "posx_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "posy_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "posz_arr = np.random.uniform(low=-1.5, high=1.5, size=10000)\n",
+      "data = dict(Density = np.random.random(size=(64,64,64)), \n",
+      "            number_of_particles = 10000,\n",
+      "            particle_position_x = posx_arr, \n",
+      "\t        particle_position_y = posy_arr,\n",
+      "\t        particle_position_z = posz_arr)\n",
+      "bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [-1.5, 1.5]])\n",
+      "pf = load_uniform_grid(data, data[\"Density\"].shape, cm_per_mpc, bbox=bbox, nprocs=4)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this example only the particle position fields have been assigned. `number_of_particles` must be the same size as the particle\n",
+      "arrays. If no particle arrays are supplied then `number_of_particles` is assumed to be zero. Take a slice, and overlay particle positions:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, \"z\", [\"Density\"])\n",
+      "slc.set_cmap(\"Density\", \"Blues\")\n",
+      "slc.annotate_particles(0.25, p_size=12.0, col=\"Red\")\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "HDF5 data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "HDF5 is a convenient format to store data. If you have unigrid data stored in an HDF5 file, it is possible to load it into memory and then use `load_uniform_grid` to get it into `yt`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import h5py\n",
+      "f = h5py.File(os.environ[\"YT_DATA_DIR\"]+\"/UnigridData/turb_vels.h5\", \"r\") # Read-only access to the file"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The HDF5 file handle's keys correspond to the datasets stored in the file:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "print f.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can iterate over the items in the file handle to get the data into a dictionary, which we will then load:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = {k:v for k,v in f.items()}\n",
+      "bbox = np.array([[-0.5, 0.5], [-0.5, 0.5], [-0.5, 0.5]])\n",
+      "pf = load_uniform_grid(data, data[\"Density\"].shape, 250.*cm_per_kpc, bbox=bbox, nprocs=8, periodicity=(False,False,False))"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In this case, the data came from a simulation which was 250 kpc on a side. An example projection of two fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "prj = ProjectionPlot(pf, \"z\", [\"z-velocity\",\"Temperature\"], weight_field=\"Density\")\n",
+      "prj.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 3,
+     "metadata": {},
+     "source": [
+      "FITS image data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "The FITS file format is a common astronomical format for 2-D images, but it can store three-dimensional data as well. The [AstroPy](http://www.astropy.org) project has modules for FITS reading and writing, which were incorporated from the [PyFITS](http://www.stsci.edu/institute/software_hardware/pyfits) library."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "import astropy.io.fits as pyfits\n",
+      "# Or, just import pyfits if that's what you have installed"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Using `pyfits` we can open a FITS file. If we call `info()` on the file handle, we can figure out some information about the file's contents. The file in this example has a primary HDU (header-data-unit) with no data, and three HDUs with 3-D data. In this case, the data consists of three velocity fields:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "f = pyfits.open(os.environ[\"YT_DATA_DIR\"]+\"/UnigridData/velocity_field_20.fits.gz\")\n",
+      "f.info()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We can put it into a dictionary in the same way as before, but we slice the file handle `f` so that we don't use the `PrimaryHDU`. `hdu.name` is the field name and `hdu.data` is the actual data. We can check that we got the correct fields. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "data = {hdu.name.lower():hdu.data for hdu in f[1:]}\n",
+      "print data.keys()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Now we load the data into `yt`. This particular file doesn't have any coordinate information, but let's assume that the box size is a Mpc. Since these are velocity fields, we can overlay velocity vectors on slices, just as if we had loaded in data from a supported code. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load_uniform_grid(data, data[\"x-velocity\"].shape, cm_per_mpc)\n",
+      "slc = SlicePlot(pf, \"x\", [\"x-velocity\",\"y-velocity\",\"z-velocity\"])\n",
+      "slc.annotate_velocity()\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Generic AMR Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "In a similar fashion to unigrid data, data gridded into rectangular patches at varying levels of resolution may also be loaded into `yt`. In this case, a list of grid dictionaries should be provided, with the requisite information about each grid's properties. This example sets up two grids: a top-level grid (`level == 0`) covering the entire domain and a subgrid at `level == 1`. "
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "grid_data = [\n",
+      "    dict(left_edge = [0.0, 0.0, 0.0],\n",
+      "         right_edge = [1.0, 1.0, 1.],\n",
+      "         level = 0,\n",
+      "         dimensions = [32, 32, 32]), \n",
+      "    dict(left_edge = [0.25, 0.25, 0.25],\n",
+      "         right_edge = [0.75, 0.75, 0.75],\n",
+      "         level = 1,\n",
+      "         dimensions = [32, 32, 32])\n",
+      "   ]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "We'll just fill each grid with random density data, with a scaling with the grid refinement level."
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "for g in grid_data: g[\"Density\"] = np.random.random(g[\"dimensions\"]) * 2**g[\"level\"]"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Particle fields are supported by adding 1-dimensional arrays to each `grid` and\n",
+      "setting the `number_of_particles` key in each `grid`'s dict. If a grid has no particles, set `number_of_particles = 0`, but the particle fields still have to be defined since they are defined elsewhere; set them to empty NumPy arrays:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "grid_data[0][\"number_of_particles\"] = 0 # Set no particles in the top-level grid\n",
+      "grid_data[0][\"particle_position_x\"] = np.array([]) # No particles, so set empty arrays\n",
+      "grid_data[0][\"particle_position_y\"] = np.array([]) \n",
+      "grid_data[0][\"particle_position_z\"] = np.array([])\n",
+      "grid_data[1][\"number_of_particles\"] = 1000\n",
+      "grid_data[1][\"particle_position_x\"] = np.random.uniform(low=0.25, high=0.75, size=1000)\n",
+      "grid_data[1][\"particle_position_y\"] = np.random.uniform(low=0.25, high=0.75, size=1000)\n",
+      "grid_data[1][\"particle_position_z\"] = np.random.uniform(low=0.25, high=0.75, size=1000)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "Then, call `load_amr_grids`:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "`load_amr_grids` also takes the same keywords `bbox` and `sim_time` as `load_uniform_grid`. Let's take a slice:"
+     ]
+    },
+    {
+     "cell_type": "code",
+     "collapsed": false,
+     "input": [
+      "slc = SlicePlot(pf, \"z\", [\"Density\"])\n",
+      "slc.annotate_particles(0.25, p_size=15.0, col=\"Pink\")\n",
+      "slc.show()"
+     ],
+     "language": "python",
+     "metadata": {},
+     "outputs": []
+    },
+    {
+     "cell_type": "heading",
+     "level": 2,
+     "metadata": {},
+     "source": [
+      "Caveats for Loading Generic Array Data"
+     ]
+    },
+    {
+     "cell_type": "markdown",
+     "metadata": {},
+     "source": [
+      "* Units will be incorrect unless the data has already been converted to cgs.\n",
+      "* Particles may be difficult to integrate.\n",
+      "* Data must already reside in memory before loading it in to `yt`, whether it is generated at runtime or loaded from disk. \n",
+      "* Some functions may behave oddly, and parallelism will be disappointing or non-existent in most cases.\n",
+      "* No consistency checks are performed on the hierarchy\n",
+      "* Consistency between particle positions and grids is not checked; `load_amr_grids` assumes that particle positions associated with one grid are not bounded within another grid at a higher level, so this must be ensured by the user prior to loading the grid data. "
+     ]
+    }
+   ],
+   "metadata": {}
+  }
+ ]
+}
\ No newline at end of file

diff -r 29f4c3fe570e3429d165974e4f14657b84e7f5de -r 6948242c129603cdcf7bdf55eb33b48f94315fbf source/examining/generic_array_data.rst
--- /dev/null
+++ b/source/examining/generic_array_data.rst
@@ -0,0 +1,5 @@
+
+Loading Generic Array Data
+=================
+
+.. notebook:: Loading_Generic_Array_Data.ipynb

diff -r 29f4c3fe570e3429d165974e4f14657b84e7f5de -r 6948242c129603cdcf7bdf55eb33b48f94315fbf source/examining/index.rst
--- a/source/examining/index.rst
+++ b/source/examining/index.rst
@@ -1,10 +1,11 @@
 Examining Data
 ==============
 
-How to examine a dataset on disk.
+How to examine datasets.
 
 .. toctree::
    :maxdepth: 2
 
-   loading_data
+   supported_frontends_data
+   generic_array_data
    low_level_inspection

diff -r 29f4c3fe570e3429d165974e4f14657b84e7f5de -r 6948242c129603cdcf7bdf55eb33b48f94315fbf source/examining/loading_data.rst
--- a/source/examining/loading_data.rst
+++ /dev/null
@@ -1,240 +0,0 @@
-.. _loading-data:
-
-Loading Data
-============
-
-This section contains information on how to load data into ``yt``, as well as
-some important caveats about different data formats.
-
-.. _loading-numpy-array:
-
-Generic Array Data
-------------------
-
-If you have a data format which is unsupported by the existing code frontends,
-you can still read your data into ``yt``.  The only requirement is that your 
-data can be represented as one or more uniform, three
-dimensional numpy arrays.  
-
-For example, let's say you have a 3D dataset of a density field in a cubical 
-volume which is 3 Mpc on a side.
-
-
-Assuming that you have a managed to get your data 
-into a numpy array called ``a``, you can now read it in using the following
-code:
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = a)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, a.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-This will generate a ``yt``-native parameter file as ``pf``.  It will treat 
-your array as density field in a cubic domain of 3 Mpc edge size (3 * 3.08e24 cm) 
-and simultaneously divide the domain into 12 chunks, so that you can take advantage
-of the underlying parallelism.  If you want to set another fieldname other 
-than ``Density``, feel free to do that here as well.  To disable parallelism, 
-just don't set nprocs.
-
-You can now use ``pf`` as though it were a normal parameter file like the 
-many examples in the Cookbook.
-
-Particle fields are detected as one-dimensional fields. The number of
-particles is set by the ``number_of_particles`` key in the ``data`` dictionary. 
-Particle fields are then added as one-dimensional arrays in a similar manner 
-as the three-dimensional grid fields.  So starting again with a 3D numpy array 
-``a`` that represents our Density field
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_uniform_grid
-
-   data = dict(Density = a, 
-               number_of_particles = 1000000,
-               particle_position_x = posx_arr, 
-	           particle_position_y = posy_arr,
-	           particle_position_z = posz_arr)
-   bbox = np.array([[-1.5, 1.5], [-1.5, 1.5], [1.5, 1.5]])
-   pf = load_uniform_grid(data, arr.shape, 3.08e24, bbox=bbox, nprocs=12)
-
-where in this exampe the particle position fields have been assigned. ``number_of_particles`` must be the same size as the particle
-arrays. If no particle arrays are supplied then ``number_of_particles`` is assumed to be zero. 
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Particles may be difficult to integrate.
-* Data must already reside in memory.
-
-.. _loading-enzo-data:
-
-Enzo Data
----------
-
-Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
-dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
-it the parameter file name.  This would be the name of the output file, and it
-contains no extension.  For instance, if you have the following files:
-
-.. code-block:: none
-
-   DD0010/
-   DD0010/data0010
-   DD0010/data0010.hierarchy
-   DD0010/data0010.cpu0000
-   DD0010/data0010.cpu0001
-   DD0010/data0010.cpu0002
-   DD0010/data0010.cpu0003
-
-You would feed the ``load`` command the filename ``DD0010/data0010`` as
-mentioned.
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("DD0010/data0010")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Enzo usage
-* Units should be correct, if you utilize standard unit-setting routines.  yt
-  will notify you if it cannot determine the units, although this
-  notification will be passive.
-* 2D and 1D data are supported, but the extraneous dimensions are set to be
-  of length 1.0
-
-.. _loading-orion-data:
-
-Orion Data
-----------
-
-Orion data is fully supported and cared for by Jeff Oishi.  This method should
-also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
-Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
-command provided by ``yt.mods`` and supply to it the directory file name.
-**You must also have the ``inputs`` file in the base directory.**  For
-instance, if you were in a directory with the following files:
-
-.. code-block:: none
-
-   inputs
-   pltgmlcs5600/
-   pltgmlcs5600/Header
-   pltgmlcs5600/Level_0
-   pltgmlcs5600/Level_0/Cell_H
-   pltgmlcs5600/Level_1
-   pltgmlcs5600/Level_1/Cell_H
-   pltgmlcs5600/Level_2
-   pltgmlcs5600/Level_2/Cell_H
-   pltgmlcs5600/Level_3
-   pltgmlcs5600/Level_3/Cell_H
-   pltgmlcs5600/Level_4
-   pltgmlcs5600/Level_4/Cell_H
-
-You would feed it the filename ``pltgmlcs5600``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("pltgmlcs5600")
-
-.. rubric:: Caveats
-
-* There are no major caveats for Orion usage
-* Star particles are not supported at the current time
-
-.. _loading-flash-data:
-
-FLASH Data
-----------
-
-FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
-FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
-supply to it the file name of a plot file or checkpoint file, but particle
-files are not currently directly loadable by themselves, due to the
-fact that they typically lack grid information. For instance, if you were in a directory with
-the following files:
-
-.. code-block:: none
-
-   cosmoSim_coolhdf5_chk_0026
-
-You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
-
-.. code-block:: python
-
-   from yt.mods import *
-   pf = load("cosmoSim_coolhdf5_chk_0026")
-
-If you have a FLASH particle file that was created at the same time as
-a plotfile or checkpoint file (therefore having particle data
-consistent with the grid structure of the latter), its data may be loaded with the
-``particle_filename`` optional argument:
-
-.. code-block:: python
-
-    from yt.mods import *
-    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
-
-.. rubric:: Caveats
-
-* Please be careful that the units are correctly utilized; yt assumes cgs
-* Velocities and length units will be scaled to comoving coordinates if yt is
-  able to discern you are examining a cosmology simulation; particle and grid
-  positions will not be.
-* Domains may be visualized assuming periodicity.
-
-.. loading-amr-data:
-
-Generic AMR Data
-----------------
-
-It is possible to create native ``yt`` parameter file from Python's dictionary
-that describes set of rectangular patches of data of possibly varying
-resolution. 
-
-.. code-block:: python
-
-   from yt.frontends.stream.api import load_amr_grids
-
-   grid_data = [
-       dict(left_edge = [0.0, 0.0, 0.0],
-            right_edge = [1.0, 1.0, 1.],
-            level = 0,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-       dict(left_edge = [0.25, 0.25, 0.25],
-            right_edge = [0.75, 0.75, 0.75],
-            level = 1,
-            dimensions = [32, 32, 32],
-            number_of_particles = 0)
-   ]
-  
-   for g in grid_data:
-       g["Density"] = np.random.random(g["dimensions"]) * 2**g["level"]
-  
-   pf = load_amr_grids(grid_data, [32, 32, 32], 1.0)
-
-Particle fields are supported by adding 1-dimensional arrays and
-setting the ``number_of_particles`` key to each ``grid``'s dict:
-
-.. code-block:: python
-
-    for g in grid_data:
-        g["number_of_particles"] = 100000
-        g["particle_position_x"] = np.random.random((g["number_of_particles"]))
-
-.. rubric:: Caveats
-
-* Units will be incorrect unless the data has already been converted to cgs.
-* Some functions may behave oddly, and parallelism will be disappointing or
-  non-existent in most cases.
-* No consistency checks are performed on the hierarchy
-* Data must already reside in memory.
-* Consistency between particle positions and grids is not checked;
-  ``load_amr_grids`` assumes that particle positions associated with one grid are
-  not bounded within another grid at a higher level, so this must be
-  ensured by the user prior to loading the grid data. 

diff -r 29f4c3fe570e3429d165974e4f14657b84e7f5de -r 6948242c129603cdcf7bdf55eb33b48f94315fbf source/examining/supported_frontends_data.rst
--- /dev/null
+++ b/source/examining/supported_frontends_data.rst
@@ -0,0 +1,301 @@
+.. _loading-data-from-supported-codes:
+
+Loading Data from Supported Codes
+============
+
+This section contains information on how to load data into ``yt`` from
+supported codes, as well as some important caveats about different
+data formats.
+
+.. _loading-enzo-data:
+
+Enzo Data
+---------
+
+Enzo data is fully supported and cared for by Matthew Turk.  To load an Enzo
+dataset, you can use the ``load`` command provided by ``yt.mods`` and supply to
+it the parameter file name.  This would be the name of the output file, and it
+contains no extension.  For instance, if you have the following files:
+
+.. code-block:: none
+
+   DD0010/
+   DD0010/data0010
+   DD0010/data0010.hierarchy
+   DD0010/data0010.cpu0000
+   DD0010/data0010.cpu0001
+   DD0010/data0010.cpu0002
+   DD0010/data0010.cpu0003
+
+You would feed the ``load`` command the filename ``DD0010/data0010`` as
+mentioned.
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("DD0010/data0010")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Enzo usage
+* Units should be correct, if you utilize standard unit-setting routines.  yt
+  will notify you if it cannot determine the units, although this
+  notification will be passive.
+* 2D and 1D data are supported, but the extraneous dimensions are set to be
+  of length 1.0
+
+.. _loading-orion-data:
+
+Orion Data
+----------
+
+Orion data is fully supported and cared for by Jeff Oishi.  This method should
+also work for CASTRO and MAESTRO data, which are cared for by Matthew Turk and
+Chris Malone, respectively.  To load an Orion dataset, you can use the ``load``
+command provided by ``yt.mods`` and supply to it the directory file name.
+**You must also have the ``inputs`` file in the base directory.**  For
+instance, if you were in a directory with the following files:
+
+.. code-block:: none
+
+   inputs
+   pltgmlcs5600/
+   pltgmlcs5600/Header
+   pltgmlcs5600/Level_0
+   pltgmlcs5600/Level_0/Cell_H
+   pltgmlcs5600/Level_1
+   pltgmlcs5600/Level_1/Cell_H
+   pltgmlcs5600/Level_2
+   pltgmlcs5600/Level_2/Cell_H
+   pltgmlcs5600/Level_3
+   pltgmlcs5600/Level_3/Cell_H
+   pltgmlcs5600/Level_4
+   pltgmlcs5600/Level_4/Cell_H
+
+You would feed it the filename ``pltgmlcs5600``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("pltgmlcs5600")
+
+.. rubric:: Caveats
+
+* There are no major caveats for Orion usage
+* Star particles are not supported at the current time
+
+.. _loading-flash-data:
+
+FLASH Data
+----------
+
+FLASH HDF5 data is *mostly* supported and cared for by John ZuHone.  To load a
+FLASH dataset, you can use the ``load`` command provided by ``yt.mods`` and
+supply to it the file name of a plot file or checkpoint file, but particle
+files are not currently directly loadable by themselves, due to the
+fact that they typically lack grid information. For instance, if you were in a directory with
+the following files:
+
+.. code-block:: none
+
+   cosmoSim_coolhdf5_chk_0026
+
+You would feed it the filename ``cosmoSim_coolhdf5_chk_0026``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("cosmoSim_coolhdf5_chk_0026")
+
+If you have a FLASH particle file that was created at the same time as
+a plotfile or checkpoint file (therefore having particle data
+consistent with the grid structure of the latter), its data may be loaded with the
+``particle_filename`` optional argument:
+
+.. code-block:: python
+
+    from yt.mods import *
+    pf = load("radio_halo_1kpc_hdf5_plt_cnt_0100", particle_filename="radio_halo_1kpc_hdf5_part_0100")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly utilized; yt assumes cgs
+* Velocities and length units will be scaled to comoving coordinates if yt is
+  able to discern you are examining a cosmology simulation; particle and grid
+  positions will not be.
+* Domains may be visualized assuming periodicity.
+
+<<<<<<< local
+.. _loading-ramses-data:
+
+RAMSES Data
+-----------
+
+RAMSES data enjoys preliminary support and is cared for by Matthew Turk.  If
+you are interested in taking a development or stewardship role, please contact
+him.  To load a RAMSES dataset, you can use the ``load`` command provided by
+``yt.mods`` and supply to it the ``info*.txt`` filename.  For instance, if you
+were in a directory with the following files:
+
+.. code-block:: none
+
+   output_00007
+   output_00007/amr_00007.out00001
+   output_00007/grav_00007.out00001
+   output_00007/hydro_00007.out00001
+   output_00007/info_00007.txt
+   output_00007/part_00007.out00001
+
+You would feed it the filename ``output_00007/info_00007.txt``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("output_00007/info_00007.txt")
+
+.. rubric:: Caveats
+
+* Please be careful that the units are correctly set!  This may not be the
+  case for RAMSES data
+* Upon instantiation of the hierarchy, yt will attempt to regrid the entire
+  domain to ensure minimum-coverage from a set of grid patches.  (This is
+  described in the yt method paper.)  This is a time-consuming process and it
+  has not yet been written to be stored between calls.
+* Particles are not supported
+* Parallelism will not be terribly efficient for large datasets
+* There may be occasional segfaults on multi-domain data, which do not
+  reflect errors in the calculation
+
+If you are interested in helping with RAMSES support, we are eager to hear from
+you!
+
+.. _loading-art-data:
+
+ART Data
+--------
+
+ART data enjoys preliminary support and is supported by Christopher Moody.
+Please contact the ``yt-dev`` mailing list if you are interested in using yt
+for ART data, or if you are interested in assisting with development of yt to
+work with ART data.
+
+At the moment, the ART octree is 'regridded' at each level to make the native
+octree look more like a mesh-based code. As a result, the initial outlay
+is about ~60 seconds to grid octs onto a mesh. This will be improved in 
+``yt-3.0``, where octs will be supported natively. 
+
+To load an ART dataset you can use the ``load`` command provided by 
+``yt.mods`` and passing the gas mesh file. It will search for and attempt 
+to find the complementary dark matter and stellar particle header and data 
+files. However, your simulations may not follow the same naming convention.
+
+So for example, a single snapshot might have a series of files looking like
+this:
+
+.. code-block:: none
+
+   10MpcBox_csf512_a0.300.d    #Gas mesh
+   PMcrda0.300.DAT             #Particle header
+   PMcrs0a0.300.DAT            #Particle data (positions,velocities)
+   stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
+
+The ART frontend tries to find the associated files matching the above, but
+if that fails you can specify ``file_particle_data``,``file_particle_data``,
+``file_star_data`` in addition to the specifying the gas mesh. You also have 
+the option of gridding particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's probably  best to leave
+``do_grid_particles=False`` as the default.
+
+To speed up the loading of an ART file, you have a few options. You can turn 
+off the particles entirely by setting ``discover_particles=False``. You can
+also only grid octs up to a certain level, ``limit_level=5``, which is useful
+when debugging by artificially creating a 'smaller' dataset to work with.
+
+Finally, when stellar ages are computed we 'spread' the ages evenly within a
+smoothing window. By default this is turned on and set to 10Myr. To turn this 
+off you can set ``spread=False``, and you can tweak the age smoothing window
+by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
+
+.. code-block:: python
+    
+   from yt.mods import *
+
+   file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
+   pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
+   pf.h.print_stats()
+   dd=pf.h.all_data()
+   print np.sum(dd['particle_type']==0)
+
+In the above example code, the first line imports the standard yt functions,
+followed by defining the gas mesh file. It's loaded only through level 3, but
+grids particles on to meshes on level 2 and higher. Finally, we create a data
+container and ask it to gather the particle_type array. In this case ``type==0``
+is for the most highly-refined dark matter particle, and we print out how many
+high-resolution star particles we find in the simulation.  Typically, however,
+you shouldn't have to specify any keyword arguments to load in a dataset.
+
+Athena Data
+----------
+=======
+.. loading-amr-data:
+>>>>>>> other
+
+Athena 4.x VTK data is *mostly* supported and cared for by John
+ZuHone. Both uniform grid and SMR datasets are supported. 
+
+Loading Athena datasets is slightly different depending on whether
+your dataset came from a serial or a parallel run. If the data came
+from a serial run or you have joined the VTK files together using the
+Athena tool ``join_vtk``, you can load the data like this:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("kh.0010.vtk")
+
+The filename corresponds to the file on SMR level 0, whereas if there
+are multiple levels the corresponding files will be picked up
+automatically, assuming they are laid out in ``lev*`` subdirectories
+under the directory where the base file is located.
+
+For parallel datasets, yt assumes that they are laid out in
+directories named ``id*``, one for each processor number, each with
+``lev*`` subdirectories for additional refinement levels. To load this
+data, call ``load`` with the base file in the ``id0`` directory:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("id0/kh.0010.vtk")
+
+which will pick up all of the files in the different ``id*`` directories for
+the entire dataset. 
+
+yt works in cgs ("Gaussian") units, but Athena data is not
+normally stored in these units. If you would like to convert data to
+cgs units, you may supply conversions for length, time, and density to ``load``:
+
+.. code-block:: python
+
+   from yt.mods import *
+   pf = load("id0/cluster_merger.0250.vtk", 
+          parameters={"LengthUnits":3.0856e24,
+                               "TimeUnits":3.1557e13,"DensityUnits":1.67e-24)
+
+This means that the yt fields (e.g. ``Density``, ``x-velocity``,
+``Bx``) will be in cgs units, but the Athena fields (e.g.,
+``density``, ``velocity_x``, ``cell_centered_B_x``) will be in code
+units. 
+
+.. rubric:: Caveats
+
+* yt primarily works with primitive variables. If the Athena
+  dataset contains conservative variables, the yt primitive fields will be generated from the
+  conserved variables on disk. 
+* Domains may be visualized assuming periodicity.
+* Particle list data is currently unsupported.
+* In some parallel Athena datasets, it is possible for a grid from one
+  refinement level to overlap with more than one grid on the parent
+  level. This may result in unpredictable behavior for some analysis
+  or visualization tasks. 
+

Repository URL: https://bitbucket.org/yt_analysis/yt-doc/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list