[yt-svn] commit/yt: 20 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Sat Aug 2 05:08:10 PDT 2014


20 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/0b9c2308457a/
Changeset:   0b9c2308457a
Branch:      yt-3.0
User:        chummels
Date:        2014-07-31 16:53:26
Summary:     Correcting small docs bugs.
Affected #:  3 files

diff -r 63103665c6dc4656107d2f5d98c15efd43f6ad57 -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc doc/source/analyzing/analysis_modules/halo_analysis.rst
--- a/doc/source/analyzing/analysis_modules/halo_analysis.rst
+++ b/doc/source/analyzing/analysis_modules/halo_analysis.rst
@@ -7,7 +7,7 @@
 and using the halo mass function.
 
 .. toctree::
-   :maxdepth: 1
+   :maxdepth: 2
 
    halo_transition
    halo_catalogs

diff -r 63103665c6dc4656107d2f5d98c15efd43f6ad57 -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc doc/source/analyzing/analysis_modules/halo_catalogs.rst
--- a/doc/source/analyzing/analysis_modules/halo_catalogs.rst
+++ b/doc/source/analyzing/analysis_modules/halo_catalogs.rst
@@ -241,4 +241,4 @@
 ----------------------------------------
 
 For a full example of how to use these methods together see 
-:ref:`halo_analysis_example`.
+:ref:`halo-analysis-example`.

diff -r 63103665c6dc4656107d2f5d98c15efd43f6ad57 -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc doc/source/analyzing/analysis_modules/synthetic_observation.rst
--- a/doc/source/analyzing/analysis_modules/synthetic_observation.rst
+++ b/doc/source/analyzing/analysis_modules/synthetic_observation.rst
@@ -5,13 +5,12 @@
 from simulation data.
 
 .. toctree::
-   :maxdepth: 1
+   :maxdepth: 2
 
    light_cone_generator
    light_ray_generator
    planning_cosmology_simulations
    absorption_spectrum
-   fitting_procedure
    star_analysis
    xray_emission_fields
    sunyaev_zeldovich


https://bitbucket.org/yt_analysis/yt/commits/a91dcdb2be21/
Changeset:   a91dcdb2be21
Branch:      yt-3.0
User:        chummels
Date:        2014-07-31 16:53:45
Summary:     Merging.
Affected #:  4 files

diff -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 doc/source/_static/custom.css
--- a/doc/source/_static/custom.css
+++ b/doc/source/_static/custom.css
@@ -7,6 +7,13 @@
     margin-left: 30px;
 }
 
+/*
+
+Collapse the navbar when its width is less than 1200 pixels.  This may need to
+be adjusted if the navbar menu changes.
+
+*/
+
 @media (max-width: 1200px) {
     .navbar-header {
         float: none;
@@ -34,15 +41,47 @@
     }
 }
 
+/* 
+
+Sphinx code literals conflict with the notebook code tag, so we special-case
+literals that are inside text.
+
+*/
+
 p code {
     color:  #d14;    
     white-space: nowrap;
 }
 
-tbody td.label {
-    color: #000000;
+/*
+
+Nicer, controllable formatting for tables that have multi-line headers.
+
+*/
+
+th.head {
+    white-space: pre;
 }
 
+/*
+
+labels have a crappy default color that is almost invisible in our doc theme so
+we use a darker color.
+
+*/
+
+.label {
+    color: #333333;
+}
+
+/*
+
+Hack to prevent internal link targets being positioned behind the navbar.
+
+See: https://github.com/twbs/bootstrap/issues/1768
+
+*/
+
 *[id]:before { 
   display: block; 
   content: " "; 

diff -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 doc/source/analyzing/analysis_modules/halo_catalogs.rst
--- a/doc/source/analyzing/analysis_modules/halo_catalogs.rst
+++ b/doc/source/analyzing/analysis_modules/halo_catalogs.rst
@@ -9,8 +9,8 @@
 In yt 3.0, operations relating to the analysis of halos (halo finding,
 merger tree creation, and individual halo analysis) are all brought 
 together into a single framework. This framework is substantially
-different from the limited framework included in yt-2.x and is only 
-backwards compatible in that output from old halo finders may be loaded.
+different from the halo analysis machinery available in yt-2.x and is 
+entirely backward incompatible.  
 For a direct translation of various halo analysis tasks using yt-2.x
 to yt-3.0 please see :ref:`halo-transition`.
 

diff -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -67,6 +67,7 @@
     version             Get some information about the yt installation (this
                         is an alias for instinfo).
     load                Load a single dataset into an IPython instance
+    mapserver           Serve a plot in a GMaps-style interface
     pastebin            Post a script to an anonymous pastebin
     pastebin_grab       Print an online pastebin to STDOUT for local use.
     upload_notebook     Upload an IPython notebook to hub.yt-project.org.
@@ -192,6 +193,13 @@
 This will start the iyt interactive environment with your specified 
 dataset already loaded.  See :ref:`interactive-prompt` for more details.
 
+mapserver
++++++++++
+
+Ever wanted to interact with your data using the 
+`google maps <http://maps.google.com/>`_ interface?  Now you can by using the
+yt mapserver.  See :ref:`mapserver` for more details.
+
 pastebin and pastebin_grab
 ++++++++++++++++++++++++++
 

diff -r 0b9c2308457af528da61ffe5b96f6488fc1d8ddc -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -908,6 +908,46 @@
             sys.path.insert(0,'')
             IPython.embed(config=cfg,user_ns=local_ns)
 
+class YTMapserverCmd(YTCommand):
+    args = ("proj", "field", "weight",
+            dict(short="-a", longname="--axis", action="store", type=int,
+                 dest="axis", default=0, help="Axis (4 for all three)"),
+            dict(short ="-o", longname="--host", action="store", type=str,
+                   dest="host", default=None, help="IP Address to bind on"),
+            "ds",
+            )
+
+    name = "mapserver"
+    description = \
+        """
+        Serve a plot in a GMaps-style interface
+
+        """
+
+    def __call__(self, args):
+        ds = args.ds
+        if args.axis == 4:
+            print "Doesn't work with multiple axes!"
+            return
+        if args.projection:
+            p = ProjectionPlot(ds, args.axis, args.field, weight_field=args.weight)
+        else:
+            p = SlicePlot(ds, args.axis, args.field)
+        from yt.gui.reason.pannable_map import PannableMapServer
+        mapper = PannableMapServer(p.data_source, args.field)
+        import yt.extern.bottle as bottle
+        bottle.debug(True)
+        if args.host is not None:
+            colonpl = args.host.find(":")
+            if colonpl >= 0:
+                port = int(args.host.split(":")[-1])
+                args.host = args.host[:colonpl]
+            else:
+                port = 8080
+            bottle.run(server='rocket', host=args.host, port=port)
+        else:
+            bottle.run(server='rocket')
+
 
 class YTPastebinCmd(YTCommand):
     name = "pastebin"


https://bitbucket.org/yt_analysis/yt/commits/11162ffcd0ab/
Changeset:   11162ffcd0ab
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 06:35:46
Summary:     Merging.
Affected #:  4 files

diff -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 -r 11162ffcd0ab4822043fa06ea29598c4d1e04fb3 doc/source/_static/custom.css
--- a/doc/source/_static/custom.css
+++ b/doc/source/_static/custom.css
@@ -51,6 +51,9 @@
 p code {
     color:  #d14;    
     white-space: nowrap;
+    font-size: 90%;
+    background-color: #f9f2f4;
+    font-family: Menlo, Monaco, Consolas, 'Courier New', monospace;
 }
 
 /*
@@ -88,4 +91,4 @@
   margin-top: -45px; 
   height: 45px; 
   visibility: hidden; 
-}
\ No newline at end of file
+}

diff -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 -r 11162ffcd0ab4822043fa06ea29598c4d1e04fb3 doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -102,6 +102,48 @@
 for fields which are mesh-dependent, specifically particle masses in some
 cosmology codes.)
 
+.. _field_parameters:
+
+Field Parameters
+++++++++++++++++
+
+Certain fields require external information in order to be calculated.  For 
+example, the radius field has to be defined based on some point of reference 
+and the radial velocity field needs to know the bulk velocity of the data object 
+so that it can be subtracted.  This information is passed into a field function 
+by setting field parameters, which are user-specified data that can be associated 
+with a data object.  The 
+:meth:`~yt.data_objects.data_containers.YTDataContainer.set_field_parameter` 
+and 
+:meth:`~yt.data_objects.data_containers.YTDataContainer.get_field_parameter` 
+functions are 
+used to set and retrieve field parameter values for a given data object.  In the 
+cases above, the field parameters are ``center`` and ``bulk_velocity`` respectively -- 
+the two most commonly used field parameters.
+
+.. code-block:: python
+
+   ds = yt.load("my_data")
+   ad = ds.all_data()
+
+   ad.set_field_parameter("wickets", 13)
+
+   print ad.get_field_parameter("wickets")
+
+If a field parameter is not set, ``get_field_parameter`` will return None.  
+Within a field function, these can then be retrieved and used in the same way.
+
+.. code-block:: python
+
+   def _wicket_density(field, data):
+       n_wickets = data.get_field_parameter("wickets")
+       if n_wickets is None:
+           # use a default if unset
+           n_wickets = 88
+       return data["gas", "density"] * n_wickets
+
+For a practical application of this, see :ref:`cookbook-radial-velocity`.
+
 Field types known to yt
 +++++++++++++++++++++++
 
@@ -206,6 +248,44 @@
 that tracks the position and velocity (respectively) in code units.
 
 
+.. _deposited-particle-fields:
+
+Deposited Particle Fields
++++++++++++++++++++++++++
+
+In order to turn particle (discrete) fields into fields that are deposited in
+some regular, space-filling way (even if that space is empty, it is defined
+everywhere) yt provides mechanisms for depositing particles onto a mesh.  These
+are in the special field-type space ``deposit``, and are typically of the form
+``("deposit", "particletype_depositiontype")`` where ``depositiontype`` is the
+mechanism by which the field is deposited, and ``particletype`` is the particle
+type of the particles being deposited.  If you are attempting to examine the
+cloud-in-cell (``cic``) deposition of the ``all`` particle type, you would
+access the field ``("deposit", "all_cic")``.
+
+yt defines a few particular types of deposition internally, and creating new
+ones can be done by modifying the files ``yt/geometry/particle_deposit.pyx``
+and ``yt/fields/particle_fields.py``, although that is an advanced topic
+somewhat outside the scope of this section.  The default deposition types
+available are:
+
+ * ``count`` - this field counts the total number of particles of a given type
+   in a given mesh zone.  Note that because, in general, the mesh for particle
+   datasets is defined by the number of particles in a region, this may not be
+   the most useful metric.  This may be made more useful by depositing particle
+   data onto an :ref:`arbitrary-grid`.
+ * ``density`` - this field takes the total sum of ``particle_mass`` in a given
+   mesh field and divides by the volume.
+ * ``mass`` - this field takes the total sum of ``particle_mass`` in each mesh
+   zone.
+ * ``cic`` - this field performs cloud-in-cell interpolation (see `Section 2.2
+   <http://ta.twi.tudelft.nl/dv/users/Lemmens/MThesis.TTH/chapter4.html>`_ for more
+   information) of the density of particles in a given mesh zone.
+ * ``smoothed`` - this is a special deposition type.  See discussion below for
+   more information, in :ref:`sph-fields`.
+
+.. _sph-fields:
+
 SPH Fields
 ++++++++++
 
@@ -213,3 +293,31 @@
 a field for the smoothing length ``h``, which is roughly equivalent to 
 ``(m/\rho)^{1/3}``, where ``m`` and ``rho`` are the particle mass and density 
 respectively.  This can be useful for doing neighbour finding.
+
+As a note, SPH fields are special cases of the "deposited" particle fields.
+They contain an additional piece of information about what is being examined,
+and any fields that are recognized as being identical to intrinsic yt fields
+will be aliased.  For example, in a Gadget dataset, the smoothed density of
+``Gas`` particles will be aliased to the mesh field ``("gas", "density")`` so
+that operations conducted on the mesh field ``density`` (which are frequent
+occurrences) will operate on the smoothed gas density from the SPH particles.
+
+The special deposition types based on smoothing (``smoothed``) are defined in
+the file ``yt/geometry/particle_smooth.pyx``, and they require non-local
+operations defined on a variable number of neighbors.  The default smoothing
+type utilizes a cubic spline kernel and uses 64 nearest neighbors, providing a
+volume-normalized smoothing.  Other types are possible, and yt provides
+functionality for many different types of non-local correlation between
+particles.  (For instance, a friends-of-friends grouper has been built on this
+same infrastructure.)
+
+Every particle field on a smoothed particle type is the source for a smoothed
+field; this is not always useful, but it errs on the side of extra fields,
+rather than too few fields.  (For instance, it may be unlikely that the
+smoothed angular momentum field will be useful.)  The naming scheme is an
+extension of the scheme described in :ref:`deposited-particle-fields`, and is
+defined as such: ``("deposit", "particletype_smoothed_fieldname")``, where 
+``fieldname`` is the name of the field being smoothed.  For example, smoothed
+``Temperature`` of the ``Gas`` particle type would be ``("deposit",
+"Gas_smoothed_Temperature")``, which in most cases would be aliased to the
+field ``("gas", "temperature")`` for convenience.

diff -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 -r 11162ffcd0ab4822043fa06ea29598c4d1e04fb3 doc/source/cookbook/calculating_information.rst
--- a/doc/source/cookbook/calculating_information.rst
+++ b/doc/source/cookbook/calculating_information.rst
@@ -32,12 +32,16 @@
 
 .. yt_cookbook:: global_phase_plots.py
 
+.. _cookbook-radial-velocity:
+
 Radial Velocity Profile
 ~~~~~~~~~~~~~~~~~~~~~~~
 
 This recipe demonstrates how to subtract off a bulk velocity on a sphere before
 calculating the radial velocity within that sphere.
-See :ref:`how-to-make-1d-profiles` for more information.
+See :ref:`how-to-make-1d-profiles` for more information on creating profiles and 
+:ref:`field_parameters` for an explanation of how the bulk velocity is provided 
+to the radial velocity field function.
 
 .. yt_cookbook:: rad_velocity.py 
 

diff -r a91dcdb2be21f7e4b58f09fcf8a150d830da3807 -r 11162ffcd0ab4822043fa06ea29598c4d1e04fb3 doc/source/developing/intro.rst
--- a/doc/source/developing/intro.rst
+++ b/doc/source/developing/intro.rst
@@ -42,6 +42,9 @@
 Share Your Scripts
 ------------------
 
+.. warning:: The yt Hub is currently offline due to some hosting problems.  We
+             hope to have it back up online soon.
+
 The next easiest way to get involved with yt is to participate in the `yt Hub
 <http://hub.yt-project.org/>`_.  This is a place where scripts, paper
 repositories, documents and so on can be submitted to share with the broader


https://bitbucket.org/yt_analysis/yt/commits/cbde7854f606/
Changeset:   cbde7854f606
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 06:46:34
Summary:     Updating help file.
Affected #:  1 file

diff -r 11162ffcd0ab4822043fa06ea29598c4d1e04fb3 -r cbde7854f606bd74935fb06c4f6a17b227248e57 doc/source/help/index.rst
--- a/doc/source/help/index.rst
+++ b/doc/source/help/index.rst
@@ -1,6 +1,6 @@
 .. _asking-for-help:
 
-What to do if you run into problems
+What to Do If You Run into Problems
 ===================================
 
 If you run into problems with yt, there are a number of steps to follow
@@ -11,18 +11,18 @@
 
 To summarize, here are the steps in order:
 
- #. Don’t panic and don’t give up
- #. Update to the latest version
- #. Search the yt documentation and mailing list archives
- #. Look at the yt source
- #. Isolate & document your problem 
- #. Go on IRC and ask a question
- #. Ask the mailing list
- #. Submit a bug report
+#. Don’t panic and don’t give up
+#. Update to the latest version
+#. Search the yt documentation and mailing list archives
+#. Look at the yt source
+#. Isolate & document your problem 
+#. Go on IRC and ask a question
+#. Ask the mailing list
+#. Submit a bug report
 
 .. _dont-panic:
 
-Don't panic and don't give up
+Don't Panic and Don't Give up
 -----------------------------
 
 This may seem silly, but it's effective.  While yt is a robust code with
@@ -34,7 +34,7 @@
 
 .. _update-the-code:
 
-Try updating yt
+Try Updating yt
 ---------------
 
 Sometimes the pace of development is pretty fast on yt, particularly in the
@@ -55,7 +55,7 @@
 
 .. _search-the-documentation:
 
-Search the documentation and mailing lists
+Search the Documentation and Mailing Lists
 ------------------------------------------
 
 The documentation has a lot of the answers to everyday problems.  This doesn't 
@@ -84,7 +84,7 @@
 
 .. _look-at-the-source:
 
-Look at the source code
+Look at the Source Code
 -----------------------
 
 We've done our best to make the source clean, and it is easily searchable from 
@@ -125,7 +125,7 @@
 
 .. _isolate_and_document:
 
-Isolate and document your problem
+Isolate and Document Your Problem
 ---------------------------------
 
 As you gear up to take your question to the rest of the community, try to distill
@@ -133,15 +133,15 @@
 script.  This can help you (and us) to identify the basic problem.  Follow
 these steps:
 
- * Identify what it is that went wrong, and how you knew it went wrong.
- * Put your script, errors, and outputs online:
+* Identify what it is that went wrong, and how you knew it went wrong.
+* Put your script, errors, and outputs online:
 
-   * ``$ yt pastebin script.py`` - pastes script.py online
-   * ``$ yt upload_image image.png`` - pastes image online
+  * ``$ yt pastebin script.py`` - pastes script.py online
+  * ``$ yt upload_image image.png`` - pastes image online
 
- * Identify which version of the code you’re using. 
+* Identify which version of the code you’re using. 
 
-   * ``$ yt version`` - provides version information, including changeset hash
+  * ``$ yt version`` - provides version information, including changeset hash
 
 It may be that through the mere process of doing this, you end up solving 
 the problem!
@@ -162,7 +162,7 @@
 
 .. _mailing-list:
 
-Ask the mailing list
+Ask the Mailing List
 --------------------
 
 If you still haven't yet found a solution, feel free to 
@@ -183,7 +183,7 @@
 
 .. _reporting-a-bug:
 
-How To report A bug
+How to Report a Bug
 -------------------
 
 If you have gone through all of the above steps, and you're still encountering 


https://bitbucket.org/yt_analysis/yt/commits/721f1e7edc2e/
Changeset:   721f1e7edc2e
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 07:14:11
Summary:     Updating formatting in docs.
Affected #:  4 files

diff -r cbde7854f606bd74935fb06c4f6a17b227248e57 -r 721f1e7edc2edaee65903072ef759e1c26819a9c doc/source/developing/developing.rst
--- a/doc/source/developing/developing.rst
+++ b/doc/source/developing/developing.rst
@@ -23,28 +23,28 @@
 you're new to Mercurial, these three resources are pretty great for learning
 the ins and outs:
 
-   * http://hginit.com/
-   * http://hgbook.red-bean.com/read/
-   * http://mercurial.selenic.com/
+* `http://hginit.com/`_
+* `http://hgbook.red-bean.com/read/`_
+* `http://mercurial.selenic.com/`_
 
 The commands that are essential for using mercurial include:
 
-   * ``hg commit`` which commits changes in the working directory to the
-     repository, creating a new "changeset object."
-   * ``hg add`` which adds a new file to be tracked by mercurial.  This does
-     not change the working directory.
-   * ``hg pull`` which pulls (from an optional path specifier) changeset
-     objects from a remote source.  The working directory is not modified.
-   * ``hg push`` which sends (to an optional path specifier) changeset objects
-     to a remote source.  The working directory is not modified.
-   * ``hg log`` which shows a log of all changeset objects in the current
-     repository.  Use ``-g`` to show a graph of changeset objects and their
-     relationship.
-   * ``hg update`` which (with an optional "revision" specifier) updates the
-     state of the working directory to match a changeset object in the
-     repository.
-   * ``hg merge`` which combines two changesets to make a union of their lines
-     of development.  This updates the working directory.
+* ``hg commit`` which commits changes in the working directory to the
+  repository, creating a new "changeset object."
+* ``hg add`` which adds a new file to be tracked by mercurial.  This does
+  not change the working directory.
+* ``hg pull`` which pulls (from an optional path specifier) changeset
+  objects from a remote source.  The working directory is not modified.
+* ``hg push`` which sends (to an optional path specifier) changeset objects
+  to a remote source.  The working directory is not modified.
+* ``hg log`` which shows a log of all changeset objects in the current
+  repository.  Use ``-g`` to show a graph of changeset objects and their
+  relationship.
+* ``hg update`` which (with an optional "revision" specifier) updates the
+  state of the working directory to match a changeset object in the
+  repository.
+* ``hg merge`` which combines two changesets to make a union of their lines
+  of development.  This updates the working directory.
 
 Keep in touch, and happy hacking!  We also provide `doc/coding_styleguide.txt`
 and an example of a fiducial docstring in `doc/docstring_example.txt`.  Please
@@ -87,26 +87,26 @@
 <https://ytep.readthedocs.org/en/latest/YTEPs/YTEP-0008.html>`_ for more
 detail.)
 
-  * New Features
+* New Features
 
-    * New unit tests (possibly new answer tests) (See :ref:`testing`)
-    * Docstrings in the source code for the public API
-    * Addition of new feature to the narrative documentation (See :ref:`writing_documentation`)
-    * Addition of cookbook recipe (See :ref:`writing_documentation`) 
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * New unit tests (possibly new answer tests) (See :ref:`testing`)
+  * Docstrings in the source code for the public API
+  * Addition of new feature to the narrative documentation (See :ref:`writing_documentation`)
+  * Addition of cookbook recipe (See :ref:`writing_documentation`) 
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
-  * Extension or Breakage of API in Existing Features
+* Extension or Breakage of API in Existing Features
 
-    * Update existing narrative docs and docstrings (See :ref:`writing_documentation`) 
-    * Update existing cookbook recipes (See :ref:`writing_documentation`) 
-    * Modify of create new unit tests (See :ref:`testing`)
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * Update existing narrative docs and docstrings (See :ref:`writing_documentation`) 
+  * Update existing cookbook recipes (See :ref:`writing_documentation`) 
+  * Modify of create new unit tests (See :ref:`testing`)
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
-  * Bug fixes
+* Bug fixes
 
-    * Unit test is encouraged, to ensure breakage does not happen again in the
-      future. (See :ref:`testing`)
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * Unit test is encouraged, to ensure breakage does not happen again in the
+    future. (See :ref:`testing`)
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
 When submitting, you will be asked to make sure that your changes meet all of
 these requirements.  They are pretty easy to meet, and we're also happy to help
@@ -122,22 +122,22 @@
 walk you through any troubles you might have.  Here are some suggestions
 for using mercurial with yt:
 
-  * Named branches are to be avoided.  Try using bookmarks (``hg bookmark``) to
-    track work.  (`More <http://mercurial.selenic.com/wiki/Bookmarks>`_)
-  * Make sure you set a username in your ``~/.hgrc`` before you commit any
-    changes!  All of the tutorials above will describe how to do this as one of
-    the very first steps.
-  * When contributing changes, you might be asked to make a handful of
-    modifications to your source code.  We'll work through how to do this with
-    you, and try to make it as painless as possible.
-  * Please avoid deleting your yt forks, as that eliminates the code review
-    process from BitBucket's website.
-  * In all likelihood, you only need one fork.  To keep it in sync, you can
-    sync from the website.  (See Bitbucket's `Blog Post
-    <http://blog.bitbucket.org/2013/02/04/syncing-and-merging-come-to-bitbucket/>`_
-    about this.)
-  * If you run into any troubles, stop by IRC (see :ref:`irc`) or the mailing
-    list.
+* Named branches are to be avoided.  Try using bookmarks (``hg bookmark``) to
+  track work.  (`More <http://mercurial.selenic.com/wiki/Bookmarks>`_)
+* Make sure you set a username in your ``~/.hgrc`` before you commit any
+  changes!  All of the tutorials above will describe how to do this as one of
+  the very first steps.
+* When contributing changes, you might be asked to make a handful of
+  modifications to your source code.  We'll work through how to do this with
+  you, and try to make it as painless as possible.
+* Please avoid deleting your yt forks, as that eliminates the code review
+  process from BitBucket's website.
+* In all likelihood, you only need one fork.  To keep it in sync, you can
+  sync from the website.  (See Bitbucket's `Blog Post
+  <http://blog.bitbucket.org/2013/02/04/syncing-and-merging-come-to-bitbucket/>`_
+  about this.)
+* If you run into any troubles, stop by IRC (see :ref:`irc`) or the mailing
+  list.
 
 .. _building-yt:
 
@@ -192,53 +192,53 @@
 
 The simplest way to submit changes to yt is to do the following:
 
-  * Build yt from the mercurial repository
-  * Navigate to the root of the yt repository 
-  * Make some changes and commit them
-  * Fork the `yt repository on BitBucket <https://bitbucket.org/yt_analysis/yt>`_
-  * Push the changesets to your fork
-  * Issue a pull request.
+* Build yt from the mercurial repository
+* Navigate to the root of the yt repository 
+* Make some changes and commit them
+* Fork the `yt repository on BitBucket <https://bitbucket.org/yt_analysis/yt>`_
+* Push the changesets to your fork
+* Issue a pull request.
 
 Here's a more detailed flowchart of how to submit changes.
 
-  #. If you have used the installation script, the source code for yt can be
-     found in ``$YT_DEST/src/yt-hg``.  Alternatively see
-     :ref:`source-installation` for instructions on how to build yt from the
-     mercurial repository. (Below, in :ref:`reading-source`, we describe how to
-     find items of interest.)  
-  #. Edit the source file you are interested in and
-     test your changes.  (See :ref:`testing` for more information.)
-  #. Fork yt on BitBucket.  (This step only has to be done once.)  You can do
-     this at: https://bitbucket.org/yt_analysis/yt/fork .  Call this repository
-     yt.
-  #. Commit these changes, using ``hg commit``.  This can take an argument
-     which is a series of filenames, if you have some changes you do not want
-     to commit.
-  #. If your changes include new functionality or cover an untested area of the
-     code, add a test.  (See :ref:`testing` for more information.)  Commit
-     these changes as well.
-  #. Push your changes to your new fork using the command::
+#. If you have used the installation script, the source code for yt can be
+   found in ``$YT_DEST/src/yt-hg``.  Alternatively see
+   :ref:`source-installation` for instructions on how to build yt from the
+   mercurial repository. (Below, in :ref:`reading-source`, we describe how to
+   find items of interest.)  
+#. Edit the source file you are interested in and
+   test your changes.  (See :ref:`testing` for more information.)
+#. Fork yt on BitBucket.  (This step only has to be done once.)  You can do
+   this at: https://bitbucket.org/yt_analysis/yt/fork .  Call this repository
+   yt.
+#. Commit these changes, using ``hg commit``.  This can take an argument
+   which is a series of filenames, if you have some changes you do not want
+   to commit.
+#. If your changes include new functionality or cover an untested area of the
+   code, add a test.  (See :ref:`testing` for more information.)  Commit
+   these changes as well.
+#. Push your changes to your new fork using the command::
 
-        hg push -r . https://bitbucket.org/YourUsername/yt/
+      hg push -r . https://bitbucket.org/YourUsername/yt/
  
-     If you end up doing considerable development, you can set an alias in the
-     file ``.hg/hgrc`` to point to this path.
-  #. Issue a pull request at
-     https://bitbucket.org/YourUsername/yt/pull-request/new
+   If you end up doing considerable development, you can set an alias in the
+   file ``.hg/hgrc`` to point to this path.
+#. Issue a pull request at
+   https://bitbucket.org/YourUsername/yt/pull-request/new
 
 During the course of your pull request you may be asked to make changes.  These
 changes may be related to style issues, correctness issues, or even requesting
 tests.  The process for responding to pull request code review is relatively
 straightforward.
 
-  #. Make requested changes, or leave a comment indicating why you don't think
-     they should be made.
-  #. Commit those changes to your local repository.
-  #. Push the changes to your fork::
+#. Make requested changes, or leave a comment indicating why you don't think
+   they should be made.
+#. Commit those changes to your local repository.
+#. Push the changes to your fork::
 
-        hg push https://bitbucket.org/YourUsername/yt/
+      hg push https://bitbucket.org/YourUsername/yt/
 
-  #. Your pull request will be automatically updated.
+#. Your pull request will be automatically updated.
 
 .. _writing_documentation:
 
@@ -261,14 +261,14 @@
 the yt mercurial repository).  It is organized hierarchically into the main
 categories of:
 
- * Visualizing
- * Analyzing
- * Examining
- * Cookbook
- * Bootcamp
- * Developing
- * Reference
- * Help
+* Visualizing
+* Analyzing
+* Examining
+* Cookbook
+* Bootcamp
+* Developing
+* Reference
+* Help
 
 You will have to figure out where your new/modified doc fits into this, but 
 browsing through the pre-built documentation is a good way to sort that out.
@@ -309,7 +309,7 @@
 --------------------------------------
 
 yt is hosted on BitBucket, and you can see all of the yt repositories at
-http://hg.yt-project.org/ .  With the yt installation script you should have a
+`http://hg.yt-project.org/`_ .  With the yt installation script you should have a
 copy of Mercurial for checking out pieces of code.  Make sure you have followed
 the steps above for bootstrapping your development (to assure you have a
 bitbucket account, etc.)
@@ -318,7 +318,7 @@
 main yt repository on bitbucket.  A fork is simply an exact copy of the main
 repository (along with its history) that you will now own and can make
 modifications as you please.  You can create a personal fork by visiting the yt
-bitbucket webpage at https://bitbucket.org/yt_analysis/yt/ .  After logging in,
+bitbucket webpage at `https://bitbucket.org/yt_analysis/yt/`_ .  After logging in,
 you should see an option near the top right labeled "fork".  Click this option,
 and then click the fork repository button on the subsequent page.  You now have
 a forked copy of the yt repository for your own personal modification.
@@ -380,54 +380,54 @@
 code is contained in the yt subdirectory.  This directory its self contains
 the following subdirectories:
 
-   ``frontends``
-      This is where interfaces to codes are created.  Within each subdirectory of
-      yt/frontends/ there must exist the following files, even if empty:
+``frontends``
+   This is where interfaces to codes are created.  Within each subdirectory of
+   yt/frontends/ there must exist the following files, even if empty:
 
-      * ``data_structures.py``, where subclasses of AMRGridPatch, Dataset
-        and AMRHierarchy are defined.
-      * ``io.py``, where a subclass of IOHandler is defined.
-      * ``fields.py``, where fields we expect to find in datasets are defined
-      * ``misc.py``, where any miscellaneous functions or classes are defined.
-      * ``definitions.py``, where any definitions specific to the frontend are
-        defined.  (i.e., header formats, etc.)
+   * ``data_structures.py``, where subclasses of AMRGridPatch, Dataset
+     and AMRHierarchy are defined.
+   * ``io.py``, where a subclass of IOHandler is defined.
+   * ``fields.py``, where fields we expect to find in datasets are defined
+   * ``misc.py``, where any miscellaneous functions or classes are defined.
+   * ``definitions.py``, where any definitions specific to the frontend are
+     defined.  (i.e., header formats, etc.)
 
-   ``fields``
-      This is where all of the derived fields that ship with yt are defined.
+``fields``
+   This is where all of the derived fields that ship with yt are defined.
 
-   ``geometry`` 
-      This is where geometric helpler routines are defined. Handlers
-      for grid and oct data, as well as helpers for coordinate transformations
-      can be found here.
+``geometry`` 
+   This is where geometric helpler routines are defined. Handlers
+   for grid and oct data, as well as helpers for coordinate transformations
+   can be found here.
 
-   ``visualization``
-      This is where all visualization modules are stored.  This includes plot
-      collections, the volume rendering interface, and pixelization frontends.
+``visualization``
+   This is where all visualization modules are stored.  This includes plot
+   collections, the volume rendering interface, and pixelization frontends.
 
-   ``data_objects``
-      All objects that handle data, processed or unprocessed, not explicitly
-      defined as visualization are located in here.  This includes the base
-      classes for data regions, covering grids, time series, and so on.  This
-      also includes derived fields and derived quantities.
+``data_objects``
+   All objects that handle data, processed or unprocessed, not explicitly
+   defined as visualization are located in here.  This includes the base
+   classes for data regions, covering grids, time series, and so on.  This
+   also includes derived fields and derived quantities.
 
-   ``analysis_modules``
-      This is where all mechanisms for processing data live.  This includes
-      things like clump finding, halo profiling, halo finding, and so on.  This
-      is something of a catchall, but it serves as a level of greater
-      abstraction that simply data selection and modification.
+``analysis_modules``
+   This is where all mechanisms for processing data live.  This includes
+   things like clump finding, halo profiling, halo finding, and so on.  This
+   is something of a catchall, but it serves as a level of greater
+   abstraction that simply data selection and modification.
 
-   ``gui``
-      This is where all GUI components go.  Typically this will be some small
-      tool used for one or two things, which contains a launching mechanism on
-      the command line.
+``gui``
+   This is where all GUI components go.  Typically this will be some small
+   tool used for one or two things, which contains a launching mechanism on
+   the command line.
 
-   ``utilities``
-      All broadly useful code that doesn't clearly fit in one of the other
-      categories goes here.
+``utilities``
+   All broadly useful code that doesn't clearly fit in one of the other
+   categories goes here.
 
-   ``extern`` 
-      Bundled external modules (i.e. code that was not written by one of
-      the yt authors but that yt depends on) lives here.
+``extern`` 
+   Bundled external modules (i.e. code that was not written by one of
+   the yt authors but that yt depends on) lives here.
 
 
 If you're looking for a specific file or function in the yt source code, use
@@ -457,72 +457,72 @@
 General Guidelines
 ++++++++++++++++++
 
- * In general, follow `PEP-8 <http://www.python.org/dev/peps/pep-0008/>`_ guidelines.
- * Classes are ConjoinedCapitals, methods and functions are
-   ``lowercase_with_underscores.``
- * Use 4 spaces, not tabs, to represent indentation.
- * Line widths should not be more than 80 characters.
- * Do not use nested classes unless you have a very good reason to, such as
-   requiring a namespace or class-definition modification.  Classes should live
-   at the top level.  ``__metaclass__`` is exempt from this.
- * Do not use unnecessary parentheses in conditionals.  ``if((something) and
-   (something_else))`` should be rewritten as ``if something and
-   something_else``.  Python is more forgiving than C.
- * Avoid copying memory when possible. For example, don't do ``a =
-   a.reshape(3,4)`` when ``a.shape = (3,4)`` will do, and ``a = a * 3`` should be
-   ``np.multiply(a, 3, a)``.
- * In general, avoid all double-underscore method names: ``__something`` is
-   usually unnecessary.
- * Doc strings should describe input, output, behavior, and any state changes
-   that occur on an object.  See the file `doc/docstring_example.txt` for a
-   fiducial example of a docstring.
+* In general, follow `PEP-8 <http://www.python.org/dev/peps/pep-0008/>`_ guidelines.
+* Classes are ConjoinedCapitals, methods and functions are
+  ``lowercase_with_underscores.``
+* Use 4 spaces, not tabs, to represent indentation.
+* Line widths should not be more than 80 characters.
+* Do not use nested classes unless you have a very good reason to, such as
+  requiring a namespace or class-definition modification.  Classes should live
+  at the top level.  ``__metaclass__`` is exempt from this.
+* Do not use unnecessary parentheses in conditionals.  ``if((something) and
+  (something_else))`` should be rewritten as ``if something and
+  something_else``.  Python is more forgiving than C.
+* Avoid copying memory when possible. For example, don't do ``a =
+  a.reshape(3,4)`` when ``a.shape = (3,4)`` will do, and ``a = a * 3`` should be
+  ``np.multiply(a, 3, a)``.
+* In general, avoid all double-underscore method names: ``__something`` is
+  usually unnecessary.
+* Doc strings should describe input, output, behavior, and any state changes
+  that occur on an object.  See the file `doc/docstring_example.txt` for a
+  fiducial example of a docstring.
 
 API Guide
 +++++++++
 
- * Do not import "*" from anything other than ``yt.funcs``.
- * Internally, only import from source files directly; instead of: ``from
-   yt.visualization.api import SlicePlot`` do
-   ``from yt.visualization.plot_window import SlicePlot``.
- * Numpy is to be imported as ``np``.
- * Do not use too many keyword arguments.  If you have a lot of keyword
-   arguments, then you are doing too much in ``__init__`` and not enough via
-   parameter setting.
- * In function arguments, place spaces before commas.  ``def something(a,b,c)``
-   should be ``def something(a, b, c)``.
- * Don't create a new class to replicate the functionality of an old class --
-   replace the old class.  Too many options makes for a confusing user
-   experience.
- * Parameter files external to yt are a last resort.
- * The usage of the ``**kwargs`` construction should be avoided.  If they
-   cannot be avoided, they must be explained, even if they are only to be
-   passed on to a nested function.
- * Constructor APIs should be kept as *simple* as possible.
- * Variable names should be short but descriptive.
- * No global variables!
+* Do not import "*" from anything other than ``yt.funcs``.
+* Internally, only import from source files directly; instead of: ``from
+  yt.visualization.api import SlicePlot`` do
+  ``from yt.visualization.plot_window import SlicePlot``.
+* Numpy is to be imported as ``np``.
+* Do not use too many keyword arguments.  If you have a lot of keyword
+  arguments, then you are doing too much in ``__init__`` and not enough via
+  parameter setting.
+* In function arguments, place spaces before commas.  ``def something(a,b,c)``
+  should be ``def something(a, b, c)``.
+* Don't create a new class to replicate the functionality of an old class --
+  replace the old class.  Too many options makes for a confusing user
+  experience.
+* Parameter files external to yt are a last resort.
+* The usage of the ``**kwargs`` construction should be avoided.  If they
+  cannot be avoided, they must be explained, even if they are only to be
+  passed on to a nested function.
+* Constructor APIs should be kept as *simple* as possible.
+* Variable names should be short but descriptive.
+* No global variables!
 
 Variable Names and Enzo-isms
 ++++++++++++++++++++++++++++
 
- * Avoid Enzo-isms.  This includes but is not limited to:
+* Avoid Enzo-isms.  This includes but is not limited to:
 
-   + Hard-coding parameter names that are the same as those in Enzo.  The
-     following translation table should be of some help.  Note that the
-     parameters are now properties on a Dataset subclass: you access them
-     like ``ds.refine_by`` .
+  + Hard-coding parameter names that are the same as those in Enzo.  The
+    following translation table should be of some help.  Note that the
+    parameters are now properties on a Dataset subclass: you access them
+    like ``ds.refine_by`` .
 
-     - ``RefineBy `` => `` refine_by``
-     - ``TopGridRank `` => `` dimensionality``
-     - ``TopGridDimensions `` => `` domain_dimensions``
-     - ``InitialTime `` => `` current_time``
-     - ``DomainLeftEdge `` => `` domain_left_edge``
-     - ``DomainRightEdge `` => `` domain_right_edge``
-     - ``CurrentTimeIdentifier `` => `` unique_identifier``
-     - ``CosmologyCurrentRedshift `` => `` current_redshift``
-     - ``ComovingCoordinates `` => `` cosmological_simulation``
-     - ``CosmologyOmegaMatterNow `` => `` omega_matter``
-     - ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
-     - ``CosmologyHubbleConstantNow `` => `` hubble_constant``
+    - ``RefineBy `` => `` refine_by``
+    - ``TopGridRank `` => `` dimensionality``
+    - ``TopGridDimensions `` => `` domain_dimensions``
+    - ``InitialTime `` => `` current_time``
+    - ``DomainLeftEdge `` => `` domain_left_edge``
+    - ``DomainRightEdge `` => `` domain_right_edge``
+    - ``CurrentTimeIdentifier `` => `` unique_identifier``
+    - ``CosmologyCurrentRedshift `` => `` current_redshift``
+    - ``ComovingCoordinates `` => `` cosmological_simulation``
+    - ``CosmologyOmegaMatterNow `` => `` omega_matter``
+    - ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
+    - ``CosmologyHubbleConstantNow `` => `` hubble_constant``
 
-   + Do not assume that the domain runs from 0 to 1.  This is not true
-     everywhere.
+  + Do not assume that the domain runs from 0 to 1.  This is not true
+    for many codes and datasets.

diff -r cbde7854f606bd74935fb06c4f6a17b227248e57 -r 721f1e7edc2edaee65903072ef759e1c26819a9c doc/source/developing/intro.rst
--- a/doc/source/developing/intro.rst
+++ b/doc/source/developing/intro.rst
@@ -15,9 +15,9 @@
 
 There are four main communication channels for yt:
 
- * We also have an IRC channel, on ``irc.freenode.net`` in ``#yt``, which can be a
-   bit less on-topic than the mailing lists.  You can connect through our web
-   gateway without any special client, at http://yt-project.org/irc.html .
+ * We have an IRC channel, on ``irc.freenode.net`` in ``#yt``.
+   You can connect through our web
+   gateway without any special client, at `http://yt-project.org/irc.html`_.
    *IRC is the first stop for conversation!*
  * `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
    is a relatively high-traffic mailing list where people are encouraged to ask
@@ -60,8 +60,8 @@
 to have more examples that show complex or advanced behavior -- and if you have
 used such scripts to write a paper, that too would be an amazing contribution.
 
-Documentation and Screencasts
------------------------------
+Documentation 
+-------------
 
 The yt documentation -- which you are reading right now -- is constantly being
 updated, and it is a task we would very much appreciate assistance with.
@@ -75,18 +75,6 @@
 issue a pull request through the website for your new fork, and we can comment
 back and forth and eventually accept your changes.
 
-One of the more interesting ways we are attempting to do lately is to add
-screencasts to the documentation -- these are recordings of people executing
-sessions in a terminal or in a web browser, showing off functionality and
-describing how to do various things.  These provide a more dynamic and
-engaging way of demonstrating functionality and teaching methods.
-
-One easy place to record screencasts is with `Screencast-O-Matic
-<http://www.screencast-o-matic.com/>`_ but there are many to choose from.  Once
-you have recorded it, let us know and be sure to add it to the
-`yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.  We'll then link to it
-from the documentation!
-
 Gallery Images and Videos
 -------------------------
 
@@ -96,9 +84,9 @@
 email it to us and we'll add it to the `Gallery
 <http://yt-project.org/gallery.html>`_.
 
-We're eager to show off the images you make with yt, so please feel free to
-drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ a
-line and let us know if you've got something great!
+We're eager to show off the images and movies you make with yt, so please feel 
+free to drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ 
+a line and let us know if you've got something great!
 
 Technical Contributions
 -----------------------

diff -r cbde7854f606bd74935fb06c4f6a17b227248e57 -r 721f1e7edc2edaee65903072ef759e1c26819a9c doc/source/examining/low_level_inspection.rst
--- a/doc/source/examining/low_level_inspection.rst
+++ b/doc/source/examining/low_level_inspection.rst
@@ -230,6 +230,5 @@
 directly as a fixed resolution array.  This provides a means for bypassing the 
 yt method for generating plots, and allows the user the freedom to use 
 whatever interface they wish for displaying and saving their image data.  
-The object for doing this is the aptly titled Fixed Resolution Buffer, and 
-there is a full explanation for how to use it 
-:ref:`here <fixed-resolution-buffers>`.
+You can use the :class:`~yt.visualization.fixed_resolution.FixedResolutionBuffer`
+to accomplish this as described in :ref:`fixed-resolution-buffers`.

diff -r cbde7854f606bd74935fb06c4f6a17b227248e57 -r 721f1e7edc2edaee65903072ef759e1c26819a9c doc/source/reference/api/api.rst
--- a/doc/source/reference/api/api.rst
+++ b/doc/source/reference/api/api.rst
@@ -86,6 +86,7 @@
    ~yt.data_objects.selection_data_containers.YTSphereBase
    ~yt.data_objects.selection_data_containers.YTEllipsoidBase
    ~yt.data_objects.selection_data_containers.YTCutRegionBase
+   ~yt.data_objects.grid_patch.AMRGridPatch
 
 Construction Objects
 ++++++++++++++++++++


https://bitbucket.org/yt_analysis/yt/commits/63f0e82d0cd7/
Changeset:   63f0e82d0cd7
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 07:34:19
Summary:     Updating the developer docs.
Affected #:  4 files

diff -r 721f1e7edc2edaee65903072ef759e1c26819a9c -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 doc/source/developing/building_the_docs.rst
--- a/doc/source/developing/building_the_docs.rst
+++ b/doc/source/developing/building_the_docs.rst
@@ -1,8 +1,77 @@
+.. _documentation
+
+Documentation
+=============
+
+.. _writing_documentation:
+
+How to Write Documentation
+--------------------------
+
+Writing documentation is one of the most important but often overlooked tasks
+for increasing yt's impact in the community.  It is the way in which the
+world will understand how to use our code, so it needs to be done concisely
+and understandably.  Typically, when a developer submits some piece of code
+with new functionality, she should also include documentation on how to use
+that functionality (as per :ref:`requirements-for-code-submission`).
+Depending on the nature of the code addition, this could be a new narrative
+docs section describing how the new code works and how to use it, it could
+include a recipe in the cookbook section, or it could simply be adding a note
+in the relevant docs text somewhere.
+
+The documentation exists in the main mercurial code repository for yt in the
+``doc`` directory (i.e. ``$YT_HG/doc/source`` where ``$YT_HG`` is the path of
+the yt mercurial repository).  It is organized hierarchically into the main
+categories of:
+
+* Visualizing
+* Analyzing
+* Examining
+* Cookbook
+* Bootcamp
+* Developing
+* Reference
+* Help
+
+You will have to figure out where your new/modified doc fits into this, but
+browsing through the pre-built documentation is a good way to sort that out.
+
+All the source for the documentation is written in
+`Sphinx <http://sphinx-doc.org/>`_, which uses ReST for markup.  ReST is very
+straightforward to markup in a text editor, and if you are new to it, we
+recommend just using other .rst files in the existing yt documentation as
+templates or checking out the
+`ReST reference documentation <http://sphinx-doc.org/rest.html>`_.
+
+New cookbook recipes (see :ref:`cookbook`) are very helpful for the community
+as they provide simple annotated recipes on how to use specific functionality.
+To add one, create a concise python script which demonstrates some
+functionality and pare it down to its minimum.  Add some comment lines to
+describe what it is that you're doing along the way.  Place this ``.py`` file
+in the ``source/cookbook/`` directory, and then link to it explicitly in one
+of the relevant ``.rst`` files in that directory (e.g. ``complex_plots.rst``,
+``cosmological_analysis.rst``, etc.), and add some description of what the script
+actually does.  We recommend that you use one of the
+`sample data sets <http://yt-project.org/data>`_ in your recipe.  When the full
+docs are built, each of the cookbook recipes are executed dynamically on
+a system which has access to all of the sample datasets.  Any output images
+generated by your script will then be attached inline in the built documentation
+directly following your script.
+
+After you have made your modifications to the docs, you will want to make sure
+that they render the way you expect them to render.  For more information on
+this, see the section on :ref:`docs_build`.  Unless you're contributing cookbook
+recipes or notebooks which require a dynamical build, you can probably get
+away with just doing a 'quick' docs build.
+
+When you have completed your documentation additions, commit your changes
+to your repository and make a pull request in the same way you would contribute
+a change to the codebase, as described in the section on :ref:`sharing-changes`.
+
 .. _docs_build:
 
-==========================
 Building the Documentation
-==========================
+--------------------------
 
 The yt documentation makes heavy use of the sphinx documentation automation
 suite.  Sphinx, written in python, was originally created for the documentation
@@ -14,8 +83,8 @@
 build time by sphinx.  We also use sphinx to run code snippets (e.g. the 
 cookbook and the notebooks) and embed resulting images and example data.
 
-Quick versus full documentation builds
---------------------------------------
+Quick versus Full Documentation Builds
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 Building the entire set of yt documentation is a laborious task, since you 
 need to have a large number of packages in order to successfully execute
@@ -31,8 +100,8 @@
 to follow the instructions for building the ``full`` docs, so that you can
 dynamically execute and render the cookbook recipes, the notebooks, etc.
 
-Building the docs (quick)
--------------------------
+Building the Docs (Quick)
+^^^^^^^^^^^^^^^^^^^^^^^^^
 
 You will need to have the yt repository available on your computer, which
 is done by default if you have yt installed.  In addition, you need a 
@@ -62,8 +131,8 @@
 ``$YT_HG/doc/build/html`` directory.  You can now go there and open
 up ``index.html`` or whatever file you wish in your web browser.
 
-Building the docs (full)
-------------------------
+Building the Docs (Full)
+^^^^^^^^^^^^^^^^^^^^^^^^
 
 As alluded to earlier, building the full documentation is a bit more involved
 than simply building the static documentation.  
@@ -85,15 +154,15 @@
 supplementary yt analysis modules installed. The following dependencies were 
 used to generate the yt documentation during the release of yt 2.6 in late 2013.
 
-- Sphinx_ 1.1.3
-- IPython_ 1.1
-- runipy_ (git hash f74458c2877)
-- pandoc_ 1.11.1
-- Rockstar halo finder 0.99.6
-- SZpack_ 1.1.1
-- ffmpeg_ 1.2.4 (compiled with libvpx support)
-- JSAnimation_ (git hash 1b95cb3a3a)
-- Astropy_ 0.2.5
+* Sphinx_ 1.1.3
+* IPython_ 1.1
+* runipy_ (git hash f74458c2877)
+* pandoc_ 1.11.1
+* Rockstar halo finder 0.99.6
+* SZpack_ 1.1.1
+* ffmpeg_ 1.2.4 (compiled with libvpx support)
+* JSAnimation_ (git hash 1b95cb3a3a)
+* Astropy_ 0.2.5
 
 .. _SZpack: http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html
 .. _Astropy: http://astropy.org/
@@ -130,8 +199,8 @@
 will not delete the autogenerated API docs, so use :code:`make fullclean` to
 delete those as well.
 
-Building the docs (hybrid)
---------------------------
+Building the Docs (Hybrid)
+^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 It's also possible to create a custom sphinx build that builds a restricted set
 of notebooks or scripts.  This can be accomplished by editing the Sphinx

diff -r 721f1e7edc2edaee65903072ef759e1c26819a9c -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 doc/source/developing/debugdrive.rst
--- a/doc/source/developing/debugdrive.rst
+++ b/doc/source/developing/debugdrive.rst
@@ -1,9 +1,9 @@
 .. _debug-drive:
 
-Debugging and Driving YT
-========================
+Debugging yt
+============
 
-There are several different convenience functions that allow you to control YT
+There are several different convenience functions that allow you to control yt
 in perhaps unexpected and unorthodox manners.  These will allow you to conduct
 in-depth debugging of processes that may be running in parallel on multiple
 processors, as well as providing a mechanism of signalling to yt that you need
@@ -18,53 +18,12 @@
 single, unified interactive prompt.  This enables and facilitates parallel
 analysis without sacrificing interactivity and flexibility.
 
-.. _pastebin:
+Use the Python Debugger
+-----------------------
 
-The Pastebin
-------------
-
-A pastebin is a website where you can easily copy source code and error
-messages to share with yt developers or your collaborators. At
-http://paste.yt-project.org/ a pastebin is available for placing scripts.  With
-yt the script ``yt_lodgeit.py`` is distributed and wrapped with 
-the ``pastebin`` and ``pastebin_grab`` commands, which allow for commandline 
-uploading and downloading of pasted snippets.  To upload a script you
-would supply it to the command:
-
-.. code-block:: bash
-
-   $ yt pastebin some_script.py
-
-The URL will be returned.  If you'd like it to be marked 'private' and not show
-up in the list of pasted snippets, supply the argument ``--private``.  All
-snippets are given either numbers or hashes.  To download a pasted snippet, you
-would use the ``pastebin_grab`` option:
-
-.. code-block:: bash
-
-   $ yt pastebin_grab 1768
-
-The snippet will be output to the window, so output redirection can be used to
-store it in a file.
-
-.. _error-reporting:
-
-Error Reporting with the Pastebin
-+++++++++++++++++++++++++++++++++
-
-If you are having troubles with yt, you can have it paste the error report
-to the pastebin by running your problematic script with the ``--paste`` option:
-
-.. code-block:: bash
-
-   $ python2.7 some_problematic_script.py --paste
-
-The ``--paste`` option has to come after the name of the script.  When the
-script dies and prints its error, it will also submit that error to the
-pastebin and return a URL for the error.  When reporting your bug, include this
-URL and then the problem can be debugged more easily.
-
-For more information on asking for help, see `asking-for-help`.
+yt is almost entirely composed of python code, so it makes sense to use
+the python debugger as your first stop in trying to debug it:
+`https://docs.python.org/2/library/pdb.html`_
 
 Signaling yt to Do Something
 ----------------------------

diff -r 721f1e7edc2edaee65903072ef759e1c26819a9c -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 doc/source/developing/developing.rst
--- a/doc/source/developing/developing.rst
+++ b/doc/source/developing/developing.rst
@@ -240,71 +240,6 @@
 
 #. Your pull request will be automatically updated.
 
-.. _writing_documentation:
-
-How to Write Documentation
---------------------------
-
-Writing documentation is one of the most important but often overlooked tasks
-for increasing yt's impact in the community.  It is the way in which the 
-world will understand how to use our code, so it needs to be done concisely
-and understandably.  Typically, when a developer submits some piece of code 
-with new functionality, she should also include documentation on how to use 
-that functionality (as per :ref:`requirements-for-code-submission`).  
-Depending on the nature of the code addition, this could be a new narrative 
-docs section describing how the new code works and how to use it, it could 
-include a recipe in the cookbook section, or it could simply be adding a note 
-in the relevant docs text somewhere.
-
-The documentation exists in the main mercurial code repository for yt in the
-``doc`` directory (i.e. ``$YT_HG/doc/source`` where ``$YT_HG`` is the path of
-the yt mercurial repository).  It is organized hierarchically into the main
-categories of:
-
-* Visualizing
-* Analyzing
-* Examining
-* Cookbook
-* Bootcamp
-* Developing
-* Reference
-* Help
-
-You will have to figure out where your new/modified doc fits into this, but 
-browsing through the pre-built documentation is a good way to sort that out.
-
-All the source for the documentation is written in 
-`Sphinx <http://sphinx-doc.org/>`_, which uses ReST for markup.  ReST is very
-straightforward to markup in a text editor, and if you are new to it, we
-recommend just using other .rst files in the existing yt documentation as 
-templates or checking out the 
-`ReST reference documentation <http://sphinx-doc.org/rest.html>`_.
-
-New cookbook recipes (see :ref:`cookbook`) are very helpful for the community 
-as they provide simple annotated recipes on how to use specific functionality.  
-To add one, create a concise python script which demonstrates some 
-functionality and pare it down to its minimum.  Add some comment lines to 
-describe what it is that you're doing along the way.  Place this ``.py`` file 
-in the ``source/cookbook/`` directory, and then link to it explicitly in one 
-of the relevant ``.rst`` files in that directory (e.g. ``complex_plots.rst``, 
-``cosmological_analysis.rst``, etc.), and add some description of what the script 
-actually does.  We recommend that you use one of the 
-`sample data sets <http://yt-project.org/data>`_ in your recipe.  When the full
-docs are built, each of the cookbook recipes are executed dynamically on 
-a system which has access to all of the sample datasets.  Any output images 
-generated by your script will then be attached inline in the built documentation 
-directly following your script.
-
-After you have made your modifications to the docs, you will want to make sure
-that they render the way you expect them to render.  For more information on
-this, see the section on :ref:`docs_build`.  Unless you're contributing cookbook
-recipes or notebooks which require a dynamical build, you can probably get 
-away with just doing a 'quick' docs build.
-
-When you have completed your documentation additions, commit your changes 
-to your repository and make a pull request in the same way you would contribute 
-a change to the codebase, as described in the section on :ref:`sharing-changes`.
-
 How To Get The Source Code For Editing
 --------------------------------------
 

diff -r 721f1e7edc2edaee65903072ef759e1c26819a9c -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 doc/source/developing/testing.rst
--- a/doc/source/developing/testing.rst
+++ b/doc/source/developing/testing.rst
@@ -1,6 +1,5 @@
 .. _testing:
 
-=======
 Testing
 =======
 
@@ -46,8 +45,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose()
+   import yt
+   yt.run_nose()
 
 If you are developing new functionality, it is sometimes more convenient to use
 the Nose command line interface, ``nosetests``. You can run the unit tests
@@ -79,36 +78,36 @@
 document, as in some cases they belong to other packages.  However, a few come
 in handy:
 
- * :func:`yt.testing.fake_random_ds` provides the ability to create a random
-   dataset, with several fields and divided into several different
-   grids, that can be operated on.
- * :func:`yt.testing.assert_equal` can operate on arrays.
- * :func:`yt.testing.assert_almost_equal` can operate on arrays and accepts a
-   relative allowable difference.
- * :func:`yt.testing.amrspace` provides the ability to create AMR grid
-   structures.
- * :func:`~yt.testing.expand_keywords` provides the ability to iterate over
-   many values for keywords.
+* :func:`~yt.testing.fake_random_ds` provides the ability to create a random
+  dataset, with several fields and divided into several different
+  grids, that can be operated on.
+* :func:`~yt.testing.assert_equal` can operate on arrays.
+* :func:`~yt.testing.assert_almost_equal` can operate on arrays and accepts a
+  relative allowable difference.
+* :func:`~yt.testing.amrspace` provides the ability to create AMR grid
+  structures.
+* :func:`~yt.testing.expand_keywords` provides the ability to iterate over
+  many values for keywords.
 
 To create new unit tests:
 
- #. Create a new ``tests/`` directory next to the file containing the
-    functionality you want to test.  Be sure to add this new directory as a
-    subpackage in the setup.py script located in the directory you're adding a
-    new ``tests/`` folder to.  This ensures that the tests will be deployed in
-    yt source and binary distributions.
- #. Inside that directory, create a new python file prefixed with ``test_`` and
-    including the name of the functionality.
- #. Inside that file, create one or more routines prefixed with ``test_`` that
-    accept no arguments.  These should ``yield`` a set of values of the form
-    ``function``, ``arguments``.  For example ``yield assert_equal, 1.0, 1.0``
-    would evaluate that 1.0 equaled 1.0.
- #. Use ``fake_random_ds`` to test on datasets, and be sure to test for
-    several combinations of ``nproc``, so that domain decomposition can be
-    tested as well.
- #. Test multiple combinations of options by using the
-    :func:`~yt.testing.expand_keywords` function, which will enable much
-    easier iteration over options.
+#. Create a new ``tests/`` directory next to the file containing the
+   functionality you want to test.  Be sure to add this new directory as a
+   subpackage in the setup.py script located in the directory you're adding a
+   new ``tests/`` folder to.  This ensures that the tests will be deployed in
+   yt source and binary distributions.
+#. Inside that directory, create a new python file prefixed with ``test_`` and
+   including the name of the functionality.
+#. Inside that file, create one or more routines prefixed with ``test_`` that
+   accept no arguments.  These should ``yield`` a set of values of the form
+   ``function``, ``arguments``.  For example ``yield assert_equal, 1.0, 1.0``
+   would evaluate that 1.0 equaled 1.0.
+#. Use ``fake_random_ds`` to test on datasets, and be sure to test for
+   several combinations of ``nproc``, so that domain decomposition can be
+   tested as well.
+#. Test multiple combinations of options by using the
+   :func:`~yt.testing.expand_keywords` function, which will enable much
+   easier iteration over options.
 
 For an example of how to write unit tests, look at the file
 ``yt/data_objects/tests/test_covering_grid.py``, which covers a great deal of
@@ -134,16 +133,16 @@
 The very first step is to make a directory and copy over the data against which
 you want to test.  Currently, we test:
 
- * ``DD0010/moving7_0010`` (available in ``tests/`` in the yt distribution)
- * ``IsolatedGalaxy/galaxy0030/galaxy0030``
- * ``WindTunnel/windtunnel_4lev_hdf5_plt_cnt_0030``
- * ``GasSloshingLowRes/sloshing_low_res_hdf5_plt_cnt_0300``
- * ``TurbBoxLowRes/data.0005.3d.hdf5``
- * ``GaussianCloud/data.0077.3d.hdf5``
- * ``RadAdvect/plt00000``
- * ``RadTube/plt00500``
+* ``DD0010/moving7_0010`` (available in ``tests/`` in the yt distribution)
+* ``IsolatedGalaxy/galaxy0030/galaxy0030``
+* ``WindTunnel/windtunnel_4lev_hdf5_plt_cnt_0030``
+* ``GasSloshingLowRes/sloshing_low_res_hdf5_plt_cnt_0300``
+* ``TurbBoxLowRes/data.0005.3d.hdf5``
+* ``GaussianCloud/data.0077.3d.hdf5``
+* ``RadAdvect/plt00000``
+* ``RadTube/plt00500``
 
-These datasets are available at http://yt-project.org/data/.
+These datasets are available at `http://yt-project.org/data/`_.
 
 Next, modify the file ``~/.yt/config`` to include a section ``[yt]``
 with the parameter ``test_data_dir``.  Set this to point to the
@@ -162,8 +161,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose(run_answer_tests=True)
+   import yt
+   yt.run_nose(run_answer_tests=True)
 
 If you have installed yt using ``python setup.py develop`` you can also
 optionally invoke nose using the ``nosetests`` command line interface:
@@ -183,8 +182,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose(run_answer_tests=True, answer_big_data=True)
+   import yt
+   yt.run_nose(run_answer_tests=True, answer_big_data=True)
 
 or, in the base directory of the yt mercurial repository:
 
@@ -231,24 +230,24 @@
 
 To write a new test:
 
- * Subclass ``AnswerTestingTest``
- * Add the attributes ``_type_name`` (a string) and ``_attrs``
-   (a tuple of strings, one for each attribute that defines the test --
-   see how this is done for projections, for instance)
- * Implement the two routines ``run`` and ``compare``  The first
-   should return a result and the second should compare a result to an old
-   result.  Neither should yield, but instead actually return.  If you need
-   additional arguments to the test, implement an ``__init__`` routine.
- * Keep in mind that *everything* returned from ``run`` will be stored.  So if
-   you are going to return a huge amount of data, please ensure that the test
-   only gets run for small data.  If you want a fast way to measure something as
-   being similar or different, either an md5 hash (see the grid values test) or
-   a sum and std of an array act as good proxies.  If you must store a large
-   amount of data for some reason, try serializing the data to a string
-   (e.g. using ``numpy.ndarray.dumps``), and then compressing the data stream
-   using ``zlib.compress``.
- * Typically for derived values, we compare to 10 or 12 decimal places.
-   For exact values, we compare exactly.
+* Subclass ``AnswerTestingTest``
+* Add the attributes ``_type_name`` (a string) and ``_attrs``
+  (a tuple of strings, one for each attribute that defines the test --
+  see how this is done for projections, for instance)
+* Implement the two routines ``run`` and ``compare``  The first
+  should return a result and the second should compare a result to an old
+  result.  Neither should yield, but instead actually return.  If you need
+  additional arguments to the test, implement an ``__init__`` routine.
+* Keep in mind that *everything* returned from ``run`` will be stored.  So if
+  you are going to return a huge amount of data, please ensure that the test
+  only gets run for small data.  If you want a fast way to measure something as
+  being similar or different, either an md5 hash (see the grid values test) or
+  a sum and std of an array act as good proxies.  If you must store a large
+  amount of data for some reason, try serializing the data to a string
+  (e.g. using ``numpy.ndarray.dumps``), and then compressing the data stream
+  using ``zlib.compress``.
+* Typically for derived values, we compare to 10 or 12 decimal places.
+  For exact values, we compare exactly.
 
 How to Add Data to the Testing Suite
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -257,26 +256,26 @@
 The Enzo example in ``yt/frontends/enzo/tests/test_outputs.py`` is
 considered canonical.  Do these things:
 
- * Create a new directory, ``tests`` inside the frontend's directory.
+* Create a new directory, ``tests`` inside the frontend's directory.
 
- * Create a new file, ``test_outputs.py`` in the frontend's ``tests``
-   directory.
+* Create a new file, ``test_outputs.py`` in the frontend's ``tests``
+  directory.
 
- * Create a new routine that operates similarly to the routines you can see
-   in Enzo's outputs.
+* Create a new routine that operates similarly to the routines you can see
+  in Enzo's outputs.
 
-   * This routine should test a number of different fields and data objects.
+  * This routine should test a number of different fields and data objects.
 
-   * The test routine itself should be decorated with
-     ``@requires_ds(file_name)``  This decorate can accept the argument
-     ``big_data`` for if this data is too big to run all the time.
+  * The test routine itself should be decorated with
+    ``@requires_ds(file_name)``  This decorate can accept the argument
+    ``big_data`` for if this data is too big to run all the time.
 
-   * There are ``small_patch_amr`` and ``big_patch_amr`` routines that
-     you can yield from to execute a bunch of standard tests.  This is where
-     you should start, and then yield additional tests that stress the
-     outputs in whatever ways are necessary to ensure functionality.
+  * There are ``small_patch_amr`` and ``big_patch_amr`` routines that
+    you can yield from to execute a bunch of standard tests.  This is where
+    you should start, and then yield additional tests that stress the
+    outputs in whatever ways are necessary to ensure functionality.
 
-   * **All tests should be yielded!**
+  * **All tests should be yielded!**
 
 If you are adding to a frontend that has a few tests already, skip the first
 two steps.


https://bitbucket.org/yt_analysis/yt/commits/d4713a7202ed/
Changeset:   d4713a7202ed
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 07:46:31
Summary:     Small docs fixes.
Affected #:  4 files

diff -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 -r d4713a7202edad92a4512b00cc5db86b341475d2 doc/source/analyzing/analysis_modules/particle_trajectories.rst
--- a/doc/source/analyzing/analysis_modules/particle_trajectories.rst
+++ b/doc/source/analyzing/analysis_modules/particle_trajectories.rst
@@ -1,3 +1,5 @@
+.. _particle-trajectories:
+
 Particle Trajectories
 ---------------------
 

diff -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 -r d4713a7202edad92a4512b00cc5db86b341475d2 doc/source/analyzing/external_analysis.rst
--- a/doc/source/analyzing/external_analysis.rst
+++ b/doc/source/analyzing/external_analysis.rst
@@ -15,10 +15,10 @@
 
 .. code-block:: python
 
-   from yt.mods import *
+   import yt
    import radtrans
 
-   ds = load("DD0010/DD0010")
+   ds = yt.load("DD0010/DD0010")
    rt_grids = []
 
    for grid in ds.index.grids:
@@ -36,13 +36,13 @@
 
 .. code-block:: python
 
-   from yt.mods import *
+   import yt
    import pop_synthesis
 
-   ds = load("DD0010/DD0010")
-   dd = ds.all_data()
-   star_masses = dd["StarMassMsun"]
-   star_metals = dd["StarMetals"]
+   ds = yt.load("DD0010/DD0010")
+   ad = ds.all_data()
+   star_masses = ad["StarMassMsun"]
+   star_metals = ad["StarMetals"]
 
    pop_synthesis.CalculateSED(star_masses, star_metals)
 
@@ -94,11 +94,11 @@
 There are several components to this analysis routine which we will have to
 wrap.
 
-   #. We have to wrap the creation of an instance of ``ParticleCollection``.
-   #. We have to transform a set of NumPy arrays into pointers to doubles.
-   #. We have to create a set of doubles into which ``calculate_axes`` will be
-      placing the values of the axes it calculates.
-   #. We have to turn the return values back into Python objects.
+#. We have to wrap the creation of an instance of ``ParticleCollection``.
+#. We have to transform a set of NumPy arrays into pointers to doubles.
+#. We have to create a set of doubles into which ``calculate_axes`` will be
+   placing the values of the axes it calculates.
+#. We have to turn the return values back into Python objects.
 
 Each of these steps can be handled in turn, and we'll be doing it using Cython
 as our interface code.

diff -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 -r d4713a7202edad92a4512b00cc5db86b341475d2 doc/source/developing/building_the_docs.rst
--- a/doc/source/developing/building_the_docs.rst
+++ b/doc/source/developing/building_the_docs.rst
@@ -1,4 +1,4 @@
-.. _documentation
+.. _documentation:
 
 Documentation
 =============

diff -r 63f0e82d0cd7d8caa69170e64fc0ad5ef2b11521 -r d4713a7202edad92a4512b00cc5db86b341475d2 doc/source/visualizing/streamlines.rst
--- a/doc/source/visualizing/streamlines.rst
+++ b/doc/source/visualizing/streamlines.rst
@@ -8,7 +8,8 @@
 velocity flow or magnetic field lines, they can be defined to follow
 any three-dimensional vector field.  Once an initial condition and
 total length of the streamline are specified, the streamline is
-uniquely defined.    
+uniquely defined.  Relatedly, yt also has the ability to follow 
+:ref:`particle-trajectories`.
 
 Method
 ------


https://bitbucket.org/yt_analysis/yt/commits/dc17173d2e4f/
Changeset:   dc17173d2e4f
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 07:46:44
Summary:     Merging.
Affected #:  2 files

diff -r d4713a7202edad92a4512b00cc5db86b341475d2 -r dc17173d2e4f62dab549ae7e84202aeaae53affb doc/source/cookbook/halo_profiler.py
--- a/doc/source/cookbook/halo_profiler.py
+++ b/doc/source/cookbook/halo_profiler.py
@@ -1,13 +1,13 @@
-from yt.mods import *
-from yt.analysis_modules.halo_analysis.api import *
+import yt
+from yt.analysis_modules.halo_analysis.api import HaloCatalog
 
 # Load the data set with the full simulation information
 # and rockstar halos
-data_pf = load('Enzo_64/RD0006/RedshiftOutput0006')
-halos_pf = load('rockstar_halos/halos_0.0.bin')
+data_ds = yt.load('Enzo_64/RD0006/RedshiftOutput0006')
+halos_ds = yt.load('rockstar_halos/halos_0.0.bin')
 
 # Instantiate a catalog using those two paramter files
-hc = HaloCatalog(data_pf=data_pf, halos_pf=halos_pf)
+hc = HaloCatalog(data_ds=data_ds, halos_ds=halos_ds)
 
 # Filter out less massive halos
 hc.add_filter("quantity_value", "particle_mass", ">", 1e14, "Msun")

diff -r d4713a7202edad92a4512b00cc5db86b341475d2 -r dc17173d2e4f62dab549ae7e84202aeaae53affb doc/source/yt3differences.rst
--- a/doc/source/yt3differences.rst
+++ b/doc/source/yt3differences.rst
@@ -16,9 +16,10 @@
 
 * We have reworked yt's import system so that most commonly-used yt functions
   and classes live in the top-level yt namespace. That means you can now
-  import yt with ``import yt``, load a dataset with ``ds = yt.load``
+  import yt with ``import yt``, load a dataset with ``ds = yt.load(filename)``
   and create a plot with ``yt.SlicePlot``.  See :ref:`api-reference` for a full
-  API listing.
+  API listing.  You can still import using ``from yt.mods import *`` to get a
+  pylab-like experience.
 * Fields and metadata for data objects and datasets now have units.  The unit
   system keeps you from making weird things like ``ergs`` + ``g`` and can
   handle things like ``g`` + ``kg`` or ``kg*m/s**2 == Newton``.  See
@@ -29,33 +30,36 @@
   Axis names are now at the *end* of field names, not the beginning.
   ``x-velocity`` is now ``velocity_x``.  For a full list of all of the fields, 
   see :ref:`field-list`.
-* Fields can be accessed by a name, but are named internally as ``(fluid_type,
-  fluid_name)``.  See :ref:`fields`.
-* Mesh fields on-disk will be in code units, and will be named ``(code_name,
-  FieldName)``. See :ref:`fields`.
+* Fields can be accessed by a name, but are named internally as ``(field_type,
+  field_name)``.  See :ref:`fields`.
+* Mesh fields that exist on-disk in an output file can be read in using whatever
+  name is used by the output file.  On-disk fields are always returned in code
+  units.  The full field name will be will be ``(code_name, field_name)``. See
+  :ref:`fields`.
 * Particle fields on-disk will also be in code units, and will be named
   ``(particle_type, FieldName)``.  If there is only one particle type in the
-  output file, the particle type for all particles will be ``io``. See 
+  output file, all particles will use ``io`` as the particle type. See 
   :ref:`fields`.
-* Previously, yt would capture command line arguments when being imported.
-  This no longer happens.  As a side effect, it is no longer necessary to
-  specify ``--parallel`` at the command line when running a parallel 
-  computation. Use ``yt.enable_parallelism()`` instead.  See 
+* The objects we used to refer to as "parameter files" we now refer to as a
+  dataset.  Instead of ``pf``, we now suggest you use ``ds`` to refer to an
+  object returned by ``yt.load``.
+* You can now create data objects without referring to hierarchy: instead of
+  ``pf.h.all_data()``, you can now say ``ds.all_data()``.
+* The hierarchy is still there, but it is now called the index: ``ds.index``.
+* Command line arguments are only parsed when yt is imported using ``from
+  yt.mods import *``. Since command line arguments are not parsed when using
+  ``import yt``, it is no longer necessary to specify ``--parallel`` at the
+  command line when running a parallel computation. Use
+  ``yt.enable_parallelism()`` in your script instead.  See
   :ref:`parallel-computation` for more details.
-* Any derived quantities that *always* returned lists (like ``Extrema``,
-  which would return a list even if you only ask for one field) now only
-  returns a single result if you only ask for one field.  Results for particle
-  and mesh fields will be returned separately.  See :ref:`derived-quantities`
-  for more information.
-* Derived quantities can now be accessed via a function that hangs off of the
-  ``quantities`` atribute of data objects. Instead of
-  ``dd.quantities['TotalMass']``, you can now use
-  ``dd.quantities.total_mass()`` to do the same thing. All derived quantities
-  can be accessed via a function that hangs off of the `quantities` attribute
-  of data objects. See :ref:`derived-quantities`.
-* You can't get the ``grids`` attribute of data objects.  To get this
+* Derived quantities have been reworked.  You can now do
+  ``dd.quantities.total_mass()`` instead of ``dd.quantities['TotalMass']()``.
+* The ``grids`` attribute of data objects no longer exists.  To get this
   information, you have to use spatial chunking and then access them.  See
-  :ref:`here grid-chunking` for an example.
+  :ref:`here grid-chunking` for an example.  For datasets that use grid
+  hierarchies, you can also access the grids for the entire dataset via
+  `ds.index.grids`.  This attribute is not defined for particle or octree
+  datasets.
 
 Cool New Things
 ---------------
@@ -125,6 +129,20 @@
 Preliminary support for non-cartesian coordinates has been added.  We expect
 this to be considerably solidified and expanded in yt 3.1.
 
+Reworked import system
+^^^^^^^^^^^^^^^^^^^^^^
+
+It's now possible to import all yt functionality using ``import yt``. Rather
+than using ``from yt.mods import *``, we suggest using ``import yt`` in new
+scripts.  Most commonly used yt functionality is attached to the ``yt`` module.
+Load a dataset with ``yt.load()``, create a phase plot using ``yt.PhasePlot,
+and much more, see :ref:`the api docs api-reference` to learn more about what's
+in the ``yt`` namespace, or just use tab completion in IPython: ``yt.<tab>``.
+
+It's still possible to use ``from yt.mods import *`` to create an interactive
+pylab-like experience.  Importing yt this way has several side effects, most
+notably the command line arguments parsing and other startup tasks will run.
+
 API Changes
 -----------
 
@@ -183,6 +201,22 @@
 data objects are now attached to the to the ``dataset`` object.  Before, you
 would say ``ph.f.sphere()``, now you can say ``ds.sphere()``.
 
+New derived quantities interface
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Derived quantities can now be accessed via a function that hangs off of the
+``quantities`` atribute of data objects. Instead of
+``dd.quantities['TotalMass']()``, you can now use ``dd.quantities.total_mass()``
+to do the same thing. All derived quantities can be accessed via a function that
+hangs off of the `quantities` attribute of data objects.
+
+Any derived quantities that *always* returned lists (like ``Extrema``, which
+would return a list even if you only ask for one field) now only returns a
+single result if you only ask for one field.  Results for particle and mesh
+fields will also be returned separately.  See :ref:`derived-quantities` for more
+information.
+
+
 Field Info
 ^^^^^^^^^^
 


https://bitbucket.org/yt_analysis/yt/commits/d85abe5cf2bb/
Changeset:   d85abe5cf2bb
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 08:27:07
Summary:     Providing instructions on how to update to the new version of yt, as well as switch between old and new versions.
Affected #:  3 files

diff -r dc17173d2e4f62dab549ae7e84202aeaae53affb -r d85abe5cf2bbd20832488a7e0f159eefd8aa2e84 doc/source/help/index.rst
--- a/doc/source/help/index.rst
+++ b/doc/source/help/index.rst
@@ -53,6 +53,32 @@
 
   $ yt update --all
 
+.. _update_errors:
+
+Update Errors
+^^^^^^^^^^^^^
+
+If for some reason the ``update`` command fails with errors, or any attempt at 
+loading yt either from the command line or from within python also fails, it 
+may simply mean you need to rebuild the yt source (some of the c-code in yt 
+needs to be rebuilt after major changes).  You can do this by navigating to
+the root of the yt mercurial repository.  If you installed with the all-in-one
+installer script, this is the ``yt-<machine>/src/yt-hg`` directory.  Then 
+execute these commands:
+
+.. code-block:: bash
+
+  $ python setup.py install --user --prefix=
+
+Now try running yt again with:
+
+.. code-block:: bash
+
+  $ yt --help
+
+If you continue to see errors, you should try contacting us via IRC or email
+but you may have to reinstall yt (see :ref:`getting-and-installing`).
+
 .. _search-the-documentation:
 
 Search the Documentation and Mailing Lists
@@ -195,7 +221,6 @@
 ticket in your stead.  Remember to include the information
 about your problem you identified in :ref:`this step <isolate_and_document>`.
 
-
 Installation Issues
 -------------------
 

diff -r dc17173d2e4f62dab549ae7e84202aeaae53affb -r d85abe5cf2bbd20832488a7e0f159eefd8aa2e84 doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -13,9 +13,9 @@
 
 * If you do not have root access on your computer, are not comfortable managing
   python packages, or are working on a supercomputer or cluster computer, you
-  will probably want to use the bash installation script.  This builds python,
-  numpy, matplotlib, and yt from source to set up an isolated scientific python
-  environment inside of a single folder in your home directory. See
+  will probably want to use the bash all-in-one installation script.  This builds 
+  python, numpy, matplotlib, and yt from source to set up an isolated scientific 
+  python environment inside of a single folder in your home directory. See
   :ref:`install-script` for more details.
 
 * If you use the `Anaconda <https://store.continuum.io/cshop/anaconda/>`_ python
@@ -261,36 +261,6 @@
 package install path.  If you do not have write access for this location, you
 might need to use ``sudo``.
 
-Switching to yt 2.x
-^^^^^^^^^^^^^^^^^^^
-
-With the release of version 3.0 of yt, development of the legacy yt 2.x series
-has been relegated to bugfixes.  That said, we will continue supporting the 2.x
-series for the forseeable future.  This makes it easy to use scripts written
-for older versions of yt without substantially updating them to support the
-new field naming or unit systems in yt version 3.
-
-Currently, the yt-2.x codebase is contained in a named branch in the yt
-mercurial repository.  First, remove any extant installations of yt on your
-system:
-
-.. code-block:: bash
-
-  pip uninstall yt
-
-To switch to yt-2.x, you will need to clone the mercurial repository as
-described in :ref:`source-installation`.  Next, you will need to navigate to the
-mercurial repository, update to the `yt-2.x` branch, and recompile:
-
-.. code-block:: bash
-
-  cd yt
-  hg update yt-2.x
-  python setup.py develop --user --prefix=
-
-You can check which version of yt you have installed by invoking ``yt version``
-at the command line.
-
 Keeping yt Updated via Mercurial
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -331,3 +301,49 @@
 
 If you like, this might be a good time to run the test suite, see :ref:`testing`
 for more details.
+
+.. _switching-between-yt-versions:
+
+Switching between yt-2.x and yt-3.x
+-----------------------------------
+
+With the release of version 3.0 of yt, development of the legacy yt 2.x series
+has been relegated to bugfixes.  That said, we will continue supporting the 2.x
+series for the forseeable future.  This makes it easy to use scripts written
+for older versions of yt without substantially updating them to support the
+new field naming or unit systems in yt version 3.
+
+Currently, the yt-2.x codebase is contained in a named branch in the yt
+mercurial repository.  If you have installed python via ``pip``, remove 
+any extant installations of yt on your system and clone the source mercurial 
+repository of yt as described in :ref:`source-installation`.
+If you installed using the all-in-one installation script, then you can skip 
+the uninstall and cloning because you already have the source repository.
+
+.. code-block:: bash
+
+  pip uninstall yt
+
+Now, to switch between versions, you need to navigate to the root of
+the mercurial yt repository (if you used the all-in-one installation script, 
+that is the directory named ``yt-machine/src/yt-hg``).  Use mercurial to
+update the the appropriate version and recompile.  For switching to yt-2.x,
+you would do this:
+
+.. code-block:: bash
+
+  cd <yt-repo-root-dir>
+  hg update yt-2.x
+  python setup.py develop --user --prefix=
+
+For updating to yt-3.0 you would do this (the ``yt`` branch is the current
+stable version of the code):
+
+.. code-block:: bash
+
+  cd <yt-repo-root-dir>
+  hg update yt
+  python setup.py develop --user --prefix=
+
+You can check which version of yt you have installed by invoking ``yt version``
+at the command line.

diff -r dc17173d2e4f62dab549ae7e84202aeaae53affb -r d85abe5cf2bbd20832488a7e0f159eefd8aa2e84 doc/source/yt3differences.rst
--- a/doc/source/yt3differences.rst
+++ b/doc/source/yt3differences.rst
@@ -9,6 +9,30 @@
 minimize disruption to existing scripts, but necessarily things will be
 different in some ways.
 
+Updating to yt 3.0 from Old Versions
+------------------------------------
+
+First off, you need to update your version of yt to yt 3.0.  If you're
+installing yt for the first time, please visit :ref:`getting-and-installing-yt`.
+If you already have a version of yt installed, you should just need one
+command:
+
+.. code-block:: bash
+
+    $ yt update --all
+
+This will update yt to the most recent version as well as download the latest
+dependencies and rebuild the source base.  This may take a few minutes.  To test
+to make sure yt is running, try:
+
+.. code-block:: bash
+
+    $ yt --help
+
+If you receive no errors, then you are ready to go.  If you have
+an error, then consult :ref:`update-errors` for solutions.  We also
+provide instructions for :ref:`switching-between-yt-versions`.
+
 Cheat Sheet
 -----------
 


https://bitbucket.org/yt_analysis/yt/commits/b4b523d413ff/
Changeset:   b4b523d413ff
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 08:30:34
Summary:     Removing dead link.
Affected #:  1 file

diff -r d85abe5cf2bbd20832488a7e0f159eefd8aa2e84 -r b4b523d413ff0b21087bbbd6c696c1a88f57d5bf doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -216,7 +216,7 @@
 By running the ``pastebin_grab`` subcommand with a pastebin number 
 (e.g. 1768), it will grab the contents of that pastebin 
 (e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
-STDOUT for local use.  For more details see the :ref:`pastebin` section.
+STDOUT for local use.  
 
 .. code-block:: bash
 


https://bitbucket.org/yt_analysis/yt/commits/a82dcee66aa9/
Changeset:   a82dcee66aa9
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 19:11:07
Summary:     Fixing a bug in "yt version" command, which never included the "+" at the end of the changeset to denote a modified repository.
Affected #:  1 file

diff -r b4b523d413ff0b21087bbbd6c696c1a88f57d5bf -r a82dcee66aa9919e7e9ba1966dd968bad1fddc3d yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -388,7 +388,7 @@
     yt_provider = pkg_resources.get_provider("yt")
     path = os.path.dirname(yt_provider.module_path)
     if not os.path.isdir(os.path.join(path, ".hg")): return None
-    version = _get_hg_version(path)[:12]
+    version = _get_hg_version(path)
     return version
 
 # This code snippet is modified from Georg Brandl


https://bitbucket.org/yt_analysis/yt/commits/6447859d6b61/
Changeset:   6447859d6b61
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 19:21:10
Summary:     Re-adding pastebin to debug docs.
Affected #:  2 files

diff -r a82dcee66aa9919e7e9ba1966dd968bad1fddc3d -r 6447859d6b613b7971f0b1ce72931abb14de06d8 doc/source/developing/debugdrive.rst
--- a/doc/source/developing/debugdrive.rst
+++ b/doc/source/developing/debugdrive.rst
@@ -18,6 +18,35 @@
 single, unified interactive prompt.  This enables and facilitates parallel
 analysis without sacrificing interactivity and flexibility.
 
+.. _pastebin:
+
+Pastebin
+--------
+
+A pastebin is a website where you can easily copy source code and error
+messages to share with yt developers or your collaborators. At
+http://paste.yt-project.org/ a pastebin is available for placing scripts.  With
+yt the script ``yt_lodgeit.py`` is distributed and wrapped with 
+the ``pastebin`` and ``pastebin_grab`` commands, which allow for commandline 
+uploading and downloading of pasted snippets.  To upload a script you
+would supply it to the command:
+
+.. code-block:: bash
+
+   $ yt pastebin some_script.py
+
+The URL will be returned.  If you'd like it to be marked 'private' and not show
+up in the list of pasted snippets, supply the argument ``--private``.  All
+snippets are given either numbers or hashes.  To download a pasted snippet, you
+would use the ``pastebin_grab`` option:
+
+.. code-block:: bash
+
+   $ yt pastebin_grab 1768
+
+The snippet will be output to the window, so output redirection can be used to
+store it in a file.
+
 Use the Python Debugger
 -----------------------
 

diff -r a82dcee66aa9919e7e9ba1966dd968bad1fddc3d -r 6447859d6b613b7971f0b1ce72931abb14de06d8 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -216,7 +216,7 @@
 By running the ``pastebin_grab`` subcommand with a pastebin number 
 (e.g. 1768), it will grab the contents of that pastebin 
 (e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
-STDOUT for local use.  
+STDOUT for local use.  See :ref:`pastebin` for more information.
 
 .. code-block:: bash
 


https://bitbucket.org/yt_analysis/yt/commits/705bebfc2133/
Changeset:   705bebfc2133
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 19:52:52
Summary:     Updating the installation instructions for getting up to 3.0 from old versions and switching between versions.
Affected #:  3 files

diff -r 6447859d6b613b7971f0b1ce72931abb14de06d8 -r 705bebfc2133f8e4be315fdcb1dee8ef181745da doc/source/help/index.rst
--- a/doc/source/help/index.rst
+++ b/doc/source/help/index.rst
@@ -68,7 +68,7 @@
 
 .. code-block:: bash
 
-  $ python setup.py install --user --prefix=
+  $ python setup.py develop
 
 Now try running yt again with:
 

diff -r 6447859d6b613b7971f0b1ce72931abb14de06d8 -r 705bebfc2133f8e4be315fdcb1dee8ef181745da doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -297,7 +297,7 @@
 
 If you get an error, follow the instructions it gives you to debug the problem.
 Do not hesitate to :ref:`contact us <asking-for-help>` so we can help you
-figure it out.
+figure it out.  There is also information at :ref:`update-errors`.
 
 If you like, this might be a good time to run the test suite, see :ref:`testing`
 for more details.
@@ -314,36 +314,58 @@
 new field naming or unit systems in yt version 3.
 
 Currently, the yt-2.x codebase is contained in a named branch in the yt
-mercurial repository.  If you have installed python via ``pip``, remove 
+mercurial repository.  Thus, depending on the method you used to install
+yt, there are different instructions for switching versions.
+
+If You Installed yt Using the Installer Script
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+You already have the mercurial repository, so you simply need to switch
+which version you're using.  Navigate to the root of the yt mercurial
+repository, update to the desired version, and rebuild the source (some of the
+c code requires a compilation step for big changes like this):
+
+.. code-block:: bash
+
+  cd yt-<machine>/src/yt-hg
+  hg update <desired-version>
+  python setup.py develop
+
+Valid versions to jump to are:
+
+* ``yt`` -- The latest *dev* changes in yt-3.x (can be unstable)
+* ``stable`` -- The latest stable release of yt-3.x
+* ``yt-2.x`` -- The latest stable release of yt-2.x
+    
+You can check which version of yt you have installed by invoking ``yt version``
+at the command line.  If encounter problems, see :ref:`update-errors`.
+
+If You Installed yt Using from Source or Using pip
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+If you have installed python via ``pip``, remove 
 any extant installations of yt on your system and clone the source mercurial 
 repository of yt as described in :ref:`source-installation`.
-If you installed using the all-in-one installation script, then you can skip 
-the uninstall and cloning because you already have the source repository.
 
 .. code-block:: bash
 
   pip uninstall yt
 
 Now, to switch between versions, you need to navigate to the root of
-the mercurial yt repository (if you used the all-in-one installation script, 
-that is the directory named ``yt-machine/src/yt-hg``).  Use mercurial to
-update the the appropriate version and recompile.  For switching to yt-2.x,
-you would do this:
+the mercurial yt repository. Use mercurial to
+update to the appropriate version and recompile.  
 
 .. code-block:: bash
 
   cd <yt-repo-root-dir>
-  hg update yt-2.x
-  python setup.py develop --user --prefix=
+  hg update <desired-version>
+  python setup.py install --user --prefix=
 
-For updating to yt-3.0 you would do this (the ``yt`` branch is the current
-stable version of the code):
+Valid versions to jump to are:
 
-.. code-block:: bash
-
-  cd <yt-repo-root-dir>
-  hg update yt
-  python setup.py develop --user --prefix=
-
+* ``yt`` -- The latest *dev* changes in yt-3.x (can be unstable)
+* ``stable`` -- The latest stable release of yt-3.x
+* ``yt-2.x`` -- The latest stable release of yt-2.x
+    
 You can check which version of yt you have installed by invoking ``yt version``
-at the command line.
+at the command line.  If encounter problems, see :ref:`update-errors`.

diff -r 6447859d6b613b7971f0b1ce72931abb14de06d8 -r 705bebfc2133f8e4be315fdcb1dee8ef181745da doc/source/yt3differences.rst
--- a/doc/source/yt3differences.rst
+++ b/doc/source/yt3differences.rst
@@ -21,8 +21,9 @@
 
     $ yt update --all
 
-This will update yt to the most recent version as well as download the latest
-dependencies and rebuild the source base.  This may take a few minutes.  To test
+This will update yt to the most recent version and rebuild the source base.  
+If you installed using the installer script, it will assure you have all of the
+latest dependencies as well.  This step may take a few minutes.  To test
 to make sure yt is running, try:
 
 .. code-block:: bash


https://bitbucket.org/yt_analysis/yt/commits/c7a0777bffcb/
Changeset:   c7a0777bffcb
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 19:53:30
Summary:     Merging.
Affected #:  4 files

diff -r 705bebfc2133f8e4be315fdcb1dee8ef181745da -r c7a0777bffcb6f5222a2cf60aa5afcf49d5c6029 doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -1,10 +1,55 @@
 import inspect
 from yt.mods import *
+from yt.testing import *
+import numpy as np
+from yt.utilities.cosmology import \
+     Cosmology
+from yt.utilities.definitions import \
+    mpc_conversion, sec_conversion
+from yt.frontends.stream.fields import \
+    StreamFieldInfo
+from yt.fields.derived_field import NullFunc
+from yt.units.yt_array import YTArray, Unit
 
+fields, units = [], []
 
-def islambda(f):
-    return inspect.isfunction(f) and \
-           f.__name__ == (lambda: True).__name__
+for fname, (code_units, aliases, dn) in StreamFieldInfo.known_other_fields:
+    fields.append(("gas", fname))
+    units.append(code_units)
+base_ds = fake_random_ds(4, fields = fields, units = units)
+base_ds.index
+base_ds.cosmological_simulation = 1
+base_ds.cosmology = Cosmology()
+from yt.config import ytcfg
+ytcfg["yt","__withintesting"] = "True"
+np.seterr(all = 'ignore')
+
+def _strip_ftype(field):
+    if not isinstance(field, tuple):
+        return field
+    elif field[0] == "all":
+        return field
+    return field[1]
+
+np.random.seed(int(0x4d3d3d3))
+units = [base_ds._get_field_info(*f).units for f in fields]
+fields = [_strip_ftype(f) for f in fields]
+ds = fake_random_ds(16, fields = fields, units = units)
+ds.parameters["HydroMethod"] = "streaming"
+ds.parameters["EOSType"] = 1.0
+ds.parameters["EOSSoundSpeed"] = 1.0
+ds.conversion_factors["Time"] = 1.0
+ds.conversion_factors.update( dict((f, 1.0) for f in fields) )
+ds.gamma = 5.0/3.0
+ds.current_redshift = 0.0001
+ds.cosmological_simulation = 1
+ds.hubble_constant = 0.7
+ds.omega_matter = 0.27
+ds.omega_lambda = 0.73
+ds.cosmology = Cosmology(hubble_constant=ds.hubble_constant,
+                         omega_matter=ds.omega_matter,
+                         omega_lambda=ds.omega_lambda,
+                         unit_registry=ds.unit_registry)
 
 header = r"""
 .. _field-list:
@@ -12,10 +57,10 @@
 Field List
 ==========
 
-This is a list of all fields available in ``yt``.  It has been organized by the
-type of code that each field is supported by.  "Universal" fields are available
-everywhere, "Enzo" fields in Enzo datasets, "Orion" fields in Orion datasets,
-and so on.
+This is a list of many of the fields available in ``yt``.  We have attempted to
+include most of the fields that are accessible through the plugin system,
+however it is possible to generate many more permutations, particularly through
+vector operations.
 
 Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
@@ -28,41 +73,28 @@
   for i in sorted(ds.field_list):
     print i
 
-.. note:: Universal fields will be overridden by a code-specific field.
-
-.. rubric:: Table of Contents
-
-.. contents::
-   :depth: 2
-   :local:
-   :backlinks: none
 """
 
 print header
 
 seen = []
 
-
 def print_all_fields(fl):
     for fn in sorted(fl):
         df = fl[fn]
         f = df._function
-        cv = df._convert_function
-        if [f, cv] in seen:
-            continue
-        seen.append([f, cv])
-        print "%s" % (df.name)
-        print "+" * len(df.name)
+        s = "%s" % (df.name,)
+        print s
+        print "+" * len(s)
         print
-        if len(df._units) > 0:
-            print "   * Units: :math:`%s`" % (df._units)
-        if len(df._projected_units) > 0:
-            print "   * Projected Units: :math:`%s`" % (df._projected_units)
+        if len(df.units) > 0:
+            u = Unit(df.units, registry = ds.unit_registry)
+            print "   * Units: :math:`%s`" % (u.latex_representation())
         print "   * Particle Type: %s" % (df.particle_type)
         print
         print "**Field Source**"
         print
-        if islambda(f):
+        if f == NullFunc:
             print "No source available."
             print
             continue
@@ -72,66 +104,6 @@
             for line in inspect.getsource(f).split("\n"):
                 print "  " + line
             print
-        print "**Convert Function Source**"
-        print
-        if islambda(cv):
-            print "No source available."
-            print
-            continue
-        else:
-            print ".. code-block:: python"
-            print
-            for line in inspect.getsource(cv).split("\n"):
-                print "  " + line
-            print
 
-
-print "Universal Field List"
-print "--------------------"
-print
-print_all_fields(FieldInfo)
-
-print "Enzo-Specific Field List"
-print "------------------------"
-print
-print_all_fields(EnzoFieldInfo)
-
-print "Orion-Specific Field List"
-print "-------------------------"
-print
-print_all_fields(OrionFieldInfo)
-
-print "FLASH-Specific Field List"
-print "-------------------------"
-print
-print_all_fields(FLASHFieldInfo)
-
-print "Athena-Specific Field List"
-print "--------------------------"
-print
-print_all_fields(AthenaFieldInfo)
-
-print "Nyx-Specific Field List"
-print "-----------------------"
-print
-print_all_fields(NyxFieldInfo)
-
-print "Chombo-Specific Field List"
-print "--------------------------"
-print
-print_all_fields(ChomboFieldInfo)
-
-print "Pluto-Specific Field List"
-print "--------------------------"
-print
-print_all_fields(PlutoFieldInfo)
-
-print "Grid-Data-Format-Specific Field List"
-print "------------------------------------"
-print
-print_all_fields(GDFFieldInfo)
-
-print "Generic-Format (Stream) Field List"
-print "----------------------------------"
-print
-print_all_fields(StreamFieldInfo)
+ds.index
+print_all_fields(ds.field_info)

diff -r 705bebfc2133f8e4be315fdcb1dee8ef181745da -r c7a0777bffcb6f5222a2cf60aa5afcf49d5c6029 doc/source/developing/creating_datatypes.rst
--- a/doc/source/developing/creating_datatypes.rst
+++ b/doc/source/developing/creating_datatypes.rst
@@ -6,36 +6,45 @@
 The three-dimensional datatypes in yt follow a fairly simple protocol.  The
 basic principle is that if you want to define a region in space, that region
 must be identifiable from some sort of cut applied against the cells --
-typically, in yt, this is done by examining the geometry.  (The
-:class:`yt.data_objects.data_containers.ExtractedRegionBase` type is a notable
-exception to this, as it is defined as a subset of an existing data object.)
+typically, in yt, this is done by examining the geometry.  
 
-In principle, you can define any number of 3D data objects, as long as the
-following methods are implemented to protocol specifications.
+Creating a new data object requires modifications to two different files, one
+of which is in Python and the other in Cython.  First, a subclass of
+:class:`~yt.data_objects.data_containers.YTDataContainer` must be defined;
+typically you actually want to subclass one of:
+:class:`~yt.data_objects.data_containers.YTSelectionContainer0D`
+:class:`~yt.data_objects.data_containers.YTSelectionContainer1D`
+:class:`~yt.data_objects.data_containers.YTSelectionContainer2D`
+:class:`~yt.data_objects.data_containers.YTSelectionContainer3D`.  
+The following attributes must be defined:
 
-.. function:: __init__(self, args, kwargs)
+ * ``_type_name`` - this is the short name by which the object type will be
+   known as.  Remember this for later, as we will have to use it when defining
+   the underlying selector.
+ * ``_con_args`` - this is the set of arguments passed to the object, and their
+   names as attributes on the data object.
+ * ``_container_fields`` - any fields that are generated by the object, rather
+   than by another derived field in yt.
 
-   This function can accept any number of arguments but must eventually call
-   AMR3DData.__init__.  It is used to set up the various parameters that
-   define the object.
+The rest of the object can be defined in Cython, in the file
+``yt/geometry/selection_routines.pyx``.  You must define a subclass of
+``SelectorObject``, which will require implementation of the following methods:
 
-.. function:: _get_list_of_grids(self)
+ * ``fill_mask`` - this takes a grid object and fills a mask of which zones
+   should be included.  It must take into account the child mask of the grid.
+ * ``select_cell`` - this routine accepts a position and a width, and returns
+   either zero or one for whether or not that cell is included in the selector.
+ * ``select_sphere`` - this routine returns zero or one whether a sphere (point
+   and radius) is included in the selector.
+ * ``select_point`` - this identifies whether or not a point is included in the
+   selector.  It should be identical to selecting a cell or a sphere with
+   zero extent.
+ * ``select_bbox`` - this returns whether or not a bounding box (i.e., grid) is
+   included in the selector.
+ * ``_hash_vals`` - this must return some combination of parameters that
+   semi-uniquely identifies the selector.
 
-   This function must set the property _grids to be a list of the grids
-   that should be considered to be a part of the data object.  Each of these
-   will be partly or completely contained within the object.
-
-.. function:: _is_fully_enclosed(self, grid)
-
-   This function returns true if the entire grid is part of the data object
-   and false if it is only partly enclosed.
-
-.. function:: _get_cut_mask(self, grid)
-
-   This function returns a boolean mask in the shape of the grid.  All of the
-   cells set to 'True' will be included in the data object and all of those set
-   to 'False' will be excluded.  Typically this is done via some logical
-   operation.
-
-For a good example of how to do this, see the
-:class:`yt.data_objects.data_containers.AMRCylinderBase` source code.
+Once the object has been defined, it must then be aliased within
+``selection_routines.pyx`` as ``typename_selector``.  For instance,
+``ray_selector`` or ``sphere_selector`` for ``_type_name`` values of ``ray``
+and ``sphere``, respectively.

diff -r 705bebfc2133f8e4be315fdcb1dee8ef181745da -r c7a0777bffcb6f5222a2cf60aa5afcf49d5c6029 doc/source/reference/api/api.rst
--- a/doc/source/reference/api/api.rst
+++ b/doc/source/reference/api/api.rst
@@ -62,6 +62,7 @@
    :toctree: generated/
 
    ~yt.data_objects.data_containers.YTSelectionContainer
+   ~yt.data_objects.data_containers.YTSelectionContainer0D
    ~yt.data_objects.data_containers.YTSelectionContainer1D
    ~yt.data_objects.data_containers.YTSelectionContainer2D
    ~yt.data_objects.data_containers.YTSelectionContainer3D

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/c7343efa5e6a/
Changeset:   c7343efa5e6a
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 20:22:09
Summary:     Updating fields docs and pointing to field list.
Affected #:  1 file

diff -r c7a0777bffcb6f5222a2cf60aa5afcf49d5c6029 -r c7343efa5e6a14c8ce3da4dd79e0667c784c8d25 doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -3,6 +3,8 @@
 Fields in yt
 ============
 
+Fields are spatially-dependent quantities associated with a parent dataset.
+Examples of fields are gas density, gas temperature, particle mass, etc.
 The fundamental way to query data in yt is to access a field, either in its raw
 form (by examining a data container) or a processed form (derived quantities,
 projections, and so on).  "Field" is something of a loaded word, as it can
@@ -62,7 +64,7 @@
 step.
 
 How are fields implemented?
-+++++++++++++++++++++++++++
+---------------------------
 
 There are two classes of fields in yt.  The first are those fields that exist
 external to yt, which are immutable and can be queried -- most commonly, these
@@ -102,10 +104,123 @@
 for fields which are mesh-dependent, specifically particle masses in some
 cosmology codes.)
 
+Field types known to yt
+-----------------------
+
+yt knows of a few different field types:
+
+* frontend-name -- Mesh or fluid fields that exist on-disk default to having
+  the name of the frontend as their type name (e.g., ``enzo``, ``flash``,
+  ``pyne`` and so on).  The units of these types are whatever units are
+  designated by the source frontend when it writes the data.
+* ``index`` -- This field type refers to characteristics of the mesh, whether
+  that mesh is defined by the simulation or internally by an octree indexing
+  of particle data.  A few handy fields are ``x``, ``y``, ``z``, ``theta``,
+  ``phi``, ``radius``, ``dx``, ``dy``, ``dz`` and so on.  Default units
+  are in CGS.
+* ``gas`` -- This is the usual default for simulation frontends for fluid
+  types.  These fields are typically aliased to the frontend-specific mesh
+  fields for grid-based codes or to the deposit fields for particle-based
+  codes.  Default units are in CGS.
+* particle type -- These are particle fields that exist on-disk as written 
+  by individual frontends.  If the frontend designates names for these particles
+  (i.e. particle type) those names are the field types. 
+  Additionally, any particle unions or filters will be accessible as field
+  types.  Examples of particle types are ``Stars``, ``DM``, ``io``, etc.  
+  Like the front-end specific mesh or fluid fields, the units of these fields
+  are whatever was designated by the source frontend when written to disk.
+* ``io`` -- If a data frontend does not have a set of multiple particle types, 
+  this is the default for all particles.
+* ``all`` -- This is a special particle field type that represents a
+  concatenation of all particle field types using :ref:`particle-unions`.
+* ``deposit`` -- This field type refers to the deposition of particles
+  (discrete data) onto a mesh, typically to compute smoothing kernels, local
+  density estimates, counts, and the like.  See :ref:`deposited-particle-fields` 
+  for more information.
+
+Field Plugins
+-------------
+
+Derived fields are organized via plugins.  Inside yt are a number of field
+plugins, which take information about fields in a dataset and then construct
+derived fields on top of them.  This allows them to take into account
+variations in naming system, units, data representations, and most importantly,
+allows only the fields that are relevant to be added.  This system will be
+expanded in future versions to enable much deeper semantic awareness of the
+data types being analyzed by yt.
+
+The field plugin system works in this order:
+
+ * Available, inherent fields are identified by yt
+ * The list of enabled field plugins is iterated over.  Each is called, and new
+   derived fields are added as relevant.
+ * Any fields which are not available, or which throw errors, are discarded.
+ * Remaining fields are added to the list of derived fields available for a
+   dataset
+ * Dependencies for every derived field are identified, to enable data
+   preloading
+
+Field plugins can be loaded dynamically, although at present this is not
+particularly useful.  Plans for extending field plugins to dynamically load, to
+enable simple definition of common types (gradient, divergence, etc), and to
+more verbosely describe available fields, have been put in place for future
+versions.
+
+The field plugins currently available include:
+
+ * Angular momentum fields for particles and fluids
+ * Astrophysical fields, such as those related to cosmology
+ * Vector fields for fluid fields, such as gradients and divergences
+ * Particle vector fields
+ * Magnetic field-related fields
+ * Species fields, such as for chemistry species (yt can recognize the entire
+   periodic table in field names and construct ionization fields as need be)
+
+What fields are available?
+--------------------------
+
+We provide a full list of fields that yt recognizes by default at 
+:ref:`field-list`.  If you want to create additional custom derived fields, 
+see :ref:`creating-derived-fields`.
+
+The full list of fields available for a dataset can be found as 
+the attribute ``field_list`` for native, on-disk fields and ``derived_field_list``
+for derived fields (``derived_field_list`` is a superset of ``field_list``).
+You can view these lists by examining a dataset like this:
+
+.. code-block:: python
+
+   ds = yt.load("my_data")
+   print ds.field_list
+   print ds.derived_field_list
+
+By using the ``field_info()`` class, one can access information about a given
+field, like its default units or the source code for it.  
+
+.. code-block:: python
+
+   ds = yt.load("my_data")
+   ds.index
+   print ds.field_info["gas", "pressure"].get_units()
+   print ds.field_info["gas", "pressure"].get_source()
+
+Particle Fields
+---------------
+
+Naturally, particle fields contain properties of particles rather than
+grid cells.  By examining the particle field in detail, you can see that 
+each element of the field array represents a single particle, whereas in mesh 
+fields each element represents a single mesh cell.  This means that for the
+most part, operations cannot operate on both particle fields and mesh fields
+simultaneously in the same way, like filters (see :ref:`filtering-data`).
+However, many of the particle fields have corresponding mesh fields that
+can be populated by "depositing" the particle values onto a yt grid as 
+described below.
+
 .. _field_parameters:
 
 Field Parameters
-++++++++++++++++
+----------------
 
 Certain fields require external information in order to be calculated.  For 
 example, the radius field has to be defined based on some point of reference 
@@ -144,114 +259,16 @@
 
 For a practical application of this, see :ref:`cookbook-radial-velocity`.
 
-Field types known to yt
-+++++++++++++++++++++++
-
-yt knows of a few different field types, by default.
-
- * ``index`` - this field type refers to characteristics of the mesh, whether
-   that mesh is defined by the simulation or internally by an octree indexing
-   of particle data.  A few handy fields are ``x``, ``y``, ``z``, ``theta``,
-   ``phi``, ``radius``, ``dx``, ``dy``, ``dz`` and so on.
- * ``gas`` - this is the usual default for simulation frontends for fluid
-   types.
- * ``all`` - this is a special particle field type that represents a
-   concatenation of all particle field types.
- * ``deposit`` - this field type refers to the deposition of particles
-   (discrete data) onto a mesh, typically to compute smoothing kernels, local
-   density estimates, counts, and the like.
- * ``io`` - if a data frontend does not have a set of particle types, this will
-   be the default for particle types.
- * frontend-name - mesh or fluid fields that exist on-disk default to having
-   the name of the frontend as their type name. (i.e., ``enzo``, ``flash``,
-   ``pyne`` and so on.)
- * particle type - if the particle types in the file are affiliated with names
-   (rather than just ``io``) they will be available as field types.
-   Additionally, any particle unions or filters will be accessible as field
-   types.
-
-Field Plugins
-+++++++++++++
-
-Derived fields are organized via plugins.  Inside yt are a number of field
-plugins, which take information about fields in a dataset and then construct
-derived fields on top of them.  This allows them to take into account
-variations in naming system, units, data representations, and most importantly,
-allows only the fields that are relevant to be added.  This system will be
-expanded in future versions to enable much deeper semantic awareness of the
-data types being analyzed by yt.
-
-The field plugin system works in this order:
-
- * Available, inherent fields are identified by yt
- * The list of enabled field plugins is iterated over.  Each is called, and new
-   derived fields are added as relevant.
- * Any fields which are not available, or which throw errors, are discarded.
- * Remaining fields are added to the list of derived fields available for a
-   dataset
- * Dependencies for every derived field are identified, to enable data
-   preloading
-
-Field plugins can be loaded dynamically, although at present this is not
-particularly useful.  Plans for extending field plugins to dynamically load, to
-enable simple definition of common types (gradient, divergence, etc), and to
-more verbosely describe available fields, have been put in place for future
-versions.
-
-The field plugins currently available include:
-
- * Angular momentum fields for particles and fluids
- * Astrophysical fields, such as those related to cosmology
- * Vector fields for fluid fields, such as gradients and divergences
- * Particle vector fields
- * Magnetic field-related fields
- * Species fields, such as for chemistry species (yt can recognize the entire
-   periodic table in field names and construct ionization fields as need be)
-
-What fields are available?
-++++++++++++++++++++++++++
-
-.. include reference here once it's done
-
-The full list of fields available for a dataset can be found as 
-the attribute ``field_list`` for native, on-disk fields and ``derived_field_list``
-for derived fields (``derived_field_list`` is a superset of ``field_list``).
-You can view these lists by examining a dataset like this:
-
-.. code-block:: python
-
-   ds = yt.load("my_data")
-   print ds.field_list
-   print ds.derived_field_list
-
-By using the ``field_info()`` class, one can access information about a given
-field, like its default units or the source code for it.  
-
-.. code-block:: python
-
-   ds = yt.load("my_data")
-   ds.index
-   print ds.field_info["gas", "pressure"].get_units()
-   print ds.field_info["gas", "pressure"].get_source()
-
-Particle Fields
----------------
-
-Naturally, particle fields contain properties of particles rather than
-grid cells.  Many of these fields have corresponding grid fields that
-can be populated by "depositing" the particle values onto a yt grid.
-
 General Particle Fields
-+++++++++++++++++++++++
+-----------------------
 
 Every particle will contain both a ``particle_position`` and ``particle_velocity``
 that tracks the position and velocity (respectively) in code units.
 
-
 .. _deposited-particle-fields:
 
 Deposited Particle Fields
-+++++++++++++++++++++++++
+-------------------------
 
 In order to turn particle (discrete) fields into fields that are deposited in
 some regular, space-filling way (even if that space is empty, it is defined
@@ -269,25 +286,25 @@
 somewhat outside the scope of this section.  The default deposition types
 available are:
 
- * ``count`` - this field counts the total number of particles of a given type
-   in a given mesh zone.  Note that because, in general, the mesh for particle
-   datasets is defined by the number of particles in a region, this may not be
-   the most useful metric.  This may be made more useful by depositing particle
-   data onto an :ref:`arbitrary-grid`.
- * ``density`` - this field takes the total sum of ``particle_mass`` in a given
-   mesh field and divides by the volume.
- * ``mass`` - this field takes the total sum of ``particle_mass`` in each mesh
-   zone.
- * ``cic`` - this field performs cloud-in-cell interpolation (see `Section 2.2
-   <http://ta.twi.tudelft.nl/dv/users/Lemmens/MThesis.TTH/chapter4.html>`_ for more
-   information) of the density of particles in a given mesh zone.
- * ``smoothed`` - this is a special deposition type.  See discussion below for
-   more information, in :ref:`sph-fields`.
+* ``count`` - this field counts the total number of particles of a given type
+  in a given mesh zone.  Note that because, in general, the mesh for particle
+  datasets is defined by the number of particles in a region, this may not be
+  the most useful metric.  This may be made more useful by depositing particle
+  data onto an :ref:`arbitrary-grid`.
+* ``density`` - this field takes the total sum of ``particle_mass`` in a given
+  mesh field and divides by the volume.
+* ``mass`` - this field takes the total sum of ``particle_mass`` in each mesh
+  zone.
+* ``cic`` - this field performs cloud-in-cell interpolation (see `Section 2.2
+  <http://ta.twi.tudelft.nl/dv/users/Lemmens/MThesis.TTH/chapter4.html>`_ for more
+  information) of the density of particles in a given mesh zone.
+* ``smoothed`` - this is a special deposition type.  See discussion below for
+  more information, in :ref:`sph-fields`.
 
 .. _sph-fields:
 
 SPH Fields
-++++++++++
+----------
 
 For gas particles from SPH simulations, each particle will typically carry
 a field for the smoothing length ``h``, which is roughly equivalent to 


https://bitbucket.org/yt_analysis/yt/commits/d7e9122ac5dd/
Changeset:   d7e9122ac5dd
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 20:33:03
Summary:     Updating field_list.rst builder script with references.
Affected #:  2 files

diff -r c7343efa5e6a14c8ce3da4dd79e0667c784c8d25 -r d7e9122ac5dd2952f261ceedd54415f8d05cae9f doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -57,10 +57,11 @@
 Field List
 ==========
 
-This is a list of many of the fields available in ``yt``.  We have attempted to
+This is a list of many of the fields available in yt.  We have attempted to
 include most of the fields that are accessible through the plugin system,
 however it is possible to generate many more permutations, particularly through
-vector operations.
+vector operations.  For more information about the fields framework,
+see :ref:`fields`.
 
 Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
@@ -73,6 +74,8 @@
   for i in sorted(ds.field_list):
     print i
 
+To figure out out what all of the field types here mean, see 
+:ref:`known-field-types`.
 """
 
 print header

diff -r c7343efa5e6a14c8ce3da4dd79e0667c784c8d25 -r d7e9122ac5dd2952f261ceedd54415f8d05cae9f doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -92,17 +92,19 @@
 For more information, see :ref:`creating-derived-fields`.
 
 There is a third, borderline class of field in yt, as well.  This is the
-"alias" type, where a field on disk (for example, ``Density``) is aliased into
-an internal yt-name (for example, ``density``).  The aliasing process allows
-universally-defined derived fields to take advantage of internal names, and it
-also provides an easy way to address what units something should be returned
-in.  If an aliased field is requested (and aliased fields will always be
-lowercase, with underscores separating words) it will be returned in CGS units
-(future versions will enable global defaults to be set for MKS and other unit
-systems), whereas if the underlying field is requested, it will not undergo any
-unit conversions from its natural units.  (This rule is occasionally violated
-for fields which are mesh-dependent, specifically particle masses in some
-cosmology codes.)
+"alias" type, where a field on disk (for example, (frontend, ``Density``)) is 
+aliased into an internal yt-name (for example, (``gas``, ``density``)).  The 
+aliasing process allows universally-defined derived fields to take advantage of 
+internal names, and it also provides an easy way to address what units something 
+should be returned in.  If an aliased field is requested (and aliased fields 
+will always be lowercase, with underscores separating words) it will be returned 
+in CGS units (future versions will enable global defaults to be set for MKS and 
+other unit systems), whereas if the underlying field is requested, it will not 
+undergo any unit conversions from its natural units.  (This rule is occasionally 
+violated for fields which are mesh-dependent, specifically particle masses in 
+some cosmology codes.)
+
+.. _known_field_types:
 
 Field types known to yt
 -----------------------


https://bitbucket.org/yt_analysis/yt/commits/aa07c9020891/
Changeset:   aa07c9020891
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 20:41:44
Summary:     Updating yt3 differences with cross-refs.
Affected #:  2 files

diff -r d7e9122ac5dd2952f261ceedd54415f8d05cae9f -r aa07c9020891387767f4a1f8c424f5e6464ad5c9 doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -28,7 +28,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Getting and Installing yt</p>
+         <p class="linkdescr">Getting, Installing, and Updating yt</p></td></tr><tr valign="top">

diff -r d7e9122ac5dd2952f261ceedd54415f8d05cae9f -r aa07c9020891387767f4a1f8c424f5e6464ad5c9 doc/source/yt3differences.rst
--- a/doc/source/yt3differences.rst
+++ b/doc/source/yt3differences.rst
@@ -154,7 +154,7 @@
 Preliminary support for non-cartesian coordinates has been added.  We expect
 this to be considerably solidified and expanded in yt 3.1.
 
-Reworked import system
+Reworked Import System
 ^^^^^^^^^^^^^^^^^^^^^^
 
 It's now possible to import all yt functionality using ``import yt``. Rather
@@ -201,15 +201,15 @@
    ds = yt.load("MyData")
    ds.setup_deprecated_fields()
 
-This sets up aliases from the old names to the new.  See :ref:`fields` for
-more information.
+This sets up aliases from the old names to the new.  See :ref:`fields` and
+:ref:`field-list` for more information.
 
 Units of Fields
 ^^^^^^^^^^^^^^^
 
 Fields now are all subclasses of NumPy arrays, the ``YTArray``, which carries
 along with it units.  This means that if you want to manipulate fields, you
-have to modify them in a unitful way.
+have to modify them in a unitful way.  See :ref:`units`.
 
 Parameter Files are Now Datasets
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -269,7 +269,8 @@
 ^^^^^^^^^^^^^^^^
 
 All data objects now accept an explicit list of ``field_parameters`` rather
-than accepting ``kwargs`` and supplying them to field parameters.
+than accepting ``kwargs`` and supplying them to field parameters.  See 
+:ref:`field-parameters`.
 
 Object Renaming
 ^^^^^^^^^^^^^^^
@@ -278,7 +279,8 @@
 removing ``AMR`` from the prefix or replacing it with ``YT``.  All names of
 objects remain the same for the purposes of selecting data and creating them;
 i.e., ``sphere`` objects are still called ``sphere`` - you can access create one
-via ``ds.sphere``.  For a detailed description and index see :ref:`available-objects`.
+via ``ds.sphere``.  For a detailed description and index see 
+:ref:`available-objects`.
 
 Boolean Regions
 ^^^^^^^^^^^^^^^
@@ -304,3 +306,9 @@
 
 This will "spatially" chunk the ``obj`` object and print out all the grids
 included.
+
+Halo Catalogs
+^^^^^^^^^^^^^
+
+The ``Halo Profiler`` infrastructure has been fundamentally rewritten and now
+exists using the ``Halo Catalog`` framework.  See :ref:`halo-analysis`.


https://bitbucket.org/yt_analysis/yt/commits/f00a3ec9a751/
Changeset:   f00a3ec9a751
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 20:58:47
Summary:     Updating.
Affected #:  3 files

diff -r aa07c9020891387767f4a1f8c424f5e6464ad5c9 -r f00a3ec9a751d0d4982d19745faff9a3f1914415 doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -78,7 +78,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+         <p class="linkdescr">Use analysis  tools to extract results from your data</p></td></tr><tr valign="top">

diff -r aa07c9020891387767f4a1f8c424f5e6464ad5c9 -r f00a3ec9a751d0d4982d19745faff9a3f1914415 doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -338,7 +338,7 @@
 * ``yt-2.x`` -- The latest stable release of yt-2.x
     
 You can check which version of yt you have installed by invoking ``yt version``
-at the command line.  If encounter problems, see :ref:`update-errors`.
+at the command line.  If you encounter problems, see :ref:`update-errors`.
 
 If You Installed yt Using from Source or Using pip
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -350,6 +350,7 @@
 .. code-block:: bash
 
   pip uninstall yt
+  hg clone https://bitbucket.org/yt_analysis/yt
 
 Now, to switch between versions, you need to navigate to the root of
 the mercurial yt repository. Use mercurial to
@@ -357,7 +358,7 @@
 
 .. code-block:: bash
 
-  cd <yt-repo-root-dir>
+  cd yt
   hg update <desired-version>
   python setup.py install --user --prefix=
 
@@ -368,4 +369,4 @@
 * ``yt-2.x`` -- The latest stable release of yt-2.x
     
 You can check which version of yt you have installed by invoking ``yt version``
-at the command line.  If encounter problems, see :ref:`update-errors`.
+at the command line.  If you encounter problems, see :ref:`update-errors`.

diff -r aa07c9020891387767f4a1f8c424f5e6464ad5c9 -r f00a3ec9a751d0d4982d19745faff9a3f1914415 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -388,7 +388,7 @@
     yt_provider = pkg_resources.get_provider("yt")
     path = os.path.dirname(yt_provider.module_path)
     if not os.path.isdir(os.path.join(path, ".hg")): return None
-    version = _get_hg_version(path)
+    version = _get_hg_version(path)[:12]
     return version
 
 # This code snippet is modified from Georg Brandl


https://bitbucket.org/yt_analysis/yt/commits/801f654e9ef9/
Changeset:   801f654e9ef9
Branch:      yt-3.0
User:        chummels
Date:        2014-08-01 21:08:59
Summary:     Making it so yt version returns full hg changeid including + for a modified repo.
Affected #:  1 file

diff -r f00a3ec9a751d0d4982d19745faff9a3f1914415 -r 801f654e9ef9c394de4552c598049a26d8c8ec33 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -388,7 +388,7 @@
     yt_provider = pkg_resources.get_provider("yt")
     path = os.path.dirname(yt_provider.module_path)
     if not os.path.isdir(os.path.join(path, ".hg")): return None
-    version = _get_hg_version(path)[:12]
+    version = _get_hg_version(path)
     return version
 
 # This code snippet is modified from Georg Brandl


https://bitbucket.org/yt_analysis/yt/commits/2ad686b9cb30/
Changeset:   2ad686b9cb30
Branch:      yt-3.0
User:        chummels
Date:        2014-08-02 14:07:59
Summary:     Merged in chummels/yt/yt-3.0 (pull request #1119)

Docs updates
Affected #:  22 files

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -57,10 +57,11 @@
 Field List
 ==========
 
-This is a list of many of the fields available in ``yt``.  We have attempted to
+This is a list of many of the fields available in yt.  We have attempted to
 include most of the fields that are accessible through the plugin system,
 however it is possible to generate many more permutations, particularly through
-vector operations.
+vector operations.  For more information about the fields framework,
+see :ref:`fields`.
 
 Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
@@ -73,6 +74,8 @@
   for i in sorted(ds.field_list):
     print i
 
+To figure out out what all of the field types here mean, see 
+:ref:`known-field-types`.
 """
 
 print header

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/analysis_modules/halo_analysis.rst
--- a/doc/source/analyzing/analysis_modules/halo_analysis.rst
+++ b/doc/source/analyzing/analysis_modules/halo_analysis.rst
@@ -7,7 +7,7 @@
 and using the halo mass function.
 
 .. toctree::
-   :maxdepth: 1
+   :maxdepth: 2
 
    halo_transition
    halo_catalogs

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/analysis_modules/halo_catalogs.rst
--- a/doc/source/analyzing/analysis_modules/halo_catalogs.rst
+++ b/doc/source/analyzing/analysis_modules/halo_catalogs.rst
@@ -241,4 +241,4 @@
 ----------------------------------------
 
 For a full example of how to use these methods together see 
-:ref:`halo_analysis_example`.
+:ref:`halo-analysis-example`.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/analysis_modules/particle_trajectories.rst
--- a/doc/source/analyzing/analysis_modules/particle_trajectories.rst
+++ b/doc/source/analyzing/analysis_modules/particle_trajectories.rst
@@ -1,3 +1,5 @@
+.. _particle-trajectories:
+
 Particle Trajectories
 ---------------------
 

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/analysis_modules/synthetic_observation.rst
--- a/doc/source/analyzing/analysis_modules/synthetic_observation.rst
+++ b/doc/source/analyzing/analysis_modules/synthetic_observation.rst
@@ -5,13 +5,12 @@
 from simulation data.
 
 .. toctree::
-   :maxdepth: 1
+   :maxdepth: 2
 
    light_cone_generator
    light_ray_generator
    planning_cosmology_simulations
    absorption_spectrum
-   fitting_procedure
    star_analysis
    xray_emission_fields
    sunyaev_zeldovich

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/external_analysis.rst
--- a/doc/source/analyzing/external_analysis.rst
+++ b/doc/source/analyzing/external_analysis.rst
@@ -15,10 +15,10 @@
 
 .. code-block:: python
 
-   from yt.mods import *
+   import yt
    import radtrans
 
-   ds = load("DD0010/DD0010")
+   ds = yt.load("DD0010/DD0010")
    rt_grids = []
 
    for grid in ds.index.grids:
@@ -36,13 +36,13 @@
 
 .. code-block:: python
 
-   from yt.mods import *
+   import yt
    import pop_synthesis
 
-   ds = load("DD0010/DD0010")
-   dd = ds.all_data()
-   star_masses = dd["StarMassMsun"]
-   star_metals = dd["StarMetals"]
+   ds = yt.load("DD0010/DD0010")
+   ad = ds.all_data()
+   star_masses = ad["StarMassMsun"]
+   star_metals = ad["StarMetals"]
 
    pop_synthesis.CalculateSED(star_masses, star_metals)
 
@@ -94,11 +94,11 @@
 There are several components to this analysis routine which we will have to
 wrap.
 
-   #. We have to wrap the creation of an instance of ``ParticleCollection``.
-   #. We have to transform a set of NumPy arrays into pointers to doubles.
-   #. We have to create a set of doubles into which ``calculate_axes`` will be
-      placing the values of the axes it calculates.
-   #. We have to turn the return values back into Python objects.
+#. We have to wrap the creation of an instance of ``ParticleCollection``.
+#. We have to transform a set of NumPy arrays into pointers to doubles.
+#. We have to create a set of doubles into which ``calculate_axes`` will be
+   placing the values of the axes it calculates.
+#. We have to turn the return values back into Python objects.
 
 Each of these steps can be handled in turn, and we'll be doing it using Cython
 as our interface code.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -3,6 +3,8 @@
 Fields in yt
 ============
 
+Fields are spatially-dependent quantities associated with a parent dataset.
+Examples of fields are gas density, gas temperature, particle mass, etc.
 The fundamental way to query data in yt is to access a field, either in its raw
 form (by examining a data container) or a processed form (derived quantities,
 projections, and so on).  "Field" is something of a loaded word, as it can
@@ -62,7 +64,7 @@
 step.
 
 How are fields implemented?
-+++++++++++++++++++++++++++
+---------------------------
 
 There are two classes of fields in yt.  The first are those fields that exist
 external to yt, which are immutable and can be queried -- most commonly, these
@@ -90,22 +92,137 @@
 For more information, see :ref:`creating-derived-fields`.
 
 There is a third, borderline class of field in yt, as well.  This is the
-"alias" type, where a field on disk (for example, ``Density``) is aliased into
-an internal yt-name (for example, ``density``).  The aliasing process allows
-universally-defined derived fields to take advantage of internal names, and it
-also provides an easy way to address what units something should be returned
-in.  If an aliased field is requested (and aliased fields will always be
-lowercase, with underscores separating words) it will be returned in CGS units
-(future versions will enable global defaults to be set for MKS and other unit
-systems), whereas if the underlying field is requested, it will not undergo any
-unit conversions from its natural units.  (This rule is occasionally violated
-for fields which are mesh-dependent, specifically particle masses in some
-cosmology codes.)
+"alias" type, where a field on disk (for example, (frontend, ``Density``)) is 
+aliased into an internal yt-name (for example, (``gas``, ``density``)).  The 
+aliasing process allows universally-defined derived fields to take advantage of 
+internal names, and it also provides an easy way to address what units something 
+should be returned in.  If an aliased field is requested (and aliased fields 
+will always be lowercase, with underscores separating words) it will be returned 
+in CGS units (future versions will enable global defaults to be set for MKS and 
+other unit systems), whereas if the underlying field is requested, it will not 
+undergo any unit conversions from its natural units.  (This rule is occasionally 
+violated for fields which are mesh-dependent, specifically particle masses in 
+some cosmology codes.)
+
+.. _known_field_types:
+
+Field types known to yt
+-----------------------
+
+yt knows of a few different field types:
+
+* frontend-name -- Mesh or fluid fields that exist on-disk default to having
+  the name of the frontend as their type name (e.g., ``enzo``, ``flash``,
+  ``pyne`` and so on).  The units of these types are whatever units are
+  designated by the source frontend when it writes the data.
+* ``index`` -- This field type refers to characteristics of the mesh, whether
+  that mesh is defined by the simulation or internally by an octree indexing
+  of particle data.  A few handy fields are ``x``, ``y``, ``z``, ``theta``,
+  ``phi``, ``radius``, ``dx``, ``dy``, ``dz`` and so on.  Default units
+  are in CGS.
+* ``gas`` -- This is the usual default for simulation frontends for fluid
+  types.  These fields are typically aliased to the frontend-specific mesh
+  fields for grid-based codes or to the deposit fields for particle-based
+  codes.  Default units are in CGS.
+* particle type -- These are particle fields that exist on-disk as written 
+  by individual frontends.  If the frontend designates names for these particles
+  (i.e. particle type) those names are the field types. 
+  Additionally, any particle unions or filters will be accessible as field
+  types.  Examples of particle types are ``Stars``, ``DM``, ``io``, etc.  
+  Like the front-end specific mesh or fluid fields, the units of these fields
+  are whatever was designated by the source frontend when written to disk.
+* ``io`` -- If a data frontend does not have a set of multiple particle types, 
+  this is the default for all particles.
+* ``all`` -- This is a special particle field type that represents a
+  concatenation of all particle field types using :ref:`particle-unions`.
+* ``deposit`` -- This field type refers to the deposition of particles
+  (discrete data) onto a mesh, typically to compute smoothing kernels, local
+  density estimates, counts, and the like.  See :ref:`deposited-particle-fields` 
+  for more information.
+
+Field Plugins
+-------------
+
+Derived fields are organized via plugins.  Inside yt are a number of field
+plugins, which take information about fields in a dataset and then construct
+derived fields on top of them.  This allows them to take into account
+variations in naming system, units, data representations, and most importantly,
+allows only the fields that are relevant to be added.  This system will be
+expanded in future versions to enable much deeper semantic awareness of the
+data types being analyzed by yt.
+
+The field plugin system works in this order:
+
+ * Available, inherent fields are identified by yt
+ * The list of enabled field plugins is iterated over.  Each is called, and new
+   derived fields are added as relevant.
+ * Any fields which are not available, or which throw errors, are discarded.
+ * Remaining fields are added to the list of derived fields available for a
+   dataset
+ * Dependencies for every derived field are identified, to enable data
+   preloading
+
+Field plugins can be loaded dynamically, although at present this is not
+particularly useful.  Plans for extending field plugins to dynamically load, to
+enable simple definition of common types (gradient, divergence, etc), and to
+more verbosely describe available fields, have been put in place for future
+versions.
+
+The field plugins currently available include:
+
+ * Angular momentum fields for particles and fluids
+ * Astrophysical fields, such as those related to cosmology
+ * Vector fields for fluid fields, such as gradients and divergences
+ * Particle vector fields
+ * Magnetic field-related fields
+ * Species fields, such as for chemistry species (yt can recognize the entire
+   periodic table in field names and construct ionization fields as need be)
+
+What fields are available?
+--------------------------
+
+We provide a full list of fields that yt recognizes by default at 
+:ref:`field-list`.  If you want to create additional custom derived fields, 
+see :ref:`creating-derived-fields`.
+
+The full list of fields available for a dataset can be found as 
+the attribute ``field_list`` for native, on-disk fields and ``derived_field_list``
+for derived fields (``derived_field_list`` is a superset of ``field_list``).
+You can view these lists by examining a dataset like this:
+
+.. code-block:: python
+
+   ds = yt.load("my_data")
+   print ds.field_list
+   print ds.derived_field_list
+
+By using the ``field_info()`` class, one can access information about a given
+field, like its default units or the source code for it.  
+
+.. code-block:: python
+
+   ds = yt.load("my_data")
+   ds.index
+   print ds.field_info["gas", "pressure"].get_units()
+   print ds.field_info["gas", "pressure"].get_source()
+
+Particle Fields
+---------------
+
+Naturally, particle fields contain properties of particles rather than
+grid cells.  By examining the particle field in detail, you can see that 
+each element of the field array represents a single particle, whereas in mesh 
+fields each element represents a single mesh cell.  This means that for the
+most part, operations cannot operate on both particle fields and mesh fields
+simultaneously in the same way, like filters (see :ref:`filtering-data`).
+However, many of the particle fields have corresponding mesh fields that
+can be populated by "depositing" the particle values onto a yt grid as 
+described below.
 
 .. _field_parameters:
 
 Field Parameters
-++++++++++++++++
+----------------
 
 Certain fields require external information in order to be calculated.  For 
 example, the radius field has to be defined based on some point of reference 
@@ -144,114 +261,16 @@
 
 For a practical application of this, see :ref:`cookbook-radial-velocity`.
 
-Field types known to yt
-+++++++++++++++++++++++
-
-yt knows of a few different field types, by default.
-
- * ``index`` - this field type refers to characteristics of the mesh, whether
-   that mesh is defined by the simulation or internally by an octree indexing
-   of particle data.  A few handy fields are ``x``, ``y``, ``z``, ``theta``,
-   ``phi``, ``radius``, ``dx``, ``dy``, ``dz`` and so on.
- * ``gas`` - this is the usual default for simulation frontends for fluid
-   types.
- * ``all`` - this is a special particle field type that represents a
-   concatenation of all particle field types.
- * ``deposit`` - this field type refers to the deposition of particles
-   (discrete data) onto a mesh, typically to compute smoothing kernels, local
-   density estimates, counts, and the like.
- * ``io`` - if a data frontend does not have a set of particle types, this will
-   be the default for particle types.
- * frontend-name - mesh or fluid fields that exist on-disk default to having
-   the name of the frontend as their type name. (i.e., ``enzo``, ``flash``,
-   ``pyne`` and so on.)
- * particle type - if the particle types in the file are affiliated with names
-   (rather than just ``io``) they will be available as field types.
-   Additionally, any particle unions or filters will be accessible as field
-   types.
-
-Field Plugins
-+++++++++++++
-
-Derived fields are organized via plugins.  Inside yt are a number of field
-plugins, which take information about fields in a dataset and then construct
-derived fields on top of them.  This allows them to take into account
-variations in naming system, units, data representations, and most importantly,
-allows only the fields that are relevant to be added.  This system will be
-expanded in future versions to enable much deeper semantic awareness of the
-data types being analyzed by yt.
-
-The field plugin system works in this order:
-
- * Available, inherent fields are identified by yt
- * The list of enabled field plugins is iterated over.  Each is called, and new
-   derived fields are added as relevant.
- * Any fields which are not available, or which throw errors, are discarded.
- * Remaining fields are added to the list of derived fields available for a
-   dataset
- * Dependencies for every derived field are identified, to enable data
-   preloading
-
-Field plugins can be loaded dynamically, although at present this is not
-particularly useful.  Plans for extending field plugins to dynamically load, to
-enable simple definition of common types (gradient, divergence, etc), and to
-more verbosely describe available fields, have been put in place for future
-versions.
-
-The field plugins currently available include:
-
- * Angular momentum fields for particles and fluids
- * Astrophysical fields, such as those related to cosmology
- * Vector fields for fluid fields, such as gradients and divergences
- * Particle vector fields
- * Magnetic field-related fields
- * Species fields, such as for chemistry species (yt can recognize the entire
-   periodic table in field names and construct ionization fields as need be)
-
-What fields are available?
-++++++++++++++++++++++++++
-
-.. include reference here once it's done
-
-The full list of fields available for a dataset can be found as 
-the attribute ``field_list`` for native, on-disk fields and ``derived_field_list``
-for derived fields (``derived_field_list`` is a superset of ``field_list``).
-You can view these lists by examining a dataset like this:
-
-.. code-block:: python
-
-   ds = yt.load("my_data")
-   print ds.field_list
-   print ds.derived_field_list
-
-By using the ``field_info()`` class, one can access information about a given
-field, like its default units or the source code for it.  
-
-.. code-block:: python
-
-   ds = yt.load("my_data")
-   ds.index
-   print ds.field_info["gas", "pressure"].get_units()
-   print ds.field_info["gas", "pressure"].get_source()
-
-Particle Fields
----------------
-
-Naturally, particle fields contain properties of particles rather than
-grid cells.  Many of these fields have corresponding grid fields that
-can be populated by "depositing" the particle values onto a yt grid.
-
 General Particle Fields
-+++++++++++++++++++++++
+-----------------------
 
 Every particle will contain both a ``particle_position`` and ``particle_velocity``
 that tracks the position and velocity (respectively) in code units.
 
-
 .. _deposited-particle-fields:
 
 Deposited Particle Fields
-+++++++++++++++++++++++++
+-------------------------
 
 In order to turn particle (discrete) fields into fields that are deposited in
 some regular, space-filling way (even if that space is empty, it is defined
@@ -269,25 +288,25 @@
 somewhat outside the scope of this section.  The default deposition types
 available are:
 
- * ``count`` - this field counts the total number of particles of a given type
-   in a given mesh zone.  Note that because, in general, the mesh for particle
-   datasets is defined by the number of particles in a region, this may not be
-   the most useful metric.  This may be made more useful by depositing particle
-   data onto an :ref:`arbitrary-grid`.
- * ``density`` - this field takes the total sum of ``particle_mass`` in a given
-   mesh field and divides by the volume.
- * ``mass`` - this field takes the total sum of ``particle_mass`` in each mesh
-   zone.
- * ``cic`` - this field performs cloud-in-cell interpolation (see `Section 2.2
-   <http://ta.twi.tudelft.nl/dv/users/Lemmens/MThesis.TTH/chapter4.html>`_ for more
-   information) of the density of particles in a given mesh zone.
- * ``smoothed`` - this is a special deposition type.  See discussion below for
-   more information, in :ref:`sph-fields`.
+* ``count`` - this field counts the total number of particles of a given type
+  in a given mesh zone.  Note that because, in general, the mesh for particle
+  datasets is defined by the number of particles in a region, this may not be
+  the most useful metric.  This may be made more useful by depositing particle
+  data onto an :ref:`arbitrary-grid`.
+* ``density`` - this field takes the total sum of ``particle_mass`` in a given
+  mesh field and divides by the volume.
+* ``mass`` - this field takes the total sum of ``particle_mass`` in each mesh
+  zone.
+* ``cic`` - this field performs cloud-in-cell interpolation (see `Section 2.2
+  <http://ta.twi.tudelft.nl/dv/users/Lemmens/MThesis.TTH/chapter4.html>`_ for more
+  information) of the density of particles in a given mesh zone.
+* ``smoothed`` - this is a special deposition type.  See discussion below for
+  more information, in :ref:`sph-fields`.
 
 .. _sph-fields:
 
 SPH Fields
-++++++++++
+----------
 
 For gas particles from SPH simulations, each particle will typically carry
 a field for the smoothing length ``h``, which is roughly equivalent to 

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/developing/building_the_docs.rst
--- a/doc/source/developing/building_the_docs.rst
+++ b/doc/source/developing/building_the_docs.rst
@@ -1,8 +1,77 @@
+.. _documentation:
+
+Documentation
+=============
+
+.. _writing_documentation:
+
+How to Write Documentation
+--------------------------
+
+Writing documentation is one of the most important but often overlooked tasks
+for increasing yt's impact in the community.  It is the way in which the
+world will understand how to use our code, so it needs to be done concisely
+and understandably.  Typically, when a developer submits some piece of code
+with new functionality, she should also include documentation on how to use
+that functionality (as per :ref:`requirements-for-code-submission`).
+Depending on the nature of the code addition, this could be a new narrative
+docs section describing how the new code works and how to use it, it could
+include a recipe in the cookbook section, or it could simply be adding a note
+in the relevant docs text somewhere.
+
+The documentation exists in the main mercurial code repository for yt in the
+``doc`` directory (i.e. ``$YT_HG/doc/source`` where ``$YT_HG`` is the path of
+the yt mercurial repository).  It is organized hierarchically into the main
+categories of:
+
+* Visualizing
+* Analyzing
+* Examining
+* Cookbook
+* Bootcamp
+* Developing
+* Reference
+* Help
+
+You will have to figure out where your new/modified doc fits into this, but
+browsing through the pre-built documentation is a good way to sort that out.
+
+All the source for the documentation is written in
+`Sphinx <http://sphinx-doc.org/>`_, which uses ReST for markup.  ReST is very
+straightforward to markup in a text editor, and if you are new to it, we
+recommend just using other .rst files in the existing yt documentation as
+templates or checking out the
+`ReST reference documentation <http://sphinx-doc.org/rest.html>`_.
+
+New cookbook recipes (see :ref:`cookbook`) are very helpful for the community
+as they provide simple annotated recipes on how to use specific functionality.
+To add one, create a concise python script which demonstrates some
+functionality and pare it down to its minimum.  Add some comment lines to
+describe what it is that you're doing along the way.  Place this ``.py`` file
+in the ``source/cookbook/`` directory, and then link to it explicitly in one
+of the relevant ``.rst`` files in that directory (e.g. ``complex_plots.rst``,
+``cosmological_analysis.rst``, etc.), and add some description of what the script
+actually does.  We recommend that you use one of the
+`sample data sets <http://yt-project.org/data>`_ in your recipe.  When the full
+docs are built, each of the cookbook recipes are executed dynamically on
+a system which has access to all of the sample datasets.  Any output images
+generated by your script will then be attached inline in the built documentation
+directly following your script.
+
+After you have made your modifications to the docs, you will want to make sure
+that they render the way you expect them to render.  For more information on
+this, see the section on :ref:`docs_build`.  Unless you're contributing cookbook
+recipes or notebooks which require a dynamical build, you can probably get
+away with just doing a 'quick' docs build.
+
+When you have completed your documentation additions, commit your changes
+to your repository and make a pull request in the same way you would contribute
+a change to the codebase, as described in the section on :ref:`sharing-changes`.
+
 .. _docs_build:
 
-==========================
 Building the Documentation
-==========================
+--------------------------
 
 The yt documentation makes heavy use of the sphinx documentation automation
 suite.  Sphinx, written in python, was originally created for the documentation
@@ -14,8 +83,8 @@
 build time by sphinx.  We also use sphinx to run code snippets (e.g. the 
 cookbook and the notebooks) and embed resulting images and example data.
 
-Quick versus full documentation builds
---------------------------------------
+Quick versus Full Documentation Builds
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 Building the entire set of yt documentation is a laborious task, since you 
 need to have a large number of packages in order to successfully execute
@@ -31,8 +100,8 @@
 to follow the instructions for building the ``full`` docs, so that you can
 dynamically execute and render the cookbook recipes, the notebooks, etc.
 
-Building the docs (quick)
--------------------------
+Building the Docs (Quick)
+^^^^^^^^^^^^^^^^^^^^^^^^^
 
 You will need to have the yt repository available on your computer, which
 is done by default if you have yt installed.  In addition, you need a 
@@ -62,8 +131,8 @@
 ``$YT_HG/doc/build/html`` directory.  You can now go there and open
 up ``index.html`` or whatever file you wish in your web browser.
 
-Building the docs (full)
-------------------------
+Building the Docs (Full)
+^^^^^^^^^^^^^^^^^^^^^^^^
 
 As alluded to earlier, building the full documentation is a bit more involved
 than simply building the static documentation.  
@@ -85,15 +154,15 @@
 supplementary yt analysis modules installed. The following dependencies were 
 used to generate the yt documentation during the release of yt 2.6 in late 2013.
 
-- Sphinx_ 1.1.3
-- IPython_ 1.1
-- runipy_ (git hash f74458c2877)
-- pandoc_ 1.11.1
-- Rockstar halo finder 0.99.6
-- SZpack_ 1.1.1
-- ffmpeg_ 1.2.4 (compiled with libvpx support)
-- JSAnimation_ (git hash 1b95cb3a3a)
-- Astropy_ 0.2.5
+* Sphinx_ 1.1.3
+* IPython_ 1.1
+* runipy_ (git hash f74458c2877)
+* pandoc_ 1.11.1
+* Rockstar halo finder 0.99.6
+* SZpack_ 1.1.1
+* ffmpeg_ 1.2.4 (compiled with libvpx support)
+* JSAnimation_ (git hash 1b95cb3a3a)
+* Astropy_ 0.2.5
 
 .. _SZpack: http://www.cita.utoronto.ca/~jchluba/Science_Jens/SZpack/SZpack.html
 .. _Astropy: http://astropy.org/
@@ -130,8 +199,8 @@
 will not delete the autogenerated API docs, so use :code:`make fullclean` to
 delete those as well.
 
-Building the docs (hybrid)
---------------------------
+Building the Docs (Hybrid)
+^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 It's also possible to create a custom sphinx build that builds a restricted set
 of notebooks or scripts.  This can be accomplished by editing the Sphinx

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/developing/debugdrive.rst
--- a/doc/source/developing/debugdrive.rst
+++ b/doc/source/developing/debugdrive.rst
@@ -1,9 +1,9 @@
 .. _debug-drive:
 
-Debugging and Driving YT
-========================
+Debugging yt
+============
 
-There are several different convenience functions that allow you to control YT
+There are several different convenience functions that allow you to control yt
 in perhaps unexpected and unorthodox manners.  These will allow you to conduct
 in-depth debugging of processes that may be running in parallel on multiple
 processors, as well as providing a mechanism of signalling to yt that you need
@@ -20,8 +20,8 @@
 
 .. _pastebin:
 
-The Pastebin
-------------
+Pastebin
+--------
 
 A pastebin is a website where you can easily copy source code and error
 messages to share with yt developers or your collaborators. At
@@ -47,24 +47,12 @@
 The snippet will be output to the window, so output redirection can be used to
 store it in a file.
 
-.. _error-reporting:
+Use the Python Debugger
+-----------------------
 
-Error Reporting with the Pastebin
-+++++++++++++++++++++++++++++++++
-
-If you are having troubles with yt, you can have it paste the error report
-to the pastebin by running your problematic script with the ``--paste`` option:
-
-.. code-block:: bash
-
-   $ python2.7 some_problematic_script.py --paste
-
-The ``--paste`` option has to come after the name of the script.  When the
-script dies and prints its error, it will also submit that error to the
-pastebin and return a URL for the error.  When reporting your bug, include this
-URL and then the problem can be debugged more easily.
-
-For more information on asking for help, see `asking-for-help`.
+yt is almost entirely composed of python code, so it makes sense to use
+the python debugger as your first stop in trying to debug it:
+`https://docs.python.org/2/library/pdb.html`_
 
 Signaling yt to Do Something
 ----------------------------

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/developing/developing.rst
--- a/doc/source/developing/developing.rst
+++ b/doc/source/developing/developing.rst
@@ -23,28 +23,28 @@
 you're new to Mercurial, these three resources are pretty great for learning
 the ins and outs:
 
-   * http://hginit.com/
-   * http://hgbook.red-bean.com/read/
-   * http://mercurial.selenic.com/
+* `http://hginit.com/`_
+* `http://hgbook.red-bean.com/read/`_
+* `http://mercurial.selenic.com/`_
 
 The commands that are essential for using mercurial include:
 
-   * ``hg commit`` which commits changes in the working directory to the
-     repository, creating a new "changeset object."
-   * ``hg add`` which adds a new file to be tracked by mercurial.  This does
-     not change the working directory.
-   * ``hg pull`` which pulls (from an optional path specifier) changeset
-     objects from a remote source.  The working directory is not modified.
-   * ``hg push`` which sends (to an optional path specifier) changeset objects
-     to a remote source.  The working directory is not modified.
-   * ``hg log`` which shows a log of all changeset objects in the current
-     repository.  Use ``-g`` to show a graph of changeset objects and their
-     relationship.
-   * ``hg update`` which (with an optional "revision" specifier) updates the
-     state of the working directory to match a changeset object in the
-     repository.
-   * ``hg merge`` which combines two changesets to make a union of their lines
-     of development.  This updates the working directory.
+* ``hg commit`` which commits changes in the working directory to the
+  repository, creating a new "changeset object."
+* ``hg add`` which adds a new file to be tracked by mercurial.  This does
+  not change the working directory.
+* ``hg pull`` which pulls (from an optional path specifier) changeset
+  objects from a remote source.  The working directory is not modified.
+* ``hg push`` which sends (to an optional path specifier) changeset objects
+  to a remote source.  The working directory is not modified.
+* ``hg log`` which shows a log of all changeset objects in the current
+  repository.  Use ``-g`` to show a graph of changeset objects and their
+  relationship.
+* ``hg update`` which (with an optional "revision" specifier) updates the
+  state of the working directory to match a changeset object in the
+  repository.
+* ``hg merge`` which combines two changesets to make a union of their lines
+  of development.  This updates the working directory.
 
 Keep in touch, and happy hacking!  We also provide `doc/coding_styleguide.txt`
 and an example of a fiducial docstring in `doc/docstring_example.txt`.  Please
@@ -87,26 +87,26 @@
 <https://ytep.readthedocs.org/en/latest/YTEPs/YTEP-0008.html>`_ for more
 detail.)
 
-  * New Features
+* New Features
 
-    * New unit tests (possibly new answer tests) (See :ref:`testing`)
-    * Docstrings in the source code for the public API
-    * Addition of new feature to the narrative documentation (See :ref:`writing_documentation`)
-    * Addition of cookbook recipe (See :ref:`writing_documentation`) 
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * New unit tests (possibly new answer tests) (See :ref:`testing`)
+  * Docstrings in the source code for the public API
+  * Addition of new feature to the narrative documentation (See :ref:`writing_documentation`)
+  * Addition of cookbook recipe (See :ref:`writing_documentation`) 
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
-  * Extension or Breakage of API in Existing Features
+* Extension or Breakage of API in Existing Features
 
-    * Update existing narrative docs and docstrings (See :ref:`writing_documentation`) 
-    * Update existing cookbook recipes (See :ref:`writing_documentation`) 
-    * Modify of create new unit tests (See :ref:`testing`)
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * Update existing narrative docs and docstrings (See :ref:`writing_documentation`) 
+  * Update existing cookbook recipes (See :ref:`writing_documentation`) 
+  * Modify of create new unit tests (See :ref:`testing`)
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
-  * Bug fixes
+* Bug fixes
 
-    * Unit test is encouraged, to ensure breakage does not happen again in the
-      future. (See :ref:`testing`)
-    * Issue created on issue tracker, to ensure this is added to the changelog
+  * Unit test is encouraged, to ensure breakage does not happen again in the
+    future. (See :ref:`testing`)
+  * Issue created on issue tracker, to ensure this is added to the changelog
 
 When submitting, you will be asked to make sure that your changes meet all of
 these requirements.  They are pretty easy to meet, and we're also happy to help
@@ -122,22 +122,22 @@
 walk you through any troubles you might have.  Here are some suggestions
 for using mercurial with yt:
 
-  * Named branches are to be avoided.  Try using bookmarks (``hg bookmark``) to
-    track work.  (`More <http://mercurial.selenic.com/wiki/Bookmarks>`_)
-  * Make sure you set a username in your ``~/.hgrc`` before you commit any
-    changes!  All of the tutorials above will describe how to do this as one of
-    the very first steps.
-  * When contributing changes, you might be asked to make a handful of
-    modifications to your source code.  We'll work through how to do this with
-    you, and try to make it as painless as possible.
-  * Please avoid deleting your yt forks, as that eliminates the code review
-    process from BitBucket's website.
-  * In all likelihood, you only need one fork.  To keep it in sync, you can
-    sync from the website.  (See Bitbucket's `Blog Post
-    <http://blog.bitbucket.org/2013/02/04/syncing-and-merging-come-to-bitbucket/>`_
-    about this.)
-  * If you run into any troubles, stop by IRC (see :ref:`irc`) or the mailing
-    list.
+* Named branches are to be avoided.  Try using bookmarks (``hg bookmark``) to
+  track work.  (`More <http://mercurial.selenic.com/wiki/Bookmarks>`_)
+* Make sure you set a username in your ``~/.hgrc`` before you commit any
+  changes!  All of the tutorials above will describe how to do this as one of
+  the very first steps.
+* When contributing changes, you might be asked to make a handful of
+  modifications to your source code.  We'll work through how to do this with
+  you, and try to make it as painless as possible.
+* Please avoid deleting your yt forks, as that eliminates the code review
+  process from BitBucket's website.
+* In all likelihood, you only need one fork.  To keep it in sync, you can
+  sync from the website.  (See Bitbucket's `Blog Post
+  <http://blog.bitbucket.org/2013/02/04/syncing-and-merging-come-to-bitbucket/>`_
+  about this.)
+* If you run into any troubles, stop by IRC (see :ref:`irc`) or the mailing
+  list.
 
 .. _building-yt:
 
@@ -192,124 +192,59 @@
 
 The simplest way to submit changes to yt is to do the following:
 
-  * Build yt from the mercurial repository
-  * Navigate to the root of the yt repository 
-  * Make some changes and commit them
-  * Fork the `yt repository on BitBucket <https://bitbucket.org/yt_analysis/yt>`_
-  * Push the changesets to your fork
-  * Issue a pull request.
+* Build yt from the mercurial repository
+* Navigate to the root of the yt repository 
+* Make some changes and commit them
+* Fork the `yt repository on BitBucket <https://bitbucket.org/yt_analysis/yt>`_
+* Push the changesets to your fork
+* Issue a pull request.
 
 Here's a more detailed flowchart of how to submit changes.
 
-  #. If you have used the installation script, the source code for yt can be
-     found in ``$YT_DEST/src/yt-hg``.  Alternatively see
-     :ref:`source-installation` for instructions on how to build yt from the
-     mercurial repository. (Below, in :ref:`reading-source`, we describe how to
-     find items of interest.)  
-  #. Edit the source file you are interested in and
-     test your changes.  (See :ref:`testing` for more information.)
-  #. Fork yt on BitBucket.  (This step only has to be done once.)  You can do
-     this at: https://bitbucket.org/yt_analysis/yt/fork .  Call this repository
-     yt.
-  #. Commit these changes, using ``hg commit``.  This can take an argument
-     which is a series of filenames, if you have some changes you do not want
-     to commit.
-  #. If your changes include new functionality or cover an untested area of the
-     code, add a test.  (See :ref:`testing` for more information.)  Commit
-     these changes as well.
-  #. Push your changes to your new fork using the command::
+#. If you have used the installation script, the source code for yt can be
+   found in ``$YT_DEST/src/yt-hg``.  Alternatively see
+   :ref:`source-installation` for instructions on how to build yt from the
+   mercurial repository. (Below, in :ref:`reading-source`, we describe how to
+   find items of interest.)  
+#. Edit the source file you are interested in and
+   test your changes.  (See :ref:`testing` for more information.)
+#. Fork yt on BitBucket.  (This step only has to be done once.)  You can do
+   this at: https://bitbucket.org/yt_analysis/yt/fork .  Call this repository
+   yt.
+#. Commit these changes, using ``hg commit``.  This can take an argument
+   which is a series of filenames, if you have some changes you do not want
+   to commit.
+#. If your changes include new functionality or cover an untested area of the
+   code, add a test.  (See :ref:`testing` for more information.)  Commit
+   these changes as well.
+#. Push your changes to your new fork using the command::
 
-        hg push -r . https://bitbucket.org/YourUsername/yt/
+      hg push -r . https://bitbucket.org/YourUsername/yt/
  
-     If you end up doing considerable development, you can set an alias in the
-     file ``.hg/hgrc`` to point to this path.
-  #. Issue a pull request at
-     https://bitbucket.org/YourUsername/yt/pull-request/new
+   If you end up doing considerable development, you can set an alias in the
+   file ``.hg/hgrc`` to point to this path.
+#. Issue a pull request at
+   https://bitbucket.org/YourUsername/yt/pull-request/new
 
 During the course of your pull request you may be asked to make changes.  These
 changes may be related to style issues, correctness issues, or even requesting
 tests.  The process for responding to pull request code review is relatively
 straightforward.
 
-  #. Make requested changes, or leave a comment indicating why you don't think
-     they should be made.
-  #. Commit those changes to your local repository.
-  #. Push the changes to your fork::
+#. Make requested changes, or leave a comment indicating why you don't think
+   they should be made.
+#. Commit those changes to your local repository.
+#. Push the changes to your fork::
 
-        hg push https://bitbucket.org/YourUsername/yt/
+      hg push https://bitbucket.org/YourUsername/yt/
 
-  #. Your pull request will be automatically updated.
-
-.. _writing_documentation:
-
-How to Write Documentation
---------------------------
-
-Writing documentation is one of the most important but often overlooked tasks
-for increasing yt's impact in the community.  It is the way in which the 
-world will understand how to use our code, so it needs to be done concisely
-and understandably.  Typically, when a developer submits some piece of code 
-with new functionality, she should also include documentation on how to use 
-that functionality (as per :ref:`requirements-for-code-submission`).  
-Depending on the nature of the code addition, this could be a new narrative 
-docs section describing how the new code works and how to use it, it could 
-include a recipe in the cookbook section, or it could simply be adding a note 
-in the relevant docs text somewhere.
-
-The documentation exists in the main mercurial code repository for yt in the
-``doc`` directory (i.e. ``$YT_HG/doc/source`` where ``$YT_HG`` is the path of
-the yt mercurial repository).  It is organized hierarchically into the main
-categories of:
-
- * Visualizing
- * Analyzing
- * Examining
- * Cookbook
- * Bootcamp
- * Developing
- * Reference
- * Help
-
-You will have to figure out where your new/modified doc fits into this, but 
-browsing through the pre-built documentation is a good way to sort that out.
-
-All the source for the documentation is written in 
-`Sphinx <http://sphinx-doc.org/>`_, which uses ReST for markup.  ReST is very
-straightforward to markup in a text editor, and if you are new to it, we
-recommend just using other .rst files in the existing yt documentation as 
-templates or checking out the 
-`ReST reference documentation <http://sphinx-doc.org/rest.html>`_.
-
-New cookbook recipes (see :ref:`cookbook`) are very helpful for the community 
-as they provide simple annotated recipes on how to use specific functionality.  
-To add one, create a concise python script which demonstrates some 
-functionality and pare it down to its minimum.  Add some comment lines to 
-describe what it is that you're doing along the way.  Place this ``.py`` file 
-in the ``source/cookbook/`` directory, and then link to it explicitly in one 
-of the relevant ``.rst`` files in that directory (e.g. ``complex_plots.rst``, 
-``cosmological_analysis.rst``, etc.), and add some description of what the script 
-actually does.  We recommend that you use one of the 
-`sample data sets <http://yt-project.org/data>`_ in your recipe.  When the full
-docs are built, each of the cookbook recipes are executed dynamically on 
-a system which has access to all of the sample datasets.  Any output images 
-generated by your script will then be attached inline in the built documentation 
-directly following your script.
-
-After you have made your modifications to the docs, you will want to make sure
-that they render the way you expect them to render.  For more information on
-this, see the section on :ref:`docs_build`.  Unless you're contributing cookbook
-recipes or notebooks which require a dynamical build, you can probably get 
-away with just doing a 'quick' docs build.
-
-When you have completed your documentation additions, commit your changes 
-to your repository and make a pull request in the same way you would contribute 
-a change to the codebase, as described in the section on :ref:`sharing-changes`.
+#. Your pull request will be automatically updated.
 
 How To Get The Source Code For Editing
 --------------------------------------
 
 yt is hosted on BitBucket, and you can see all of the yt repositories at
-http://hg.yt-project.org/ .  With the yt installation script you should have a
+`http://hg.yt-project.org/`_ .  With the yt installation script you should have a
 copy of Mercurial for checking out pieces of code.  Make sure you have followed
 the steps above for bootstrapping your development (to assure you have a
 bitbucket account, etc.)
@@ -318,7 +253,7 @@
 main yt repository on bitbucket.  A fork is simply an exact copy of the main
 repository (along with its history) that you will now own and can make
 modifications as you please.  You can create a personal fork by visiting the yt
-bitbucket webpage at https://bitbucket.org/yt_analysis/yt/ .  After logging in,
+bitbucket webpage at `https://bitbucket.org/yt_analysis/yt/`_ .  After logging in,
 you should see an option near the top right labeled "fork".  Click this option,
 and then click the fork repository button on the subsequent page.  You now have
 a forked copy of the yt repository for your own personal modification.
@@ -380,54 +315,54 @@
 code is contained in the yt subdirectory.  This directory its self contains
 the following subdirectories:
 
-   ``frontends``
-      This is where interfaces to codes are created.  Within each subdirectory of
-      yt/frontends/ there must exist the following files, even if empty:
+``frontends``
+   This is where interfaces to codes are created.  Within each subdirectory of
+   yt/frontends/ there must exist the following files, even if empty:
 
-      * ``data_structures.py``, where subclasses of AMRGridPatch, Dataset
-        and AMRHierarchy are defined.
-      * ``io.py``, where a subclass of IOHandler is defined.
-      * ``fields.py``, where fields we expect to find in datasets are defined
-      * ``misc.py``, where any miscellaneous functions or classes are defined.
-      * ``definitions.py``, where any definitions specific to the frontend are
-        defined.  (i.e., header formats, etc.)
+   * ``data_structures.py``, where subclasses of AMRGridPatch, Dataset
+     and AMRHierarchy are defined.
+   * ``io.py``, where a subclass of IOHandler is defined.
+   * ``fields.py``, where fields we expect to find in datasets are defined
+   * ``misc.py``, where any miscellaneous functions or classes are defined.
+   * ``definitions.py``, where any definitions specific to the frontend are
+     defined.  (i.e., header formats, etc.)
 
-   ``fields``
-      This is where all of the derived fields that ship with yt are defined.
+``fields``
+   This is where all of the derived fields that ship with yt are defined.
 
-   ``geometry`` 
-      This is where geometric helpler routines are defined. Handlers
-      for grid and oct data, as well as helpers for coordinate transformations
-      can be found here.
+``geometry`` 
+   This is where geometric helpler routines are defined. Handlers
+   for grid and oct data, as well as helpers for coordinate transformations
+   can be found here.
 
-   ``visualization``
-      This is where all visualization modules are stored.  This includes plot
-      collections, the volume rendering interface, and pixelization frontends.
+``visualization``
+   This is where all visualization modules are stored.  This includes plot
+   collections, the volume rendering interface, and pixelization frontends.
 
-   ``data_objects``
-      All objects that handle data, processed or unprocessed, not explicitly
-      defined as visualization are located in here.  This includes the base
-      classes for data regions, covering grids, time series, and so on.  This
-      also includes derived fields and derived quantities.
+``data_objects``
+   All objects that handle data, processed or unprocessed, not explicitly
+   defined as visualization are located in here.  This includes the base
+   classes for data regions, covering grids, time series, and so on.  This
+   also includes derived fields and derived quantities.
 
-   ``analysis_modules``
-      This is where all mechanisms for processing data live.  This includes
-      things like clump finding, halo profiling, halo finding, and so on.  This
-      is something of a catchall, but it serves as a level of greater
-      abstraction that simply data selection and modification.
+``analysis_modules``
+   This is where all mechanisms for processing data live.  This includes
+   things like clump finding, halo profiling, halo finding, and so on.  This
+   is something of a catchall, but it serves as a level of greater
+   abstraction that simply data selection and modification.
 
-   ``gui``
-      This is where all GUI components go.  Typically this will be some small
-      tool used for one or two things, which contains a launching mechanism on
-      the command line.
+``gui``
+   This is where all GUI components go.  Typically this will be some small
+   tool used for one or two things, which contains a launching mechanism on
+   the command line.
 
-   ``utilities``
-      All broadly useful code that doesn't clearly fit in one of the other
-      categories goes here.
+``utilities``
+   All broadly useful code that doesn't clearly fit in one of the other
+   categories goes here.
 
-   ``extern`` 
-      Bundled external modules (i.e. code that was not written by one of
-      the yt authors but that yt depends on) lives here.
+``extern`` 
+   Bundled external modules (i.e. code that was not written by one of
+   the yt authors but that yt depends on) lives here.
 
 
 If you're looking for a specific file or function in the yt source code, use
@@ -457,72 +392,72 @@
 General Guidelines
 ++++++++++++++++++
 
- * In general, follow `PEP-8 <http://www.python.org/dev/peps/pep-0008/>`_ guidelines.
- * Classes are ConjoinedCapitals, methods and functions are
-   ``lowercase_with_underscores.``
- * Use 4 spaces, not tabs, to represent indentation.
- * Line widths should not be more than 80 characters.
- * Do not use nested classes unless you have a very good reason to, such as
-   requiring a namespace or class-definition modification.  Classes should live
-   at the top level.  ``__metaclass__`` is exempt from this.
- * Do not use unnecessary parentheses in conditionals.  ``if((something) and
-   (something_else))`` should be rewritten as ``if something and
-   something_else``.  Python is more forgiving than C.
- * Avoid copying memory when possible. For example, don't do ``a =
-   a.reshape(3,4)`` when ``a.shape = (3,4)`` will do, and ``a = a * 3`` should be
-   ``np.multiply(a, 3, a)``.
- * In general, avoid all double-underscore method names: ``__something`` is
-   usually unnecessary.
- * Doc strings should describe input, output, behavior, and any state changes
-   that occur on an object.  See the file `doc/docstring_example.txt` for a
-   fiducial example of a docstring.
+* In general, follow `PEP-8 <http://www.python.org/dev/peps/pep-0008/>`_ guidelines.
+* Classes are ConjoinedCapitals, methods and functions are
+  ``lowercase_with_underscores.``
+* Use 4 spaces, not tabs, to represent indentation.
+* Line widths should not be more than 80 characters.
+* Do not use nested classes unless you have a very good reason to, such as
+  requiring a namespace or class-definition modification.  Classes should live
+  at the top level.  ``__metaclass__`` is exempt from this.
+* Do not use unnecessary parentheses in conditionals.  ``if((something) and
+  (something_else))`` should be rewritten as ``if something and
+  something_else``.  Python is more forgiving than C.
+* Avoid copying memory when possible. For example, don't do ``a =
+  a.reshape(3,4)`` when ``a.shape = (3,4)`` will do, and ``a = a * 3`` should be
+  ``np.multiply(a, 3, a)``.
+* In general, avoid all double-underscore method names: ``__something`` is
+  usually unnecessary.
+* Doc strings should describe input, output, behavior, and any state changes
+  that occur on an object.  See the file `doc/docstring_example.txt` for a
+  fiducial example of a docstring.
 
 API Guide
 +++++++++
 
- * Do not import "*" from anything other than ``yt.funcs``.
- * Internally, only import from source files directly; instead of: ``from
-   yt.visualization.api import SlicePlot`` do
-   ``from yt.visualization.plot_window import SlicePlot``.
- * Numpy is to be imported as ``np``.
- * Do not use too many keyword arguments.  If you have a lot of keyword
-   arguments, then you are doing too much in ``__init__`` and not enough via
-   parameter setting.
- * In function arguments, place spaces before commas.  ``def something(a,b,c)``
-   should be ``def something(a, b, c)``.
- * Don't create a new class to replicate the functionality of an old class --
-   replace the old class.  Too many options makes for a confusing user
-   experience.
- * Parameter files external to yt are a last resort.
- * The usage of the ``**kwargs`` construction should be avoided.  If they
-   cannot be avoided, they must be explained, even if they are only to be
-   passed on to a nested function.
- * Constructor APIs should be kept as *simple* as possible.
- * Variable names should be short but descriptive.
- * No global variables!
+* Do not import "*" from anything other than ``yt.funcs``.
+* Internally, only import from source files directly; instead of: ``from
+  yt.visualization.api import SlicePlot`` do
+  ``from yt.visualization.plot_window import SlicePlot``.
+* Numpy is to be imported as ``np``.
+* Do not use too many keyword arguments.  If you have a lot of keyword
+  arguments, then you are doing too much in ``__init__`` and not enough via
+  parameter setting.
+* In function arguments, place spaces before commas.  ``def something(a,b,c)``
+  should be ``def something(a, b, c)``.
+* Don't create a new class to replicate the functionality of an old class --
+  replace the old class.  Too many options makes for a confusing user
+  experience.
+* Parameter files external to yt are a last resort.
+* The usage of the ``**kwargs`` construction should be avoided.  If they
+  cannot be avoided, they must be explained, even if they are only to be
+  passed on to a nested function.
+* Constructor APIs should be kept as *simple* as possible.
+* Variable names should be short but descriptive.
+* No global variables!
 
 Variable Names and Enzo-isms
 ++++++++++++++++++++++++++++
 
- * Avoid Enzo-isms.  This includes but is not limited to:
+* Avoid Enzo-isms.  This includes but is not limited to:
 
-   + Hard-coding parameter names that are the same as those in Enzo.  The
-     following translation table should be of some help.  Note that the
-     parameters are now properties on a Dataset subclass: you access them
-     like ``ds.refine_by`` .
+  + Hard-coding parameter names that are the same as those in Enzo.  The
+    following translation table should be of some help.  Note that the
+    parameters are now properties on a Dataset subclass: you access them
+    like ``ds.refine_by`` .
 
-     - ``RefineBy `` => `` refine_by``
-     - ``TopGridRank `` => `` dimensionality``
-     - ``TopGridDimensions `` => `` domain_dimensions``
-     - ``InitialTime `` => `` current_time``
-     - ``DomainLeftEdge `` => `` domain_left_edge``
-     - ``DomainRightEdge `` => `` domain_right_edge``
-     - ``CurrentTimeIdentifier `` => `` unique_identifier``
-     - ``CosmologyCurrentRedshift `` => `` current_redshift``
-     - ``ComovingCoordinates `` => `` cosmological_simulation``
-     - ``CosmologyOmegaMatterNow `` => `` omega_matter``
-     - ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
-     - ``CosmologyHubbleConstantNow `` => `` hubble_constant``
+    - ``RefineBy `` => `` refine_by``
+    - ``TopGridRank `` => `` dimensionality``
+    - ``TopGridDimensions `` => `` domain_dimensions``
+    - ``InitialTime `` => `` current_time``
+    - ``DomainLeftEdge `` => `` domain_left_edge``
+    - ``DomainRightEdge `` => `` domain_right_edge``
+    - ``CurrentTimeIdentifier `` => `` unique_identifier``
+    - ``CosmologyCurrentRedshift `` => `` current_redshift``
+    - ``ComovingCoordinates `` => `` cosmological_simulation``
+    - ``CosmologyOmegaMatterNow `` => `` omega_matter``
+    - ``CosmologyOmegaLambdaNow `` => `` omega_lambda``
+    - ``CosmologyHubbleConstantNow `` => `` hubble_constant``
 
-   + Do not assume that the domain runs from 0 to 1.  This is not true
-     everywhere.
+  + Do not assume that the domain runs from 0 to 1.  This is not true
+    for many codes and datasets.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/developing/intro.rst
--- a/doc/source/developing/intro.rst
+++ b/doc/source/developing/intro.rst
@@ -15,9 +15,9 @@
 
 There are four main communication channels for yt:
 
- * We also have an IRC channel, on ``irc.freenode.net`` in ``#yt``, which can be a
-   bit less on-topic than the mailing lists.  You can connect through our web
-   gateway without any special client, at http://yt-project.org/irc.html .
+ * We have an IRC channel, on ``irc.freenode.net`` in ``#yt``.
+   You can connect through our web
+   gateway without any special client, at `http://yt-project.org/irc.html`_.
    *IRC is the first stop for conversation!*
  * `yt-users <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>`_
    is a relatively high-traffic mailing list where people are encouraged to ask
@@ -60,8 +60,8 @@
 to have more examples that show complex or advanced behavior -- and if you have
 used such scripts to write a paper, that too would be an amazing contribution.
 
-Documentation and Screencasts
------------------------------
+Documentation 
+-------------
 
 The yt documentation -- which you are reading right now -- is constantly being
 updated, and it is a task we would very much appreciate assistance with.
@@ -75,18 +75,6 @@
 issue a pull request through the website for your new fork, and we can comment
 back and forth and eventually accept your changes.
 
-One of the more interesting ways we are attempting to do lately is to add
-screencasts to the documentation -- these are recordings of people executing
-sessions in a terminal or in a web browser, showing off functionality and
-describing how to do various things.  These provide a more dynamic and
-engaging way of demonstrating functionality and teaching methods.
-
-One easy place to record screencasts is with `Screencast-O-Matic
-<http://www.screencast-o-matic.com/>`_ but there are many to choose from.  Once
-you have recorded it, let us know and be sure to add it to the
-`yt Vimeo group <http://vimeo.com/groups/ytgallery>`_.  We'll then link to it
-from the documentation!
-
 Gallery Images and Videos
 -------------------------
 
@@ -96,9 +84,9 @@
 email it to us and we'll add it to the `Gallery
 <http://yt-project.org/gallery.html>`_.
 
-We're eager to show off the images you make with yt, so please feel free to
-drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ a
-line and let us know if you've got something great!
+We're eager to show off the images and movies you make with yt, so please feel 
+free to drop `us <http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org>`_ 
+a line and let us know if you've got something great!
 
 Technical Contributions
 -----------------------

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/developing/testing.rst
--- a/doc/source/developing/testing.rst
+++ b/doc/source/developing/testing.rst
@@ -1,6 +1,5 @@
 .. _testing:
 
-=======
 Testing
 =======
 
@@ -46,8 +45,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose()
+   import yt
+   yt.run_nose()
 
 If you are developing new functionality, it is sometimes more convenient to use
 the Nose command line interface, ``nosetests``. You can run the unit tests
@@ -79,36 +78,36 @@
 document, as in some cases they belong to other packages.  However, a few come
 in handy:
 
- * :func:`yt.testing.fake_random_ds` provides the ability to create a random
-   dataset, with several fields and divided into several different
-   grids, that can be operated on.
- * :func:`yt.testing.assert_equal` can operate on arrays.
- * :func:`yt.testing.assert_almost_equal` can operate on arrays and accepts a
-   relative allowable difference.
- * :func:`yt.testing.amrspace` provides the ability to create AMR grid
-   structures.
- * :func:`~yt.testing.expand_keywords` provides the ability to iterate over
-   many values for keywords.
+* :func:`~yt.testing.fake_random_ds` provides the ability to create a random
+  dataset, with several fields and divided into several different
+  grids, that can be operated on.
+* :func:`~yt.testing.assert_equal` can operate on arrays.
+* :func:`~yt.testing.assert_almost_equal` can operate on arrays and accepts a
+  relative allowable difference.
+* :func:`~yt.testing.amrspace` provides the ability to create AMR grid
+  structures.
+* :func:`~yt.testing.expand_keywords` provides the ability to iterate over
+  many values for keywords.
 
 To create new unit tests:
 
- #. Create a new ``tests/`` directory next to the file containing the
-    functionality you want to test.  Be sure to add this new directory as a
-    subpackage in the setup.py script located in the directory you're adding a
-    new ``tests/`` folder to.  This ensures that the tests will be deployed in
-    yt source and binary distributions.
- #. Inside that directory, create a new python file prefixed with ``test_`` and
-    including the name of the functionality.
- #. Inside that file, create one or more routines prefixed with ``test_`` that
-    accept no arguments.  These should ``yield`` a set of values of the form
-    ``function``, ``arguments``.  For example ``yield assert_equal, 1.0, 1.0``
-    would evaluate that 1.0 equaled 1.0.
- #. Use ``fake_random_ds`` to test on datasets, and be sure to test for
-    several combinations of ``nproc``, so that domain decomposition can be
-    tested as well.
- #. Test multiple combinations of options by using the
-    :func:`~yt.testing.expand_keywords` function, which will enable much
-    easier iteration over options.
+#. Create a new ``tests/`` directory next to the file containing the
+   functionality you want to test.  Be sure to add this new directory as a
+   subpackage in the setup.py script located in the directory you're adding a
+   new ``tests/`` folder to.  This ensures that the tests will be deployed in
+   yt source and binary distributions.
+#. Inside that directory, create a new python file prefixed with ``test_`` and
+   including the name of the functionality.
+#. Inside that file, create one or more routines prefixed with ``test_`` that
+   accept no arguments.  These should ``yield`` a set of values of the form
+   ``function``, ``arguments``.  For example ``yield assert_equal, 1.0, 1.0``
+   would evaluate that 1.0 equaled 1.0.
+#. Use ``fake_random_ds`` to test on datasets, and be sure to test for
+   several combinations of ``nproc``, so that domain decomposition can be
+   tested as well.
+#. Test multiple combinations of options by using the
+   :func:`~yt.testing.expand_keywords` function, which will enable much
+   easier iteration over options.
 
 For an example of how to write unit tests, look at the file
 ``yt/data_objects/tests/test_covering_grid.py``, which covers a great deal of
@@ -134,16 +133,16 @@
 The very first step is to make a directory and copy over the data against which
 you want to test.  Currently, we test:
 
- * ``DD0010/moving7_0010`` (available in ``tests/`` in the yt distribution)
- * ``IsolatedGalaxy/galaxy0030/galaxy0030``
- * ``WindTunnel/windtunnel_4lev_hdf5_plt_cnt_0030``
- * ``GasSloshingLowRes/sloshing_low_res_hdf5_plt_cnt_0300``
- * ``TurbBoxLowRes/data.0005.3d.hdf5``
- * ``GaussianCloud/data.0077.3d.hdf5``
- * ``RadAdvect/plt00000``
- * ``RadTube/plt00500``
+* ``DD0010/moving7_0010`` (available in ``tests/`` in the yt distribution)
+* ``IsolatedGalaxy/galaxy0030/galaxy0030``
+* ``WindTunnel/windtunnel_4lev_hdf5_plt_cnt_0030``
+* ``GasSloshingLowRes/sloshing_low_res_hdf5_plt_cnt_0300``
+* ``TurbBoxLowRes/data.0005.3d.hdf5``
+* ``GaussianCloud/data.0077.3d.hdf5``
+* ``RadAdvect/plt00000``
+* ``RadTube/plt00500``
 
-These datasets are available at http://yt-project.org/data/.
+These datasets are available at `http://yt-project.org/data/`_.
 
 Next, modify the file ``~/.yt/config`` to include a section ``[yt]``
 with the parameter ``test_data_dir``.  Set this to point to the
@@ -162,8 +161,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose(run_answer_tests=True)
+   import yt
+   yt.run_nose(run_answer_tests=True)
 
 If you have installed yt using ``python setup.py develop`` you can also
 optionally invoke nose using the ``nosetests`` command line interface:
@@ -183,8 +182,8 @@
 
 .. code-block:: python
 
-   >>> import yt
-   >>> yt.run_nose(run_answer_tests=True, answer_big_data=True)
+   import yt
+   yt.run_nose(run_answer_tests=True, answer_big_data=True)
 
 or, in the base directory of the yt mercurial repository:
 
@@ -231,24 +230,24 @@
 
 To write a new test:
 
- * Subclass ``AnswerTestingTest``
- * Add the attributes ``_type_name`` (a string) and ``_attrs``
-   (a tuple of strings, one for each attribute that defines the test --
-   see how this is done for projections, for instance)
- * Implement the two routines ``run`` and ``compare``  The first
-   should return a result and the second should compare a result to an old
-   result.  Neither should yield, but instead actually return.  If you need
-   additional arguments to the test, implement an ``__init__`` routine.
- * Keep in mind that *everything* returned from ``run`` will be stored.  So if
-   you are going to return a huge amount of data, please ensure that the test
-   only gets run for small data.  If you want a fast way to measure something as
-   being similar or different, either an md5 hash (see the grid values test) or
-   a sum and std of an array act as good proxies.  If you must store a large
-   amount of data for some reason, try serializing the data to a string
-   (e.g. using ``numpy.ndarray.dumps``), and then compressing the data stream
-   using ``zlib.compress``.
- * Typically for derived values, we compare to 10 or 12 decimal places.
-   For exact values, we compare exactly.
+* Subclass ``AnswerTestingTest``
+* Add the attributes ``_type_name`` (a string) and ``_attrs``
+  (a tuple of strings, one for each attribute that defines the test --
+  see how this is done for projections, for instance)
+* Implement the two routines ``run`` and ``compare``  The first
+  should return a result and the second should compare a result to an old
+  result.  Neither should yield, but instead actually return.  If you need
+  additional arguments to the test, implement an ``__init__`` routine.
+* Keep in mind that *everything* returned from ``run`` will be stored.  So if
+  you are going to return a huge amount of data, please ensure that the test
+  only gets run for small data.  If you want a fast way to measure something as
+  being similar or different, either an md5 hash (see the grid values test) or
+  a sum and std of an array act as good proxies.  If you must store a large
+  amount of data for some reason, try serializing the data to a string
+  (e.g. using ``numpy.ndarray.dumps``), and then compressing the data stream
+  using ``zlib.compress``.
+* Typically for derived values, we compare to 10 or 12 decimal places.
+  For exact values, we compare exactly.
 
 How to Add Data to the Testing Suite
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -257,26 +256,26 @@
 The Enzo example in ``yt/frontends/enzo/tests/test_outputs.py`` is
 considered canonical.  Do these things:
 
- * Create a new directory, ``tests`` inside the frontend's directory.
+* Create a new directory, ``tests`` inside the frontend's directory.
 
- * Create a new file, ``test_outputs.py`` in the frontend's ``tests``
-   directory.
+* Create a new file, ``test_outputs.py`` in the frontend's ``tests``
+  directory.
 
- * Create a new routine that operates similarly to the routines you can see
-   in Enzo's outputs.
+* Create a new routine that operates similarly to the routines you can see
+  in Enzo's outputs.
 
-   * This routine should test a number of different fields and data objects.
+  * This routine should test a number of different fields and data objects.
 
-   * The test routine itself should be decorated with
-     ``@requires_ds(file_name)``  This decorate can accept the argument
-     ``big_data`` for if this data is too big to run all the time.
+  * The test routine itself should be decorated with
+    ``@requires_ds(file_name)``  This decorate can accept the argument
+    ``big_data`` for if this data is too big to run all the time.
 
-   * There are ``small_patch_amr`` and ``big_patch_amr`` routines that
-     you can yield from to execute a bunch of standard tests.  This is where
-     you should start, and then yield additional tests that stress the
-     outputs in whatever ways are necessary to ensure functionality.
+  * There are ``small_patch_amr`` and ``big_patch_amr`` routines that
+    you can yield from to execute a bunch of standard tests.  This is where
+    you should start, and then yield additional tests that stress the
+    outputs in whatever ways are necessary to ensure functionality.
 
-   * **All tests should be yielded!**
+  * **All tests should be yielded!**
 
 If you are adding to a frontend that has a few tests already, skip the first
 two steps.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/examining/low_level_inspection.rst
--- a/doc/source/examining/low_level_inspection.rst
+++ b/doc/source/examining/low_level_inspection.rst
@@ -230,6 +230,5 @@
 directly as a fixed resolution array.  This provides a means for bypassing the 
 yt method for generating plots, and allows the user the freedom to use 
 whatever interface they wish for displaying and saving their image data.  
-The object for doing this is the aptly titled Fixed Resolution Buffer, and 
-there is a full explanation for how to use it 
-:ref:`here <fixed-resolution-buffers>`.
+You can use the :class:`~yt.visualization.fixed_resolution.FixedResolutionBuffer`
+to accomplish this as described in :ref:`fixed-resolution-buffers`.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/help/index.rst
--- a/doc/source/help/index.rst
+++ b/doc/source/help/index.rst
@@ -1,6 +1,6 @@
 .. _asking-for-help:
 
-What to do if you run into problems
+What to Do If You Run into Problems
 ===================================
 
 If you run into problems with yt, there are a number of steps to follow
@@ -11,18 +11,18 @@
 
 To summarize, here are the steps in order:
 
- #. Don’t panic and don’t give up
- #. Update to the latest version
- #. Search the yt documentation and mailing list archives
- #. Look at the yt source
- #. Isolate & document your problem 
- #. Go on IRC and ask a question
- #. Ask the mailing list
- #. Submit a bug report
+#. Don’t panic and don’t give up
+#. Update to the latest version
+#. Search the yt documentation and mailing list archives
+#. Look at the yt source
+#. Isolate & document your problem 
+#. Go on IRC and ask a question
+#. Ask the mailing list
+#. Submit a bug report
 
 .. _dont-panic:
 
-Don't panic and don't give up
+Don't Panic and Don't Give up
 -----------------------------
 
 This may seem silly, but it's effective.  While yt is a robust code with
@@ -34,7 +34,7 @@
 
 .. _update-the-code:
 
-Try updating yt
+Try Updating yt
 ---------------
 
 Sometimes the pace of development is pretty fast on yt, particularly in the
@@ -53,9 +53,35 @@
 
   $ yt update --all
 
+.. _update_errors:
+
+Update Errors
+^^^^^^^^^^^^^
+
+If for some reason the ``update`` command fails with errors, or any attempt at 
+loading yt either from the command line or from within python also fails, it 
+may simply mean you need to rebuild the yt source (some of the c-code in yt 
+needs to be rebuilt after major changes).  You can do this by navigating to
+the root of the yt mercurial repository.  If you installed with the all-in-one
+installer script, this is the ``yt-<machine>/src/yt-hg`` directory.  Then 
+execute these commands:
+
+.. code-block:: bash
+
+  $ python setup.py develop
+
+Now try running yt again with:
+
+.. code-block:: bash
+
+  $ yt --help
+
+If you continue to see errors, you should try contacting us via IRC or email
+but you may have to reinstall yt (see :ref:`getting-and-installing`).
+
 .. _search-the-documentation:
 
-Search the documentation and mailing lists
+Search the Documentation and Mailing Lists
 ------------------------------------------
 
 The documentation has a lot of the answers to everyday problems.  This doesn't 
@@ -84,7 +110,7 @@
 
 .. _look-at-the-source:
 
-Look at the source code
+Look at the Source Code
 -----------------------
 
 We've done our best to make the source clean, and it is easily searchable from 
@@ -125,7 +151,7 @@
 
 .. _isolate_and_document:
 
-Isolate and document your problem
+Isolate and Document Your Problem
 ---------------------------------
 
 As you gear up to take your question to the rest of the community, try to distill
@@ -133,15 +159,15 @@
 script.  This can help you (and us) to identify the basic problem.  Follow
 these steps:
 
- * Identify what it is that went wrong, and how you knew it went wrong.
- * Put your script, errors, and outputs online:
+* Identify what it is that went wrong, and how you knew it went wrong.
+* Put your script, errors, and outputs online:
 
-   * ``$ yt pastebin script.py`` - pastes script.py online
-   * ``$ yt upload_image image.png`` - pastes image online
+  * ``$ yt pastebin script.py`` - pastes script.py online
+  * ``$ yt upload_image image.png`` - pastes image online
 
- * Identify which version of the code you’re using. 
+* Identify which version of the code you’re using. 
 
-   * ``$ yt version`` - provides version information, including changeset hash
+  * ``$ yt version`` - provides version information, including changeset hash
 
 It may be that through the mere process of doing this, you end up solving 
 the problem!
@@ -162,7 +188,7 @@
 
 .. _mailing-list:
 
-Ask the mailing list
+Ask the Mailing List
 --------------------
 
 If you still haven't yet found a solution, feel free to 
@@ -183,7 +209,7 @@
 
 .. _reporting-a-bug:
 
-How To report A bug
+How to Report a Bug
 -------------------
 
 If you have gone through all of the above steps, and you're still encountering 
@@ -195,7 +221,6 @@
 ticket in your stead.  Remember to include the information
 about your problem you identified in :ref:`this step <isolate_and_document>`.
 
-
 Installation Issues
 -------------------
 

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -28,7 +28,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Getting and Installing yt</p>
+         <p class="linkdescr">Getting, Installing, and Updating yt</p></td></tr><tr valign="top">
@@ -78,7 +78,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Use analysis tools to extract results from your data</p>
+         <p class="linkdescr">Use analysis  tools to extract results from your data</p></td></tr><tr valign="top">

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -13,9 +13,9 @@
 
 * If you do not have root access on your computer, are not comfortable managing
   python packages, or are working on a supercomputer or cluster computer, you
-  will probably want to use the bash installation script.  This builds python,
-  numpy, matplotlib, and yt from source to set up an isolated scientific python
-  environment inside of a single folder in your home directory. See
+  will probably want to use the bash all-in-one installation script.  This builds 
+  python, numpy, matplotlib, and yt from source to set up an isolated scientific 
+  python environment inside of a single folder in your home directory. See
   :ref:`install-script` for more details.
 
 * If you use the `Anaconda <https://store.continuum.io/cshop/anaconda/>`_ python
@@ -261,36 +261,6 @@
 package install path.  If you do not have write access for this location, you
 might need to use ``sudo``.
 
-Switching to yt 2.x
-^^^^^^^^^^^^^^^^^^^
-
-With the release of version 3.0 of yt, development of the legacy yt 2.x series
-has been relegated to bugfixes.  That said, we will continue supporting the 2.x
-series for the forseeable future.  This makes it easy to use scripts written
-for older versions of yt without substantially updating them to support the
-new field naming or unit systems in yt version 3.
-
-Currently, the yt-2.x codebase is contained in a named branch in the yt
-mercurial repository.  First, remove any extant installations of yt on your
-system:
-
-.. code-block:: bash
-
-  pip uninstall yt
-
-To switch to yt-2.x, you will need to clone the mercurial repository as
-described in :ref:`source-installation`.  Next, you will need to navigate to the
-mercurial repository, update to the `yt-2.x` branch, and recompile:
-
-.. code-block:: bash
-
-  cd yt
-  hg update yt-2.x
-  python setup.py develop --user --prefix=
-
-You can check which version of yt you have installed by invoking ``yt version``
-at the command line.
-
 Keeping yt Updated via Mercurial
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -327,7 +297,76 @@
 
 If you get an error, follow the instructions it gives you to debug the problem.
 Do not hesitate to :ref:`contact us <asking-for-help>` so we can help you
-figure it out.
+figure it out.  There is also information at :ref:`update-errors`.
 
 If you like, this might be a good time to run the test suite, see :ref:`testing`
 for more details.
+
+.. _switching-between-yt-versions:
+
+Switching between yt-2.x and yt-3.x
+-----------------------------------
+
+With the release of version 3.0 of yt, development of the legacy yt 2.x series
+has been relegated to bugfixes.  That said, we will continue supporting the 2.x
+series for the forseeable future.  This makes it easy to use scripts written
+for older versions of yt without substantially updating them to support the
+new field naming or unit systems in yt version 3.
+
+Currently, the yt-2.x codebase is contained in a named branch in the yt
+mercurial repository.  Thus, depending on the method you used to install
+yt, there are different instructions for switching versions.
+
+If You Installed yt Using the Installer Script
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+You already have the mercurial repository, so you simply need to switch
+which version you're using.  Navigate to the root of the yt mercurial
+repository, update to the desired version, and rebuild the source (some of the
+c code requires a compilation step for big changes like this):
+
+.. code-block:: bash
+
+  cd yt-<machine>/src/yt-hg
+  hg update <desired-version>
+  python setup.py develop
+
+Valid versions to jump to are:
+
+* ``yt`` -- The latest *dev* changes in yt-3.x (can be unstable)
+* ``stable`` -- The latest stable release of yt-3.x
+* ``yt-2.x`` -- The latest stable release of yt-2.x
+    
+You can check which version of yt you have installed by invoking ``yt version``
+at the command line.  If you encounter problems, see :ref:`update-errors`.
+
+If You Installed yt Using from Source or Using pip
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+If you have installed python via ``pip``, remove 
+any extant installations of yt on your system and clone the source mercurial 
+repository of yt as described in :ref:`source-installation`.
+
+.. code-block:: bash
+
+  pip uninstall yt
+  hg clone https://bitbucket.org/yt_analysis/yt
+
+Now, to switch between versions, you need to navigate to the root of
+the mercurial yt repository. Use mercurial to
+update to the appropriate version and recompile.  
+
+.. code-block:: bash
+
+  cd yt
+  hg update <desired-version>
+  python setup.py install --user --prefix=
+
+Valid versions to jump to are:
+
+* ``yt`` -- The latest *dev* changes in yt-3.x (can be unstable)
+* ``stable`` -- The latest stable release of yt-3.x
+* ``yt-2.x`` -- The latest stable release of yt-2.x
+    
+You can check which version of yt you have installed by invoking ``yt version``
+at the command line.  If you encounter problems, see :ref:`update-errors`.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/reference/api/api.rst
--- a/doc/source/reference/api/api.rst
+++ b/doc/source/reference/api/api.rst
@@ -87,6 +87,7 @@
    ~yt.data_objects.selection_data_containers.YTSphereBase
    ~yt.data_objects.selection_data_containers.YTEllipsoidBase
    ~yt.data_objects.selection_data_containers.YTCutRegionBase
+   ~yt.data_objects.grid_patch.AMRGridPatch
 
 Construction Objects
 ++++++++++++++++++++

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -216,7 +216,7 @@
 By running the ``pastebin_grab`` subcommand with a pastebin number 
 (e.g. 1768), it will grab the contents of that pastebin 
 (e.g. the website http://paste.yt-project.org/show/1768 ) and send it to 
-STDOUT for local use.  For more details see the :ref:`pastebin` section.
+STDOUT for local use.  See :ref:`pastebin` for more information.
 
 .. code-block:: bash
 

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/visualizing/streamlines.rst
--- a/doc/source/visualizing/streamlines.rst
+++ b/doc/source/visualizing/streamlines.rst
@@ -8,7 +8,8 @@
 velocity flow or magnetic field lines, they can be defined to follow
 any three-dimensional vector field.  Once an initial condition and
 total length of the streamline are specified, the streamline is
-uniquely defined.    
+uniquely defined.  Relatedly, yt also has the ability to follow 
+:ref:`particle-trajectories`.
 
 Method
 ------

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 doc/source/yt3differences.rst
--- a/doc/source/yt3differences.rst
+++ b/doc/source/yt3differences.rst
@@ -9,6 +9,31 @@
 minimize disruption to existing scripts, but necessarily things will be
 different in some ways.
 
+Updating to yt 3.0 from Old Versions
+------------------------------------
+
+First off, you need to update your version of yt to yt 3.0.  If you're
+installing yt for the first time, please visit :ref:`getting-and-installing-yt`.
+If you already have a version of yt installed, you should just need one
+command:
+
+.. code-block:: bash
+
+    $ yt update --all
+
+This will update yt to the most recent version and rebuild the source base.  
+If you installed using the installer script, it will assure you have all of the
+latest dependencies as well.  This step may take a few minutes.  To test
+to make sure yt is running, try:
+
+.. code-block:: bash
+
+    $ yt --help
+
+If you receive no errors, then you are ready to go.  If you have
+an error, then consult :ref:`update-errors` for solutions.  We also
+provide instructions for :ref:`switching-between-yt-versions`.
+
 Cheat Sheet
 -----------
 
@@ -129,7 +154,7 @@
 Preliminary support for non-cartesian coordinates has been added.  We expect
 this to be considerably solidified and expanded in yt 3.1.
 
-Reworked import system
+Reworked Import System
 ^^^^^^^^^^^^^^^^^^^^^^
 
 It's now possible to import all yt functionality using ``import yt``. Rather
@@ -176,15 +201,15 @@
    ds = yt.load("MyData")
    ds.setup_deprecated_fields()
 
-This sets up aliases from the old names to the new.  See :ref:`fields` for
-more information.
+This sets up aliases from the old names to the new.  See :ref:`fields` and
+:ref:`field-list` for more information.
 
 Units of Fields
 ^^^^^^^^^^^^^^^
 
 Fields now are all subclasses of NumPy arrays, the ``YTArray``, which carries
 along with it units.  This means that if you want to manipulate fields, you
-have to modify them in a unitful way.
+have to modify them in a unitful way.  See :ref:`units`.
 
 Parameter Files are Now Datasets
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -244,7 +269,8 @@
 ^^^^^^^^^^^^^^^^
 
 All data objects now accept an explicit list of ``field_parameters`` rather
-than accepting ``kwargs`` and supplying them to field parameters.
+than accepting ``kwargs`` and supplying them to field parameters.  See 
+:ref:`field-parameters`.
 
 Object Renaming
 ^^^^^^^^^^^^^^^
@@ -253,7 +279,8 @@
 removing ``AMR`` from the prefix or replacing it with ``YT``.  All names of
 objects remain the same for the purposes of selecting data and creating them;
 i.e., ``sphere`` objects are still called ``sphere`` - you can access create one
-via ``ds.sphere``.  For a detailed description and index see :ref:`available-objects`.
+via ``ds.sphere``.  For a detailed description and index see 
+:ref:`available-objects`.
 
 Boolean Regions
 ^^^^^^^^^^^^^^^
@@ -279,3 +306,9 @@
 
 This will "spatially" chunk the ``obj`` object and print out all the grids
 included.
+
+Halo Catalogs
+^^^^^^^^^^^^^
+
+The ``Halo Profiler`` infrastructure has been fundamentally rewritten and now
+exists using the ``Halo Catalog`` framework.  See :ref:`halo-analysis`.

diff -r 9701602a49cf2aa538b53570bb9038bc1ef441e4 -r 2ad686b9cb30292efbadf4d0f0cf3d18961537f2 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -388,7 +388,7 @@
     yt_provider = pkg_resources.get_provider("yt")
     path = os.path.dirname(yt_provider.module_path)
     if not os.path.isdir(os.path.join(path, ".hg")): return None
-    version = _get_hg_version(path)[:12]
+    version = _get_hg_version(path)
     return version
 
 # This code snippet is modified from Georg Brandl

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.


More information about the yt-svn mailing list