[yt-svn] commit/yt: 13 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Mon Aug 4 05:45:39 PDT 2014


13 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/bec313f5b326/
Changeset:   bec313f5b326
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:12:54
Summary:     Minor docs corrections.
Affected #:  2 files

diff -r 4a8c93735cdf58c42d1f60dbd3fb323e227ab982 -r bec313f5b326893ec7fccba65ac3a092434fc5e5 doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -28,7 +28,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Getting, Installing, and Updating yt</p>
+         <p class="linkdescr">Getting, installing, and updating yt</p></td></tr><tr valign="top">

diff -r 4a8c93735cdf58c42d1f60dbd3fb323e227ab982 -r bec313f5b326893ec7fccba65ac3a092434fc5e5 doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -341,7 +341,7 @@
 at the command line.  If you encounter problems, see :ref:`update-errors`.
 
 If You Installed yt Using from Source or Using pip
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+++++++++++++++++++++++++++++++++++++++++++++++++++
 
 If you have installed python via ``pip``, remove 
 any extant installations of yt on your system and clone the source mercurial 


https://bitbucket.org/yt_analysis/yt/commits/c92f37f8a49b/
Changeset:   c92f37f8a49b
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:13:48
Summary:     Correcting a unit bug in spherical position fields.
Affected #:  1 file

diff -r bec313f5b326893ec7fccba65ac3a092434fc5e5 -r c92f37f8a49b03846a4423dba6dabba6ae4267fb yt/fields/particle_fields.py
--- a/yt/fields/particle_fields.py
+++ b/yt/fields/particle_fields.py
@@ -351,7 +351,7 @@
 
     registry.add_field((ptype, "particle_radius_spherical"),
               function=_particle_radius_spherical,
-              particle_type=True, units="cm/s",
+              particle_type=True, units="cm",
               validators=[ValidateParameter("normal"), 
                           ValidateParameter("center")])
 
@@ -369,7 +369,7 @@
 
     registry.add_field((ptype, "particle_theta_spherical"),
               function=_particle_theta_spherical,
-              particle_type=True, units="cm/s",
+              particle_type=True, units="cm",
               validators=[ValidateParameter("normal"), 
                           ValidateParameter("center")])
 
@@ -387,7 +387,7 @@
 
     registry.add_field((ptype, "particle_phi_spherical"),
               function=_particle_phi_spherical,
-              particle_type=True, units="cm/s",
+              particle_type=True, units="cm",
               validators=[ValidateParameter("normal"), 
                           ValidateParameter("center")])
 


https://bitbucket.org/yt_analysis/yt/commits/65216c3eb198/
Changeset:   65216c3eb198
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:15:15
Summary:     Adding to the description of how fields are aliased.
Affected #:  2 files

diff -r c92f37f8a49b03846a4423dba6dabba6ae4267fb -r 65216c3eb198e0f01e7f656c6acd2fc9295c2396 doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -20,7 +20,8 @@
 for datasets containing multiple different types of fluid fields, mesh fields,
 particles (with overlapping or disjoint lists of fields).  To enable accessing
 these fields in a meaningful, simple way, the mechanism for accessing them has
-changed to take an optional *field type* in addition to the *field name*.
+changed to take an optional *field type* in addition to the *field name* of
+the form ('*field type*', '*field name*').
 
 As an example, we may be in a situation where have multiple types of particles
 which possess the ``particle_position`` field.  In the case where a data
@@ -99,17 +100,18 @@
 should be returned in.  If an aliased field is requested (and aliased fields 
 will always be lowercase, with underscores separating words) it will be returned 
 in CGS units (future versions will enable global defaults to be set for MKS and 
-other unit systems), whereas if the underlying field is requested, it will not 
-undergo any unit conversions from its natural units.  (This rule is occasionally 
-violated for fields which are mesh-dependent, specifically particle masses in 
-some cosmology codes.)
+other unit systems), whereas if the frontend-specific field is requested, it 
+will not undergo any unit conversions from its natural units.  (This rule is 
+occasionally violated for fields which are mesh-dependent, specifically particle 
+masses in some cosmology codes.)
 
-.. _known_field_types:
+.. _known-field-types:
 
 Field types known to yt
 -----------------------
 
-yt knows of a few different field types:
+Recall that fields are formally accessed in two parts: ('*field type*', 
+'*field name*').  Here we describe the different field types you will encounter:
 
 * frontend-name -- Mesh or fluid fields that exist on-disk default to having
   the name of the frontend as their type name (e.g., ``enzo``, ``flash``,
@@ -140,6 +142,14 @@
   density estimates, counts, and the like.  See :ref:`deposited-particle-fields` 
   for more information.
 
+While it is best to be explicit access fields by their full names 
+(i.e. ('*field type*', '*field name*')), yt provides an abbreviated 
+interface for accessing common fields (i.e. '*field name*').  In the abbreviated
+case, yt will assume you want the last *field type* accessed.  If you
+haven't previously accessed a *field type*, it will default to *field type* = 
+``'all'`` in the case of particle fields and *field type* = ``'gas'`` in the 
+case of mesh fields.
+
 Field Plugins
 -------------
 

diff -r c92f37f8a49b03846a4423dba6dabba6ae4267fb -r 65216c3eb198e0f01e7f656c6acd2fc9295c2396 doc/source/reference/api/api.rst
--- a/doc/source/reference/api/api.rst
+++ b/doc/source/reference/api/api.rst
@@ -694,6 +694,7 @@
 
    ~yt.convenience.load
    ~yt.data_objects.static_output.Dataset.all_data
+   ~yt.data_objects.static_output.Dataset.box
    ~yt.funcs.deprecate
    ~yt.funcs.ensure_list
    ~yt.funcs.get_pbar
@@ -714,6 +715,8 @@
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.data_objects.data_containers.YTDataContainer.get_field_parameter
+   ~yt.data_objects.data_containers.YTDataContainer.set_field_parameter
 
 Math Utilities
 --------------


https://bitbucket.org/yt_analysis/yt/commits/42f3d20b5b66/
Changeset:   42f3d20b5b66
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:16:24
Summary:     Adding to field_list docs (and helper script) to fix formatting, add a TOC, and make sure the units are correct for universal fields.
Affected #:  2 files

diff -r 65216c3eb198e0f01e7f656c6acd2fc9295c2396 -r 42f3d20b5b66c55a1d6a0d9b804b736e2eab209c doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -67,17 +67,20 @@
 Field List
 ==========
 
-This is a list of many of the fields available in ``yt``.  We have attempted to
-include most of the fields that are accessible through the plugin system, as well as
-the fields that are known by the frontends, however it is possible to generate many more
-permutations, particularly through vector operations. For more information about the fields
-framework, see :ref:`fields`.
+This is a list of many of the fields available in yt.  We have attempted to
+include most of the fields that are accessible through the plugin system, as 
+well as the fields that are known by the frontends, however it is possible to 
+generate many more permutations, particularly through vector operations. For 
+more information about the fields framework, see :ref:`fields`.
 
-Some fields are recognized by specific frontends only. These are typically fields like density
-and temperature that have their own names and units in the different frontend datasets. Often,
-these fields are aliased to their ``yt``-named counterpart fields. For example, in the ``FLASH``
-frontend, the ``dens`` field is aliased to the ``yt`` field ``density``, ``velx`` is aliased to
-``velocity_x``, and so on. In what follows, if a field is aliased it will be noted.
+Some fields are recognized by specific frontends only. These are typically 
+fields like density and temperature that have their own names and units in 
+the different frontend datasets. Often, these fields are aliased to their 
+yt-named counterpart fields (typically 'gas' fieldtypes). For example, in 
+the ``FLASH`` frontend, the ``dens`` field (i.e. ``(flash, dens)``) is aliased 
+to the gas field density (i.e. ``(gas, density)``), similarly ``(flash, velx)`` 
+is aliased to ``(gas, velocity_x)``, and so on. In what follows, if a field 
+is aliased it will be noted.
 
 Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
@@ -93,19 +96,27 @@
 To figure out out what all of the field types here mean, see
 :ref:`known-field-types`.
 
+.. rubric:: Table of Contents
+
+.. contents::
+   :depth: 2
+   :local:
+   :backlinks: none
+
 .. _yt_fields:
 
-Fields Generated by ``yt``
-++++++++++++++++++++++++++
-
+Universal Fields
+----------------
 """
 
 print header
 
 seen = []
 
-def fix_units(units):
+def fix_units(units, in_cgs=False):
     unit_object = Unit(units, registry=ds.unit_registry)
+    if in_cgs:
+        unit_object = unit_object.get_cgs_equivalent()
     latex = unit_object.latex_representation()
     return latex.replace('\/','~')
 
@@ -115,10 +126,17 @@
         f = df._function
         s = "%s" % (df.name,)
         print s
-        print "-" * len(s)
+        print "^" * len(s)
         print
         if len(df.units) > 0:
-            print "   * Units: :math:`%s`" % fix_units(df.units)
+            # Most universal fields are in CGS except for these special fields
+            if df.name[1] in ['particle_position', 'particle_position_x', \
+                         'particle_position_y', 'particle_position_z', \
+                         'entropy', 'kT', 'metallicity', 'dx', 'dy', 'dz',\
+                         'cell_volume', 'x', 'y', 'z']:
+                print "   * Units: :math:`%s`" % fix_units(df.units)
+            else:
+                print "   * Units: :math:`%s`" % fix_units(df.units, in_cgs=True)
         print "   * Particle Type: %s" % (df.particle_type)
         print
         print "**Field Source**"
@@ -145,7 +163,7 @@
         ftype = "'"+ftype+"'"
     s = "(%s, '%s')" % (ftype, name)
     print s
-    print "-" * len(s)
+    print "^" * len(s)
     print
     if len(units) > 0:
         print "   * Units: :math:`\mathrm{%s}`" % fix_units(units)
@@ -182,7 +200,7 @@
             print ".. _%s_specific_fields:\n" % dset_name.replace("Dataset", "")
             h = "%s-Specific Fields" % dset_name.replace("Dataset", "")
             print h
-            print "+" * len(h) + "\n"
+            print "-" * len(h) + "\n"
             for field in known_other_fields:
                 print_frontend_field(frontend, field, False)
             for field in known_particle_fields:

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/8e77a775cede/
Changeset:   8e77a775cede
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:16:42
Summary:     Fixing a linking problem with two analysis modules.
Affected #:  2 files

diff -r 42f3d20b5b66c55a1d6a0d9b804b736e2eab209c -r 8e77a775cede60fc242be6ed2ada61e19661c547 doc/source/analyzing/analysis_modules/fitting_procedure.rst
--- a/doc/source/analyzing/analysis_modules/fitting_procedure.rst
+++ /dev/null
@@ -1,138 +0,0 @@
-.. _fitting_procedure:
-
-Procedure for Generating Fits
-=============================
-.. sectionauthor:: Hilary Egan <hilary.egan at colorado.edu>
-
-To generate a fit for a spectrum :py:func:`generate_total_fit()` is called.
-This function controls the identification of line complexes, the fit
-of a series of absorption lines for each appropriate species, checks of
-those fits, and returns the results of the fits.
-
-
-Finding Line Complexes
-----------------------
-Line complexes are found using the :py:func:`find_complexes` function. The
-process by which line complexes are found involves walking through
-the array of flux in order from minimum to maximum wavelength, and finding
-series of spatially contiguous cells whose flux is less than some limit.
-These regions are then checked in terms of an additional flux limit and size.
-The bounds of all the passing regions are then listed and returned. Those
-bounds that cover an exceptionally large region of wavelength space will be
-broken up if a suitable cut point is found. This method is only appropriate
-for noiseless spectra.
-
-The optional parameter **complexLim** (default = 0.999), controls the limit
-that triggers the identification of a spatially contiguous region of flux
-that could be a line complex. This number should be very close to 1 but not
-exactly equal. It should also be at least an order of magnitude closer to 1
-than the later discussed **fitLim** parameter, because a line complex where
-the flux of the trough is very close to the flux of the edge can be incredibly
-unstable when optimizing.
-
-The **fitLim** parameter controls what is the maximum flux that the trough
-of the region can have and still be considered a line complex. This 
-effectively controls the sensitivity to very low column absorbers. Default
-value is **fitLim** = 0.99. If a region is identified where the flux of the trough
-is greater than this value, the region is simply ignored.
-
-The **minLength** parameter controls the minimum number of array elements 
-that an identified region must have. This value must be greater than or
-equal to 3 as there are a minimum of 3 free parameters that must be fit.
-Default is **minLength** = 3.
-
-The **maxLength** parameter controls the maximum number of array elements
-that an identified region can have before it is split into separate regions.
-Default is **maxLength** = 1000. This should be adjusted based on the 
-resolution of the spectrum to remain appropriate. The value correspond
-to a wavelength of roughly 50 angstroms. 
-
-The **splitLim** parameter controls how exceptionally large regions are split.
-When such a region is identified by having more array elements than
-**maxLength**, the point of maximum flux (or minimum absorption) in the 
-middle two quartiles is identified. If that point has a flux greater than
-or equal to **splitLim**, then two separate complexes are created: one from
-the lower wavelength edge to the minimum absorption point and the other from
-the minimum absorption point to the higher wavelength edge. The default
-value is **splitLim** =.99, but it should not drastically affect results, so
-long as the value is reasonably close to 1.
-
-
-Fitting a Line Complex
-----------------------
-
-After a complex is identified, it is fitted by iteratively adding and 
-optimizing a set of Voigt Profiles for a particular species until the
-region is considered successfully fit. The optimizing is accomplished
-using scipy's least squares optimizer. This requires an initial estimate
-of the parameters to be fit (column density, b-value, redshift) for each
-line.
-
-Each time a line is added, the guess of the parameters is based on
-the difference between the line complex and the fit so far. For the first line
-this just means the initial guess is based solely on the flux of the line
-complex. The column density is given by the initial column density given
-in the species parameters dictionary. If the line is saturated (some portion
-of the flux with a value less than .1) than the larger initial column density
-guess is chosen. If the flux is relatively high (all values >.9) than the
-smaller initial guess is given. These values are chosen to make optimization
-faster and more stable by being closer to the actual value, but the final
-results of fitting should not depend on them as they merely provide a
-starting point. 
-
-After the parameters for a line are optimized for the first time, the 
-optimized parameters are then used for the initial guess on subsequent 
-iterations with more lines. 
-
-The complex is considered successfully fit when the sum of the squares of 
-the difference between the flux generated from the fit and the desired flux
-profile is less than **errBound**. **errBound** is related to the optional
-parameter to :py:func:`generate_total_fit()`, **maxAvgError** by the number
-of array elements in the region such that **errBound** = number of elements *
-**maxAvgError**.
-
-There are several other conditions under which the cycle of adding and 
-optimizing lines will halt. If the error of the optimized fit from adding
-a line is an order of magnitude worse than the error of the fit without
-that line, then it is assumed that the fitting has become unstable and 
-the latest line is removed. Lines are also prevented from being added if
-the total number of lines is greater than the number of elements in the flux
-array being fit divided by 3. This is because there must not be more free
-parameters in a fit than the number of points to constrain them. 
-
-
-Checking Fit Results
---------------------
-
-After an acceptable fit for a region is determined, there are several steps
-the algorithm must go through to validate the fits. 
-
-First, the parameters must be in a reasonable range. This is a check to make 
-sure that the optimization did not become unstable and generate a fit that
-diverges wildly outside the region where the fit was performed. This way, even
-if particular complex cannot be fit, the rest of the spectrum fitting still
-behaves as expected. The range of acceptability for each parameter is given
-in the species parameter dictionary. These are merely broad limits that will
-prevent numerical instability rather than physical limits.
-
-In cases where a single species generates multiple lines (as in the OVI 
-doublet), the fits are then checked for higher wavelength lines. Originally
-the fits are generated only considering the lowest wavelength fit to a region.
-This is because we perform the fitting of complexes in order from the lowest
-wavelength to the highest, so any contribution to a complex being fit must
-come from the lower wavelength as the higher wavelength contributions would
-already have been subtracted out after fitting the lower wavelength. 
-
-Saturated Lyman Alpha Fitting Tools
------------------------------------
-
-In cases where a large or saturated line (there exists a point in the complex
-where the flux is less than .1) fails to be fit properly at first pass, a
-more robust set of fitting tools is used to try and remedy the situation.
-The basic approach is to simply try a much wider range of initial parameter
-guesses in order to find the true optimization minimum, rather than getting
-stuck in a local minimum. A set of hard coded initial parameter guesses
-for Lyman alpha lines is given by the function :py:func:`get_test_lines`. 
-Also included in these parameter guesses is an an initial guess of a high
-column cool line overlapping a lower column warm line, indictive of a 
-broad Lyman alpha (BLA) absorber.

diff -r 42f3d20b5b66c55a1d6a0d9b804b736e2eab209c -r 8e77a775cede60fc242be6ed2ada61e19661c547 doc/source/analyzing/analysis_modules/index.rst
--- a/doc/source/analyzing/analysis_modules/index.rst
+++ b/doc/source/analyzing/analysis_modules/index.rst
@@ -17,4 +17,4 @@
    two_point_functions
    clump_finding
    particle_trajectories
-   ellipsoidal_analysis
+   ellipsoid_analysis


https://bitbucket.org/yt_analysis/yt/commits/ad15c4350876/
Changeset:   ad15c4350876
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:22:33
Summary:     Merging.
Affected #:  4 files

diff -r 8e77a775cede60fc242be6ed2ada61e19661c547 -r ad15c4350876d8b60588b21495e312136d2ab30f doc/source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -1,7 +1,7 @@
 {
  "metadata": {
   "name": "",
-  "signature": "sha256:e4db171b795d155870280ddbe8986f55f9a94ffb10783abf9d4cc2de3ec24894"
+  "signature": "sha256:2cc168b2c1737c67647aa29892c0213e7a58233fa53c809f9cd975a4306e9bc8"
  },
  "nbformat": 3,
  "nbformat_minor": 0,
@@ -89,6 +89,7 @@
      "cell_type": "code",
      "collapsed": false,
      "input": [
+      "%matplotlib inline\n",
       "import yt\n",
       "from yt.analysis_modules.sunyaev_zeldovich.api import SZProjection\n",
       "\n",
@@ -222,4 +223,4 @@
    "metadata": {}
   }
  ]
-}
+}
\ No newline at end of file

diff -r 8e77a775cede60fc242be6ed2ada61e19661c547 -r ad15c4350876d8b60588b21495e312136d2ab30f doc/source/visualizing/volume_rendering.rst
--- a/doc/source/visualizing/volume_rendering.rst
+++ b/doc/source/visualizing/volume_rendering.rst
@@ -60,7 +60,7 @@
    # Set up the camera parameters: center, looking direction, width, resolution
    c = (ds.domain_right_edge + ds.domain_left_edge)/2.0
    L = np.array([1.0, 1.0, 1.0])
-   W = ds.quan(0.3, 'unitary)
+   W = ds.quan(0.3, 'unitary')
    N = 256 
 
    # Create a camera object

diff -r 8e77a775cede60fc242be6ed2ada61e19661c547 -r ad15c4350876d8b60588b21495e312136d2ab30f yt/frontends/art/data_structures.py
--- a/yt/frontends/art/data_structures.py
+++ b/yt/frontends/art/data_structures.py
@@ -39,6 +39,8 @@
     io_registry
 from yt.utilities.lib.misc_utilities import \
     get_box_grids_level
+from yt.data_objects.particle_unions import \
+    ParticleUnion
 
 from yt.frontends.art.definitions import *
 import yt.utilities.fortran_utils as fpu
@@ -104,17 +106,7 @@
         self.particle_field_list = [f for f in particle_fields]
         self.field_list = [("art", f) for f in fluid_fields]
         # now generate all of the possible particle fields
-        if "wspecies" in self.dataset.parameters.keys():
-            wspecies = self.dataset.parameters['wspecies']
-            nspecies = len(wspecies)
-            self.dataset.particle_types = ["darkmatter", "stars"]
-            for specie in range(nspecies):
-                self.dataset.particle_types.append("specie%i" % specie)
-            self.dataset.particle_types_raw = tuple(
-                self.dataset.particle_types)
-        else:
-            self.dataset.particle_types = []
-        for ptype in self.dataset.particle_types:
+        for ptype in self.dataset.particle_types_raw:
             for pfield in self.particle_field_list:
                 pfn = (ptype, pfield)
                 self.field_list.append(pfn)
@@ -313,6 +305,8 @@
             self.root_level = root_level
             mylog.info("Using root level of %02i", self.root_level)
         # read the particle header
+        self.particle_types = []
+        self.particle_types_raw = ()
         if not self.skip_particles and self._file_particle_header:
             with open(self._file_particle_header, "rb") as fh:
                 particle_header_vals = fpu.read_attrs(
@@ -323,6 +317,10 @@
                 lspecies = np.fromfile(fh, dtype='>i', count=10)
             self.parameters['wspecies'] = wspecies[:n]
             self.parameters['lspecies'] = lspecies[:n]
+            for specie in range(n):
+                self.particle_types.append("specie%i" % specie)
+            self.particle_types_raw = tuple(
+                self.particle_types)
             ls_nonzero = np.diff(lspecies)[:n-1]
             ls_nonzero = np.append(lspecies[0], ls_nonzero)
             self.star_type = len(ls_nonzero)
@@ -360,6 +358,16 @@
         self.gamma = self.parameters["gamma"]
         mylog.info("Max level is %02i", self.max_level)
 
+    def create_field_info(self):
+        super(ARTDataset, self).create_field_info()
+        if "wspecies" in self.parameters:
+            # We create dark_matter and stars unions.
+            ptr = self.particle_types_raw
+            pu = ParticleUnion("darkmatter", list(ptr[:-1]))
+            self.add_particle_union(pu)
+            pu = ParticleUnion("stars", list(ptr[-1:]))
+            self.add_particle_union(pu)
+
     @classmethod
     def _is_valid(self, *args, **kwargs):
         """

diff -r 8e77a775cede60fc242be6ed2ada61e19661c547 -r ad15c4350876d8b60588b21495e312136d2ab30f yt/frontends/art/io.py
--- a/yt/frontends/art/io.py
+++ b/yt/frontends/art/io.py
@@ -39,6 +39,11 @@
         self.cache = {}
         self.masks = {}
         super(IOHandlerART, self).__init__(*args, **kwargs)
+        self.ws = self.ds.parameters["wspecies"]
+        self.ls = self.ds.parameters["lspecies"]
+        self.file_particle = self.ds._file_particle_data
+        self.file_stars = self.ds._file_particle_stars
+        self.Nrow = self.ds.parameters["Nrow"]
 
     def _read_fluid_selection(self, chunks, selector, fields, size):
         # Chunks in this case will have affiliated domain subset objects
@@ -70,8 +75,6 @@
         if key in self.masks.keys() and self.caching:
             return self.masks[key]
         ds = self.ds
-        ptmax = self.ws[-1]
-        pbool, idxa, idxb = _determine_field_size(ds, ftype, self.ls, ptmax)
         pstr = 'particle_position_%s'
         x,y,z = [self._get_field((ftype, pstr % ax)) for ax in 'xyz']
         mask = selector.select_points(x, y, z, 0.0)
@@ -81,6 +84,26 @@
         else:
             return mask
 
+    def _read_particle_coords(self, chunks, ptf):
+        for chunk in chunks:
+            for ptype, field_list in sorted(ptf.items()):
+                x = self._get_field((ptype, "particle_position_x"))
+                y = self._get_field((ptype, "particle_position_y"))
+                z = self._get_field((ptype, "particle_position_z"))
+                yield ptype, (x, y, z)
+
+    def _read_particle_fields(self, chunks, ptf, selector):
+        for chunk in chunks:
+            for ptype, field_list in sorted(ptf.items()):
+                x = self._get_field((ptype, "particle_position_x"))
+                y = self._get_field((ptype, "particle_position_y"))
+                z = self._get_field((ptype, "particle_position_z"))
+                mask = selector.select_points(x, y, z, 0.0)
+                if mask is None: continue
+                for field in field_list:
+                    data = self._get_field((ptype, field))
+                    yield (ptype, field), data[mask]
+
     def _get_field(self,  field):
         if field in self.cache.keys() and self.caching:
             mylog.debug("Cached %s", str(field))
@@ -139,6 +162,13 @@
             temp[-nstars:] = data
             tr[field] = temp
             del data
+        # We check again, after it's been filled
+        if fname == "particle_mass":
+            # We now divide by NGrid in order to make this match up.  Note that
+            # this means that even when requested in *code units*, we are
+            # giving them as modified by the ng value.  This only works for
+            # dark_matter -- stars are regular matter.
+            tr[field] /= self.ds.domain_dimensions.prod()
         if tr == {}:
             tr = dict((f, np.array([])) for f in fields)
         if self.caching:
@@ -147,35 +177,15 @@
         else:
             return tr[field]
 
-    def _read_particle_selection(self, chunks, selector, fields):
-        chunk = chunks.next()
-        self.ds = chunk.objs[0].domain.ds
-        self.ws = self.ds.parameters["wspecies"]
-        self.ls = self.ds.parameters["lspecies"]
-        self.file_particle = self.ds._file_particle_data
-        self.file_stars = self.ds._file_particle_stars
-        self.Nrow = self.ds.parameters["Nrow"]
-        data = {f:np.array([]) for f in fields}
-        for f in fields:
-            ftype, fname = f
-            mask = self._get_mask(selector, ftype)
-            arr = self._get_field(f)[mask].astype('f8')
-            data[f] = np.concatenate((arr, data[f]))
-        return data
-
-def _determine_field_size(ds, field, lspecies, ptmax):
+def _determine_field_size(pf, field, lspecies, ptmax):
     pbool = np.zeros(len(lspecies), dtype="bool")
     idxas = np.concatenate(([0, ], lspecies[:-1]))
     idxbs = lspecies
     if "specie" in field:
         index = int(field.replace("specie", ""))
         pbool[index] = True
-    elif field == "stars":
-        pbool[-1] = True
-    elif field == "darkmatter":
-        pbool[0:-1] = True
     else:
-        pbool[:] = True
+        raise RuntimeError
     idxa, idxb = idxas[pbool][0], idxbs[pbool][-1]
     return pbool, idxa, idxb
 


https://bitbucket.org/yt_analysis/yt/commits/15b7375c1ee1/
Changeset:   15b7375c1ee1
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 02:54:58
Summary:     Adding notes to indicate that the "creating derived quantities" and "creating new frontends" sections are out of date.
Affected #:  2 files

diff -r ad15c4350876d8b60588b21495e312136d2ab30f -r 15b7375c1ee17b78e2eeea0cd39dab224351e1a0 doc/source/developing/creating_derived_quantities.rst
--- a/doc/source/developing/creating_derived_quantities.rst
+++ b/doc/source/developing/creating_derived_quantities.rst
@@ -3,6 +3,10 @@
 Creating Derived Quantities
 ---------------------------
 
+.. warning:: This section is not yet updated to work with yt 3.0.  If you
+             have a question about making a custom derived quantity, please
+             contact the mailing list.
+
 The basic idea is that you need to be able to operate both on a set of data,
 and a set of sets of data.  (If this is not possible, the quantity needs to be
 added with the ``force_unlazy`` option.)

diff -r ad15c4350876d8b60588b21495e312136d2ab30f -r 15b7375c1ee17b78e2eeea0cd39dab224351e1a0 doc/source/developing/creating_frontend.rst
--- a/doc/source/developing/creating_frontend.rst
+++ b/doc/source/developing/creating_frontend.rst
@@ -3,12 +3,9 @@
 Creating A New Code Frontend
 ============================
 
-.. note::
-
-   The material in this section will be significantly revised with the release
-   of yt 3.0.  If you would like to write a new frontend, consider sending a
-   message to the mailing list so you can find out the latest about frontends in
-   yt 3.0.
+.. warning: This section is not yet updated to work with yt 3.0.  If you
+            have a question about making a custom derived quantity, please
+            contact the mailing list.
 
 yt is designed to support analysis and visualization of data from multiple
 different simulation codes, although it has so far been most successfully


https://bitbucket.org/yt_analysis/yt/commits/6dd68366bb97/
Changeset:   6dd68366bb97
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 16:45:20
Summary:     Merging.
Affected #:  1 file

diff -r 15b7375c1ee17b78e2eeea0cd39dab224351e1a0 -r 6dd68366bb97da7ede9e79958a6269eee281c63a doc/source/cookbook/particle_filter.py
--- a/doc/source/cookbook/particle_filter.py
+++ b/doc/source/cookbook/particle_filter.py
@@ -2,24 +2,33 @@
 import numpy as np
 from yt.data_objects.particle_filters import add_particle_filter
 
+
 # Define filter functions for our particle filters based on stellar age.
+# In this dataset particles in the initial conditions are given creation
+# times arbitrarily far into the future, so stars with negative ages belong
+# in the old stars filter.
 def stars_10Myr(pfilter, data):
-    filter = (data.pf.current_time - data["Stars", "creation_time"]).in_units('Myr') <= 10
+    age = data.ds.current_time - data["Stars", "creation_time"]
+    filter = np.logical_and(age >= 0, age.in_units('Myr') < 10)
     return filter
 
 def stars_100Myr(pfilter, data):
-    filter = (((data.pf.current_time - data["Stars", "creation_time"]).in_units('Myr') <= 100) & \
-              ((data.pf.current_time - data["Stars", "creation_time"]).in_units('Myr') >= 10))
+    age = (data.ds.current_time - data["Stars", "creation_time"]).in_units('Myr')
+    filter = np.logical_and(age >= 10, age < 100)
     return filter
 
 def stars_old(pfilter, data):
-    filter = (data.pf.current_time - data["Stars", "creation_time"]).in_units('Myr') > 100
+    age = data.ds.current_time - data["Stars", "creation_time"]
+    filter = np.logical_or(age < 0, age.in_units('Myr') >= 100)
     return filter
 
 # Create the particle filters
-add_particle_filter("stars_young", function=stars_10Myr, filtered_type='Stars', requires=["creation_time"])
-add_particle_filter("stars_medium", function=stars_100Myr, filtered_type='Stars', requires=["creation_time"])
-add_particle_filter("stars_old", function=stars_old, filtered_type='Stars', requires=["creation_time"])
+add_particle_filter("stars_young", function=stars_10Myr, filtered_type='Stars',
+                    requires=["creation_time"])
+add_particle_filter("stars_medium", function=stars_100Myr, filtered_type='Stars',
+                    requires=["creation_time"])
+add_particle_filter("stars_old", function=stars_old, filtered_type='Stars',
+                    requires=["creation_time"])
 
 # Load a dataset and apply the particle filters
 filename = "TipsyGalaxy/galaxy.00300"
@@ -39,5 +48,8 @@
 print "Mass of old stars = %g Msun" % mass_old
 
 # Generate 4 projections: gas density, young stars, medium stars, old stars
-prj = yt.ProjectionPlot(ds, 'z', [('gas', 'density'), ('deposit', 'stars_young_cic'), ('deposit', 'stars_medium_cic'), ('deposit', 'stars_old_cic')], center="max", width=(100, 'kpc'))
+fields = [('gas', 'density'), ('deposit', 'stars_young_cic'),
+          ('deposit', 'stars_medium_cic'), ('deposit', 'stars_old_cic')]
+
+prj = yt.ProjectionPlot(ds, 'z', fields, center="max", width=(100, 'kpc'))
 prj.save()


https://bitbucket.org/yt_analysis/yt/commits/712a40da9fbd/
Changeset:   712a40da9fbd
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 17:05:19
Summary:     Making css modification that Nathan suggested to make TOCs work correctly in docs.
Affected #:  1 file

diff -r 6dd68366bb97da7ede9e79958a6269eee281c63a -r 712a40da9fbd6c69a9e1bc654a74a7e6b296ec30 doc/source/_static/custom.css
--- a/doc/source/_static/custom.css
+++ b/doc/source/_static/custom.css
@@ -85,7 +85,7 @@
 
 */
 
-*[id]:before { 
+*[id]:before :not(p) {
   display: block; 
   content: " "; 
   margin-top: -45px; 


https://bitbucket.org/yt_analysis/yt/commits/6b7bb8e30655/
Changeset:   6b7bb8e30655
Branch:      yt-3.0
User:        chummels
Date:        2014-08-03 17:05:48
Summary:     Modifying show_fields.py script to generate local TOCs for each set of fields.
Affected #:  2 files

diff -r 712a40da9fbd6c69a9e1bc654a74a7e6b296ec30 -r 6b7bb8e3065573d66935663112353db1222c15c0 doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -96,17 +96,20 @@
 To figure out out what all of the field types here mean, see
 :ref:`known-field-types`.
 
-.. rubric:: Table of Contents
-
-.. contents::
-   :depth: 2
+.. contents:: Table of Contents
+   :depth: 1
    :local:
    :backlinks: none
 
-.. _yt_fields:
+.. _yt-fields:
 
 Universal Fields
 ----------------
+
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 """
 
 print header
@@ -201,6 +204,10 @@
             h = "%s-Specific Fields" % dset_name.replace("Dataset", "")
             print h
             print "-" * len(h) + "\n"
+            print ".. contents:: "
+            print "   :depth: 1"
+            print "   :local:"
+            print "   :backlinks: none"
             for field in known_other_fields:
                 print_frontend_field(frontend, field, False)
             for field in known_particle_fields:

diff -r 712a40da9fbd6c69a9e1bc654a74a7e6b296ec30 -r 6b7bb8e3065573d66935663112353db1222c15c0 doc/source/reference/field_list.rst
--- a/doc/source/reference/field_list.rst
+++ b/doc/source/reference/field_list.rst
@@ -33,18 +33,21 @@
 To figure out out what all of the field types here mean, see
 :ref:`known-field-types`.
 
-.. rubric:: Table of Contents
-
-.. contents::
-   :depth: 2
+.. contents:: Table of Contents
+   :depth: 1
    :local:
    :backlinks: none
 
-.. _yt_fields:
+.. _yt-fields:
 
 Universal Fields
 ----------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
+
 ('all', 'mesh_id')
 ^^^^^^^^^^^^^^^^^^
 
@@ -3377,6 +3380,10 @@
 ART-Specific Fields
 -------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('art', 'Density')
 ^^^^^^^^^^^^^^^^^^
 
@@ -3537,6 +3544,10 @@
 ARTIO-Specific Fields
 ---------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('artio', 'HVAR_GAS_DENSITY')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -3709,6 +3720,10 @@
 Athena-Specific Fields
 ----------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('athena', 'density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -3742,6 +3757,10 @@
 Boxlib-Specific Fields
 ----------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('boxlib', 'density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -3910,6 +3929,10 @@
 Enzo-Specific Fields
 --------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('enzo', 'Cooling_Time')
 ^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4183,6 +4206,10 @@
 FLASH-Specific Fields
 ---------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('flash', 'velx')
 ^^^^^^^^^^^^^^^^^
 
@@ -4477,6 +4504,10 @@
 GDF-Specific Fields
 -------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('gdf', 'density')
 ^^^^^^^^^^^^^^^^^^
 
@@ -4552,6 +4583,10 @@
 HaloCatalog-Specific Fields
 ---------------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'particle_identifier')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4610,6 +4645,10 @@
 OWLSSubfind-Specific Fields
 ---------------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'CenterOfMass_0')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4774,6 +4813,10 @@
 Rockstar-Specific Fields
 ------------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'particle_identifier')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5089,6 +5132,10 @@
 RAMSES-Specific Fields
 ----------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 ('ramses', 'Density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -5199,6 +5246,10 @@
 Gadget-Specific Fields
 ----------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5377,6 +5428,10 @@
 GadgetHDF5-Specific Fields
 --------------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5555,6 +5610,10 @@
 OWLS-Specific Fields
 --------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5733,6 +5792,10 @@
 Tipsy-Specific Fields
 ---------------------
 
+.. contents:: 
+   :depth: 1
+   :local:
+   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 


https://bitbucket.org/yt_analysis/yt/commits/3ed2103820c4/
Changeset:   3ed2103820c4
Branch:      yt-3.0
User:        chummels
Date:        2014-08-04 14:08:21
Summary:     Moving long table of contents of individual fields to the end of the field_docs list as an "index of fields"
Affected #:  2 files

diff -r 6b7bb8e3065573d66935663112353db1222c15c0 -r 3ed2103820c48e4fd90fe021e19ef572184a1e59 doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -105,13 +105,18 @@
 
 Universal Fields
 ----------------
+"""
+
+footer = """
+
+Index of Fields
+---------------
 
 .. contents:: 
-   :depth: 1
-   :local:
+   :depth: 3
    :backlinks: none
+
 """
-
 print header
 
 seen = []
@@ -204,10 +209,6 @@
             h = "%s-Specific Fields" % dset_name.replace("Dataset", "")
             print h
             print "-" * len(h) + "\n"
-            print ".. contents:: "
-            print "   :depth: 1"
-            print "   :local:"
-            print "   :backlinks: none"
             for field in known_other_fields:
                 print_frontend_field(frontend, field, False)
             for field in known_particle_fields:
@@ -215,3 +216,5 @@
                     print_frontend_field("particle_type", field, True)
                 else:
                     print_frontend_field("io", field, True)
+
+print footer

diff -r 6b7bb8e3065573d66935663112353db1222c15c0 -r 3ed2103820c48e4fd90fe021e19ef572184a1e59 doc/source/reference/field_list.rst
--- a/doc/source/reference/field_list.rst
+++ b/doc/source/reference/field_list.rst
@@ -43,11 +43,6 @@
 Universal Fields
 ----------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
-
 ('all', 'mesh_id')
 ^^^^^^^^^^^^^^^^^^
 
@@ -3380,10 +3375,6 @@
 ART-Specific Fields
 -------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('art', 'Density')
 ^^^^^^^^^^^^^^^^^^
 
@@ -3544,10 +3535,6 @@
 ARTIO-Specific Fields
 ---------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('artio', 'HVAR_GAS_DENSITY')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -3720,10 +3707,6 @@
 Athena-Specific Fields
 ----------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('athena', 'density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -3757,10 +3740,6 @@
 Boxlib-Specific Fields
 ----------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('boxlib', 'density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -3929,10 +3908,6 @@
 Enzo-Specific Fields
 --------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('enzo', 'Cooling_Time')
 ^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4206,10 +4181,6 @@
 FLASH-Specific Fields
 ---------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('flash', 'velx')
 ^^^^^^^^^^^^^^^^^
 
@@ -4504,10 +4475,6 @@
 GDF-Specific Fields
 -------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('gdf', 'density')
 ^^^^^^^^^^^^^^^^^^
 
@@ -4583,10 +4550,6 @@
 HaloCatalog-Specific Fields
 ---------------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'particle_identifier')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4645,10 +4608,6 @@
 OWLSSubfind-Specific Fields
 ---------------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'CenterOfMass_0')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -4813,10 +4772,6 @@
 Rockstar-Specific Fields
 ------------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'particle_identifier')
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5132,10 +5087,6 @@
 RAMSES-Specific Fields
 ----------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 ('ramses', 'Density')
 ^^^^^^^^^^^^^^^^^^^^^
 
@@ -5246,10 +5197,6 @@
 Gadget-Specific Fields
 ----------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5428,10 +5375,6 @@
 GadgetHDF5-Specific Fields
 --------------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5610,10 +5553,6 @@
 OWLS-Specific Fields
 --------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -5792,10 +5731,6 @@
 Tipsy-Specific Fields
 ---------------------
 
-.. contents:: 
-   :depth: 1
-   :local:
-   :backlinks: none
 (particle_type, 'Mass')
 ^^^^^^^^^^^^^^^^^^^^^^^
 
@@ -6095,3 +6030,13 @@
    * Aliased to: ``HeI``
    * Particle Type: True
 
+
+
+Index of Fields
+---------------
+
+.. contents:: 
+   :depth: 3
+   :backlinks: none
+
+


https://bitbucket.org/yt_analysis/yt/commits/1247055ff5db/
Changeset:   1247055ff5db
Branch:      yt-3.0
User:        MatthewTurk
Date:        2014-08-04 14:36:26
Summary:     Merging
Affected #:  12 files

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/helper_scripts/show_fields.py
--- a/doc/helper_scripts/show_fields.py
+++ b/doc/helper_scripts/show_fields.py
@@ -67,17 +67,20 @@
 Field List
 ==========
 
-This is a list of many of the fields available in ``yt``.  We have attempted to
-include most of the fields that are accessible through the plugin system, as well as
-the fields that are known by the frontends, however it is possible to generate many more
-permutations, particularly through vector operations. For more information about the fields
-framework, see :ref:`fields`.
+This is a list of many of the fields available in yt.  We have attempted to
+include most of the fields that are accessible through the plugin system, as 
+well as the fields that are known by the frontends, however it is possible to 
+generate many more permutations, particularly through vector operations. For 
+more information about the fields framework, see :ref:`fields`.
 
-Some fields are recognized by specific frontends only. These are typically fields like density
-and temperature that have their own names and units in the different frontend datasets. Often,
-these fields are aliased to their ``yt``-named counterpart fields. For example, in the ``FLASH``
-frontend, the ``dens`` field is aliased to the ``yt`` field ``density``, ``velx`` is aliased to
-``velocity_x``, and so on. In what follows, if a field is aliased it will be noted.
+Some fields are recognized by specific frontends only. These are typically 
+fields like density and temperature that have their own names and units in 
+the different frontend datasets. Often, these fields are aliased to their 
+yt-named counterpart fields (typically 'gas' fieldtypes). For example, in 
+the ``FLASH`` frontend, the ``dens`` field (i.e. ``(flash, dens)``) is aliased 
+to the gas field density (i.e. ``(gas, density)``), similarly ``(flash, velx)`` 
+is aliased to ``(gas, velocity_x)``, and so on. In what follows, if a field 
+is aliased it will be noted.
 
 Try using the ``ds.field_list`` and ``ds.derived_field_list`` to view the
 native and derived fields available for your dataset respectively. For example
@@ -93,19 +96,35 @@
 To figure out out what all of the field types here mean, see
 :ref:`known-field-types`.
 
-.. _yt_fields:
+.. contents:: Table of Contents
+   :depth: 1
+   :local:
+   :backlinks: none
 
-Fields Generated by ``yt``
-++++++++++++++++++++++++++
+.. _yt-fields:
+
+Universal Fields
+----------------
+"""
+
+footer = """
+
+Index of Fields
+---------------
+
+.. contents:: 
+   :depth: 3
+   :backlinks: none
 
 """
-
 print header
 
 seen = []
 
-def fix_units(units):
+def fix_units(units, in_cgs=False):
     unit_object = Unit(units, registry=ds.unit_registry)
+    if in_cgs:
+        unit_object = unit_object.get_cgs_equivalent()
     latex = unit_object.latex_representation()
     return latex.replace('\/','~')
 
@@ -115,10 +134,17 @@
         f = df._function
         s = "%s" % (df.name,)
         print s
-        print "-" * len(s)
+        print "^" * len(s)
         print
         if len(df.units) > 0:
-            print "   * Units: :math:`%s`" % fix_units(df.units)
+            # Most universal fields are in CGS except for these special fields
+            if df.name[1] in ['particle_position', 'particle_position_x', \
+                         'particle_position_y', 'particle_position_z', \
+                         'entropy', 'kT', 'metallicity', 'dx', 'dy', 'dz',\
+                         'cell_volume', 'x', 'y', 'z']:
+                print "   * Units: :math:`%s`" % fix_units(df.units)
+            else:
+                print "   * Units: :math:`%s`" % fix_units(df.units, in_cgs=True)
         print "   * Particle Type: %s" % (df.particle_type)
         print
         print "**Field Source**"
@@ -145,7 +171,7 @@
         ftype = "'"+ftype+"'"
     s = "(%s, '%s')" % (ftype, name)
     print s
-    print "-" * len(s)
+    print "^" * len(s)
     print
     if len(units) > 0:
         print "   * Units: :math:`\mathrm{%s}`" % fix_units(units)
@@ -182,7 +208,7 @@
             print ".. _%s_specific_fields:\n" % dset_name.replace("Dataset", "")
             h = "%s-Specific Fields" % dset_name.replace("Dataset", "")
             print h
-            print "+" * len(h) + "\n"
+            print "-" * len(h) + "\n"
             for field in known_other_fields:
                 print_frontend_field(frontend, field, False)
             for field in known_particle_fields:
@@ -190,3 +216,5 @@
                     print_frontend_field("particle_type", field, True)
                 else:
                     print_frontend_field("io", field, True)
+
+print footer

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/_static/custom.css
--- a/doc/source/_static/custom.css
+++ b/doc/source/_static/custom.css
@@ -85,7 +85,7 @@
 
 */
 
-*[id]:before { 
+*[id]:before :not(p) {
   display: block; 
   content: " "; 
   margin-top: -45px; 

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/analyzing/analysis_modules/fitting_procedure.rst
--- a/doc/source/analyzing/analysis_modules/fitting_procedure.rst
+++ /dev/null
@@ -1,138 +0,0 @@
-.. _fitting_procedure:
-
-Procedure for Generating Fits
-=============================
-.. sectionauthor:: Hilary Egan <hilary.egan at colorado.edu>
-
-To generate a fit for a spectrum :py:func:`generate_total_fit()` is called.
-This function controls the identification of line complexes, the fit
-of a series of absorption lines for each appropriate species, checks of
-those fits, and returns the results of the fits.
-
-
-Finding Line Complexes
-----------------------
-Line complexes are found using the :py:func:`find_complexes` function. The
-process by which line complexes are found involves walking through
-the array of flux in order from minimum to maximum wavelength, and finding
-series of spatially contiguous cells whose flux is less than some limit.
-These regions are then checked in terms of an additional flux limit and size.
-The bounds of all the passing regions are then listed and returned. Those
-bounds that cover an exceptionally large region of wavelength space will be
-broken up if a suitable cut point is found. This method is only appropriate
-for noiseless spectra.
-
-The optional parameter **complexLim** (default = 0.999), controls the limit
-that triggers the identification of a spatially contiguous region of flux
-that could be a line complex. This number should be very close to 1 but not
-exactly equal. It should also be at least an order of magnitude closer to 1
-than the later discussed **fitLim** parameter, because a line complex where
-the flux of the trough is very close to the flux of the edge can be incredibly
-unstable when optimizing.
-
-The **fitLim** parameter controls what is the maximum flux that the trough
-of the region can have and still be considered a line complex. This 
-effectively controls the sensitivity to very low column absorbers. Default
-value is **fitLim** = 0.99. If a region is identified where the flux of the trough
-is greater than this value, the region is simply ignored.
-
-The **minLength** parameter controls the minimum number of array elements 
-that an identified region must have. This value must be greater than or
-equal to 3 as there are a minimum of 3 free parameters that must be fit.
-Default is **minLength** = 3.
-
-The **maxLength** parameter controls the maximum number of array elements
-that an identified region can have before it is split into separate regions.
-Default is **maxLength** = 1000. This should be adjusted based on the 
-resolution of the spectrum to remain appropriate. The value correspond
-to a wavelength of roughly 50 angstroms. 
-
-The **splitLim** parameter controls how exceptionally large regions are split.
-When such a region is identified by having more array elements than
-**maxLength**, the point of maximum flux (or minimum absorption) in the 
-middle two quartiles is identified. If that point has a flux greater than
-or equal to **splitLim**, then two separate complexes are created: one from
-the lower wavelength edge to the minimum absorption point and the other from
-the minimum absorption point to the higher wavelength edge. The default
-value is **splitLim** =.99, but it should not drastically affect results, so
-long as the value is reasonably close to 1.
-
-
-Fitting a Line Complex
-----------------------
-
-After a complex is identified, it is fitted by iteratively adding and 
-optimizing a set of Voigt Profiles for a particular species until the
-region is considered successfully fit. The optimizing is accomplished
-using scipy's least squares optimizer. This requires an initial estimate
-of the parameters to be fit (column density, b-value, redshift) for each
-line.
-
-Each time a line is added, the guess of the parameters is based on
-the difference between the line complex and the fit so far. For the first line
-this just means the initial guess is based solely on the flux of the line
-complex. The column density is given by the initial column density given
-in the species parameters dictionary. If the line is saturated (some portion
-of the flux with a value less than .1) than the larger initial column density
-guess is chosen. If the flux is relatively high (all values >.9) than the
-smaller initial guess is given. These values are chosen to make optimization
-faster and more stable by being closer to the actual value, but the final
-results of fitting should not depend on them as they merely provide a
-starting point. 
-
-After the parameters for a line are optimized for the first time, the 
-optimized parameters are then used for the initial guess on subsequent 
-iterations with more lines. 
-
-The complex is considered successfully fit when the sum of the squares of 
-the difference between the flux generated from the fit and the desired flux
-profile is less than **errBound**. **errBound** is related to the optional
-parameter to :py:func:`generate_total_fit()`, **maxAvgError** by the number
-of array elements in the region such that **errBound** = number of elements *
-**maxAvgError**.
-
-There are several other conditions under which the cycle of adding and 
-optimizing lines will halt. If the error of the optimized fit from adding
-a line is an order of magnitude worse than the error of the fit without
-that line, then it is assumed that the fitting has become unstable and 
-the latest line is removed. Lines are also prevented from being added if
-the total number of lines is greater than the number of elements in the flux
-array being fit divided by 3. This is because there must not be more free
-parameters in a fit than the number of points to constrain them. 
-
-
-Checking Fit Results
---------------------
-
-After an acceptable fit for a region is determined, there are several steps
-the algorithm must go through to validate the fits. 
-
-First, the parameters must be in a reasonable range. This is a check to make 
-sure that the optimization did not become unstable and generate a fit that
-diverges wildly outside the region where the fit was performed. This way, even
-if particular complex cannot be fit, the rest of the spectrum fitting still
-behaves as expected. The range of acceptability for each parameter is given
-in the species parameter dictionary. These are merely broad limits that will
-prevent numerical instability rather than physical limits.
-
-In cases where a single species generates multiple lines (as in the OVI 
-doublet), the fits are then checked for higher wavelength lines. Originally
-the fits are generated only considering the lowest wavelength fit to a region.
-This is because we perform the fitting of complexes in order from the lowest
-wavelength to the highest, so any contribution to a complex being fit must
-come from the lower wavelength as the higher wavelength contributions would
-already have been subtracted out after fitting the lower wavelength. 
-
-Saturated Lyman Alpha Fitting Tools
------------------------------------
-
-In cases where a large or saturated line (there exists a point in the complex
-where the flux is less than .1) fails to be fit properly at first pass, a
-more robust set of fitting tools is used to try and remedy the situation.
-The basic approach is to simply try a much wider range of initial parameter
-guesses in order to find the true optimization minimum, rather than getting
-stuck in a local minimum. A set of hard coded initial parameter guesses
-for Lyman alpha lines is given by the function :py:func:`get_test_lines`. 
-Also included in these parameter guesses is an an initial guess of a high
-column cool line overlapping a lower column warm line, indictive of a 
-broad Lyman alpha (BLA) absorber.

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/analyzing/analysis_modules/index.rst
--- a/doc/source/analyzing/analysis_modules/index.rst
+++ b/doc/source/analyzing/analysis_modules/index.rst
@@ -17,4 +17,4 @@
    two_point_functions
    clump_finding
    particle_trajectories
-   ellipsoidal_analysis
+   ellipsoid_analysis

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/analyzing/fields.rst
--- a/doc/source/analyzing/fields.rst
+++ b/doc/source/analyzing/fields.rst
@@ -20,7 +20,8 @@
 for datasets containing multiple different types of fluid fields, mesh fields,
 particles (with overlapping or disjoint lists of fields).  To enable accessing
 these fields in a meaningful, simple way, the mechanism for accessing them has
-changed to take an optional *field type* in addition to the *field name*.
+changed to take an optional *field type* in addition to the *field name* of
+the form ('*field type*', '*field name*').
 
 As an example, we may be in a situation where have multiple types of particles
 which possess the ``particle_position`` field.  In the case where a data
@@ -99,17 +100,18 @@
 should be returned in.  If an aliased field is requested (and aliased fields 
 will always be lowercase, with underscores separating words) it will be returned 
 in CGS units (future versions will enable global defaults to be set for MKS and 
-other unit systems), whereas if the underlying field is requested, it will not 
-undergo any unit conversions from its natural units.  (This rule is occasionally 
-violated for fields which are mesh-dependent, specifically particle masses in 
-some cosmology codes.)
+other unit systems), whereas if the frontend-specific field is requested, it 
+will not undergo any unit conversions from its natural units.  (This rule is 
+occasionally violated for fields which are mesh-dependent, specifically particle 
+masses in some cosmology codes.)
 
-.. _known_field_types:
+.. _known-field-types:
 
 Field types known to yt
 -----------------------
 
-yt knows of a few different field types:
+Recall that fields are formally accessed in two parts: ('*field type*', 
+'*field name*').  Here we describe the different field types you will encounter:
 
 * frontend-name -- Mesh or fluid fields that exist on-disk default to having
   the name of the frontend as their type name (e.g., ``enzo``, ``flash``,
@@ -140,6 +142,14 @@
   density estimates, counts, and the like.  See :ref:`deposited-particle-fields` 
   for more information.
 
+While it is best to be explicit access fields by their full names 
+(i.e. ('*field type*', '*field name*')), yt provides an abbreviated 
+interface for accessing common fields (i.e. '*field name*').  In the abbreviated
+case, yt will assume you want the last *field type* accessed.  If you
+haven't previously accessed a *field type*, it will default to *field type* = 
+``'all'`` in the case of particle fields and *field type* = ``'gas'`` in the 
+case of mesh fields.
+
 Field Plugins
 -------------
 

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/developing/creating_derived_quantities.rst
--- a/doc/source/developing/creating_derived_quantities.rst
+++ b/doc/source/developing/creating_derived_quantities.rst
@@ -3,6 +3,10 @@
 Creating Derived Quantities
 ---------------------------
 
+.. warning:: This section is not yet updated to work with yt 3.0.  If you
+             have a question about making a custom derived quantity, please
+             contact the mailing list.
+
 The basic idea is that you need to be able to operate both on a set of data,
 and a set of sets of data.  (If this is not possible, the quantity needs to be
 added with the ``force_unlazy`` option.)

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/developing/creating_frontend.rst
--- a/doc/source/developing/creating_frontend.rst
+++ b/doc/source/developing/creating_frontend.rst
@@ -3,12 +3,9 @@
 Creating A New Code Frontend
 ============================
 
-.. note::
-
-   The material in this section will be significantly revised with the release
-   of yt 3.0.  If you would like to write a new frontend, consider sending a
-   message to the mailing list so you can find out the latest about frontends in
-   yt 3.0.
+.. warning: This section is not yet updated to work with yt 3.0.  If you
+            have a question about making a custom derived quantity, please
+            contact the mailing list.
 
 yt is designed to support analysis and visualization of data from multiple
 different simulation codes, although it has so far been most successfully

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -28,7 +28,7 @@
          </p></td><td width="75%">
-         <p class="linkdescr">Getting, Installing, and Updating yt</p>
+         <p class="linkdescr">Getting, installing, and updating yt</p></td></tr><tr valign="top">

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -341,7 +341,7 @@
 at the command line.  If you encounter problems, see :ref:`update-errors`.
 
 If You Installed yt Using from Source or Using pip
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+++++++++++++++++++++++++++++++++++++++++++++++++++
 
 If you have installed python via ``pip``, remove 
 any extant installations of yt on your system and clone the source mercurial 

diff -r af0201dadeebed1e66e3470e718a225470140b95 -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e doc/source/reference/api/api.rst
--- a/doc/source/reference/api/api.rst
+++ b/doc/source/reference/api/api.rst
@@ -694,6 +694,7 @@
 
    ~yt.convenience.load
    ~yt.data_objects.static_output.Dataset.all_data
+   ~yt.data_objects.static_output.Dataset.box
    ~yt.funcs.deprecate
    ~yt.funcs.ensure_list
    ~yt.funcs.get_pbar
@@ -714,6 +715,8 @@
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_passthrough
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_root_only
    ~yt.utilities.parallel_tools.parallel_analysis_interface.parallel_simple_proxy
+   ~yt.data_objects.data_containers.YTDataContainer.get_field_parameter
+   ~yt.data_objects.data_containers.YTDataContainer.set_field_parameter
 
 Math Utilities
 --------------

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/49d4aca8c7e3/
Changeset:   49d4aca8c7e3
Branch:      yt-3.0
User:        MatthewTurk
Date:        2014-08-04 14:44:12
Summary:     Updating version to 3.0
Affected #:  2 files

diff -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e -r 49d4aca8c7e3f8a83057956b39a5840785a0daa5 doc/source/conf.py
--- a/doc/source/conf.py
+++ b/doc/source/conf.py
@@ -69,7 +69,7 @@
 # The short X.Y version.
 version = '3.0'
 # The full version, including alpha/beta/rc tags.
-release = '3.0alpha'
+release = '3.0'
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.

diff -r 1247055ff5db3152e4a499f7cdbaa243b9d6e10e -r 49d4aca8c7e3f8a83057956b39a5840785a0daa5 setup.py
--- a/setup.py
+++ b/setup.py
@@ -118,7 +118,7 @@
 # End snippet
 ######
 
-VERSION = "3.0dev"
+VERSION = "3.0"
 
 if os.path.exists('MANIFEST'):
     os.remove('MANIFEST')

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list