[yt-svn] commit/yt-doc: 2 new changesets

Bitbucket commits-noreply at bitbucket.org
Thu Jul 26 11:48:59 PDT 2012


2 new commits in yt-doc:


https://bitbucket.org/yt_analysis/yt-doc/changeset/397e499177e0/
changeset:   397e499177e0
user:        ngoldbaum
date:        2012-07-26 20:48:30
summary:     Fixing some typos.
affected #:  1 file

diff -r 0d86e414a8bfbd869d802919c6bd14f81bdcb881 -r 397e499177e0090e658634a9c48ef48b95dd5731 source/orientation/making_plots.rst
--- a/source/orientation/making_plots.rst
+++ b/source/orientation/making_plots.rst
@@ -10,7 +10,7 @@
 slices and projections through your data.  We will then move on to demonstrate
 how to make analysis plots, including phase diagrams and profiles.
 
-The quickest way to plot a slice of a field through you data is to use
+The quickest way to plot a slice of a field through your data is to use
 :class:`~yt.visualization.plot_window.SlicePlot`.  Say we want to visualize a
 slice through the Density field along the z-axis centered on the center of the
 simulation box in a simulation dataset we've opened and stored in the parameter
@@ -59,10 +59,11 @@
 
 .. code-block:: python
 
-   >>> SlicePlot(pf, 'z', 'Density', center=[0.2, 0.3, 0.8], width = (10,'kpc)).save()
+   >>> SlicePlot(pf, 'z', 'Density', center=[0.2, 0.3, 0.8], 
+   ...           width = (10,'kpc)).save()
 
 The center must be given in code units.  Optionally, you can supply 'c' or 'm'
-for the center.  These two options will center the plot on the center of the
+for the center.  These two choices will center the plot on the center of the
 simulation box and the coordinate of the maximum density cell, respectively.
 
 One can also use the SlicePlot to make annotated plots.  The following commands
@@ -75,15 +76,15 @@
    >>> SlicePlot.annotate_grids()
    >>> SlicePlot.save()
 
-There are a number of annotations available.  The rest of the annotations are
-described in :ref:`callbacks`.
+There are a number of annotations available.  The full list is available in
+:ref:`callbacks`.
 
 Projectiions
 ^^^^^^^^^^^^
 
 It can be limiting to only look at slices through 3D data.  In most cases, Doing
 so discards the vast majority of the data.  For this reason, yt provides a
-simple interface for generating plot of projections through your data.  The
+simple interface for generating plots of projections through your data.  The
 interface for making projection plots,
 :class:`~yt.visualization.plot_window.ProjectionPlot` is very similar to
 ``SlicePlot``, described above.  To create and save a plot of the projection of



https://bitbucket.org/yt_analysis/yt-doc/changeset/d7f484a89e72/
changeset:   d7f484a89e72
user:        ngoldbaum
date:        2012-07-26 20:48:51
summary:     Merging.
affected #:  7 files

diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/analyzing_an_entire_simulation.rst
--- a/source/analysis_modules/analyzing_an_entire_simulation.rst
+++ /dev/null
@@ -1,151 +0,0 @@
-.. _analyzing-an-entire-simulation:
-
-Analyzing an Entire Simulation
-==============================
-.. sectionauthor:: Britton Smith <britton.smith at colorado.edu>
-
-The EnzoSimulation class provides a simple framework for performing the same 
-analysis on multiple datasets in a single simulation.  At its most basic, an 
-EnzoSimulation object gives you access to a time-ordered list of datasets over 
-the time or redshift interval of your choosing.  It also includes more 
-sophisticated machinery for stitching together cosmological datasets to create 
-a continuous volume spanning a given redshift interval.  This is the engine that 
-powers the light cone generator (see :ref:`light-cone-generator`).  See 
-:ref:`cookbook-simulation_halo_profiler` for an example of using the EnzoSimulation 
-class to run the HaloProfiler on multiple datasets within a simulation.
-
-EnzoSimulation Options
-----------------------
-
-The only argument required to instantiate an EnzoSimulation is the path to the 
-parameter file used to run the simulation:
-
-.. code-block:: python
-
-  import yt.analysis_modules.simulation_handler.api as ES
-  es = ES.EnzoSimulation("my_simulation.par")
-
-The EnzoSimulation object will then read through the simulation parameter file 
-to figure out what datasets are available and where they are located.  Comment 
-characters are respected, so commented-out lines will be ignored.  If no time 
-and/or redshift interval is specified using the keyword arguments listed below, 
-the EnzoSimulation object will create a time-ordered list of all datasets.
-
-.. note:: For cosmological simulations, the interval of interest can be
-   specified with a combination of time and redshift keywords.
-
-The additional keyword options are:
-
- * **initial_time** (*float*): the initial time in code units for the
-   dataset list.  Default: None.
-
- * **final_time** (*float*): the final time in code units for the dataset
-   list.  Default: None.
-
- * **initial_redshift** (*float*): the initial (highest) redshift for the
-   dataset list.  Only for cosmological simulations.  Default: None.
-
- * **final_redshift** (*float*): the final (lowest) redshift for the dataset
-   list.  Only for cosmological simulations.  Default: None.
-
- * **links** (*bool*): if True, each entry in the dataset list will
-   contain entries, *previous* and *next*, that point to the previous and next
-   entries on the dataset list.  Default: False.
-
- * **enzo_parameters** (*dict*): a dictionary specify additional
-   parameters to be retrieved from the parameter file.  The format should be the
-   name of the parameter as the key and the variable type as the value.  For
-   example, {'CosmologyComovingBoxSize':float}.  All parameter values will be
-   stored in the dictionary attribute, *enzoParameters*.  Default: None.
-
- * **get_time_outputs** (*bool*): if False, the time datasets, specified
-   in Enzo with the *dtDataDump*, will not be added to the dataset list.  Default:
-   True.
-
- * **get_redshift_outputs** (*bool*): if False, the redshift datasets will
-   not be added to the dataset list.  Default: True.
-
-.. warning:: The EnzoSimulation object will use the *GlobalDir* Enzo parameter
-   to determine the absolute path to the data, so make sure this is set correctly
-   if the data has been moved.  If this parameter is not present in the parameter
-   file, the code will look for the data in the current directory.
-
-The Dataset List
-----------------
-
-The primary attribute of an EnzoSimulation object is the dataset list, 
-*allOutputs*.  Each list item is a dictionary, containing the time, redshift 
-(if cosmological), and filename of the dataset.
-
-.. code-block:: python
-
-  >>> es.allOutputs[0]
-  {'filename': '/Users/britton/EnzoRuns/cool_core_unreasonable/RD0000/RD0000',
-   'time': 0.81631644849936602, 'redshift': 99.0}
-
-Now, analyzing each dataset is easy:
-
-.. code-block:: python
-
-  for output in es.allOutputs:
-      # load up a dataset
-      pf = load(output['filename'])
-      # do something!
-
-Cosmology Splices
------------------
-
-For cosmological simulations, the physical width of the simulation box 
-corresponds to some :math:`\Delta z`, which varies with redshift.  Using this 
-logic, one can stitch together a series of datasets to create a continuous 
-volume or length element from one redshift to another.  The 
-:meth:`create_cosmology_splice` method will return such a list:
-
-.. code-block:: python
-
-  cosmo = es.create_cosmology_splice(minimal=True, deltaz_min=0.0, initial_redshift=1.0, final_redshift=0.0)
-
-The returned list is of the same format as the *allOutputs* attribute.  The 
-keyword arguments are:
-
- * **minimal** (*bool*): if True, the minimum number of datasets is used
-   to connect the initial and final redshift.  If false, the list will contain as
-   many entries as possible within the redshift interval.  Default: True.
-
- * **deltaz_min** (*float*): specifies the minimum :math:`\Delta z` between
-   consecutive datasets in the returned list.  Default: 0.0.
-
- * **initial_redshift** (*float*): the initial (highest) redshift in the
-   cosmology splice list.  If none given, the highest redshift dataset present
-   will be used.  Default: None.
-
- * **final_redshift** (*float*): the final (lowest) redshift in the
-   cosmology splice list.  If none given, the lowest redshift dataset present will
-   be used.  Default: None.
-
-The most well known application of this function is the
-:ref:`light-cone-generator`.
-
-Planning a Cosmological Simulation
-----------------------------------
-
-If you want to run a cosmological simulation that will have just enough data outputs 
-to create a cosmology splice, the :meth:`imagine_minimal_splice` method will calculate 
-a list of redshifts outputs that will minimally connect a redshift interval.
-
-.. code-block:: python
-
-  initial_redshift = 0.4
-  final_redshift = 0.0 
-  outputs = es.imagine_minimal_splice(initial_redshift, final_redshift, filename='outputs.out')
-
-This function will return a list of dictionaries with "redshift" and "deltazMax" 
-entries.  The keyword arguments are:
-
- * **decimals** (*int*): The decimal place to which the output redshift will be rounded.  If the decimal place in question is nonzero, the redshift will be rounded up to ensure continuity of the splice.  Default: 3.
-
- * **filename** (*str*): If provided, a file will be written with the redshift outputs in the form in which they should be given in the enzo parameter file.  Default: None.
-
- * **redshift_output_string** (*str*): The parameter accompanying the redshift outputs in the enzo parameter file.  Default: "CosmologyOutputRedshift".
-
- * **start_index** (*int*): The index of the first redshift output.  Default: 0.


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/halo_profiling.rst
--- a/source/analysis_modules/halo_profiling.rst
+++ b/source/analysis_modules/halo_profiling.rst
@@ -5,30 +5,26 @@
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>,
    Stephen Skory <s at skory.us>
 
-The halo profiler provides a means of performing analysis on multiple points in a dataset at 
-once.  This is primarily intended for use with cosmological simulations, in which  
-gravitationally bound structures composed of dark matter and gas, called halos, form and 
-become the hosts for galaxies and galaxy clusters.
+The ``HaloProfiler`` provides a means of performing analysis on multiple halos 
+in a parallel-safe way.
 
-The halo profiler performs two primary functions: radial profiles and projections.  
-The halo profiler can be run in parallel, with `mpi4py
-<http://code.google.com/p/mpi4py/>`_ installed, by running 
-your script inside an mpirun call with the --parallel flag at the end.
+The halo profiler performs three primary functions: radial profiles, 
+projections, and custom analysis.  See the cookbook for a recipe demonstrating 
+all of these features.
 
 Configuring the Halo Profiler
 -----------------------------
 
-A sample script to run the halo profiler can be found in :ref:`cookbook-run_halo_profiler`.  
-In order to run the halo profiler on a dataset, a halo profiler object must be instantiated 
-with the path to the dataset as the only argument:
+The only argument required to create a ``HaloProfiler`` object is the path 
+to the dataset.
 
 .. code-block:: python
 
   from yt.analysis_modules.halo_profiler.api import *
-  hp = HaloProfiler("DD0242/DD0242")
+  hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046")
 
-Most of the halo profiler's options are configured with keyword arguments given at 
-instantiation.  These options are:
+Most of the halo profiler's options are configured with additional keyword 
+arguments:
 
  * **output_dir** (*str*): if specified, all output will be put into this path
    instead of in the dataset directories.  Default: None.
@@ -98,22 +94,23 @@
    calculated (used for calculation of radial and tangential velocities.  Valid
    options are:
    - ["bulk", "halo"] (Default): the velocity provided in the halo list
-   - ["bulk", "sphere"]: the bulk velocity of the sphere centered on the halo center.
+   - ["bulk", "sphere"]: the bulk velocity of the sphere centered on the halo 
+center.
    - ["max", field]: the velocity of the cell that is the location of the maximum of the field specified.
 
  * **filter_quantities** (*list*): quantities from the original halo list
    file to be written out in the filtered list file.  Default: ['id','center'].
 
- * **use_critical_density** (*bool*): if True, the definition of overdensity for 
-   virial quantities is calculated with respect to the critical density.  If False, 
-   overdensity is with respect to mean matter density, which is lower by a factor 
-   of Omega_M.  Default: False.
+ * **use_critical_density** (*bool*): if True, the definition of overdensity 
+     for virial quantities is calculated with respect to the critical 
+     density.  If False, overdensity is with respect to mean matter density, 
+     which is lower by a factor of Omega_M.  Default: False.
 
 Profiles
 --------
 
-Once the halo profiler object has been instantiated, fields can be added for profiling with 
-the :meth:`add_profile` method:
+Once the halo profiler object has been instantiated, fields can be added for 
+profiling with the :meth:`add_profile` method:
 
 .. code-block:: python
 
@@ -121,13 +118,35 @@
   hp.add_profile('TotalMassMsun', weight_field=None, accumulation=True)
   hp.add_profile('Density', weight_field=None, accumulation=False)
   hp.add_profile('Temperature', weight_field='CellMassMsun', accumulation=False)
-  hp.make_profiles(njobs=-1)
+  hp.make_profiles(njobs=-1, prefilters=["halo['mass'] > 1e13"],
+                   filename='VirialQuantities.h5')
 
 The :meth:`make_profiles` method will begin the profiling.  Use the
 **njobs** keyword to control the number of jobs over which the
 profiling is divided.  Setting to -1 results in a single processor per
 halo.  Setting to 1 results in all available processors working on the
-same halo.
+same halo.  The prefilters keyword tells the profiler to skip all halos with 
+masses (as loaded from the halo finder) less than a given amount.  See below 
+for more information.  Additional keyword arguments are:
+
+ * **filename** (*str*): If set, a file will be written with all of the 
+   filtered halos and the quantities returned by the filter functions.
+   Default: None.
+
+ * **prefilters** (*list*): A single dataset can contain thousands or tens of 
+   thousands of halos. Significant time can be saved by not profiling halos
+   that are certain to not pass any filter functions in place.  Simple filters 
+   based on quantities provided in the initial halo list can be used to filter 
+   out unwanted halos using this parameter.  Default: None.
+
+ * **njobs** (*int*): The number of jobs over which to split the profiling.  
+   Set to -1 so that each halo is done by a single processor.  Default: -1.
+
+ * **dynamic** (*bool*): If True, distribute halos using a task queue.  If 
+   False, distribute halos evenly over all jobs.  Default: False.
+
+ * **profile_format** (*str*): The file format for the radial profiles, 
+   'ascii' or 'hdf5'.  Default: 'ascii'.
 
 .. image:: _images/profiles.png
    :width: 500
@@ -145,7 +164,7 @@
   hp.add_projection('Temperature', weight_field='Density')
   hp.add_projection('Metallicity', weight_field='Density')
   hp.make_projections(axes=[0, 1, 2], save_cube=True, save_images=True, 
-                                    halo_list="filtered", njobs=-1)
+                      halo_list="filtered", njobs=-1)
 
 If **save_cube** is set to True, the projection data
 will be written to a set of hdf5 files 
@@ -158,7 +177,26 @@
 discussion of filtering halos.  Use the **njobs** keyword to control
 the number of jobs over which the profiling is divided.  Setting to -1
 results in a single processor per halo.  Setting to 1 results in all
-available processors working on the same halo.
+available processors working on the same halo.  The keyword arguments are:
+
+ * **axes** (*list*): A list of the axes to project along, using the usual 
+   0,1,2 convention. Default=[0,1,2].
+
+ * **halo_list** (*str*) {'filtered', 'all'}: Which set of halos to make 
+   profiles of, either ones passed by the halo filters (if enabled/added), or 
+   all halos.  Default='filtered'.
+
+ * **save_images** (*bool*): Whether or not to save images of the projections. 
+   Default=False.
+
+ * **save_cube** (*bool*): Whether or not to save the HDF5 files of the halo 
+   projections.  Default=True.
+
+ * **njobs** (*int*): The number of jobs over which to split the projections.  
+   Set to -1 so that each halo is done by a single processor.  Default: -1.
+
+ * **dynamic** (*bool*): If True, distribute halos using a task queue.  If 
+   False, distribute halos evenly over all jobs.  Default: False.
 
 .. image:: _images/projections.png
    :width: 500
@@ -228,8 +266,8 @@
 
   hp.make_profiles(filename="FilteredQuantities.out")
 
-If the **filename** keyword is set, a file will be written with all of the filtered halos 
-and the quantities returned by the filter functions.
+If the **filename** keyword is set, a file will be written with all of the 
+filtered halos and the quantities returned by the filter functions.
 
 .. note:: If the profiles have already been run, the halo profiler will read
    in the previously created output files instead of re-running the profiles.
@@ -284,8 +322,10 @@
 
 .. code-block:: python
 
-   hp = HaloProfiler("data0092", recenter="Max_Dark_Matter_Density")
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", 
+                     recenter="Max_Dark_Matter_Density")
 
+Additional options are:
 
   * *Min_Dark_Matter_Density* - Recenter on the point of minimum dark matter
     density in the halo.
@@ -338,7 +378,7 @@
        ma, mini, mx, my, mz, mg = sphere.quantities['MinLocation']('Temperature')
        return [mx,my,mz]
    
-   hp = HaloProfiler("data0092", recenter=find_min_temp)
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", recenter=find_min_temp)
 
 It is possible to make more complicated functions. This example below extends
 the example above to include a distance control that prevents the center from
@@ -362,20 +402,8 @@
        if d > 5.: return [-1, -1, -1]
        return [mx,my,mz]
    
-   hp = HaloProfiler("data0092", recenter=find_min_temp_dist)
-
-.. warning::
-
-   If the halo profiler is run in parallel, and a recentering function is used
-   that is user-defined, two flags need to be set in the ``quantities`` call
-   as in the example below. The built-in recentering functions have
-   these flags set already.
-   
-   .. code-block:: python
-      
-      ma, mini, mx, my, mz, mg = sphere.quantities['MinLocation']('Temperature',
-        lazy_reader=True, preload=False)
-
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", 
+                     recenter=find_min_temp_dist)
 
 Custom Halo Analysis
 --------------------
@@ -410,7 +438,8 @@
 .. code-block:: python
 
     hp.analyze_halo_sphere(halo_2D_profile, halo_list='filtered',
-        analysis_output_dir='2D_profiles', njobs=-1)
+                           analysis_output_dir='2D_profiles', 
+                           njobs=-1, dynamic=False)
 
 Just like with the :meth:`make_projections` function, the keyword,
 **halo_list**, can be used to select between the full list of halos


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/index.rst
--- a/source/analysis_modules/index.rst
+++ b/source/analysis_modules/index.rst
@@ -8,11 +8,11 @@
    :maxdepth: 2
 
    running_halofinder
-   analyzing_an_entire_simulation
    hmf_howto
    halo_profiling
    light_cone_generator
    light_ray_generator
+   planning_cosmology_simulations
    absorption_spectrum
    star_analysis
    halo_mass_function


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/light_cone_generator.rst
--- a/source/analysis_modules/light_cone_generator.rst
+++ b/source/analysis_modules/light_cone_generator.rst
@@ -4,71 +4,82 @@
 ====================
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>
 
-Light cones are projections made by stacking multiple datasets together to continuously span a 
-given redshift interval.  The width of individual projection slices is adjusted such that each slice 
-has the same angular size.  Each projection slice is randomly shifted and projected along a random 
-axis to ensure that the same structures are not sampled multiple times.  Since deeper images sample 
-earlier epochs of the simulation, light cones represent the closest thing to synthetic imaging 
-observations.
-
-As with most things yt, the light cone functionality can be run in parallel with 
-`mpi4py <http://code.google.com/p/mpi4py/>`_ installed, by running your script inside an mpirun call 
-with the --parallel flag at the end.
+Light cones are projections made by stacking multiple datasets together to 
+continuously span a given redshift interval.  The width of individual 
+projection slices is adjusted such that each slice has the same angular size.  
+Each projection slice is randomly shifted and projected along a random axis to 
+ensure that the same structures are not sampled multiple times.  Since deeper 
+images sample earlier epochs of the simulation, light cones represent the 
+closest thing to synthetic imaging observations.
 
 .. image:: _images/LightCone_full_small.png
    :width: 500
 
-A light cone projection of the thermal Sunyaev-Zeldovich Y parameter from z = 0 to 0.4 with a 
-450x450 arcminute field of view using 9 individual slices.  The panels shows the contributions from 
-the 9 individual slices with the final light cone image shown in the bottom, right.
+A light cone projection of the thermal Sunyaev-Zeldovich Y parameter from 
+z = 0 to 0.4 with a 450x450 arcminute field of view using 9 individual 
+slices.  The panels shows the contributions from the 9 individual slices with 
+the final light cone image shown in the bottom, right.
 
 Configuring the Light Cone Generator
 ------------------------------------
 
-A recipe for creating a simple light cone projection can be found in :ref:`cookbook-make_light_cone`.  
-Light cone projections are made from objects of the LightCone class.  The only required argument for 
-instantiation is the parameter file used to run the simulation, although a few keyword arguments are 
-technically required for anything interesting to happen:
+A recipe for creating a simple light cone projection can be found in the 
+cookbook.  The required arguments to instantiate a ``LightCone`` objects are 
+the path to the simulation parameter file, the simulation type, the nearest 
+redshift, and the furthest redshift of the light cone.
 
 .. code-block:: python
 
-  import yt.analysis_modules.lightcone.api as LC
+  from yt.analysis_modules.api import LightCone
 
-  lc = LC.LightCone("128Mpc256grid_SFFB.param", initial_redshift=0.4, final_redshift=0.0, 
-                    observer_redshift=0.0, field_of_view_in_arcminutes=450.0, 
-                    image_resolution_in_arcseconds=60.0)
+  lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
+                 'Enzo', 0., 0.1)
 
-The complete list of keyword arguments for instantiation is given below:
+The additional keyword arguments are:
 
- * **initial_redshift** (*float*): the initial (highest) redshift for the light cone.  Default: 1.0.
+ * **field_of_view_in_arcminutes** (*float*): The field of view of the image 
+   in units of arcminutes.  Default: 600.0.
 
- * **final_redshift** (*float*): the final (lowest) redshift for the light cone.  Default: 0.0.
+ * **image_resolution_in_arcseconds** (*float*): The size of each image pixel 
+   in units of arcseconds.  Default: 60.0.
 
- * **observer_redshift** (*float*): the redshift of the observer.  Default: 0.0.
+ * **use_minimum_datasets** (*bool*):  If True, the minimum number of datasets 
+   is used to connect the initial and final redshift.  If false, the light 
+   cone solution will contain as many entries as possible within the redshift 
+   interval.  Default: True.
 
- * **field_of_view_in_arcminutes** (*float*): the field of view of the image in units of arcminutes.  Default: 600.0.
+ * **deltaz_min** (*float*): Specifies the minimum Delta-z between 
+   consecutive datasets in the returned list.  Default: 0.0.
 
- * **image_resolution_in_arcseconds** (*float*): the size of each image pixel in units of arcseconds.  Default: 60.0.
+ * **minimum_coherent_box_fraction** (*float*): Used with use_minimum_datasets 
+   set to False, this parameter specifies the fraction of the total box size 
+   to be traversed before rerandomizing the projection axis and center.  This 
+   was invented to allow light cones with thin slices to sample coherent large 
+   scale structure, but in practice does not work so well.  Try setting this 
+   parameter to 1 and see what happens.  Default: 0.0.
 
- * **use_minimum_datasets** (*bool*): if True, the minimum number of datasets is used to connect the initial and final redshift.  If false, the light cone solution will contain as many entries as possible within the redshift interval.  Default: True.
+ * **time_data** (*bool*): Whether or not to include time outputs when 
+   gathering datasets for time series.  Default: True.
 
- * **deltaz_min** (*float*): specifies the minimum :math:`\Delta z` between consecutive datasets in the returned list.  Default: 0.0.
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs when 
+   gathering datasets for time series.  Default: True.
 
- * **minimum_coherent_box_fraction** (*float*): used with **use_minimum_datasets** set to False, this parameter specifies the fraction of the total box size to be traversed before rerandomizing the projection axis and center.  This was invented to allow light cones with thin slices to sample coherent large scale structure, but in practice does not work so well.  Try setting this parameter to 1 and see what happens.  Default: 0.0.
+ * **set_parameters** (*dict*): Dictionary of parameters to attach to 
+   pf.parameters.  Default: None.
 
- * **output_dir** (*str*): the directory in which images and data files will be written.  Default: 'LC'.
+ * **output_dir** (*string*): The directory in which images and data files
+    will be written.  Default: 'LC'.
 
- * **output_prefix** (*str*): the prefix of all images and data files.  Default: 'LightCone'.
+ * **output_prefix** (*string*): The prefix of all images and data files.
+   Default: 'LightCone'.
 
 Creating Light Cone Solutions
 -----------------------------
 
-A light cone solution consists of a list of datasets and the width, depth, center, and axis of the 
-projection to be made for that slice.  The LightCone class is a subclass of EnzoSimulation 
-(see :ref:`analyzing-an-entire-simulation`).  As such, the initial selection of the list of datasets 
-to be used in a light cone solution is done with the :meth:`EnzoSimulation.create_cosmology_splice`.  
-The :meth:`LightCone.calculate_light_cone_solution` is used to calculated the random shifting and 
-projection axis:
+A light cone solution consists of a list of datasets and the width, depth, 
+center, and axis of the projection to be made for that slice.  The 
+:meth:`LightCone.calculate_light_cone_solution` function is used to 
+calculate the random shifting and projection axis:
 
 .. code-block:: python
 
@@ -76,9 +87,12 @@
 
 The keyword argument are:
 
- * **seed** (*int*): the seed for the random number generator.  Any light cone solution can be reproduced by giving the same random seed.  Default: None (each solution will be distinct).
+ * **seed** (*int*): the seed for the random number generator.  Any light cone 
+   solution can be reproduced by giving the same random seed.  Default: None 
+   (each solution will be distinct).
 
- * **filename** (*str*): if given, a text file detailing the solution will be written out.  Default: None.
+ * **filename** (*str*): if given, a text file detailing the solution will be 
+   written out.  Default: None.
 
 If a new solution for the same LightCone object is desired, the 
 :meth:`rerandomize_light_cone_solution` method should be called in place of 
@@ -87,77 +101,110 @@
 .. code-block:: python
 
   new_seed = 987654321
-  lc.rerandomize_light_cone_solution(new_seed, Recycle=True, filename='new_lightcone.dat')
+  lc.rerandomize_light_cone_solution(new_seed, Recycle=True, 
+                                     filename='new_lightcone.dat')
 
-If debugging is on, the LightCone object will calculate and output the fraction of the light cone 
-volume in common with the original solution.  The keyword arguments are:
+Additional keyword arguments are:
 
- * **recycle** (*bool*): if True, the new solution will have the same shift in the line of sight as the original solution.  Since the projections of each slice are serialized and stored for the entire width of the box (even if the width used is left than the total box), the projection data can be deserialized instead of being remade from scratch.  This can greatly speed up the creation of a large number of light cone projections.  Default: True.
+ * **recycle** (*bool*): if True, the new solution will have the same shift in 
+   the line of sight as the original solution.  Since the projections of each 
+   slice are serialized and stored for the entire width of the box (even if 
+   the width used is left than the total box), the projection data can be 
+   deserialized instead of being remade from scratch.  This can greatly speed 
+   up the creation of a large number of light cone projections.  Default: True.
 
- * **filename** (*str*): if given, a text file detailing the solution will be written out.  Default: None.
+ * **filename** (*str*): if given, a text file detailing the solution will be 
+   written out.  Default: None.
 
-If :meth:`rerandomize_light_cone_solution` is used, the LightCone object will keep a copy of the 
-original solution that can be returned to at any time by calling :meth:`restore_master_solution`:
+If :meth:`rerandomize_light_cone_solution` is used, the LightCone object will 
+keep a copy of the original solution that can be returned to at any time by 
+calling :meth:`restore_master_solution`:
 
 .. code-block:: python
 
   lc.restore_master_solution()
 
-.. note:: All light cone solutions made with the above method will still use the same list of datasets.  Only the shifting and projection axis will be different.
+.. note:: All light cone solutions made with the above method will still use 
+the same list of datasets.  Only the shifting and projection axis will be 
+different.
 
 Making a Light Cone Projection
 ------------------------------
 
-With the light cone solution set, projections can be made of any available field:
+With the light cone solution set, projections can be made of any available 
+field:
 
 .. code-block:: python
 
   field = 'Density'
-  pc = lc.project_light_cone(field , weight_field=None, save_stack=True, save_slice_images=True)
+  lc.project_light_cone(field , weight_field=None, 
+                        save_stack=True, 
+                        save_slice_images=True)
 
-The return value of :meth:`project_light_cone` is the PlotCollection containing the image of the final 
-light cone image.  This allows the user further customization of the final image.  The keyword 
-arguments of :meth:`project_light_cone` are:
+Additional keyword arguments:
 
- * **weight_field** (*str*): the weight field of the projection.  This has the same meaning as in standard projections.  Default: None.
+ * **weight_field** (*str*): the weight field of the projection.  This has the 
+   same meaning as in standard projections.  Default: None.
 
- * **apply_halo_mask** (*bool*): if True, a boolean mask is apply to the light cone projection.  See below for a description of halo masks.  Default: False.
+ * **apply_halo_mask** (*bool*): if True, a boolean mask is apply to the light 
+   cone projection.  See below for a description of halo masks.  Default: False.
 
- * **node** (*str*): a prefix to be prepended to the node name under which the projection data is serialized.  Default: None.
+ * **node** (*str*): a prefix to be prepended to the node name under which the 
+   projection data is serialized.  Default: None.
 
- * **save_stack** (*bool*): if True, the unflatted light cone data including each individual slice is written to an hdf5 file.  Default: True.
+ * **save_stack** (*bool*): if True, the unflatted light cone data including 
+   each individual slice is written to an hdf5 file.  Default: True.
 
- * **save_slice_images** (*bool*): save images for each individual projection slice.  Default: False.
+ * **save_final_image** (*bool*): if True, save an image of the final light 
+   cone projection.  Default: True.
 
- * **flatten_stack** (*bool*): if True, the light cone stack is continually flattened each time a slice is added in order to save memory.  This is generally not necessary.  Default: False.
+ * **save_slice_images** (*bool*): save images for each individual projection 
+   slice.  Default: False.
 
- * **photon_field** (*bool*): if True, the projection data for each slice is decremented by 4 :math:`\pi` R :superscript:`2`, where R is the luminosity distance between the observer and the slice redshift.  Default: False.
+ * **flatten_stack** (*bool*): if True, the light cone stack is continually 
+   flattened each time a slice is added in order to save memory.  This is 
+   generally not necessary.  Default: False.
 
-.. note:: Additional keywords appropriate for a call to :meth:`PlotCollection.add_projection` can also be given to :meth:`project_light_cone`.
+ * **photon_field** (*bool*): if True, the projection data for each slice is 
+   decremented by 4 pi R :superscript:`2` , where R is the luminosity 
+   distance between the observer and the slice redshift.  Default: False.
+
+ * **njobs** (*int*): The number of parallel jobs over which the light cone 
+   projection will be split.  Choose -1 for one processor per individual
+   projection and 1 to have all processors work together on each projection.
+   Default: 1.
+
+ * **dynamic** (*bool*): If True, use dynamic load balancing to create the 
+   projections.  Default: False.
 
 Sampling Unique Light Cone Volumes
 ----------------------------------
 
-When making a large number of light cones, particularly for statistical analysis, it is important 
-to have a handle on the amount of sampled volume in common from one projection to another.  Any 
-statistics may untrustworthy if a set of light cones have too much volume in common, even if they 
-may all be entirely different in appearance.  LightCone objects have the ability to calculate the 
-volume in common between two solutions with the same dataset list.  The :meth:`find_unique_solutions` 
-and :meth:`project_unique_light_cones` functions can be used to create a set of light cone solutions 
-that have some maximum volume in common and create light cone projections for those solutions.  If 
-specified, the code will attempt to use recycled solutions that can use the same serialized projection 
-objects that have already been created.  This can greatly increase the speed of making multiple light 
-cone projections.  See :ref:`cookbook-unique_light_cones` for an example of doing this.
+When making a large number of light cones, particularly for statistical 
+analysis, it is important to have a handle on the amount of sampled volume in 
+common from one projection to another.  Any statistics may untrustworthy if a 
+set of light cones have too much volume in common, even if they may all be 
+entirely different in appearance.  LightCone objects have the ability to 
+calculate the volume in common between two solutions with the same dataset 
+ist.  The :meth:`find_unique_solutions` and 
+:meth:`project_unique_light_cones` functions can be used to create a set of 
+light cone solutions that have some maximum volume in common and create light 
+cone projections for those solutions.  If specified, the code will attempt to 
+use recycled solutions that can use the same serialized projection objects 
+that have already been created.  This can greatly increase the speed of making 
+multiple light cone projections.  See the cookbook for an example of doing this.
 
 Making Light Cones with a Halo Mask
 -----------------------------------
 
-The situation may arise where it is necessary or desirable to know the location of halos within the 
-light cone volume, and specifically their location in the final image.  This can be useful for 
-developing algorithms to find galaxies or clusters in image data.  The light cone generator does this 
-by running the HaloProfiler (see :ref:`halo_profiling`) on each of the datasets used in the light cone 
-and shifting them accordingly with the light cone solution.  The ability also exists to create a 
-boolean mask with the dimensions of the final light cone image that can be used to mask out the 
-halos in the image.  It is left as an exercise to the reader to find a use for this functionality.  
-This process is somewhat complicated, but not terribly.  See :ref:`cookbook-light_cone_halo_mask` for 
-an example of how to do this.
+The situation may arise where it is necessary or desirable to know the 
+location of halos within the light cone volume, and specifically their 
+location in the final image.  This can be useful for developing algorithms to 
+find galaxies or clusters in image data.  The light cone generator does this 
+by running the HaloProfiler (see :ref:`halo_profiling`) on each of the 
+datasets used in the light cone and shifting them accordingly with the light 
+cone solution.  The ability also exists to create a boolean mask with the 
+dimensions of the final light cone image that can be used to mask out the 
+halos in the image.  It is left as an exercise to the reader to find a use for 
+this functionality.  This process is somewhat complicated, but not terribly.  
+See the recipe in the cookbook for an example of this functionality.
\ No newline at end of file


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/light_ray_generator.rst
--- a/source/analysis_modules/light_ray_generator.rst
+++ b/source/analysis_modules/light_ray_generator.rst
@@ -4,118 +4,129 @@
 ====================
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>
 
-Light rays are similar to light cones (:ref:`light-cone-generator`) in the way they stack mulitple 
-datasets together to span a redshift interval.  Unlike light cones, which which stack radomly 
-oriented projections from each dataset to create synthetic images, light rays use infinitesimally 
-thin pencil beams to simulate QSO sight lines.
-
-.. note:: The light ray generator can be run in parallel, but should be done so without the --parallel flag.  This is because the light ray tool is not yet parallelized in a way that complies with yt's parallel mode.  The light ray tool works in parallel by giving each processor one dataset from the stack.  Therefore, it is useless to use more processors than datasets.
+Light rays are similar to light cones (:ref:`light-cone-generator`) in how  
+they stack mulitple datasets together to span a redshift interval.  Unlike 
+light cones, which which stack randomly oriented projections from each 
+dataset to create synthetic images, light rays use thin pencil beams to 
+simulate QSO sight lines.
 
 .. image:: _images/lightray.png
 
-A ray segment records the information of all grid cells intersected by the ray as well as the path 
-length, dl, of the ray through the cell.  Column densities can be calculated by multiplying 
-physical densities by the path length.
+A ray segment records the information of all grid cells intersected by the ray 
+as well as the path length, dl, of the ray through the cell.  Column densities 
+can be calculated by multiplying physical densities by the path length.
 
 Configuring the Light Ray Generator
 -----------------------------------
-
-An advanced recipe for creating a light ray can be found in :ref:`cookbook-make_light_ray`.  
-The only required arguments for instantiation are the parameter file used to run the simulation, 
-the initial redshift, and the final redshift.
+  
+The arguments required to instantiate a ``LightRay`` object are the same as 
+those required for a ``LightCone`` object: the simulation parameter file, the 
+simulation type, the nearest redshift, and the furthest redshift.
 
 .. code-block:: python
 
-  from yt.analysis_modules.light_ray.api import *
+  from yt.analysis_modules.api import LightRay
+  lr = LightRay("enzo_tiny_cosmology/32Mpc_32.enzo",
+                'Enzo', 0.0, 0.1)
 
-  lr = LightRay('my_simulation.par', 0.0, 0.1)
+Additional keyword arguments are:
 
-The light ray tool is a subclass of EnzoSimulation (see :ref:`analyzing-an-entire-simulation`).  
-As such, the additional keyword arguments are very simiar to the ones for 
-:meth:`EnzoSimulation.create_cosmology_splice`.  The complete list is given below:
+ * **use_minimum_datasets** (*bool*): If True, the minimum number of datasets 
+   is used to connect the initial and final redshift.  If false, the light 
+   ray solution will contain as many entries as possible within the redshift
+   interval.  Default: True.
 
- * **deltaz_min** (*float*): minimum delta z between consecutive datasets.  Default: 0.0.
+ * **deltaz_min** (*float*):  Specifies the minimum Delta-z between consecutive
+   datasets in the returned list.  Default: 0.0.
 
- * **use_minimum_datasets** (*bool*): if True, the minimum number of datasets is used to connect the initial and final redshift.  If false, the light ray solution will contain as many entries as possible within the redshift interval.  Default: True.
+ * **minimum_coherent_box_fraction** (*float*): Used with use_minimum_datasets 
+   set to False, this parameter specifies the fraction of the total box size 
+   to be traversed before rerandomizing the projection axis and center.  This
+   was invented to allow light rays with thin slices to sample coherent large 
+   scale structure, but in practice does not work so well.  Try setting this 
+   parameter to 1 and see what happens.  Default: 0.0.
 
- * **minimum_coherent_box_fraction** (*float*): used with use_minimum_datasets set to False, this parameter specifies the fraction of the total box size to be traversed before rerandomizing the projection axis and center.  This was invented to allow light cones with thin slices to sample coherent large scale structure, but in practice does not work so well.  It is not very clear what this will do to a light ray.  Default: 0.0.
+ * **time_data** (*bool*): Whether or not to include time outputs when gathering
+   datasets for time series.  Default: True.
+
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs when 
+   gathering datasets for time series.  Default: True.
+
 
 Making Light Ray Data
 ---------------------
 
-Once the LightRay object has been instantiated, the :meth:`make_light_ray` will trace out the 
-rays in each dataset and collect information for all the fields requested.  The output file 
-will be an hdf5 file containing all the cell field values for all the cells that were intersected 
-by the ray.  A single LightRay object can be used over and over to make multiple randomizations, 
-simply by changing the value of the random seed with the **seed** keyword.
+Once the LightRay object has been instantiated, the :meth:`make_light_ray` 
+will trace out the rays in each dataset and collect information for all the 
+fields requested.  The output file will be an hdf5 file containing all the 
+cell field values for all the cells that were intersected by the ray.  A 
+single LightRay object can be used over and over to make multiple 
+randomizations, simply by changing the value of the random seed with the 
+**seed** keyword.
 
 .. code-block:: python
 
   lr.make_light_ray(seed=8675309,
-                    solution_filename='lightraysolution.txt',
-                    data_filename=data_filename,
                     fields=['Temperature', 'Density'],
                     get_los_velocity=True)
 
 The keyword arguments are:
 
- * **seed** (*int*): seed for the random number generator.  Default: None.
- * **fields** (*list*): a list of fields for which to get data.  Default: None.
- * **solution_filename** (*string*): path to a text file where the trajectories of each subray is written out.  Default: None.
- * **data_filename** (*string*): path to output file for ray data.  Default: None.
- * **get_nearest_galaxy** (*bool*): if True, the HaloProfiler will be used to calculate the distance and mass of the nearest halo for each point in the ray.  This option requires additional information to be included.  See below for an example.  Default: False.
- * **get_los_velocity** (*bool*): if True, the line of sight velocity is calculated for each point in the ray.  Default: False.
+ * **seed** (*int*): Seed for the random number generator.  Default: None.
+
+ * **fields** (*list*): A list of fields for which to get data.  Default: None.
+
+ * **solution_filename** (*string*): Path to a text file where the 
+   trajectories of each subray is written out.  Default: None.
+
+ * **data_filename** (*string*): Path to output file for ray data.  
+   Default: None.
+
+ * **get_los_velocity** (*bool*): If True, the line of sight velocity is 
+   calculated for each point in the ray.  Default: False.
+
+ * **get_nearest_halo** (*bool*): If True, the HaloProfiler will be used to 
+   calculate the distance and mass of the nearest halo for each point in the
+   ray.  This option requires additional information to be included.  See 
+   the cookbook for an example.  Default: False.
+
+ * **nearest_halo_fields** (*list*): A list of fields to be calculated for the 
+   halos nearest to every lixel in the ray.  Default: None.
+
+ * **halo_profiler_parameters** (*dict*): A dictionary of parameters to be 
+   passed to the HaloProfiler to create the appropriate data used to get 
+   properties for the nearest halos.  Default: None.
+
+ * **njobs** (*int*): The number of parallel jobs over which the slices for the
+   halo mask will be split.  Choose -1 for one processor per individual slice 
+   and 1 to have all processors work together on each projection.  Default: 1
+
+ * **dynamic** (*bool*): If True, use dynamic load balancing to create the 
+   projections.  Default: False.
 
 Getting The Nearest Galaxies
 ----------------------------
 
-With the **get_los_velocity** keyword set to True for :meth:`make_light_ray`, the light ray 
-tool will use the HaloProfiler to calculate the distance and mass of the nearest halo to that 
-pixel.  In order to do this, three additional keyword arguments must be supplied to tell the 
-HaloProfiler what to do:
+The light ray tool will use the HaloProfiler to calculate the distance and 
+mass of the nearest halo to that pixel.  In order to do this, a dictionary 
+called halo_profiler_parameters is used to pass instructions to the 
+HaloProfiler.  This dictionary has three additional keywords:
 
- * **halo_profiler_kwargs** (*dict*): a dictionary of standard HaloProfiler keyword arguments and values to be given to the HaloProfiler.
- * **halo_profiler_actions** (*list*): a list of actions to be performed by the HaloProfiler.  Each item in the list should be a dictionary with the following entries: "function", "args", and "kwargs", for the function to be performed, the arguments supplied to that function, and the keyword arguments.
- * **halo_list** (*string*): 'all' to use the full halo list, or 'filtered' to use the filtered halo list created after calling make_profiles.
+ * **halo_profiler_kwargs** (*dict*): A dictionary of standard HaloProfiler 
+   keyword arguments and values to be given to the HaloProfiler.
 
-In the example below, we ask the HaloProfiler to perform two tasks.  The first is to add the 
-halo filter for virialized halos above 10 :superscript:`14` solar masses.  The second is to 
-make the radial profiles.  Finally, the **halo_list** keyword signifies that we want to use the 
-filtered halo list created after profiling.
+ * **halo_profiler_actions** (*list*): A list of actions to be performed by 
+   the HaloProfiler.  Each item in the list should be a dictionary with the 
+   following entries: "function", "args", and "kwargs", for the function to 
+   be performed, the arguments supplied to that function, and the keyword 
+   arguments.
 
-.. code-block:: python
+ * **halo_list** (*string*): 'all' to use the full halo list, or 'filtered' 
+   to use the filtered halo list created after calling make_profiles.
 
-  halo_profiler_kwargs = {'halo_list_format': {'id':0, 'center':[4, 5, 6]},
-                                               'TotalMassMsun':1},
-                          'halo_list_file': 'HopAnalysis.out'}
-
-  halo_profiler_actions = [{'function': add_halo_filter,
-                            'args': VirialFilter,
-                            'kwargs': {'overdensity_field': 'ActualOverdensity',
-                                       'virial_overdensity': 200,
-                                       'virial_filters': [['TotalMassMsun','>=','1e14']],
-                                       'virial_quantities': ['TotalMassMsun','RadiusMpc']}},
-                           {'function': make_profiles,
-                            'args': None,
-                            'kwargs': {'filename': 'VirializedHalos.out'}}]
-
-  halo_list = 'filtered'
-
-  halo_mass_field = 'TotalMassMsun_200'
-
-  lr.make_light_ray(seed=8675309,
-                    solution_filename='lightraysolution.txt',
-                    data_filename='lightray.h5',
-                    fields=['Temperature', 'Density'],
-                    get_nearest_galaxy=True, 
-                    halo_profiler_kwargs=halo_profiler_kwargs,
-                    halo_profiler_actions=halo_profiler_actions, 
-                    halo_list=halo_list,
-                    halo_mass_field=halo_mass_field,
-                    get_los_velocity=True)
-
+See the recipe in the cookbook for am example.
 
 What Can I do with this?
 ------------------------
 
-Try :ref:`absorption_spectrum`!
\ No newline at end of file
+Try :ref:`absorption_spectrum`.
\ No newline at end of file


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analysis_modules/planning_cosmology_simulations.rst
--- /dev/null
+++ b/source/analysis_modules/planning_cosmology_simulations.rst
@@ -0,0 +1,28 @@
+.. _planning-cosmology-simulations:
+
+Planning Simulations to use LightCones or LightRays
+===================================================
+
+If you want to run a cosmological simulation that will have just enough data 
+outputs to create a cosmology splice, the :meth:`plan_cosmology_splice` 
+function will calculate a list of redshifts outputs that will minimally 
+connect a redshift interval.
+
+.. code-block:: python
+
+  from yt.analysis_modules.api import CosmologySplice
+  my_splice = CosmologySplice('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+  my_splice.plan_cosmology_splice(0.0, 0.1, filename='redshifts.out')
+
+This will write out a file, formatted for simulation type, with a list of 
+redshift dumps.  The keyword arguments are:
+
+ * **decimals** (*int*): The decimal place to which the output redshift will 
+   be rounded.  If the decimal place in question is nonzero, the redshift will 
+   be rounded up to ensure continuity of the splice.  Default: 3.
+
+ * **filename** (*str*): If provided, a file will be written with the redshift 
+   outputs in the form in which they should be given in the enzo parameter 
+   file.  Default: None.
+
+ * **start_index** (*int*): The index of the first redshift output.  Default: 0.


diff -r 397e499177e0090e658634a9c48ef48b95dd5731 -r d7f484a89e7267f74107541eb6cbc3c435748075 source/analyzing/time_series_analysis.rst
--- a/source/analyzing/time_series_analysis.rst
+++ b/source/analyzing/time_series_analysis.rst
@@ -6,9 +6,8 @@
 Often, one wants to analyze a continuous set of outputs from a simulation in a
 uniform manner.  A simple example would be to calculate the peak density in a
 set of outputs that were written out.  The problem with time series analysis in
-yt is general an issue of verbosity and clunkiness. Typically, unless using the
-:class:`~yt.analysis_modules.simulation_handling.EnzoSimulation` class (which
-is only available as of right now for Enzo) one sets up a loop:
+yt is general an issue of verbosity and clunkiness. Typically, one sets up a 
+loop:
 
 .. code-block:: python
 
@@ -124,3 +123,98 @@
 This allows you to create your own analysis tasks that will be then available
 to time series data objects.  Since ``TimeSeriesData`` objects iterate over
 filenames in parallel by default, this allows for transparent parallelization. 
+
+Analyzing an Entire Simulation
+------------------------------
+
+The parameter file used to run a simulation contains all the information 
+necessary to know what datasets should be available.  The ``simulation`` 
+convenience function allows one to create a ``TimeSeriesData`` object of all 
+or a subset of all data created by a single simulation.
+
+.. note:: Currently only implemented for Enzo.  Other simulation types coming 
+   soon.
+
+To instantiate, give the parameter file and the simulation type.
+
+.. code-block:: python
+
+  from yt.mods import *
+  my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+
+Then, create a ``TimeSeriesData`` object with the :meth:`get_time_series` 
+function.  With no additional keywords, the time series will include every 
+dataset.
+
+.. code-block:: python
+
+  my_sim.get_time_series()
+
+After this, time series analysis can be done normally.
+
+.. code-block:: python
+
+  for pf in my_sim.piter()
+      all_data = pf.h.all_data()
+      print all_data.quantities['Extrema']('Density')
+ 
+Additional keywords can be given to :meth:`get_time_series` to select a subset
+of the total data:
+
+ * **time_data** (*bool*): Whether or not to include time outputs when 
+   gathering datasets for time series.  Default: True.
+
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs 
+   when gathering datasets for time series.  Default: True.
+
+ * **initial_time** (*float*): The earliest time for outputs to be included.  
+   If None, the initial time of the simulation is used.  This can be used in 
+   combination with either final_time or final_redshift.  Default: None.
+
+ * **final_time** (*float*): The latest time for outputs to be included.  If 
+   None, the final time of the simulation is used.  This can be used in 
+   combination with either initial_time or initial_redshift.  Default: None.
+
+ * **times** (*list*): A list of times for which outputs will be found.
+   Default: None.
+
+ * **time_units** (*str*): The time units used for requesting outputs by time.
+   Default: '1' (code units).
+
+ * **initial_redshift** (*float*): The earliest redshift for outputs to be 
+   included.  If None, the initial redshift of the simulation is used.  This
+   can be used in combination with either final_time or final_redshift.
+   Default: None.
+
+ * **final_time** (*float*): The latest redshift for outputs to be included.  
+   If None, the final redshift of the simulation is used.  This can be used 
+   in combination with either initial_time or initial_redshift.  
+   Default: None.
+
+ * **redshifts** (*list*): A list of redshifts for which outputs will be found.
+   Default: None.
+
+ * **initial_cycle** (*float*): The earliest cycle for outputs to be 
+   included.  If None, the initial cycle of the simulation is used.  This can
+   only be used with final_cycle.  Default: None.
+
+ * **final_cycle** (*float*): The latest cycle for outputs to be included.  
+   If None, the final cycle of the simulation is used.  This can only be used 
+   in combination with initial_cycle.  Default: None.
+
+ * **tolerance** (*float*):  Used in combination with "times" or "redshifts" 
+   keywords, this is the tolerance within which outputs are accepted given 
+   the requested times or redshifts.  If None, the nearest output is always 
+   taken.  Default: None.
+
+ * **find_outputs** (*bool*): If True, subdirectories within the GlobalDir 
+   directory are searched one by one for datasets.  Time and redshift 
+   information are gathered by temporarily instantiating each dataset.  This 
+   can be used when simulation data was created in a non-standard way, making 
+   it difficult to guess the corresponding time and redshift information.
+   Default: False.
+
+ * **parallel** (*bool*/*int*): If True, the generated TimeSeriesData will 
+   divide the work such that a single processor works on each dataset.  If an
+   integer is supplied, the work will be divided into that number of jobs.
+   Default: True.
\ No newline at end of file

Repository URL: https://bitbucket.org/yt_analysis/yt-doc/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list