[yt-svn] commit/yt-doc: 30 new changesets

Bitbucket commits-noreply at bitbucket.org
Thu Jul 26 13:34:02 PDT 2012


30 new commits in yt-doc:


https://bitbucket.org/yt_analysis/yt-doc/changeset/f092fb08e113/
changeset:   f092fb08e113
user:        MatthewTurk
date:        2012-07-25 02:34:49
summary:     Adding a yt_cookbook directive to grab all the output from cookbook recipes.
affected #:  5 files

diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 extensions/yt_cookbook.py
--- /dev/null
+++ b/extensions/yt_cookbook.py
@@ -0,0 +1,32 @@
+# This extension is quite simple:
+#  1. It accepts a script name
+#  2. This script is added to the document in a literalinclude
+#  3. Any _static images found will be added
+
+from sphinx.util.compat import Directive
+from docutils.parsers.rst import directives
+import os, glob
+
+def setup(app):
+    app.add_directive('yt_cookbook', CookbookScript)
+
+class CookbookScript(Directive):
+    required_arguments = 1
+    optional_arguments = 0
+
+    def run(self):
+        script_fn = directives.path(self.arguments[0])
+        script_name = os.path.basename(self.arguments[0]).split(".")[0]
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        im_path = os.path.join(rst_dir, "_static")
+        images = sorted(glob.glob(os.path.join(im_path, "%s__*.png" % script_name)))
+        lines = [".. literalinclude:: %s" % self.arguments[0], "\n", "\n"]
+        for im in images:
+            im_name = os.path.join("_static", os.path.basename(im))
+            lines.append(".. image:: %s" % im_name)
+            lines.append("   :width: 250")
+            lines.append("\n")
+        self.state_machine.insert_input(lines, rst_file)
+        print "\n".join(lines)
+        return []


diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 helper_scripts/run_recipes.sh
--- a/helper_scripts/run_recipes.sh
+++ b/helper_scripts/run_recipes.sh
@@ -10,7 +10,7 @@
     python2.7 ${ROOT}/${s} || exit
     for o in *.png *.txt
     do
-        mv -v ${o} ${ROOT}/source/cookbook/_static/${sb%%.py}_${o}
+        mv -v ${o} ${ROOT}/source/cookbook/_static/${sb%%.py}__${o}
     done
     touch ${sb}.done
 done


diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -27,7 +27,8 @@
 # coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
 extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx',
               'sphinx.ext.pngmath', 'sphinx.ext.viewcode',
-              'sphinx.ext.autosummary', 'numpydocmod', 'youtube']
+              'sphinx.ext.autosummary', 'numpydocmod', 'youtube',
+              'yt_cookbook']
 
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['_templates']


diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -2,6 +2,9 @@
 -------------------------------
 
 .. literalinclude:: average_value.py
+
+
+
 .. literalinclude:: sum_mass_in_sphere.py
 .. literalinclude:: global_phase_plots.py
 


diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 source/cookbook/simple_plots.rst
--- a/source/cookbook/simple_plots.rst
+++ b/source/cookbook/simple_plots.rst
@@ -7,16 +7,7 @@
 This script shows the simplest way to make a slice from the scripting
 interface.
 
-.. literalinclude:: simple_slice.py
-
-.. image:: _static/simple_slice_sloshing_nomag2_hdf5_plt_cnt_0150_Slice_x_Density.png
-   :width: 250
-
-.. image:: _static/simple_slice_sloshing_nomag2_hdf5_plt_cnt_0150_Slice_y_Density.png
-   :width: 250
-
-.. image:: _static/simple_slice_sloshing_nomag2_hdf5_plt_cnt_0150_Slice_z_Density.png
-   :width: 250
+.. yt_cookbook:: simple_slice.py
 
 Simple Probability Distribution Functions
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -25,10 +16,7 @@
 another.  This shows how to see the distribution of mass in a simulation, with
 respect to the total mass in the simulation.
 
-.. literalinclude:: simple_pdf.py
-
-.. image:: _static/simple_pdf_fiducial_1to3_b0.273d_hdf5_plt_cnt_0175_Profile2D_0_Density_Temperature_CellMassMsun.png
-   :width: 250
+.. yt_cookbook:: simple_pdf.py
 
 Simple Phase Plots
 ~~~~~~~~~~~~~~~~~~
@@ -37,10 +25,7 @@
 two-dimensional histograms, where the value is either the weighted-average or
 the total accumualtion in a cell.
 
-.. literalinclude:: simple_phase.py
-
-.. image:: _static/simple_phase_galaxy0030_Profile2D_0_Density_Temperature_CellMassMsun.png
-   :width: 250
+.. yt_cookbook:: simple_phase.py
 
 Simple 1D Histograms
 ~~~~~~~~~~~~~~~~~~~~
@@ -49,26 +34,14 @@
 the total accumulation (when weight is set to ``None``) or the average (when a
 weight is supplied.)
 
-.. literalinclude:: simple_profile.py
-
-.. image:: _static/simple_profile_galaxy0030_Profile1D_0_Density_Temperature.png
-   :width: 250
+.. yt_cookbook:: simple_profile.py
 
 Simple Projections
 ~~~~~~~~~~~~~~~~~~
 
 This is the simplest way to make a projection through a dataset.
 
-.. literalinclude:: simple_projection.py
-
-.. image:: _static/simple_projection_fiducial_1to3_b0.273d_hdf5_plt_cnt_0175_Projection_x_Density_Density.png
-   :width: 250
-
-.. image:: _static/simple_projection_fiducial_1to3_b0.273d_hdf5_plt_cnt_0175_Projection_y_Density_Density.png
-   :width: 250
-
-.. image:: _static/simple_projection_fiducial_1to3_b0.273d_hdf5_plt_cnt_0175_Projection_z_Density_Density.png
-   :width: 250
+.. yt_cookbook:: simple_projection.py
 
 Simple Radial Profiles
 ~~~~~~~~~~~~~~~~~~~~~~
@@ -76,10 +49,7 @@
 This shows how to make a profile of a quantity with respect to the radius, in
 this case the radius in Mpc.
 
-.. literalinclude:: simple_radial_profile.py
-
-.. image:: _static/simple_radial_profile_galaxy0030_Profile1D_0_RadiusMpc_Density.png
-   :width: 250
+.. yt_cookbook:: simple_radial_profile.py
 
 Simple Volume Rendering
 ~~~~~~~~~~~~~~~~~~~~~~~
@@ -87,10 +57,7 @@
 Here we see how to make a very simple volume rendering, where each option is
 considered in turn.
 
-.. literalinclude:: simple_volume_rendering.py
-
-.. image:: _static/simple_volume_rendering_data0043_volume_rendered.png
-   :width: 250
+.. yt_cookbook:: simple_volume_rendering.py
 
 Off-Axis Slicing
 ~~~~~~~~~~~~~~~~
@@ -98,5 +65,4 @@
 A cutting plane allows you to slice at some angle that isn't aligned with the
 axes.
 
-.. literalinclude:: aligned_cutting_plane.py
-
+.. yt_cookbook:: aligned_cutting_plane.py



https://bitbucket.org/yt_analysis/yt-doc/changeset/b49b8250b4d1/
changeset:   b49b8250b4d1
user:        brittonsmith
date:        2012-07-25 01:55:27
summary:     Removing obselete recipes.
affected #:  2 files

diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r b49b8250b4d12f4c70ee39ee2ef76330652056ea source/cookbook/make_light_cone.py
--- a/source/cookbook/make_light_cone.py
+++ /dev/null
@@ -1,26 +0,0 @@
-"""
-The following recipe will make a light cone projection (see :ref:`light-cone-generator`) 
-of a single quantity over the redshift interval 0 to 0.4.
-"""
-from yt.mods import *
-from yt.analysis_modules.light_cone.api import *
-
-# All of the light cone parameters are given as keyword arguments at instantiation.
-lc = LightCone("128Mpc256grid_SFFB.param", initial_redshift=0.4, 
-               final_redshift=0.0, observer_redshift=0.0,
-               field_of_view_in_arcminutes=450.0, 
-               image_resolution_in_arcseconds=60.0,
-               use_minimum_datasets=True, deltaz_min=0.0, 
-               minimum_coherent_box_fraction=0.0,
-               output_dir='LC', output_prefix='LightCone')
-
-# Calculate a light cone solution and write out a text file with the details 
-# of the solution.
-lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
-
-# This will be the field to be projected.
-field = 'SZY'
-
-# Make the light cone projection, save individual images of each slice 
-# and of the projection as well as an hdf5 file with the full data cube.
-lc.project_light_cone(field ,save_stack=True, save_slice_images=True)


diff -r 05087e70403d73a4f71ee34278c1d4418d9f5da9 -r b49b8250b4d12f4c70ee39ee2ef76330652056ea source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ /dev/null
@@ -1,59 +0,0 @@
-"""
-This is a recipe to make a light ray through a simulation.
-"""
-import os
-import sys
-
-from yt.mods import *
-from yt.analysis_modules.halo_profiler.api import *
-from yt.analysis_modules.light_ray.api import *
-
-# Get the simulation parameter file from the command line.
-par_file = sys.argv[1]
-
-# Instantiate a ray object from z = 0 to z = 0.1 using the 
-# minimum number of datasets.
-lr = LightRay(par_file, 0.0, 0.1, use_minimum_datasets=True)
-
-# The next four variables are used when get_nearest_galaxy is set to True.
-# This option will calculate the distance and mass of the halo nearest to 
-# each element of the ray.
-# The light ray tool accomplishes this by using the HaloProfiler.
-# Here we are providing the LightRay with instructions to give the HaloProfiler.
-# This is a dictionary of standard halo profiler keyword arguments and values.
-halo_profiler_kwargs = {'halo_list_format': {'id':0, 'center':[4, 5, 6], 
-                                             'TotalMassMsun':1},
-                        'halo_list_file': 'HopAnalysis.out'}
-# This is a list of actions we want the HaloProfiler to perform.
-# Note that each list item is a dictionary with the following three 
-# entries: 'function', 'args', and 'kwargs'.
-# These are the function to be called, the arguments to that function, and 
-# any keyword arguments.
-halo_profiler_actions = [{'function': make_profiles,
-                          'args': None,
-                          'kwargs': {'filename': 'VirializedHalos.out'}},
-                         {'function': add_halo_filter,
-                          'args': VirialFilter,
-                          'kwargs': {'overdensity_field': 'ActualOverdensity',
-                                     'virial_overdensity': 200,
-                                     'virial_filters': [['TotalMassMsun','>=','1e14']],
-                                     'virial_quantities': ['TotalMassMsun','RadiusMpc']}}]
-# This option can only be 'all' or 'filtered' and tells the HaloProfiler to 
-# use either the full halo list or the filtered list made after calling make_profiles.
-halo_list = 'filtered'
-
-# This is the name of the field from the halo list that represents the halo mass.
-halo_mass_field = 'TotalMassMsun_200'
-
-# Make the ray and get the Density and Temperature fields, the nearest galaxy information, and 
-# the line of sight velocity.
-lr.make_light_ray(seed=8675309, 
-                  solution_filename='lightraysolution.txt',
-                  data_filename='lightray.h5',
-                  fields=['Temperature', 'Density'],
-                  get_nearest_galaxy=True, 
-                  halo_profiler_kwargs=halo_profiler_kwargs,
-                  halo_profiler_actions=halo_profiler_actions, 
-                  halo_list=halo_list,
-                  halo_mass_field=halo_mass_field,
-                  get_los_velocity=True)



https://bitbucket.org/yt_analysis/yt-doc/changeset/8d259a26f151/
changeset:   8d259a26f151
user:        brittonsmith
date:        2012-07-25 01:58:25
summary:     Adding new light cone recipes.
affected #:  3 files

diff -r b49b8250b4d12f4c70ee39ee2ef76330652056ea -r 8d259a26f1519a2225597c5b6daad7d7b78c18ae source/cookbook/light_cone_projection.py
--- /dev/null
+++ b/source/cookbook/light_cone_projection.py
@@ -0,0 +1,24 @@
+from yt.mods import *
+from yt.analysis_modules.api import LightCone
+
+# Create a LightCone object extending from z = 0 to z = 0.1
+# with a 600 arcminute field of view and a resolution of
+# 60 arcseconds.
+# We have already set up the redshift dumps to be
+# used for this, so we will not use any of the time
+# data dumps.
+cts = LightCone('32Mpc_32.enzo', 'Enzo', 0., 0.1,
+                observer_redshift=0.0,
+                field_of_view_in_arcminutes=600.0,
+                image_resolution_in_arcseconds=60.0,
+                time_data=False)
+
+# Calculate a randomization of the solution.
+cts.calculate_light_cone_solution(seed=123456789)
+
+# Make a light cone projection of the SZ Y parameter.
+# Set njobs to -1 to have one core work on each projection
+# in parallel.
+cts.project_light_cone('SZY',
+                       save_slice_images=True,
+                       njobs=-1)


diff -r b49b8250b4d12f4c70ee39ee2ef76330652056ea -r 8d259a26f1519a2225597c5b6daad7d7b78c18ae source/cookbook/light_cone_with_halo_mask.py
--- /dev/null
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -0,0 +1,69 @@
+from yt.mods import *
+
+from yt.analysis_modules.api import LightCone
+from yt.analysis_modules.halo_profiler.api import *
+
+# Instantiate a light cone object as usual.
+lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=600.0,
+               image_resolution_in_arcseconds=60.0,
+               time_data=False,
+               output_dir='LC_HM', output_prefix='LightCone')
+
+# Calculate the light cone solution.
+lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
+
+
+# Configure the HaloProfiler.
+# These are keyword arguments given when creating a
+# HaloProfiler object.
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+
+# Create a list of actions for the HaloProfiler to take.
+halo_profiler_actions = []
+
+# Each item in the list is a dictionary containing three things:
+# 1. 'function' - the function to be called.
+# 2. 'args' - a list of arguments given with the function.
+# 3. 'kwargs' - a dictionary of keyword arguments.
+
+# Add a virial filter.
+halo_profiler_actions.append({'function': HaloProfiler.add_halo_filter,
+                              'args': [VirialFilter],
+                              'kwargs': {'must_be_virialized':False,
+                                         'overdensity_field':'ActualOverdensity',
+                                         'virial_overdensity':100,
+                                         'virial_filters':[['TotalMassMsun','>','1e5']],
+                                         'virial_quantities':['TotalMassMsun','RadiusMpc']}})
+
+# Add a call to make the profiles.
+halo_profiler_actions.append({'function': HaloProfiler.make_profiles,
+                              'kwargs': {'filename': "VirializedHalos.out"}})
+
+# Specify the desired halo list is the filtered list.
+# If 'all' is given instead, the full list will be used.
+halo_list = 'filtered'
+
+# Put them all into one dictionary.
+halo_profiler_parameters=dict(halo_profiler_kwargs=halo_profiler_kwargs,
+                              halo_profiler_actions=halo_profiler_actions,
+                              halo_list=halo_list)
+
+# Get the halo list for the active solution of this light cone using
+# the HaloProfiler settings set up above.
+# Write the boolean map to an hdf5 file called 'halo_mask.h5'.
+# Write a text file detailing the location, redshift, radius, and mass
+# of each halo in light cone projection.
+lc.get_halo_mask(mask_file='halo_mask.h5', map_file='halo_map.out',
+                 cube_file='halo_cube.h5',
+                 virial_overdensity=100,
+                 halo_profiler_parameters=halo_profiler_parameters,
+                 njobs=1, dynamic=False)
+
+# Choose the field to be projected.
+field = 'SZY'
+
+# Make the light cone projection and apply the halo mask.
+pc = lc.project_light_cone(field, save_stack=True, save_slice_images=True,
+                           apply_halo_mask=True)


diff -r b49b8250b4d12f4c70ee39ee2ef76330652056ea -r 8d259a26f1519a2225597c5b6daad7d7b78c18ae source/cookbook/unique_light_cone_projections.py
--- /dev/null
+++ b/source/cookbook/unique_light_cone_projections.py
@@ -0,0 +1,28 @@
+from yt.mods import *
+from yt.analysis_modules.cosmological_observation.light_cone.api import *
+
+# Instantiate a light cone.
+lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=120.0,
+               image_resolution_in_arcseconds=60.0,
+               use_minimum_datasets=True,
+               time_data=False,
+               output_dir='LC_U', output_prefix='LightCone')
+
+# Try to find 10 solutions that have at most 10% volume in
+# common and give up after 50 consecutive failed attempts.
+# The recycle=True setting tells the code to first attempt
+# to use solutions with the same projection axes as other
+# solutions.  This will save time when making the projection.
+find_unique_solutions(lc, max_overlap=0.10, failures=50,
+                      seed=123456789, recycle=True,
+                      solutions=10, filename='unique.dat')
+
+field = 'SZY'
+
+# Make light cone projections with each of the random seeds
+# found above.  All output files will be written with unique
+# names based on the random seed numbers.
+project_unique_light_cones(lc, 'unique.dat', field,
+                           save_slice_images=True)



https://bitbucket.org/yt_analysis/yt-doc/changeset/832ec021add2/
changeset:   832ec021add2
user:        brittonsmith
date:        2012-07-25 02:06:49
summary:     Adding LightRay recipe.
affected #:  1 file

diff -r 8d259a26f1519a2225597c5b6daad7d7b78c18ae -r 832ec021add2bdefdbe1f2898a943031963b8151 source/cookbook/make_light_ray.py
--- /dev/null
+++ b/source/cookbook/make_light_ray.py
@@ -0,0 +1,62 @@
+import os
+import sys
+
+from yt.mods import *
+
+from yt.analysis_modules.halo_profiler.api import *
+from yt.analysis_modules.cosmological_observation.light_ray.api import \
+     LightRay
+
+# Create a LightRay object extending from z = 0 to z = 0.1
+# and use only the redshift dumps.
+lr = LightRay("32Mpc_32.enzo", 'Enzo', 0.0, 0.1,
+              use_minimum_datasets=True,
+              time_data=False)
+
+# Configure the HaloProfiler.
+# These are keyword arguments given when creating a
+# HaloProfiler object.
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+
+# Create a list of actions for the HaloProfiler to take.
+halo_profiler_actions = []
+
+# Each item in the list is a dictionary containing three things:
+# 1. 'function' - the function to be called.
+# 2. 'args' - a list of arguments given with the function.
+# 3. 'kwargs' - a dictionary of keyword arguments.
+
+# Add a virial filter.
+halo_profiler_actions.append({'function': HaloProfiler.add_halo_filter,
+                              'args': [VirialFilter],
+                              'kwargs': {'must_be_virialized':False,
+                                         'overdensity_field':'ActualOverdensity',
+                                         'virial_overdensity':100,
+                                         'virial_filters':[['TotalMassMsun','>','1e5']],
+                                         'virial_quantities':['TotalMassMsun','RadiusMpc']}})
+
+# Add a call to make the profiles.
+halo_profiler_actions.append({'function': HaloProfiler.make_profiles,
+                              'kwargs': {'filename': "VirializedHalos.out"}})
+
+# Specify the desired halo list is the filtered list.
+# If 'all' is given instead, the full list will be used.
+halo_list = 'filtered'
+
+# Put them all into one dictionary.
+halo_profiler_parameters=dict(halo_profiler_kwargs=halo_profiler_kwargs,
+                              halo_profiler_actions=halo_profiler_actions,
+                              halo_list=halo_list)
+
+# Make a light ray, and set njobs to -1 to use one core
+# per dataset.
+lr.make_light_ray(seed=123456789,
+                  solution_filename='lightraysolution.txt',
+                  data_filename='lightray.h5',
+                  fields=['Temperature', 'Density'],
+                  get_nearest_halo=True,
+                  nearest_halo_fields=['TotalMassMsun_100',
+                                       'RadiusMpc_100'],
+                  halo_profiler_parameters=halo_profiler_parameters,
+                  get_los_velocity=True,
+                  njobs=-1)



https://bitbucket.org/yt_analysis/yt-doc/changeset/5ab067ed9083/
changeset:   5ab067ed9083
user:        MatthewTurk
date:        2012-07-25 02:35:31
summary:     Merging from Britton
affected #:  5 files

diff -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 source/cookbook/light_cone_projection.py
--- /dev/null
+++ b/source/cookbook/light_cone_projection.py
@@ -0,0 +1,24 @@
+from yt.mods import *
+from yt.analysis_modules.api import LightCone
+
+# Create a LightCone object extending from z = 0 to z = 0.1
+# with a 600 arcminute field of view and a resolution of
+# 60 arcseconds.
+# We have already set up the redshift dumps to be
+# used for this, so we will not use any of the time
+# data dumps.
+cts = LightCone('32Mpc_32.enzo', 'Enzo', 0., 0.1,
+                observer_redshift=0.0,
+                field_of_view_in_arcminutes=600.0,
+                image_resolution_in_arcseconds=60.0,
+                time_data=False)
+
+# Calculate a randomization of the solution.
+cts.calculate_light_cone_solution(seed=123456789)
+
+# Make a light cone projection of the SZ Y parameter.
+# Set njobs to -1 to have one core work on each projection
+# in parallel.
+cts.project_light_cone('SZY',
+                       save_slice_images=True,
+                       njobs=-1)


diff -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 source/cookbook/light_cone_with_halo_mask.py
--- /dev/null
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -0,0 +1,69 @@
+from yt.mods import *
+
+from yt.analysis_modules.api import LightCone
+from yt.analysis_modules.halo_profiler.api import *
+
+# Instantiate a light cone object as usual.
+lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=600.0,
+               image_resolution_in_arcseconds=60.0,
+               time_data=False,
+               output_dir='LC_HM', output_prefix='LightCone')
+
+# Calculate the light cone solution.
+lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
+
+
+# Configure the HaloProfiler.
+# These are keyword arguments given when creating a
+# HaloProfiler object.
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+
+# Create a list of actions for the HaloProfiler to take.
+halo_profiler_actions = []
+
+# Each item in the list is a dictionary containing three things:
+# 1. 'function' - the function to be called.
+# 2. 'args' - a list of arguments given with the function.
+# 3. 'kwargs' - a dictionary of keyword arguments.
+
+# Add a virial filter.
+halo_profiler_actions.append({'function': HaloProfiler.add_halo_filter,
+                              'args': [VirialFilter],
+                              'kwargs': {'must_be_virialized':False,
+                                         'overdensity_field':'ActualOverdensity',
+                                         'virial_overdensity':100,
+                                         'virial_filters':[['TotalMassMsun','>','1e5']],
+                                         'virial_quantities':['TotalMassMsun','RadiusMpc']}})
+
+# Add a call to make the profiles.
+halo_profiler_actions.append({'function': HaloProfiler.make_profiles,
+                              'kwargs': {'filename': "VirializedHalos.out"}})
+
+# Specify the desired halo list is the filtered list.
+# If 'all' is given instead, the full list will be used.
+halo_list = 'filtered'
+
+# Put them all into one dictionary.
+halo_profiler_parameters=dict(halo_profiler_kwargs=halo_profiler_kwargs,
+                              halo_profiler_actions=halo_profiler_actions,
+                              halo_list=halo_list)
+
+# Get the halo list for the active solution of this light cone using
+# the HaloProfiler settings set up above.
+# Write the boolean map to an hdf5 file called 'halo_mask.h5'.
+# Write a text file detailing the location, redshift, radius, and mass
+# of each halo in light cone projection.
+lc.get_halo_mask(mask_file='halo_mask.h5', map_file='halo_map.out',
+                 cube_file='halo_cube.h5',
+                 virial_overdensity=100,
+                 halo_profiler_parameters=halo_profiler_parameters,
+                 njobs=1, dynamic=False)
+
+# Choose the field to be projected.
+field = 'SZY'
+
+# Make the light cone projection and apply the halo mask.
+pc = lc.project_light_cone(field, save_stack=True, save_slice_images=True,
+                           apply_halo_mask=True)


diff -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 source/cookbook/make_light_cone.py
--- a/source/cookbook/make_light_cone.py
+++ /dev/null
@@ -1,26 +0,0 @@
-"""
-The following recipe will make a light cone projection (see :ref:`light-cone-generator`) 
-of a single quantity over the redshift interval 0 to 0.4.
-"""
-from yt.mods import *
-from yt.analysis_modules.light_cone.api import *
-
-# All of the light cone parameters are given as keyword arguments at instantiation.
-lc = LightCone("128Mpc256grid_SFFB.param", initial_redshift=0.4, 
-               final_redshift=0.0, observer_redshift=0.0,
-               field_of_view_in_arcminutes=450.0, 
-               image_resolution_in_arcseconds=60.0,
-               use_minimum_datasets=True, deltaz_min=0.0, 
-               minimum_coherent_box_fraction=0.0,
-               output_dir='LC', output_prefix='LightCone')
-
-# Calculate a light cone solution and write out a text file with the details 
-# of the solution.
-lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
-
-# This will be the field to be projected.
-field = 'SZY'
-
-# Make the light cone projection, save individual images of each slice 
-# and of the projection as well as an hdf5 file with the full data cube.
-lc.project_light_cone(field ,save_stack=True, save_slice_images=True)


diff -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ b/source/cookbook/make_light_ray.py
@@ -1,59 +1,62 @@
-"""
-This is a recipe to make a light ray through a simulation.
-"""
 import os
 import sys
 
 from yt.mods import *
+
 from yt.analysis_modules.halo_profiler.api import *
-from yt.analysis_modules.light_ray.api import *
+from yt.analysis_modules.cosmological_observation.light_ray.api import \
+     LightRay
 
-# Get the simulation parameter file from the command line.
-par_file = sys.argv[1]
+# Create a LightRay object extending from z = 0 to z = 0.1
+# and use only the redshift dumps.
+lr = LightRay("32Mpc_32.enzo", 'Enzo', 0.0, 0.1,
+              use_minimum_datasets=True,
+              time_data=False)
 
-# Instantiate a ray object from z = 0 to z = 0.1 using the 
-# minimum number of datasets.
-lr = LightRay(par_file, 0.0, 0.1, use_minimum_datasets=True)
+# Configure the HaloProfiler.
+# These are keyword arguments given when creating a
+# HaloProfiler object.
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
 
-# The next four variables are used when get_nearest_galaxy is set to True.
-# This option will calculate the distance and mass of the halo nearest to 
-# each element of the ray.
-# The light ray tool accomplishes this by using the HaloProfiler.
-# Here we are providing the LightRay with instructions to give the HaloProfiler.
-# This is a dictionary of standard halo profiler keyword arguments and values.
-halo_profiler_kwargs = {'halo_list_format': {'id':0, 'center':[4, 5, 6], 
-                                             'TotalMassMsun':1},
-                        'halo_list_file': 'HopAnalysis.out'}
-# This is a list of actions we want the HaloProfiler to perform.
-# Note that each list item is a dictionary with the following three 
-# entries: 'function', 'args', and 'kwargs'.
-# These are the function to be called, the arguments to that function, and 
-# any keyword arguments.
-halo_profiler_actions = [{'function': make_profiles,
-                          'args': None,
-                          'kwargs': {'filename': 'VirializedHalos.out'}},
-                         {'function': add_halo_filter,
-                          'args': VirialFilter,
-                          'kwargs': {'overdensity_field': 'ActualOverdensity',
-                                     'virial_overdensity': 200,
-                                     'virial_filters': [['TotalMassMsun','>=','1e14']],
-                                     'virial_quantities': ['TotalMassMsun','RadiusMpc']}}]
-# This option can only be 'all' or 'filtered' and tells the HaloProfiler to 
-# use either the full halo list or the filtered list made after calling make_profiles.
+# Create a list of actions for the HaloProfiler to take.
+halo_profiler_actions = []
+
+# Each item in the list is a dictionary containing three things:
+# 1. 'function' - the function to be called.
+# 2. 'args' - a list of arguments given with the function.
+# 3. 'kwargs' - a dictionary of keyword arguments.
+
+# Add a virial filter.
+halo_profiler_actions.append({'function': HaloProfiler.add_halo_filter,
+                              'args': [VirialFilter],
+                              'kwargs': {'must_be_virialized':False,
+                                         'overdensity_field':'ActualOverdensity',
+                                         'virial_overdensity':100,
+                                         'virial_filters':[['TotalMassMsun','>','1e5']],
+                                         'virial_quantities':['TotalMassMsun','RadiusMpc']}})
+
+# Add a call to make the profiles.
+halo_profiler_actions.append({'function': HaloProfiler.make_profiles,
+                              'kwargs': {'filename': "VirializedHalos.out"}})
+
+# Specify the desired halo list is the filtered list.
+# If 'all' is given instead, the full list will be used.
 halo_list = 'filtered'
 
-# This is the name of the field from the halo list that represents the halo mass.
-halo_mass_field = 'TotalMassMsun_200'
+# Put them all into one dictionary.
+halo_profiler_parameters=dict(halo_profiler_kwargs=halo_profiler_kwargs,
+                              halo_profiler_actions=halo_profiler_actions,
+                              halo_list=halo_list)
 
-# Make the ray and get the Density and Temperature fields, the nearest galaxy information, and 
-# the line of sight velocity.
-lr.make_light_ray(seed=8675309, 
+# Make a light ray, and set njobs to -1 to use one core
+# per dataset.
+lr.make_light_ray(seed=123456789,
                   solution_filename='lightraysolution.txt',
                   data_filename='lightray.h5',
                   fields=['Temperature', 'Density'],
-                  get_nearest_galaxy=True, 
-                  halo_profiler_kwargs=halo_profiler_kwargs,
-                  halo_profiler_actions=halo_profiler_actions, 
-                  halo_list=halo_list,
-                  halo_mass_field=halo_mass_field,
-                  get_los_velocity=True)
+                  get_nearest_halo=True,
+                  nearest_halo_fields=['TotalMassMsun_100',
+                                       'RadiusMpc_100'],
+                  halo_profiler_parameters=halo_profiler_parameters,
+                  get_los_velocity=True,
+                  njobs=-1)


diff -r f092fb08e113674a6c00dbc51e64cd7ae15238f5 -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 source/cookbook/unique_light_cone_projections.py
--- /dev/null
+++ b/source/cookbook/unique_light_cone_projections.py
@@ -0,0 +1,28 @@
+from yt.mods import *
+from yt.analysis_modules.cosmological_observation.light_cone.api import *
+
+# Instantiate a light cone.
+lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=120.0,
+               image_resolution_in_arcseconds=60.0,
+               use_minimum_datasets=True,
+               time_data=False,
+               output_dir='LC_U', output_prefix='LightCone')
+
+# Try to find 10 solutions that have at most 10% volume in
+# common and give up after 50 consecutive failed attempts.
+# The recycle=True setting tells the code to first attempt
+# to use solutions with the same projection axes as other
+# solutions.  This will save time when making the projection.
+find_unique_solutions(lc, max_overlap=0.10, failures=50,
+                      seed=123456789, recycle=True,
+                      solutions=10, filename='unique.dat')
+
+field = 'SZY'
+
+# Make light cone projections with each of the random seeds
+# found above.  All output files will be written with unique
+# names based on the random seed numbers.
+project_unique_light_cones(lc, 'unique.dat', field,
+                           save_slice_images=True)



https://bitbucket.org/yt_analysis/yt-doc/changeset/72164354abe6/
changeset:   72164354abe6
user:        MatthewTurk
date:        2012-07-25 02:43:41
summary:     A few changes so that these can run from within the top-level data directory.
affected #:  3 files

diff -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 -r 72164354abe62228be1f5fb10af888e57b851b10 source/cookbook/light_cone_projection.py
--- a/source/cookbook/light_cone_projection.py
+++ b/source/cookbook/light_cone_projection.py
@@ -7,7 +7,8 @@
 # We have already set up the redshift dumps to be
 # used for this, so we will not use any of the time
 # data dumps.
-cts = LightCone('32Mpc_32.enzo', 'Enzo', 0., 0.1,
+cts = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
+                'Enzo', 0., 0.1,
                 observer_redshift=0.0,
                 field_of_view_in_arcminutes=600.0,
                 image_resolution_in_arcseconds=60.0,


diff -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 -r 72164354abe62228be1f5fb10af888e57b851b10 source/cookbook/light_cone_with_halo_mask.py
--- a/source/cookbook/light_cone_with_halo_mask.py
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -4,7 +4,8 @@
 from yt.analysis_modules.halo_profiler.api import *
 
 # Instantiate a light cone object as usual.
-lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+lc = LightCone("enzo_tiny_cosmology/32Mpc_32.enzo",
+               'Enzo', 0, 0.1,
                observer_redshift=0.0,
                field_of_view_in_arcminutes=600.0,
                image_resolution_in_arcseconds=60.0,


diff -r 5ab067ed9083387d5e7883b27824c9a80e4e8ca2 -r 72164354abe62228be1f5fb10af888e57b851b10 source/cookbook/unique_light_cone_projections.py
--- a/source/cookbook/unique_light_cone_projections.py
+++ b/source/cookbook/unique_light_cone_projections.py
@@ -2,7 +2,7 @@
 from yt.analysis_modules.cosmological_observation.light_cone.api import *
 
 # Instantiate a light cone.
-lc = LightCone("32Mpc_32.enzo", 'Enzo', 0, 0.1,
+lc = LightCone("enzo_tiny_cosmology/32Mpc_32.enzo", 'Enzo', 0, 0.1,
                observer_redshift=0.0,
                field_of_view_in_arcminutes=120.0,
                image_resolution_in_arcseconds=60.0,



https://bitbucket.org/yt_analysis/yt-doc/changeset/2b16faade0e4/
changeset:   2b16faade0e4
user:        MatthewTurk
date:        2012-07-25 02:50:09
summary:     Adding a script to check for missing recipes and adding some supplemental data
to be found and moved from _temp.
affected #:  5 files

diff -r 72164354abe62228be1f5fb10af888e57b851b10 -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d helper_scripts/run_recipes.sh
--- a/helper_scripts/run_recipes.sh
+++ b/helper_scripts/run_recipes.sh
@@ -8,7 +8,7 @@
     [ -e ${sb}.done ] && continue
     echo ${sb}
     python2.7 ${ROOT}/${s} || exit
-    for o in *.png *.txt
+    for o in *.png *.txt *.h5 *.dat
     do
         mv -v ${o} ${ROOT}/source/cookbook/_static/${sb%%.py}__${o}
     done


diff -r 72164354abe62228be1f5fb10af888e57b851b10 -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -241,4 +241,4 @@
                        'http://matplotlib.sourceforge.net/': None,
                        }
 
-autosummary_generate = glob.glob("api/api.rst")
+#autosummary_generate = glob.glob("api/api.rst")


diff -r 72164354abe62228be1f5fb10af888e57b851b10 -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d source/cookbook/count.sh
--- /dev/null
+++ b/source/cookbook/count.sh
@@ -0,0 +1,8 @@
+for fn in *.py
+do
+    COUNT=`cat *.rst | grep --count ${fn}`
+    if [ $COUNT -lt 1 ]
+    then
+        echo ${fn} missing!
+    fi
+done


diff -r 72164354abe62228be1f5fb10af888e57b851b10 -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ b/source/cookbook/make_light_ray.py
@@ -9,7 +9,8 @@
 
 # Create a LightRay object extending from z = 0 to z = 0.1
 # and use only the redshift dumps.
-lr = LightRay("32Mpc_32.enzo", 'Enzo', 0.0, 0.1,
+lr = LightRay("enzo_tiny_cosmology/32Mpc_32.enzo",
+              'Enzo', 0.0, 0.1,
               use_minimum_datasets=True,
               time_data=False)
 


diff -r 72164354abe62228be1f5fb10af888e57b851b10 -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d source/cookbook/simple_plots.rst
--- a/source/cookbook/simple_plots.rst
+++ b/source/cookbook/simple_plots.rst
@@ -51,6 +51,35 @@
 
 .. yt_cookbook:: simple_radial_profile.py
 
+Making Plots of Multiple Fields Simultaneously
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+By adding multiple fields to a single
+:class:`~yt.visualization.plot_window.SlicePlot` or
+:class:`~yt.visualization.plot_window.ProjectionPlot` some of the overhead of
+creating the data object can be reduced, and better performance squeezed out.
+This recipe shows how to add multiple fields to a single plot.
+
+.. yt_cookbook:: simple_slice_with_multiple_fields.py 
+
+Accessing and Modifying Plots Directly
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+While often the Plot Window, and its affiliated :ref:`plot-modifications` can
+cover normal use cases, sometimes more direct access to the underlying
+Matplotlib engine is necessary.  This recipe shows how to modify the plot
+window :class:`matplotlib.axes.Axes` object directly.
+
+.. yt_cookbook:: simple_slice_matplotlib_example.py 
+
+Off-Axis Slicing
+~~~~~~~~~~~~~~~~
+
+A cutting plane allows you to slice at some angle that isn't aligned with the
+axes.
+
+.. yt_cookbook:: aligned_cutting_plane.py
+
 Simple Volume Rendering
 ~~~~~~~~~~~~~~~~~~~~~~~
 
@@ -59,10 +88,3 @@
 
 .. yt_cookbook:: simple_volume_rendering.py
 
-Off-Axis Slicing
-~~~~~~~~~~~~~~~~
-
-A cutting plane allows you to slice at some angle that isn't aligned with the
-axes.
-
-.. yt_cookbook:: aligned_cutting_plane.py



https://bitbucket.org/yt_analysis/yt-doc/changeset/1fe9ff38aa51/
changeset:   1fe9ff38aa51
user:        MatthewTurk
date:        2012-07-25 04:12:48
summary:     Refining the yt_cookbook directive to include source and links to the full-size
images, thanks to help from the Matplotlib plot_directive.
affected #:  2 files

diff -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f extensions/yt_cookbook.py
--- a/extensions/yt_cookbook.py
+++ b/extensions/yt_cookbook.py
@@ -5,28 +5,51 @@
 
 from sphinx.util.compat import Directive
 from docutils.parsers.rst import directives
-import os, glob
+import os, glob, shutil
+
+# Some of this magic comes from the matplotlib plot_directive.
 
 def setup(app):
     app.add_directive('yt_cookbook', CookbookScript)
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
 
 class CookbookScript(Directive):
     required_arguments = 1
     optional_arguments = 0
 
     def run(self):
-        script_fn = directives.path(self.arguments[0])
-        script_name = os.path.basename(self.arguments[0]).split(".")[0]
         rst_file = self.state_machine.document.attributes['source']
         rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        script_fn = directives.path(self.arguments[0])
+        script_bn = os.path.basename(script_fn)
+        script_name = os.path.basename(self.arguments[0]).split(".")[0]
+
+        # This magic is from matplotlib
+        dest_dir = os.path.abspath(os.path.join(setup.app.builder.outdir,
+                                                os.path.dirname(script_fn)))
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir) # no problem here for me, but just use built-ins
+
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        shutil.copyfile(os.path.join(rst_dir, script_fn),
+                        os.path.join(dest_dir, rel_dir, script_bn))
+
         im_path = os.path.join(rst_dir, "_static")
         images = sorted(glob.glob(os.path.join(im_path, "%s__*.png" % script_name)))
-        lines = [".. literalinclude:: %s" % self.arguments[0], "\n", "\n"]
+        lines = []
+        lines.append("(`%s <%s>`__)" % (script_bn, script_fn))
+        lines.append("\n")
+        lines.append("\n")
+        lines.append(".. literalinclude:: %s" % self.arguments[0])
+        lines.append("\n")
+        lines.append("\n")
         for im in images:
             im_name = os.path.join("_static", os.path.basename(im))
             lines.append(".. image:: %s" % im_name)
-            lines.append("   :width: 250")
+            lines.append("   :width: 400")
+            lines.append("   :target: ../_images/%s" % os.path.basename(im))
             lines.append("\n")
         self.state_machine.insert_input(lines, rst_file)
-        print "\n".join(lines)
         return []


diff -r 2b16faade0e4ca124f37d3bdede859a3b82aef8d -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f source/configuration.rst
--- a/source/configuration.rst
+++ b/source/configuration.rst
@@ -58,14 +58,14 @@
 argument.  As an example, to lower the log level (thus making it more verbose)
 you can specify:
 
-.. code-block::
+.. code-block:: bash
 
    $ python2.7 my_script.py --config loglevel=1
 
 Any configuration option specific to yt can be specified in this manner.  One
 common configuration option would be to disable serialization:
 
-.. code-block::
+.. code-block:: bash
 
    $ python2.7 my_script.py --config serialize=False
 



https://bitbucket.org/yt_analysis/yt-doc/changeset/d8b69cdb77a6/
changeset:   d8b69cdb77a6
user:        MatthewTurk
date:        2012-07-25 05:07:32
summary:     Changing the remaining existing literalincludes over to yt_cookbook directives.
Added all the remaining recipes from the cookbook.
affected #:  16 files

diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -1,10 +1,7 @@
 Calculating Dataset Information
 -------------------------------
 
-.. literalinclude:: average_value.py
-
-
-
-.. literalinclude:: sum_mass_in_sphere.py
-.. literalinclude:: global_phase_plots.py
-
+.. yt_cookbook:: average_value.py
+.. yt_cookbook:: sum_mass_in_sphere.py
+.. yt_cookbook:: global_phase_plots.py
+.. yt_cookbook:: rad_velocity.py 


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/camera_movement.py
--- /dev/null
+++ b/source/cookbook/camera_movement.py
@@ -0,0 +1,53 @@
+"""
+Title: Camera Motion in Volume Rendering
+Description: This recipe shows how to use the movement functions hanging off of
+             the Camera object.
+
+             Additionally, for the purposes of the recipe, we have simplified
+             the image considerably.
+Outputs: []
+"""
+from yt.mods import * # set up our namespace
+   
+# Follow the simple_volume_rendering cookbook for the first part of this.
+fn = "RedshiftOutput0005" # parameter file to load
+pf = load(fn) # load data
+dd = pf.h.all_data()
+mi, ma = dd.quantities["Extrema"]("Density")[0]
+
+# Set up transfer function
+tf = ColorTransferFunction((na.log10(mi), na.log10(ma)))
+tf.add_layers(6, w=0.05)
+
+# Set up camera paramters
+c = [0.5, 0.5, 0.5] # Center
+L = [1, 1, 1] # Normal Vector
+W = 1.0 # Width
+Nvec = 512 # Pixels on a side
+
+# Specify a north vector, which helps with rotations.
+north_vector = [0.,0.,1.]
+
+# Find the maximum density location, store it in max_c
+v,max_c = pf.h.find_max('Density')
+
+# Initialize the Camera
+cam = pf.h.camera(c, L, W, (Nvec,Nvec), tf, north_vector=north_vector)
+frame = 0
+
+# Do a rotation over 30 frames
+for i, snapshot in enumerate(cam.rotation(na.pi, 30)):
+    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
+    frame += 1
+
+# Move to the maximum density location over 10 frames
+for i, snapshot in enumerate(cam.move_to(max_c, 10)):
+    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
+    frame += 1
+
+# Zoom in by a factor of 10 over 10 frames
+for i, snapshot in enumerate(cam.zoomin(10.0, 10)):
+    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
+    frame += 1
+
+


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/complex_plots.rst
--- a/source/cookbook/complex_plots.rst
+++ b/source/cookbook/complex_plots.rst
@@ -1,10 +1,11 @@
 A Few Complex Plots
 -------------------
 
-.. literalinclude:: offaxis_projection.py
-.. literalinclude:: multi_width_image.py
-.. literalinclude:: overplot_particles.py
-.. literalinclude:: thin_slice_projection.py
-.. literalinclude:: velocity_vectors_on_slice.py
-.. literalinclude:: contours_on_slice.py
-
+.. yt_cookbook:: offaxis_projection.py
+.. yt_cookbook:: multi_width_image.py
+.. yt_cookbook:: multi_plot_slice_and_proj.py 
+.. yt_cookbook:: overplot_particles.py
+.. yt_cookbook:: thin_slice_projection.py
+.. yt_cookbook:: velocity_vectors_on_slice.py
+.. yt_cookbook:: contours_on_slice.py
+.. yt_cookbook:: radial_profile_styles.py 


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/constructing_data_objects.rst
--- a/source/cookbook/constructing_data_objects.rst
+++ b/source/cookbook/constructing_data_objects.rst
@@ -1,6 +1,6 @@
 Constructing Data Objects
 -------------------------
 
-.. literalinclude:: find_clumps.py
-.. literalinclude:: boolean_data_objects.py
+.. yt_cookbook:: find_clumps.py
+.. yt_cookbook:: boolean_data_objects.py
 


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/cosmological_analysis.rst
--- a/source/cookbook/cosmological_analysis.rst
+++ b/source/cookbook/cosmological_analysis.rst
@@ -1,9 +1,10 @@
 Cosmological Analysis
 ---------------------
 
-.. literalinclude:: make_light_cone.py
-.. literalinclude:: make_light_ray.py
-.. literalinclude:: halo_finding.py
-.. literalinclude:: halo_particle_plotting.py
-.. literalinclude:: halo_plotting.py
-
+.. yt_cookbook:: halo_finding.py
+.. yt_cookbook:: halo_plotting.py
+.. yt_cookbook:: halo_particle_plotting.py
+.. yt_cookbook:: light_cone_projection.py 
+.. yt_cookbook:: light_cone_with_halo_mask.py 
+.. yt_cookbook:: unique_light_cone_projections.py 
+.. yt_cookbook:: make_light_ray.py 


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/count.sh
--- a/source/cookbook/count.sh
+++ b/source/cookbook/count.sh
@@ -3,6 +3,6 @@
     COUNT=`cat *.rst | grep --count ${fn}`
     if [ $COUNT -lt 1 ]
     then
-        echo ${fn} missing!
+        echo ${fn}
     fi
 done


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/extract_fixed_resolution_data.py
--- /dev/null
+++ b/source/cookbook/extract_fixed_resolution_data.py
@@ -0,0 +1,38 @@
+"""
+Title: Extract Fixed Resolution Data
+Description: This is a recipe to show how to open a dataset and extract it to a
+             file at a fixed resolution with no interpolation or smoothing.
+             Additionally, this recipe shows how to insert a dataset into an
+             external HDF5 file using h5py.
+Outputs: [my_data.h5]
+"""
+from yt.mods import *
+
+# For this example we will use h5py to write to our output file.
+import h5py
+
+fn = "RedshiftOutput0005" # parameter file to load
+pf = load(fn) # load data
+
+# This is the resolution we will extract at
+DIMS = 128
+
+# Now, we construct an object that describes the data region and structure we
+# want
+cube = pf.h.covering_grid(2, # The level we are willing to extract to; higher
+                             # levels than this will not contribute to the data!
+                          left_edge=[0.0, 0.0, 0.0], 
+                          # How many dimensions along each axis
+                          dims=[DIMS,DIMS,DIMS],
+                          # And any fields to preload (this is optional!)
+                          fields=["Density"])
+
+# Now we open our output file using h5py
+# Note that we open with 'w' which will overwrite existing files!
+f = h5py.File("my_data.h5", "w") 
+
+# We create a dataset at the root note, calling it density...
+f.create_dataset("/density", data=cube["Density"])
+
+# We close our file
+f.close()


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/halo_mass_info.py
--- /dev/null
+++ b/source/cookbook/halo_mass_info.py
@@ -0,0 +1,34 @@
+"""
+Title: Halo Mass Info
+Description: This recipe finds halos and then prints out information about
+             them.  Note that this recipe will take advantage of multiple CPUs
+             if executed with mpirun and supplied the --parallel command line
+             argument.  
+Outputs: [RedshiftOutput0006_halo_info.txt]
+"""
+from yt.mods import *
+
+fn = "Enzo_64/RD0006/RedshiftOutput0006" # parameter file to load
+pf = load(fn) # load data
+
+# First we run our halo finder to identify all the halos in the dataset.  This
+# can take arguments, but the default are pretty sane.
+halos = HaloFinder(pf)
+
+f = open("%s_halo_info.txt" % pf, "w")
+
+# Now, for every halo, we get the baryon data and examine it.
+for halo in halos:
+    # The halo has a property called 'get_sphere' that obtains a sphere
+    # centered on the point of maximum density (or the center of mass, if that
+    # argument is supplied) and with the radius the maximum particle radius of
+    # that halo.
+    sphere = halo.get_sphere()
+    # We use the quantities[] method to get the total mass in baryons and in
+    # particles.
+    baryon_mass, particle_mass = sphere.quantities["TotalQuantity"](
+            ["CellMassMsun", "ParticleMassMsun"], lazy_reader=True)
+    # Now we print out this information, along with the ID.
+    f.write("Total mass in HOP group %s is %0.5e (gas = %0.5e / particles = %0.5e)\n" % \
+            (halo.id, baryon_mass + particle_mass, baryon_mass, particle_mass))
+f.close()


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/multi_plot.py
--- /dev/null
+++ b/source/cookbook/multi_plot.py
@@ -0,0 +1,54 @@
+"""
+Title: Simple Multi Plot
+Description: This is a simple recipe to show how to open a dataset and then
+             plot a slice through it, centered at its most dense point.
+Outputs: []
+"""
+from yt.mods import * # set up our namespace
+import matplotlib.colorbar as cb
+
+fn = "GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0150" # parameter file to load
+orient = 'horizontal'
+
+pf = load(fn) # load data
+
+# There's a lot in here:
+#   From this we get a containing figure, a list-of-lists of axes into which we
+#   can place plots, and some axes that we'll put colorbars.
+# We feed it:
+#   Number of plots on the x-axis, number of plots on the y-axis, and how we
+#   want our colorbars oriented.  (This governs where they will go, too.
+#   bw is the base-width in inches, but 4 is about right for most cases.
+fig, axes, colorbars = get_multi_plot( 2, 1, colorbar=orient, bw = 4)
+
+# We'll use a plot collection, just for convenience's sake
+pc = PlotCollection(pf, "c")
+
+# Now we add a slice and set the colormap of that slice, but note that we're
+# feeding it an axes -- the zeroth row, the zeroth column, and telling the plot
+# "Don't make a colorbar."  We'll make one ourselves.
+p = pc.add_slice("Density", "x", figure = fig, axes = axes[0][0], use_colorbar=False)
+p.set_cmap("bds_highcontrast") # this is our colormap
+
+# We do this again, but this time we take the 1-index column.
+p = pc.add_slice("Temperature", "x", figure=fig, axes=axes[0][1], use_colorbar=False)
+p.set_cmap("hot") # a different colormap
+
+pc.set_width(5.0, 'mpc') # change width of both plots
+
+# Each 'p' is a plot -- this is the Density plot and the Temperature plot.
+# Each 'cax' is a colorbar-container, into which we'll put a colorbar.
+# zip means, give these two me together.
+for p, cax in zip(pc.plots, colorbars):
+    # Now we make a colorbar, using the 'image' attribute of the plot.
+    # 'image' is usually not accessed; we're making a special exception here,
+    # though.  'image' will tell the colorbar what the limits of the data are.
+    cbar = cb.Colorbar(cax, p.image, orientation=orient)
+    # Now, we have to do a tiny bit of magic -- we tell the plot what its
+    # colorbar is, and then we tell the plot to set the label of that colorbar.
+    p.colorbar = cbar
+    p._autoset_label()
+
+# And now we're done!  Note that we're calling a method of the figure, not the
+# PlotCollection.
+fig.savefig("%s" % pf)


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/multi_plot_3x2.py
--- /dev/null
+++ b/source/cookbook/multi_plot_3x2.py
@@ -0,0 +1,58 @@
+"""
+Title: Basic 3x2 Multi Plot
+Description: This is a simple recipe to show how to open a dataset and then
+             plot a slice through it, centered at its most dense point.
+Outputs: [RedshiftOutput0006_3x2.png]
+"""
+from yt.mods import * # set up our namespace
+import matplotlib.colorbar as cb
+
+fn = "Enzo_64/RD0006/RedshiftOutput0006" # parameter file to load
+orient = 'horizontal'
+
+pf = load(fn) # load data
+
+# There's a lot in here:
+#   From this we get a containing figure, a list-of-lists of axes into which we
+#   can place plots, and some axes that we'll put colorbars.
+# We feed it:
+#   Number of plots on the x-axis, number of plots on the y-axis, and how we
+#   want our colorbars oriented.  (This governs where they will go, too.
+#   bw is the base-width in inches, but 4 is about right for most cases.
+fig, axes, colorbars = get_multi_plot( 2, 3, colorbar=orient, bw = 4)
+
+# We'll use a plot collection, just for convenience's sake
+pc = PlotCollection(pf, "c")
+
+# Now we follow the method of "multi_plot.py" but we're going to iterate
+# over the columns, which will become axes of slicing.
+for ax in range(3):
+    p = pc.add_slice("Density", ax, figure = fig, axes = axes[ax][0],
+                     use_colorbar=False)
+    p.set_cmap("bds_highcontrast") # this is our colormap
+    p.set_zlim(5e-32, 1e-29)
+    # We do this again, but this time we take the 1-index column.
+    p = pc.add_slice("Temperature", ax, figure=fig, axes=axes[ax][1],
+                     use_colorbar=False)
+    p.set_zlim(1e3, 3e4) # Set this so it's the same for all.
+    p.set_cmap("hot") # a different colormap
+
+pc.set_width(5.0, 'mpc') # change width of both plots
+
+# Each 'p' is a plot -- this is the Density plot and the Temperature plot.
+# Each 'cax' is a colorbar-container, into which we'll put a colorbar.
+# zip means, give these two me together.  Note that it cuts off after the
+# shortest iterator is exhausted, in this case pc.plots.
+for p, cax in zip(pc.plots, colorbars):
+    # Now we make a colorbar, using the 'image' attribute of the plot.
+    # 'image' is usually not accessed; we're making a special exception here,
+    # though.  'image' will tell the colorbar what the limits of the data are.
+    cbar = cb.Colorbar(cax, p.image, orientation=orient)
+    # Now, we have to do a tiny bit of magic -- we tell the plot what its
+    # colorbar is, and then we tell the plot to set the label of that colorbar.
+    p.colorbar = cbar
+    p._autoset_label()
+
+# And now we're done!  Note that we're calling a method of the figure, not the
+# PlotCollection.
+fig.savefig("%s_3x2" % pf)


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/multi_plot_3x2_FRB.py
--- /dev/null
+++ b/source/cookbook/multi_plot_3x2_FRB.py
@@ -0,0 +1,72 @@
+"""
+Title: Advanced 3x2 Multi Plot
+Description: This produces a series of slices of multiple fields with different
+             color maps and zlimits, and makes use of the
+             FixedResolutionBuffer. While this is more complex than the
+             equivalent plot collection-based solution, it allows for a *lot*
+             more flexibility. Every part of the script uses matplotlib
+             commands, allowing its full power to be exercised.
+Outputs: [RedshiftOutput0006_3x2.png]
+"""
+from yt.mods import * # set up our namespace
+import matplotlib.colorbar as cb
+from matplotlib.colors import LogNorm
+
+fn = "Enzo_64/RD0006/RedshiftOutput0006" # parameter file to load
+
+
+pf = load(fn) # load data
+
+# set up our Fixed Resolution Buffer parameters: a width, resolution, and center
+width = (7.0, "mpc")
+res = [1000, 1000]
+#  get_multi_plot returns a containing figure, a list-of-lists of axes
+#   into which we can place plots, and some axes that we'll put
+#   colorbars.  
+
+#  it accepts: # of x-axis plots, # of y-axis plots, and how the
+#  colorbars are oriented (this also determines where they go: below
+#  in the case of 'horizontal', on the right in the case of
+#  'vertical'), bw is the base-width in inches (4 is about right for
+#  most cases)
+
+orient = 'horizontal'
+fig, axes, colorbars = get_multi_plot( 2, 3, colorbar=orient, bw = 6)
+
+# Now we follow the method of "multi_plot.py" but we're going to iterate
+# over the columns, which will become axes of slicing.
+plots = []
+for ax in range(3):
+    sli = pf.h.slice(ax, c[ax])
+    frb = sli.to_frb(width, res, center=c, periodic=True)
+    den_axis = axes[ax][0]
+    temp_axis = axes[ax][1]
+
+    # here, we turn off the axes labels and ticks, but you could
+    # customize further.
+    for ax in (den_axis, temp_axis):
+        ax.xaxis.set_visible(False)
+        ax.yaxis.set_visible(False)
+
+    plots.append(den_axis.imshow(frb['Density'], norm=LogNorm()))
+    plots[-1].set_clim((5e-32, 1e-29))
+    plots[-1].set_cmap("bds_highcontrast")
+
+    plots.append(temp_axis.imshow(frb['Temperature'], norm=LogNorm()))
+    plots[-1].set_clim((1e3, 3e4))
+    plots[-1].set_cmap("hot")
+    
+# Each 'cax' is a colorbar-container, into which we'll put a colorbar.
+# the zip command creates triples from each element of the three lists
+# .  Note that it cuts off after the shortest iterator is exhausted,
+# in this case, titles.
+titles=[r'$\mathrm{Density}\ (\mathrm{g\ cm^{-3}})$', r'$\mathrm{Temperature}\ (\mathrm{K})$']
+for p, cax, t in zip(plots, colorbars,titles):
+    # Now we make a colorbar, using the 'image' we stored in plots
+    # above. note this is what is *returned* by the imshow method of
+    # the plots.
+    cbar = fig.colorbar(p, cax=cax, orientation=orient)
+    cbar.set_label(t)
+
+# And now we're done!  
+fig.savefig("%s_3x2.png" % pf)


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/offaxis_projection_colorbar.py
--- /dev/null
+++ b/source/cookbook/offaxis_projection_colorbar.py
@@ -0,0 +1,46 @@
+"""
+Title: Off-axis Projection
+Description: This recipe shows how to generate a colorbar with a projection of
+             a dataset from an arbitrary projection angle (so you are not
+             confined to the x, y, and z axes).  Please note that this same
+             write_projection function will work with a volume rendering to
+             generate a colorbar in the same fashion.  
+Outputs: [ offaxis_projection_colorbar.png ]
+"""
+from yt.mods import * # set up our namespace
+
+fn = "IsolatedGalaxy/galaxy0030/galaxy0030" # parameter file to load
+
+pf = load(fn) # load data
+
+# Now we need a center of our volume to render.  Here we'll just use
+# 0.5,0.5,0.5, because volume renderings are not periodic.
+c = [0.5, 0.5, 0.5]
+
+# Our image plane will be normal to some vector.  For things like collapsing
+# objects, you could set it the way you would a cutting plane -- but for this
+# dataset, we'll just choose an off-axis value at random.  This gets normalized
+# automatically.
+L = [0.5, 0.2, 0.7]
+
+# Our "width" is the width of the image plane as well as the depth -- so we set
+# it to be 0.8 so we get almost the whole domain.  Note that corners may be
+# visible in the output image!
+W = 0.8
+
+# Now we decide how big an image we want.  512x512 should be sufficient.
+N = 512
+
+# Now we call the off_axis_projection function, which handles the rest.
+# Note that we set no_ghost equal to False, so that we *do* include ghost
+# zones in our data.  This takes longer to calculate, but the results look
+# much cleaner than when you ignore the ghost zones.
+# Also note that we set the field which we want to project as "Density", but
+# really we could use any arbitrary field like "Temperature", "Metallicity"
+# or whatever.
+image = off_axis_projection(pf, c, L, W, N, "Density", no_ghost=False)
+
+# Image is now an NxN array representing the intensities of the various pixels.
+# And now, we call our direct image saver.  We save the log of the result.
+write_projection(image, "offaxis_projection_colorbar.png" % pf, 
+                 colorbar_label="Column Density (cm$^{-2}$)", title="Title")


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/run_halo_profiler.py
--- /dev/null
+++ b/source/cookbook/run_halo_profiler.py
@@ -0,0 +1,37 @@
+"""
+This is a recipe for making radial profiles and projections of all of the halos 
+within a cosmological simulation.  See :ref:`halo_profiling` for full documentation 
+of the HaloProfiler.
+"""
+from yt.mods import *
+
+# Instantiate HaloProfiler for this dataset.
+hp = amods.halo_profiler.HaloProfiler("DD0242/DD0242")
+
+# Add a filter to remove halos that have no profile points with overdensity 
+# above 200, and with virial masses less than 1e14 solar masses.
+# Also, return the virial mass and radius to be written out to a file.
+hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
+                   overdensity_field='ActualOverdensity',
+                   virial_overdensity=200,
+                   virial_filters=[['TotalMassMsun','>=','1e14']],
+                   virial_quantities=['TotalMassMsun','RadiusMpc'])
+
+# Add profile fields.
+hp.add_profile('CellVolume',weight_field=None,accumulation=True)
+hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
+hp.add_profile('Density',weight_field='CellMassMsun',accumulation=False)
+hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
+
+# Make profiles and output filtered halo list to FilteredQuantities.out.
+hp.make_profiles(filename="FilteredQuantities.out")
+
+# Add projection fields.
+hp.add_projection('Density',weight_field=None)
+hp.add_projection('Temperature',weight_field='Density')
+hp.add_projection('Metallicity',weight_field='Density')
+
+# Make projections for all three axes using the filtered halo list and 
+# save data to hdf5 files.
+hp.make_projections(save_cube=True,save_images=True,
+                    halo_list='filtered',axes=[0,1,2])


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/simple_plots.rst
--- a/source/cookbook/simple_plots.rst
+++ b/source/cookbook/simple_plots.rst
@@ -1,6 +1,10 @@
 Making Simple Plots
 -------------------
 
+One of the easiest ways to interact with yt is by creating simple
+visualizations of your data.  Below we show how to do this, as well as how to
+extend these plots to be ready for publication.
+
 Simple Slices
 ~~~~~~~~~~~~~
 


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/simulation_halo_profiler.py
--- /dev/null
+++ b/source/cookbook/simulation_halo_profiler.py
@@ -0,0 +1,40 @@
+"""
+The following recipe will run the HaloProfiler (see :ref:`halo_profiling`) on
+all the datasets in one simulation between z = 10 and 0.
+"""
+from yt.mods import *
+
+es = amods.simulation_handler.EnzoSimulation(
+        "simulation_parameter_file", initial_redshift=10, final_redshift=0)
+
+# Loop over all dataset in the requested time interval.
+for output in es.allOutputs:
+
+    # Instantiate HaloProfiler for this dataset.
+    hp = amods.halo_profiler.HaloProfiler(output['filename'])
+    
+    # Add a virialization filter.
+    hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
+                       overdensity_field='ActualOverdensity',
+                       virial_overdensity=200,
+                       virial_filters=[['TotalMassMsun','>=','1e14']],
+                       virial_quantities=['TotalMassMsun','RadiusMpc'])
+    
+    # Add profile fields.
+    hp.add_profile('CellVolume',weight_field=None,accumulation=True)
+    hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
+    hp.add_profile('Density',weight_field="CellMassMsun",accumulation=False)
+    hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
+    # Make profiles and output filtered halo list to FilteredQuantities.out.
+    hp.make_profiles(filename="FilteredQuantities.out")
+    
+    # Add projection fields.
+    hp.add_projection('Density',weight_field=None)
+    hp.add_projection('Temperature',weight_field='Density')
+    hp.add_projection('Metallicity',weight_field='Density')
+    # Make projections for all three axes using the filtered halo list and 
+    # save data to hdf5 files.
+    hp.make_projections(save_cube=True,save_images=True,
+                        halo_list='filtered',axes=[0,1,2])
+    
+    del hp


diff -r 1fe9ff38aa51017e297271d4a8b7b63117cd120f -r d8b69cdb77a66263d0552de015a4c704868af5ae source/cookbook/zoomin_frames.py
--- /dev/null
+++ b/source/cookbook/zoomin_frames.py
@@ -0,0 +1,41 @@
+"""
+Title: Zooming Movie
+Description: This is a recipe that takes a slice through the most dense point,
+             then creates a bunch of frames as it zooms in.  It's important to
+             note that this particular recipe is provided to show how to be
+             more flexible and add annotations and the like -- the base system,
+             of a zoomin, is provided by the "yt zoomin" command on the command
+             line.
+Outputs: [frame_00000.png, frame_00001.png, frame_00002.png,
+         frame_00003.png, frame_00004.png]
+"""
+from yt.mods import * # set up our namespace
+
+fn = "RedshiftOutput0005" # parameter file to load
+n_frames = 5  # This is the number of frames to make -- below, you can see how
+              # this is used.
+min_dx = 40   # This is the minimum size in smallest_dx of our last frame.
+              # Usually it should be set to something like 400, but for THIS
+              # dataset, we actually don't have that great of resolution.:w
+
+pf = load(fn) # load data
+frame_template = "frame_%05i" # Template for frame filenames
+
+pc = PlotCollection(pf, "c")
+p = pc.add_slice("Density", "z") # Add our slice, along z
+p.modify["contour"]("Temperature") # We'll contour in temperature -- this kind
+                                    # of modification can't be done on the command
+                                    # line, so that's why we have the recipe!
+
+# What we do now is a bit fun.  "enumerate" returns a tuple for every item --
+# the index of the item, and the item itself.  This saves us having to write
+# something like "i = 0" and then inside the loop "i += 1" for ever loop.  The
+# argument to enumerate is the 'logspace' function, which takes a minimum and a
+# maximum and the number of items to generate.  It returns 10^power of each
+# item it generates.
+for i,v in enumerate(na.logspace(
+            0, na.log10(pf.h.get_smallest_dx()*min_dx), n_frames)):
+    # We set our width as necessary for this frame ...
+    pc.set_width(v,'1')
+    # ... and we save!
+    pc.save(frame_template % (i))



https://bitbucket.org/yt_analysis/yt-doc/changeset/f48b7599fe6f/
changeset:   f48b7599fe6f
user:        MatthewTurk
date:        2012-07-25 05:14:46
summary:     Adding the remaining .py files to various .rst files in cookbook includes.
affected #:  3 files

diff -r d8b69cdb77a66263d0552de015a4c704868af5ae -r f48b7599fe6fabb19e6694fbc46321d060b0ff64 source/cookbook/complex_plots.rst
--- a/source/cookbook/complex_plots.rst
+++ b/source/cookbook/complex_plots.rst
@@ -4,8 +4,14 @@
 .. yt_cookbook:: offaxis_projection.py
 .. yt_cookbook:: multi_width_image.py
 .. yt_cookbook:: multi_plot_slice_and_proj.py 
+.. yt_cookbook:: multi_plot_3x2_FRB.py
+.. yt_cookbook:: multi_plot_3x2.py
+.. yt_cookbook:: multi_plot.py
 .. yt_cookbook:: overplot_particles.py
 .. yt_cookbook:: thin_slice_projection.py
 .. yt_cookbook:: velocity_vectors_on_slice.py
 .. yt_cookbook:: contours_on_slice.py
 .. yt_cookbook:: radial_profile_styles.py 
+.. yt_cookbook:: offaxis_projection_colorbar.py
+.. yt_cookbook:: camera_movement.py
+.. yt_cookbook:: zoomin_frames.py


diff -r d8b69cdb77a66263d0552de015a4c704868af5ae -r f48b7599fe6fabb19e6694fbc46321d060b0ff64 source/cookbook/constructing_data_objects.rst
--- a/source/cookbook/constructing_data_objects.rst
+++ b/source/cookbook/constructing_data_objects.rst
@@ -3,4 +3,4 @@
 
 .. yt_cookbook:: find_clumps.py
 .. yt_cookbook:: boolean_data_objects.py
-
+.. yt_cookbook:: extract_fixed_resolution_data.py


diff -r d8b69cdb77a66263d0552de015a4c704868af5ae -r f48b7599fe6fabb19e6694fbc46321d060b0ff64 source/cookbook/cosmological_analysis.rst
--- a/source/cookbook/cosmological_analysis.rst
+++ b/source/cookbook/cosmological_analysis.rst
@@ -8,3 +8,6 @@
 .. yt_cookbook:: light_cone_with_halo_mask.py 
 .. yt_cookbook:: unique_light_cone_projections.py 
 .. yt_cookbook:: make_light_ray.py 
+.. yt_cookbook:: halo_mass_info.py
+.. yt_cookbook:: run_halo_profiler.py
+.. yt_cookbook:: simulation_halo_profiler.py



https://bitbucket.org/yt_analysis/yt-doc/changeset/cc98f4f9819a/
changeset:   cc98f4f9819a
user:        MatthewTurk
date:        2012-07-25 14:44:35
summary:     Adding recipe notes for advanced.
affected #:  1 file

diff -r f48b7599fe6fabb19e6694fbc46321d060b0ff64 -r cc98f4f9819ae7f73984a0ada0ea6ebd21ea4b1a source/cookbook/free_free_field.py
--- a/source/cookbook/free_free_field.py
+++ b/source/cookbook/free_free_field.py
@@ -1,14 +1,3 @@
-"""
-This is a rather complicated example of how to use field parameters to alter a field.
-The field in question is the monochromatic free-free emission from a hot, optically thin plasma (such as in galaxy clusters) in erg/s/cm^3/keV, or optionally in photons/s/cm^3/keV. (see Rybicki and Lightman 1979). The following field parameters may be specified:
-Z = the mean charge number of the ions in the plasma
-mue = the mean molecular weight for the electrons
-mui = the mean molecular weight for the ions
-Ephoton = the photon energy in keV
-photom_emissivity = logical flag determining whether we use energy or photon emission
-
-In order to get the total luminosity for a data object, we need to add a new derived quantity that multiplies the emission by the cell volumes. We do that here as well. 
-"""
 from yt.mods import *
 from yt.utilities.physical_constants import mp # Need to grab the proton mass from the 
                                                # constants database



https://bitbucket.org/yt_analysis/yt-doc/changeset/ef08a6f40f5a/
changeset:   ef08a6f40f5a
user:        MatthewTurk
date:        2012-07-25 14:55:40
summary:     Adding notes to the calculating_information section.
affected #:  2 files

diff -r cc98f4f9819ae7f73984a0ada0ea6ebd21ea4b1a -r ef08a6f40f5ac0f983704882fba4ffe3d187b55f source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -1,7 +1,38 @@
 Calculating Dataset Information
 -------------------------------
 
+These recipes demonstrate methods of calculating quantities in a simulation,
+either for later visualization or for understanding properties of fluids and
+particles in the simulation.
+
+Average Field Value
+~~~~~~~~~~~~~~~~~~~
+
+This recipe is a very simple method of calculating the global average of a
+given field, as weighted by another field.
+
 .. yt_cookbook:: average_value.py
+
+Mass Enclosed in a Sphere
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe constructs a sphere and then sums the total mass in particles and
+fluids in the sphere.
+
 .. yt_cookbook:: sum_mass_in_sphere.py
+
+Global Phase Plot
+~~~~~~~~~~~~~~~~~
+
+This is a simple recipe to show how to open a dataset and then plot a couple
+global phase diagrams, save them, and quit.
+
 .. yt_cookbook:: global_phase_plots.py
+
+Radial Velocity Profile
+~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe demonstrates how to subtract off a bulk velocity on a sphere before
+calculating the radial velocity within that sphere.
+
 .. yt_cookbook:: rad_velocity.py 


diff -r cc98f4f9819ae7f73984a0ada0ea6ebd21ea4b1a -r ef08a6f40f5ac0f983704882fba4ffe3d187b55f source/cookbook/global_phase_plots.py
--- a/source/cookbook/global_phase_plots.py
+++ b/source/cookbook/global_phase_plots.py
@@ -1,9 +1,3 @@
-"""
-This is a simple recipe to show how to open a dataset and then plot a couple
-phase diagrams, save them, and quit.  Note that this recipe will take advantage
-of multiple CPUs if executed with mpirun and supplied the --parallel command
-line argument.  For more information, see :ref:`methods-profiles`.
-"""
 from yt.mods import * # set up our namespace
 
 fn = "IsolatedGalaxy/galaxy0030/galaxy0030" # parameter file to load
@@ -12,13 +6,7 @@
 dd = pf.h.all_data() # This is an object that describes the entire box
 pc = PlotCollection(pf) # defaults to center at most dense point
 
-# We plot the average x-velocity (mass-weighted) in our object as a function of
+# We plot the average VelocityMagnitude (mass-weighted) in our object as a function of
 # Density and Temperature
-plot=pc.add_phase_object(dd, ["Density","Temperature","x-velocity"])
-
-# We now plot the average value of x-velocity as a function of temperature
-plot=pc.add_profile_object(dd, ["Temperature", "x-velocity"])
-
-# Finally, the velocity magnitude as a function of density
-plot=pc.add_profile_object(dd, ["Density", "VelocityMagnitude"])
+plot=pc.add_phase_object(dd, ["Density","Temperature","VelocityMagnitude"])
 pc.save() # save all plots



https://bitbucket.org/yt_analysis/yt-doc/changeset/7976c21bf98b/
changeset:   7976c21bf98b
user:        MatthewTurk
date:        2012-07-25 15:08:54
summary:     Adding a worked example of the ionization cube.
affected #:  2 files

diff -r ef08a6f40f5ac0f983704882fba4ffe3d187b55f -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 source/advanced/ionization_cube.py
--- /dev/null
+++ b/source/advanced/ionization_cube.py
@@ -0,0 +1,39 @@
+from yt.mods import *
+from yt.utilities.parallel_tools.parallel_analysis_interface \
+    import communication_system
+import h5py, glob, time
+
+ at derived_field(name = "IonizedHydrogen",
+               units = r"\frac{\rho_{HII}}{rho_H}")
+def IonizedHydrogen(field, data):
+    return data["HII_Density"]/(data["HI_Density"]+data["HII_Density"])
+
+filenames = glob.glob("SED800/DD*/*.hierarchy")
+filenames.sort()
+ts = TimeSeriesData.from_filenames(filenames, parallel = 8)
+
+ionized_z = na.zeros(ts[0].domain_dimensions, dtype="float32")
+
+t1 = time.time()
+for pf in ts.piter():
+    z = pf.current_redshift
+    for g in parallel_objects(pf.h.grids, njobs = 16):
+        i1, j1, k1 = g.get_global_startindex() # Index into our domain
+        i2, j2, k2 = g.get_global_startindex() + g.ActiveDimensions
+        # Look for the newly ionized gas
+        newly_ion = ((g["IonizedHydrogen"] > 0.999)
+                   & (ionized_z[i1:i2,j1:j2,k1:k2] < z))
+        ionized_z[i1:i2,j1:j2,k1:k2][newly_ion] = z
+        g.clear_data()
+
+print "Iteration completed  %0.3e" % (time.time()-t1)
+comm = communication_system.communicators[-1]
+for i in range(ionized_z.shape[0]):
+    ionized_z[i,:,:] = comm.mpi_allreduce(ionized_z[i,:,:], op="max")
+    print "Slab % 3i has minimum z of %0.3e" % (i, ionized_z[i,:,:].max())
+t2 = time.time()
+print "Completed.  %0.3e" % (t2-t1)
+
+if comm.rank == 0:
+    f = h5py.File("IonizationCube.h5", "w")
+    f.create_dataset("/z", data=ionized_z)


diff -r ef08a6f40f5ac0f983704882fba4ffe3d187b55f -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 source/advanced/parallel_computation.rst
--- a/source/advanced/parallel_computation.rst
+++ b/source/advanced/parallel_computation.rst
@@ -463,5 +463,29 @@
     and can give valuable feedback about the
     resources the task requires.
     
-    
-  
+An Advanced Worked Example
+--------------------------
+
+Below is a script used to calculate the redshift of first 99.9% ionization in a
+simulation.  This script was designed to analyze a set of 100 outputs on
+Gordon, running on 128 processors.  This script goes through three phases:
+
+ #. Define a new derived field, which calculates the fraction of ionized
+    hydrogen as a function only of the total hydrogen density.
+ #. Load a time series up, specifying ``parallel = 8``.  This means that it
+    will decompose into 8 jobs.  So if we ran on 128 processors, we would have
+    16 processors assigned to each output in the time series.
+ #. Creating a big cube that will hold our results for this set of processors.
+    Note that this will be only for each output considered by this processor,
+    and this cube will not necessarily be filled in in every cell.
+ #. For each output, distribute the grids to each of the sixteen processors
+    working on that output.  Each of these takes the max of the ionized
+    redshift in their zone versus the accumulation cube.
+ #. Iterate over slabs and find the maximum redshift in each slab of our
+    accumulation cube.
+
+At the end, the root processor (of the global calculation) writes out an
+ionization cube that contains the redshift of first reionization for each zone
+across all outputs.
+
+.. literalinclude:: ionization_cube.py



https://bitbucket.org/yt_analysis/yt-doc/changeset/44e275713ae3/
changeset:   44e275713ae3
user:        MatthewTurk
date:        2012-07-25 15:12:16
summary:     Adding descriptions for the data object construction recipes.
affected #:  5 files

diff -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 -r 44e275713ae3e44583aaed685a7c1853db096ccd source/cookbook/boolean_data_objects.py
--- a/source/cookbook/boolean_data_objects.py
+++ b/source/cookbook/boolean_data_objects.py
@@ -1,9 +1,3 @@
-"""
-Below shows the creation of a number of boolean data objects, which
-are built upon previously-defined data objects. The boolean
-data ojbects can be used like any other, except for a few cases.
-Please see :ref:`boolean_data_objects` for more information.
-"""
 from yt.mods import * # set up our namespace
 
 pf = load("Enzo_64/DD0043/data0043") # load data


diff -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 -r 44e275713ae3e44583aaed685a7c1853db096ccd source/cookbook/complex_plots.rst
--- a/source/cookbook/complex_plots.rst
+++ b/source/cookbook/complex_plots.rst
@@ -1,12 +1,12 @@
 A Few Complex Plots
 -------------------
 
-.. yt_cookbook:: offaxis_projection.py
 .. yt_cookbook:: multi_width_image.py
 .. yt_cookbook:: multi_plot_slice_and_proj.py 
 .. yt_cookbook:: multi_plot_3x2_FRB.py
 .. yt_cookbook:: multi_plot_3x2.py
 .. yt_cookbook:: multi_plot.py
+.. yt_cookbook:: offaxis_projection.py
 .. yt_cookbook:: overplot_particles.py
 .. yt_cookbook:: thin_slice_projection.py
 .. yt_cookbook:: velocity_vectors_on_slice.py


diff -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 -r 44e275713ae3e44583aaed685a7c1853db096ccd source/cookbook/constructing_data_objects.rst
--- a/source/cookbook/constructing_data_objects.rst
+++ b/source/cookbook/constructing_data_objects.rst
@@ -1,6 +1,35 @@
 Constructing Data Objects
 -------------------------
 
+These recipes demonstrate a few uncommon methods of constructing data objects
+from a simulation.
+
+Identifying Clumps
+~~~~~~~~~~~~~~~~~~
+
+This is a recipe to show how to find topologically connected sets of cells
+inside a dataset.  It returns these clumps and they can be inspected or
+visualized as would any other data object.  More detail on this method can be
+found in `astro-ph/0806.1653`.  For more information, see
+:ref:`methods-contours`.
+
 .. yt_cookbook:: find_clumps.py
+
+Boolean Data Objects
+~~~~~~~~~~~~~~~~~~~~
+
+Below shows the creation of a number of boolean data objects, which are built
+upon previously-defined data objects. The boolean data ojbects can be used like
+any other, except for a few cases.  Please see :ref:`boolean_data_objects` for
+more information.
+
 .. yt_cookbook:: boolean_data_objects.py
+
+Extracting Fixed Resolution Data
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This is a recipe to show how to open a dataset and extract it to a file at a
+fixed resolution with no interpolation or smoothing.  Additionally, this recipe
+shows how to insert a dataset into an external HDF5 file using h5py.
+
 .. yt_cookbook:: extract_fixed_resolution_data.py


diff -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 -r 44e275713ae3e44583aaed685a7c1853db096ccd source/cookbook/extract_fixed_resolution_data.py
--- a/source/cookbook/extract_fixed_resolution_data.py
+++ b/source/cookbook/extract_fixed_resolution_data.py
@@ -1,11 +1,3 @@
-"""
-Title: Extract Fixed Resolution Data
-Description: This is a recipe to show how to open a dataset and extract it to a
-             file at a fixed resolution with no interpolation or smoothing.
-             Additionally, this recipe shows how to insert a dataset into an
-             external HDF5 file using h5py.
-Outputs: [my_data.h5]
-"""
 from yt.mods import *
 
 # For this example we will use h5py to write to our output file.


diff -r 7976c21bf98bd5f0c9d981ce82f8c39d70e65500 -r 44e275713ae3e44583aaed685a7c1853db096ccd source/cookbook/find_clumps.py
--- a/source/cookbook/find_clumps.py
+++ b/source/cookbook/find_clumps.py
@@ -1,10 +1,3 @@
-"""
-This is a recipe to show how to find topologicall connected sets of cells
-inside a dataset.  It returns these clumps and they can be inspected or
-visualized as would any other data object.  More detail on this method can be
-found in astro-ph/0806.1653.  For more information, see
-:ref:`methods-contours`.
-"""
 from yt.mods import * # set up our namespace
 
 fn = "IsolatedGalaxy/galaxy0030/galaxy0030" # parameter file to load



https://bitbucket.org/yt_analysis/yt-doc/changeset/1066492f7861/
changeset:   1066492f7861
user:        MatthewTurk
date:        2012-07-25 15:37:37
summary:     Adding descriptions for complex_plots.
affected #:  9 files

diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/camera_movement.py
--- a/source/cookbook/camera_movement.py
+++ b/source/cookbook/camera_movement.py
@@ -1,12 +1,3 @@
-"""
-Title: Camera Motion in Volume Rendering
-Description: This recipe shows how to use the movement functions hanging off of
-             the Camera object.
-
-             Additionally, for the purposes of the recipe, we have simplified
-             the image considerably.
-Outputs: []
-"""
 from yt.mods import * # set up our namespace
    
 # Follow the simple_volume_rendering cookbook for the first part of this.


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/complex_plots.rst
--- a/source/cookbook/complex_plots.rst
+++ b/source/cookbook/complex_plots.rst
@@ -1,17 +1,124 @@
 A Few Complex Plots
 -------------------
 
+The built-in plotting functionality covers the very simple use cases that are
+most common.  These scripts will demonstrate how to construct more complex
+plots or publication-quality plots.  In many cases these show how to make
+multi-panel plots.
+
+Simple Multi-Plot
+~~~~~~~~~~~~~~~~~
+This is a simple recipe to show how to open a dataset and then plot a slice
+through it, centered at its most dense point.
+
+.. yt_cookbook:: multi_plot.py
+
+Multi-Width Image
+~~~~~~~~~~~~~~~~~
+
+This is a simple recipe to show how to open a dataset and then plot slices
+through it at varying widths.
+
 .. yt_cookbook:: multi_width_image.py
+
+Multi-Plot Slice and Projections
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This shows how to combine multiple slices and projections into a single image,
+with detailed control over colorbars, titles and color limits.
+
 .. yt_cookbook:: multi_plot_slice_and_proj.py 
+
+Simple Multi-Plot Multi-Panel
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This demonstrates how to construct a very simple multi-panel image.
+
+.. yt_cookbook:: multi_plot_3x2.py
+
+Advanced Multi-Plot Multi-Panel
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This produces a series of slices of multiple fields with different color maps
+and zlimits, and makes use of the FixedResolutionBuffer. While this is more
+complex than the equivalent plot collection-based solution, it allows for a
+*lot* more flexibility. Every part of the script uses matplotlib commands,
+allowing its full power to be exercised.
+
 .. yt_cookbook:: multi_plot_3x2_FRB.py
-.. yt_cookbook:: multi_plot_3x2.py
-.. yt_cookbook:: multi_plot.py
+
+Projecting Off-Axis
+~~~~~~~~~~~~~~~~~~~
+
+This recipe demonstrates how to take an image-plane line integral along an
+arbitrary axis in a simulation.
+
 .. yt_cookbook:: offaxis_projection.py
+
+Projecting Off-Axis with a Colorbar
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe shows how to generate a colorbar with a projection of a dataset
+from an arbitrary projection angle (so you are not confined to the x, y, and z
+axes).  Please note that this same write_projection function will work with a
+volume rendering to generate a colorbar in the same fashion.  
+
+.. yt_cookbook:: offaxis_projection_colorbar.py
+
+Thin-Slice Projections
+~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe is an example of how to project through only a given data object,
+in this case a thin region, and then display the result.
+
+.. yt_cookbook:: thin_slice_projection.py
+
+Plotting Particles Over Fluids
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe demonstrates how to overplot particles on top of a fluid image.
+
 .. yt_cookbook:: overplot_particles.py
-.. yt_cookbook:: thin_slice_projection.py
+
+
+Overplotting Velocity Vectors
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe demonstrates how to plot velocity vectors on top of a slice.
+
 .. yt_cookbook:: velocity_vectors_on_slice.py
+
+Overplotting Contours
+~~~~~~~~~~~~~~~~~~~~~
+
+This is a simple recipe to show how to open a dataset, plot a slice through it,
+and add contours of another quantity on top.
+
 .. yt_cookbook:: contours_on_slice.py
+
+Styling Radial Profile Plots
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This recipe demonstrates a method of calculating radial profiles for several
+quantities, styling them and saving out the resultant plot.
+
 .. yt_cookbook:: radial_profile_styles.py 
-.. yt_cookbook:: offaxis_projection_colorbar.py
+
+Moving a Volume Rendering Camera
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+In this recipe, moving a camera through a domain and taking multiple snapshots
+(volume rendering) is shown.
+
 .. yt_cookbook:: camera_movement.py
+
+Zooming into an Image
+~~~~~~~~~~~~~~~~~~~~~
+
+This is a recipe that takes a slice through the most dense point, then creates
+a bunch of frames as it zooms in.  It's important to note that this particular
+recipe is provided to show how to be more flexible and add annotations and the
+like -- the base system, of a zoomin, is provided by the "yt zoomin" command on
+the command line.
+
 .. yt_cookbook:: zoomin_frames.py


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/contours_on_slice.py
--- a/source/cookbook/contours_on_slice.py
+++ b/source/cookbook/contours_on_slice.py
@@ -1,7 +1,3 @@
-"""
-This is a simple recipe to show how to open a dataset, plot a slice
-through it, and add contours of another quantity on top.
-"""
 from yt.mods import * # set up our namespace
 
 pf = load("GasSloshing/sloshing_nomag2_hdf5_plt_cnt_0150") # load data


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/multi_plot.py
--- a/source/cookbook/multi_plot.py
+++ b/source/cookbook/multi_plot.py
@@ -1,9 +1,3 @@
-"""
-Title: Simple Multi Plot
-Description: This is a simple recipe to show how to open a dataset and then
-             plot a slice through it, centered at its most dense point.
-Outputs: []
-"""
 from yt.mods import * # set up our namespace
 import matplotlib.colorbar as cb
 


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/multi_plot_3x2.py
--- a/source/cookbook/multi_plot_3x2.py
+++ b/source/cookbook/multi_plot_3x2.py
@@ -1,9 +1,3 @@
-"""
-Title: Basic 3x2 Multi Plot
-Description: This is a simple recipe to show how to open a dataset and then
-             plot a slice through it, centered at its most dense point.
-Outputs: [RedshiftOutput0006_3x2.png]
-"""
 from yt.mods import * # set up our namespace
 import matplotlib.colorbar as cb
 


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/multi_plot_3x2_FRB.py
--- a/source/cookbook/multi_plot_3x2_FRB.py
+++ b/source/cookbook/multi_plot_3x2_FRB.py
@@ -1,13 +1,3 @@
-"""
-Title: Advanced 3x2 Multi Plot
-Description: This produces a series of slices of multiple fields with different
-             color maps and zlimits, and makes use of the
-             FixedResolutionBuffer. While this is more complex than the
-             equivalent plot collection-based solution, it allows for a *lot*
-             more flexibility. Every part of the script uses matplotlib
-             commands, allowing its full power to be exercised.
-Outputs: [RedshiftOutput0006_3x2.png]
-"""
 from yt.mods import * # set up our namespace
 import matplotlib.colorbar as cb
 from matplotlib.colors import LogNorm


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/multi_plot_slice_and_proj.py
--- a/source/cookbook/multi_plot_slice_and_proj.py
+++ b/source/cookbook/multi_plot_slice_and_proj.py
@@ -1,9 +1,3 @@
-"""
-Title: Basic 2x3 Multi Plot
-Description: This is a simple recipe to show how to open a dataset and then
-             plot a slice through it, centered at its most dense point.
-Outputs: 
-"""
 from yt.mods import * # set up our namespace
 import matplotlib.colorbar as cb
 from matplotlib.colors import LogNorm


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/offaxis_projection_colorbar.py
--- a/source/cookbook/offaxis_projection_colorbar.py
+++ b/source/cookbook/offaxis_projection_colorbar.py
@@ -1,12 +1,3 @@
-"""
-Title: Off-axis Projection
-Description: This recipe shows how to generate a colorbar with a projection of
-             a dataset from an arbitrary projection angle (so you are not
-             confined to the x, y, and z axes).  Please note that this same
-             write_projection function will work with a volume rendering to
-             generate a colorbar in the same fashion.  
-Outputs: [ offaxis_projection_colorbar.png ]
-"""
 from yt.mods import * # set up our namespace
 
 fn = "IsolatedGalaxy/galaxy0030/galaxy0030" # parameter file to load


diff -r 44e275713ae3e44583aaed685a7c1853db096ccd -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 source/cookbook/zoomin_frames.py
--- a/source/cookbook/zoomin_frames.py
+++ b/source/cookbook/zoomin_frames.py
@@ -1,14 +1,3 @@
-"""
-Title: Zooming Movie
-Description: This is a recipe that takes a slice through the most dense point,
-             then creates a bunch of frames as it zooms in.  It's important to
-             note that this particular recipe is provided to show how to be
-             more flexible and add annotations and the like -- the base system,
-             of a zoomin, is provided by the "yt zoomin" command on the command
-             line.
-Outputs: [frame_00000.png, frame_00001.png, frame_00002.png,
-         frame_00003.png, frame_00004.png]
-"""
 from yt.mods import * # set up our namespace
 
 fn = "RedshiftOutput0005" # parameter file to load



https://bitbucket.org/yt_analysis/yt-doc/changeset/8578b1f0d58b/
changeset:   8578b1f0d58b
user:        MatthewTurk
date:        2012-07-25 18:27:41
summary:     Fixing zoomin_frames to use SlicePlot.
affected #:  1 file

diff -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 -r 8578b1f0d58b6b0c2d6f2cf765c6fb0b392f1c7d source/cookbook/zoomin_frames.py
--- a/source/cookbook/zoomin_frames.py
+++ b/source/cookbook/zoomin_frames.py
@@ -1,6 +1,6 @@
 from yt.mods import * # set up our namespace
 
-fn = "RedshiftOutput0005" # parameter file to load
+fn = "IsolatedGalaxy/galaxy0030/galaxy0030" # parameter file to load
 n_frames = 5  # This is the number of frames to make -- below, you can see how
               # this is used.
 min_dx = 40   # This is the minimum size in smallest_dx of our last frame.
@@ -10,11 +10,10 @@
 pf = load(fn) # load data
 frame_template = "frame_%05i" # Template for frame filenames
 
-pc = PlotCollection(pf, "c")
-p = pc.add_slice("Density", "z") # Add our slice, along z
-p.modify["contour"]("Temperature") # We'll contour in temperature -- this kind
-                                    # of modification can't be done on the command
-                                    # line, so that's why we have the recipe!
+p = SlicePlot(pf, "Density", "z") # Add our slice, along z
+p.annotate_contours("Temperature") # We'll contour in temperature -- this kind
+                                   # of modification can't be done on the command
+                                   # line, so that's why we have the recipe!
 
 # What we do now is a bit fun.  "enumerate" returns a tuple for every item --
 # the index of the item, and the item itself.  This saves us having to write
@@ -25,6 +24,6 @@
 for i,v in enumerate(na.logspace(
             0, na.log10(pf.h.get_smallest_dx()*min_dx), n_frames)):
     # We set our width as necessary for this frame ...
-    pc.set_width(v,'1')
+    p.set_width(v, '1')
     # ... and we save!
-    pc.save(frame_template % (i))
+    p.save(frame_template % (i))



https://bitbucket.org/yt_analysis/yt-doc/changeset/5b6609d5eb99/
changeset:   5b6609d5eb99
user:        MatthewTurk
date:        2012-07-25 19:07:05
summary:     Adding note about downloading data and contributing recipes.
affected #:  1 file

diff -r 8578b1f0d58b6b0c2d6f2cf765c6fb0b392f1c7d -r 5b6609d5eb9921e3a696c6c444c3d25aa174d77a source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -9,16 +9,15 @@
 how to do some fairly common tasks -- which can lead to combining these, with
 other Python code, into more complicated and advanced tasks.
 
-All of the data used here was used in the `2012 yt Workshop
-<http://yt-project.org/workshop2012/>`_ and can be found at the workshop home
-page.  This includes FLASH and Enzo datasets, both of which are in evidence
-here.
+All of the data used here is freely available from http://yt-project.org/data/
+where you will find links to download individual datasets.
 
 If you want to take a look at more complex recipes, or submit your own,
 check out the `yt Hub <http://hub.yt-project.org>`_.
 
-.. note::
-   To view at full size, right click on an image and choose "Open in New Tab."
+.. note:: To contribute your own recipes, please 
+   `fork <http://bitbucket.org/yt_analysis/yt-doc/fork>`_
+   the documentation repository!
 
 .. toctree::
    :maxdepth: 2



https://bitbucket.org/yt_analysis/yt-doc/changeset/f8b0b9a8517f/
changeset:   f8b0b9a8517f
user:        MatthewTurk
date:        2012-07-25 19:25:29
summary:     Adding time series recipe from John ZuHone.
affected #:  2 files

diff -r 5b6609d5eb9921e3a696c6c444c3d25aa174d77a -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -36,3 +36,12 @@
 calculating the radial velocity within that sphere.
 
 .. yt_cookbook:: rad_velocity.py 
+
+Time Series Analysis
+~~~~~~~~~~~~~~~~~~~~
+
+This recipe shows how to calculate a number of quantities on a set of parameter
+files.  Note that it is parallel aware, and that if you only wanted to run in
+serial the operation ``for pf in ts:`` would also have worked identically.
+
+.. yt_cookbook:: time_series.py


diff -r 5b6609d5eb9921e3a696c6c444c3d25aa174d77a -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b source/cookbook/time_series.py
--- /dev/null
+++ b/source/cookbook/time_series.py
@@ -0,0 +1,41 @@
+from yt.mods import *
+import glob
+import matplotlib.pyplot as plt
+
+keV = 1.16044e7
+mue = 1.0/0.875
+m_p = 1.673e-24
+mtt = -2./3.
+
+# Glob for a list of filenames, then sort them
+fns = glob.glob("GasSloshingLowRes/sloshing_low_res_hdf5_plt_cnt_0[0-6][0-9]0")
+fns.sort()
+
+# Construct the time series object
+
+ts = TimeSeriesData.from_filenames(fns)
+
+storage = {}
+
+# We use the piter() method here so that this can be run in parallel.
+# Alternately, you could just iterate "for pf in ts:" and directly append to
+# times and entrs.
+for sto, pf in ts.piter(storage=storage):
+    sphere = pf.h.sphere("c", (100., "kpc"))
+    temp = sphere["Temperature"]/keV
+    dens = sphere["Density"]/(m_p*mue)
+    mgas = sphere["CellMass"]
+    entr = (temp*(dens**mtt)*mgas).sum()/mgas.sum() 
+    sto.result = (pf.current_time, entr)
+
+times = []
+entrs = []
+for k in storage:
+    t, e = storage[k]
+    times.append(t)
+    entrs.append(e)
+
+plt.semilogy(times, entrs, 'x-')
+plt.xlabel("Time")
+plt.ylabel("Entropy")
+plt.savefig("time_versus_entropy.png")



https://bitbucket.org/yt_analysis/yt-doc/changeset/b57d3621ff70/
changeset:   b57d3621ff70
user:        MatthewTurk
date:        2012-07-25 20:18:25
summary:     Adding a contributor list to the changelog, insert my draft of the 2.4 changes,
updated a few things to remove plot collection, and remove a now
hilariously-outdated IPython section.
affected #:  4 files

diff -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b -r b57d3621ff70841dcc0c0f706fbfb3688e793931 source/advanced/debugdrive.rst
--- a/source/advanced/debugdrive.rst
+++ b/source/advanced/debugdrive.rst
@@ -125,69 +125,3 @@
 For security reasons, this will only work on local processes; to connect on a
 cluster, you will have to execute the command ``yt rpdb`` on the node on which
 that process was launched.
-
-.. _interactive-parallel:
-
-Interactive Parallel Processing with IPython
---------------------------------------------
-
-IPython is a powerful mechanism not only for interactive usage, but also for
-task delegation and parallel analysis driving.  Using the 
-`IPython Parallel Multi Engine <http://ipython.scipy.org/doc/manual/html/parallel/parallel_multiengine.html>`_
-interface, you can launch multiple 'engines' for computation which can then be
-driven by ``yt``.  However, to do so, you will have to ensure that the IPython
-dependencies for parallel computation are met -- this requires the installation
-of a few components.
-
-  * `PyOpenSSL <http://pyopenssl.sourceforge.net/>`_
-  * `Twisted <http://twistedmatrix.com/trac/>`_
-  * `Foolscap <http://foolscap.lothar.com/trac>`_
-
-Both Twisted and Foolscap can be installed using ``easy_install`` but PyOpenSSL
-requires manual installation.  Of course, ``yt`` itself requires `mpi4py
-<http://code.google.com/p/mpi4py/>`_ to be installed as well, which is
-described in :ref:`parallel-computation`.
-
-The entire section in the IPython manual on
-`parallel computation <http://ipython.scipy.org/doc/manual/html/parallel/index.html>`_
-is essential reading, but for a quick start, you need to launch the engines:
-
-.. code-block:: bash
-
-   $ ipcontroller
-   $ mpirun -np 4 ipengine
-
-This will launch the controller, which interfaces with the new computation
-engines launched afterward.  Note that you can launch an arbitrary number of
-compute processes.  Now, launch IPYthon:
-
-.. code-block:: bash
-
-   $ ipython
-
-and execute the commands:
-
-.. code-block:: python
-
-   ipcontroller mpirun -np 4 ipengine
-   mec = client.MultiEngineClient()
-   mec.activate()
-
-You have now obtained an object, ``mec``, which is able to interact with and
-control all of the launched engines.  Any command prefixed with the string
-``%px`` will now be issued on all processors.  Any action that would be
-executed in parallel in ``yt`` will be executed in parallel here.  For
-instance,
-
-.. code-block:: python
-
-   %px from yt.mods import *
-   %px pf = load("data0050")
-   %px pc = PlotCollection(pf)
-   %px pc.add_projection("Density", 0)
-
-This will load up the name space, the parameter file, and project through
-``data0050`` in parallel utilizing all of our processors.  IPython can also
-execute commands on a limited subset of hosts, and it can also turn on
-auto-execution, to send all of your commands to all of the compute engines,
-using the ``%autopx`` directive.


diff -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b -r b57d3621ff70841dcc0c0f706fbfb3688e793931 source/advanced/parallel_computation.rst
--- a/source/advanced/parallel_computation.rst
+++ b/source/advanced/parallel_computation.rst
@@ -52,9 +52,8 @@
    pf = load("RD0035/RedshiftOutput0035")
    v, c = pf.h.find_max("Density")
    print v, c
-   pc = PlotCollection(pf, center = [0.5, 0.5, 0.5])
-   pc.add_projection("Density", 0)
-   pc.save()
+   p = ProjectionPlot(pf, "x", "Density")
+   p.save()
 
 Will execute the finding of the maximum density and the projection in parallel
 if launched in parallel.  To do so, at the command line you would execute
@@ -181,9 +180,8 @@
        sto.result_id = fn
        sto.result = dd.quantities["Extrema"]("Density")
        # Makes and saves a plot of the gas density.
-       pc = PlotCollection(pf, [0.5, 0.5, 0.5])
-       pc.add_projection("Density", 0)
-       pc.save()
+       p = ProjectionPlot(pf, "x", "Density")
+       p.save()
    # At this point, as the loop exits, the local copies of my_storage are
    # combined such that all tasks now have an identical and full version of
    # my_storage. Until this point, each task is unaware of what the other


diff -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b -r b57d3621ff70841dcc0c0f706fbfb3688e793931 source/changelog.rst
--- a/source/changelog.rst
+++ b/source/changelog.rst
@@ -3,8 +3,116 @@
 ChangeLog
 =========
 
+
 This is a non-comprehensive log of changes to the code.
 
+Contributors
+------------
+
+Here are all of the contributors to the code base, in alphabetical order.
+
+ * Tom Abel
+ * David Collins
+ * Andrew Cunningham
+ * Nathan Goldbaum
+ * Cameron Hummels
+ * Ji-hoon Kim
+ * Steffen Klemer
+ * Kacper Kowalik
+ * Michael Kuhlen
+ * Eve Lee
+ * Chris Malone
+ * Chris Moody
+ * Andrew Myers
+ * Jeff Oishi
+ * Jean-Claude Passy
+ * Thomass Robitaille
+ * Anna Rosen
+ * Anthony Scopatz
+ * Devin Silvia
+ * Sam Skillman
+ * Stephen Skory
+ * Britton Smith
+ * Geoffrey So
+ * Casey Stark
+ * Elizabeth Tasker
+ * Matthew Turk
+ * Rick Wagner
+ * John Wise
+ * John ZuHone
+
+Version 2.4
+-----------
+
+The 2.4 release was particularly large, encompassing nearly a thousand
+changesets and a number of new features.
+
+Most Visible Improvements
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+ * Threaded volume renderer, completely refactored from the ground up for
+   speed and parallelism.
+ * The Plot Window (see :ref:`simple-inspection`) is now fully functional!  No
+   more PlotCollections, and full, easy access to Matplotlib axes objects.
+ * Many improvements to Time Series analysis:
+    * EnzoSimulation now integrates with TimeSeries analysis!
+    * Auto-parallelization of analysis and parallel iteration
+    * Memory usage when iterating over parameter files reduced substantially
+ * Many improvements to Reason, the yt GUI
+    * Addition of "yt reason" as a startup command
+    * Keyboard shortcuts in projection & slice mode: z, Z, x, X for zooms,
+      hjkl, HJKL for motion
+    * Drag to move in projection & slice mode
+    * Contours and vector fields in projection & slice mode
+    * Color map selection in projection & slice mode
+    * 3D Scene
+ * Integration with the all new yt Hub ( http://hub.yt-project.org/ ): upload
+   variable resolution projections, slices, project information, vertices and
+   plot collections right from the yt command line!
+
+Other Changes
+~~~~~~~~~~~~~
+
+ * :class:`~yt.visualization.plot_window.ProjectionPlot` and 
+   :class:`~yt.visualization.plot_window.SlicePlot` supplant the functionality
+   of PlotCollection.
+ * Camera path creation from keyframes and splines
+ * Ellipsoidal data containers and ellipsoidal parameter calculation for halos
+ * PyX and ZeroMQ now available in the install script
+ * Consolidation of unit handling
+ * HDF5 updated to 1.8.7, Mercurial updated to 2.2, IPython updated to 0.12
+ * Preview of integration with Rockstar halo finder
+ * Improvements to merger tree speed and memory usage
+ * Sunrise exporter now compatible with Sunrise 4.0
+ * Particle trajectory calculator now available!
+ * Speed and parallel scalability improvements in projections, profiles and HOP
+ * New Vorticity-related fields
+ * Vast improvements to the ART frontend
+ * Many improvements to the FLASH frontend, including full parameter reads,
+   speedups, and support for more corner cases of FLASH 2, 2.5 and 3 data.
+ * Integration of the Grid Data Format frontend, and a converter for Athena
+   data to this format.
+ * Improvements to command line parsing
+ * Parallel import improvements on parallel filesystems
+   (``from yt.pmods import *``)
+ * proj_style keyword for projections, for Maximum Intensity Projections
+   (``proj_style = "mip"``)
+ * Fisheye rendering for planetarium rendering
+ * Profiles now provide \*_std fields for standard deviation of values
+ * Generalized Orientation class, providing 6DOF motion control
+ * parallel_objects iteration now more robust, provides optional barrier.
+   (Also now being used as underlying iteration mechanism in many internal
+   routines.)
+ * Dynamic load balancing in parallel_objects iteration.
+ * Parallel-aware objects can now be pickled.
+ * Many new colormaps included
+ * Numerous improvements to the PyX-based eps_writer module
+ * FixedResolutionBuffer to FITS export.
+ * Generic image to FITS export.
+ * Multi-level parallelism for extremely large cameras in volume rendering
+ * Light cone and light ray updates to fit with current best practices for
+   parallelism
+
 Version 2.3 
 -----------
 


diff -r f8b0b9a8517f6280874e07edadd87e23b6e6ec2b -r b57d3621ff70841dcc0c0f706fbfb3688e793931 source/visualizing/plots.rst
--- a/source/visualizing/plots.rst
+++ b/source/visualizing/plots.rst
@@ -18,6 +18,8 @@
 collection of plots to be set up, rendered, and saved simultaenously.
 Below we will summarize how to use both plotting interfaces.
 
+.. _simple-inspection:
+
 Simple Data Inspection
 ----------------------
 



https://bitbucket.org/yt_analysis/yt-doc/changeset/b2c9b3615bb7/
changeset:   b2c9b3615bb7
user:        brittonsmith
date:        2012-07-25 17:32:26
summary:     Fixing recipe to keep output files contained.
affected #:  1 file

diff -r 1066492f7861b00c9f3e696da7a9c555ee6d7322 -r b2c9b3615bb7b9ef88bf5a88847e833ef3fd2398 source/cookbook/light_cone_with_halo_mask.py
--- a/source/cookbook/light_cone_with_halo_mask.py
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -4,7 +4,7 @@
 from yt.analysis_modules.halo_profiler.api import *
 
 # Instantiate a light cone object as usual.
-lc = LightCone("enzo_tiny_cosmology/32Mpc_32.enzo",
+lc = LightCone("32Mpc_32.enzo",
                'Enzo', 0, 0.1,
                observer_redshift=0.0,
                field_of_view_in_arcminutes=600.0,
@@ -19,7 +19,8 @@
 # Configure the HaloProfiler.
 # These are keyword arguments given when creating a
 # HaloProfiler object.
-halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out',
+                        'output_dir': 'halo_analysis'}
 
 # Create a list of actions for the HaloProfiler to take.
 halo_profiler_actions = []
@@ -56,8 +57,9 @@
 # Write the boolean map to an hdf5 file called 'halo_mask.h5'.
 # Write a text file detailing the location, redshift, radius, and mass
 # of each halo in light cone projection.
-lc.get_halo_mask(mask_file='halo_mask.h5', map_file='halo_map.out',
-                 cube_file='halo_cube.h5',
+lc.get_halo_mask(mask_file='LC_HM/halo_mask.h5',
+                 map_file='LC_HM/halo_map.out',
+                 cube_file='LC_HM/halo_cube.h5',
                  virial_overdensity=100,
                  halo_profiler_parameters=halo_profiler_parameters,
                  njobs=1, dynamic=False)
@@ -66,5 +68,6 @@
 field = 'SZY'
 
 # Make the light cone projection and apply the halo mask.
-pc = lc.project_light_cone(field, save_stack=True, save_slice_images=True,
+pc = lc.project_light_cone(field, save_stack=False,
+                           save_slice_images=True,
                            apply_halo_mask=True)



https://bitbucket.org/yt_analysis/yt-doc/changeset/5d2b2e025b0b/
changeset:   5d2b2e025b0b
user:        brittonsmith
date:        2012-07-25 17:38:25
summary:     Fixing directories for recipes.
affected #:  2 files

diff -r b2c9b3615bb7b9ef88bf5a88847e833ef3fd2398 -r 5d2b2e025b0bfdb44e1ac1e496ce07219fa7d0fa source/cookbook/light_cone_projection.py
--- a/source/cookbook/light_cone_projection.py
+++ b/source/cookbook/light_cone_projection.py
@@ -7,19 +7,21 @@
 # We have already set up the redshift dumps to be
 # used for this, so we will not use any of the time
 # data dumps.
-cts = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
-                'Enzo', 0., 0.1,
-                observer_redshift=0.0,
-                field_of_view_in_arcminutes=600.0,
-                image_resolution_in_arcseconds=60.0,
-                time_data=False)
+lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
+               'Enzo', 0., 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=600.0,
+               image_resolution_in_arcseconds=60.0,
+               time_data=False)
 
 # Calculate a randomization of the solution.
-cts.calculate_light_cone_solution(seed=123456789)
+lc.calculate_light_cone_solution(seed=123456789)
 
-# Make a light cone projection of the SZ Y parameter.
+# Choose the field to be projected.
+field = 'SZY'
+
 # Set njobs to -1 to have one core work on each projection
 # in parallel.
-cts.project_light_cone('SZY',
-                       save_slice_images=True,
-                       njobs=-1)
+lc.project_light_cone(field, save_stack=False,
+                      save_slice_images=True,
+                      njobs=-1)


diff -r b2c9b3615bb7b9ef88bf5a88847e833ef3fd2398 -r 5d2b2e025b0bfdb44e1ac1e496ce07219fa7d0fa source/cookbook/light_cone_with_halo_mask.py
--- a/source/cookbook/light_cone_with_halo_mask.py
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -4,7 +4,7 @@
 from yt.analysis_modules.halo_profiler.api import *
 
 # Instantiate a light cone object as usual.
-lc = LightCone("32Mpc_32.enzo",
+lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
                'Enzo', 0, 0.1,
                observer_redshift=0.0,
                field_of_view_in_arcminutes=600.0,



https://bitbucket.org/yt_analysis/yt-doc/changeset/e9150e9fb04d/
changeset:   e9150e9fb04d
user:        brittonsmith
date:        2012-07-25 17:48:45
summary:     Finalizing light cone recipes to produce only images.
affected #:  3 files

diff -r 5d2b2e025b0bfdb44e1ac1e496ce07219fa7d0fa -r e9150e9fb04d2faba549fad47aad605eaab5944e source/cookbook/light_cone_projection.py
--- a/source/cookbook/light_cone_projection.py
+++ b/source/cookbook/light_cone_projection.py
@@ -21,7 +21,9 @@
 field = 'SZY'
 
 # Set njobs to -1 to have one core work on each projection
-# in parallel.
+# in parallel.  Set save_slice_images to True to see an
+# image for each individual slice.
 lc.project_light_cone(field, save_stack=False,
-                      save_slice_images=True,
+                      save_final_image=True,
+                      save_slice_images=False,
                       njobs=-1)


diff -r 5d2b2e025b0bfdb44e1ac1e496ce07219fa7d0fa -r e9150e9fb04d2faba549fad47aad605eaab5944e source/cookbook/light_cone_with_halo_mask.py
--- a/source/cookbook/light_cone_with_halo_mask.py
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -13,7 +13,8 @@
                output_dir='LC_HM', output_prefix='LightCone')
 
 # Calculate the light cone solution.
-lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
+lc.calculate_light_cone_solution(seed=123456789,
+                                 filename='LC_HM/lightcone.dat')
 
 
 # Configure the HaloProfiler.
@@ -69,5 +70,6 @@
 
 # Make the light cone projection and apply the halo mask.
 pc = lc.project_light_cone(field, save_stack=False,
-                           save_slice_images=True,
+                           save_final_image=True,
+                           save_slice_images=False,
                            apply_halo_mask=True)


diff -r 5d2b2e025b0bfdb44e1ac1e496ce07219fa7d0fa -r e9150e9fb04d2faba549fad47aad605eaab5944e source/cookbook/unique_light_cone_projections.py
--- a/source/cookbook/unique_light_cone_projections.py
+++ b/source/cookbook/unique_light_cone_projections.py
@@ -17,12 +17,15 @@
 # solutions.  This will save time when making the projection.
 find_unique_solutions(lc, max_overlap=0.10, failures=50,
                       seed=123456789, recycle=True,
-                      solutions=10, filename='unique.dat')
+                      solutions=10, filename='LC_U/unique.dat')
 
+# Choose the field to be projected.
 field = 'SZY'
 
 # Make light cone projections with each of the random seeds
 # found above.  All output files will be written with unique
 # names based on the random seed numbers.
-project_unique_light_cones(lc, 'unique.dat', field,
-                           save_slice_images=True)
+project_unique_light_cones(lc, 'LC_U/unique.dat', field,
+                           save_stack=False,
+                           save_final_image=True,
+                           save_slice_images=False)



https://bitbucket.org/yt_analysis/yt-doc/changeset/a72c64e42b32/
changeset:   a72c64e42b32
user:        brittonsmith
date:        2012-07-25 20:16:59
summary:     Added new halo profiler recipe and removed simulation halo profiler
recipe.  Added reciped deomnstrating use of simulation time series.
Filled out cosmological analysis descriptions.
affected #:  6 files

diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/cosmological_analysis.rst
--- a/source/cookbook/cosmological_analysis.rst
+++ b/source/cookbook/cosmological_analysis.rst
@@ -1,13 +1,69 @@
 Cosmological Analysis
 ---------------------
 
+These scripts demonstrate some basic and more advanced analysis that can be 
+performed on cosmological simulations.
+
+Simple Halo Finding
+~~~~~~~~~~~~~~~~~~~
+This script shows how to create a halo catalog for a single dataset.
+
 .. yt_cookbook:: halo_finding.py
+
+Plotting Halos
+~~~~~~~~~~~~~~
+This is a mechanism for plotting circles representing identified particle halos
+on an image.
+
 .. yt_cookbook:: halo_plotting.py
+
+Plotting Halo Particles
+~~~~~~~~~~~~~~~~~~~~~~~
+This is a simple mechanism for overplotting the particles belonging only to
+halos.
+
 .. yt_cookbook:: halo_particle_plotting.py
-.. yt_cookbook:: light_cone_projection.py 
+
+Halo Information
+~~~~~~~~~~~~~~~~
+This recipe finds halos and then prints out information about them.
+
+.. yt_cookbook:: halo_mass_info.py
+
+Halo Profiling and Custom Analysis
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This script demonstrates the three primary uses of the halo profiler: 
+1) radial profiles and filtering; 2) projections; and 3) custom halo 
+analysis.
+
+.. yt_cookbook:: halo_profiler.py
+
+Light Cone Projection
+~~~~~~~~~~~~~~~~~~~~~
+This script creates a light cone projection, a synthetic observation 
+that stacks together projections from multiple datasets to extend over 
+a given redshift interval.
+
+.. yt_cookbook:: light_cone_projection.py
+
+Light Cone with Halo Mask
+~~~~~~~~~~~~~~~~~~~~~~~~~
+This script combines the light cone generator with the halo profiler to 
+make a light cone projection with all of the halos cut out of the image.
+
 .. yt_cookbook:: light_cone_with_halo_mask.py 
+
+Making Unique Light Cones
+~~~~~~~~~~~~~~~~~~~~~~~~~
+This script demonstrates how to make a series of light cone projections
+that only have a maximum amount of volume in common.
+
 .. yt_cookbook:: unique_light_cone_projections.py 
+
+Making Light Rays
+~~~~~~~~~~~~~~~~~
+This script demonstrates how to make a synthetic quasar sight line and 
+uses the halo profiler to record information about halos close to the 
+line of sight.
+
 .. yt_cookbook:: make_light_ray.py 
-.. yt_cookbook:: halo_mass_info.py
-.. yt_cookbook:: run_halo_profiler.py
-.. yt_cookbook:: simulation_halo_profiler.py


diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/halo_profiler.py
--- /dev/null
+++ b/source/cookbook/halo_profiler.py
@@ -0,0 +1,48 @@
+from yt.mods import *
+
+from yt.analysis_modules.halo_profiler.api import *
+
+# Define a custom function to be called on all halos.
+# The first argument is a dictionary containing the
+# halo id, center, etc.
+# The second argument is the sphere centered on the halo.
+def get_density_extrema(halo, sphere):
+    my_extrema = sphere.quantities['Extrema']('Density')
+    mylog.info('Halo %d has density extrema: %s',
+               halo['id'], my_extrema)
+
+
+# Instantiate HaloProfiler for this dataset.
+hp = HaloProfiler('enzo_tiny_cosmology/DD0046/DD0046',
+                  output_dir='halo_analysis')
+
+# Add a filter to remove halos that have no profile points with overdensity
+# above 200, and with virial masses less than 1e10 solar masses.
+# Also, return the virial mass and radius to be written out to a file.
+hp.add_halo_filter(amods.halo_profiler.VirialFilter, must_be_virialized=True,
+                   overdensity_field='ActualOverdensity',
+                   virial_overdensity=200,
+                   virial_filters=[['TotalMassMsun', '>=', '1e10']],
+                   virial_quantities=['TotalMassMsun', 'RadiusMpc'])
+
+# Add profile fields.
+hp.add_profile('CellVolume', weight_field=None, accumulation=True)
+hp.add_profile('TotalMassMsun', weight_field=None, accumulation=True)
+hp.add_profile('Density', weight_field='CellMassMsun', accumulation=False)
+hp.add_profile('Temperature', weight_field='CellMassMsun', accumulation=False)
+
+# Make profiles and output filtered halo list to FilteredQuantities.h5.
+hp.make_profiles(filename="FilteredQuantities.h5",
+                 profile_format='hdf5', njobs=-1)
+
+# Add projection fields.
+hp.add_projection('Density', weight_field=None)
+hp.add_projection('Temperature', weight_field='Density')
+hp.add_projection('Metallicity', weight_field='Density')
+
+# Make projections just along the x axis using the filtered halo list.
+hp.make_projections(save_cube=False, save_images=True,
+                    halo_list='filtered', axes=[0], njobs=-1)
+
+# Run our custom analysis function on all halos in the filtered list.
+hp.analyze_halo_spheres(get_density_extrema, njobs=-1)


diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ b/source/cookbook/make_light_ray.py
@@ -7,6 +7,8 @@
 from yt.analysis_modules.cosmological_observation.light_ray.api import \
      LightRay
 
+os.mkdir('LR')
+     
 # Create a LightRay object extending from z = 0 to z = 0.1
 # and use only the redshift dumps.
 lr = LightRay("enzo_tiny_cosmology/32Mpc_32.enzo",
@@ -17,7 +19,8 @@
 # Configure the HaloProfiler.
 # These are keyword arguments given when creating a
 # HaloProfiler object.
-halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out',
+                        'output_dir' : 'halo_analysis'}
 
 # Create a list of actions for the HaloProfiler to take.
 halo_profiler_actions = []
@@ -52,8 +55,8 @@
 # Make a light ray, and set njobs to -1 to use one core
 # per dataset.
 lr.make_light_ray(seed=123456789,
-                  solution_filename='lightraysolution.txt',
-                  data_filename='lightray.h5',
+                  solution_filename='LR/lightraysolution.txt',
+                  data_filename='LR/lightray.h5',
                   fields=['Temperature', 'Density'],
                   get_nearest_halo=True,
                   nearest_halo_fields=['TotalMassMsun_100',


diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/run_halo_profiler.py
--- a/source/cookbook/run_halo_profiler.py
+++ /dev/null
@@ -1,37 +0,0 @@
-"""
-This is a recipe for making radial profiles and projections of all of the halos 
-within a cosmological simulation.  See :ref:`halo_profiling` for full documentation 
-of the HaloProfiler.
-"""
-from yt.mods import *
-
-# Instantiate HaloProfiler for this dataset.
-hp = amods.halo_profiler.HaloProfiler("DD0242/DD0242")
-
-# Add a filter to remove halos that have no profile points with overdensity 
-# above 200, and with virial masses less than 1e14 solar masses.
-# Also, return the virial mass and radius to be written out to a file.
-hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
-                   overdensity_field='ActualOverdensity',
-                   virial_overdensity=200,
-                   virial_filters=[['TotalMassMsun','>=','1e14']],
-                   virial_quantities=['TotalMassMsun','RadiusMpc'])
-
-# Add profile fields.
-hp.add_profile('CellVolume',weight_field=None,accumulation=True)
-hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
-hp.add_profile('Density',weight_field='CellMassMsun',accumulation=False)
-hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
-
-# Make profiles and output filtered halo list to FilteredQuantities.out.
-hp.make_profiles(filename="FilteredQuantities.out")
-
-# Add projection fields.
-hp.add_projection('Density',weight_field=None)
-hp.add_projection('Temperature',weight_field='Density')
-hp.add_projection('Metallicity',weight_field='Density')
-
-# Make projections for all three axes using the filtered halo list and 
-# save data to hdf5 files.
-hp.make_projections(save_cube=True,save_images=True,
-                    halo_list='filtered',axes=[0,1,2])


diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/simulation_analysis.py
--- /dev/null
+++ b/source/cookbook/simulation_analysis.py
@@ -0,0 +1,20 @@
+from yt.mods import *
+
+# Instantiate a time series object for an Enzo simulation..
+my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+
+# Get a time series for all data made by the simulation.
+my_sim.get_time_series()
+
+# Calculate and store extrema for all datasets.
+all_storage = {}
+for my_storage, pf in my_sim.piter(storage=all_storage):
+    all_data = pf.h.all_data()
+    my_extrema = all_data.quantities['Extrema']('Density')
+
+    # Save to storage so we can get at it afterward.
+    my_storage.result = my_extrema
+
+# Print out all the values we calculated.
+for my_result in all_storage.values():
+    print my_result


diff -r e9150e9fb04d2faba549fad47aad605eaab5944e -r a72c64e42b3206c3d51a0465fae4070fbf8cc5c3 source/cookbook/simulation_halo_profiler.py
--- a/source/cookbook/simulation_halo_profiler.py
+++ /dev/null
@@ -1,40 +0,0 @@
-"""
-The following recipe will run the HaloProfiler (see :ref:`halo_profiling`) on
-all the datasets in one simulation between z = 10 and 0.
-"""
-from yt.mods import *
-
-es = amods.simulation_handler.EnzoSimulation(
-        "simulation_parameter_file", initial_redshift=10, final_redshift=0)
-
-# Loop over all dataset in the requested time interval.
-for output in es.allOutputs:
-
-    # Instantiate HaloProfiler for this dataset.
-    hp = amods.halo_profiler.HaloProfiler(output['filename'])
-    
-    # Add a virialization filter.
-    hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
-                       overdensity_field='ActualOverdensity',
-                       virial_overdensity=200,
-                       virial_filters=[['TotalMassMsun','>=','1e14']],
-                       virial_quantities=['TotalMassMsun','RadiusMpc'])
-    
-    # Add profile fields.
-    hp.add_profile('CellVolume',weight_field=None,accumulation=True)
-    hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
-    hp.add_profile('Density',weight_field="CellMassMsun",accumulation=False)
-    hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
-    # Make profiles and output filtered halo list to FilteredQuantities.out.
-    hp.make_profiles(filename="FilteredQuantities.out")
-    
-    # Add projection fields.
-    hp.add_projection('Density',weight_field=None)
-    hp.add_projection('Temperature',weight_field='Density')
-    hp.add_projection('Metallicity',weight_field='Density')
-    # Make projections for all three axes using the filtered halo list and 
-    # save data to hdf5 files.
-    hp.make_projections(save_cube=True,save_images=True,
-                        halo_list='filtered',axes=[0,1,2])
-    
-    del hp



https://bitbucket.org/yt_analysis/yt-doc/changeset/1bc04ba24910/
changeset:   1bc04ba24910
user:        MatthewTurk
date:        2012-07-25 20:24:16
summary:     Merged in brittonsmith/yt-doc-worf (pull request #1)
affected #:  9 files

diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/cosmological_analysis.rst
--- a/source/cookbook/cosmological_analysis.rst
+++ b/source/cookbook/cosmological_analysis.rst
@@ -1,13 +1,69 @@
 Cosmological Analysis
 ---------------------
 
+These scripts demonstrate some basic and more advanced analysis that can be 
+performed on cosmological simulations.
+
+Simple Halo Finding
+~~~~~~~~~~~~~~~~~~~
+This script shows how to create a halo catalog for a single dataset.
+
 .. yt_cookbook:: halo_finding.py
+
+Plotting Halos
+~~~~~~~~~~~~~~
+This is a mechanism for plotting circles representing identified particle halos
+on an image.
+
 .. yt_cookbook:: halo_plotting.py
+
+Plotting Halo Particles
+~~~~~~~~~~~~~~~~~~~~~~~
+This is a simple mechanism for overplotting the particles belonging only to
+halos.
+
 .. yt_cookbook:: halo_particle_plotting.py
-.. yt_cookbook:: light_cone_projection.py 
+
+Halo Information
+~~~~~~~~~~~~~~~~
+This recipe finds halos and then prints out information about them.
+
+.. yt_cookbook:: halo_mass_info.py
+
+Halo Profiling and Custom Analysis
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+This script demonstrates the three primary uses of the halo profiler: 
+1) radial profiles and filtering; 2) projections; and 3) custom halo 
+analysis.
+
+.. yt_cookbook:: halo_profiler.py
+
+Light Cone Projection
+~~~~~~~~~~~~~~~~~~~~~
+This script creates a light cone projection, a synthetic observation 
+that stacks together projections from multiple datasets to extend over 
+a given redshift interval.
+
+.. yt_cookbook:: light_cone_projection.py
+
+Light Cone with Halo Mask
+~~~~~~~~~~~~~~~~~~~~~~~~~
+This script combines the light cone generator with the halo profiler to 
+make a light cone projection with all of the halos cut out of the image.
+
 .. yt_cookbook:: light_cone_with_halo_mask.py 
+
+Making Unique Light Cones
+~~~~~~~~~~~~~~~~~~~~~~~~~
+This script demonstrates how to make a series of light cone projections
+that only have a maximum amount of volume in common.
+
 .. yt_cookbook:: unique_light_cone_projections.py 
+
+Making Light Rays
+~~~~~~~~~~~~~~~~~
+This script demonstrates how to make a synthetic quasar sight line and 
+uses the halo profiler to record information about halos close to the 
+line of sight.
+
 .. yt_cookbook:: make_light_ray.py 
-.. yt_cookbook:: halo_mass_info.py
-.. yt_cookbook:: run_halo_profiler.py
-.. yt_cookbook:: simulation_halo_profiler.py


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/halo_profiler.py
--- /dev/null
+++ b/source/cookbook/halo_profiler.py
@@ -0,0 +1,48 @@
+from yt.mods import *
+
+from yt.analysis_modules.halo_profiler.api import *
+
+# Define a custom function to be called on all halos.
+# The first argument is a dictionary containing the
+# halo id, center, etc.
+# The second argument is the sphere centered on the halo.
+def get_density_extrema(halo, sphere):
+    my_extrema = sphere.quantities['Extrema']('Density')
+    mylog.info('Halo %d has density extrema: %s',
+               halo['id'], my_extrema)
+
+
+# Instantiate HaloProfiler for this dataset.
+hp = HaloProfiler('enzo_tiny_cosmology/DD0046/DD0046',
+                  output_dir='halo_analysis')
+
+# Add a filter to remove halos that have no profile points with overdensity
+# above 200, and with virial masses less than 1e10 solar masses.
+# Also, return the virial mass and radius to be written out to a file.
+hp.add_halo_filter(amods.halo_profiler.VirialFilter, must_be_virialized=True,
+                   overdensity_field='ActualOverdensity',
+                   virial_overdensity=200,
+                   virial_filters=[['TotalMassMsun', '>=', '1e10']],
+                   virial_quantities=['TotalMassMsun', 'RadiusMpc'])
+
+# Add profile fields.
+hp.add_profile('CellVolume', weight_field=None, accumulation=True)
+hp.add_profile('TotalMassMsun', weight_field=None, accumulation=True)
+hp.add_profile('Density', weight_field='CellMassMsun', accumulation=False)
+hp.add_profile('Temperature', weight_field='CellMassMsun', accumulation=False)
+
+# Make profiles and output filtered halo list to FilteredQuantities.h5.
+hp.make_profiles(filename="FilteredQuantities.h5",
+                 profile_format='hdf5', njobs=-1)
+
+# Add projection fields.
+hp.add_projection('Density', weight_field=None)
+hp.add_projection('Temperature', weight_field='Density')
+hp.add_projection('Metallicity', weight_field='Density')
+
+# Make projections just along the x axis using the filtered halo list.
+hp.make_projections(save_cube=False, save_images=True,
+                    halo_list='filtered', axes=[0], njobs=-1)
+
+# Run our custom analysis function on all halos in the filtered list.
+hp.analyze_halo_spheres(get_density_extrema, njobs=-1)


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/light_cone_projection.py
--- a/source/cookbook/light_cone_projection.py
+++ b/source/cookbook/light_cone_projection.py
@@ -7,19 +7,23 @@
 # We have already set up the redshift dumps to be
 # used for this, so we will not use any of the time
 # data dumps.
-cts = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
-                'Enzo', 0., 0.1,
-                observer_redshift=0.0,
-                field_of_view_in_arcminutes=600.0,
-                image_resolution_in_arcseconds=60.0,
-                time_data=False)
+lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
+               'Enzo', 0., 0.1,
+               observer_redshift=0.0,
+               field_of_view_in_arcminutes=600.0,
+               image_resolution_in_arcseconds=60.0,
+               time_data=False)
 
 # Calculate a randomization of the solution.
-cts.calculate_light_cone_solution(seed=123456789)
+lc.calculate_light_cone_solution(seed=123456789)
 
-# Make a light cone projection of the SZ Y parameter.
+# Choose the field to be projected.
+field = 'SZY'
+
 # Set njobs to -1 to have one core work on each projection
-# in parallel.
-cts.project_light_cone('SZY',
-                       save_slice_images=True,
-                       njobs=-1)
+# in parallel.  Set save_slice_images to True to see an
+# image for each individual slice.
+lc.project_light_cone(field, save_stack=False,
+                      save_final_image=True,
+                      save_slice_images=False,
+                      njobs=-1)


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/light_cone_with_halo_mask.py
--- a/source/cookbook/light_cone_with_halo_mask.py
+++ b/source/cookbook/light_cone_with_halo_mask.py
@@ -4,7 +4,7 @@
 from yt.analysis_modules.halo_profiler.api import *
 
 # Instantiate a light cone object as usual.
-lc = LightCone("enzo_tiny_cosmology/32Mpc_32.enzo",
+lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
                'Enzo', 0, 0.1,
                observer_redshift=0.0,
                field_of_view_in_arcminutes=600.0,
@@ -13,13 +13,15 @@
                output_dir='LC_HM', output_prefix='LightCone')
 
 # Calculate the light cone solution.
-lc.calculate_light_cone_solution(seed=123456789, filename='lightcone.dat')
+lc.calculate_light_cone_solution(seed=123456789,
+                                 filename='LC_HM/lightcone.dat')
 
 
 # Configure the HaloProfiler.
 # These are keyword arguments given when creating a
 # HaloProfiler object.
-halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out',
+                        'output_dir': 'halo_analysis'}
 
 # Create a list of actions for the HaloProfiler to take.
 halo_profiler_actions = []
@@ -56,8 +58,9 @@
 # Write the boolean map to an hdf5 file called 'halo_mask.h5'.
 # Write a text file detailing the location, redshift, radius, and mass
 # of each halo in light cone projection.
-lc.get_halo_mask(mask_file='halo_mask.h5', map_file='halo_map.out',
-                 cube_file='halo_cube.h5',
+lc.get_halo_mask(mask_file='LC_HM/halo_mask.h5',
+                 map_file='LC_HM/halo_map.out',
+                 cube_file='LC_HM/halo_cube.h5',
                  virial_overdensity=100,
                  halo_profiler_parameters=halo_profiler_parameters,
                  njobs=1, dynamic=False)
@@ -66,5 +69,7 @@
 field = 'SZY'
 
 # Make the light cone projection and apply the halo mask.
-pc = lc.project_light_cone(field, save_stack=True, save_slice_images=True,
+pc = lc.project_light_cone(field, save_stack=False,
+                           save_final_image=True,
+                           save_slice_images=False,
                            apply_halo_mask=True)


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ b/source/cookbook/make_light_ray.py
@@ -7,6 +7,8 @@
 from yt.analysis_modules.cosmological_observation.light_ray.api import \
      LightRay
 
+os.mkdir('LR')
+     
 # Create a LightRay object extending from z = 0 to z = 0.1
 # and use only the redshift dumps.
 lr = LightRay("enzo_tiny_cosmology/32Mpc_32.enzo",
@@ -17,7 +19,8 @@
 # Configure the HaloProfiler.
 # These are keyword arguments given when creating a
 # HaloProfiler object.
-halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out'}
+halo_profiler_kwargs = {'halo_list_file': 'HopAnalysis.out',
+                        'output_dir' : 'halo_analysis'}
 
 # Create a list of actions for the HaloProfiler to take.
 halo_profiler_actions = []
@@ -52,8 +55,8 @@
 # Make a light ray, and set njobs to -1 to use one core
 # per dataset.
 lr.make_light_ray(seed=123456789,
-                  solution_filename='lightraysolution.txt',
-                  data_filename='lightray.h5',
+                  solution_filename='LR/lightraysolution.txt',
+                  data_filename='LR/lightray.h5',
                   fields=['Temperature', 'Density'],
                   get_nearest_halo=True,
                   nearest_halo_fields=['TotalMassMsun_100',


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/run_halo_profiler.py
--- a/source/cookbook/run_halo_profiler.py
+++ /dev/null
@@ -1,37 +0,0 @@
-"""
-This is a recipe for making radial profiles and projections of all of the halos 
-within a cosmological simulation.  See :ref:`halo_profiling` for full documentation 
-of the HaloProfiler.
-"""
-from yt.mods import *
-
-# Instantiate HaloProfiler for this dataset.
-hp = amods.halo_profiler.HaloProfiler("DD0242/DD0242")
-
-# Add a filter to remove halos that have no profile points with overdensity 
-# above 200, and with virial masses less than 1e14 solar masses.
-# Also, return the virial mass and radius to be written out to a file.
-hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
-                   overdensity_field='ActualOverdensity',
-                   virial_overdensity=200,
-                   virial_filters=[['TotalMassMsun','>=','1e14']],
-                   virial_quantities=['TotalMassMsun','RadiusMpc'])
-
-# Add profile fields.
-hp.add_profile('CellVolume',weight_field=None,accumulation=True)
-hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
-hp.add_profile('Density',weight_field='CellMassMsun',accumulation=False)
-hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
-
-# Make profiles and output filtered halo list to FilteredQuantities.out.
-hp.make_profiles(filename="FilteredQuantities.out")
-
-# Add projection fields.
-hp.add_projection('Density',weight_field=None)
-hp.add_projection('Temperature',weight_field='Density')
-hp.add_projection('Metallicity',weight_field='Density')
-
-# Make projections for all three axes using the filtered halo list and 
-# save data to hdf5 files.
-hp.make_projections(save_cube=True,save_images=True,
-                    halo_list='filtered',axes=[0,1,2])


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/simulation_analysis.py
--- /dev/null
+++ b/source/cookbook/simulation_analysis.py
@@ -0,0 +1,20 @@
+from yt.mods import *
+
+# Instantiate a time series object for an Enzo simulation..
+my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+
+# Get a time series for all data made by the simulation.
+my_sim.get_time_series()
+
+# Calculate and store extrema for all datasets.
+all_storage = {}
+for my_storage, pf in my_sim.piter(storage=all_storage):
+    all_data = pf.h.all_data()
+    my_extrema = all_data.quantities['Extrema']('Density')
+
+    # Save to storage so we can get at it afterward.
+    my_storage.result = my_extrema
+
+# Print out all the values we calculated.
+for my_result in all_storage.values():
+    print my_result


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/simulation_halo_profiler.py
--- a/source/cookbook/simulation_halo_profiler.py
+++ /dev/null
@@ -1,40 +0,0 @@
-"""
-The following recipe will run the HaloProfiler (see :ref:`halo_profiling`) on
-all the datasets in one simulation between z = 10 and 0.
-"""
-from yt.mods import *
-
-es = amods.simulation_handler.EnzoSimulation(
-        "simulation_parameter_file", initial_redshift=10, final_redshift=0)
-
-# Loop over all dataset in the requested time interval.
-for output in es.allOutputs:
-
-    # Instantiate HaloProfiler for this dataset.
-    hp = amods.halo_profiler.HaloProfiler(output['filename'])
-    
-    # Add a virialization filter.
-    hp.add_halo_filter(amods.halo_profiler.VirialFilter,must_be_virialized=True,
-                       overdensity_field='ActualOverdensity',
-                       virial_overdensity=200,
-                       virial_filters=[['TotalMassMsun','>=','1e14']],
-                       virial_quantities=['TotalMassMsun','RadiusMpc'])
-    
-    # Add profile fields.
-    hp.add_profile('CellVolume',weight_field=None,accumulation=True)
-    hp.add_profile('TotalMassMsun',weight_field=None,accumulation=True)
-    hp.add_profile('Density',weight_field="CellMassMsun",accumulation=False)
-    hp.add_profile('Temperature',weight_field='CellMassMsun',accumulation=False)
-    # Make profiles and output filtered halo list to FilteredQuantities.out.
-    hp.make_profiles(filename="FilteredQuantities.out")
-    
-    # Add projection fields.
-    hp.add_projection('Density',weight_field=None)
-    hp.add_projection('Temperature',weight_field='Density')
-    hp.add_projection('Metallicity',weight_field='Density')
-    # Make projections for all three axes using the filtered halo list and 
-    # save data to hdf5 files.
-    hp.make_projections(save_cube=True,save_images=True,
-                        halo_list='filtered',axes=[0,1,2])
-    
-    del hp


diff -r b57d3621ff70841dcc0c0f706fbfb3688e793931 -r 1bc04ba2491023fd6b35289738e58623faed9375 source/cookbook/unique_light_cone_projections.py
--- a/source/cookbook/unique_light_cone_projections.py
+++ b/source/cookbook/unique_light_cone_projections.py
@@ -17,12 +17,15 @@
 # solutions.  This will save time when making the projection.
 find_unique_solutions(lc, max_overlap=0.10, failures=50,
                       seed=123456789, recycle=True,
-                      solutions=10, filename='unique.dat')
+                      solutions=10, filename='LC_U/unique.dat')
 
+# Choose the field to be projected.
 field = 'SZY'
 
 # Make light cone projections with each of the random seeds
 # found above.  All output files will be written with unique
 # names based on the random seed numbers.
-project_unique_light_cones(lc, 'unique.dat', field,
-                           save_slice_images=True)
+project_unique_light_cones(lc, 'LC_U/unique.dat', field,
+                           save_stack=False,
+                           save_final_image=True,
+                           save_slice_images=False)



https://bitbucket.org/yt_analysis/yt-doc/changeset/3f5902e12fb5/
changeset:   3f5902e12fb5
user:        MatthewTurk
date:        2012-07-25 20:26:32
summary:     Adding simulation_analysis to the cookbook
affected #:  1 file

diff -r 1bc04ba2491023fd6b35289738e58623faed9375 -r 3f5902e12fb54cfae44b9acb56522db387ea7540 source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -37,6 +37,15 @@
 
 .. yt_cookbook:: rad_velocity.py 
 
+Simulation Analysis
+~~~~~~~~~~~~~~~~~~~
+
+This uses :class:`~yt.data_objects.time_series.SimulationTimeSeries` to
+calculate the extrema of a series of outputs, whose names it guesses in
+advance.  This will run in parallel and take advantage of multiple MPI tasks.
+
+.. yt_cookbook:: simulation_analysis.py
+
 Time Series Analysis
 ~~~~~~~~~~~~~~~~~~~~
 



https://bitbucket.org/yt_analysis/yt-doc/changeset/85d160e29fff/
changeset:   85d160e29fff
user:        MatthewTurk
date:        2012-07-25 21:42:26
summary:     Fixing a number of recipes.  Volume rendering recipes still have issues.
affected #:  9 files

diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d helper_scripts/run_recipes.sh
--- a/helper_scripts/run_recipes.sh
+++ b/helper_scripts/run_recipes.sh
@@ -7,7 +7,7 @@
     echo ${sb}.done
     [ -e ${sb}.done ] && continue
     echo ${sb}
-    python2.7 ${ROOT}/${s} || exit
+    python2.7 ${ROOT}/${s} --config serialize=false || exit
     for o in *.png *.txt *.h5 *.dat
     do
         mv -v ${o} ${ROOT}/source/cookbook/_static/${sb%%.py}__${o}


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/camera_movement.py
--- a/source/cookbook/camera_movement.py
+++ b/source/cookbook/camera_movement.py
@@ -1,19 +1,17 @@
 from yt.mods import * # set up our namespace
    
 # Follow the simple_volume_rendering cookbook for the first part of this.
-fn = "RedshiftOutput0005" # parameter file to load
-pf = load(fn) # load data
-dd = pf.h.all_data()
-mi, ma = dd.quantities["Extrema"]("Density")[0]
+pf = load("IsolatedGalaxy/galaxy0030/galaxy0030") # load data
 
 # Set up transfer function
-tf = ColorTransferFunction((na.log10(mi), na.log10(ma)))
-tf.add_layers(6, w=0.05)
+tf = ColorTransferFunction((-30.0, -24.0))
+tf.plot("hi2.png")
+tf.add_layers(10, w=0.01)
 
 # Set up camera paramters
 c = [0.5, 0.5, 0.5] # Center
 L = [1, 1, 1] # Normal Vector
-W = 1.0 # Width
+W = 0.5 # Width
 Nvec = 512 # Pixels on a side
 
 # Specify a north vector, which helps with rotations.
@@ -24,20 +22,21 @@
 
 # Initialize the Camera
 cam = pf.h.camera(c, L, W, (Nvec,Nvec), tf, north_vector=north_vector)
+cam.snapshot("hi.png", 4.0)
 frame = 0
 
 # Do a rotation over 30 frames
-for i, snapshot in enumerate(cam.rotation(na.pi, 30)):
+for i, snapshot in enumerate(cam.rotation(na.pi, 5, clip_ratio=4.0)):
     write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 
 # Move to the maximum density location over 10 frames
-for i, snapshot in enumerate(cam.move_to(max_c, 10)):
+for i, snapshot in enumerate(cam.move_to(max_c, 5, clip_ratio=4.0)):
     write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 
 # Zoom in by a factor of 10 over 10 frames
-for i, snapshot in enumerate(cam.zoomin(10.0, 10)):
+for i, snapshot in enumerate(cam.zoomin(10.0, 5, clip_ratio=4.0)):
     write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/extract_fixed_resolution_data.py
--- a/source/cookbook/extract_fixed_resolution_data.py
+++ b/source/cookbook/extract_fixed_resolution_data.py
@@ -3,20 +3,18 @@
 # For this example we will use h5py to write to our output file.
 import h5py
 
-fn = "RedshiftOutput0005" # parameter file to load
-pf = load(fn) # load data
+pf = load("IsolatedGalaxy/galaxy0030/galaxy0030")
 
-# This is the resolution we will extract at
-DIMS = 128
+level = 2
+dims = pf.domain_dimensions * pf.refine_by**level
 
 # Now, we construct an object that describes the data region and structure we
 # want
 cube = pf.h.covering_grid(2, # The level we are willing to extract to; higher
                              # levels than this will not contribute to the data!
                           left_edge=[0.0, 0.0, 0.0], 
-                          # How many dimensions along each axis
-                          dims=[DIMS,DIMS,DIMS],
                           # And any fields to preload (this is optional!)
+                          dims = dims,
                           fields=["Density"])
 
 # Now we open our output file using h5py


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/halo_profiler.py
--- a/source/cookbook/halo_profiler.py
+++ b/source/cookbook/halo_profiler.py
@@ -14,7 +14,7 @@
 
 # Instantiate HaloProfiler for this dataset.
 hp = HaloProfiler('enzo_tiny_cosmology/DD0046/DD0046',
-                  output_dir='halo_analysis')
+                  output_dir='.')
 
 # Add a filter to remove halos that have no profile points with overdensity
 # above 200, and with virial masses less than 1e10 solar masses.


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/multi_plot.py
--- a/source/cookbook/multi_plot.py
+++ b/source/cookbook/multi_plot.py
@@ -28,7 +28,7 @@
 p = pc.add_slice("Temperature", "x", figure=fig, axes=axes[0][1], use_colorbar=False)
 p.set_cmap("hot") # a different colormap
 
-pc.set_width(5.0, 'mpc') # change width of both plots
+pc.set_width(1.0, 'mpc') # change width of both plots
 
 # Each 'p' is a plot -- this is the Density plot and the Temperature plot.
 # Each 'cax' is a colorbar-container, into which we'll put a colorbar.


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/multi_plot_3x2.py
--- a/source/cookbook/multi_plot_3x2.py
+++ b/source/cookbook/multi_plot_3x2.py
@@ -31,8 +31,6 @@
     p.set_zlim(1e3, 3e4) # Set this so it's the same for all.
     p.set_cmap("hot") # a different colormap
 
-pc.set_width(5.0, 'mpc') # change width of both plots
-
 # Each 'p' is a plot -- this is the Density plot and the Temperature plot.
 # Each 'cax' is a colorbar-container, into which we'll put a colorbar.
 # zip means, give these two me together.  Note that it cuts off after the


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/multi_plot_3x2_FRB.py
--- a/source/cookbook/multi_plot_3x2_FRB.py
+++ b/source/cookbook/multi_plot_3x2_FRB.py
@@ -6,9 +6,10 @@
 
 
 pf = load(fn) # load data
+v, c = pf.h.find_max("Density")
 
 # set up our Fixed Resolution Buffer parameters: a width, resolution, and center
-width = (7.0, "mpc")
+width = (1.0, 'unitary')
 res = [1000, 1000]
 #  get_multi_plot returns a containing figure, a list-of-lists of axes
 #   into which we can place plots, and some axes that we'll put
@@ -28,7 +29,7 @@
 plots = []
 for ax in range(3):
     sli = pf.h.slice(ax, c[ax])
-    frb = sli.to_frb(width, res, center=c, periodic=True)
+    frb = sli.to_frb(width, res, center=c)
     den_axis = axes[ax][0]
     temp_axis = axes[ax][1]
 


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/multi_plot_slice_and_proj.py
--- a/source/cookbook/multi_plot_slice_and_proj.py
+++ b/source/cookbook/multi_plot_slice_and_proj.py
@@ -16,7 +16,7 @@
 #   bw is the base-width in inches, but 4 is about right for most cases.
 fig, axes, colorbars = get_multi_plot(3, 2, colorbar=orient, bw = 4)
 
-slc = pf.h.slice(2, 0.0, fields=["Density","Temperature","z-velocity"], 
+slc = pf.h.slice(2, 0.0, fields=["Density","Temperature","VelocityMagnitude"], 
                  center=pf.domain_center)
 proj = pf.h.proj(2, "Density", weight_field="Density", center=pf.domain_center)
 
@@ -40,8 +40,8 @@
          dens_axes[1].imshow(proj_frb["Density"], origin='lower', norm=LogNorm()),
          temp_axes[0].imshow(slc_frb["Temperature"], origin='lower'),    
          temp_axes[1].imshow(proj_frb["Temperature"], origin='lower'),
-         vels_axes[0].imshow(slc_frb["z-velocity"], origin='lower'),
-         vels_axes[1].imshow(proj_frb["z-velocity"], origin='lower')]
+         vels_axes[0].imshow(slc_frb["VelocityMagnitude"], origin='lower'),
+         vels_axes[1].imshow(proj_frb["VelocityMagnitude"], origin='lower')]
          
 plots[0].set_clim((1.0e-27,1.0e-25))
 plots[0].set_cmap("bds_highcontrast")
@@ -58,7 +58,7 @@
 
 titles=[r'$\mathrm{Density}\ (\mathrm{g\ cm^{-3}})$', 
         r'$\mathrm{Temperature}\ (\mathrm{K})$',
-        r'$\mathrm{z-velocity}\ (\mathrm{cm\ s^{-1}})$']
+        r'$\mathrm{VelocityMagnitude}\ (\mathrm{cm\ s^{-1}})$']
 
 for p, cax, t in zip(plots[0:6:2], colorbars, titles):
     cbar = fig.colorbar(p, cax=cax, orientation=orient)


diff -r 3f5902e12fb54cfae44b9acb56522db387ea7540 -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d source/cookbook/zoomin_frames.py
--- a/source/cookbook/zoomin_frames.py
+++ b/source/cookbook/zoomin_frames.py
@@ -10,8 +10,8 @@
 pf = load(fn) # load data
 frame_template = "frame_%05i" # Template for frame filenames
 
-p = SlicePlot(pf, "Density", "z") # Add our slice, along z
-p.annotate_contours("Temperature") # We'll contour in temperature -- this kind
+p = SlicePlot(pf, "z", "Density") # Add our slice, along z
+p.annotate_contour("Temperature") # We'll contour in temperature -- this kind
                                    # of modification can't be done on the command
                                    # line, so that's why we have the recipe!
 



https://bitbucket.org/yt_analysis/yt-doc/changeset/112d4ae5e2f7/
changeset:   112d4ae5e2f7
user:        MatthewTurk
date:        2012-07-25 21:59:41
summary:     Bug fix in light ray for multiple runs of the same recipe.
affected #:  1 file

diff -r 85d160e29fff3f32a63276bee443f1f4eb28fc9d -r 112d4ae5e2f7235a3270c772c679c1a5553c8b7a source/cookbook/make_light_ray.py
--- a/source/cookbook/make_light_ray.py
+++ b/source/cookbook/make_light_ray.py
@@ -7,7 +7,7 @@
 from yt.analysis_modules.cosmological_observation.light_ray.api import \
      LightRay
 
-os.mkdir('LR')
+if not os.path.isdir("LR"): os.mkdir('LR')
      
 # Create a LightRay object extending from z = 0 to z = 0.1
 # and use only the redshift dumps.



https://bitbucket.org/yt_analysis/yt-doc/changeset/e716e295b688/
changeset:   e716e295b688
user:        MatthewTurk
date:        2012-07-26 17:41:48
summary:     Changing back a few things in the camera recipe.
affected #:  1 file

diff -r 112d4ae5e2f7235a3270c772c679c1a5553c8b7a -r e716e295b688ddf613c6991020bc37c0ff61f1e5 source/cookbook/camera_movement.py
--- a/source/cookbook/camera_movement.py
+++ b/source/cookbook/camera_movement.py
@@ -2,16 +2,17 @@
    
 # Follow the simple_volume_rendering cookbook for the first part of this.
 pf = load("IsolatedGalaxy/galaxy0030/galaxy0030") # load data
+dd = pf.h.all_data()
+mi, ma = dd.quantities["Extrema"]("Density")[0]
 
 # Set up transfer function
-tf = ColorTransferFunction((-30.0, -24.0))
-tf.plot("hi2.png")
-tf.add_layers(10, w=0.01)
+tf = ColorTransferFunction((na.log10(mi), na.log10(ma)))
+tf.add_layers(6, w=0.05)
 
 # Set up camera paramters
 c = [0.5, 0.5, 0.5] # Center
 L = [1, 1, 1] # Normal Vector
-W = 0.5 # Width
+W = 1.0 # Width
 Nvec = 512 # Pixels on a side
 
 # Specify a north vector, which helps with rotations.
@@ -22,22 +23,21 @@
 
 # Initialize the Camera
 cam = pf.h.camera(c, L, W, (Nvec,Nvec), tf, north_vector=north_vector)
-cam.snapshot("hi.png", 4.0)
 frame = 0
 
-# Do a rotation over 30 frames
-for i, snapshot in enumerate(cam.rotation(na.pi, 5, clip_ratio=4.0)):
+# Do a rotation over 5 frames
+for i, snapshot in enumerate(cam.rotation(na.pi, 5, clip_ratio = 8.0)):
+    write_bitmap(snapshot, 'camera_movement_%04.png' % frame)
+    frame += 1
+
+# Move to the maximum density location over 5 frames
+for i, snapshot in enumerate(cam.move_to(max_c, 5, clip_ratio = 8.0)):
+    write_bitmap(snapshot, 'camera_movement_%04.png' % frame)
+    frame += 1
+
+# Zoom in by a factor of 10 over 5 frames
+for i, snapshot in enumerate(cam.zoomin(10.0, 5, clip_ratio = 8.0)):
     write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 
-# Move to the maximum density location over 10 frames
-for i, snapshot in enumerate(cam.move_to(max_c, 5, clip_ratio=4.0)):
-    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
-    frame += 1
 
-# Zoom in by a factor of 10 over 10 frames
-for i, snapshot in enumerate(cam.zoomin(10.0, 5, clip_ratio=4.0)):
-    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
-    frame += 1
-
-



https://bitbucket.org/yt_analysis/yt-doc/changeset/56e69a18c6a7/
changeset:   56e69a18c6a7
user:        MatthewTurk
date:        2012-07-26 22:33:12
summary:     Adding content to time series analysis, changing a few things in a couple recipes, turning back on API generation, and adding data files to the cookbook extension.
affected #:  8 files

diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 extensions/yt_cookbook.py
--- a/extensions/yt_cookbook.py
+++ b/extensions/yt_cookbook.py
@@ -15,6 +15,8 @@
     setup.config = app.config
     setup.confdir = app.confdir
 
+data_patterns = ["*.h5", "*.out", "*.dat"]
+
 class CookbookScript(Directive):
     required_arguments = 1
     optional_arguments = 0
@@ -33,8 +35,10 @@
             os.makedirs(dest_dir) # no problem here for me, but just use built-ins
 
         rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        place = os.path.join(dest_dir, rel_dir)
+        if not os.path.isdir(place): os.makedirs(place)
         shutil.copyfile(os.path.join(rst_dir, script_fn),
-                        os.path.join(dest_dir, rel_dir, script_bn))
+                        os.path.join(place, script_bn))
 
         im_path = os.path.join(rst_dir, "_static")
         images = sorted(glob.glob(os.path.join(im_path, "%s__*.png" % script_name)))
@@ -51,5 +55,15 @@
             lines.append("   :width: 400")
             lines.append("   :target: ../_images/%s" % os.path.basename(im))
             lines.append("\n")
+        lines.append("\n")
+        for ext in data_patterns:
+            data_files = sorted(glob.glob(os.path.join(
+                im_path, "%s__*.%s" % (script_name, ext))))
+            for df in data_files:
+                df_bn = os.path.basename(df)
+                shutil.copyfile(os.path.join(rst_dir, df),
+                                os.path.join(dest_dir, rel_dir, df_bn))
+                lines.append(" * Data: `%s <%s>`__)" % (df_bn, df))
+            lines.append("\n")
         self.state_machine.insert_input(lines, rst_file)
         return []


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/advanced/ionization_cube.py
--- a/source/advanced/ionization_cube.py
+++ b/source/advanced/ionization_cube.py
@@ -8,9 +8,7 @@
 def IonizedHydrogen(field, data):
     return data["HII_Density"]/(data["HI_Density"]+data["HII_Density"])
 
-filenames = glob.glob("SED800/DD*/*.hierarchy")
-filenames.sort()
-ts = TimeSeriesData.from_filenames(filenames, parallel = 8)
+ts = TimeSeriesData.from_filenames("SED800/DD*/*.hierarchy", parallel = 8)
 
 ionized_z = na.zeros(ts[0].domain_dimensions, dtype="float32")
 


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/advanced/parallel_computation.rst
--- a/source/advanced/parallel_computation.rst
+++ b/source/advanced/parallel_computation.rst
@@ -128,6 +128,8 @@
 saved to a list. The list is then split up and each MPI task performs parts of
 it independently.
 
+.. _parallelizing-your-analysis:
+
 Parallelizing Your Analysis
 ---------------------------
 
@@ -195,6 +197,8 @@
 This example above can be modified to loop over anything that can be saved to
 a Python list: halos, data files, arrays, and more.
 
+.. _parallel-time-series-analysis:
+
 Parallel Time Series Analysis
 -----------------------------
 
@@ -208,9 +212,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   all_files = glob.glob("DD*/output_*")
-   all_files.sort()
-   ts = TimeSeries.from_filenames(all_files, Parallel = True)
+   ts = TimeSeries.from_filenames("DD*/output_*", Parallel = True)
    sphere = ts.sphere("max", (1.0, "pc"))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 
@@ -225,19 +227,17 @@
 .. code-block:: python
 
    from yt.mods import *
-   all_files = glob.glob("DD*/output_*")
-   all_files.sort()
-   ts = TimeSeries.from_filenames(all_files, Parallel = True)
+   ts = TimeSeries.from_filenames("DD*/output_*", Parallel = True)
    my_storage = {}
    for sto,pf in ts.piter(storage=my_storage):
-	sphere = pf.h.sphere("max", (1.0, "pc"))
-	L_vec = sphere.quantities["AngularMomentumVector"]()
-	sto.result_id = pf.parameter_filename
-	sto.result = L_vec
+       sphere = pf.h.sphere("max", (1.0, "pc"))
+       L_vec = sphere.quantities["AngularMomentumVector"]()
+       sto.result_id = pf.parameter_filename
+       sto.result = L_vec
 
    L_vecs = []
    for fn, L_vec in sorted(my_storage.items()):
-	L_vecs.append(L_vec)
+       L_vecs.append(L_vec)
 
 
 You can also request a fixed number of processors to calculate each
@@ -247,9 +247,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   all_files = glob.glob("DD*/output_*")
-   all_files.sort()
-   ts = TimeSeries.from_filenames(all_files, Parallel = 4)
+   ts = TimeSeries.from_filenames("DD*/output_*", Parallel = 4)
    sphere = ts.sphere("max", (1.0, "pc))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/analyzing/time_series_analysis.rst
--- a/source/analyzing/time_series_analysis.rst
+++ b/source/analyzing/time_series_analysis.rst
@@ -48,9 +48,47 @@
 
 This will create a new time series, populated with the output files ``DD0030``
 and ``DD0040``.  This object, here called ``ts``, can now be analyzed in bulk.
+Alternately, you can specify a pattern that is supplied to :module:`glob`, and
+those filenames will be sorted and returned.  Here is an example:
+
+.. code-block:: python
+
+   from yt.mods import *
+   ts = TimeSeriesData.from_filenames("*/*.hierarchy")
+
+Analyzing Each Dataset In Sequence
+----------------------------------
+
+The :class:`~yt.data_objects.time_series.TimeSeriesData` object has two primary
+methods of iteration.  The first is a very simple iteration, where each object
+is returned for iteration:
+
+.. code-block:: python
+
+   from yt.mods import *
+   ts = TimeSeriesData.from_filenames("*/*.hierarchy")
+   for pf in ts:
+       print pf.current_time
+
+This can also operate in parallel, using
+:meth:`~yt.data_objects.time_series.TimeSeriesData.piter`.  For more examples,
+see:
+
+ * :ref:`parallel-time-series-analysis`
+ * The cookbook recipe for :ref:`time-series-analysis`
+ * :class:`~yt.data_objects.time_series.TimeSeriesData`
+
+Prepared Time Series Analysis
+-----------------------------
+
+A few handy functions for treating time series data as a uniform, single object
+are also available.
+
+.. warning:: The future of these functions is uncertain: they may be removed in
+   the future!
 
 Simple Analysis Tasks
----------------------
+~~~~~~~~~~~~~~~~~~~~~
 
 The available tasks that come built-in can be seen by looking at the output of
 ``ts.tasks.keys()``.  For instance, one of the simplest ones is the
@@ -60,9 +98,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   all_files = glob.glob("*/*.hierarchy")
-   all_files.sort()
-   ts = TimeSeries.from_filenames(all_files)
+   ts = TimeSeries.from_filenames("*/*.hierarchy")
    max_rho = ts.tasks["MaximumValue"]("Density")
 
 When we call the task, the time series object executes the task on each
@@ -72,7 +108,7 @@
 list of analysis tasks.
 
 Analysis Tasks Applied to Objects
----------------------------------
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 
 Just as some tasks can be applied to datasets as a whole, one can also apply
 the creation of objects to datasets.  This means that you are able to construct
@@ -87,9 +123,7 @@
 .. code-block:: python
 
    from yt.mods import *
-   all_files = glob.glob("*/*.hierarchy")
-   all_files.sort()
-   ts = TimeSeries.from_filenames(all_files)
+   ts = TimeSeries.from_filenames("*/*.hierarchy")
    sphere = ts.sphere("max", (1.0, "pc"))
    L_vecs = sphere.quantities["AngularMomentumVector"]()
 
@@ -104,7 +138,7 @@
 the same manner as "sphere" was used above.
 
 Creating Analysis Tasks
------------------------
+~~~~~~~~~~~~~~~~~~~~~~~
 
 If you wanted to look at the mass in star particles as a function of time, you
 would write a function that accepts params and pf and then decorate it with


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/conf.py
--- a/source/conf.py
+++ b/source/conf.py
@@ -241,4 +241,4 @@
                        'http://matplotlib.sourceforge.net/': None,
                        }
 
-#autosummary_generate = glob.glob("api/api.rst")
+autosummary_generate = glob.glob("api/api.rst")


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/cookbook/calculating_information.rst
--- a/source/cookbook/calculating_information.rst
+++ b/source/cookbook/calculating_information.rst
@@ -46,6 +46,8 @@
 
 .. yt_cookbook:: simulation_analysis.py
 
+.. _time-series-analysis:
+
 Time Series Analysis
 ~~~~~~~~~~~~~~~~~~~~
 


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/cookbook/camera_movement.py
--- a/source/cookbook/camera_movement.py
+++ b/source/cookbook/camera_movement.py
@@ -27,12 +27,12 @@
 
 # Do a rotation over 5 frames
 for i, snapshot in enumerate(cam.rotation(na.pi, 5, clip_ratio = 8.0)):
-    write_bitmap(snapshot, 'camera_movement_%04.png' % frame)
+    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 
 # Move to the maximum density location over 5 frames
 for i, snapshot in enumerate(cam.move_to(max_c, 5, clip_ratio = 8.0)):
-    write_bitmap(snapshot, 'camera_movement_%04.png' % frame)
+    write_bitmap(snapshot, 'camera_movement_%04i.png' % frame)
     frame += 1
 
 # Zoom in by a factor of 10 over 5 frames


diff -r e716e295b688ddf613c6991020bc37c0ff61f1e5 -r 56e69a18c6a7f212740969c5a550050017effd77 source/cookbook/index.rst
--- a/source/cookbook/index.rst
+++ b/source/cookbook/index.rst
@@ -27,4 +27,4 @@
    complex_plots
    cosmological_analysis
    constructing_data_objects
-
+   advanced



https://bitbucket.org/yt_analysis/yt-doc/changeset/1c7708950891/
changeset:   1c7708950891
user:        MatthewTurk
date:        2012-07-26 22:33:18
summary:     Merging
affected #:  8 files

diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/analyzing_an_entire_simulation.rst
--- a/source/analysis_modules/analyzing_an_entire_simulation.rst
+++ /dev/null
@@ -1,151 +0,0 @@
-.. _analyzing-an-entire-simulation:
-
-Analyzing an Entire Simulation
-==============================
-.. sectionauthor:: Britton Smith <britton.smith at colorado.edu>
-
-The EnzoSimulation class provides a simple framework for performing the same 
-analysis on multiple datasets in a single simulation.  At its most basic, an 
-EnzoSimulation object gives you access to a time-ordered list of datasets over 
-the time or redshift interval of your choosing.  It also includes more 
-sophisticated machinery for stitching together cosmological datasets to create 
-a continuous volume spanning a given redshift interval.  This is the engine that 
-powers the light cone generator (see :ref:`light-cone-generator`).  See 
-:ref:`cookbook-simulation_halo_profiler` for an example of using the EnzoSimulation 
-class to run the HaloProfiler on multiple datasets within a simulation.
-
-EnzoSimulation Options
-----------------------
-
-The only argument required to instantiate an EnzoSimulation is the path to the 
-parameter file used to run the simulation:
-
-.. code-block:: python
-
-  import yt.analysis_modules.simulation_handler.api as ES
-  es = ES.EnzoSimulation("my_simulation.par")
-
-The EnzoSimulation object will then read through the simulation parameter file 
-to figure out what datasets are available and where they are located.  Comment 
-characters are respected, so commented-out lines will be ignored.  If no time 
-and/or redshift interval is specified using the keyword arguments listed below, 
-the EnzoSimulation object will create a time-ordered list of all datasets.
-
-.. note:: For cosmological simulations, the interval of interest can be
-   specified with a combination of time and redshift keywords.
-
-The additional keyword options are:
-
- * **initial_time** (*float*): the initial time in code units for the
-   dataset list.  Default: None.
-
- * **final_time** (*float*): the final time in code units for the dataset
-   list.  Default: None.
-
- * **initial_redshift** (*float*): the initial (highest) redshift for the
-   dataset list.  Only for cosmological simulations.  Default: None.
-
- * **final_redshift** (*float*): the final (lowest) redshift for the dataset
-   list.  Only for cosmological simulations.  Default: None.
-
- * **links** (*bool*): if True, each entry in the dataset list will
-   contain entries, *previous* and *next*, that point to the previous and next
-   entries on the dataset list.  Default: False.
-
- * **enzo_parameters** (*dict*): a dictionary specify additional
-   parameters to be retrieved from the parameter file.  The format should be the
-   name of the parameter as the key and the variable type as the value.  For
-   example, {'CosmologyComovingBoxSize':float}.  All parameter values will be
-   stored in the dictionary attribute, *enzoParameters*.  Default: None.
-
- * **get_time_outputs** (*bool*): if False, the time datasets, specified
-   in Enzo with the *dtDataDump*, will not be added to the dataset list.  Default:
-   True.
-
- * **get_redshift_outputs** (*bool*): if False, the redshift datasets will
-   not be added to the dataset list.  Default: True.
-
-.. warning:: The EnzoSimulation object will use the *GlobalDir* Enzo parameter
-   to determine the absolute path to the data, so make sure this is set correctly
-   if the data has been moved.  If this parameter is not present in the parameter
-   file, the code will look for the data in the current directory.
-
-The Dataset List
-----------------
-
-The primary attribute of an EnzoSimulation object is the dataset list, 
-*allOutputs*.  Each list item is a dictionary, containing the time, redshift 
-(if cosmological), and filename of the dataset.
-
-.. code-block:: python
-
-  >>> es.allOutputs[0]
-  {'filename': '/Users/britton/EnzoRuns/cool_core_unreasonable/RD0000/RD0000',
-   'time': 0.81631644849936602, 'redshift': 99.0}
-
-Now, analyzing each dataset is easy:
-
-.. code-block:: python
-
-  for output in es.allOutputs:
-      # load up a dataset
-      pf = load(output['filename'])
-      # do something!
-
-Cosmology Splices
------------------
-
-For cosmological simulations, the physical width of the simulation box 
-corresponds to some :math:`\Delta z`, which varies with redshift.  Using this 
-logic, one can stitch together a series of datasets to create a continuous 
-volume or length element from one redshift to another.  The 
-:meth:`create_cosmology_splice` method will return such a list:
-
-.. code-block:: python
-
-  cosmo = es.create_cosmology_splice(minimal=True, deltaz_min=0.0, initial_redshift=1.0, final_redshift=0.0)
-
-The returned list is of the same format as the *allOutputs* attribute.  The 
-keyword arguments are:
-
- * **minimal** (*bool*): if True, the minimum number of datasets is used
-   to connect the initial and final redshift.  If false, the list will contain as
-   many entries as possible within the redshift interval.  Default: True.
-
- * **deltaz_min** (*float*): specifies the minimum :math:`\Delta z` between
-   consecutive datasets in the returned list.  Default: 0.0.
-
- * **initial_redshift** (*float*): the initial (highest) redshift in the
-   cosmology splice list.  If none given, the highest redshift dataset present
-   will be used.  Default: None.
-
- * **final_redshift** (*float*): the final (lowest) redshift in the
-   cosmology splice list.  If none given, the lowest redshift dataset present will
-   be used.  Default: None.
-
-The most well known application of this function is the
-:ref:`light-cone-generator`.
-
-Planning a Cosmological Simulation
-----------------------------------
-
-If you want to run a cosmological simulation that will have just enough data outputs 
-to create a cosmology splice, the :meth:`imagine_minimal_splice` method will calculate 
-a list of redshifts outputs that will minimally connect a redshift interval.
-
-.. code-block:: python
-
-  initial_redshift = 0.4
-  final_redshift = 0.0 
-  outputs = es.imagine_minimal_splice(initial_redshift, final_redshift, filename='outputs.out')
-
-This function will return a list of dictionaries with "redshift" and "deltazMax" 
-entries.  The keyword arguments are:
-
- * **decimals** (*int*): The decimal place to which the output redshift will be rounded.  If the decimal place in question is nonzero, the redshift will be rounded up to ensure continuity of the splice.  Default: 3.
-
- * **filename** (*str*): If provided, a file will be written with the redshift outputs in the form in which they should be given in the enzo parameter file.  Default: None.
-
- * **redshift_output_string** (*str*): The parameter accompanying the redshift outputs in the enzo parameter file.  Default: "CosmologyOutputRedshift".
-
- * **start_index** (*int*): The index of the first redshift output.  Default: 0.


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/halo_profiling.rst
--- a/source/analysis_modules/halo_profiling.rst
+++ b/source/analysis_modules/halo_profiling.rst
@@ -5,30 +5,26 @@
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>,
    Stephen Skory <s at skory.us>
 
-The halo profiler provides a means of performing analysis on multiple points in a dataset at 
-once.  This is primarily intended for use with cosmological simulations, in which  
-gravitationally bound structures composed of dark matter and gas, called halos, form and 
-become the hosts for galaxies and galaxy clusters.
+The ``HaloProfiler`` provides a means of performing analysis on multiple halos 
+in a parallel-safe way.
 
-The halo profiler performs two primary functions: radial profiles and projections.  
-The halo profiler can be run in parallel, with `mpi4py
-<http://code.google.com/p/mpi4py/>`_ installed, by running 
-your script inside an mpirun call with the --parallel flag at the end.
+The halo profiler performs three primary functions: radial profiles, 
+projections, and custom analysis.  See the cookbook for a recipe demonstrating 
+all of these features.
 
 Configuring the Halo Profiler
 -----------------------------
 
-A sample script to run the halo profiler can be found in :ref:`cookbook-run_halo_profiler`.  
-In order to run the halo profiler on a dataset, a halo profiler object must be instantiated 
-with the path to the dataset as the only argument:
+The only argument required to create a ``HaloProfiler`` object is the path 
+to the dataset.
 
 .. code-block:: python
 
   from yt.analysis_modules.halo_profiler.api import *
-  hp = HaloProfiler("DD0242/DD0242")
+  hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046")
 
-Most of the halo profiler's options are configured with keyword arguments given at 
-instantiation.  These options are:
+Most of the halo profiler's options are configured with additional keyword 
+arguments:
 
  * **output_dir** (*str*): if specified, all output will be put into this path
    instead of in the dataset directories.  Default: None.
@@ -98,22 +94,23 @@
    calculated (used for calculation of radial and tangential velocities.  Valid
    options are:
    - ["bulk", "halo"] (Default): the velocity provided in the halo list
-   - ["bulk", "sphere"]: the bulk velocity of the sphere centered on the halo center.
+   - ["bulk", "sphere"]: the bulk velocity of the sphere centered on the halo 
+center.
    - ["max", field]: the velocity of the cell that is the location of the maximum of the field specified.
 
  * **filter_quantities** (*list*): quantities from the original halo list
    file to be written out in the filtered list file.  Default: ['id','center'].
 
- * **use_critical_density** (*bool*): if True, the definition of overdensity for 
-   virial quantities is calculated with respect to the critical density.  If False, 
-   overdensity is with respect to mean matter density, which is lower by a factor 
-   of Omega_M.  Default: False.
+ * **use_critical_density** (*bool*): if True, the definition of overdensity 
+     for virial quantities is calculated with respect to the critical 
+     density.  If False, overdensity is with respect to mean matter density, 
+     which is lower by a factor of Omega_M.  Default: False.
 
 Profiles
 --------
 
-Once the halo profiler object has been instantiated, fields can be added for profiling with 
-the :meth:`add_profile` method:
+Once the halo profiler object has been instantiated, fields can be added for 
+profiling with the :meth:`add_profile` method:
 
 .. code-block:: python
 
@@ -121,13 +118,35 @@
   hp.add_profile('TotalMassMsun', weight_field=None, accumulation=True)
   hp.add_profile('Density', weight_field=None, accumulation=False)
   hp.add_profile('Temperature', weight_field='CellMassMsun', accumulation=False)
-  hp.make_profiles(njobs=-1)
+  hp.make_profiles(njobs=-1, prefilters=["halo['mass'] > 1e13"],
+                   filename='VirialQuantities.h5')
 
 The :meth:`make_profiles` method will begin the profiling.  Use the
 **njobs** keyword to control the number of jobs over which the
 profiling is divided.  Setting to -1 results in a single processor per
 halo.  Setting to 1 results in all available processors working on the
-same halo.
+same halo.  The prefilters keyword tells the profiler to skip all halos with 
+masses (as loaded from the halo finder) less than a given amount.  See below 
+for more information.  Additional keyword arguments are:
+
+ * **filename** (*str*): If set, a file will be written with all of the 
+   filtered halos and the quantities returned by the filter functions.
+   Default: None.
+
+ * **prefilters** (*list*): A single dataset can contain thousands or tens of 
+   thousands of halos. Significant time can be saved by not profiling halos
+   that are certain to not pass any filter functions in place.  Simple filters 
+   based on quantities provided in the initial halo list can be used to filter 
+   out unwanted halos using this parameter.  Default: None.
+
+ * **njobs** (*int*): The number of jobs over which to split the profiling.  
+   Set to -1 so that each halo is done by a single processor.  Default: -1.
+
+ * **dynamic** (*bool*): If True, distribute halos using a task queue.  If 
+   False, distribute halos evenly over all jobs.  Default: False.
+
+ * **profile_format** (*str*): The file format for the radial profiles, 
+   'ascii' or 'hdf5'.  Default: 'ascii'.
 
 .. image:: _images/profiles.png
    :width: 500
@@ -145,7 +164,7 @@
   hp.add_projection('Temperature', weight_field='Density')
   hp.add_projection('Metallicity', weight_field='Density')
   hp.make_projections(axes=[0, 1, 2], save_cube=True, save_images=True, 
-                                    halo_list="filtered", njobs=-1)
+                      halo_list="filtered", njobs=-1)
 
 If **save_cube** is set to True, the projection data
 will be written to a set of hdf5 files 
@@ -158,7 +177,26 @@
 discussion of filtering halos.  Use the **njobs** keyword to control
 the number of jobs over which the profiling is divided.  Setting to -1
 results in a single processor per halo.  Setting to 1 results in all
-available processors working on the same halo.
+available processors working on the same halo.  The keyword arguments are:
+
+ * **axes** (*list*): A list of the axes to project along, using the usual 
+   0,1,2 convention. Default=[0,1,2].
+
+ * **halo_list** (*str*) {'filtered', 'all'}: Which set of halos to make 
+   profiles of, either ones passed by the halo filters (if enabled/added), or 
+   all halos.  Default='filtered'.
+
+ * **save_images** (*bool*): Whether or not to save images of the projections. 
+   Default=False.
+
+ * **save_cube** (*bool*): Whether or not to save the HDF5 files of the halo 
+   projections.  Default=True.
+
+ * **njobs** (*int*): The number of jobs over which to split the projections.  
+   Set to -1 so that each halo is done by a single processor.  Default: -1.
+
+ * **dynamic** (*bool*): If True, distribute halos using a task queue.  If 
+   False, distribute halos evenly over all jobs.  Default: False.
 
 .. image:: _images/projections.png
    :width: 500
@@ -228,8 +266,8 @@
 
   hp.make_profiles(filename="FilteredQuantities.out")
 
-If the **filename** keyword is set, a file will be written with all of the filtered halos 
-and the quantities returned by the filter functions.
+If the **filename** keyword is set, a file will be written with all of the 
+filtered halos and the quantities returned by the filter functions.
 
 .. note:: If the profiles have already been run, the halo profiler will read
    in the previously created output files instead of re-running the profiles.
@@ -284,8 +322,10 @@
 
 .. code-block:: python
 
-   hp = HaloProfiler("data0092", recenter="Max_Dark_Matter_Density")
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", 
+                     recenter="Max_Dark_Matter_Density")
 
+Additional options are:
 
   * *Min_Dark_Matter_Density* - Recenter on the point of minimum dark matter
     density in the halo.
@@ -338,7 +378,7 @@
        ma, mini, mx, my, mz, mg = sphere.quantities['MinLocation']('Temperature')
        return [mx,my,mz]
    
-   hp = HaloProfiler("data0092", recenter=find_min_temp)
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", recenter=find_min_temp)
 
 It is possible to make more complicated functions. This example below extends
 the example above to include a distance control that prevents the center from
@@ -362,20 +402,8 @@
        if d > 5.: return [-1, -1, -1]
        return [mx,my,mz]
    
-   hp = HaloProfiler("data0092", recenter=find_min_temp_dist)
-
-.. warning::
-
-   If the halo profiler is run in parallel, and a recentering function is used
-   that is user-defined, two flags need to be set in the ``quantities`` call
-   as in the example below. The built-in recentering functions have
-   these flags set already.
-   
-   .. code-block:: python
-      
-      ma, mini, mx, my, mz, mg = sphere.quantities['MinLocation']('Temperature',
-        lazy_reader=True, preload=False)
-
+   hp = HaloProfiler("enzo_tiny_cosmology/DD0046/DD0046", 
+                     recenter=find_min_temp_dist)
 
 Custom Halo Analysis
 --------------------
@@ -410,7 +438,8 @@
 .. code-block:: python
 
     hp.analyze_halo_sphere(halo_2D_profile, halo_list='filtered',
-        analysis_output_dir='2D_profiles', njobs=-1)
+                           analysis_output_dir='2D_profiles', 
+                           njobs=-1, dynamic=False)
 
 Just like with the :meth:`make_projections` function, the keyword,
 **halo_list**, can be used to select between the full list of halos


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/index.rst
--- a/source/analysis_modules/index.rst
+++ b/source/analysis_modules/index.rst
@@ -8,11 +8,11 @@
    :maxdepth: 2
 
    running_halofinder
-   analyzing_an_entire_simulation
    hmf_howto
    halo_profiling
    light_cone_generator
    light_ray_generator
+   planning_cosmology_simulations
    absorption_spectrum
    star_analysis
    halo_mass_function


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/light_cone_generator.rst
--- a/source/analysis_modules/light_cone_generator.rst
+++ b/source/analysis_modules/light_cone_generator.rst
@@ -4,71 +4,82 @@
 ====================
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>
 
-Light cones are projections made by stacking multiple datasets together to continuously span a 
-given redshift interval.  The width of individual projection slices is adjusted such that each slice 
-has the same angular size.  Each projection slice is randomly shifted and projected along a random 
-axis to ensure that the same structures are not sampled multiple times.  Since deeper images sample 
-earlier epochs of the simulation, light cones represent the closest thing to synthetic imaging 
-observations.
-
-As with most things yt, the light cone functionality can be run in parallel with 
-`mpi4py <http://code.google.com/p/mpi4py/>`_ installed, by running your script inside an mpirun call 
-with the --parallel flag at the end.
+Light cones are projections made by stacking multiple datasets together to 
+continuously span a given redshift interval.  The width of individual 
+projection slices is adjusted such that each slice has the same angular size.  
+Each projection slice is randomly shifted and projected along a random axis to 
+ensure that the same structures are not sampled multiple times.  Since deeper 
+images sample earlier epochs of the simulation, light cones represent the 
+closest thing to synthetic imaging observations.
 
 .. image:: _images/LightCone_full_small.png
    :width: 500
 
-A light cone projection of the thermal Sunyaev-Zeldovich Y parameter from z = 0 to 0.4 with a 
-450x450 arcminute field of view using 9 individual slices.  The panels shows the contributions from 
-the 9 individual slices with the final light cone image shown in the bottom, right.
+A light cone projection of the thermal Sunyaev-Zeldovich Y parameter from 
+z = 0 to 0.4 with a 450x450 arcminute field of view using 9 individual 
+slices.  The panels shows the contributions from the 9 individual slices with 
+the final light cone image shown in the bottom, right.
 
 Configuring the Light Cone Generator
 ------------------------------------
 
-A recipe for creating a simple light cone projection can be found in :ref:`cookbook-make_light_cone`.  
-Light cone projections are made from objects of the LightCone class.  The only required argument for 
-instantiation is the parameter file used to run the simulation, although a few keyword arguments are 
-technically required for anything interesting to happen:
+A recipe for creating a simple light cone projection can be found in the 
+cookbook.  The required arguments to instantiate a ``LightCone`` objects are 
+the path to the simulation parameter file, the simulation type, the nearest 
+redshift, and the furthest redshift of the light cone.
 
 .. code-block:: python
 
-  import yt.analysis_modules.lightcone.api as LC
+  from yt.analysis_modules.api import LightCone
 
-  lc = LC.LightCone("128Mpc256grid_SFFB.param", initial_redshift=0.4, final_redshift=0.0, 
-                    observer_redshift=0.0, field_of_view_in_arcminutes=450.0, 
-                    image_resolution_in_arcseconds=60.0)
+  lc = LightCone('enzo_tiny_cosmology/32Mpc_32.enzo',
+                 'Enzo', 0., 0.1)
 
-The complete list of keyword arguments for instantiation is given below:
+The additional keyword arguments are:
 
- * **initial_redshift** (*float*): the initial (highest) redshift for the light cone.  Default: 1.0.
+ * **field_of_view_in_arcminutes** (*float*): The field of view of the image 
+   in units of arcminutes.  Default: 600.0.
 
- * **final_redshift** (*float*): the final (lowest) redshift for the light cone.  Default: 0.0.
+ * **image_resolution_in_arcseconds** (*float*): The size of each image pixel 
+   in units of arcseconds.  Default: 60.0.
 
- * **observer_redshift** (*float*): the redshift of the observer.  Default: 0.0.
+ * **use_minimum_datasets** (*bool*):  If True, the minimum number of datasets 
+   is used to connect the initial and final redshift.  If false, the light 
+   cone solution will contain as many entries as possible within the redshift 
+   interval.  Default: True.
 
- * **field_of_view_in_arcminutes** (*float*): the field of view of the image in units of arcminutes.  Default: 600.0.
+ * **deltaz_min** (*float*): Specifies the minimum Delta-z between 
+   consecutive datasets in the returned list.  Default: 0.0.
 
- * **image_resolution_in_arcseconds** (*float*): the size of each image pixel in units of arcseconds.  Default: 60.0.
+ * **minimum_coherent_box_fraction** (*float*): Used with use_minimum_datasets 
+   set to False, this parameter specifies the fraction of the total box size 
+   to be traversed before rerandomizing the projection axis and center.  This 
+   was invented to allow light cones with thin slices to sample coherent large 
+   scale structure, but in practice does not work so well.  Try setting this 
+   parameter to 1 and see what happens.  Default: 0.0.
 
- * **use_minimum_datasets** (*bool*): if True, the minimum number of datasets is used to connect the initial and final redshift.  If false, the light cone solution will contain as many entries as possible within the redshift interval.  Default: True.
+ * **time_data** (*bool*): Whether or not to include time outputs when 
+   gathering datasets for time series.  Default: True.
 
- * **deltaz_min** (*float*): specifies the minimum :math:`\Delta z` between consecutive datasets in the returned list.  Default: 0.0.
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs when 
+   gathering datasets for time series.  Default: True.
 
- * **minimum_coherent_box_fraction** (*float*): used with **use_minimum_datasets** set to False, this parameter specifies the fraction of the total box size to be traversed before rerandomizing the projection axis and center.  This was invented to allow light cones with thin slices to sample coherent large scale structure, but in practice does not work so well.  Try setting this parameter to 1 and see what happens.  Default: 0.0.
+ * **set_parameters** (*dict*): Dictionary of parameters to attach to 
+   pf.parameters.  Default: None.
 
- * **output_dir** (*str*): the directory in which images and data files will be written.  Default: 'LC'.
+ * **output_dir** (*string*): The directory in which images and data files
+    will be written.  Default: 'LC'.
 
- * **output_prefix** (*str*): the prefix of all images and data files.  Default: 'LightCone'.
+ * **output_prefix** (*string*): The prefix of all images and data files.
+   Default: 'LightCone'.
 
 Creating Light Cone Solutions
 -----------------------------
 
-A light cone solution consists of a list of datasets and the width, depth, center, and axis of the 
-projection to be made for that slice.  The LightCone class is a subclass of EnzoSimulation 
-(see :ref:`analyzing-an-entire-simulation`).  As such, the initial selection of the list of datasets 
-to be used in a light cone solution is done with the :meth:`EnzoSimulation.create_cosmology_splice`.  
-The :meth:`LightCone.calculate_light_cone_solution` is used to calculated the random shifting and 
-projection axis:
+A light cone solution consists of a list of datasets and the width, depth, 
+center, and axis of the projection to be made for that slice.  The 
+:meth:`LightCone.calculate_light_cone_solution` function is used to 
+calculate the random shifting and projection axis:
 
 .. code-block:: python
 
@@ -76,9 +87,12 @@
 
 The keyword argument are:
 
- * **seed** (*int*): the seed for the random number generator.  Any light cone solution can be reproduced by giving the same random seed.  Default: None (each solution will be distinct).
+ * **seed** (*int*): the seed for the random number generator.  Any light cone 
+   solution can be reproduced by giving the same random seed.  Default: None 
+   (each solution will be distinct).
 
- * **filename** (*str*): if given, a text file detailing the solution will be written out.  Default: None.
+ * **filename** (*str*): if given, a text file detailing the solution will be 
+   written out.  Default: None.
 
 If a new solution for the same LightCone object is desired, the 
 :meth:`rerandomize_light_cone_solution` method should be called in place of 
@@ -87,77 +101,110 @@
 .. code-block:: python
 
   new_seed = 987654321
-  lc.rerandomize_light_cone_solution(new_seed, Recycle=True, filename='new_lightcone.dat')
+  lc.rerandomize_light_cone_solution(new_seed, Recycle=True, 
+                                     filename='new_lightcone.dat')
 
-If debugging is on, the LightCone object will calculate and output the fraction of the light cone 
-volume in common with the original solution.  The keyword arguments are:
+Additional keyword arguments are:
 
- * **recycle** (*bool*): if True, the new solution will have the same shift in the line of sight as the original solution.  Since the projections of each slice are serialized and stored for the entire width of the box (even if the width used is left than the total box), the projection data can be deserialized instead of being remade from scratch.  This can greatly speed up the creation of a large number of light cone projections.  Default: True.
+ * **recycle** (*bool*): if True, the new solution will have the same shift in 
+   the line of sight as the original solution.  Since the projections of each 
+   slice are serialized and stored for the entire width of the box (even if 
+   the width used is left than the total box), the projection data can be 
+   deserialized instead of being remade from scratch.  This can greatly speed 
+   up the creation of a large number of light cone projections.  Default: True.
 
- * **filename** (*str*): if given, a text file detailing the solution will be written out.  Default: None.
+ * **filename** (*str*): if given, a text file detailing the solution will be 
+   written out.  Default: None.
 
-If :meth:`rerandomize_light_cone_solution` is used, the LightCone object will keep a copy of the 
-original solution that can be returned to at any time by calling :meth:`restore_master_solution`:
+If :meth:`rerandomize_light_cone_solution` is used, the LightCone object will 
+keep a copy of the original solution that can be returned to at any time by 
+calling :meth:`restore_master_solution`:
 
 .. code-block:: python
 
   lc.restore_master_solution()
 
-.. note:: All light cone solutions made with the above method will still use the same list of datasets.  Only the shifting and projection axis will be different.
+.. note:: All light cone solutions made with the above method will still use 
+the same list of datasets.  Only the shifting and projection axis will be 
+different.
 
 Making a Light Cone Projection
 ------------------------------
 
-With the light cone solution set, projections can be made of any available field:
+With the light cone solution set, projections can be made of any available 
+field:
 
 .. code-block:: python
 
   field = 'Density'
-  pc = lc.project_light_cone(field , weight_field=None, save_stack=True, save_slice_images=True)
+  lc.project_light_cone(field , weight_field=None, 
+                        save_stack=True, 
+                        save_slice_images=True)
 
-The return value of :meth:`project_light_cone` is the PlotCollection containing the image of the final 
-light cone image.  This allows the user further customization of the final image.  The keyword 
-arguments of :meth:`project_light_cone` are:
+Additional keyword arguments:
 
- * **weight_field** (*str*): the weight field of the projection.  This has the same meaning as in standard projections.  Default: None.
+ * **weight_field** (*str*): the weight field of the projection.  This has the 
+   same meaning as in standard projections.  Default: None.
 
- * **apply_halo_mask** (*bool*): if True, a boolean mask is apply to the light cone projection.  See below for a description of halo masks.  Default: False.
+ * **apply_halo_mask** (*bool*): if True, a boolean mask is apply to the light 
+   cone projection.  See below for a description of halo masks.  Default: False.
 
- * **node** (*str*): a prefix to be prepended to the node name under which the projection data is serialized.  Default: None.
+ * **node** (*str*): a prefix to be prepended to the node name under which the 
+   projection data is serialized.  Default: None.
 
- * **save_stack** (*bool*): if True, the unflatted light cone data including each individual slice is written to an hdf5 file.  Default: True.
+ * **save_stack** (*bool*): if True, the unflatted light cone data including 
+   each individual slice is written to an hdf5 file.  Default: True.
 
- * **save_slice_images** (*bool*): save images for each individual projection slice.  Default: False.
+ * **save_final_image** (*bool*): if True, save an image of the final light 
+   cone projection.  Default: True.
 
- * **flatten_stack** (*bool*): if True, the light cone stack is continually flattened each time a slice is added in order to save memory.  This is generally not necessary.  Default: False.
+ * **save_slice_images** (*bool*): save images for each individual projection 
+   slice.  Default: False.
 
- * **photon_field** (*bool*): if True, the projection data for each slice is decremented by 4 :math:`\pi` R :superscript:`2`, where R is the luminosity distance between the observer and the slice redshift.  Default: False.
+ * **flatten_stack** (*bool*): if True, the light cone stack is continually 
+   flattened each time a slice is added in order to save memory.  This is 
+   generally not necessary.  Default: False.
 
-.. note:: Additional keywords appropriate for a call to :meth:`PlotCollection.add_projection` can also be given to :meth:`project_light_cone`.
+ * **photon_field** (*bool*): if True, the projection data for each slice is 
+   decremented by 4 pi R :superscript:`2` , where R is the luminosity 
+   distance between the observer and the slice redshift.  Default: False.
+
+ * **njobs** (*int*): The number of parallel jobs over which the light cone 
+   projection will be split.  Choose -1 for one processor per individual
+   projection and 1 to have all processors work together on each projection.
+   Default: 1.
+
+ * **dynamic** (*bool*): If True, use dynamic load balancing to create the 
+   projections.  Default: False.
 
 Sampling Unique Light Cone Volumes
 ----------------------------------
 
-When making a large number of light cones, particularly for statistical analysis, it is important 
-to have a handle on the amount of sampled volume in common from one projection to another.  Any 
-statistics may untrustworthy if a set of light cones have too much volume in common, even if they 
-may all be entirely different in appearance.  LightCone objects have the ability to calculate the 
-volume in common between two solutions with the same dataset list.  The :meth:`find_unique_solutions` 
-and :meth:`project_unique_light_cones` functions can be used to create a set of light cone solutions 
-that have some maximum volume in common and create light cone projections for those solutions.  If 
-specified, the code will attempt to use recycled solutions that can use the same serialized projection 
-objects that have already been created.  This can greatly increase the speed of making multiple light 
-cone projections.  See :ref:`cookbook-unique_light_cones` for an example of doing this.
+When making a large number of light cones, particularly for statistical 
+analysis, it is important to have a handle on the amount of sampled volume in 
+common from one projection to another.  Any statistics may untrustworthy if a 
+set of light cones have too much volume in common, even if they may all be 
+entirely different in appearance.  LightCone objects have the ability to 
+calculate the volume in common between two solutions with the same dataset 
+ist.  The :meth:`find_unique_solutions` and 
+:meth:`project_unique_light_cones` functions can be used to create a set of 
+light cone solutions that have some maximum volume in common and create light 
+cone projections for those solutions.  If specified, the code will attempt to 
+use recycled solutions that can use the same serialized projection objects 
+that have already been created.  This can greatly increase the speed of making 
+multiple light cone projections.  See the cookbook for an example of doing this.
 
 Making Light Cones with a Halo Mask
 -----------------------------------
 
-The situation may arise where it is necessary or desirable to know the location of halos within the 
-light cone volume, and specifically their location in the final image.  This can be useful for 
-developing algorithms to find galaxies or clusters in image data.  The light cone generator does this 
-by running the HaloProfiler (see :ref:`halo_profiling`) on each of the datasets used in the light cone 
-and shifting them accordingly with the light cone solution.  The ability also exists to create a 
-boolean mask with the dimensions of the final light cone image that can be used to mask out the 
-halos in the image.  It is left as an exercise to the reader to find a use for this functionality.  
-This process is somewhat complicated, but not terribly.  See :ref:`cookbook-light_cone_halo_mask` for 
-an example of how to do this.
+The situation may arise where it is necessary or desirable to know the 
+location of halos within the light cone volume, and specifically their 
+location in the final image.  This can be useful for developing algorithms to 
+find galaxies or clusters in image data.  The light cone generator does this 
+by running the HaloProfiler (see :ref:`halo_profiling`) on each of the 
+datasets used in the light cone and shifting them accordingly with the light 
+cone solution.  The ability also exists to create a boolean mask with the 
+dimensions of the final light cone image that can be used to mask out the 
+halos in the image.  It is left as an exercise to the reader to find a use for 
+this functionality.  This process is somewhat complicated, but not terribly.  
+See the recipe in the cookbook for an example of this functionality.
\ No newline at end of file


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/light_ray_generator.rst
--- a/source/analysis_modules/light_ray_generator.rst
+++ b/source/analysis_modules/light_ray_generator.rst
@@ -4,118 +4,129 @@
 ====================
 .. sectionauthor:: Britton Smith <brittonsmith at gmail.com>
 
-Light rays are similar to light cones (:ref:`light-cone-generator`) in the way they stack mulitple 
-datasets together to span a redshift interval.  Unlike light cones, which which stack radomly 
-oriented projections from each dataset to create synthetic images, light rays use infinitesimally 
-thin pencil beams to simulate QSO sight lines.
-
-.. note:: The light ray generator can be run in parallel, but should be done so without the --parallel flag.  This is because the light ray tool is not yet parallelized in a way that complies with yt's parallel mode.  The light ray tool works in parallel by giving each processor one dataset from the stack.  Therefore, it is useless to use more processors than datasets.
+Light rays are similar to light cones (:ref:`light-cone-generator`) in how  
+they stack mulitple datasets together to span a redshift interval.  Unlike 
+light cones, which which stack randomly oriented projections from each 
+dataset to create synthetic images, light rays use thin pencil beams to 
+simulate QSO sight lines.
 
 .. image:: _images/lightray.png
 
-A ray segment records the information of all grid cells intersected by the ray as well as the path 
-length, dl, of the ray through the cell.  Column densities can be calculated by multiplying 
-physical densities by the path length.
+A ray segment records the information of all grid cells intersected by the ray 
+as well as the path length, dl, of the ray through the cell.  Column densities 
+can be calculated by multiplying physical densities by the path length.
 
 Configuring the Light Ray Generator
 -----------------------------------
-
-An advanced recipe for creating a light ray can be found in :ref:`cookbook-make_light_ray`.  
-The only required arguments for instantiation are the parameter file used to run the simulation, 
-the initial redshift, and the final redshift.
+  
+The arguments required to instantiate a ``LightRay`` object are the same as 
+those required for a ``LightCone`` object: the simulation parameter file, the 
+simulation type, the nearest redshift, and the furthest redshift.
 
 .. code-block:: python
 
-  from yt.analysis_modules.light_ray.api import *
+  from yt.analysis_modules.api import LightRay
+  lr = LightRay("enzo_tiny_cosmology/32Mpc_32.enzo",
+                'Enzo', 0.0, 0.1)
 
-  lr = LightRay('my_simulation.par', 0.0, 0.1)
+Additional keyword arguments are:
 
-The light ray tool is a subclass of EnzoSimulation (see :ref:`analyzing-an-entire-simulation`).  
-As such, the additional keyword arguments are very simiar to the ones for 
-:meth:`EnzoSimulation.create_cosmology_splice`.  The complete list is given below:
+ * **use_minimum_datasets** (*bool*): If True, the minimum number of datasets 
+   is used to connect the initial and final redshift.  If false, the light 
+   ray solution will contain as many entries as possible within the redshift
+   interval.  Default: True.
 
- * **deltaz_min** (*float*): minimum delta z between consecutive datasets.  Default: 0.0.
+ * **deltaz_min** (*float*):  Specifies the minimum Delta-z between consecutive
+   datasets in the returned list.  Default: 0.0.
 
- * **use_minimum_datasets** (*bool*): if True, the minimum number of datasets is used to connect the initial and final redshift.  If false, the light ray solution will contain as many entries as possible within the redshift interval.  Default: True.
+ * **minimum_coherent_box_fraction** (*float*): Used with use_minimum_datasets 
+   set to False, this parameter specifies the fraction of the total box size 
+   to be traversed before rerandomizing the projection axis and center.  This
+   was invented to allow light rays with thin slices to sample coherent large 
+   scale structure, but in practice does not work so well.  Try setting this 
+   parameter to 1 and see what happens.  Default: 0.0.
 
- * **minimum_coherent_box_fraction** (*float*): used with use_minimum_datasets set to False, this parameter specifies the fraction of the total box size to be traversed before rerandomizing the projection axis and center.  This was invented to allow light cones with thin slices to sample coherent large scale structure, but in practice does not work so well.  It is not very clear what this will do to a light ray.  Default: 0.0.
+ * **time_data** (*bool*): Whether or not to include time outputs when gathering
+   datasets for time series.  Default: True.
+
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs when 
+   gathering datasets for time series.  Default: True.
+
 
 Making Light Ray Data
 ---------------------
 
-Once the LightRay object has been instantiated, the :meth:`make_light_ray` will trace out the 
-rays in each dataset and collect information for all the fields requested.  The output file 
-will be an hdf5 file containing all the cell field values for all the cells that were intersected 
-by the ray.  A single LightRay object can be used over and over to make multiple randomizations, 
-simply by changing the value of the random seed with the **seed** keyword.
+Once the LightRay object has been instantiated, the :meth:`make_light_ray` 
+will trace out the rays in each dataset and collect information for all the 
+fields requested.  The output file will be an hdf5 file containing all the 
+cell field values for all the cells that were intersected by the ray.  A 
+single LightRay object can be used over and over to make multiple 
+randomizations, simply by changing the value of the random seed with the 
+**seed** keyword.
 
 .. code-block:: python
 
   lr.make_light_ray(seed=8675309,
-                    solution_filename='lightraysolution.txt',
-                    data_filename=data_filename,
                     fields=['Temperature', 'Density'],
                     get_los_velocity=True)
 
 The keyword arguments are:
 
- * **seed** (*int*): seed for the random number generator.  Default: None.
- * **fields** (*list*): a list of fields for which to get data.  Default: None.
- * **solution_filename** (*string*): path to a text file where the trajectories of each subray is written out.  Default: None.
- * **data_filename** (*string*): path to output file for ray data.  Default: None.
- * **get_nearest_galaxy** (*bool*): if True, the HaloProfiler will be used to calculate the distance and mass of the nearest halo for each point in the ray.  This option requires additional information to be included.  See below for an example.  Default: False.
- * **get_los_velocity** (*bool*): if True, the line of sight velocity is calculated for each point in the ray.  Default: False.
+ * **seed** (*int*): Seed for the random number generator.  Default: None.
+
+ * **fields** (*list*): A list of fields for which to get data.  Default: None.
+
+ * **solution_filename** (*string*): Path to a text file where the 
+   trajectories of each subray is written out.  Default: None.
+
+ * **data_filename** (*string*): Path to output file for ray data.  
+   Default: None.
+
+ * **get_los_velocity** (*bool*): If True, the line of sight velocity is 
+   calculated for each point in the ray.  Default: False.
+
+ * **get_nearest_halo** (*bool*): If True, the HaloProfiler will be used to 
+   calculate the distance and mass of the nearest halo for each point in the
+   ray.  This option requires additional information to be included.  See 
+   the cookbook for an example.  Default: False.
+
+ * **nearest_halo_fields** (*list*): A list of fields to be calculated for the 
+   halos nearest to every lixel in the ray.  Default: None.
+
+ * **halo_profiler_parameters** (*dict*): A dictionary of parameters to be 
+   passed to the HaloProfiler to create the appropriate data used to get 
+   properties for the nearest halos.  Default: None.
+
+ * **njobs** (*int*): The number of parallel jobs over which the slices for the
+   halo mask will be split.  Choose -1 for one processor per individual slice 
+   and 1 to have all processors work together on each projection.  Default: 1
+
+ * **dynamic** (*bool*): If True, use dynamic load balancing to create the 
+   projections.  Default: False.
 
 Getting The Nearest Galaxies
 ----------------------------
 
-With the **get_los_velocity** keyword set to True for :meth:`make_light_ray`, the light ray 
-tool will use the HaloProfiler to calculate the distance and mass of the nearest halo to that 
-pixel.  In order to do this, three additional keyword arguments must be supplied to tell the 
-HaloProfiler what to do:
+The light ray tool will use the HaloProfiler to calculate the distance and 
+mass of the nearest halo to that pixel.  In order to do this, a dictionary 
+called halo_profiler_parameters is used to pass instructions to the 
+HaloProfiler.  This dictionary has three additional keywords:
 
- * **halo_profiler_kwargs** (*dict*): a dictionary of standard HaloProfiler keyword arguments and values to be given to the HaloProfiler.
- * **halo_profiler_actions** (*list*): a list of actions to be performed by the HaloProfiler.  Each item in the list should be a dictionary with the following entries: "function", "args", and "kwargs", for the function to be performed, the arguments supplied to that function, and the keyword arguments.
- * **halo_list** (*string*): 'all' to use the full halo list, or 'filtered' to use the filtered halo list created after calling make_profiles.
+ * **halo_profiler_kwargs** (*dict*): A dictionary of standard HaloProfiler 
+   keyword arguments and values to be given to the HaloProfiler.
 
-In the example below, we ask the HaloProfiler to perform two tasks.  The first is to add the 
-halo filter for virialized halos above 10 :superscript:`14` solar masses.  The second is to 
-make the radial profiles.  Finally, the **halo_list** keyword signifies that we want to use the 
-filtered halo list created after profiling.
+ * **halo_profiler_actions** (*list*): A list of actions to be performed by 
+   the HaloProfiler.  Each item in the list should be a dictionary with the 
+   following entries: "function", "args", and "kwargs", for the function to 
+   be performed, the arguments supplied to that function, and the keyword 
+   arguments.
 
-.. code-block:: python
+ * **halo_list** (*string*): 'all' to use the full halo list, or 'filtered' 
+   to use the filtered halo list created after calling make_profiles.
 
-  halo_profiler_kwargs = {'halo_list_format': {'id':0, 'center':[4, 5, 6]},
-                                               'TotalMassMsun':1},
-                          'halo_list_file': 'HopAnalysis.out'}
-
-  halo_profiler_actions = [{'function': add_halo_filter,
-                            'args': VirialFilter,
-                            'kwargs': {'overdensity_field': 'ActualOverdensity',
-                                       'virial_overdensity': 200,
-                                       'virial_filters': [['TotalMassMsun','>=','1e14']],
-                                       'virial_quantities': ['TotalMassMsun','RadiusMpc']}},
-                           {'function': make_profiles,
-                            'args': None,
-                            'kwargs': {'filename': 'VirializedHalos.out'}}]
-
-  halo_list = 'filtered'
-
-  halo_mass_field = 'TotalMassMsun_200'
-
-  lr.make_light_ray(seed=8675309,
-                    solution_filename='lightraysolution.txt',
-                    data_filename='lightray.h5',
-                    fields=['Temperature', 'Density'],
-                    get_nearest_galaxy=True, 
-                    halo_profiler_kwargs=halo_profiler_kwargs,
-                    halo_profiler_actions=halo_profiler_actions, 
-                    halo_list=halo_list,
-                    halo_mass_field=halo_mass_field,
-                    get_los_velocity=True)
-
+See the recipe in the cookbook for am example.
 
 What Can I do with this?
 ------------------------
 
-Try :ref:`absorption_spectrum`!
\ No newline at end of file
+Try :ref:`absorption_spectrum`.
\ No newline at end of file


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analysis_modules/planning_cosmology_simulations.rst
--- /dev/null
+++ b/source/analysis_modules/planning_cosmology_simulations.rst
@@ -0,0 +1,28 @@
+.. _planning-cosmology-simulations:
+
+Planning Simulations to use LightCones or LightRays
+===================================================
+
+If you want to run a cosmological simulation that will have just enough data 
+outputs to create a cosmology splice, the :meth:`plan_cosmology_splice` 
+function will calculate a list of redshifts outputs that will minimally 
+connect a redshift interval.
+
+.. code-block:: python
+
+  from yt.analysis_modules.api import CosmologySplice
+  my_splice = CosmologySplice('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+  my_splice.plan_cosmology_splice(0.0, 0.1, filename='redshifts.out')
+
+This will write out a file, formatted for simulation type, with a list of 
+redshift dumps.  The keyword arguments are:
+
+ * **decimals** (*int*): The decimal place to which the output redshift will 
+   be rounded.  If the decimal place in question is nonzero, the redshift will 
+   be rounded up to ensure continuity of the splice.  Default: 3.
+
+ * **filename** (*str*): If provided, a file will be written with the redshift 
+   outputs in the form in which they should be given in the enzo parameter 
+   file.  Default: None.
+
+ * **start_index** (*int*): The index of the first redshift output.  Default: 0.


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/analyzing/time_series_analysis.rst
--- a/source/analyzing/time_series_analysis.rst
+++ b/source/analyzing/time_series_analysis.rst
@@ -6,9 +6,8 @@
 Often, one wants to analyze a continuous set of outputs from a simulation in a
 uniform manner.  A simple example would be to calculate the peak density in a
 set of outputs that were written out.  The problem with time series analysis in
-yt is general an issue of verbosity and clunkiness. Typically, unless using the
-:class:`~yt.analysis_modules.simulation_handling.EnzoSimulation` class (which
-is only available as of right now for Enzo) one sets up a loop:
+yt is general an issue of verbosity and clunkiness. Typically, one sets up a 
+loop:
 
 .. code-block:: python
 
@@ -158,3 +157,98 @@
 This allows you to create your own analysis tasks that will be then available
 to time series data objects.  Since ``TimeSeriesData`` objects iterate over
 filenames in parallel by default, this allows for transparent parallelization. 
+
+Analyzing an Entire Simulation
+------------------------------
+
+The parameter file used to run a simulation contains all the information 
+necessary to know what datasets should be available.  The ``simulation`` 
+convenience function allows one to create a ``TimeSeriesData`` object of all 
+or a subset of all data created by a single simulation.
+
+.. note:: Currently only implemented for Enzo.  Other simulation types coming 
+   soon.
+
+To instantiate, give the parameter file and the simulation type.
+
+.. code-block:: python
+
+  from yt.mods import *
+  my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo')
+
+Then, create a ``TimeSeriesData`` object with the :meth:`get_time_series` 
+function.  With no additional keywords, the time series will include every 
+dataset.
+
+.. code-block:: python
+
+  my_sim.get_time_series()
+
+After this, time series analysis can be done normally.
+
+.. code-block:: python
+
+  for pf in my_sim.piter()
+      all_data = pf.h.all_data()
+      print all_data.quantities['Extrema']('Density')
+ 
+Additional keywords can be given to :meth:`get_time_series` to select a subset
+of the total data:
+
+ * **time_data** (*bool*): Whether or not to include time outputs when 
+   gathering datasets for time series.  Default: True.
+
+ * **redshift_data** (*bool*): Whether or not to include redshift outputs 
+   when gathering datasets for time series.  Default: True.
+
+ * **initial_time** (*float*): The earliest time for outputs to be included.  
+   If None, the initial time of the simulation is used.  This can be used in 
+   combination with either final_time or final_redshift.  Default: None.
+
+ * **final_time** (*float*): The latest time for outputs to be included.  If 
+   None, the final time of the simulation is used.  This can be used in 
+   combination with either initial_time or initial_redshift.  Default: None.
+
+ * **times** (*list*): A list of times for which outputs will be found.
+   Default: None.
+
+ * **time_units** (*str*): The time units used for requesting outputs by time.
+   Default: '1' (code units).
+
+ * **initial_redshift** (*float*): The earliest redshift for outputs to be 
+   included.  If None, the initial redshift of the simulation is used.  This
+   can be used in combination with either final_time or final_redshift.
+   Default: None.
+
+ * **final_time** (*float*): The latest redshift for outputs to be included.  
+   If None, the final redshift of the simulation is used.  This can be used 
+   in combination with either initial_time or initial_redshift.  
+   Default: None.
+
+ * **redshifts** (*list*): A list of redshifts for which outputs will be found.
+   Default: None.
+
+ * **initial_cycle** (*float*): The earliest cycle for outputs to be 
+   included.  If None, the initial cycle of the simulation is used.  This can
+   only be used with final_cycle.  Default: None.
+
+ * **final_cycle** (*float*): The latest cycle for outputs to be included.  
+   If None, the final cycle of the simulation is used.  This can only be used 
+   in combination with initial_cycle.  Default: None.
+
+ * **tolerance** (*float*):  Used in combination with "times" or "redshifts" 
+   keywords, this is the tolerance within which outputs are accepted given 
+   the requested times or redshifts.  If None, the nearest output is always 
+   taken.  Default: None.
+
+ * **find_outputs** (*bool*): If True, subdirectories within the GlobalDir 
+   directory are searched one by one for datasets.  Time and redshift 
+   information are gathered by temporarily instantiating each dataset.  This 
+   can be used when simulation data was created in a non-standard way, making 
+   it difficult to guess the corresponding time and redshift information.
+   Default: False.
+
+ * **parallel** (*bool*/*int*): If True, the generated TimeSeriesData will 
+   divide the work such that a single processor works on each dataset.  If an
+   integer is supplied, the work will be divided into that number of jobs.
+   Default: True.
\ No newline at end of file


diff -r 56e69a18c6a7f212740969c5a550050017effd77 -r 1c770895089118dc74bf771378db81d15bd2157e source/orientation/making_plots.rst
--- a/source/orientation/making_plots.rst
+++ b/source/orientation/making_plots.rst
@@ -1,18 +1,142 @@
 Making Plots
 ------------
 
-Examining data by hand and looking at individual quantities one at a time can
-be interesting and productive, but yt also provides a set of visualization
-tools that you can use.  One of the fundamental implementations of this is the
-``PlotCollection``, an object designed to enable you to make several related
-plots all at once.  Originally, the idea was that yt would get used to make
-multiple plots of different fields, along different axes, all centered at the
-same point.  This has somewhat faded with time, but it still functions as a
-convenient way to set up a bunch of plots with only one or two commands.
+Slices
+^^^^^^
+
+Examining data by hand and looking at individual quantities one at a time can be
+interesting and productive, but yt also provides a set of visualization tools
+that you can use. We'll start by showing you how to make visualizations of
+slices and projections through your data.  We will then move on to demonstrate
+how to make analysis plots, including phase diagrams and profiles.
+
+The quickest way to plot a slice of a field through your data is to use
+:class:`~yt.visualization.plot_window.SlicePlot`.  Say we want to visualize a
+slice through the Density field along the z-axis centered on the center of the
+simulation box in a simulation dataset we've opened and stored in the parameter
+file object ``pf``.  This can be accomplished with the following command:
+
+.. clode-block:: python
+
+   >>> slc = SlicePlot(pf, 'z', 'Density')
+   >>> slc.save()
+
+These two commands will create a slice object and store it in a variable we've
+called ``slc``.  We then call the ``save()`` function that is associated with
+the slice object.  This automatically saves the plot in png image format with an
+automatically generated filename.  If you don't want the slice object to stick
+around, you can accomplish the same thing in one line:
+
+.. clode-block:: python
+   
+   >>> SlicePlot(pf, 'z', 'Density').save()
+
+It's nice to keep the slice object around if you want to modify the plot.  By
+default, the plot width will be set to the size of the simulation box.  To zoom
+in by a factor of ten, you can call the zoom function attached to the slice
+object:
+
+.. code-block:: python
+
+   >>> slc = SlicePlot(pf, 'z', 'Density')
+   >>> slc.zoom(10)
+   >>> slc.save('zoom')
+
+This will save a new plot to disk with a different filename - prepended with
+'zoom' instead of the name of the parmaeter file. If you want to set the width
+manually, you can do that as well. For example, the following sequence of
+commands will create a slice, set the width of the plot to 10 kiloparsecs, and
+save it to disk.
+
+.. code-block:: python
+
+   >>> slc = SlicePlot(pf, 'z', 'Density')
+   >>> slc.set_width((10,'kpc'))
+   >>> slc.save('10kpc')
+
+The SlicePlot also optionally accepts the coordinate to center the plot on and
+the width of the plot:
+
+.. code-block:: python
+
+   >>> SlicePlot(pf, 'z', 'Density', center=[0.2, 0.3, 0.8], 
+   ...           width = (10,'kpc)).save()
+
+The center must be given in code units.  Optionally, you can supply 'c' or 'm'
+for the center.  These two choices will center the plot on the center of the
+simulation box and the coordinate of the maximum density cell, respectively.
+
+One can also use the SlicePlot to make annotated plots.  The following commands
+will create a slice, annotate it by marking the grid boundaries, and save the
+plot to disk:
+
+.. code-block:: python
+
+   >>> SlicePlot(pf, 'z', 'Density')
+   >>> SlicePlot.annotate_grids()
+   >>> SlicePlot.save()
+
+There are a number of annotations available.  The full list is available in
+:ref:`callbacks`.
+
+Projectiions
+^^^^^^^^^^^^
+
+It can be limiting to only look at slices through 3D data.  In most cases, Doing
+so discards the vast majority of the data.  For this reason, yt provides a
+simple interface for generating plots of projections through your data.  The
+interface for making projection plots,
+:class:`~yt.visualization.plot_window.ProjectionPlot` is very similar to
+``SlicePlot``, described above.  To create and save a plot of the projection of
+the density field through the z-axis of a dataset, centered on the center of the
+simulation box, do the following:
+
+.. code-block:: python
+
+   >>> ProjectionPlot(pf, 'z' 'Density).save()
+
+A ``ProjectionPlot`` can be modified and created in exactly the same keyword
+arguments as s ``SlicePlot``. For example, one can also adjust the width of
+the plot, either after creating the projection plot:
+
+.. clode-block:: python
+
+   >>> prj = ProjectionPlot(pf, 'z', 'Density')
+   >>> prj.set_width((10,'kpc'))
+
+or while creating the projection in the first place:
+
+.. clode-block:: python
+
+   >>> ProjectionPlot(pf, 'z', 'Density', width=(10,'kpc'))
+
+In addition, one can optionally supply a maximum level to project to, this is
+very useful for large datasets where projections can be costly:
+
+.. code-block:: python
+
+   >>> ProjectionPlot(pf, 'z', 'Density', max_level=10)
+
+as well as a field to weight the projection by.  The following example creates a
+map of the density-weighted mean temperature, projected along the z-axis:
+
+.. code-block:: python
+
+   >>> ProjectionPlot(pf, 'z', 'Temperature', weight_field='Density')
+
+PlotCollection
+^^^^^^^^^^^^^^
+
+To create profiles, yt supplies the ``PlotCollection``, an object designed to
+enable you to make several related plots all at once.  Originally, the idea was
+that yt would get used to make multiple plots of different fields, along
+different axes, all centered at the same point.  This has somewhat faded with
+time, but it still functions as a convenient way to set up a bunch of plots with
+only one or two commands.
 
 A plot collection is really defined by two things: the simulation output it
 will make plots from, and the "center" of the plot collection.  By default, the
-center is the place where all slices and phase plots are centered, although
+center is the place where all phase plots are centered, although
 there is some leeway on this.  We start by creating our plot collection.  The
 plot collection takes two arguments: the first is the parameter file (``pf``)
 we associate the plot collection with, and the second is our center.  Note that
@@ -25,32 +149,15 @@
 
 We've chosen to center at [0.5, 0.5, 0.5], which for this simulation is the
 center of the domain.  We can now add a number of different types of
-visualizations to this plot, but we'll only look at a few.  The first is a
-projection, which we talked about earlier.  yt regards axes as integers: 0 for
-x, 1 for y, 2 for z.  So we add a projection of Density along the x-axis with
-this command:
+visualizations to this plot collection, but we'll only look at a few.  
 
-   >>> p = pc.add_projection("Density", 0)
+Phase Plots
+^^^^^^^^^^^
 
-yt will then create the projection and hang the resultant plot onto the
-``PlotCollection``.  It also returns the plot object when it's done, which we
-then assign to the variable ``p``.  Plot objects are mostly useful if you want
-to do advanced things to the plot -- like overplotting grids, contours,
-vectors, or modifying the underlying visualization in some non-trivial way.
-Many modifications can be applied in simpler ways, but some complex
-modifications require the plot object (``p`` in this case) itself.
-
-We'll now add a slice of "x-velocity" to the plot collection.  This should let
-us see how our spheres are moving.  (But, since we centered in a relatively
-uninteresting section of the data, it won't look like too much!)
-
-   >>> pc.add_slice("x-velocity", 0)
-
-We don't retain the plot object, in this case, but we could!  Now, for our
-final trick, we'll create a phase plot.  Phase plots are pretty cool -- they
-take all the data inside a data container and they bin it with respect to two
-variables.  You can then have it calculate the average, or the sum, of some
-other quantity as a function of those two variables.
+Phase plots are pretty cool -- they take all the data inside a data container
+and they bin it with respect to two variables.  You can then have it calculate
+the average, or the sum, of some other quantity as a function of those two
+variables.
 
 This allows, for instance, the calculation of the average Temperature as a
 function of Density and velocity.  Or, it allows the distribution of all the
@@ -96,24 +203,7 @@
 plot is saved out with that prefix.  Each plot's name is calculated from a
 combination of the type of plot and the fields in that plot.  For plots where
 many duplicates can be included, a counter is included too -- for instance,
-phase and profile plots.  Note that the field of view is not included in the
-filename, which means that you will have to include that yourself.
-
-The default field of view for slices and projections is the entire domain.  We
-can change that by calling ``set_width`` and specifying a value and a unit.
-Most astrophysical units are available, but two special units are also
-available: "unitary" and "1".  "unitary" units are scaled to the domain width;
-so 0.5 in "unitary" would occupy half the domain.  "1" means in the native
-units of the simulation code.  (For enzo, "unitary" and "1" are usually but not
-always the same.)  We'll set the width to half the domain size and save again::
-
-   >>> pc.set_width(0.5, 'unitary')
-   >>> pc.save("second_images")
-
-Note that the phase plots didn't change at all -- only the image plots.  This
-is by design, to keep the code from grinding too much on the disk.  yt assumes
-that when you create a phase plot, you already know what you are aiming to do,
-and it doesn't second guess that.
+phase and profile plots.
 
 All of these commands can be run from a script -- which, in fact, is the way
 that I would personally encourage.  It will make it easier to produce plots

Repository URL: https://bitbucket.org/yt_analysis/yt-doc/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list