[Yt-svn] commit/yt: 12 new changesets

Bitbucket commits-noreply at bitbucket.org
Fri Sep 2 07:20:12 PDT 2011


12 new changesets in yt:

http://bitbucket.org/yt_analysis/yt/changeset/f02afd88c0eb/
changeset:   f02afd88c0eb
branch:      stable
user:        MatthewTurk
date:        2011-08-26 02:07:13
summary:     Merging from unstable branch
affected #:  347 files (742.4 KB)
Diff too large to display.
http://bitbucket.org/yt_analysis/yt/changeset/d6b41e08f836/
changeset:   d6b41e08f836
branch:      stable
user:        MatthewTurk
date:        2011-08-26 04:03:22
summary:     Removing expired item from import
affected #:  1 file (30 bytes)

--- a/yt/analysis_modules/api.py	Thu Aug 25 20:07:13 2011 -0400
+++ b/yt/analysis_modules/api.py	Thu Aug 25 22:03:22 2011 -0400
@@ -79,7 +79,6 @@
     ExtractedParameterFile
 
 from .level_sets.api import \
-    GridConsiderationQueue, \
     coalesce_join_tree, \
     identify_contours, \
     Clump, \


http://bitbucket.org/yt_analysis/yt/changeset/2083398de53d/
changeset:   2083398de53d
branch:      stable
user:        MatthewTurk
date:        2011-08-26 04:03:55
summary:     Fixing title underline
affected #:  1 file (1 byte)

--- a/yt/data_objects/object_finding_mixin.py	Thu Aug 25 22:03:22 2011 -0400
+++ b/yt/data_objects/object_finding_mixin.py	Thu Aug 25 22:03:55 2011 -0400
@@ -115,7 +115,7 @@
         as the input *fields*.
         
         Parameters
-        ---------
+        ----------
         fields : string or list of strings
             The field(s) that will be returned.
         


http://bitbucket.org/yt_analysis/yt/changeset/3b82eda9bdd9/
changeset:   3b82eda9bdd9
branch:      stable
user:        MatthewTurk
date:        2011-08-26 04:36:52
summary:     Several fixes to docstrings to get the docs to build correctly.
affected #:  4 files (521 bytes)

--- a/yt/analysis_modules/light_cone/light_cone.py	Thu Aug 25 22:03:55 2011 -0400
+++ b/yt/analysis_modules/light_cone/light_cone.py	Thu Aug 25 22:36:52 2011 -0400
@@ -122,12 +122,20 @@
                                                                 deltaz_min=self.deltaz_min)
 
     def calculate_light_cone_solution(self, seed=None, filename=None):
-        """
-        Create list of projections to be added together to make the light cone.
-        :param seed (int): the seed for the random number generator.  Any light cone solution 
-               can be reproduced by giving the same random seed.  Default: None (each solution 
-               will be distinct).
-        :param filename (str): if given, a text file detailing the solution will be written out.  Default: None.
+        r"""Create list of projections to be added together to make the light cone.
+
+        Several sentences providing an extended description. Refer to
+        variables using back-ticks, e.g. `var`.
+
+        Parameters
+        ----------
+        seed : int
+            The seed for the random number generator.  Any light cone solution
+            can be reproduced by giving the same random seed.  Default: None
+            (each solution will be distinct).
+        filename : string
+            If given, a text file detailing the solution will be written out.
+            Default: None.
         """
 
         # Don't use box coherence with maximum projection depths.
@@ -209,13 +217,17 @@
             self._save_light_cone_solution(filename=filename)
 
     def get_halo_mask(self, mask_file=None, map_file=None, **kwargs):
+        r"""Gets a halo mask from a file or makes a new one.
+
+        Parameters
+        ----------
+        mask_file : string, optional
+            An HDF5 file to which to output the halo mask
+        map_file : string, optional
+            A text file to which to output the halo map (locations in the
+            images of the halos
+
         """
-        Gets a halo mask from a file or makes a new one.
-        :param mask_file (str): specify an hdf5 file to output the halo mask.
-        :param map_file (str): specify a text file to output the halo map 
-               (locations in image of halos).
-        """
-
         # Get halo map if map_file given.
         if map_file is not None and not os.path.exists(map_file):
             light_cone_halo_map(self, map_file=map_file, **kwargs)
@@ -240,22 +252,34 @@
     def project_light_cone(self, field, weight_field=None, apply_halo_mask=False, node=None,
                            save_stack=True, save_slice_images=False, cmap_name='algae', 
                            flatten_stack=False, photon_field=False):
-        """
-        Create projections for light cone, then add them together.
-        :param weight_field (str): the weight field of the projection.  This has the same meaning as in standard 
-               projections.  Default: None.
-        :param apply_halo_mask (bool): if True, a boolean mask is apply to the light cone projection.  See below for a 
-               description of halo masks.  Default: False.
-        :param node (str): a prefix to be prepended to the node name under which the projection data is serialized.  
-               Default: None.
-        :param save_stack (bool): if True, the unflatted light cone data including each individual slice is written to 
-               an hdf5 file.  Default: True.
-        :param save_slice_images (bool): save images for each individual projection slice.  Default: False.
-        :param cmap_name (str): color map for images.  Default: 'algae'.
-        :param flatten_stack (bool): if True, the light cone stack is continually flattened each time a slice is added 
-               in order to save memory.  This is generally not necessary.  Default: False.
-        :param photon_field (bool): if True, the projection data for each slice is decremented by 4 Pi R^2`, where R 
-               is the luminosity distance between the observer and the slice redshift.  Default: False.
+        r"""Create projections for light cone, then add them together.
+
+        Parameters
+        ----------
+        weight_field : str
+            the weight field of the projection.  This has the same meaning as
+            in standard projections.  Default: None.
+        apply_halo_mask : bool
+            if True, a boolean mask is apply to the light cone projection.  See
+            below for a description of halo masks.  Default: False.
+        node : string
+            a prefix to be prepended to the node name under which the
+            projection data is serialized.  Default: None.
+        save_stack : bool
+            if True, the unflatted light cone data including each individual
+            slice is written to an hdf5 file.  Default: True.
+        save_slice_images : bool
+            save images for each individual projection slice.  Default: False.
+        cmap_name : string
+            color map for images.  Default: 'algae'.
+        flatten_stack : bool
+            if True, the light cone stack is continually flattened each time a
+            slice is added in order to save memory.  This is generally not
+            necessary.  Default: False.
+        photon_field : bool
+            if True, the projection data for each slice is decremented by 4 Pi
+            R^2`, where R is the luminosity distance between the observer and
+            the slice redshift.  Default: False.
         """
 
         # Clear projection stack.


--- a/yt/data_objects/derived_quantities.py	Thu Aug 25 22:03:55 2011 -0400
+++ b/yt/data_objects/derived_quantities.py	Thu Aug 25 22:36:52 2011 -0400
@@ -173,9 +173,13 @@
     This function returns the location of the center
     of mass. By default, it computes of the *non-particle* data in the object. 
 
-    :param use_cells: if True, will include the cell mass (default: True)
-    :param use_particles: if True, will include the particles in the 
-    object (default: False)
+    Parameters
+    ----------
+
+    use_cells : bool
+        If True, will include the cell mass (default: True)
+    use_particles : bool
+        if True, will include the particles in the object (default: False)
     """
     x = y = z = den = 0
     if use_cells: 


--- a/yt/frontends/orion/data_structures.py	Thu Aug 25 22:03:55 2011 -0400
+++ b/yt/frontends/orion/data_structures.py	Thu Aug 25 22:36:52 2011 -0400
@@ -444,16 +444,12 @@
     def __init__(self, plotname, paramFilename=None, fparamFilename=None,
                  data_style='orion_native', paranoia=False,
                  storage_filename = None):
-        """need to override for Orion file structure.
-
-        the paramfile is usually called "inputs"
+        """
+        The paramfile is usually called "inputs"
         and there may be a fortran inputs file usually called "probin"
         plotname here will be a directory name
-        as per BoxLib, data_style will be one of
-         * Native
-         * IEEE (not implemented in yt)
-         * ASCII (not implemented in yt)
-
+        as per BoxLib, data_style will be Native (implemented here), IEEE (not
+        yet implemented) or ASCII (not yet implemented.)
         """
         self.storage_filename = storage_filename
         self.paranoid_read = paranoia


--- a/yt/funcs.py	Thu Aug 25 22:03:55 2011 -0400
+++ b/yt/funcs.py	Thu Aug 25 22:36:52 2011 -0400
@@ -137,17 +137,9 @@
     return resident * pagesize / (1024 * 1024) # return in megs
 
 def time_execution(func):
-    """
+    r"""
     Decorator for seeing how long a given function takes, depending on whether
     or not the global 'yt.timefunctions' config parameter is set.
-
-    This can be used like so:
-
-    .. code-block:: python
-
-       @time_execution
-    def some_longrunning_function(...):
-
     """
     @wraps(func)
     def wrapper(*arg, **kw):


http://bitbucket.org/yt_analysis/yt/changeset/707c5428298c/
changeset:   707c5428298c
branch:      stable
user:        caseywstark
date:        2011-06-19 23:30:17
summary:     Just readability stuff. pep 8 and 257
affected #:  1 file (96 bytes)

--- a/yt/data_objects/field_info_container.py	Tue May 24 15:05:49 2011 -0600
+++ b/yt/data_objects/field_info_container.py	Sun Jun 19 14:30:17 2011 -0700
@@ -23,50 +23,59 @@
 
   You should have received a copy of the GNU General Public License
   along with this program.  If not, see <http://www.gnu.org/licenses/>.
+
 """
 
 import types
-import numpy as na
 import inspect
 import copy
 import itertools
 
+import numpy as na
+
 from yt.funcs import *
 
 class FieldInfoContainer(object): # We are all Borg.
     """
     This is a generic field container.  It contains a list of potential derived
-    fields, all of which know how to act on a data object and return a value.  This
-    object handles converting units as well as validating the availability of a
-    given field.
+    fields, all of which know how to act on a data object and return a value.
+    This object handles converting units as well as validating the availability
+    of a given field.
+
     """
     _shared_state = {}
     _universal_field_list = {}
+
     def __new__(cls, *args, **kwargs):
         self = object.__new__(cls, *args, **kwargs)
         self.__dict__ = cls._shared_state
         return self
+
     def __getitem__(self, key):
         if key in self._universal_field_list:
             return self._universal_field_list[key]
         raise KeyError
+
     def keys(self):
-        """
-        Return all the field names this object knows about.
-        """
+        """ Return all the field names this object knows about. """
         return self._universal_field_list.keys()
+
     def __iter__(self):
         return self._universal_field_list.iterkeys()
+
     def __setitem__(self, key, val):
         self._universal_field_list[key] = val
+
     def has_key(self, key):
         return key in self._universal_field_list
+
     def add_field(self, name, function = None, **kwargs):
         """
         Add a new field, along with supplemental metadata, to the list of
         available fields.  This respects a number of arguments, all of which
         are passed on to the constructor for
         :class:`~yt.data_objects.api.DerivedField`.
+
         """
         if function == None:
             def create_function(function):
@@ -74,6 +83,7 @@
                 return function
             return create_function
         self[name] = DerivedField(name, function, **kwargs)
+
 FieldInfo = FieldInfoContainer()
 add_field = FieldInfo.add_field
 
@@ -89,14 +99,18 @@
 class CodeFieldInfoContainer(FieldInfoContainer):
     def __setitem__(self, key, val):
         self._field_list[key] = val
+
     def __iter__(self):
         return itertools.chain(self._field_list.iterkeys(),
-                        self._universal_field_list.iterkeys())
+                               self._universal_field_list.iterkeys())
+
     def keys(self):
         return set(self._field_list.keys() + self._universal_field_list.keys())
+
     def has_key(self, key):
         return key in self._universal_field_list \
             or key in self._field_list
+
     def __getitem__(self, key):
         if key in self._field_list:
             return self._field_list[key]
@@ -111,6 +125,7 @@
     def __init__(self, ghost_zones = 0, fields=None):
         self.ghost_zones = ghost_zones
         self.fields = fields
+
     def __str__(self):
         return "(%s, %s)" % (self.ghost_zones, self.fields)
 
@@ -121,18 +136,21 @@
 class NeedsDataField(ValidationException):
     def __init__(self, missing_fields):
         self.missing_fields = missing_fields
+
     def __str__(self):
         return "(%s)" % (self.missing_fields)
 
 class NeedsProperty(ValidationException):
     def __init__(self, missing_properties):
         self.missing_properties = missing_properties
+
     def __str__(self):
         return "(%s)" % (self.missing_properties)
 
 class NeedsParameter(ValidationException):
     def __init__(self, missing_parameters):
         self.missing_parameters = missing_parameters
+
     def __str__(self):
         return "(%s)" % (self.missing_parameters)
 
@@ -141,21 +159,25 @@
     NumberOfParticles = 1
     _read_exception = None
     _id_offset = 0
+
     def __init__(self, nd = 16, pf = None, flat = False):
         self.nd = nd
         self.flat = flat
         self._spatial = not flat
-        self.ActiveDimensions = [nd,nd,nd]
-        self.LeftEdge = [0.0,0.0,0.0]
-        self.RightEdge = [1.0,1.0,1.0]
+        self.ActiveDimensions = [nd, nd, nd]
+        self.LeftEdge = [0.0, 0.0, 0.0]
+        self.RightEdge = [1.0, 1.0, 1.0]
         self.dds = na.ones(3, "float64")
         self['dx'] = self['dy'] = self['dz'] = na.array([1.0])
+
         class fake_parameter_file(defaultdict):
             pass
-        if pf is None:
+
+        if pf is None:  # setup defaults
             pf = fake_parameter_file(lambda: 1)
             pf.current_redshift = pf.omega_lambda = pf.omega_matter = \
                 pf.hubble_constant = pf.cosmological_simulation = 0.0
+
         self.pf = pf
         class fake_hierarchy(object):
             class fake_io(object):
@@ -165,36 +187,45 @@
             io = fake_io()
             def get_smallest_dx(self):
                 return 1.0
+
         self.hierarchy = fake_hierarchy()
         self.requested = []
         self.requested_parameters = []
         if not self.flat:
             defaultdict.__init__(self,
-                lambda: na.ones((nd,nd,nd), dtype='float64')
-                + 1e-4*na.random.random((nd,nd,nd)))
+                lambda: na.ones((nd, nd, nd), dtype='float64')
+                + 1e-4*na.random.random((nd, nd, nd)))
         else:
             defaultdict.__init__(self, 
-                lambda: na.ones((nd*nd*nd), dtype='float64')
-                + 1e-4*na.random.random((nd*nd*nd)))
+                lambda: na.ones((nd * nd * nd), dtype='float64')
+                + 1e-4*na.random.random((nd * nd * nd)))
+
     def __missing__(self, item):
         if FieldInfo.has_key(item) and \
             FieldInfo[item]._function.func_name != '<lambda>':
+
             try:
                 vv = FieldInfo[item](self)
             except NeedsGridType as exc:
                 ngz = exc.ghost_zones
-                nfd = FieldDetector(self.nd+ngz*2)
+                nfd = FieldDetector(self.nd + ngz * 2)
                 nfd._num_ghost_zones = ngz
                 vv = FieldInfo[item](nfd)
-                if ngz > 0: vv = vv[ngz:-ngz,ngz:-ngz,ngz:-ngz]
+
+                if ngz > 0: vv = vv[ngz:-ngz, ngz:-ngz, ngz:-ngz]
+
                 for i in nfd.requested:
                     if i not in self.requested: self.requested.append(i)
+
                 for i in nfd.requested_parameters:
-                    if i not in self.requested_parameters: self.requested_parameters.append(i)
+                    if i not in self.requested_parameters:
+                        self.requested_parameters.append(i)
+
             if vv is not None:
                 if not self.flat: self[item] = vv
                 else: self[item] = vv.ravel()
                 return self[item]
+
         self.requested.append(item)
         return defaultdict.__missing__(self, item)
 
@@ -208,12 +239,15 @@
 
     def get_field_parameter(self, param):
         self.requested_parameters.append(param)
-        if param in ['bulk_velocity','center','height_vector']:
+
+        if param in ['bulk_velocity', 'center', 'height_vector']:
             return na.random.random(3)*1e-2
         else:
             return 0.0
+
     _num_ghost_zones = 0
     id = 1
+
     def has_field_parameter(self, param): return True
     def convert(self, item): return 1
 
@@ -244,23 +278,29 @@
         :param display_name: a name used in the plots
         :param projection_conversion: which unit should we multiply by in a
                                       projection?
+
         """
         self.name = name
         self._function = function
+
         if validators:
             self.validators = ensure_list(validators)
         else:
             self.validators = []
+
         self.take_log = take_log
         self._units = units
         self._projected_units = projected_units
+
         if not convert_function:
             convert_function = lambda a: 1.0
         self._convert_function = convert_function
         self._particle_convert_function = particle_convert_function
+
         self.particle_type = particle_type
         self.vector_field = vector_field
         self.projection_conversion = projection_conversion
+
         self.display_field = display_field
         self.display_name = display_name
         self.not_in_all = not_in_all
@@ -269,6 +309,7 @@
         """
         This raises an exception of the appropriate type if the set of
         validation mechanisms are not met, and otherwise returns True.
+
         """
         for validator in self.validators:
             validator(data)
@@ -278,6 +319,7 @@
     def get_dependencies(self, *args, **kwargs):
         """
         This returns a list of names of fields that this field depends on.
+
         """
         e = FieldDetector(*args, **kwargs)
         if self._function.func_name == '<lambda>':
@@ -287,47 +329,50 @@
         return e
 
     def get_units(self):
-        """
-        Return a string describing the units.
-        """
+        """ Return a string describing the units. """
         return self._units
 
     def get_projected_units(self):
         """
         Return a string describing the units if the field has been projected.
+
         """
         return self._projected_units
 
     def __call__(self, data):
-        """
-        Return the value of the field in a given *data* object.
-        """
+        """ Return the value of the field in a given *data* object. """
         ii = self.check_available(data)
         original_fields = data.keys() # Copy
         dd = self._function(self, data)
         dd *= self._convert_function(data)
+
         for field_name in data.keys():
             if field_name not in original_fields:
                 del data[field_name]
+
         return dd
 
     def get_source(self):
         """
         Return a string containing the source of the function (if possible.)
+
         """
         return inspect.getsource(self._function)
 
     def get_label(self, projected=False):
         """
         Return a data label for the given field, inluding units.
+
         """
         name = self.name
         if self.display_name is not None: name = self.display_name
         data_label = r"$\rm{%s}" % name
+
         if projected: units = self.get_projected_units()
         else: units = self.get_units()
         if units != "": data_label += r"\/\/ (%s)" % (units)
         data_label += r"$"
+
         return data_label
 
     def particle_convert(self, data):
@@ -342,9 +387,11 @@
     def __init__(self, parameters):
         """
         This validator ensures that the parameter file has a given parameter.
+
         """
         FieldValidator.__init__(self)
         self.parameters = ensure_list(parameters)
+
     def __call__(self, data):
         doesnt_have = []
         for p in self.parameters:
@@ -357,11 +404,13 @@
 class ValidateDataField(FieldValidator):
     def __init__(self, field):
         """
-        This validator ensures that the output file has a given data field stored
-        in it.
+        This validator ensures that the output file has a given data field
+        stored in it.
+
         """
         FieldValidator.__init__(self)
         self.fields = ensure_list(field)
+
     def __call__(self, data):
         doesnt_have = []
         if isinstance(data, FieldDetector): return True
@@ -370,15 +419,19 @@
                 doesnt_have.append(f)
         if len(doesnt_have) > 0:
             raise NeedsDataField(doesnt_have)
+
         return True
 
 class ValidateProperty(FieldValidator):
     def __init__(self, prop):
         """
-        This validator ensures that the data object has a given python attribute.
+        This validator ensures that the data object has a given python
+        attribute.
+
         """
         FieldValidator.__init__(self)
         self.prop = ensure_list(prop)
+
     def __call__(self, data):
         doesnt_have = []
         for p in self.prop:
@@ -386,6 +439,7 @@
                 doesnt_have.append(p)
         if len(doesnt_have) > 0:
             raise NeedsProperty(doesnt_have)
+
         return True
 
 class ValidateSpatial(FieldValidator):
@@ -393,13 +447,15 @@
         """
         This validator ensures that the data handed to the field is of spatial
         nature -- that is to say, 3-D.
+
         """
         FieldValidator.__init__(self)
         self.ghost_zones = ghost_zones
         self.fields = fields
+
     def __call__(self, data):
-        # When we say spatial information, we really mean
-        # that it has a three-dimensional data structure
+        # When we say spatial information, we really mean that it has a
+        # three-dimensional data structure
         #if isinstance(data, FieldDetector): return True
         if not data._spatial:
             raise NeedsGridType(self.ghost_zones,self.fields)
@@ -412,8 +468,10 @@
         """
         This validator ensures that the data handed to the field is an actual
         grid patch, not a covering grid of any kind.
+
         """
         FieldValidator.__init__(self)
+
     def __call__(self, data):
         # We need to make sure that it's an actual AMR grid
         if isinstance(data, FieldDetector): return True


http://bitbucket.org/yt_analysis/yt/changeset/1a609f2c06da/
changeset:   1a609f2c06da
branch:      stable
user:        MatthewTurk
date:        2011-08-26 16:23:42
summary:     Merging in Casey's changes for readability.
affected #:  1 file (78 bytes)

--- a/yt/data_objects/field_info_container.py	Thu Aug 25 22:36:52 2011 -0400
+++ b/yt/data_objects/field_info_container.py	Fri Aug 26 10:23:42 2011 -0400
@@ -26,19 +26,20 @@
 """
 
 import types
-import numpy as na
 import inspect
 import copy
 import itertools
 
+import numpy as na
+
 from yt.funcs import *
 
 class FieldInfoContainer(object): # We are all Borg.
     """
     This is a generic field container.  It contains a list of potential derived
-    fields, all of which know how to act on a data object and return a value.  This
-    object handles converting units as well as validating the availability of a
-    given field.
+    fields, all of which know how to act on a data object and return a value.
+    This object handles converting units as well as validating the availability
+    of a given field.
     """
     _shared_state = {}
     _universal_field_list = {}
@@ -51,22 +52,25 @@
             return self._universal_field_list[key]
         raise KeyError
     def keys(self):
-        """
-        Return all the field names this object knows about.
-        """
+        """ Return all the field names this object knows about. """
         return self._universal_field_list.keys()
+
     def __iter__(self):
         return self._universal_field_list.iterkeys()
+
     def __setitem__(self, key, val):
         self._universal_field_list[key] = val
+
     def has_key(self, key):
         return key in self._universal_field_list
+
     def add_field(self, name, function = None, **kwargs):
         """
         Add a new field, along with supplemental metadata, to the list of
         available fields.  This respects a number of arguments, all of which
         are passed on to the constructor for
         :class:`~yt.data_objects.api.DerivedField`.
+
         """
         if function == None:
             def create_function(function):
@@ -74,6 +78,7 @@
                 return function
             return create_function
         self[name] = DerivedField(name, function, **kwargs)
+
 FieldInfo = FieldInfoContainer()
 add_field = FieldInfo.add_field
 
@@ -89,14 +94,18 @@
 class CodeFieldInfoContainer(FieldInfoContainer):
     def __setitem__(self, key, val):
         self._field_list[key] = val
+
     def __iter__(self):
         return itertools.chain(self._field_list.iterkeys(),
-                        self._universal_field_list.iterkeys())
+                               self._universal_field_list.iterkeys())
+
     def keys(self):
         return set(self._field_list.keys() + self._universal_field_list.keys())
+
     def has_key(self, key):
         return key in self._universal_field_list \
             or key in self._field_list
+
     def __getitem__(self, key):
         if key in self._field_list:
             return self._field_list[key]
@@ -111,6 +120,7 @@
     def __init__(self, ghost_zones = 0, fields=None):
         self.ghost_zones = ghost_zones
         self.fields = fields
+
     def __str__(self):
         return "(%s, %s)" % (self.ghost_zones, self.fields)
 
@@ -121,18 +131,21 @@
 class NeedsDataField(ValidationException):
     def __init__(self, missing_fields):
         self.missing_fields = missing_fields
+
     def __str__(self):
         return "(%s)" % (self.missing_fields)
 
 class NeedsProperty(ValidationException):
     def __init__(self, missing_properties):
         self.missing_properties = missing_properties
+
     def __str__(self):
         return "(%s)" % (self.missing_properties)
 
 class NeedsParameter(ValidationException):
     def __init__(self, missing_parameters):
         self.missing_parameters = missing_parameters
+
     def __str__(self):
         return "(%s)" % (self.missing_parameters)
 
@@ -141,18 +154,19 @@
     NumberOfParticles = 1
     _read_exception = None
     _id_offset = 0
+
     def __init__(self, nd = 16, pf = None, flat = False):
         self.nd = nd
         self.flat = flat
         self._spatial = not flat
-        self.ActiveDimensions = [nd,nd,nd]
-        self.LeftEdge = [0.0,0.0,0.0]
-        self.RightEdge = [1.0,1.0,1.0]
+        self.ActiveDimensions = [nd, nd, nd]
+        self.LeftEdge = [0.0, 0.0, 0.0]
+        self.RightEdge = [1.0, 1.0, 1.0]
         self.dds = na.ones(3, "float64")
         self['dx'] = self['dy'] = self['dz'] = na.array([1.0])
         class fake_parameter_file(defaultdict):
             pass
-        if pf is None:
+        if pf is None:  # setup defaults
             pf = fake_parameter_file(lambda: 1)
             pf.current_redshift = pf.omega_lambda = pf.omega_matter = \
                 pf.hubble_constant = pf.cosmological_simulation = 0.0
@@ -168,17 +182,18 @@
             io = fake_io()
             def get_smallest_dx(self):
                 return 1.0
+
         self.hierarchy = fake_hierarchy()
         self.requested = []
         self.requested_parameters = []
         if not self.flat:
             defaultdict.__init__(self,
-                lambda: na.ones((nd,nd,nd), dtype='float64')
-                + 1e-4*na.random.random((nd,nd,nd)))
+                lambda: na.ones((nd, nd, nd), dtype='float64')
+                + 1e-4*na.random.random((nd, nd, nd)))
         else:
             defaultdict.__init__(self, 
-                lambda: na.ones((nd*nd*nd), dtype='float64')
-                + 1e-4*na.random.random((nd*nd*nd)))
+                lambda: na.ones((nd * nd * nd), dtype='float64')
+                + 1e-4*na.random.random((nd * nd * nd)))
     def __missing__(self, item):
         FI = getattr(self.pf, "field_info", FieldInfo)
         if FI.has_key(item) and \
@@ -190,15 +205,20 @@
                 nfd = FieldDetector(self.nd+ngz*2)
                 nfd._num_ghost_zones = ngz
                 vv = FI[item](nfd)
-                if ngz > 0: vv = vv[ngz:-ngz,ngz:-ngz,ngz:-ngz]
+                if ngz > 0: vv = vv[ngz:-ngz, ngz:-ngz, ngz:-ngz]
+
                 for i in nfd.requested:
                     if i not in self.requested: self.requested.append(i)
+
                 for i in nfd.requested_parameters:
-                    if i not in self.requested_parameters: self.requested_parameters.append(i)
+                    if i not in self.requested_parameters:
+                        self.requested_parameters.append(i)
+
             if vv is not None:
                 if not self.flat: self[item] = vv
                 else: self[item] = vv.ravel()
                 return self[item]
+
         self.requested.append(item)
         return defaultdict.__missing__(self, item)
 
@@ -249,23 +269,29 @@
         :param display_name: a name used in the plots
         :param projection_conversion: which unit should we multiply by in a
                                       projection?
+
         """
         self.name = name
         self._function = function
+
         if validators:
             self.validators = ensure_list(validators)
         else:
             self.validators = []
+
         self.take_log = take_log
         self._units = units
         self._projected_units = projected_units
+
         if not convert_function:
             convert_function = lambda a: 1.0
         self._convert_function = convert_function
         self._particle_convert_function = particle_convert_function
+
         self.particle_type = particle_type
         self.vector_field = vector_field
         self.projection_conversion = projection_conversion
+
         self.display_field = display_field
         self.display_name = display_name
         self.not_in_all = not_in_all
@@ -274,6 +300,7 @@
         """
         This raises an exception of the appropriate type if the set of
         validation mechanisms are not met, and otherwise returns True.
+
         """
         for validator in self.validators:
             validator(data)
@@ -283,6 +310,7 @@
     def get_dependencies(self, *args, **kwargs):
         """
         This returns a list of names of fields that this field depends on.
+
         """
         e = FieldDetector(*args, **kwargs)
         if self._function.func_name == '<lambda>':
@@ -292,47 +320,50 @@
         return e
 
     def get_units(self):
-        """
-        Return a string describing the units.
-        """
+        """ Return a string describing the units.  """
         return self._units
 
     def get_projected_units(self):
         """
         Return a string describing the units if the field has been projected.
+
         """
         return self._projected_units
 
     def __call__(self, data):
-        """
-        Return the value of the field in a given *data* object.
-        """
+        """ Return the value of the field in a given *data* object.  """
         ii = self.check_available(data)
         original_fields = data.keys() # Copy
         dd = self._function(self, data)
         dd *= self._convert_function(data)
+
         for field_name in data.keys():
             if field_name not in original_fields:
                 del data[field_name]
+
         return dd
 
     def get_source(self):
         """
         Return a string containing the source of the function (if possible.)
+
         """
         return inspect.getsource(self._function)
 
     def get_label(self, projected=False):
         """
         Return a data label for the given field, inluding units.
+
         """
         name = self.name
         if self.display_name is not None: name = self.display_name
         data_label = r"$\rm{%s}" % name
+
         if projected: units = self.get_projected_units()
         else: units = self.get_units()
         if units != "": data_label += r"\/\/ (%s)" % (units)
         data_label += r"$"
+
         return data_label
 
     def particle_convert(self, data):
@@ -347,9 +378,11 @@
     def __init__(self, parameters):
         """
         This validator ensures that the parameter file has a given parameter.
+
         """
         FieldValidator.__init__(self)
         self.parameters = ensure_list(parameters)
+
     def __call__(self, data):
         doesnt_have = []
         for p in self.parameters:
@@ -362,11 +395,13 @@
 class ValidateDataField(FieldValidator):
     def __init__(self, field):
         """
-        This validator ensures that the output file has a given data field stored
-        in it.
+        This validator ensures that the output file has a given data field
+        stored in it.
+
         """
         FieldValidator.__init__(self)
         self.fields = ensure_list(field)
+
     def __call__(self, data):
         doesnt_have = []
         if isinstance(data, FieldDetector): return True
@@ -375,15 +410,19 @@
                 doesnt_have.append(f)
         if len(doesnt_have) > 0:
             raise NeedsDataField(doesnt_have)
+
         return True
 
 class ValidateProperty(FieldValidator):
     def __init__(self, prop):
         """
-        This validator ensures that the data object has a given python attribute.
+        This validator ensures that the data object has a given python
+        attribute.
+
         """
         FieldValidator.__init__(self)
         self.prop = ensure_list(prop)
+
     def __call__(self, data):
         doesnt_have = []
         for p in self.prop:
@@ -391,6 +430,7 @@
                 doesnt_have.append(p)
         if len(doesnt_have) > 0:
             raise NeedsProperty(doesnt_have)
+
         return True
 
 class ValidateSpatial(FieldValidator):
@@ -398,13 +438,15 @@
         """
         This validator ensures that the data handed to the field is of spatial
         nature -- that is to say, 3-D.
+
         """
         FieldValidator.__init__(self)
         self.ghost_zones = ghost_zones
         self.fields = fields
+
     def __call__(self, data):
-        # When we say spatial information, we really mean
-        # that it has a three-dimensional data structure
+        # When we say spatial information, we really mean that it has a
+        # three-dimensional data structure
         #if isinstance(data, FieldDetector): return True
         if not data._spatial:
             raise NeedsGridType(self.ghost_zones,self.fields)
@@ -417,8 +459,10 @@
         """
         This validator ensures that the data handed to the field is an actual
         grid patch, not a covering grid of any kind.
+
         """
         FieldValidator.__init__(self)
+
     def __call__(self, data):
         # We need to make sure that it's an actual AMR grid
         if isinstance(data, FieldDetector): return True


http://bitbucket.org/yt_analysis/yt/changeset/6d1198309781/
changeset:   6d1198309781
branch:      stable
user:        caseywstark
date:        2011-08-26 21:26:16
summary:     Fixing dumb path bug (my bad)
affected #:  1 file (226 bytes)

--- a/yt/frontends/nyx/data_structures.py	Fri Aug 26 10:23:42 2011 -0400
+++ b/yt/frontends/nyx/data_structures.py	Fri Aug 26 12:26:16 2011 -0700
@@ -548,11 +548,17 @@
         """
         self.storage_filename = storage_filename
         self.parameter_filename = param_filename
-        self.parameter_file_path = os.path.abspath(self.parameter_filename)
         self.fparameter_filename = fparam_filename
-        self.fparameter_file_path = os.path.abspath(self.fparameter_filename)
+
         self.path = os.path.abspath(plotname)  # data folder
 
+        # silly inputs and probin file thing (this is on the Nyx todo list)
+        self.parameter_file_path = os.path.join(os.path.dirname(self.path),
+                                                self.parameter_filename)
+
+        self.fparameter_file_path = os.path.join(os.path.dirname(self.path),
+                                                 self.fparameter_filename)
+
         self.fparameters = {}
 
         # @todo: quick fix...


http://bitbucket.org/yt_analysis/yt/changeset/3e5dff87b558/
changeset:   3e5dff87b558
branch:      stable
user:        samskillman
date:        2011-08-26 21:56:51
summary:     Adding scale and scale_units to the magnetic_field callback, changing to keyword arguments in QuiverCallback
affected #:  1 file (173 bytes)

--- a/yt/visualization/plot_modifications.py	Fri Aug 26 12:26:16 2011 -0700
+++ b/yt/visualization/plot_modifications.py	Fri Aug 26 13:56:51 2011 -0600
@@ -81,18 +81,20 @@
         else:
             xv = "%s-velocity" % (x_names[plot.data.axis])
             yv = "%s-velocity" % (y_names[plot.data.axis])
-            qcb = QuiverCallback(xv, yv, self.factor, self.scale, self.scale_units)
+            qcb = QuiverCallback(xv, yv, self.factor, scale=self.scale, scale_units=self.scale_units)
         return qcb(plot)
 
 class MagFieldCallback(PlotCallback):
     _type_name = "magnetic_field"
-    def __init__(self, factor=16):
+    def __init__(self, factor=16, scale=None, scale_units=None):
         """
         Adds a 'quiver' plot of magnetic field to the plot, skipping all but
         every *factor* datapoint
         """
         PlotCallback.__init__(self)
         self.factor = factor
+        self.scale  = scale
+        self.scale_units = scale_units
 
     def __call__(self, plot):
         # Instantiation of these is cheap
@@ -101,12 +103,12 @@
         else:
             xv = "B%s" % (x_names[plot.data.axis])
             yv = "B%s" % (y_names[plot.data.axis])
-            qcb = QuiverCallback(xv, yv, self.factor)
+            qcb = QuiverCallback(xv, yv, self.factor, scale=self.scale, scale_units=self.scale_units)
         return qcb(plot)
 
 class QuiverCallback(PlotCallback):
     _type_name = "quiver"
-    def __init__(self, field_x, field_y, factor, scale, scale_units):
+    def __init__(self, field_x, field_y, factor, scale=None, scale_units=None):
         """
         Adds a 'quiver' plot to any plot, using the *field_x* and *field_y*
         from the associated data, skipping every *factor* datapoints


http://bitbucket.org/yt_analysis/yt/changeset/154492039276/
changeset:   154492039276
branch:      stable
user:        MatthewTurk
date:        2011-08-26 22:02:30
summary:     Adding distribute_setup.py explicitly.
affected #:  1 file (28 bytes)

--- a/MANIFEST.in	Fri Aug 26 13:56:51 2011 -0600
+++ b/MANIFEST.in	Fri Aug 26 16:02:30 2011 -0400
@@ -1,2 +1,3 @@
+include distribute_setup.py
 recursive-include yt/gui/reason/html/ *.html *.png *.ico *.js
 recursive-include yt/ *.pyx *.pxd *.hh *.h README* 


http://bitbucket.org/yt_analysis/yt/changeset/3e5654bc09d4/
changeset:   3e5654bc09d4
branch:      stable
user:        MatthewTurk
date:        2011-08-29 01:08:37
summary:     Updating to distribute 0.6.21.
affected #:  1 file (0 bytes)

--- a/distribute_setup.py	Fri Aug 26 16:02:30 2011 -0400
+++ b/distribute_setup.py	Sun Aug 28 19:08:37 2011 -0400
@@ -46,7 +46,7 @@
             args = [quote(arg) for arg in args]
         return os.spawnl(os.P_WAIT, sys.executable, *args) == 0
 
-DEFAULT_VERSION = "0.6.14"
+DEFAULT_VERSION = "0.6.21"
 DEFAULT_URL = "http://pypi.python.org/packages/source/d/distribute/"
 SETUPTOOLS_FAKED_VERSION = "0.6c11"
 


http://bitbucket.org/yt_analysis/yt/changeset/41bd8aacfbc8/
changeset:   41bd8aacfbc8
branch:      stable
user:        MatthewTurk
date:        2011-09-01 22:07:42
summary:     Fixing the rocket import and updating the yt-project location for the pastebin.
affected #:  2 files (16 bytes)

--- a/scripts/yt_lodgeit.py	Sun Aug 28 19:08:37 2011 -0400
+++ b/scripts/yt_lodgeit.py	Thu Sep 01 16:07:42 2011 -0400
@@ -4,8 +4,8 @@
     LodgeIt!
     ~~~~~~~~
 
-    A script that pastes stuff into the enzotools pastebin on
-    paste.enztools.org.
+    A script that pastes stuff into the yt-project pastebin on
+    paste.yt-project.org.
 
     Modified (very, very slightly) from the original script by the authors
     below.


--- a/yt/gui/reason/bottle_mods.py	Sun Aug 28 19:08:37 2011 -0400
+++ b/yt/gui/reason/bottle_mods.py	Thu Sep 01 16:07:42 2011 -0400
@@ -99,7 +99,7 @@
 class YTRocketServer(ServerAdapter):
     server_info = {} # Hack to get back at instance vars
     def run(self, handler):
-        from rocket import Rocket
+        from yt.utilities.rocket import Rocket
         server = Rocket((self.host, self.port), 'wsgi', { 'wsgi_app' : handler })
         self.server_info[id(self)] = server
         server.start()


http://bitbucket.org/yt_analysis/yt/changeset/5a056f22d893/
changeset:   5a056f22d893
branch:      stable
user:        MatthewTurk
date:        2011-09-02 16:19:43
summary:     Added tag yt-2.2 for changeset 41bd8aacfbc8
affected #:  1 file (48 bytes)

--- a/.hgtags	Thu Sep 01 16:07:42 2011 -0400
+++ b/.hgtags	Fri Sep 02 10:19:43 2011 -0400
@@ -5155,3 +5155,4 @@
 ca6e536c15a60070e6988fd472dc771a1897e170 yt-2.0
 882c41eed5dd4a3cdcbb567bcb79b833e46b1f42 yt-2.0.1
 a2b3521b1590c25029ca0bc602ad6cb7ae7b8ba2 yt-2.1
+41bd8aacfbc81fa66d7a3f2cd2880f10c3e237a4 yt-2.2

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list