[Yt-svn] yt: 4 new changesets

hg at spacepope.org hg at spacepope.org
Sat Feb 19 10:59:33 PST 2011


hg Repository: yt
details:   yt/rev/2c647f9560cb
changeset: 3751:2c647f9560cb
user:      Matthew Turk <matthewturk at gmail.com>
date:
Fri Feb 11 14:09:11 2011 -0500
description:
First couple classes toward adaptive ray casting.  It's at this point in the
implementation that I realize I have to re-order much of the ray casting, as
well as calculate the exit_t for rays from bricks to calculate refinement
criteria.

hg Repository: yt
details:   yt/rev/77e64bb3e1d4
changeset: 3752:77e64bb3e1d4
user:      Matthew Turk <matthewturk at gmail.com>
date:
Sat Feb 19 12:48:18 2011 -0500
description:
More work on adaptive ray tracing.  Now merging with Sam's updates to kD-tree
to test the traversal.

hg Repository: yt
details:   yt/rev/abc9b02c309f
changeset: 3753:abc9b02c309f
user:      Matthew Turk <matthewturk at gmail.com>
date:
Sat Feb 19 12:48:31 2011 -0500
description:
Merging

hg Repository: yt
details:   yt/rev/c6df2b59fa19
changeset: 3754:c6df2b59fa19
user:      Matthew Turk <matthewturk at gmail.com>
date:
Sat Feb 19 13:50:20 2011 -0500
description:
More work on the adaptive ray tracing.  Added a camera and fixed some bugs.

diffstat:

 CREDITS                                                    |     8 +-
 README                                                     |    10 +-
 doc/README                                                 |     4 +-
 doc/how_to_develop_yt.txt                                  |     7 +
 doc/install_script.sh                                      |    14 +
 scripts/eyt                                                |   152 -
 yt/analysis_modules/halo_merger_tree/merger_tree.py        |     2 +-
 yt/analysis_modules/level_sets/contour_finder.py           |    28 +-
 yt/astro_objects/api.py                                    |    34 +
 yt/astro_objects/astrophysical_object.py                   |    90 +
 yt/astro_objects/clumped_region.py                         |    39 +
 yt/astro_objects/setup.py                                  |    12 +
 yt/astro_objects/simulation_volume.py                      |    32 +
 yt/data_objects/derived_quantities.py                      |    20 +-
 yt/data_objects/universal_fields.py                        |    54 +
 yt/frontends/enzo/data_structures.py                       |    11 +
 yt/frontends/enzo/io.py                                    |     4 +-
 yt/setup.py                                                |     1 +
 yt/utilities/_amr_utils/ContourFinding.pyx                 |    17 +
 yt/utilities/_amr_utils/VolumeIntegrator.pyx               |   151 +-
 yt/utilities/amr_kdtree/amr_kdtree.py                      |  1128 +++++------
 yt/utilities/parallel_tools/parallel_analysis_interface.py |     6 +-
 yt/visualization/volume_rendering/camera.py                |    86 +-
 yt/visualization/volume_rendering/grid_partitioner.py      |     2 +-
 24 files changed, 1098 insertions(+), 814 deletions(-)

diffs (truncated from 2433 to 300 lines):

diff -r a763d0bcb2df -r c6df2b59fa19 CREDITS
--- a/CREDITS	Thu Feb 10 10:09:48 2011 -0500
+++ b/CREDITS	Sat Feb 19 13:50:20 2011 -0500
@@ -17,7 +17,11 @@
 Enthought, the cmdln.py module by Trent Mick, and the progressbar module by
 Nilton Volpato.  The PasteBin interface code (as well as the PasteBin itself)
 was written by the Pocoo collective (pocoo.org).  The RamsesRead++ library was
-developed by Oliver Hahn.  Large parts of this code were guided by discussions
-with Tom Abel, Ralf Kaehler, Mike Norman and Greg Bryan.
+developed by Oliver Hahn.  yt also includes a slightly-modified version of
+libconfig (http://www.hyperrealm.com/libconfig/) and an unmodified version of
+several routines from HEALpix (http://healpix.jpl.nasa.gov/).
+
+Large parts of development of yt were guided by discussions with Tom Abel, Ralf
+Kaehler, Mike Norman and Greg Bryan.
 
 Thanks to everyone for all your contributions!
diff -r a763d0bcb2df -r c6df2b59fa19 README
--- a/README	Thu Feb 10 10:09:48 2011 -0500
+++ b/README	Sat Feb 19 13:50:20 2011 -0500
@@ -1,11 +1,13 @@
 Hi there!  You've just downloaded yt, an analysis tool for 3D Enzo adaptive
-mesh refinement datasets.  It's written in python and based on the NumPy and
-Matplotlib components.
+mesh refinement datasets.  It's written in python and heavily leverages both
+NumPy and Matplotlib for fast arrays and visualization, respectively.
 
-Full documentation is available online at http://yt.enzotools.org/ .
+Full documentation and a user community can be found at
+http://yt.enzotools.org/ .
 
 If you have used Python before, and are comfortable with installing packages,
-you should find the setup.py script fairly straightforward.
+you should find the setup.py script fairly straightforward: simply execute
+"python setup.py install".
 
 If you would rather a more automated installation, you can use the script
 doc/install_script.sh .  You will have to set the destination directory, and
diff -r a763d0bcb2df -r c6df2b59fa19 doc/README
--- a/doc/README	Thu Feb 10 10:09:48 2011 -0500
+++ b/doc/README	Sat Feb 19 13:50:20 2011 -0500
@@ -15,7 +15,7 @@
 
 You can also download a copy of the documentation and unzip it right here:
 
-wget http://yt.enzotools.org/doc/docs_html.zip
-unzip docs_html.zip
+wget http://yt.enzotools.org/doc/download.zip
+unzip download.zip
 
 Then open index.html with your favorite web browser, and be off!
diff -r a763d0bcb2df -r c6df2b59fa19 doc/how_to_develop_yt.txt
--- a/doc/how_to_develop_yt.txt	Thu Feb 10 10:09:48 2011 -0500
+++ b/doc/how_to_develop_yt.txt	Sat Feb 19 13:50:20 2011 -0500
@@ -74,6 +74,13 @@
       classes for data regions, covering grids, time series, and so on.  This
       also includes derived fields and derived quantities.
 
+   astro_objects
+      This is where all objects that represent astrophysical objects should
+      live -- for instance, galaxies, halos, disks, clusters and so on.  These
+      can be expressive, provide astrophysical analysis functionality and will
+      in general be more user-modifiable, user-tweakable, and much less of a
+      black box that data_objects.
+
    analysis_modules
       This is where all mechanisms for processing data live.  This includes
       things like clump finding, halo profiling, halo finding, and so on.  This
diff -r a763d0bcb2df -r c6df2b59fa19 doc/install_script.sh
--- a/doc/install_script.sh	Thu Feb 10 10:09:48 2011 -0500
+++ b/doc/install_script.sh	Sat Feb 19 13:50:20 2011 -0500
@@ -72,6 +72,7 @@
 {
     MYHOST=`hostname -s`  # just give the short one, not FQDN
     MYHOSTLONG=`hostname` # FQDN, for Ranger
+    MYOS=`uname -s`       # A guess at the OS
     if [ "${MYHOST##kraken}" != "${MYHOST}" ]
     then
         echo "Looks like you're on Kraken."
@@ -134,6 +135,19 @@
         echo "   $ module load gcc"
         echo
     fi
+    if [ "${MYOS##Darwin}" != "${MYOS}" ]
+    then
+        echo "Looks like you're running on Mac OSX."
+        echo
+        echo "NOTE: You may have problems if you are running OSX 10.6 (Snow"
+        echo "Leopard) or newer.  If you do, please set the following"
+        echo "environment variables, remove any broken installation tree, and"
+        echo "re-run this script verbatim."
+        echo
+        echo "$ export CC=gcc-4.2"
+        echo "$ export CXX=g++-4.2"
+        echo
+    fi
 }
 
 
diff -r a763d0bcb2df -r c6df2b59fa19 scripts/eyt
--- a/scripts/eyt	Thu Feb 10 10:09:48 2011 -0500
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,152 +0,0 @@
-#!python
-
-from yt.mods import *
-import os
-namespace = locals().copy()
-
-doc = """\
-
-Welcome to Enzo-embedded yt!
-
-The different processors are accessible via the 'mec' variable.  To get grid
-data, try using the get_grid_field function.  When done, be sure to kill the
-processes with 'mec.kill()'!
-
-Information about the mec variable, an instance of MultiEngineClient, can be
-found in the IPython documentation:
-
-http://ipython.scipy.org/doc/manual/html/parallel/parallel_multiengine.html
-
-You can use the '%px' command to issue commands on all the engines
-simultaneously.
-
-"""
-
-import IPython.Shell
-
-if "DISPLAY" in os.environ:
-    try:
-        ip_shell = IPython.Shell.IPShellMatplotlibWX(user_ns=namespace)
-    except ImportError:
-        ip_shell = IPython.Shell.IPShellMatplotlib(user_ns=namespace)
-else:
-    ip_shell = IPython.Shell.IPShellMatplotlib(user_ns=namespace)
-
-ip = ip_shell.IP.getapi()
-
-import os   
-import glob
-import itertools
-
-ip = ip_shell.IP.getapi()
-ip.ex("from yt.mods import *")
-from IPython.kernel import client
-
-class YTClient(object):
-    mec = None
-
-    def __init__(self):
-        self.refresh()
-
-    def eval(self, varname, targets = None):
-        """
-        This function pulls anything from the remote host, but it will overwrite
-        any variable named __tmp.  This is to get around nested variables and
-        properties on the remote host.
-        """
-        self.mec.execute("__tmp = %s" % varname, targets=targets)
-        result = self.mec.pull("__tmp", targets=targets)
-        return result
-
-    def get_grid_field(self, grid_index, field_name, raw=False):
-        """
-        Return the numpy array representing a piece of field information.
-        Note that *grid_index* is the actual index into the array, which is ID-1.
-
-        If *raw* is set to True, then only raw original fields from the hierarchy
-        are returned.  This will include ghost zones, and derived fields are
-        inaccessible.
-        """
-        proc = int(self.enzo.hierarchy_information["GridProcs"][grid_index])
-        if not raw: # go through yt
-            result = self.eval("pf.h.grids[%s]['%s']" % (
-                        grid_index, field_name), [proc])[0]
-        else: # go through enzo module
-            result = self.eval("enzo.grid_data[%s + 1]['%s']" % (
-                        grid_index, field_name), [proc])[0].swapaxes(0,2)
-        return result
-
-    def refresh(self):
-        if self.mec is not None: self.mec.kill()
-        self.mec = client.MultiEngineClient()
-        self.mec.activate()
-        # there are some blocks in hierarchy instantiation, so
-        # we pre-instantiate
-        self.mec.execute("pf.h") 
-        self.enzo = enzo_module_proxy(self)
-        self.pf = EnzoStaticOutputProxy(ytc=self)
-        ip.to_user_ns(dict(
-            mec=self.mec, ytc=self, pf = self.pf))
-
-class enzo_module_proxy(object):
-    def __init__(self, ytc):
-        self.hierarchy_information = ytc.eval("enzo.hierarchy_information", [0])[0]
-        self.conversion_factors = ytc.eval("enzo.conversion_factors", [0])[0]
-        self.yt_parameter_file = ytc.eval("enzo.yt_parameter_file", [0])[0]
-
-from yt.lagos import EnzoStaticOutputInMemory, EnzoHierarchyInMemory
-from yt.lagos.HierarchyType import _data_style_funcs
-from yt.lagos.DataReadingFuncs import BaseDataQueue
-
-class EnzoHierarchyProxy(EnzoHierarchyInMemory):
-    _data_style = 'proxy'
-    def _setup_field_lists(self):
-        self.field_list = self.parameter_file.ytc.eval("pf.h.field_list", [0])[0]
-
-    def _obtain_enzo(self):
-        return self.parameter_file.ytc.enzo
-
-class EnzoStaticOutputProxy(EnzoStaticOutputInMemory):
-    _data_style = 'proxy'
-    _hierarchy_class = EnzoHierarchyProxy
-
-    def __init__(self, *args, **kwargs):
-        self.ytc = kwargs.pop("ytc")
-        EnzoStaticOutputInMemory.__init__(self, *args, **kwargs)
-
-    def _obtain_enzo(self):
-        return self.ytc.enzo
-
-def _read_proxy_slice(self, grid, field, axis, coord):
-    data = ytc.get_grid_field(grid.id - 1, field, raw=True)
-    sl = [slice(3,-3), slice(3,-3), slice(3,-3)]
-    sl[axis] = slice(coord + 3, coord + 4)
-    sl = tuple(reversed(sl))
-    return data[sl].swapaxes(0,2)
-
-class DataQueueProxy(BaseDataQueue):
-    def __init__(self, ghost_zones = 3):
-        self.my_slice = (slice(ghost_zones, -ghost_zones),
-                         slice(ghost_zones, -ghost_zones),
-                         slice(ghost_zones, -ghost_zones))
-        BaseDataQueue.__init__(self)
-
-    def _read_set(self, grid, field):
-        data = ytc.get_grid_field(grid.id - 1, field, raw=True)
-        return data[self.my_slice]
-
-    def modify(self, field):
-        return field.swapaxes(0,2)
-
-def proxy_exception(*args, **kwargs):
-    return KeyError
-
-# things like compare buffers over time
-
-_data_style_funcs['proxy'] = \
-    (None, None, None, _read_proxy_slice, proxy_exception, DataQueueProxy)
-
-ytc = YTClient()
-mec = ytc.mec
-
-ip_shell.mainloop(sys_exit=1,banner=doc)
diff -r a763d0bcb2df -r c6df2b59fa19 yt/analysis_modules/halo_merger_tree/merger_tree.py
--- a/yt/analysis_modules/halo_merger_tree/merger_tree.py	Thu Feb 10 10:09:48 2011 -0500
+++ b/yt/analysis_modules/halo_merger_tree/merger_tree.py	Sat Feb 19 13:50:20 2011 -0500
@@ -216,7 +216,7 @@
         for cycle, file in enumerate(self.restart_files):
             gc.collect()
             pf = load(file)
-            self.period = self.pf.domain_right_edge - self.pf.domain_left_edge
+            self.period = pf.domain_right_edge - pf.domain_left_edge
             # If the halos are already found, skip this data step, unless
             # refresh is True.
             dir = os.path.dirname(file)
diff -r a763d0bcb2df -r c6df2b59fa19 yt/analysis_modules/level_sets/contour_finder.py
--- a/yt/analysis_modules/level_sets/contour_finder.py	Thu Feb 10 10:09:48 2011 -0500
+++ b/yt/analysis_modules/level_sets/contour_finder.py	Sat Feb 19 13:50:20 2011 -0500
@@ -256,7 +256,11 @@
                     s1.update(joins.pop(k2))
                     s1.update([k2])
                     updated += 1
-    return joins
+    tr = []
+    for k in joins.keys():
+        v = joins.pop(k)
+        tr.append((k, na.array(list(v), dtype="int64")))
+    return tr
 
 def identify_contours(data_source, field, min_val, max_val,
                           cached_fields=None):
@@ -300,15 +304,23 @@
     sort_new = na.array(list(set(tree)), dtype='int64')
     mylog.info("Coalescing %s joins", sort_new.shape[0])
     joins = coalesce_join_tree(sort_new)
+    #joins = [(i, na.array(list(j), dtype="int64")) for i, j in sorted(joins.items())]
     pbar = get_pbar("Joining ", len(joins))
     # This process could and should be done faster
-    for i, new in enumerate(sorted(joins.keys())):
-        pbar.update(i)
-        old_set = joins[new]
-        for old in old_set:
-            if old == new: continue
-            i1 = (data_source["tempContours"] == old)
-            data_source["tempContours"][i1] = new
+    print "Joining..."
+    t1 = time.time()
+    ff = data_source["tempContours"].astype("int64")



More information about the yt-svn mailing list