[yt-svn] commit/yt: MatthewTurk: Merged in atmyers/yt (pull request #1778)

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Tue Dec 15 16:56:56 PST 2015


1 new commit in yt:

https://bitbucket.org/yt_analysis/yt/commits/a065f0f3bd81/
Changeset:   a065f0f3bd81
Branch:      yt
User:        MatthewTurk
Date:        2015-12-16 00:56:24+00:00
Summary:     Merged in atmyers/yt (pull request #1778)

More unstructured mesh work
Affected #:  76 files

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -57,6 +57,11 @@
 yt/utilities/lib/marching_cubes.c
 yt/utilities/lib/png_writer.h
 yt/utilities/lib/write_array.c
+yt/utilities/lib/element_mappings.c
+yt/utilities/lib/mesh_construction.cpp
+yt/utilities/lib/mesh_samplers.cpp
+yt/utilities/lib/mesh_traversal.cpp
+yt/utilities/lib/mesh_intersection.cpp
 syntax: glob
 *.pyc
 .*.swp

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -327,6 +327,100 @@
 
 .. _loading-fits-data:
 
+Exodus II Data
+--------------
+
+Exodus II is a file format for Finite Element datasets that is used by the MOOSE
+framework for file IO. Support for this format (and for unstructured mesh data in 
+general) is a new feature as of yt 3.3, so while we aim to fully support it, we also expect 
+there to be some buggy features at present. Currently, yt can visualize first-order
+mesh types only (4-node quads, 8-node hexes, 3-node triangles, and 4-node tetrahedra).
+Development of higher-order visualization capability is a work in progress.
+
+To load an Exodus II dataset, you can use the ``yt.load`` command on the Exodus II
+file:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+
+Because Exodus II datasets can have multiple steps (which can correspond to time steps, 
+picard iterations, non-linear solve iterations, etc...), you can also specify a step
+argument when you load an Exodus II data that defines the index at which to look when
+you read data from the file.
+
+You can access the connectivity information directly by doing:
+
+.. code-block:: python
+    
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   print(ds.index.meshes[0].connectivity_coords)
+   print(ds.index.meshes[0].connectivity_indices)
+   print(ds.index.meshes[1].connectivity_coords)
+   print(ds.index.meshes[1].connectivity_indices)
+
+This particular dataset has two meshes in it, both of which are made of 8-node hexes.
+yt uses a field name convention to access these different meshes in plots and data
+objects. To see all the fields found in a particlular dataset, you can do:
+
+.. code-block:: python
+    
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   print(ds.field_list)
+
+This will give you a list of field names like ``('connect1', 'diffused')`` and 
+``('connect2', 'convected')``. Here, fields labelled with ``'connect1'`` correspond to the
+first mesh, and those with ``'connect2'`` to the second, and so on. To grab the value
+of the ``'convected'`` variable at all the nodes in the first mesh, for example, you
+would do:
+
+.. code-block:: python
+    
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   ad = ds.all_data()  # geometric selection, this just grabs everything
+   print(ad['connect1', 'convected'])
+
+In this dataset, ``('connect1', 'convected')`` is nodal field, meaning that the field values
+are defined at the vertices of the elements. If we examine the shape of the returned array:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   ad = ds.all_data()
+   print(ad['connect1', 'convected'].shape)
+
+we see that this mesh has 12480 8-node hexahedral elements, and that we get 8 field values
+for each element. To get the vertex positions at which these field values are defined, we
+can do, for instance:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   ad = ds.all_data()
+   print(ad['connect1', 'vertex_x'])
+
+If we instead look at an element-centered field, like ``('connect1', 'conv_indicator')``,
+we get:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010", step=0)
+   ad = ds.all_data()
+   print(ad['connect1', 'conv_indicator'].shape)
+
+we instead get only one field value per element.
+
+For information about visualizing unstructured mesh data, including Exodus II datasets, 
+please see :ref:`unstructured-mesh-slices` and :ref:`unstructured_mesh_rendering`. 
+
+
 FITS Data
 ---------
 
@@ -1035,8 +1129,8 @@
 
 In addition to the above grid types, you can also load data stored on
 unstructured meshes. This type of mesh is used, for example, in many
-finite element calculations. Currently, hexahedral, tetrahedral, and
-wedge-shaped mesh element are supported.
+finite element calculations. Currently, hexahedral and tetrahedral
+mesh elements are supported.
 
 To load an unstructured mesh, you need to specify the following. First,
 you need to have a coordinates array, which should be an (L, 3) array

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 doc/source/reference/changelog.rst
--- a/doc/source/reference/changelog.rst
+++ b/doc/source/reference/changelog.rst
@@ -22,7 +22,7 @@
 * Late-stage beta support for Python 3 - unit tests and answer tests pass for 
   all the major frontends under python 3.4, and yt should now be mostly if not 
   fully usable.  Because many of the yt developers are still on Python 2 at 
-  this point, this should be considered a “late stage beta” as there may be 
+  this point, this should be considered a "late stage beta" as there may be 
   remaining issues yet to be identified or worked out.
 * Now supporting Gadget Friend-of-Friends/Subfind catalogs - see here to learn 
   how to load halo catalogs as regular yt datasets.

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 doc/source/visualizing/plots.rst
--- a/doc/source/visualizing/plots.rst
+++ b/doc/source/visualizing/plots.rst
@@ -355,6 +355,78 @@
 keyword arguments, as described in
 :class:`~yt.visualization.plot_window.OffAxisProjectionPlot`
 
+.. _unstructured-mesh-slices:
+
+Unstructured Mesh Slices
+------------------------
+
+Unstructured Mesh datasets can be sliced using the same syntax as above.
+Here is an example script using a publically available MOOSE dataset:
+
+.. python-script::
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
+   sl = yt.SlicePlot(ds, 'x', ('connect1', 'diffused'))
+   sl.zoom(0.75)
+   sl.save()
+
+Here, we plot the ``'diffused'`` variable, using a slice normal to the ``'x'`` direction, 
+through the meshed labelled by ``'connect1'``. By default, the slice goes through the
+center of the domain. We have also zoomed out a bit to get a better view of the 
+resulting structure. To instead plot the ``'convected'`` variable, using a slice normal
+to the ``'z'`` direction through the mesh labelled by ``'connect2'``, we do:
+
+.. python-script::
+
+   import yt
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
+   sl = yt.SlicePlot(ds, 'z', ('connect2', 'convected'))
+   sl.zoom(0.75)
+   sl.save()
+
+These slices are made by sampling the finite element solution at the points corresponding 
+to each pixel of the image. The ``'convected'`` and ``'diffused'`` variables are node-centered,
+so this interpolation is performed by converting the sample point the reference coordinate
+system of the element and evaluating the appropriate shape functions. You can also
+plot element-centered fields:
+
+.. python-script::
+
+   import yt
+   ds = yt.load('MOOSE_sample_data/out.e-s010')
+   sl = yt.SlicePlot(ds, 'y', ('connect1', 'conv_indicator'))
+   sl.zoom(0.75)
+   sl.save()
+
+We can also annotate the mesh lines, as follows:
+
+.. python-script::
+
+   import yt
+   ds = yt.load('MOOSE_sample_data/out.e-s010')
+   sl = yt.SlicePlot(ds, 'z', ('connect1', 'diffused'))
+   sl.annotate_mesh_lines(thresh=0.1)
+   sl.zoom(0.75)
+   sl.save()
+
+This annotation is performed by marking the pixels where the mapped coordinate is close
+to the element boundary. What counts as 'close' (in the mapped coordinate system) is 
+determined by the ``thresh`` parameter, which can be varied to make the lines thicker or
+thinner.
+
+Finally, slices can also be used to examine 2D unstructured mesh datasets, but the
+slices must be taken to be normal to the ``'z'`` axis, or you'll get an error. Here is
+an example using another MOOSE dataset:
+
+.. python-script::
+
+   import yt
+   ds = yt.load('MOOSE_sample_data/out.e')
+   sl = yt.SlicePlot(ds, 2, ('connect1', 'nodal_aux'))
+   sl.save()
+
+
 Plot Customization: Recentering, Resizing, Colormaps, and More
 --------------------------------------------------------------
 

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 doc/source/visualizing/unstructured_mesh_rendering.rst
--- a/doc/source/visualizing/unstructured_mesh_rendering.rst
+++ b/doc/source/visualizing/unstructured_mesh_rendering.rst
@@ -3,19 +3,16 @@
 Unstructured Mesh Rendering
 ===========================
 
+Installation
+^^^^^^^^^^^^
+
 Beginning with version 3.3, yt has the ability to volume render unstructured
-meshes from, for example, finite element calculations. In order to use this
-capability, a few additional dependencies are required beyond those you get
-when you run the install script. First, `embree <https://embree.github.io>`_
+mesh data - like that created by finite element calculations, for example. 
+In order to use this capability, a few additional dependencies are required 
+beyond those you get when you run the install script. First, `embree <https://embree.github.io>`_
 (a fast software ray-tracing library from Intel) must be installed, either
 by compiling from source or by using one of the pre-built binaries available
-at Embree's `downloads <https://embree.github.io/downloads.html>`_ page. Once
-Embree is installed, you must also create a symlink next to the library. For
-example, if the libraries were installed at /usr/local/lib/, you must do
-
-.. code-block:: bash
-
-    sudo ln -s /usr/local/lib/libembree.2.6.1.dylib /usr/local/lib/libembree.so
+at Embree's `downloads <https://embree.github.io/downloads.html>`_ page. 
 
 Second, the python bindings for embree (called 
 `pyembree <https://github.com/scopatz/pyembree>`_) must also be installed. To
@@ -25,23 +22,39 @@
 
     git clone https://github.com/scopatz/pyembree
 
-To install, navigate to the root directory and run the setup script:
+To install, navigate to the root directory and run the setup script.
+If Embree was installed to some location that is not in your path by default,
+you will need to pass in CFLAGS and LDFLAGS to the setup.py script. For example,
+the Mac OS X package installer puts the installation at /opt/local/ instead of 
+usr/local. To account for this, you would do:
 
 .. code-block:: bash
 
-    python setup.py develop
+    CFLAGS='-I/opt/local/include' LDFLAGS='-L/opt/local/lib' python setup.py install
 
-If Embree was installed to some location that is not in your path by default,
-you will need to pass in CFLAGS and LDFLAGS to the setup.py script. For example,
-the Mac OS package installer puts the installation at /opt/local/ instead of 
-usr/local. To account for this, you would do:
+Once embree and pyembree are installed, you must rebuild yt from source in order to use
+the unstructured mesh rendering capability. Once again, if embree is installed in a 
+location that is not part of your default search path, you must tell yt where to find it.
+There are a number of ways to do this. One way is to again manually pass in the flags
+when running the setup script in the yt-hg directory:
 
 .. code-block:: bash
 
     CFLAGS='-I/opt/local/include' LDFLAGS='-L/opt/local/lib' python setup.py develop
 
-You must also use these flags when building any part of yt that links against
-pyembree.
+You can also set EMBREE_DIR environment variable to '/opt/local', in which case
+you could just run 
+
+.. code-block:: bash
+   
+   python setup.py develop
+
+as usual. Finally, if you create a file called embree.cfg in the yt-hg directory with
+the location of the embree installation, the setup script will find this and use it, 
+provided EMBREE_DIR is not set. We recommend one of the later two methods, especially
+if you plan on re-compiling the cython extensions regularly. Note that none of this is
+neccessary if you installed embree into a location that is in your default path, such
+as /usr/local.
 
 Once the pre-requisites are installed, unstructured mesh data can be rendered
 much like any other dataset. In particular, a new type of 
@@ -55,120 +68,293 @@
 :class:`~yt.visualization.volume_rendering.render_source.RenderSource` is called,
 a set of rays are cast at the source. Each time a ray strikes the source mesh,
 the data is sampled at the intersection point at the resulting value gets 
-saved into an image.
+saved into an image. See below for examples.
 
-See below for examples. First, here is an example of rendering a hexahedral mesh.
+Examples
+^^^^^^^^
+
+First, here is an example of rendering an 8-node, hexahedral MOOSE dataset.
 
 .. python-script::
 
    import yt
-   import pylab as plt
-   from yt.visualization.volume_rendering.render_source import MeshSource
-   from yt.visualization.volume_rendering.camera import Camera
-   from yt.utilities.exodusII_reader import get_data
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
 
-   # load the data
-   coords, connectivity, data = get_data("MOOSE_sample_data/out.e-s010")
-   mesh_id = 0
-   field_name = ('gas', 'diffused')
-   ds = yt.load_unstructured_mesh(data[mesh_id], connectivity[mesh_id], coords[mesh_id])
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
 
-   # create the RenderSource
-   ms = MeshSource(ds, field_name)
+   ms = MeshSource(ds, ('connect1', 'diffused'))
 
-   # set up camera
+   # setup the camera
    cam = Camera(ds)
-   camera_position = ds.arr([-3.0, 3.0, -3.0], 'code_length')
-   north_vector = ds.arr([0.0, 1.0, 0.0], 'dimensionless')
-   cam.resolution = (800, 800)
-   cam.set_position(camera_position, north_vector)
+   cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')  # point we're looking at
 
-   # make the image
-   im = ms.render(cam)
+   cam_pos = ds.arr([-3.0, 3.0, -3.0], 'code_length')  # the camera location
+   north_vector = ds.arr([0.0, -1.0, 0.0], 'dimensionless')  # down is the new up
+   cam.set_position(cam_pos, north_vector)
 
-   # plot and save
-   plt.imshow(im, cmap='Eos A', origin='lower', vmin=0, vmax=2.0)
-   plt.gca().axes.get_xaxis().set_visible(False)
-   plt.gca().axes.get_yaxis().set_visible(False)
-   cb = plt.colorbar()
-   cb.set_label(field_name[1])
-   plt.savefig('hex_mesh_render.png')
+   im = ms.render(cam, cmap='Eos A', color_bounds=(0.0, 2.0))
+   pw.write_png(im, 'hex_mesh_render.png')
 
-Next, here is an example of rendering a dataset with tetrahedral mesh elements.
+You can also overplot the mesh boundaries:
 
 .. python-script::
 
    import yt
-   import pylab as plt
-   from yt.visualization.volume_rendering.render_source import MeshSource
-   from yt.visualization.volume_rendering.camera import Camera
-   from yt.utilities.exodusII_reader import get_data
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
 
-   # load the data
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
+
+   ms = MeshSource(ds, ('connect1', 'diffused'))
+
+   # setup the camera
+   cam = Camera(ds)
+   cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')  # point we're looking at
+
+   cam_pos = ds.arr([-3.0, 3.0, -3.0], 'code_length')  # the camera location
+   north_vector = ds.arr([0.0, -1.0, 0.0], 'dimensionless')  # down is the new up
+   cam.set_position(cam_pos, north_vector)
+   cam.resolution = (800, 800)
+
+   ms.render(cam, cmap='Eos A', color_bounds=(0.0, 2.0))
+   im = ms.annotate_mesh_lines()
+   pw.write_png(im, 'hex_render_with_mesh.png')
+
+As with slices, you can visualize different meshes and different fields. For example,
+Here is a script similar to the above that plots the "diffused" variable 
+using the mesh labelled by "connect2":
+
+.. python-script::
+
+   import yt
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
+
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
+
+   ms = MeshSource(ds, ('connect2', 'diffused'))
+
+   # setup the camera
+   cam = Camera(ds)
+   cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')  # point we're looking at
+
+   cam_pos = ds.arr([-3.0, 3.0, -3.0], 'code_length')  # the camera location
+   north_vector = ds.arr([0.0, -1.0, 0.0], 'dimensionless')  # down is the new up
+   cam.set_position(cam_pos, north_vector)
+
+   im = ms.render(cam, cmap='Eos A', color_bounds=(0.0, 2.0))
+   pw.write_png(im, 'hex_mesh_render.png')
+
+Next, here is an example of rendering a dataset with tetrahedral mesh elements.
+Note that in this dataset, there are multiple "steps" per file, so we specify
+that we want to look at the last one.
+
+.. python-script::
+
+   import yt
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
+
    filename = "MOOSE_sample_data/high_order_elems_tet4_refine_out.e"
-   coords, connectivity, data = get_data(filename)
-   mesh_id = 0
-   field_name = ('gas', 'u')
-   ds = yt.load_unstructured_mesh(data[mesh_id], connectivity[mesh_id], coords[mesh_id])
+   ds = yt.load(filename, step=-1)  # we look at the last time frame
 
-   # create the RenderSource
-   ms = MeshSource(ds, field_name)
+   ms = MeshSource(ds, ('connect1', 'u'))
 
-   # set up camera
+   # setup the camera 
    cam = Camera(ds)
    camera_position = ds.arr([3.0, 3.0, 3.0], 'code_length')
    cam.set_width(ds.arr([2.0, 2.0, 2.0], 'code_length'))
    north_vector = ds.arr([0.0, 1.0, 0.0], 'dimensionless')
+   cam.set_position(camera_position, north_vector)
+
+   im = ms.render(cam, cmap='Eos A', color_bounds=(0.0, 1.0))
+   pw.write_png(im, 'tetra_render.png')
+
+Another example, this time plotting the temperature field from a 20-node hex 
+MOOSE dataset:
+
+.. python-script::
+
+   import yt
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
+
+   ds = yt.load("MOOSE_sample_data/mps_out.e", step=-1)  # we load the last time frame
+
+   ms = MeshSource(ds, ('connect2', 'temp'))
+
+   # set up the camera
+   cam = Camera(ds)
+   camera_position = ds.arr([-1.0, 1.0, -0.5], 'code_length')
+   north_vector = ds.arr([0.0, 1.0, 1.0], 'dimensionless')
+   cam.width = ds.arr([0.04, 0.04, 0.04], 'code_length')
    cam.resolution = (800, 800)
    cam.set_position(camera_position, north_vector)
 
-   # make the image
-   im = ms.render(cam)
+   im = ms.render(cam, cmap='hot', color_bounds=(500.0, 1700.0))
+   im = ms.annotate_mesh_lines()
+   pw.write_png(im, 'hex20_render.png')
 
-   # plot and save
-   plt.imshow(im, cmap='Eos A', origin='lower', vmin=0.0, vmax=1.0)
-   plt.gca().axes.get_xaxis().set_visible(False)
-   plt.gca().axes.get_yaxis().set_visible(False)
-   cb = plt.colorbar()
-   cb.set_label(field_name[1])
-   plt.savefig('tet_mesh_render.png')
+As with other volume renderings in yt, you can swap out different lenses. Here is 
+an example that uses a "perspective" lens, for which the rays diverge from the 
+camera position according to some opening angle:
 
-Finally, here is a script that creates frames of a movie. It calls the rotate()
-method 300 times, saving a new image to the disk each time.
+.. python-script::
+
+   import yt
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
+
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
+
+   ms = MeshSource(ds, ('connect2', 'diffused'))
+
+   # setup the camera
+   cam = Camera(ds, lens_type='perspective')
+   cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')  # point we're looking at
+
+   cam_pos = ds.arr([-4.5, 4.5, -4.5], 'code_length')  # the camera location
+   north_vector = ds.arr([0.0, -1.0, 0.0], 'dimensionless')  # down is the new up
+   cam.set_position(cam_pos, north_vector)
+
+   im = ms.render(cam, cmap='Eos A', color_bounds=(0.0, 2.0))
+   im = ms.annotate_mesh_lines()
+   pw.write_png(im, 'hex_mesh_render_perspective.png')
+
+You can also create scenes that have multiple meshes. The ray-tracing infrastructure
+will keep track of the depth information for each source separately, and composite
+the final image accordingly. In the next example, we show how to render a scene 
+with two meshes on it:
+
+.. python-script::
+
+    import yt
+    from yt.visualization.volume_rendering.api import MeshSource, Camera, Scene
+    import yt.utilities.png_writer as pw
+
+    ds = yt.load("MOOSE_sample_data/out.e-s010")
+
+    # this time we create an empty scene and add sources to it one-by-one
+    sc = Scene()
+
+    cam = Camera(ds)
+    cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')
+    cam.set_position(ds.arr([-3.0, 3.0, -3.0], 'code_length'),
+                     ds.arr([0.0, 1.0, 0.0], 'dimensionless'))
+    cam.set_width = ds.arr([8.0, 8.0, 8.0], 'code_length')
+    cam.resolution = (800, 800)
+
+    sc.camera = cam
+
+    # create two distinct MeshSources from 'connect1' and 'connect2'
+    ms1 = MeshSource(ds, ('connect1', 'diffused'))
+    ms2 = MeshSource(ds, ('connect2', 'diffused'))
+
+    sc.add_source(ms1)
+    sc.add_source(ms2)
+
+    im = sc.render()
+
+    pw.write_png(im, 'composite_render.png')
+
+
+Making Movies
+^^^^^^^^^^^^^
+
+Here are a couple of example scripts that show how to create image frames that 
+can later be stiched together into a movie. In the first example, we look at a 
+single dataset at a fixed time, but we move the camera around to get a different
+vantage point. We call the rotate() method 300 times, saving a new image to the 
+disk each time.
 
 .. code-block:: python
 
    import yt
-   import pylab as plt
-   from yt.visualization.volume_rendering.render_source import MeshSource
-   from yt.visualization.volume_rendering.camera import Camera
-   from yt.utilities.exodusII_reader import get_data
+   from yt.visualization.volume_rendering.api import MeshSource, Camera
+   import yt.utilities.png_writer as pw
 
-   # load dataset
-   coords, connectivity, data = get_data("MOOSE_sample_data/out.e-s010")
-   mesh_id = 0
-   field_name = ('gas', 'diffused')
-   ds = yt.load_unstructured_mesh(data[mesh_id], connectivity[mesh_id], coords[mesh_id])
+   ds = yt.load("MOOSE_sample_data/out.e-s010")
 
-   # create the RenderSource
-   ms = MeshSource(ds, field_name)
+   ms = MeshSource(ds, ('connect1', 'diffused'))
 
-   # set up camera
+   # setup the camera
    cam = Camera(ds)
-   camera_position = ds.arr([-3.0, 3.0, -3.0], 'code_length')
-   north_vector = ds.arr([0.0, 1.0, 0.0], 'dimensionless')
-   cam.set_position(camera_position, north_vector)
+   cam.focus = ds.arr([0.0, 0.0, 0.0], 'code_length')  # point we're looking at
+
+   cam_pos = ds.arr([-3.0, 3.0, -3.0], 'code_length')  # the camera location
+   north_vector = ds.arr([0.0, -1.0, 0.0], 'dimensionless')  # down is the new up
+   cam.set_position(cam_pos, north_vector)
+   cam.resolution = (800, 800)
    cam.steady_north = True
 
    # make movie frames
    num_frames = 301
    for i in range(num_frames):
        cam.rotate(2.0*np.pi/num_frames)
-       im = ms.render(cam)
-       plt.imshow(im, cmap='Eos A', origin='lower',vmin=0.0, vmax=2.0)
-       plt.gca().axes.get_xaxis().set_visible(False)
-       plt.gca().axes.get_yaxis().set_visible(False)
-       cb = plt.colorbar()
-       cb.set_label('diffused')
-       plt.savefig('movie_frames/surface_render_%.4d.png' % i)
-       plt.clf()
+       im = ms.render(cam, cmap='Eos A', color_bounds=(0.0, 2.0))
+       pw.write_png(im, 'movie_frames/surface_render_%.4d.png' % i)
+
+Finally, this example demonstrates how to loop over the time steps in a single
+file with a fixed camera position:
+
+.. code-block:: python
+
+    import yt
+    from yt.visualization.volume_rendering.api import MeshSource, Camera
+    import pylab as plt
+
+    NUM_STEPS = 127
+    CMAP = 'hot'
+    VMIN = 300.0
+    VMAX = 2000.0
+
+    for step in range(NUM_STEPS):
+
+        ds = yt.load("MOOSE_sample_data/mps_out.e", step=step)
+
+	time = ds._get_current_time()
+
+	# the field name is a tuple of strings. The first string
+	# specifies which mesh will be plotted, the second string
+	# specifies the name of the field.
+	field_name = ('connect2', 'temp')
+
+	# this initializes the render source
+	ms = MeshSource(ds, field_name)
+
+	# set up the camera here. these values were arrived by
+	# calling pitch, yaw, and roll in the notebook until I
+	# got the angle I wanted.
+	cam = Camera(ds)
+	camera_position = ds.arr([0.1, 0.0, 0.1], 'code_length')
+	cam.focus = ds.domain_center
+	north_vector = ds.arr([0.3032476, 0.71782557, -0.62671153], 'dimensionless')
+	cam.width = ds.arr([ 0.04,  0.04,  0.04], 'code_length')
+	cam.resolution = (800, 800)
+	cam.set_position(camera_position, north_vector)
+
+	# actually make the image here
+	im = ms.render(cam, cmap=CMAP, color_bounds=(VMIN, VMAX))
+
+	# Plot the result using matplotlib and save.
+	# Note that we are setting the upper and lower
+	# bounds of the colorbar to be the same for all
+	# frames of the image.
+
+	# must clear the image between frames
+	plt.clf()
+	fig = plt.gcf()
+	ax = plt.gca()
+	ax.imshow(im, interpolation='nearest', origin='lower')
+
+	# Add the colorbar using a fake (not shown) image.
+	p = ax.imshow(ms.data, visible=False, cmap=CMAP, vmin=VMIN, vmax=VMAX)
+	cb = fig.colorbar(p)
+	cb.set_label(field_name[1])
+
+	ax.text(25, 750, 'time = %.2e' % time, color='k')
+	ax.axes.get_xaxis().set_visible(False)
+	ax.axes.get_yaxis().set_visible(False)
+
+	plt.savefig('movie_frames/test_%.3d' % step)

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 requirements.txt
--- a/requirements.txt
+++ b/requirements.txt
@@ -4,3 +4,4 @@
 h5py==2.5.0 
 nose==1.3.6 
 sympy==0.7.6 
+

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/data_objects/data_containers.py
--- a/yt/data_objects/data_containers.py
+++ b/yt/data_objects/data_containers.py
@@ -588,7 +588,7 @@
                         extra_attrs=extra_attrs)
 
         return filename
-        
+
     def to_glue(self, fields, label="yt", data_collection=None):
         """
         Takes specific *fields* in the container and exports them to

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/data_objects/unstructured_mesh.py
--- a/yt/data_objects/unstructured_mesh.py
+++ b/yt/data_objects/unstructured_mesh.py
@@ -59,6 +59,9 @@
         self._current_fluid_type = self.ds.default_fluid_type
 
     def _check_consistency(self):
+        if self.connectivity_indices.shape[1] != self._connectivity_length:
+            raise RuntimeError
+
         for gi in range(self.connectivity_indices.shape[0]):
             ind = self.connectivity_indices[gi, :] - self._index_offset
             coords = self.connectivity_coords[ind, :]
@@ -136,7 +139,7 @@
         mask = self._get_selector_mask(selector)
         count = self.count(selector)
         if count == 0: return 0
-        dest[offset:offset+count] = source[mask,...]
+        dest[offset:offset+count] = source[mask, ...]
         return count
 
     def count(self, selector):
@@ -167,11 +170,12 @@
 
     def select_fcoords_vertex(self, dobj = None):
         mask = self._get_selector_mask(dobj.selector)
-        if mask is None: return np.empty((0,self._connectivity_length,3), dtype='float64')
+        if mask is None: return np.empty((0, self._connectivity_length, 3), dtype='float64')
         vertices = self.connectivity_coords[
                 self.connectivity_indices - 1]
         return vertices[mask, :, :]
 
+
 class SemiStructuredMesh(UnstructuredMesh):
     _connectivity_length = 8
     _type_name = 'semi_structured_mesh'

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/api.py
--- a/yt/frontends/api.py
+++ b/yt/frontends/api.py
@@ -24,6 +24,7 @@
     'chombo',
     'eagle',
     'enzo',
+    'exodus_ii',
     'fits',
     'flash',
     'gadget',

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/__init__.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/__init__.py
@@ -0,0 +1,14 @@
+"""
+API for yt.frontends.exodus_ii
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/api.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/api.py
@@ -0,0 +1,27 @@
+"""
+API for yt.frontends.exodus_ii
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from .data_structures import \
+    ExodusIIUnstructuredMesh, \
+    ExodusIIUnstructuredIndex, \
+    ExodusIIDataset
+
+from .fields import \
+    ExodusIIFieldInfo
+
+from .io import \
+    IOHandlerExodusII
+
+from . import tests

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/data_structures.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/data_structures.py
@@ -0,0 +1,262 @@
+"""
+Exodus II data structures
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+import numpy as np
+
+from yt.geometry.unstructured_mesh_handler import \
+    UnstructuredIndex
+from yt.data_objects.unstructured_mesh import \
+    UnstructuredMesh
+from yt.data_objects.static_output import \
+    Dataset
+from .io import \
+    NetCDF4FileHandler
+from yt.utilities.logger import ytLogger as mylog
+from .fields import \
+    ExodusIIFieldInfo
+from .util import \
+    load_info_records, sanitize_string
+
+
+class ExodusIIUnstructuredMesh(UnstructuredMesh):
+    _index_offset = 1
+
+    def __init__(self, *args, **kwargs):
+        super(ExodusIIUnstructuredMesh, self).__init__(*args, **kwargs)
+
+
+class ExodusIIUnstructuredIndex(UnstructuredIndex):
+    def __init__(self, ds, dataset_type = 'exodus_ii'):
+        super(ExodusIIUnstructuredIndex, self).__init__(ds, dataset_type)
+
+    def _initialize_mesh(self):
+        coords = self.ds._read_coordinates()
+        self.meshes = [ExodusIIUnstructuredMesh(
+            mesh_id, self.index_filename, conn_ind, coords, self)
+                       for mesh_id, conn_ind in
+                       enumerate(self.ds._read_connectivity())]
+
+    def _detect_output_fields(self):
+        elem_names = self.dataset.parameters['elem_names']
+        node_names = self.dataset.parameters['nod_names']
+        fnames = elem_names + node_names
+        self.field_list = []
+        for i in range(1, len(self.meshes)+1):
+            self.field_list += [('connect%d' % i, fname) for fname in fnames]
+
+
+class ExodusIIDataset(Dataset):
+    _index_class = ExodusIIUnstructuredIndex
+    _field_info_class = ExodusIIFieldInfo
+
+    def __init__(self,
+                 filename,
+                 step=0,
+                 dataset_type='exodus_ii',
+                 storage_filename=None,
+                 units_override=None):
+
+        self.parameter_filename = filename
+        self.fluid_types += self._get_fluid_types()
+        self.step = step
+        super(ExodusIIDataset, self).__init__(filename, dataset_type,
+                                              units_override=units_override)
+        self.index_filename = filename
+        self.storage_filename = storage_filename
+
+    def _set_code_unit_attributes(self):
+        # This is where quantities are created that represent the various
+        # on-disk units.  These are the currently available quantities which
+        # should be set, along with examples of how to set them to standard
+        # values.
+        #
+        self.length_unit = self.quan(1.0, "cm")
+        self.mass_unit = self.quan(1.0, "g")
+        self.time_unit = self.quan(1.0, "s")
+        #
+        # These can also be set:
+        # self.velocity_unit = self.quan(1.0, "cm/s")
+        # self.magnetic_unit = self.quan(1.0, "gauss")
+
+    def _parse_parameter_file(self):
+        self._handle = NetCDF4FileHandler(self.parameter_filename)
+        self._vars = self._handle.dataset.variables
+        self._read_glo_var()
+        self.dimensionality = self._vars['coor_names'].shape[0]
+        self.parameters['info_records'] = self._load_info_records()
+        self.unique_identifier = self._get_unique_identifier()
+        self.num_steps = len(self._vars['time_whole'])
+        self.current_time = self._get_current_time()
+        self.parameters['num_meshes'] = self._vars['eb_status'].shape[0]
+        self.parameters['elem_names'] = self._get_elem_names()
+        self.parameters['nod_names'] = self._get_nod_names()
+        self.domain_left_edge = self._load_domain_edge(0)
+        self.domain_right_edge = self._load_domain_edge(1)
+
+        # set up psuedo-3D for lodim datasets here
+        if self.dimensionality == 2:
+            self.domain_left_edge = np.append(self.domain_left_edge, 0.0)
+            self.domain_right_edge = np.append(self.domain_right_edge, 1.0)
+
+        self.periodicity = (False, False, False)
+
+        # These attributes don't really make sense for unstructured
+        # mesh data, but yt warns if they are not present, so we set
+        # them to dummy values here.
+        self.domain_dimensions = np.ones(3, "int32")
+        self.cosmological_simulation = 0
+        self.current_redshift = 0
+        self.omega_lambda = 0
+        self.omega_matter = 0
+        self.hubble_constant = 0
+        self.refine_by = 0
+
+    def _get_fluid_types(self):
+        handle = NetCDF4FileHandler(self.parameter_filename).dataset
+        fluid_types = ()
+        i = 1
+        while True:
+            ftype = 'connect%d' % i
+            if ftype in handle.variables:
+                fluid_types += (ftype,)
+                i += 1
+            else:
+                break
+        return fluid_types
+
+    def _read_glo_var(self):
+        """
+        Adds each global variable to the dict of parameters
+
+        """
+        names = self._get_glo_names()
+        if not names:
+            return
+        values = self._vars['vals_glo_var'][:].transpose()
+        for name, value in zip(names, values):
+            self.parameters[name] = value
+
+    def _load_info_records(self):
+        """
+        Returns parsed version of the info_records.
+        """
+        try:
+            return load_info_records(self._vars['info_records'])
+        except (KeyError, TypeError):
+            mylog.warning("No info_records found")
+            return []
+
+    def _get_unique_identifier(self):
+        return self.parameter_filename.__hash__()
+
+    def _get_current_time(self):
+        try:
+            return self._vars['time_whole'][self.step]
+        except IndexError:
+            raise RuntimeError("Invalid step number, max is %d" \
+                               % (self.num_steps - 1))            
+        except (KeyError, TypeError):
+            return 0.0
+
+    def _get_glo_names(self):
+        """
+
+        Returns the names of the global vars, if available.
+
+        """
+
+        if "name_glo_var" not in self._vars:
+            mylog.warning("name_glo_var not found")
+            return []
+        else:
+            return [sanitize_string(v.tostring()) for v in
+                    self._vars["name_glo_var"]]
+            
+    def _get_elem_names(self):
+        """
+
+        Returns the names of the element vars, if available.
+
+        """
+
+        if "name_elem_var" not in self._vars:
+            mylog.warning("name_elem_var not found")
+            return []
+        else:
+            return [sanitize_string(v.tostring()) for v in
+                    self._vars["name_elem_var"]]
+
+    def _get_nod_names(self):
+        """
+
+        Returns the names of the node vars, if available
+
+        """
+
+        if "name_nod_var" not in self._vars:
+            mylog.warning("name_nod_var not found")
+            return []
+        else:
+            return [sanitize_string(v.tostring()) for v in
+                    self._vars["name_nod_var"]]
+
+    def _read_coordinates(self):
+        """
+
+        Loads the coordinates for the mesh
+
+        """
+        
+        coord_axes = 'xyz'[:self.dimensionality]
+
+        mylog.info("Loading coordinates")
+        if "coord" not in self._vars:
+            return np.array([self._vars["coord%s" % ax][:]
+                             for ax in coord_axes]).transpose().copy()
+        else:
+            return np.array([coord for coord in
+                             self._vars["coord"][:]]).transpose().copy()
+
+    def _read_connectivity(self):
+        """
+        Loads the connectivity data for the mesh
+        """
+        mylog.info("Loading connectivity")
+        connectivity = []
+        for i in range(self.parameters['num_meshes']):
+            connectivity.append(self._vars["connect%d" % (i+1)][:].astype("i8"))
+        return connectivity
+
+    def _load_domain_edge(self, domain_idx):
+        """
+        Loads the boundaries for the domain edge
+
+        Parameters:
+        - domain_idx: 0 corresponds to the left edge, 1 corresponds to the right edge
+        """
+        if domain_idx == 0:
+            return self._read_coordinates().min(axis=0)
+        if domain_idx == 1:
+            return self._read_coordinates().max(axis=0)
+
+    @classmethod
+    def _is_valid(self, *args, **kwargs):
+        try:
+            from netCDF4 import Dataset
+            filename = args[0]
+            with Dataset(filename) as f:
+                f.variables['connect1']
+            return True
+        except:
+            pass

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/definitions.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/definitions.py
@@ -0,0 +1,1 @@
+# This file is often empty.  It can hold definitions related to a frontend.

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/fields.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/fields.py
@@ -0,0 +1,45 @@
+"""
+ExodusII-specific fields
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from yt.fields.field_info_container import \
+    FieldInfoContainer
+
+# We need to specify which fields we might have in our dataset.  The field info
+# container subclass here will define which fields it knows about.  There are
+# optionally methods on it that get called which can be subclassed.
+
+class ExodusIIFieldInfo(FieldInfoContainer):
+    known_other_fields = (
+        # Each entry here is of the form
+        # ( "name", ("units", ["fields", "to", "alias"], # "display_name")),
+    )
+
+    known_particle_fields = (
+        # Identical form to above
+        # ( "name", ("units", ["fields", "to", "alias"], # "display_name")),
+    )
+
+    def __init__(self, ds, field_list):
+        super(ExodusIIFieldInfo, self).__init__(ds, field_list)
+        # If you want, you can check self.field_list
+
+    def setup_fluid_fields(self):
+        # Here we do anything that might need info about the dataset.
+        # You can use self.alias, self.add_output_field and self.add_field .
+        pass
+
+    def setup_particle_fields(self, ptype):
+        # This will get called for every particle type.
+        pass

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/io.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/io.py
@@ -0,0 +1,84 @@
+"""
+ExodusII-specific IO functions
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+import numpy as np
+from yt.utilities.io_handler import \
+    BaseIOHandler
+from yt.utilities.file_handler import \
+    NetCDF4FileHandler
+
+
+class IOHandlerExodusII(BaseIOHandler):
+    _particle_reader = False
+    _dataset_type = "exodus_ii"
+    _INDEX_OFFSET = 1
+
+    def __init__(self, ds):
+        self.filename = ds.index_filename
+        exodus_ii_handler = NetCDF4FileHandler(self.filename)
+        self.handler = exodus_ii_handler.dataset
+        super(IOHandlerExodusII, self).__init__(ds)
+        self.node_fields = ds._get_nod_names()
+        self.elem_fields = ds._get_elem_names()
+
+    def _read_particle_coords(self, chunks, ptf):
+        pass
+
+    def _read_particle_fields(self, chunks, ptf, selector):
+        pass
+
+    def _read_fluid_selection(self, chunks, selector, fields, size):
+        # This needs to allocate a set of arrays inside a dictionary, where the
+        # keys are the (ftype, fname) tuples and the values are arrays that
+        # have been masked using whatever selector method is appropriate.  The
+        # dict gets returned at the end and it should be flat, with selected
+        # data.  Note that if you're reading grid data, you might need to
+        # special-case a grid selector object.
+        chunks = list(chunks)
+        rv = {}
+        for field in fields:
+            ftype, fname = field
+            ci = self.handler.variables[ftype][:] - self._INDEX_OFFSET
+            num_elem = ci.shape[0]
+            if fname in self.node_fields:
+                nodes_per_element = ci.shape[1]
+                rv[field] = np.empty((num_elem, nodes_per_element), dtype="float64")
+            elif fname in self.elem_fields:
+                rv[field] = np.empty(num_elem, dtype="float64")
+        for field in fields:
+            ind = 0
+            ftype, fname = field
+            mesh_id = int(ftype[-1])
+            chunk = chunks[mesh_id - 1]
+            ci = self.handler.variables[ftype][:] - self._INDEX_OFFSET
+            if fname in self.node_fields:
+                field_ind = self.node_fields.index(fname)
+                fdata = self.handler.variables['vals_nod_var%d' % (field_ind + 1)]
+                data = fdata[self.ds.step][ci]
+                for g in chunk.objs:
+                    ind += g.select(selector, data, rv[field], ind)  # caches
+            if fname in self.elem_fields:
+                field_ind = self.elem_fields.index(fname)
+                fdata = self.handler.variables['vals_elem_var%deb%s' %
+                                               (field_ind + 1, mesh_id)][:]
+                data = fdata[self.ds.step, :]
+                for g in chunk.objs:
+                    ind += g.select(selector, data, rv[field], ind)  # caches
+        return rv
+
+    def _read_chunk_data(self, chunk, fields):
+        # This reads the data from a single chunk, and is only used for
+        # caching.
+        pass

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/setup.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/setup.py
@@ -0,0 +1,9 @@
+#!/usr/bin/env python
+
+
+def configuration(parent_package='', top_path=None):
+    from numpy.distutils.misc_util import Configuration
+    config = Configuration('exodus_ii', parent_package, top_path)
+    config.make_config_py()  # installs __config__.py
+    #config.make_svn_version_py()
+    return config

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/tests/test_outputs.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/tests/test_outputs.py
@@ -0,0 +1,52 @@
+"""
+Exodus II frontend tests
+
+
+
+"""
+
+#-----------------------------------------------------------------------------
+# Copyright (c) 2013, yt Development Team.
+#
+# Distributed under the terms of the Modified BSD License.
+#
+# The full license is in the file COPYING.txt, distributed with this software.
+#-----------------------------------------------------------------------------
+
+from yt.testing import \
+    assert_equal, \
+    assert_array_equal, \
+    requires_file
+from yt.utilities.answer_testing.framework import \
+    data_dir_load
+
+out = "ExodusII/out.e"
+
+
+ at requires_file(out)
+def test_out():
+    ds = data_dir_load(out)
+    yield assert_equal, str(ds), "out.e"
+    yield assert_equal, ds.dimensionality, 3
+    yield assert_equal, ds.unique_identifier, 5081193338833632556
+    yield assert_equal, ds.current_time, 0.0
+    yield assert_array_equal, ds.parameters['nod_names'], ['convected', 'diffused']
+    yield assert_equal, ds.parameters['num_meshes'], 2
+
+out_s002 = "ExodusII/out.e-s002"
+
+
+ at requires_file(out_s002)
+def test_out002():
+    ds = data_dir_load(out_s002)
+    yield assert_equal, str(ds), "out.e-s002"
+    yield assert_equal, ds.dimensionality, 3
+    yield assert_equal, ds.current_time, 2.0
+
+gold = "ExodusII/gold.e"
+
+
+ at requires_file(gold)
+def test_gold():
+    ds = data_dir_load(gold)
+    yield assert_equal, str(ds), "gold.e"

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/exodus_ii/util.py
--- /dev/null
+++ b/yt/frontends/exodus_ii/util.py
@@ -0,0 +1,53 @@
+import string
+from itertools import takewhile
+from collections import OrderedDict
+import re
+
+def sanitize_string(s):
+    return "".join(_ for _ in takewhile(lambda a: a in string.printable, s))
+
+def load_info_records(info_records):
+    info_records_parsed = [sanitize_string(line_chars) for line_chars in info_records]
+    return group_by_sections(info_records_parsed)
+
+def group_by_sections(info_records):
+    # 1. Split by top groupings
+    top_levels = get_top_levels(info_records)
+    # 2. Determine if in section by index number
+    grouped = OrderedDict()
+    for tidx, top_level in enumerate(top_levels):
+        grouped[top_level[1]] = []
+
+        try:
+            next_idx = top_levels[tidx + 1][0]
+        except IndexError:
+            next_idx = len(info_records) - 1
+
+        for idx in range(top_level[0], next_idx):       
+            if idx == top_level[0]:
+                continue
+
+            grouped[top_level[1]].append(info_records[idx])
+
+    
+    if 'Version Info' in grouped.keys():
+        version_info = OrderedDict()
+        for line in grouped['Version Info']:
+            split_line = line.split(":")
+            key = split_line[0]
+            val = ":".join(split_line[1:]).lstrip().rstrip()
+            if key != '':
+                version_info[key] = val
+        grouped['Version Info'] = version_info
+    
+    return grouped
+
+def get_top_levels(info_records):
+    top_levels = []
+    for idx, line in enumerate(info_records):
+        pattern = re.compile("###[a-zA-Z\s]+")
+        if pattern.match(line):
+            clean_line = re.sub(r'[^\w\s]', '', line).lstrip().rstrip()
+            top_levels.append([idx, clean_line])
+    
+    return top_levels

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/setup.py
--- a/yt/frontends/setup.py
+++ b/yt/frontends/setup.py
@@ -11,6 +11,7 @@
     config.add_subpackage("athena")
     config.add_subpackage("boxlib")
     config.add_subpackage("chombo")
+    config.add_subpackage("exodus_ii")
     config.add_subpackage("eagle")
     config.add_subpackage("enzo")
     config.add_subpackage("fits")
@@ -37,6 +38,7 @@
     config.add_subpackage("chombo/tests")
     config.add_subpackage("eagle/tests")
     config.add_subpackage("enzo/tests")
+    config.add_subpackage("exodus_ii/tests")
     config.add_subpackage("fits/tests")
     config.add_subpackage("flash/tests")
     config.add_subpackage("gadget/tests")
@@ -47,6 +49,7 @@
     config.add_subpackage("ramses/tests")
     config.add_subpackage("rockstar/tests")
     config.add_subpackage("stream/tests")
+    config.add_subpackage("stream/sample_data")
     config.add_subpackage("tipsy/tests")
     config.add_subpackage("ytdata/tests")
     return config

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/stream/api.py
--- a/yt/frontends/stream/api.py
+++ b/yt/frontends/stream/api.py
@@ -34,3 +34,5 @@
       IOHandlerStream
 
 from . import tests
+
+from . import sample_data

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/stream/data_structures.py
--- a/yt/frontends/stream/data_structures.py
+++ b/yt/frontends/stream/data_structures.py
@@ -139,7 +139,7 @@
         self.io = io
         self.particle_types = particle_types
         self.periodicity = periodicity
-            
+
     def get_fields(self):
         return self.fields.all_fields
 
@@ -149,7 +149,7 @@
             return self.particle_types[field]
         else :
             return False
-        
+
 class StreamHierarchy(GridIndex):
 
     grid = StreamGrid
@@ -1517,9 +1517,9 @@
 
     field_units, data = unitify_data(data)
     sfh = StreamDictFieldHandler()
-    
+
     particle_types = set_particle_types(data)
-    
+
     sfh.update({0:data})
     grid_left_edges = domain_left_edge
     grid_right_edges = domain_right_edge
@@ -1603,12 +1603,13 @@
     _field_info_class = StreamFieldInfo
     _dataset_type = "stream_unstructured"
 
-def load_unstructured_mesh(data, connectivity, coordinates,
-                         length_unit = None, bbox=None, sim_time=0.0,
-                         mass_unit = None, time_unit = None,
-                         velocity_unit = None, magnetic_unit = None,
-                         periodicity=(False, False, False),
-                         geometry = "cartesian"):
+
+def load_unstructured_mesh(connectivity, coordinates, node_data=None,
+                           elem_data=None, length_unit=None, bbox=None,
+                           sim_time=0.0, mass_unit=None, time_unit=None,
+                           velocity_unit=None, magnetic_unit=None,
+                           periodicity=(False, False, False),
+                           geometry = "cartesian"):
     r"""Load an unstructured mesh of data into yt as a
     :class:`~yt.frontends.stream.data_structures.StreamHandler`.
 
@@ -1622,10 +1623,6 @@
 
     Parameters
     ----------
-    data : dict or list of dicts
-        This is a list of dicts of numpy arrays, where each element in the list
-        is a different mesh, and where the keys of dicts are the field names.
-        If a dict is supplied, this will be assumed to be the only mesh.
     connectivity : list of array_like or array_like
         This is the connectivity array for the meshes; this should either be a
         list where each element in the list is a numpy array or a single numpy
@@ -1635,6 +1632,18 @@
     coordinates : array_like
         This should be of size (L,3) where L is the number of vertices
         indicated in the connectivity matrix.
+    node_data : dict or list of dicts
+        This is a list of dicts of numpy arrays, where each element in the list
+        is a different mesh, and where the keys of dicts are the field names.
+        If a dict is supplied, this will be assumed to be the only mesh. These
+        data fields are assumed to be node-centered, i.e. there must be one
+        value for every node in the mesh.
+    elem_data : dict or list of dicts
+        This is a list of dicts of numpy arrays, where each element in the list
+        is a different mesh, and where the keys of dicts are the field names.
+        If a dict is supplied, this will be assumed to be the only mesh. These
+        data fields are assumed to be element-centered, i.e. there must be only
+        one value for every element in the mesh.
     bbox : array_like (xdim:zdim, LE:RE), optional
         Size of computational domain in units of the length unit.
     sim_time : float, optional
@@ -1662,8 +1671,33 @@
 
     domain_dimensions = np.ones(3, "int32") * 2
     nprocs = 1
+
+    if elem_data is None and node_data is None:
+        raise RuntimeError("No data supplied in load_unstructured_mesh.")
+
+    if isinstance(connectivity, list):
+        num_meshes = len(connectivity)
+    else:
+        num_meshes = 1
+    connectivity = ensure_list(connectivity)
+
+    if elem_data is None:
+        elem_data = [{} for i in range(num_meshes)]
+    elem_data = ensure_list(elem_data)
+
+    if node_data is None:
+        node_data = [{} for i in range(num_meshes)]
+    node_data = ensure_list(node_data)
+
+    data = [{} for i in range(num_meshes)]
+    for elem_dict, data_dict in zip(elem_data, data):
+        for field, values in elem_dict.items():
+            data_dict[field] = values
+    for node_dict, data_dict in zip(node_data, data):
+        for field, values in node_dict.items():
+            data_dict[field] = values
     data = ensure_list(data)
-    connectivity = ensure_list(connectivity)
+
     if bbox is None:
         bbox = np.array([[coordinates[:,i].min() - 0.1 * abs(coordinates[:,i].min()),
                           coordinates[:,i].max() + 0.1 * abs(coordinates[:,i].max())]
@@ -1733,5 +1767,12 @@
 
     sds = StreamUnstructuredMeshDataset(handler, geometry = geometry)
 
+    fluid_types = ()
+    for i in range(1, num_meshes + 1):
+        fluid_types += ('connect%d' % i,)
+    sds.fluid_types = fluid_types
+
+    sds._node_fields = node_data[0].keys()
+    sds._elem_fields = elem_data[0].keys()
+
     return sds
-

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/frontends/stream/io.py
--- a/yt/frontends/stream/io.py
+++ b/yt/frontends/stream/io.py
@@ -255,7 +255,6 @@
 
 class IOHandlerStreamUnstructured(BaseIOHandler):
     _dataset_type = "stream_unstructured"
-    _node_types = ("diffused", "convected", "u", "temp")
 
     def __init__(self, ds):
         self.fields = ds.stream_handler.fields
@@ -267,9 +266,8 @@
         mesh_id = chunk.objs[0].mesh_id
         rv = {}
         for field in fields:
-            field_name = field[1]
-            nodes_per_element = self.fields[mesh_id][field].shape[1]
-            if field_name in self._node_types:
+            if field in self.ds._node_fields:
+                nodes_per_element = self.fields[mesh_id][field].shape[1]
                 rv[field] = np.empty((size, nodes_per_element), dtype="float64")
             else:
                 rv[field] = np.empty(size, dtype="float64")

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/geometry/coordinates/cartesian_coordinates.py
--- a/yt/geometry/coordinates/cartesian_coordinates.py
+++ b/yt/geometry/coordinates/cartesian_coordinates.py
@@ -21,6 +21,10 @@
     _get_vert_fields, \
     cartesian_to_cylindrical, \
     cylindrical_to_cartesian
+from yt.funcs import mylog
+from yt.utilities.lib.pixelization_routines import \
+    pixelize_element_mesh
+from yt.data_objects.unstructured_mesh import SemiStructuredMesh
 import yt.visualization._MPL as _MPL
 
 
@@ -59,7 +63,54 @@
 
     def pixelize(self, dimension, data_source, field, bounds, size,
                  antialias = True, periodic = True):
-        if dimension < 3:
+        index = data_source.ds.index
+        if (hasattr(index, 'meshes') and
+           not isinstance(index.meshes[0], SemiStructuredMesh)):
+            ftype, fname = field
+            mesh_id = int(ftype[-1]) - 1
+            mesh = index.meshes[mesh_id]
+            coords = mesh.connectivity_coords
+            indices = mesh.connectivity_indices
+            offset = mesh._index_offset
+            ad = data_source.ds.all_data()
+            field_data = ad[field]
+            buff_size = size[0:dimension] + (1,) + size[dimension:]
+
+            c = np.float64(data_source.center[dimension].d)
+            bounds.insert(2*dimension, c)
+            bounds.insert(2*dimension, c)
+            bounds = np.reshape(bounds, (3, 2))
+
+            # if this is an element field, promote to 2D here
+            if len(field_data.shape) == 1:
+                field_data = np.expand_dims(field_data, 1)
+            # if this is a higher-order element, we demote to 1st order
+            # here, for now.
+            elif field_data.shape[1] == 27 or field_data.shape[1] == 20:
+                # hexahedral
+                mylog.warning("High order elements not yet supported, " +
+                              "dropping to 1st order.")
+                field_data = field_data[:, 0:8]
+                indices = indices[:, 0:8]
+            elif field_data.shape[1] == 10:
+                # tetrahedral
+                mylog.warning("High order elements not yet supported, " +
+                              "dropping to 1st order.")
+                field_data = field_data[:,0:4]
+                indices = indices[:, 0:4]
+
+            img = pixelize_element_mesh(coords,
+                                        indices,
+                                        buff_size, field_data, bounds,
+                                        index_offset=offset)
+
+            # re-order the array and squeeze out the dummy dim
+            ax = data_source.axis
+            xax = self.x_axis[ax]
+            yax = self.y_axis[ax]
+            return np.squeeze(np.transpose(img, (yax, xax, ax)))
+
+        elif dimension < 3:
             return self._ortho_pixelize(data_source, field, bounds, size,
                                         antialias, dimension, periodic)
         else:

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/geometry/geometry_handler.py
--- a/yt/geometry/geometry_handler.py
+++ b/yt/geometry/geometry_handler.py
@@ -409,6 +409,7 @@
             ind += c.shape[0]
         return ci
 
+
 class ChunkDataCache(object):
     def __init__(self, base_iter, preload_fields, geometry_handler,
                  max_length = 256):

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/geometry/selection_routines.pyx
--- a/yt/geometry/selection_routines.pyx
+++ b/yt/geometry/selection_routines.pyx
@@ -397,6 +397,7 @@
         cdef np.ndarray[np.uint8_t, ndim=1] mask
         cdef int i, j, k, selected
         cdef int npoints, nv = mesh._connectivity_length
+        cdef int ndim = mesh.connectivity_coords.shape[1]
         cdef int total = 0
         cdef int offset = mesh._index_offset
         coords = _ensure_code(mesh.connectivity_coords)
@@ -405,11 +406,11 @@
         mask = np.zeros(npoints, dtype='uint8')
         for i in range(npoints):
             selected = 0
-            for k in range(3):
+            for k in range(ndim):
                 le[k] = 1e60
                 re[k] = -1e60
             for j in range(nv):
-                for k in range(3):
+                for k in range(ndim):
                     pos = coords[indices[i, j] - offset, k]
                     le[k] = fmin(pos, le[k])
                     re[k] = fmax(pos, re[k])

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/testing.py
--- a/yt/testing.py
+++ b/yt/testing.py
@@ -266,6 +266,50 @@
     return ds
 
 
+def fake_tetrahedral_ds():
+    from yt.frontends.stream.api import load_unstructured_mesh
+    from yt.frontends.stream.sample_data.tetrahedral_mesh import \
+        _connectivity, _coordinates
+
+    # the distance from the origin
+    node_data = {}
+    dist = np.sum(_coordinates**2, 1)
+    node_data[('connect1', 'test')] = dist[_connectivity]
+
+    # each element gets a random number
+    elem_data = {}
+    elem_data[('connect1', 'elem')] = np.random.rand(_connectivity.shape[0])
+
+    ds = load_unstructured_mesh(_connectivity,
+                                _coordinates,
+                                node_data=node_data,
+                                elem_data=elem_data)
+    return ds
+
+
+def fake_hexahedral_ds():
+    from yt.frontends.stream.api import load_unstructured_mesh
+    from yt.frontends.stream.sample_data.hexahedral_mesh import \
+        _connectivity, _coordinates
+
+    _connectivity -= 1  # this mesh has an offset of 1
+
+    # the distance from the origin
+    node_data = {}
+    dist = np.sum(_coordinates**2, 1)
+    node_data[('connect1', 'test')] = dist[_connectivity]
+
+    # each element gets a random number
+    elem_data = {}
+    elem_data[('connect1', 'elem')] = np.random.rand(_connectivity.shape[0])
+
+    ds = load_unstructured_mesh(_connectivity,
+                                _coordinates,
+                                node_data=node_data,
+                                elem_data=elem_data)
+    return ds
+
+
 def expand_keywords(keywords, full=False):
     """
     expand_keywords is a means for testing all possible keyword

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/utilities/exodusII_reader.py
--- a/yt/utilities/exodusII_reader.py
+++ b/yt/utilities/exodusII_reader.py
@@ -20,8 +20,8 @@
     # Is this correct?
     etypes = fvars["eb_status"][:]
     nelem = etypes.shape[0]
-#    varnames = [sanitize_string(v.tostring()) for v in
-#                fvars["name_elem_var"][:]]
+    varnames = [sanitize_string(v.tostring()) for v in
+                fvars["name_elem_var"][:]]
     nodnames = [sanitize_string(v.tostring()) for v in
                 fvars["name_nod_var"][:]]
     coord = np.array([fvars["coord%s" % ax][:]
@@ -34,9 +34,9 @@
         ci = connects[-1]
         coords.append(coord)  # Same for all
         vals = {}
-#        for j, v in enumerate(varnames):
-#            values = fvars["vals_elem_var%seb%s" % (j+1, i+1)][:]
-#            vals['gas', v] = values.astype("f8")[-1, :]
+        for j, v in enumerate(varnames):
+            values = fvars["vals_elem_var%seb%s" % (j+1, i+1)][:]
+            vals['gas', v] = values.astype("f8")[-1, :]
         for j, v in enumerate(nodnames):
             # We want just for this set of nodes all the node variables
             # Use (ci - 1) to get these values

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/utilities/file_handler.py
--- a/yt/utilities/file_handler.py
+++ b/yt/utilities/file_handler.py
@@ -1,5 +1,5 @@
 """
-A wrapper class for h5py file objects.
+Wrapper classes for h5py and netCDF4 file objects.
 
 
 
@@ -74,3 +74,9 @@
 
     def close(self):
         pass
+
+class NetCDF4FileHandler(object):
+    def __init__(self, filename):
+        from netCDF4 import Dataset
+        ds = Dataset(filename)
+        self.dataset = ds

diff -r e8f6316dad05ec9850bc3e994e163bd484cadb69 -r a065f0f3bd81c1906c2d5cc389fbfa07878ca4b3 yt/utilities/lib/element_mappings.pxd
--- a/yt/utilities/lib/element_mappings.pxd
+++ b/yt/utilities/lib/element_mappings.pxd
@@ -10,6 +10,7 @@
     # to get counted as "inside". This is in the
     # mapped coordinates of the element.
     cdef np.float64_t inclusion_tol
+    cdef int num_mapped_coords
 
     cdef void map_real_to_unit(self,
                                double* mapped_x, 
@@ -29,6 +30,31 @@
 
     cdef int check_inside(self, double* mapped_coord) nogil
 
+    cdef int check_near_edge(self,
+                             double* mapped_coord,
+                             double tolerance,
+                             int direction) nogil
+
+
+cdef class P1Sampler2D(ElementSampler):
+
+    cdef void map_real_to_unit(self,
+                               double* mapped_x, 
+                               double* vertices,
+                               double* physical_x) nogil
+
+
+    cdef double sample_at_unit_point(self,
+                                     double* coord,
+                                     double* vals) nogil
+
+    cdef int check_inside(self, double* mapped_coord) nogil
+
+    cdef int check_near_edge(self,
+                             double* mapped_coord,
+                             double tolerance,
+                             int direction) nogil
+
 
 cdef class P1Sampler3D(ElementSampler):
 
@@ -44,6 +70,12 @@
 
     cdef int check_inside(self, double* mapped_coord) nogil
 
+    cdef int check_near_edge(self,
+                             double* mapped_coord,
+                             double tolerance,
+                             int direction) nogil
+
+
 # This typedef defines a function pointer that defines the system
 # of equations that will be solved by the NonlinearSolveSamplers.
 # 
@@ -63,7 +95,7 @@
                            double* phys_x) nogil
 
 # This typedef defines a function pointer that defines the Jacobian
-# matrix used by the NonlinearSolveSamplers. Subclasses needed to 
+# matrix used by the NonlinearSolveSampler3D. Subclasses needed to 
 # define a Jacobian function in this form.
 # 
 # inputs:
@@ -77,20 +109,41 @@
 #     scol     - the second column of the jacobian
 #     tcol     - the third column of the jaocobian
 #
-ctypedef void (*jac_type)(double* rcol, 
-                          double* scol, 
-                          double* tcol, 
-                          double* x, 
-                          double* vertices, 
-                          double* phys_x) nogil
+ctypedef void (*jac_type3D)(double* rcol, 
+                            double* scol, 
+                            double* tcol, 
+                            double* x, 
+                            double* vertices, 
+                            double* phys_x) nogil
 
-cdef class NonlinearSolveSampler(ElementSampler):
+
+# This typedef defines a function pointer that defines the Jacobian
+# matrix used by the NonlinearSolveSampler2D. Subclasses needed to 
+# define a Jacobian function in this form.
+# 
+# inputs:
+#     x        - pointer to the mapped coordinate
+#     vertices - pointer to the element vertices
+#     phys_x   - pointer to the physical coordinate
+#
+# outputs:
+#
+#     A        - A flattened array storing the Jacobian matrix
+#                The order of this array is [J11, J12, J21, J22]
+#
+ctypedef void (*jac_type2D)(double* A,
+                            double* x,
+                            double* vertices,
+                            double* phys_x) nogil
+
+
+cdef class NonlinearSolveSampler3D(ElementSampler):
 
     cdef int dim
     cdef int max_iter
     cdef np.float64_t tolerance
     cdef func_type func 
-    cdef jac_type jac
+    cdef jac_type3D jac
 
     cdef void map_real_to_unit(self,
                                double* mapped_x, 
@@ -98,7 +151,7 @@
                                double* physical_x) nogil
     
 
-cdef class Q1Sampler3D(NonlinearSolveSampler):
+cdef class Q1Sampler3D(NonlinearSolveSampler3D):
 
     cdef void map_real_to_unit(self,
                                double* mapped_x, 
@@ -111,3 +164,41 @@
                                      double* vals) nogil
 
     cdef int check_inside(self, double* mapped_coord) nogil
+
+    cdef int check_near_edge(self,
+                             double* mapped_coord,
+                             double tolerance,
+                             int direction) nogil
+
+cdef class NonlinearSolveSampler2D(ElementSampler):
+
+    cdef int dim
+    cdef int max_iter
+    cdef np.float64_t tolerance
+    cdef func_type func 
+    cdef jac_type2D jac
+
+    cdef void map_real_to_unit(self,
+                               double* mapped_x, 
+                               double* vertices,
+                               double* physical_x) nogil
+    
+
+cdef class Q1Sampler2D(NonlinearSolveSampler2D):
+
+    cdef void map_real_to_unit(self,
+                               double* mapped_x, 
+                               double* vertices,
+                               double* physical_x) nogil
+
+
+    cdef double sample_at_unit_point(self,
+                                     double* coord,
+                                     double* vals) nogil
+
+    cdef int check_inside(self, double* mapped_coord) nogil
+
+    cdef int check_near_edge(self,
+                             double* mapped_coord,
+                             double tolerance,
+                             int direction) nogil

This diff is so big that we needed to truncate the remainder.

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.


More information about the yt-svn mailing list