[yt-svn] commit/yt: 32 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Tue Oct 17 09:19:40 PDT 2017


32 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/82acafef02ab/
Changeset:   82acafef02ab
User:        Corentin Cadiou
Date:        2017-10-06 08:55:28+00:00
Summary:     add scale to command line
Affected #:  1 file

diff -r 0231a9c04ac1101ef070028c7af3f71684643597 -r 82acafef02ab8a6efc541ca31f94dde478563a59 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -882,7 +882,10 @@
             dict(short="-fu", longname="--field-unit",
                  action="store", type=str,
                  dest="field_unit", default=None,
-                 help="Desired field units"))
+                 help="Desired field units"),
+            dict(longname='--scale',
+                 action='store_true',
+                 help="Annotate the plot with the scale"))
 
     name = "plot"
 
@@ -932,6 +935,8 @@
                 plt.annotate_grids()
             if args.time:
                 plt.annotate_timestamp()
+            if args.scale:
+                plt.annotate_scale()
 
             if args.field_unit:
                 plt.set_unit(args.field, args.field_unit)


https://bitbucket.org/yt_analysis/yt/commits/fdfc41d145db/
Changeset:   fdfc41d145db
User:        Corentin Cadiou
Date:        2017-10-16 22:17:10+00:00
Summary:     add doc
Affected #:  1 file

diff -r 82acafef02ab8a6efc541ca31f94dde478563a59 -r fdfc41d145db4614c9e004df4e6a63fea9af6e85 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -114,7 +114,12 @@
 
   $ yt plot --show-grids -p DD0010/moving7_0010
 
-We can now see all the grids in the field of view.
+We can now see all the grids in the field of view. If you want to annotate your plot with a scale bar, you can use the ``--scale`` option.:
+
+.. code-block:: bash
+
+  $ yt plot --scale -p DD0010/moving7_0010
+
 
 Command-line subcommand summary
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


https://bitbucket.org/yt_analysis/yt/commits/c2e7ef1d3245/
Changeset:   c2e7ef1d3245
User:        Corentin Cadiou
Date:        2017-10-16 22:20:13+00:00
Summary:     --scale -> --show-scale-bar
Affected #:  2 files

diff -r fdfc41d145db4614c9e004df4e6a63fea9af6e85 -r c2e7ef1d3245d79e239c8d109876eda730576668 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -114,11 +114,13 @@
 
   $ yt plot --show-grids -p DD0010/moving7_0010
 
-We can now see all the grids in the field of view. If you want to annotate your plot with a scale bar, you can use the ``--scale`` option.:
+We can now see all the grids in the field of view. If you want to
+annotate your plot with a scale bar, you can use the
+``--scale-scale-bar`` option:
 
 .. code-block:: bash
 
-  $ yt plot --scale -p DD0010/moving7_0010
+  $ yt plot --show-scale-bar -p DD0010/moving7_0010
 
 
 Command-line subcommand summary

diff -r fdfc41d145db4614c9e004df4e6a63fea9af6e85 -r c2e7ef1d3245d79e239c8d109876eda730576668 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -883,7 +883,7 @@
                  action="store", type=str,
                  dest="field_unit", default=None,
                  help="Desired field units"),
-            dict(longname='--scale',
+            dict(longname='--show-scale-bar',
                  action='store_true',
                  help="Annotate the plot with the scale"))
 


https://bitbucket.org/yt_analysis/yt/commits/a4059053d0ae/
Changeset:   a4059053d0ae
User:        Corentin Cadiou
Date:        2017-10-16 22:20:26+00:00
Summary:     remove 2 useless '.'
Affected #:  1 file

diff -r c2e7ef1d3245d79e239c8d109876eda730576668 -r a4059053d0aeae749173ee997288894206a153d9 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -98,7 +98,7 @@
 domain only has some minor shifts in density, so the plot is essentially
 incomprehensible.  Let's try it again, but instead of slicing, let's project.
 This is a line integral through the domain, and for the density field this
-becomes a column density.:
+becomes a column density:
 
 .. code-block:: bash
 
@@ -108,7 +108,7 @@
 nearly indistinguishable, because of how the two spheres are located in the
 domain.  We could center our domain on one of the spheres and take a slice, as
 well.  Now let's see what the domain looks like with grids overlaid, using the
-``--show-grids`` option.:
+``--show-grids`` option:
 
 .. code-block:: bash
 


https://bitbucket.org/yt_analysis/yt/commits/bd57556f5866/
Changeset:   bd57556f5866
User:        Corentin Cadiou
Date:        2017-10-17 12:15:21+00:00
Summary:     fix mistake
Affected #:  1 file

diff -r a4059053d0aeae749173ee997288894206a153d9 -r bd57556f58669958027f11183c7ca05c19e5abcc yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -935,7 +935,7 @@
                 plt.annotate_grids()
             if args.time:
                 plt.annotate_timestamp()
-            if args.scale:
+            if args.show_scale_bar:
                 plt.annotate_scale()
 
             if args.field_unit:


https://bitbucket.org/yt_analysis/yt/commits/6d1a664e3553/
Changeset:   6d1a664e3553
User:        ngoldbaum
Date:        2017-10-17 16:02:32+00:00
Summary:     typo fix
Affected #:  1 file

diff -r bd57556f58669958027f11183c7ca05c19e5abcc -r 6d1a664e3553109c66338365eb1a6d5693ff2221 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -116,7 +116,7 @@
 
 We can now see all the grids in the field of view. If you want to
 annotate your plot with a scale bar, you can use the
-``--scale-scale-bar`` option:
+``--show-scale-bar`` option:
 
 .. code-block:: bash
 


https://bitbucket.org/yt_analysis/yt/commits/e64509889f88/
Changeset:   e64509889f88
User:        ngoldbaum
Date:        2017-10-17 16:02:44+00:00
Summary:     Merge branch 'add-scale-in-command-line'
Affected #:  2 files

diff -r d66dc7438508dbf225eaa337cd83dab8eda662d8 -r e64509889f88dabc7f11328d95b9aef816700464 doc/source/reference/command-line.rst
--- a/doc/source/reference/command-line.rst
+++ b/doc/source/reference/command-line.rst
@@ -98,7 +98,7 @@
 domain only has some minor shifts in density, so the plot is essentially
 incomprehensible.  Let's try it again, but instead of slicing, let's project.
 This is a line integral through the domain, and for the density field this
-becomes a column density.:
+becomes a column density:
 
 .. code-block:: bash
 
@@ -108,13 +108,20 @@
 nearly indistinguishable, because of how the two spheres are located in the
 domain.  We could center our domain on one of the spheres and take a slice, as
 well.  Now let's see what the domain looks like with grids overlaid, using the
-``--show-grids`` option.:
+``--show-grids`` option:
 
 .. code-block:: bash
 
   $ yt plot --show-grids -p DD0010/moving7_0010
 
-We can now see all the grids in the field of view.
+We can now see all the grids in the field of view. If you want to
+annotate your plot with a scale bar, you can use the
+``--show-scale-bar`` option:
+
+.. code-block:: bash
+
+  $ yt plot --show-scale-bar -p DD0010/moving7_0010
+
 
 Command-line subcommand summary
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

diff -r d66dc7438508dbf225eaa337cd83dab8eda662d8 -r e64509889f88dabc7f11328d95b9aef816700464 yt/utilities/command_line.py
--- a/yt/utilities/command_line.py
+++ b/yt/utilities/command_line.py
@@ -882,7 +882,10 @@
             dict(short="-fu", longname="--field-unit",
                  action="store", type=str,
                  dest="field_unit", default=None,
-                 help="Desired field units"))
+                 help="Desired field units"),
+            dict(longname='--show-scale-bar',
+                 action='store_true',
+                 help="Annotate the plot with the scale"))
 
     name = "plot"
 
@@ -932,6 +935,8 @@
                 plt.annotate_grids()
             if args.time:
                 plt.annotate_timestamp()
+            if args.show_scale_bar:
+                plt.annotate_scale()
 
             if args.field_unit:
                 plt.set_unit(args.field, args.field_unit)


https://bitbucket.org/yt_analysis/yt/commits/6909b0d65d71/
Changeset:   6909b0d65d71
User:        jzuhone
Date:        2017-08-11 19:40:21+00:00
Summary:     Begin refactor of FITSDataset
Affected #:  2 files

diff -r d66dc7438508dbf225eaa337cd83dab8eda662d8 -r 6909b0d65d719a5d7914169165d56ccabcb524e7 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -40,7 +40,8 @@
     FITSFileHandler
 from yt.utilities.io_handler import \
     io_registry
-from .fields import FITSFieldInfo
+from .fields import FITSFieldInfo, \
+    WCSFITSFieldInfo
 from yt.utilities.decompose import \
     decompose_array, get_psize
 from yt.units.unit_lookup_table import \
@@ -48,7 +49,8 @@
     prefixable_units, \
     unit_prefixes
 from yt.units import dimensions
-from yt.utilities.on_demand_imports import _astropy, NotAModule
+from yt.utilities.on_demand_imports import \
+    _astropy, NotAModule
 
 
 lon_prefixes = ["X","RA","GLON","LINEAR"]
@@ -310,8 +312,6 @@
                  nprocs=None,
                  storage_filename=None,
                  nan_mask=None,
-                 spectral_factor=1.0,
-                 z_axis_decomp=False,
                  suppress_astropy_warnings=True,
                  parameters=None,
                  units_override=None,
@@ -322,9 +322,6 @@
         parameters["nprocs"] = nprocs
         self.specified_parameters = parameters
 
-        self.z_axis_decomp = z_axis_decomp
-        self.spectral_factor = spectral_factor
-
         if suppress_astropy_warnings:
             warnings.filterwarnings('ignore', module="astropy", append=True)
         auxiliary_files = ensure_list(auxiliary_files)
@@ -360,65 +357,6 @@
                                              ignore_blank=True)
                 self._handle._fits_files.append(f)
 
-        if len(self._handle) > 1 and self._handle[1].name == "EVENTS":
-            self.events_data = True
-            self.first_image = 1
-            self.primary_header = self._handle[self.first_image].header
-            self.naxis = 2
-            self.wcs = _astropy.pywcs.WCS(naxis=2)
-            self.events_info = {}
-            for k,v in self.primary_header.items():
-                if k.startswith("TTYP"):
-                    if v.lower() in ["x","y"]:
-                        num = k.strip("TTYPE")
-                        self.events_info[v.lower()] = (self.primary_header["TLMIN"+num],
-                                                       self.primary_header["TLMAX"+num],
-                                                       self.primary_header["TCTYP"+num],
-                                                       self.primary_header["TCRVL"+num],
-                                                       self.primary_header["TCDLT"+num],
-                                                       self.primary_header["TCRPX"+num])
-                    elif v.lower() in ["energy","time"]:
-                        num = k.strip("TTYPE")
-                        unit = self.primary_header["TUNIT"+num].lower()
-                        if unit.endswith("ev"): unit = unit.replace("ev","eV")
-                        self.events_info[v.lower()] = unit
-            self.axis_names = [self.events_info[ax][2] for ax in ["x","y"]]
-            self.reblock = 1
-            if "reblock" in self.specified_parameters:
-                self.reblock = self.specified_parameters["reblock"]
-            self.wcs.wcs.cdelt = [self.events_info["x"][4]*self.reblock,
-                                  self.events_info["y"][4]*self.reblock]
-            self.wcs.wcs.crpix = [(self.events_info["x"][5]-0.5)/self.reblock+0.5,
-                                  (self.events_info["y"][5]-0.5)/self.reblock+0.5]
-            self.wcs.wcs.ctype = [self.events_info["x"][2],self.events_info["y"][2]]
-            self.wcs.wcs.cunit = ["deg","deg"]
-            self.wcs.wcs.crval = [self.events_info["x"][3],self.events_info["y"][3]]
-            self.dims = [(self.events_info["x"][1]-self.events_info["x"][0])/self.reblock,
-                         (self.events_info["y"][1]-self.events_info["y"][0])/self.reblock]
-        else:
-            self.events_data = False
-            # Sometimes the primary hdu doesn't have an image
-            if len(self._handle) > 1 and self._handle[0].header["naxis"] == 0:
-                self.first_image = 1
-            else:
-                self.first_image = 0
-            self.primary_header = self._handle[self.first_image].header
-            self.naxis = self.primary_header["naxis"]
-            self.axis_names = [self.primary_header.get("ctype%d" % (i+1),"LINEAR")
-                               for i in range(self.naxis)]
-            self.dims = [self.primary_header["naxis%d" % (i+1)]
-                         for i in range(self.naxis)]
-            wcs = _astropy.pywcs.WCS(header=self.primary_header)
-            if self.naxis == 4:
-                self.wcs = _astropy.pywcs.WCS(naxis=3)
-                self.wcs.wcs.crpix = wcs.wcs.crpix[:3]
-                self.wcs.wcs.cdelt = wcs.wcs.cdelt[:3]
-                self.wcs.wcs.crval = wcs.wcs.crval[:3]
-                self.wcs.wcs.cunit = [str(unit) for unit in wcs.wcs.cunit][:3]
-                self.wcs.wcs.ctype = [type for type in wcs.wcs.ctype][:3]
-            else:
-                self.wcs = wcs
-
         self.refine_by = 2
 
         Dataset.__init__(self, fn, dataset_type, units_override=units_override,
@@ -459,11 +397,13 @@
             beam_size = self.quan(beam_size[0], beam_size[1]).in_cgs().value
         else:
             beam_size = 1.0
-        self.unit_registry.add("beam",beam_size,dimensions=dimensions.solid_angle)
+        self.unit_registry.add("beam", beam_size, dimensions=dimensions.solid_angle)
         if self.spec_cube:
             units = self.wcs_2d.wcs.cunit[0]
-            if units == "deg": units = "degree"
-            if units == "rad": units = "radian"
+            if units == "deg":
+                units = "degree"
+            if units == "rad":
+                units = "radian"
             pixel_area = np.prod(np.abs(self.wcs_2d.wcs.cdelt))
             pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
             pixel_dims = pixel_area.units.dimensions
@@ -471,6 +411,10 @@
 
     def _parse_parameter_file(self):
 
+        self._determine_structure()
+        self._determine_wcs()
+        self._determine_wcs2d()
+
         if self.parameter_filename.startswith("InMemory"):
             self.unique_identifier = time.time()
         else:
@@ -484,7 +428,8 @@
 
         # Sometimes a FITS file has a 4D datacube, in which case
         # we take the 4th axis and assume it consists of different fields.
-        if self.dimensionality == 4: self.dimensionality = 3
+        if self.dimensionality == 4: 
+            self.dimensionality = 3
 
         self.domain_dimensions = np.array(self.dims)[:self.dimensionality]
         if self.dimensionality == 2:
@@ -514,24 +459,10 @@
         self.current_redshift = self.omega_lambda = self.omega_matter = \
             self.hubble_constant = self.cosmological_simulation = 0.0
 
-        if self.dimensionality == 2 and self.z_axis_decomp:
-            mylog.warning("You asked to decompose along the z-axis, but this is a 2D dataset. " +
-                          "Ignoring.")
-            self.z_axis_decomp = False
-
-        if self.events_data: self.specified_parameters["nprocs"] = 1
-
-        # If nprocs is None, do some automatic decomposition of the domain
-        if self.specified_parameters["nprocs"] is None:
-            if self.z_axis_decomp:
-                nprocs = np.around(self.domain_dimensions[2]/8).astype("int")
-            else:
-                nprocs = np.around(np.prod(self.domain_dimensions)/32**self.dimensionality).astype("int")
-            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
-        else:
-            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
+        self._determine_nprocs()
 
         # Check to see if this data is in some kind of (Lat,Lon,Vel) format
+        """
         self.spec_cube = False
         self.wcs_2d = None
         x = 0
@@ -547,17 +478,56 @@
                 self.lon_name = "X"
             else:
                 self._setup_spec_cube()
+        """
 
         # Now we can set up some of our parameters for convenience.
-        #self.parameters['wcs'] = dict(self.wcs.to_header())
         for k, v in self.primary_header.items():
             self.parameters[k] = v
         # Remove potential default keys
         self.parameters.pop('', None)
 
+    def _determine_nprocs(self):
+        # If nprocs is None, do some automatic decomposition of the domain
+        if self.specified_parameters["nprocs"] is None:
+            nprocs = np.around(np.prod(self.domain_dimensions)/32**self.dimensionality).astype("int")
+            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
+        else:
+            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
+
+    def _determine_structure(self):
+        # Sometimes the primary hdu doesn't have an image
+        if len(self._handle) > 1 and self._handle[0].header["naxis"] == 0:
+            self.first_image = 1
+        else:
+            self.first_image = 0
+        self.primary_header = self._handle[self.first_image].header
+        self.naxis = self.primary_header["naxis"]
+        self.axis_names = [self.primary_header.get("ctype%d" % (i+1),"LINEAR")
+                           for i in range(self.naxis)]
+        self.dims = [self.primary_header["naxis%d" % (i+1)]
+                     for i in range(self.naxis)]
+
+    def _determine_wcs(self):
+        wcs = _astropy.pywcs.WCS(header=self.primary_header)
+        if self.naxis == 4:
+            self.wcs = _astropy.pywcs.WCS(naxis=3)
+            self.wcs.wcs.crpix = wcs.wcs.crpix[:3]
+            self.wcs.wcs.cdelt = wcs.wcs.cdelt[:3]
+            self.wcs.wcs.crval = wcs.wcs.crval[:3]
+            self.wcs.wcs.cunit = [str(unit) for unit in wcs.wcs.cunit][:3]
+            self.wcs.wcs.ctype = [type for type in wcs.wcs.ctype][:3]
+        else:
+            self.wcs = wcs
+
+    def _determine_wcs2d(self):
+        self.wcs_2d = self.wcs
+        self.lat_axis = 1
+        self.lon_axis = 0
+        self.lat_name = "Y"
+        self.lon_name = "X"
+
     def _setup_spec_cube(self):
 
-        self.spec_cube = True
         self.geometry = "spectral_cube"
 
         end = min(self.dimensionality+1,4)
@@ -630,16 +600,6 @@
             self.spec_name = "z"
             self.spec_unit = "code_length"
 
-    def spec2pixel(self, spec_value):
-        sv = self.arr(spec_value).in_units(self.spec_unit)
-        return self.arr((sv.v-self._z0)/self._dz+self._p0,
-                        "code_length")
-
-    def pixel2spec(self, pixel_value):
-        pv = self.arr(pixel_value, "code_length")
-        return self.arr((pv.v-self._p0)*self._dz+self._z0,
-                        self.spec_unit)
-
     @classmethod
     def _is_valid(cls, *args, **kwargs):
         ext = args[0].rsplit(".", 1)[-1]
@@ -676,3 +636,113 @@
 
     def close(self):
         self._handle.close()
+
+class SkyDataFITSDataset(FITSDataset):
+    _field_info_class = WCSFITSFieldInfo
+
+    def _parse_parameter_file(self):
+        super(SkyDataFITSDataset, self)._parse_parameter_file()
+        
+class SpectralCubeFITSDataset(SkyDataFITSDataset):
+    def __init__(self, filename,
+                 auxiliary_files=[],
+                 nprocs=None,
+                 storage_filename=None,
+                 nan_mask=None,
+                 spectral_factor=1.0,
+                 z_axis_decomp=False,
+                 suppress_astropy_warnings=True,
+                 parameters=None,
+                 units_override=None,
+                 unit_system="cgs"):
+        super(SpectralCubeFITSDataset, self).__init__(filename, nprocs=nprocs,
+            auxiliary_files=auxiliary_files, storage_filename=storage_filename,
+            suppress_astropy_warnings=suppress_astropy_warnings, nan_mask=nan_mask,
+            parameters=parameters, units_override=units_override,
+            unit_system=unit_system)
+
+        self.spectral_factor = spectral_factor
+        self.z_axis_decomp = z_axis_decomp
+
+    def _determine_nprocs(self):
+        # If nprocs is None, do some automatic decomposition of the domain
+        if self.specified_parameters["nprocs"] is None:
+            if self.z_axis_decomp:
+                nprocs = np.around(self.domain_dimensions[2] / 8).astype("int")
+            else:
+                nprocs = np.around(np.prod(self.domain_dimensions) / 32 ** self.dimensionality).astype("int")
+            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
+        else:
+            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
+
+    def spec2pixel(self, spec_value):
+        sv = self.arr(spec_value).in_units(self.spec_unit)
+        return self.arr((sv.v-self._z0)/self._dz+self._p0,
+                        "code_length")
+
+    def pixel2spec(self, pixel_value):
+        pv = self.arr(pixel_value, "code_length")
+        return self.arr((pv.v-self._p0)*self._dz+self._z0,
+                        self.spec_unit)
+
+class EventsFITSHierarchy(FITSHierarchy):
+
+    def _parse_index(self):
+        super(EventsFITSHierarchy, self)._parse_index()
+        try:
+            self.grid_particle_count[:] = self.dataset.primary_header["naxis2"]
+        except KeyError:
+            self.grid_particle_count[:] = 0.0
+        self._particle_indices = np.zeros(self.num_grids + 1, dtype='int64')
+        self._particle_indices[1] = self.grid_particle_count.squeeze()
+
+class EventsFITSDataset(SkyDataFITSDataset):
+    _index_class = EventsFITSHierarchy
+    def __init__(self, filename,
+                 nprocs=None,
+                 storage_filename=None,
+                 suppress_astropy_warnings=True,
+                 parameters=None,
+                 units_override=None,
+                 unit_system="cgs"):
+        super(EventsFITSDataset, self).__init__(filename, nprocs=nprocs,
+            storage_filename=storage_filename, parameters=parameters,
+            suppress_astropy_warnings=suppress_astropy_warnings,
+            units_override=units_override, unit_system=unit_system)
+
+    def _determine_structure(self):
+        self.first_image = 1
+        self.primary_header = self._handle[self.first_image].header
+        self.naxis = 2
+
+    def _determine_wcs(self):
+        self.wcs = _astropy.pywcs.WCS(naxis=2)
+        self.events_info = {}
+        for k, v in self.primary_header.items():
+            if k.startswith("TTYP"):
+                if v.lower() in ["x", "y"]:
+                    num = k.strip("TTYPE")
+                    self.events_info[v.lower()] = (self.primary_header["TLMIN" + num],
+                                                   self.primary_header["TLMAX" + num],
+                                                   self.primary_header["TCTYP" + num],
+                                                   self.primary_header["TCRVL" + num],
+                                                   self.primary_header["TCDLT" + num],
+                                                   self.primary_header["TCRPX" + num])
+                elif v.lower() in ["energy", "time"]:
+                    num = k.strip("TTYPE")
+                    unit = self.primary_header["TUNIT" + num].lower()
+                    if unit.endswith("ev"): unit = unit.replace("ev", "eV")
+                    self.events_info[v.lower()] = unit
+        self.axis_names = [self.events_info[ax][2] for ax in ["x", "y"]]
+        self.reblock = 1
+        if "reblock" in self.specified_parameters:
+            self.reblock = self.specified_parameters["reblock"]
+        self.wcs.wcs.cdelt = [self.events_info["x"][4] * self.reblock,
+                              self.events_info["y"][4] * self.reblock]
+        self.wcs.wcs.crpix = [(self.events_info["x"][5] - 0.5) / self.reblock + 0.5,
+                              (self.events_info["y"][5] - 0.5) / self.reblock + 0.5]
+        self.wcs.wcs.ctype = [self.events_info["x"][2], self.events_info["y"][2]]
+        self.wcs.wcs.cunit = ["deg", "deg"]
+        self.wcs.wcs.crval = [self.events_info["x"][3], self.events_info["y"][3]]
+        self.dims = [(self.events_info["x"][1] - self.events_info["x"][0]) / self.reblock,
+                     (self.events_info["y"][1] - self.events_info["y"][0]) / self.reblock]

diff -r d66dc7438508dbf225eaa337cd83dab8eda662d8 -r 6909b0d65d719a5d7914169165d56ccabcb524e7 yt/frontends/fits/fields.py
--- a/yt/frontends/fits/fields.py
+++ b/yt/frontends/fits/fields.py
@@ -19,9 +19,15 @@
     def __init__(self, ds, field_list, slice_info=None):
         super(FITSFieldInfo, self).__init__(ds, field_list, slice_info=slice_info)
         for field in ds.field_list:
-            if field[0] == "fits": self[field].take_log = False
+            if field[0] == "fits": 
+                self[field].take_log = False
+
+class WCSFITSFieldInfo(FITSFieldInfo):
 
-    def _setup_spec_cube_fields(self):
+    def setup_fluid_fields(self):
+        def _pixel(field, data):
+            return data.ds.arr(data["ones"], "pixel")
+        self.add_field(("fits", "pixel"), sampling_type="cell", function=_pixel, units="pixel")
 
         def _get_2d_wcs(data, axis):
             w_coords = data.ds.wcs_2d.wcs_pix2world(data["x"], data["y"], 1)
@@ -33,11 +39,14 @@
             return _world_f
 
         for (i, axis), name in zip(enumerate([self.ds.lon_axis, self.ds.lat_axis]),
-                                             [self.ds.lon_name, self.ds.lat_name]):
+                                   [self.ds.lon_name, self.ds.lat_name]):
             unit = str(self.ds.wcs_2d.wcs.cunit[i])
-            if unit.lower() == "deg": unit = "degree"
-            if unit.lower() == "rad": unit = "radian"
-            self.add_field(("fits",name), sampling_type="cell",  function=world_f(axis, unit), units=unit)
+            if unit.lower() == "deg":
+                unit = "degree"
+            if unit.lower() == "rad":
+                unit = "radian"
+            self.add_field(("fits",name), sampling_type="cell", 
+                           function=world_f(axis, unit), units=unit)
 
         if self.ds.dimensionality == 3:
             def _spec(field, data):
@@ -46,12 +55,3 @@
                 return data.ds.arr(sp, data.ds.spec_unit)
             self.add_field(("fits","spectral"), sampling_type="cell",  function=_spec,
                            units=self.ds.spec_unit, display_name=self.ds.spec_name)
-
-    def setup_fluid_fields(self):
-
-        if self.ds.spec_cube:
-            def _pixel(field, data):
-                return data.ds.arr(data["ones"], "pixel")
-            self.add_field(("fits","pixel"), sampling_type="cell",  function=_pixel, units="pixel")
-            self._setup_spec_cube_fields()
-            return


https://bitbucket.org/yt_analysis/yt/commits/5524405379ad/
Changeset:   5524405379ad
User:        jzuhone
Date:        2017-08-11 22:32:46+00:00
Summary:     More refactoring of FITSDataset
Affected #:  2 files

diff -r 6909b0d65d719a5d7914169165d56ccabcb524e7 -r 5524405379ad8d103e49abbcdaf2b3dce3d8650d yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -20,7 +20,6 @@
 import weakref
 import warnings
 
-
 from collections import defaultdict
 
 from yt.config import ytcfg
@@ -44,6 +43,7 @@
     WCSFITSFieldInfo
 from yt.utilities.decompose import \
     decompose_array, get_psize
+from yt.funcs import issue_deprecation_warning
 from yt.units.unit_lookup_table import \
     default_unit_symbol_lut, \
     prefixable_units, \
@@ -252,14 +252,6 @@
             self.grid_right_edge[0,:] = ds.domain_right_edge
             self.grid_dimensions[0] = ds.domain_dimensions
 
-        if ds.events_data:
-            try:
-                self.grid_particle_count[:] = ds.primary_header["naxis2"]
-            except KeyError:
-                self.grid_particle_count[:] = 0.0
-            self._particle_indices = np.zeros(self.num_grids + 1, dtype='int64')
-            self._particle_indices[1] = self.grid_particle_count.squeeze()
-
         self.grid_levels.flat[:] = 0
         self.grids = np.empty(self.num_grids, dtype='object')
         for i in range(self.num_grids):
@@ -300,6 +292,28 @@
             yield YTDataChunk(dobj, "io", gs, self._count_selection(dobj, gs),
                               cache = cache)
 
+def check_fits_valid(args):
+    ext = args[0].rsplit(".", 1)[-1]
+    if ext.upper() in ("GZ", "FZ"):
+        # We don't know for sure that there will be > 1
+        ext = args[0].rsplit(".", 1)[0].rsplit(".", 1)[-1]
+    if ext.upper() not in ("FITS", "FTS"):
+        return False
+    elif isinstance(_astropy.pyfits, NotAModule):
+        raise RuntimeError("This appears to be a FITS file, but AstroPy is not installed.")
+    try:
+        with warnings.catch_warnings():
+            warnings.filterwarnings('ignore', category=UserWarning, append=True)
+            fileh = _astropy.pyfits.open(args[0])
+        valid = fileh[0].header["naxis"] >= 2
+        if len(fileh) > 1:
+            valid = fileh[1].header["naxis"] >= 2 or valid
+        fileh.close()
+        return valid
+    except:
+        pass
+    return False
+
 class FITSDataset(Dataset):
     _index_class = FITSHierarchy
     _field_info_class = FITSFieldInfo
@@ -413,7 +427,7 @@
 
         self._determine_structure()
         self._determine_wcs()
-        self._determine_wcs2d()
+        self._determine_axes()
 
         if self.parameter_filename.startswith("InMemory"):
             self.unique_identifier = time.time()
@@ -463,8 +477,6 @@
 
         # Check to see if this data is in some kind of (Lat,Lon,Vel) format
         """
-        self.spec_cube = False
-        self.wcs_2d = None
         x = 0
         for p in lon_prefixes+lat_prefixes+list(spec_names.keys()):
             y = np_char.startswith(self.axis_names[:self.dimensionality], p)
@@ -519,16 +531,37 @@
         else:
             self.wcs = wcs
 
-    def _determine_wcs2d(self):
-        self.wcs_2d = self.wcs
+    def _determine_axes(self):
         self.lat_axis = 1
         self.lon_axis = 0
         self.lat_name = "Y"
         self.lon_name = "X"
 
-    def _setup_spec_cube(self):
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        return check_fits_valid(args)
 
-        self.geometry = "spectral_cube"
+    @classmethod
+    def _guess_candidates(cls, base, directories, files):
+        candidates = []
+        for fn, fnl in ((_, _.lower()) for _ in files):
+            if fnl.endswith(".fits") or \
+               fnl.endswith(".fits.gz") or \
+               fnl.endswith(".fits.fz"):
+                candidates.append(fn)
+        # FITS files don't preclude subdirectories
+        return candidates, True
+
+    def close(self):
+        self._handle.close()
+
+class SkyDataFITSDataset(FITSDataset):
+    _field_info_class = WCSFITSFieldInfo
+
+    def _parse_parameter_file(self):
+        super(SkyDataFITSDataset, self)._parse_parameter_file()
+
+        self.geometry = "sky_coord"
 
         end = min(self.dimensionality+1,4)
         if self.events_data:
@@ -558,91 +591,26 @@
             self.lat_name = "Y"
             self.lon_name = "X"
 
-        if self.wcs.naxis > 2:
-
-            self.spec_axis = np.zeros((end-1), dtype="bool")
-            for p in spec_names.keys():
-                self.spec_axis += np_char.startswith(ctypes, p)
-            self.spec_axis = np.where(self.spec_axis)[0][0]
-            self.spec_name = spec_names[ctypes[self.spec_axis].split("-")[0][0]]
-
-            self.wcs_2d = _astropy.pywcs.WCS(naxis=2)
-            self.wcs_2d.wcs.crpix = self.wcs.wcs.crpix[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.cdelt = self.wcs.wcs.cdelt[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.crval = self.wcs.wcs.crval[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.cunit = [str(self.wcs.wcs.cunit[self.lon_axis]),
-                                     str(self.wcs.wcs.cunit[self.lat_axis])]
-            self.wcs_2d.wcs.ctype = [self.wcs.wcs.ctype[self.lon_axis],
-                                     self.wcs.wcs.ctype[self.lat_axis]]
-
-            self._p0 = self.wcs.wcs.crpix[self.spec_axis]
-            self._dz = self.wcs.wcs.cdelt[self.spec_axis]
-            self._z0 = self.wcs.wcs.crval[self.spec_axis]
-            self.spec_unit = str(self.wcs.wcs.cunit[self.spec_axis])
-
-            if self.spectral_factor == "auto":
-                self.spectral_factor = float(max(self.domain_dimensions[[self.lon_axis,
-                                                                         self.lat_axis]]))
-                self.spectral_factor /= self.domain_dimensions[self.spec_axis]
-                mylog.info("Setting the spectral factor to %f" % (self.spectral_factor))
-            Dz = self.domain_right_edge[self.spec_axis]-self.domain_left_edge[self.spec_axis]
-            dre = self.domain_right_edge
-            dre[self.spec_axis] = (self.domain_left_edge[self.spec_axis] +
-                                   self.spectral_factor*Dz)
-            self.domain_right_edge = dre
-            self._dz /= self.spectral_factor
-            self._p0 = (self._p0-0.5)*self.spectral_factor + 0.5
-
-        else:
-
-            self.wcs_2d = self.wcs
-            self.spec_axis = 2
-            self.spec_name = "z"
-            self.spec_unit = "code_length"
-
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        ext = args[0].rsplit(".", 1)[-1]
-        if ext.upper() in ("GZ", "FZ"):
-            # We don't know for sure that there will be > 1
-            ext = args[0].rsplit(".", 1)[0].rsplit(".", 1)[-1]
-        if ext.upper() not in ("FITS", "FTS"):
-            return False
-        elif isinstance(_astropy.pyfits, NotAModule):
-            raise RuntimeError("This appears to be a FITS file, but AstroPy is not installed.")
-        try:
+        if check_fits_valid(args):
             with warnings.catch_warnings():
                 warnings.filterwarnings('ignore', category=UserWarning, append=True)
                 fileh = _astropy.pyfits.open(args[0])
-            valid = fileh[0].header["naxis"] >= 2
-            if len(fileh) > 1:
-                valid = fileh[1].header["naxis"] >= 2 or valid
-            fileh.close()
-            return valid
-        except:
-            pass
-        return False
+            try:
+            self.primary_header = self._handle[self.first_image].header
+            self.naxis = self.primary_header["naxis"]
+            self.axis_names = [self.primary_header.get("ctype%d" % (i + 1), "LINEAR")
+                               for i in range(self.naxis)]
 
-    @classmethod
-    def _guess_candidates(cls, base, directories, files):
-        candidates = []
-        for fn, fnl in ((_, _.lower()) for _ in files):
-            if fnl.endswith(".fits") or \
-               fnl.endswith(".fits.gz") or \
-               fnl.endswith(".fits.fz"):
-                candidates.append(fn)
-        # FITS files don't preclude subdirectories
-        return candidates, True
+            x = 0
+            for p in lon_prefixes + lat_prefixes + list(spec_names.keys()):
+                y = np_char.startswith(self.axis_names[:self.dimensionality], p)
+                x += np.any(y)
 
-    def close(self):
-        self._handle.close()
+            fileh.close()
 
-class SkyDataFITSDataset(FITSDataset):
-    _field_info_class = WCSFITSFieldInfo
 
-    def _parse_parameter_file(self):
-        super(SkyDataFITSDataset, self)._parse_parameter_file()
-        
 class SpectralCubeFITSDataset(SkyDataFITSDataset):
     def __init__(self, filename,
                  auxiliary_files=[],
@@ -664,6 +632,47 @@
         self.spectral_factor = spectral_factor
         self.z_axis_decomp = z_axis_decomp
 
+    def _parse_parameter_file(self):
+        super(SpectralCubeFITSDataset, self)._parse_parameter_file()
+
+        self.geometry = "spectral_cube"
+
+        end = min(self.dimensionality+1,4)
+        ctypes = np.array([self.primary_header["CTYPE%d" % (i)] for i in range(1,end)])
+
+        self.spec_axis = np.zeros((end-1), dtype="bool")
+        for p in spec_names.keys():
+            self.spec_axis += np_char.startswith(ctypes, p)
+        self.spec_axis = np.where(self.spec_axis)[0][0]
+        self.spec_name = spec_names[ctypes[self.spec_axis].split("-")[0][0]]
+
+        self.wcs_2d = _astropy.pywcs.WCS(naxis=2)
+        self.wcs_2d.wcs.crpix = self.wcs.wcs.crpix[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.cdelt = self.wcs.wcs.cdelt[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.crval = self.wcs.wcs.crval[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.cunit = [str(self.wcs.wcs.cunit[self.lon_axis]),
+                                 str(self.wcs.wcs.cunit[self.lat_axis])]
+        self.wcs_2d.wcs.ctype = [self.wcs.wcs.ctype[self.lon_axis],
+                                 self.wcs.wcs.ctype[self.lat_axis]]
+
+        self._p0 = self.wcs.wcs.crpix[self.spec_axis]
+        self._dz = self.wcs.wcs.cdelt[self.spec_axis]
+        self._z0 = self.wcs.wcs.crval[self.spec_axis]
+        self.spec_unit = str(self.wcs.wcs.cunit[self.spec_axis])
+
+        if self.spectral_factor == "auto":
+            self.spectral_factor = float(max(self.domain_dimensions[[self.lon_axis,
+                                                                     self.lat_axis]]))
+            self.spectral_factor /= self.domain_dimensions[self.spec_axis]
+            mylog.info("Setting the spectral factor to %f" % (self.spectral_factor))
+        Dz = self.domain_right_edge[self.spec_axis]-self.domain_left_edge[self.spec_axis]
+        dre = self.domain_right_edge
+        dre[self.spec_axis] = (self.domain_left_edge[self.spec_axis] +
+                               self.spectral_factor*Dz)
+        self.domain_right_edge = dre
+        self._dz /= self.spectral_factor
+        self._p0 = (self._p0-0.5)*self.spectral_factor + 0.5
+
     def _determine_nprocs(self):
         # If nprocs is None, do some automatic decomposition of the domain
         if self.specified_parameters["nprocs"] is None:
@@ -702,6 +711,7 @@
                  nprocs=None,
                  storage_filename=None,
                  suppress_astropy_warnings=True,
+                 reblock=1,
                  parameters=None,
                  units_override=None,
                  unit_system="cgs"):
@@ -709,6 +719,7 @@
             storage_filename=storage_filename, parameters=parameters,
             suppress_astropy_warnings=suppress_astropy_warnings,
             units_override=units_override, unit_system=unit_system)
+        self.reblock = reblock
 
     def _determine_structure(self):
         self.first_image = 1
@@ -734,8 +745,10 @@
                     if unit.endswith("ev"): unit = unit.replace("ev", "eV")
                     self.events_info[v.lower()] = unit
         self.axis_names = [self.events_info[ax][2] for ax in ["x", "y"]]
-        self.reblock = 1
         if "reblock" in self.specified_parameters:
+            issue_deprecation_warning("'reblock' is now a keyword argument that "
+                                      "can be passed to 'yt.load'. This behavior "
+                                      "is deprecated.")
             self.reblock = self.specified_parameters["reblock"]
         self.wcs.wcs.cdelt = [self.events_info["x"][4] * self.reblock,
                               self.events_info["y"][4] * self.reblock]
@@ -746,3 +759,17 @@
         self.wcs.wcs.crval = [self.events_info["x"][3], self.events_info["y"][3]]
         self.dims = [(self.events_info["x"][1] - self.events_info["x"][0]) / self.reblock,
                      (self.events_info["y"][1] - self.events_info["y"][0]) / self.reblock]
+
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        if check_fits_valid(args):
+            try:
+                with warnings.catch_warnings():
+                    warnings.filterwarnings('ignore', category=UserWarning, append=True)
+                    fileh = _astropy.pyfits.open(args[0])
+                valid = fileh[1].name == "EVENTS"
+                fileh.close()
+                return valid
+            except:
+                pass
+        return False
\ No newline at end of file

diff -r 6909b0d65d719a5d7914169165d56ccabcb524e7 -r 5524405379ad8d103e49abbcdaf2b3dce3d8650d yt/frontends/fits/fields.py
--- a/yt/frontends/fits/fields.py
+++ b/yt/frontends/fits/fields.py
@@ -25,12 +25,17 @@
 class WCSFITSFieldInfo(FITSFieldInfo):
 
     def setup_fluid_fields(self):
+        if hasattr(self.ds, "wcs_2d"):
+            wcs_2d = self.ds.wcs_2d
+        else:
+            wcs_2d = self.ds.wcs
+
         def _pixel(field, data):
             return data.ds.arr(data["ones"], "pixel")
         self.add_field(("fits", "pixel"), sampling_type="cell", function=_pixel, units="pixel")
 
         def _get_2d_wcs(data, axis):
-            w_coords = data.ds.wcs_2d.wcs_pix2world(data["x"], data["y"], 1)
+            w_coords = wcs_2d.wcs_pix2world(data["x"], data["y"], 1)
             return w_coords[axis]
 
         def world_f(axis, unit):
@@ -40,7 +45,7 @@
 
         for (i, axis), name in zip(enumerate([self.ds.lon_axis, self.ds.lat_axis]),
                                    [self.ds.lon_name, self.ds.lat_name]):
-            unit = str(self.ds.wcs_2d.wcs.cunit[i])
+            unit = str(wcs_2d.wcs.cunit[i])
             if unit.lower() == "deg":
                 unit = "degree"
             if unit.lower() == "rad":


https://bitbucket.org/yt_analysis/yt/commits/9603b64a0101/
Changeset:   9603b64a0101
User:        jzuhone
Date:        2017-08-13 18:21:54+00:00
Summary:     FITS refactoring
Affected #:  2 files

diff -r 5524405379ad8d103e49abbcdaf2b3dce3d8650d -r 9603b64a01011b03fe353b2b5796a2b355fda7ba yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -412,16 +412,6 @@
         else:
             beam_size = 1.0
         self.unit_registry.add("beam", beam_size, dimensions=dimensions.solid_angle)
-        if self.spec_cube:
-            units = self.wcs_2d.wcs.cunit[0]
-            if units == "deg":
-                units = "degree"
-            if units == "rad":
-                units = "radian"
-            pixel_area = np.prod(np.abs(self.wcs_2d.wcs.cdelt))
-            pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
-            pixel_dims = pixel_area.units.dimensions
-            self.unit_registry.add("pixel",float(pixel_area.value),dimensions=pixel_dims)
 
     def _parse_parameter_file(self):
 
@@ -591,6 +581,19 @@
             self.lat_name = "Y"
             self.lon_name = "X"
 
+    def _set_code_unit_attributes(self):
+        super(SkyDataFITSDataset, self)._set_code_unit_attributes()
+        wcs_2d = getattr(self, "wcs_2d", self.wcs)
+        units = wcs_2d.wcs.cunit[0]
+        if units == "deg":
+            units = "degree"
+        if units == "rad":
+            units = "radian"
+        pixel_area = np.prod(np.abs(wcs_2d.wcs.cdelt))
+        pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
+        pixel_dims = pixel_area.units.dimensions
+        self.unit_registry.add("pixel", float(pixel_area.value), dimensions=pixel_dims)
+
     @classmethod
     def _is_valid(cls, *args, **kwargs):
         if check_fits_valid(args):

diff -r 5524405379ad8d103e49abbcdaf2b3dce3d8650d -r 9603b64a01011b03fe353b2b5796a2b355fda7ba yt/frontends/fits/fields.py
--- a/yt/frontends/fits/fields.py
+++ b/yt/frontends/fits/fields.py
@@ -25,10 +25,7 @@
 class WCSFITSFieldInfo(FITSFieldInfo):
 
     def setup_fluid_fields(self):
-        if hasattr(self.ds, "wcs_2d"):
-            wcs_2d = self.ds.wcs_2d
-        else:
-            wcs_2d = self.ds.wcs
+        wcs_2d = getattr(self.ds, "wcs_2d", self.ds.wcs)
 
         def _pixel(field, data):
             return data.ds.arr(data["ones"], "pixel")


https://bitbucket.org/yt_analysis/yt/commits/aeceef20a267/
Changeset:   aeceef20a267
User:        jzuhone
Date:        2017-08-14 15:01:24+00:00
Summary:     Do different domain decomps for spectral cubes vs everything else
Affected #:  1 file

diff -r 9603b64a01011b03fe353b2b5796a2b355fda7ba -r aeceef20a26718c50daa36912989b87b724cf814 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -202,7 +202,7 @@
                         self._axis_map[fname] = k
                         self._file_map[fname] = fits_file
                         self._ext_map[fname] = j
-                        self._scale_map[fname] = [0.0,1.0]
+                        self._scale_map[fname] = [0.0, 1.0]
                         if "bzero" in hdu.header:
                             self._scale_map[fname][0] = hdu.header["bzero"]
                         if "bscale" in hdu.header:
@@ -226,27 +226,7 @@
 
         # If nprocs > 1, decompose the domain into virtual grids
         if self.num_grids > 1:
-            if self.ds.z_axis_decomp:
-                dz = ds.quan(1.0, "code_length")*ds.spectral_factor
-                self.grid_dimensions[:,2] = np.around(float(ds.domain_dimensions[2])/
-                                                            self.num_grids).astype("int")
-                self.grid_dimensions[-1,2] += (ds.domain_dimensions[2] % self.num_grids)
-                self.grid_left_edge[0,2] = ds.domain_left_edge[2]
-                self.grid_left_edge[1:,2] = ds.domain_left_edge[2] + \
-                                            np.cumsum(self.grid_dimensions[:-1,2])*dz
-                self.grid_right_edge[:,2] = self.grid_left_edge[:,2]+self.grid_dimensions[:,2]*dz
-                self.grid_left_edge[:,:2] = ds.domain_left_edge[:2]
-                self.grid_right_edge[:,:2] = ds.domain_right_edge[:2]
-                self.grid_dimensions[:,:2] = ds.domain_dimensions[:2]
-            else:
-                bbox = np.array([[le,re] for le, re in zip(ds.domain_left_edge,
-                                                           ds.domain_right_edge)])
-                dims = np.array(ds.domain_dimensions)
-                psize = get_psize(dims, self.num_grids)
-                gle, gre, shapes, slices = decompose_array(dims, psize, bbox)
-                self.grid_left_edge = self.ds.arr(gle, "code_length")
-                self.grid_right_edge = self.ds.arr(gre, "code_length")
-                self.grid_dimensions = np.array([shape for shape in shapes], dtype="int32")
+            self._domain_decomp()
         else:
             self.grid_left_edge[0,:] = ds.domain_left_edge
             self.grid_right_edge[0,:] = ds.domain_right_edge
@@ -257,6 +237,16 @@
         for i in range(self.num_grids):
             self.grids[i] = self.grid(i, self, self.grid_levels[i,0])
 
+    def _domain_decomp(self):
+        bbox = np.array([[le, re] for le, re in zip(self.ds.domain_left_edge,
+                                                    self.ds.domain_right_edge)])
+        dims = np.array(self.ds.domain_dimensions)
+        psize = get_psize(dims, self.num_grids)
+        gle, gre, shapes, slices = decompose_array(dims, psize, bbox)
+        self.grid_left_edge = self.ds.arr(gle, "code_length")
+        self.grid_right_edge = self.ds.arr(gre, "code_length")
+        self.grid_dimensions = np.array([shape for shape in shapes], dtype="int32")
+
     def _populate_grid_objects(self):
         for i in range(self.num_grids):
             self.grids[i]._prepare_grid()
@@ -610,8 +600,20 @@
             for p in lon_prefixes + lat_prefixes + list(spec_names.keys()):
                 y = np_char.startswith(self.axis_names[:self.dimensionality], p)
                 x += np.any(y)
+class SpectralCubeFITSHierarchy(FITSHierarchy):
 
-            fileh.close()
+    def _domain_decomp(self):
+        dz = self.ds.quan(1.0, "code_length") * self.ds.spectral_factor
+        self.grid_dimensions[:, 2] = np.around(float(self.ds.domain_dimensions[2]) /
+                                               self.num_grids).astype("int")
+        self.grid_dimensions[-1, 2] += (self.ds.domain_dimensions[2] % self.num_grids)
+        self.grid_left_edge[0, 2] = self.ds.domain_left_edge[2]
+        self.grid_left_edge[1:, 2] = self.ds.domain_left_edge[2] + \
+                                     np.cumsum(self.grid_dimensions[:-1, 2]) * dz
+        self.grid_right_edge[:, 2] = self.grid_left_edge[:, 2] + self.grid_dimensions[:, 2] * dz
+        self.grid_left_edge[:, :2] = self.ds.domain_left_edge[:2]
+        self.grid_right_edge[:, :2] = self.ds.domain_right_edge[:2]
+        self.grid_dimensions[:, :2] = self.ds.domain_dimensions[:2]
 
 
 class SpectralCubeFITSDataset(SkyDataFITSDataset):


https://bitbucket.org/yt_analysis/yt/commits/51c3bd902fe9/
Changeset:   51c3bd902fe9
User:        jzuhone
Date:        2017-08-14 15:01:49+00:00
Summary:     Handle different cases of _is_valid, but do so without code duplication
Affected #:  1 file

diff -r aeceef20a26718c50daa36912989b87b724cf814 -r 51c3bd902fe9f6c02be7081e90729cd4fddae59f yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -52,7 +52,6 @@
 from yt.utilities.on_demand_imports import \
     _astropy, NotAModule
 
-
 lon_prefixes = ["X","RA","GLON","LINEAR"]
 lat_prefixes = ["Y","DEC","GLAT","LINEAR"]
 delimiters = ["*", "/", "-", "^", "(", ")"]
@@ -64,6 +63,10 @@
               "E":"Energy",
               "W":"Wavelength"}
 
+space_prefixes = list(set(lon_prefixes + lat_prefixes))
+sky_prefixes = list(set(space_prefixes).difference_update({"X", "Y", "LINEAR"}))
+spec_prefixes = list(spec_names.keys())
+
 field_from_unit = {"Jy":"intensity",
                    "K":"temperature"}
 
@@ -282,27 +285,36 @@
             yield YTDataChunk(dobj, "io", gs, self._count_selection(dobj, gs),
                               cache = cache)
 
+def find_primary_header(fileh):
+    # Sometimes the primary hdu doesn't have an image
+    if len(fileh) > 1 and fileh[0].header["naxis"] == 0:
+        first_image = 1
+    else:
+        first_image = 0
+    header = fileh[first_image].header
+    return header, fileh
+
 def check_fits_valid(args):
     ext = args[0].rsplit(".", 1)[-1]
     if ext.upper() in ("GZ", "FZ"):
         # We don't know for sure that there will be > 1
         ext = args[0].rsplit(".", 1)[0].rsplit(".", 1)[-1]
     if ext.upper() not in ("FITS", "FTS"):
-        return False
+        return None
     elif isinstance(_astropy.pyfits, NotAModule):
         raise RuntimeError("This appears to be a FITS file, but AstroPy is not installed.")
     try:
         with warnings.catch_warnings():
             warnings.filterwarnings('ignore', category=UserWarning, append=True)
             fileh = _astropy.pyfits.open(args[0])
-        valid = fileh[0].header["naxis"] >= 2
-        if len(fileh) > 1:
-            valid = fileh[1].header["naxis"] >= 2 or valid
-        fileh.close()
-        return valid
+        header, _ = find_primary_header(fileh)
+        if header["naxis"] >= 2:
+            return fileh
+        else:
+            fileh.close()
     except:
         pass
-    return False
+    return None
 
 class FITSDataset(Dataset):
     _index_class = FITSHierarchy
@@ -487,12 +499,7 @@
             self.parameters["nprocs"] = self.specified_parameters["nprocs"]
 
     def _determine_structure(self):
-        # Sometimes the primary hdu doesn't have an image
-        if len(self._handle) > 1 and self._handle[0].header["naxis"] == 0:
-            self.first_image = 1
-        else:
-            self.first_image = 0
-        self.primary_header = self._handle[self.first_image].header
+        self.primary_header, self.first_image = find_primary_header(self._handle)
         self.naxis = self.primary_header["naxis"]
         self.axis_names = [self.primary_header.get("ctype%d" % (i+1),"LINEAR")
                            for i in range(self.naxis)]
@@ -519,7 +526,12 @@
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        return check_fits_valid(args)
+        fileh = check_fits_valid(args)
+        if fileh is None:
+            return False
+        else:
+            fileh.close()
+            return True
 
     @classmethod
     def _guess_candidates(cls, base, directories, files):
@@ -535,6 +547,13 @@
     def close(self):
         self._handle.close()
 
+def find_axes(axis_names, prefixes):
+    x = 0
+    for p in prefixes:
+        y = np_char.startswith(axis_names, p)
+        x += np.any(y)
+    return x
+
 class SkyDataFITSDataset(FITSDataset):
     _field_info_class = WCSFITSFieldInfo
 
@@ -586,20 +605,21 @@
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        if check_fits_valid(args):
-            with warnings.catch_warnings():
-                warnings.filterwarnings('ignore', category=UserWarning, append=True)
-                fileh = _astropy.pyfits.open(args[0])
+        fileh = check_fits_valid(args)
+        if fileh is not None:
             try:
-            self.primary_header = self._handle[self.first_image].header
-            self.naxis = self.primary_header["naxis"]
-            self.axis_names = [self.primary_header.get("ctype%d" % (i + 1), "LINEAR")
-                               for i in range(self.naxis)]
+                header, _ = find_primary_header(fileh)
+                if header["naxis"] > 2:
+                    return False
+                axis_names = [header.get("ctype%d" % (i+1), "")
+                              for i in range(header["naxis"])]
+                x = find_axes(axis_names, sky_prefixes)
+                fileh.close()
+                return x == 2
+            except:
+                pass
+        return False
 
-            x = 0
-            for p in lon_prefixes + lat_prefixes + list(spec_names.keys()):
-                y = np_char.startswith(self.axis_names[:self.dimensionality], p)
-                x += np.any(y)
 class SpectralCubeFITSHierarchy(FITSHierarchy):
 
     def _domain_decomp(self):
@@ -617,13 +637,13 @@
 
 
 class SpectralCubeFITSDataset(SkyDataFITSDataset):
+    _index_class = SpectralCubeFITSHierarchy
     def __init__(self, filename,
                  auxiliary_files=[],
                  nprocs=None,
                  storage_filename=None,
                  nan_mask=None,
                  spectral_factor=1.0,
-                 z_axis_decomp=False,
                  suppress_astropy_warnings=True,
                  parameters=None,
                  units_override=None,
@@ -635,7 +655,6 @@
             unit_system=unit_system)
 
         self.spectral_factor = spectral_factor
-        self.z_axis_decomp = z_axis_decomp
 
     def _parse_parameter_file(self):
         super(SpectralCubeFITSDataset, self)._parse_parameter_file()
@@ -699,6 +718,24 @@
         return self.arr((pv.v-self._p0)*self._dz+self._z0,
                         self.spec_unit)
 
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        fileh = check_fits_valid(args)
+        if fileh is not None:
+            try:
+                header, _ = find_primary_header(fileh)
+                naxis = min(header["naxis"], 3)
+                if naxis < 3:
+                    return False
+                axis_names = [header.get("ctype%d" % (i+1), "LINEAR")
+                              for i in range(naxis)]
+                x = find_axes(axis_names[:naxis], space_prefixes+spec_prefixes)
+                fileh.close()
+                return x == 3
+            except:
+                pass
+        return False
+
 class EventsFITSHierarchy(FITSHierarchy):
 
     def _parse_index(self):
@@ -767,11 +804,9 @@
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        if check_fits_valid(args):
+        fileh = check_fits_valid(args)
+        if fileh is not None:
             try:
-                with warnings.catch_warnings():
-                    warnings.filterwarnings('ignore', category=UserWarning, append=True)
-                    fileh = _astropy.pyfits.open(args[0])
                 valid = fileh[1].name == "EVENTS"
                 fileh.close()
                 return valid


https://bitbucket.org/yt_analysis/yt/commits/5b030fe2d2bf/
Changeset:   5b030fe2d2bf
User:        jzuhone
Date:        2017-08-14 15:33:30+00:00
Summary:     Can't make a list from a None
Affected #:  1 file

diff -r 51c3bd902fe9f6c02be7081e90729cd4fddae59f -r 5b030fe2d2bf5ea467fd213d3a70fc70c7b94a17 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -64,7 +64,9 @@
               "W":"Wavelength"}
 
 space_prefixes = list(set(lon_prefixes + lat_prefixes))
-sky_prefixes = list(set(space_prefixes).difference_update({"X", "Y", "LINEAR"}))
+sky_prefixes = set(space_prefixes)
+sky_prefixes.difference_update({"X", "Y", "LINEAR"})
+sky_prefixes = list(sky_prefixes)
 spec_prefixes = list(spec_names.keys())
 
 field_from_unit = {"Jy":"intensity",


https://bitbucket.org/yt_analysis/yt/commits/0015f88b88c5/
Changeset:   0015f88b88c5
User:        jzuhone
Date:        2017-08-14 15:33:46+00:00
Summary:     Bugfix
Affected #:  1 file

diff -r 5b030fe2d2bf5ea467fd213d3a70fc70c7b94a17 -r 0015f88b88c5121c45d01e6cd1349ab073422983 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -615,8 +615,7 @@
                     return False
                 axis_names = [header.get("ctype%d" % (i+1), "")
                               for i in range(header["naxis"])]
-                x = find_axes(axis_names, sky_prefixes)
-                fileh.close()
+                x = find_axes(axis_names, sky_prefixes+spec_prefixes)
                 return x == 2
             except:
                 pass
@@ -732,7 +731,6 @@
                 axis_names = [header.get("ctype%d" % (i+1), "LINEAR")
                               for i in range(naxis)]
                 x = find_axes(axis_names[:naxis], space_prefixes+spec_prefixes)
-                fileh.close()
                 return x == 3
             except:
                 pass


https://bitbucket.org/yt_analysis/yt/commits/bbf2fe8816d1/
Changeset:   bbf2fe8816d1
User:        jzuhone
Date:        2017-08-14 15:39:22+00:00
Summary:     Bugfix, minor formatting nits
Affected #:  1 file

diff -r 0015f88b88c5121c45d01e6cd1349ab073422983 -r bbf2fe8816d1b001c51fc898bf81343a291b245d yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -58,10 +58,10 @@
 delimiters += [str(i) for i in range(10)]
 regex_pattern = '|'.join(re.escape(_) for _ in delimiters)
 
-spec_names = {"V":"Velocity",
-              "F":"Frequency",
-              "E":"Energy",
-              "W":"Wavelength"}
+spec_names = {"V": "Velocity",
+              "F": "Frequency",
+              "E": "Energy",
+              "W": "Wavelength"}
 
 space_prefixes = list(set(lon_prefixes + lat_prefixes))
 sky_prefixes = set(space_prefixes)
@@ -75,8 +75,8 @@
 class FITSGrid(AMRGridPatch):
     _id_offset = 0
     def __init__(self, id, index, level):
-        AMRGridPatch.__init__(self, id, filename = index.index_filename,
-                              index = index)
+        AMRGridPatch.__init__(self, id, filename=index.index_filename,
+                              index=index)
         self.Parent = None
         self.Children = []
         self.Level = 0
@@ -294,7 +294,7 @@
     else:
         first_image = 0
     header = fileh[first_image].header
-    return header, fileh
+    return header, first_image
 
 def check_fits_valid(args):
     ext = args[0].rsplit(".", 1)[-1]
@@ -469,23 +469,6 @@
 
         self._determine_nprocs()
 
-        # Check to see if this data is in some kind of (Lat,Lon,Vel) format
-        """
-        x = 0
-        for p in lon_prefixes+lat_prefixes+list(spec_names.keys()):
-            y = np_char.startswith(self.axis_names[:self.dimensionality], p)
-            x += np.any(y)
-        if x == self.dimensionality:
-            if self.axis_names == ['LINEAR','LINEAR']:
-                self.wcs_2d = self.wcs
-                self.lat_axis = 1
-                self.lon_axis = 0
-                self.lat_name = "Y"
-                self.lon_name = "X"
-            else:
-                self._setup_spec_cube()
-        """
-
         # Now we can set up some of our parameters for convenience.
         for k, v in self.primary_header.items():
             self.parameters[k] = v


https://bitbucket.org/yt_analysis/yt/commits/129691aaa77d/
Changeset:   129691aaa77d
User:        jzuhone
Date:        2017-10-03 02:26:50+00:00
Summary:     Specialize _detect_output_fields for events fits datasets
Affected #:  1 file

diff -r bbf2fe8816d1b001c51fc898bf81343a291b245d -r 129691aaa77df1d24dc62c0457c6e163a9cbd99e yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -145,17 +145,6 @@
     def _detect_output_fields(self):
         ds = self.dataset
         self.field_list = []
-        if ds.events_data:
-            for k,v in ds.events_info.items():
-                fname = "event_"+k
-                mylog.info("Adding field %s to the list of fields." % (fname))
-                self.field_list.append(("io",fname))
-                if k in ["x","y"]:
-                    field_unit = "code_length"
-                else:
-                    field_unit = v
-                self.dataset.field_units[("io",fname)] = field_unit
-            return
         self._axis_map = {}
         self._file_map = {}
         self._ext_map = {}
@@ -721,6 +710,20 @@
 
 class EventsFITSHierarchy(FITSHierarchy):
 
+    def _detect_output_fields(self):
+        ds = self.dataset
+        self.field_list = []
+        for k,v in ds.events_info.items():
+            fname = "event_"+k
+            mylog.info("Adding field %s to the list of fields." % (fname))
+            self.field_list.append(("io",fname))
+            if k in ["x","y"]:
+                field_unit = "code_length"
+            else:
+                field_unit = v
+            self.dataset.field_units[("io",fname)] = field_unit
+        return
+
     def _parse_index(self):
         super(EventsFITSHierarchy, self)._parse_index()
         try:


https://bitbucket.org/yt_analysis/yt/commits/70cf198dd662/
Changeset:   70cf198dd662
User:        jzuhone
Date:        2017-10-03 02:28:12+00:00
Summary:     Fix miscellaneous bugs and make sure that we don't have multiple dataset types validating the same file
Affected #:  1 file

diff -r 129691aaa77df1d24dc62c0457c6e163a9cbd99e -r 70cf198dd662b8ee29c23cc2bd8734d3118781ea yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -409,7 +409,6 @@
     def _parse_parameter_file(self):
 
         self._determine_structure()
-        self._determine_wcs()
         self._determine_axes()
 
         if self.parameter_filename.startswith("InMemory"):
@@ -428,6 +427,8 @@
         if self.dimensionality == 4: 
             self.dimensionality = 3
 
+        self._determine_wcs()
+
         self.domain_dimensions = np.array(self.dims)[:self.dimensionality]
         if self.dimensionality == 2:
             self.domain_dimensions = np.append(self.domain_dimensions,
@@ -504,8 +505,10 @@
         if fileh is None:
             return False
         else:
+            # Check for FITS event files and reject them
+            valid = len(fileh) > 1 and fileh[1].name == "EVENTS"
             fileh.close()
-            return True
+            return valid
 
     @classmethod
     def _guess_candidates(cls, base, directories, files):
@@ -531,32 +534,33 @@
 class SkyDataFITSDataset(FITSDataset):
     _field_info_class = WCSFITSFieldInfo
 
+    def _determine_wcs(self):
+        super(SkyDataFITSDataset, self)._determine_wcs()
+        end = min(self.dimensionality+1, 4)
+        self.ctypes = np.array([self.primary_header["CTYPE%d" % (i)]
+                                for i in range(1, end)])
+
     def _parse_parameter_file(self):
         super(SkyDataFITSDataset, self)._parse_parameter_file()
 
-        self.geometry = "sky_coord"
+        end = min(self.dimensionality + 1, 4)
 
-        end = min(self.dimensionality+1,4)
-        if self.events_data:
-            ctypes = self.axis_names
-        else:
-            ctypes = np.array([self.primary_header["CTYPE%d" % (i)]
-                               for i in range(1,end)])
+        self.geometry = "spectral_cube"
 
-        log_str = "Detected these axes: "+"%s "*len(ctypes)
-        mylog.info(log_str % tuple([ctype for ctype in ctypes]))
+        log_str = "Detected these axes: "+"%s "*len(self.ctypes)
+        mylog.info(log_str % tuple([ctype for ctype in self.ctypes]))
 
         self.lat_axis = np.zeros((end-1), dtype="bool")
         for p in lat_prefixes:
-            self.lat_axis += np_char.startswith(ctypes, p)
+            self.lat_axis += np_char.startswith(self.ctypes, p)
         self.lat_axis = np.where(self.lat_axis)[0][0]
-        self.lat_name = ctypes[self.lat_axis].split("-")[0].lower()
+        self.lat_name = self.ctypes[self.lat_axis].split("-")[0].lower()
 
         self.lon_axis = np.zeros((end-1), dtype="bool")
         for p in lon_prefixes:
-            self.lon_axis += np_char.startswith(ctypes, p)
+            self.lon_axis += np_char.startswith(self.ctypes, p)
         self.lon_axis = np.where(self.lon_axis)[0][0]
-        self.lon_name = ctypes[self.lon_axis].split("-")[0].lower()
+        self.lon_name = self.ctypes[self.lon_axis].split("-")[0].lower()
 
         if self.lat_axis == self.lon_axis and self.lat_name == self.lon_name:
             self.lat_axis = 1
@@ -564,6 +568,10 @@
             self.lat_name = "Y"
             self.lon_name = "X"
 
+        self.spec_axis = 2
+        self.spec_name = "z"
+        self.spec_unit = ""
+
     def _set_code_unit_attributes(self):
         super(SkyDataFITSDataset, self)._set_code_unit_attributes()
         wcs_2d = getattr(self, "wcs_2d", self.wcs)
@@ -621,27 +629,25 @@
                  parameters=None,
                  units_override=None,
                  unit_system="cgs"):
+        self.spectral_factor = spectral_factor
         super(SpectralCubeFITSDataset, self).__init__(filename, nprocs=nprocs,
             auxiliary_files=auxiliary_files, storage_filename=storage_filename,
             suppress_astropy_warnings=suppress_astropy_warnings, nan_mask=nan_mask,
             parameters=parameters, units_override=units_override,
             unit_system=unit_system)
 
-        self.spectral_factor = spectral_factor
-
     def _parse_parameter_file(self):
         super(SpectralCubeFITSDataset, self)._parse_parameter_file()
 
         self.geometry = "spectral_cube"
 
         end = min(self.dimensionality+1,4)
-        ctypes = np.array([self.primary_header["CTYPE%d" % (i)] for i in range(1,end)])
 
-        self.spec_axis = np.zeros((end-1), dtype="bool")
+        self.spec_axis = np.zeros(end-1, dtype="bool")
         for p in spec_names.keys():
-            self.spec_axis += np_char.startswith(ctypes, p)
+            self.spec_axis += np_char.startswith(self.ctypes, p)
         self.spec_axis = np.where(self.spec_axis)[0][0]
-        self.spec_name = spec_names[ctypes[self.spec_axis].split("-")[0][0]]
+        self.spec_name = spec_names[self.ctypes[self.spec_axis].split("-")[0][0]]
 
         self.wcs_2d = _astropy.pywcs.WCS(naxis=2)
         self.wcs_2d.wcs.crpix = self.wcs.wcs.crpix[[self.lon_axis, self.lat_axis]]
@@ -673,10 +679,7 @@
     def _determine_nprocs(self):
         # If nprocs is None, do some automatic decomposition of the domain
         if self.specified_parameters["nprocs"] is None:
-            if self.z_axis_decomp:
-                nprocs = np.around(self.domain_dimensions[2] / 8).astype("int")
-            else:
-                nprocs = np.around(np.prod(self.domain_dimensions) / 32 ** self.dimensionality).astype("int")
+            nprocs = np.around(self.domain_dimensions[2] / 8).astype("int")
             self.parameters["nprocs"] = max(min(nprocs, 512), 1)
         else:
             self.parameters["nprocs"] = self.specified_parameters["nprocs"]
@@ -736,18 +739,17 @@
 class EventsFITSDataset(SkyDataFITSDataset):
     _index_class = EventsFITSHierarchy
     def __init__(self, filename,
-                 nprocs=None,
                  storage_filename=None,
                  suppress_astropy_warnings=True,
                  reblock=1,
                  parameters=None,
                  units_override=None,
                  unit_system="cgs"):
-        super(EventsFITSDataset, self).__init__(filename, nprocs=nprocs,
+        self.reblock = reblock
+        super(EventsFITSDataset, self).__init__(filename, nprocs=1,
             storage_filename=storage_filename, parameters=parameters,
             suppress_astropy_warnings=suppress_astropy_warnings,
             units_override=units_override, unit_system=unit_system)
-        self.reblock = reblock
 
     def _determine_structure(self):
         self.first_image = 1
@@ -787,6 +789,7 @@
         self.wcs.wcs.crval = [self.events_info["x"][3], self.events_info["y"][3]]
         self.dims = [(self.events_info["x"][1] - self.events_info["x"][0]) / self.reblock,
                      (self.events_info["y"][1] - self.events_info["y"][0]) / self.reblock]
+        self.ctypes = self.axis_names
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):


https://bitbucket.org/yt_analysis/yt/commits/df36ecdb7896/
Changeset:   df36ecdb7896
User:        jzuhone
Date:        2017-10-03 02:47:11+00:00
Summary:     Simplify the is_valid's
Affected #:  1 file

diff -r 70cf198dd662b8ee29c23cc2bd8734d3118781ea -r df36ecdb7896a28a72fe3c94007c44f195b9686c yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -307,6 +307,21 @@
         pass
     return None
 
+def check_sky_coords(args, ndim):
+    fileh = check_fits_valid(args)
+    if fileh is not None:
+        try:
+            header, _ = find_primary_header(fileh)
+            if header["naxis"] > ndim:
+                return False
+            axis_names = [header.get("ctype%d" % (i + 1), "")
+                          for i in range(header["naxis"])]
+            x = find_axes(axis_names, sky_prefixes + spec_prefixes)
+            return x == ndim
+        except:
+            pass
+    return False
+
 class FITSDataset(Dataset):
     _index_class = FITSHierarchy
     _field_info_class = FITSFieldInfo
@@ -505,10 +520,8 @@
         if fileh is None:
             return False
         else:
-            # Check for FITS event files and reject them
-            valid = len(fileh) > 1 and fileh[1].name == "EVENTS"
             fileh.close()
-            return valid
+            return True
 
     @classmethod
     def _guess_candidates(cls, base, directories, files):
@@ -587,19 +600,7 @@
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        fileh = check_fits_valid(args)
-        if fileh is not None:
-            try:
-                header, _ = find_primary_header(fileh)
-                if header["naxis"] > 2:
-                    return False
-                axis_names = [header.get("ctype%d" % (i+1), "")
-                              for i in range(header["naxis"])]
-                x = find_axes(axis_names, sky_prefixes+spec_prefixes)
-                return x == 2
-            except:
-                pass
-        return False
+        return check_sky_coords(args, 2)
 
 class SpectralCubeFITSHierarchy(FITSHierarchy):
 
@@ -696,20 +697,7 @@
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        fileh = check_fits_valid(args)
-        if fileh is not None:
-            try:
-                header, _ = find_primary_header(fileh)
-                naxis = min(header["naxis"], 3)
-                if naxis < 3:
-                    return False
-                axis_names = [header.get("ctype%d" % (i+1), "LINEAR")
-                              for i in range(naxis)]
-                x = find_axes(axis_names[:naxis], space_prefixes+spec_prefixes)
-                return x == 3
-            except:
-                pass
-        return False
+        return check_sky_coords(args, 3)
 
 class EventsFITSHierarchy(FITSHierarchy):
 


https://bitbucket.org/yt_analysis/yt/commits/caad976a31fd/
Changeset:   caad976a31fd
User:        jzuhone
Date:        2017-10-03 03:06:05+00:00
Summary:     Get all of the _is_valid's to work together, finally
Affected #:  1 file

diff -r df36ecdb7896a28a72fe3c94007c44f195b9686c -r caad976a31fd0ecb028c5f1813977552dd391b9a yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -311,13 +311,18 @@
     fileh = check_fits_valid(args)
     if fileh is not None:
         try:
-            header, _ = find_primary_header(fileh)
-            if header["naxis"] > ndim:
-                return False
-            axis_names = [header.get("ctype%d" % (i + 1), "")
-                          for i in range(header["naxis"])]
-            x = find_axes(axis_names, sky_prefixes + spec_prefixes)
-            return x == ndim
+            if fileh[1].name == "EVENTS" and ndim == 2:
+                fileh.close()
+                return True
+            else:
+                header, _ = find_primary_header(fileh)
+                if header["naxis"] < ndim:
+                    return False
+                axis_names = [header.get("ctype%d" % (i + 1), "")
+                              for i in range(header["naxis"])]
+                x = find_axes(axis_names, sky_prefixes + spec_prefixes)
+                fileh.close()
+                return x >= ndim
         except:
             pass
     return False


https://bitbucket.org/yt_analysis/yt/commits/1f423ce4da0e/
Changeset:   1f423ce4da0e
User:        jzuhone
Date:        2017-10-14 01:42:12+00:00
Summary:     Update doc formatting
Affected #:  1 file

diff -r caad976a31fd0ecb028c5f1813977552dd391b9a -r 1f423ce4da0e129303523254ff6a927e369a653b doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -684,8 +684,8 @@
 * fits.gz
 * fts.gz
 
-yt can read two kinds of FITS files: FITS image files and FITS binary table files containing
-positions, times, and energies of X-ray events.
+yt can currently read two kinds of FITS files: FITS image files and FITS 
+binary table files containing positions, times, and energies of X-ray events.
 
 Though a FITS image is composed of a single array in the FITS file,
 upon being loaded into yt it is automatically decomposed into grids:
@@ -760,14 +760,47 @@
 If your data is of the first case, yt will determine the length units based
 on the information in the header. If your data is of the second or third
 cases, no length units will be assigned, but the world coordinate information
-about the axes will be stored in separate fields. If your data is of the fourth
-type, the coordinates of the first three axes will be determined according to
-cases 1-3.
+about the axes will be stored in separate fields. If your data is of the 
+fourth type, the coordinates of the first three axes will be determined 
+according to cases 1-3.
 
 .. note::
 
-  Linear length-based coordinates (Case 1 above) are only supported if all dimensions
-  have the same value for ``CUNITx``. WCS coordinates are only supported for Cases 2-4.
+  Linear length-based coordinates (Case 1 above) are only supported if all 
+  dimensions have the same value for ``CUNITx``. WCS coordinates are only 
+  supported for Cases 2-4.
+
+FITS Data Decomposition
+^^^^^^^^^^^^^^^^^^^^^^^
+
+Though a FITS image is composed of a single array in the FITS file,
+upon being loaded into yt it is automatically decomposed into grids:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("m33_hi.fits")
+   ds.print_stats()
+
+.. parsed-literal::
+
+   level  # grids         # cells     # cells^3
+   ----------------------------------------------
+     0	     512	  981940800       994
+   ----------------------------------------------
+             512	  981940800
+
+For 3D spectral-cube data, the decomposition into grids will be done along the
+spectral axis since this will speed up many common operations for this 
+particular type of dataset. 
+
+yt will generate its own domain decomposition, but the number of grids can be
+set manually by passing the ``nprocs`` parameter to the ``load`` call:
+
+.. code-block:: python
+
+   ds = load("m33_hi.fits", nprocs=64)
+
 
 Fields in FITS Datasets
 ^^^^^^^^^^^^^^^^^^^^^^^
@@ -797,10 +830,11 @@
 will be generated from the pixel coordinates in the file using the WCS
 transformations provided by AstroPy.
 
-X-ray event data will be loaded as particle fields in yt, but a grid will be constructed from the
-WCS information in the FITS header. There is a helper function, ``setup_counts_fields``,
-which may be used to make deposited image fields from the event data for different energy bands
-(for an example see :ref:`xray_fits`).
+X-ray event data will be loaded as particle fields in yt, but a grid will be 
+constructed from the WCS information in the FITS header. There is a helper 
+function, ``setup_counts_fields``, which may be used to make deposited image 
+fields from the event data for different energy bands (for an example see 
+:ref:`xray_fits`).
 
 .. note::
 
@@ -814,8 +848,8 @@
 Additional Options
 ^^^^^^^^^^^^^^^^^^
 
-The following are additional options that may be passed to the ``load`` command when analyzing
-FITS data:
+The following are additional options that may be passed to the ``load`` command 
+when analyzing FITS data:
 
 ``nan_mask``
 """"""""""""
@@ -840,12 +874,6 @@
 files, many of which you may want to ignore. If you want to see these
 warnings, set ``suppress_astropy_warnings = False``.
 
-``z_axis_decomp``
-"""""""""""""""""
-
-For some applications, decomposing 3D FITS data into grids that span the x-y plane with short
-strides along the z-axis may result in a significant improvement in I/O speed. To enable this feature, set ``z_axis_decomp=True``.
-
 ``spectral_factor``
 """""""""""""""""""
 
@@ -860,8 +888,10 @@
 Miscellaneous Tools for Use with FITS Data
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
-A number of tools have been prepared for use with FITS data that enhance yt's visualization and
-analysis capabilities for this particular type of data. These are included in the ``yt.frontends.fits.misc`` module, and can be imported like so:
+A number of tools have been prepared for use with FITS data that enhance yt's 
+visualization and analysis capabilities for this particular type of data. These 
+are included in the ``yt.frontends.fits.misc`` module, and can be imported like 
+so:
 
 .. code-block:: python
 
@@ -870,7 +900,8 @@
 ``setup_counts_fields``
 """""""""""""""""""""""
 
-This function can be used to create image fields from X-ray counts data in different energy bands:
+This function can be used to create image fields from X-ray counts data in 
+different energy bands:
 
 .. code-block:: python
 
@@ -880,13 +911,13 @@
 which would make two fields, ``"counts_0.1-2.0"`` and ``"counts_2.0-5.0"``,
 and add them to the field registry for the dataset ``ds``.
 
-
 ``ds9_region``
 """"""""""""""
 
-This function takes a `ds9 <http://ds9.si.edu/site/Home.html>`_ region and creates a "cut region"
-data container from it, that can be used to select the cells in the FITS dataset that fall within
-the region. To use this functionality, the `pyregion <https://github.com/astropy/pyregion/>`_
+This function takes a `ds9 <http://ds9.si.edu/site/Home.html>`_ region and 
+creates a "cut region" data container from it, that can be used to select 
+the cells in the FITS dataset that fall within the region. To use this 
+functionality, the `pyregion <https://github.com/astropy/pyregion/>`_
 package must be installed.
 
 .. code-block:: python
@@ -899,8 +930,9 @@
 ``PlotWindowWCS``
 """""""""""""""""
 
-This class takes a on-axis ``SlicePlot`` or ``ProjectionPlot`` of FITS data and adds celestial
-coordinates to the plot axes. To use it, a version of AstroPy >= 1.3 must be installed.
+This class takes a on-axis ``SlicePlot`` or ``ProjectionPlot`` of FITS 
+data and adds celestial coordinates to the plot axes. To use it, a 
+version of AstroPy >= 1.3 must be installed.
 
 .. code-block:: python
 
@@ -908,21 +940,23 @@
   wcs_slc.show() # for the IPython notebook
   wcs_slc.save()
 
-``WCSAxes`` is still in an experimental state, but as its functionality improves it will be
-utilized more here.
+``WCSAxes`` is still in an experimental state, but as its functionality 
+improves it will be utilized more here.
 
 ``create_spectral_slabs``
 """""""""""""""""""""""""
 
 .. note::
 
-  The following functionality requires the `spectral-cube <http://spectral-cube.readthedocs.org>`_
-  library to be installed.
-
-If you have a spectral intensity dataset of some sort, and would like to extract emission in
-particular slabs along the spectral axis of a certain width, ``create_spectral_slabs`` can be
-used to generate a dataset with these slabs as different fields. In this example, we use it
-to extract individual lines from an intensity cube:
+  The following functionality requires the 
+  `spectral-cube <http://spectral-cube.readthedocs.org>`_ library to be 
+  installed.
+
+If you have a spectral intensity dataset of some sort, and would like to
+extract emission in particular slabs along the spectral axis of a certain 
+width, ``create_spectral_slabs`` can be used to generate a dataset with 
+these slabs as different fields. In this example, we use it to extract 
+individual lines from an intensity cube:
 
 .. code-block:: python
 
@@ -934,11 +968,13 @@
                                     slab_centers, slab_width,
                                     nan_mask=0.0)
 
-All keyword arguments to `create_spectral_slabs` are passed on to `load` when creating the dataset
-(see :ref:`additional_fits_options` above). In the returned dataset, the different slabs will be
-different fields, with the field names taken from the keys in ``slab_centers``. The WCS coordinates
-on the spectral axis are reset so that the center of the domain along this axis is zero, and the
-left and right edges of the domain along this axis are :math:`\pm` ``0.5*slab_width``.
+All keyword arguments to `create_spectral_slabs` are passed on to `load` when 
+creating the dataset (see :ref:`additional_fits_options` above). In the 
+returned dataset, the different slabs will be different fields, with the field 
+names taken from the keys in ``slab_centers``. The WCS coordinates on the 
+spectral axis are reset so that the center of the domain along this axis is 
+zero, and the left and right edges of the domain along this axis are 
+:math:`\pm` ``0.5*slab_width``.
 
 Examples of Using FITS Data
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^


https://bitbucket.org/yt_analysis/yt/commits/5f89cd84185a/
Changeset:   5f89cd84185a
User:        jzuhone
Date:        2017-10-03 03:20:51+00:00
Summary:     Update API
Affected #:  1 file

diff -r 1f423ce4da0e129303523254ff6a927e369a653b -r 5f89cd84185ae841a101818560aab054d1685680 yt/frontends/fits/api.py
--- a/yt/frontends/fits/api.py
+++ b/yt/frontends/fits/api.py
@@ -13,7 +13,11 @@
 from .data_structures import \
       FITSGrid, \
       FITSHierarchy, \
-      FITSDataset
+      FITSDataset, \
+      SpectralCubeFITSHierarchy, \
+      SpectralCubeFITSDataset, \
+      SkyDataFITSDataset, \
+      EventsFITSDataset
 
 from .fields import \
       FITSFieldInfo


https://bitbucket.org/yt_analysis/yt/commits/19a2f9ecd4a9/
Changeset:   19a2f9ecd4a9
User:        jzuhone
Date:        2017-10-03 03:21:02+00:00
Summary:     Fix classes in tests
Affected #:  1 file

diff -r 5f89cd84185ae841a101818560aab054d1685680 -r 19a2f9ecd4a94b448a9416c4c3a1c907161a0332 yt/frontends/fits/tests/test_outputs.py
--- a/yt/frontends/fits/tests/test_outputs.py
+++ b/yt/frontends/fits/tests/test_outputs.py
@@ -21,14 +21,15 @@
     requires_ds, \
     small_patch_amr, \
     data_dir_load
-from ..data_structures import FITSDataset
+from ..data_structures import FITSDataset, \
+    SpectralCubeFITSDataset
 
 _fields_grs = ("temperature",)
 
 grs = "radio_fits/grs-50-cube.fits"
 @requires_ds(grs)
 def test_grs():
-    ds = data_dir_load(grs, cls=FITSDataset, kwargs={"nan_mask":0.0})
+    ds = data_dir_load(grs, cls=SpectralCubeFITSDataset, kwargs={"nan_mask":0.0})
     assert_equal(str(ds), "grs-50-cube.fits")
     for test in small_patch_amr(ds, _fields_grs, input_center="c", input_weight="ones"):
         test_grs.__name__ = test.description
@@ -51,4 +52,4 @@
 
 @requires_file(grs)
 def test_FITSDataset():
-    assert isinstance(data_dir_load(grs), FITSDataset)
+    assert isinstance(data_dir_load(grs), SpectralCubeFITSDataset)


https://bitbucket.org/yt_analysis/yt/commits/502c65f02225/
Changeset:   502c65f02225
User:        jzuhone
Date:        2017-10-03 12:26:25+00:00
Summary:     Fix flake8
Affected #:  1 file

diff -r 19a2f9ecd4a94b448a9416c4c3a1c907161a0332 -r 502c65f02225f56381acf8c1b3755df2f4f4f29c yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -143,7 +143,6 @@
             return True
 
     def _detect_output_fields(self):
-        ds = self.dataset
         self.field_list = []
         self._axis_map = {}
         self._file_map = {}


https://bitbucket.org/yt_analysis/yt/commits/deed9d6e61a6/
Changeset:   deed9d6e61a6
User:        jzuhone
Date:        2017-10-03 12:30:21+00:00
Summary:     Bugfix
Affected #:  1 file

diff -r 502c65f02225f56381acf8c1b3755df2f4f4f29c -r deed9d6e61a62a820d5a3fe821644b523dd619c9 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -310,7 +310,8 @@
     fileh = check_fits_valid(args)
     if fileh is not None:
         try:
-            if fileh[1].name == "EVENTS" and ndim == 2:
+            if (len(fileh) > 1 and
+                fileh[1].name == "EVENTS" and ndim == 2):
                 fileh.close()
                 return True
             else:


https://bitbucket.org/yt_analysis/yt/commits/450edf1f829d/
Changeset:   450edf1f829d
User:        jzuhone
Date:        2017-10-03 13:06:49+00:00
Summary:     Bug fixes
Affected #:  1 file

diff -r deed9d6e61a62a820d5a3fe821644b523dd619c9 -r 450edf1f829d989a01612e5693f3ac318510b6c8 yt/frontends/fits/misc.py
--- a/yt/frontends/fits/misc.py
+++ b/yt/frontends/fits/misc.py
@@ -162,16 +162,17 @@
     >>> print circle_region.quantities.extrema("flux")
     """
     import pyregion
+    from yt.frontends.fits.api import EventsFITSDataset
     if os.path.exists(reg):
         r = pyregion.open(reg)
     else:
         r = pyregion.parse(reg)
     reg_name = reg
-    filter = r.get_filter(header=ds.wcs_2d.to_header())
+    filter = r.get_filter(header=ds.wcs.to_header())
     nx = ds.domain_dimensions[ds.lon_axis]
     ny = ds.domain_dimensions[ds.lat_axis]
-    mask = filter.mask((ny,nx)).transpose()
-    if ds.events_data:
+    mask = filter.mask((ny, nx)).transpose()
+    if isinstance(ds, EventsFITSDataset):
         prefix = "event_"
     else:
         prefix = ""


https://bitbucket.org/yt_analysis/yt/commits/dba413ae504c/
Changeset:   dba413ae504c
User:        jzuhone
Date:        2017-10-03 13:34:44+00:00
Summary:     Update radio docs
Affected #:  1 file

diff -r 450edf1f829d989a01612e5693f3ac318510b6c8 -r dba413ae504c659a1f81ed1c385c989712e1d432 doc/source/cookbook/fits_radio_cubes.ipynb
--- a/doc/source/cookbook/fits_radio_cubes.ipynb
+++ b/doc/source/cookbook/fits_radio_cubes.ipynb
@@ -43,7 +43,7 @@
    },
    "outputs": [],
    "source": [
-    "ds = yt.load(\"radio_fits/m33_hi.fits\", nan_mask=0.0, z_axis_decomp=True)"
+    "ds = yt.load(\"radio_fits/m33_hi.fits\", nan_mask=0.0)"
    ]
   },
   {
@@ -185,7 +185,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "We can also make a projection of all the emission along the line of sight. Since we're not doing an integration along a path length, we needed to specify `method = \"sum\"`:"
+    "We can also make a projection of all the emission along the line of sight:"
    ]
   },
   {


https://bitbucket.org/yt_analysis/yt/commits/a3e16aecc3c4/
Changeset:   a3e16aecc3c4
User:        jzuhone
Date:        2017-10-03 13:35:01+00:00
Summary:     Make sure we have a wcs_2d attribute
Affected #:  2 files

diff -r dba413ae504c659a1f81ed1c385c989712e1d432 -r a3e16aecc3c4f13f06e23ee6d5c5e91652bc1672 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -557,6 +557,7 @@
         end = min(self.dimensionality+1, 4)
         self.ctypes = np.array([self.primary_header["CTYPE%d" % (i)]
                                 for i in range(1, end)])
+        self.wcs_2d = self.wcs
 
     def _parse_parameter_file(self):
         super(SkyDataFITSDataset, self)._parse_parameter_file()
@@ -592,13 +593,12 @@
 
     def _set_code_unit_attributes(self):
         super(SkyDataFITSDataset, self)._set_code_unit_attributes()
-        wcs_2d = getattr(self, "wcs_2d", self.wcs)
-        units = wcs_2d.wcs.cunit[0]
+        units = self.wcs_2d.wcs.cunit[0]
         if units == "deg":
             units = "degree"
         if units == "rad":
             units = "radian"
-        pixel_area = np.prod(np.abs(wcs_2d.wcs.cdelt))
+        pixel_area = np.prod(np.abs(self.wcs_2d.wcs.cdelt))
         pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
         pixel_dims = pixel_area.units.dimensions
         self.unit_registry.add("pixel", float(pixel_area.value), dimensions=pixel_dims)
@@ -783,6 +783,7 @@
         self.dims = [(self.events_info["x"][1] - self.events_info["x"][0]) / self.reblock,
                      (self.events_info["y"][1] - self.events_info["y"][0]) / self.reblock]
         self.ctypes = self.axis_names
+        self.wcs_2d = self.wcs
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):

diff -r dba413ae504c659a1f81ed1c385c989712e1d432 -r a3e16aecc3c4f13f06e23ee6d5c5e91652bc1672 yt/frontends/fits/misc.py
--- a/yt/frontends/fits/misc.py
+++ b/yt/frontends/fits/misc.py
@@ -168,7 +168,7 @@
     else:
         r = pyregion.parse(reg)
     reg_name = reg
-    filter = r.get_filter(header=ds.wcs.to_header())
+    filter = r.get_filter(header=ds.wcs_2d.to_header())
     nx = ds.domain_dimensions[ds.lon_axis]
     ny = ds.domain_dimensions[ds.lat_axis]
     mask = filter.mask((ny, nx)).transpose()


https://bitbucket.org/yt_analysis/yt/commits/b3331c459b70/
Changeset:   b3331c459b70
User:        jzuhone
Date:        2017-10-03 13:41:09+00:00
Summary:     Update FITS tests
Affected #:  2 files

diff -r a3e16aecc3c4f13f06e23ee6d5c5e91652bc1672 -r b3331c459b7001f94117bf5f40e2c0a1b65a9f91 tests/tests.yaml
--- a/tests/tests.yaml
+++ b/tests/tests.yaml
@@ -17,7 +17,7 @@
   local_enzo_p_001:
     - yt/frontends/enzo_p/tests/test_outputs.py
 
-  local_fits_001:
+  local_fits_002:
     - yt/frontends/fits/tests/test_outputs.py
 
   local_flash_008:

diff -r a3e16aecc3c4f13f06e23ee6d5c5e91652bc1672 -r b3331c459b7001f94117bf5f40e2c0a1b65a9f91 yt/frontends/fits/tests/test_outputs.py
--- a/yt/frontends/fits/tests/test_outputs.py
+++ b/yt/frontends/fits/tests/test_outputs.py
@@ -22,7 +22,9 @@
     small_patch_amr, \
     data_dir_load
 from ..data_structures import FITSDataset, \
-    SpectralCubeFITSDataset
+    SpectralCubeFITSDataset, \
+    SkyDataFITSDataset, \
+    EventsFITSDataset
 
 _fields_grs = ("temperature",)
 
@@ -46,10 +48,50 @@
         test_velocity_field.__name__ = test.description
         yield test
 
+acis = "xray_fits/acisf05356N003_evt2.fits.gz"
+
+_fields_acis = ("counts_0.1-2.0", "counts_2.0-5.0")
+
+ at requires_ds(acis)
+def test_acis():
+    from yt.frontends.fits.misc import setup_counts_fields
+    ds = data_dir_load(acis, cls=EventsFITSDataset)
+    ebounds = [(0.1, 2.0), (2.0, 5.0)]
+    setup_counts_fields(ds, ebounds)
+    assert_equal(str(ds), "acisf05356N003_evt2.fits.gz")
+    for test in small_patch_amr(ds, _fields_acis, input_center="c", input_weight="ones"):
+        test_acis.__name__ = test.description
+        yield test
+
+A2052 = "xray_fits/A2052_merged_0.3-2_match-core_tmap_bgecorr.fits"
+
+_fields_A2052 = ("flux",)
+
+ at requires_ds(A2052)
+def test_A2052():
+    ds = data_dir_load(A2052, cls=SkyDataFITSDataset)
+    assert_equal(str(ds), "A2052_merged_0.3-2_match-core_tmap_bgecorr.fits")
+    for test in small_patch_amr(ds, _fields_A2052, input_center="c", input_weight="ones"):
+        test_A2052.__name__ = test.description
+        yield test
+
 @requires_file(vf)
 def test_units_override():
     units_override_check(vf)
 
- at requires_file(grs)
+ at requires_file(vf)
 def test_FITSDataset():
+    assert isinstance(data_dir_load(vf), FITSDataset)
+
+ at requires_file(grs)
+def test_SpectralCubeFITSDataset():
     assert isinstance(data_dir_load(grs), SpectralCubeFITSDataset)
+
+ at requires_file(acis)
+def test_EventsFITSDataset():
+    assert isinstance(data_dir_load(acis), EventsFITSDataset)
+
+ at requires_file(A2052)
+def test_SkyDataFITSDataset():
+    assert isinstance(data_dir_load(A2052), SkyDataFITSDataset)
+


https://bitbucket.org/yt_analysis/yt/commits/607c7d685023/
Changeset:   607c7d685023
User:        jzuhone
Date:        2017-10-03 13:47:43+00:00
Summary:     restore deprecated z_axis_decomp argument
Affected #:  1 file

diff -r b3331c459b7001f94117bf5f40e2c0a1b65a9f91 -r 607c7d685023c2c738cb6f7148809f1b301469ae yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -634,8 +634,13 @@
                  suppress_astropy_warnings=True,
                  parameters=None,
                  units_override=None,
-                 unit_system="cgs"):
+                 unit_system="cgs",
+                 z_axis_decomp=None):
         self.spectral_factor = spectral_factor
+        if z_axis_decomp is not None:
+            issue_deprecation_warning("The 'z_axis_decomp' argument is deprecated, "
+                                      "as this decomposition is now performed for "
+                                      "spectral-cube FITS datasets automatically.")
         super(SpectralCubeFITSDataset, self).__init__(filename, nprocs=nprocs,
             auxiliary_files=auxiliary_files, storage_filename=storage_filename,
             suppress_astropy_warnings=suppress_astropy_warnings, nan_mask=nan_mask,


https://bitbucket.org/yt_analysis/yt/commits/dcf46e025fe3/
Changeset:   dcf46e025fe3
User:        jzuhone
Date:        2017-10-16 20:39:03+00:00
Summary:     Add extra backticks
Affected #:  1 file

diff -r 607c7d685023c2c738cb6f7148809f1b301469ae -r dcf46e025fe359ec39a10d511a44e7d28547056a doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -968,7 +968,7 @@
                                     slab_centers, slab_width,
                                     nan_mask=0.0)
 
-All keyword arguments to `create_spectral_slabs` are passed on to `load` when 
+All keyword arguments to ``create_spectral_slabs`` are passed on to ``load`` when 
 creating the dataset (see :ref:`additional_fits_options` above). In the 
 returned dataset, the different slabs will be different fields, with the field 
 names taken from the keys in ``slab_centers``. The WCS coordinates on the 


https://bitbucket.org/yt_analysis/yt/commits/c61501837308/
Changeset:   c61501837308
User:        jzuhone
Date:        2017-10-16 20:42:38+00:00
Summary:     Code cleanup as suggested by Britton
Affected #:  1 file

diff -r dcf46e025fe359ec39a10d511a44e7d28547056a -r c61501837308724e4e8c4148e8ab02fd2a9e91f3 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -231,14 +231,14 @@
             self.grids[i] = self.grid(i, self, self.grid_levels[i,0])
 
     def _domain_decomp(self):
-        bbox = np.array([[le, re] for le, re in zip(self.ds.domain_left_edge,
-                                                    self.ds.domain_right_edge)])
-        dims = np.array(self.ds.domain_dimensions)
+        bbox = np.array([self.ds.domain_left_edge,
+                         self.ds.domain_right_edge]).transpose()
+        dims = self.ds.domain_dimensions
         psize = get_psize(dims, self.num_grids)
         gle, gre, shapes, slices = decompose_array(dims, psize, bbox)
         self.grid_left_edge = self.ds.arr(gle, "code_length")
         self.grid_right_edge = self.ds.arr(gre, "code_length")
-        self.grid_dimensions = np.array([shape for shape in shapes], dtype="int32")
+        self.grid_dimensions = np.array(shapes, dtype="int32")
 
     def _populate_grid_objects(self):
         for i in range(self.num_grids):
@@ -770,7 +770,8 @@
                 elif v.lower() in ["energy", "time"]:
                     num = k.strip("TTYPE")
                     unit = self.primary_header["TUNIT" + num].lower()
-                    if unit.endswith("ev"): unit = unit.replace("ev", "eV")
+                    if unit.endswith("ev"):
+                        unit = unit.replace("ev", "eV")
                     self.events_info[v.lower()] = unit
         self.axis_names = [self.events_info[ax][2] for ax in ["x", "y"]]
         if "reblock" in self.specified_parameters:


https://bitbucket.org/yt_analysis/yt/commits/89cf5553a469/
Changeset:   89cf5553a469
User:        ngoldbaum
Date:        2017-10-17 16:09:19+00:00
Summary:     Merge pull request #1576 from jzuhone/fits_refactor

Refactor of the FITS frontend
Affected #:  8 files

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 doc/source/cookbook/fits_radio_cubes.ipynb
--- a/doc/source/cookbook/fits_radio_cubes.ipynb
+++ b/doc/source/cookbook/fits_radio_cubes.ipynb
@@ -43,7 +43,7 @@
    },
    "outputs": [],
    "source": [
-    "ds = yt.load(\"radio_fits/m33_hi.fits\", nan_mask=0.0, z_axis_decomp=True)"
+    "ds = yt.load(\"radio_fits/m33_hi.fits\", nan_mask=0.0)"
    ]
   },
   {
@@ -185,7 +185,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "We can also make a projection of all the emission along the line of sight. Since we're not doing an integration along a path length, we needed to specify `method = \"sum\"`:"
+    "We can also make a projection of all the emission along the line of sight:"
    ]
   },
   {

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 doc/source/examining/loading_data.rst
--- a/doc/source/examining/loading_data.rst
+++ b/doc/source/examining/loading_data.rst
@@ -684,8 +684,8 @@
 * fits.gz
 * fts.gz
 
-yt can read two kinds of FITS files: FITS image files and FITS binary table files containing
-positions, times, and energies of X-ray events.
+yt can currently read two kinds of FITS files: FITS image files and FITS 
+binary table files containing positions, times, and energies of X-ray events.
 
 Though a FITS image is composed of a single array in the FITS file,
 upon being loaded into yt it is automatically decomposed into grids:
@@ -760,14 +760,47 @@
 If your data is of the first case, yt will determine the length units based
 on the information in the header. If your data is of the second or third
 cases, no length units will be assigned, but the world coordinate information
-about the axes will be stored in separate fields. If your data is of the fourth
-type, the coordinates of the first three axes will be determined according to
-cases 1-3.
+about the axes will be stored in separate fields. If your data is of the 
+fourth type, the coordinates of the first three axes will be determined 
+according to cases 1-3.
 
 .. note::
 
-  Linear length-based coordinates (Case 1 above) are only supported if all dimensions
-  have the same value for ``CUNITx``. WCS coordinates are only supported for Cases 2-4.
+  Linear length-based coordinates (Case 1 above) are only supported if all 
+  dimensions have the same value for ``CUNITx``. WCS coordinates are only 
+  supported for Cases 2-4.
+
+FITS Data Decomposition
+^^^^^^^^^^^^^^^^^^^^^^^
+
+Though a FITS image is composed of a single array in the FITS file,
+upon being loaded into yt it is automatically decomposed into grids:
+
+.. code-block:: python
+
+   import yt
+   ds = yt.load("m33_hi.fits")
+   ds.print_stats()
+
+.. parsed-literal::
+
+   level  # grids         # cells     # cells^3
+   ----------------------------------------------
+     0	     512	  981940800       994
+   ----------------------------------------------
+             512	  981940800
+
+For 3D spectral-cube data, the decomposition into grids will be done along the
+spectral axis since this will speed up many common operations for this 
+particular type of dataset. 
+
+yt will generate its own domain decomposition, but the number of grids can be
+set manually by passing the ``nprocs`` parameter to the ``load`` call:
+
+.. code-block:: python
+
+   ds = load("m33_hi.fits", nprocs=64)
+
 
 Fields in FITS Datasets
 ^^^^^^^^^^^^^^^^^^^^^^^
@@ -797,10 +830,11 @@
 will be generated from the pixel coordinates in the file using the WCS
 transformations provided by AstroPy.
 
-X-ray event data will be loaded as particle fields in yt, but a grid will be constructed from the
-WCS information in the FITS header. There is a helper function, ``setup_counts_fields``,
-which may be used to make deposited image fields from the event data for different energy bands
-(for an example see :ref:`xray_fits`).
+X-ray event data will be loaded as particle fields in yt, but a grid will be 
+constructed from the WCS information in the FITS header. There is a helper 
+function, ``setup_counts_fields``, which may be used to make deposited image 
+fields from the event data for different energy bands (for an example see 
+:ref:`xray_fits`).
 
 .. note::
 
@@ -814,8 +848,8 @@
 Additional Options
 ^^^^^^^^^^^^^^^^^^
 
-The following are additional options that may be passed to the ``load`` command when analyzing
-FITS data:
+The following are additional options that may be passed to the ``load`` command 
+when analyzing FITS data:
 
 ``nan_mask``
 """"""""""""
@@ -840,12 +874,6 @@
 files, many of which you may want to ignore. If you want to see these
 warnings, set ``suppress_astropy_warnings = False``.
 
-``z_axis_decomp``
-"""""""""""""""""
-
-For some applications, decomposing 3D FITS data into grids that span the x-y plane with short
-strides along the z-axis may result in a significant improvement in I/O speed. To enable this feature, set ``z_axis_decomp=True``.
-
 ``spectral_factor``
 """""""""""""""""""
 
@@ -860,8 +888,10 @@
 Miscellaneous Tools for Use with FITS Data
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
-A number of tools have been prepared for use with FITS data that enhance yt's visualization and
-analysis capabilities for this particular type of data. These are included in the ``yt.frontends.fits.misc`` module, and can be imported like so:
+A number of tools have been prepared for use with FITS data that enhance yt's 
+visualization and analysis capabilities for this particular type of data. These 
+are included in the ``yt.frontends.fits.misc`` module, and can be imported like 
+so:
 
 .. code-block:: python
 
@@ -870,7 +900,8 @@
 ``setup_counts_fields``
 """""""""""""""""""""""
 
-This function can be used to create image fields from X-ray counts data in different energy bands:
+This function can be used to create image fields from X-ray counts data in 
+different energy bands:
 
 .. code-block:: python
 
@@ -880,13 +911,13 @@
 which would make two fields, ``"counts_0.1-2.0"`` and ``"counts_2.0-5.0"``,
 and add them to the field registry for the dataset ``ds``.
 
-
 ``ds9_region``
 """"""""""""""
 
-This function takes a `ds9 <http://ds9.si.edu/site/Home.html>`_ region and creates a "cut region"
-data container from it, that can be used to select the cells in the FITS dataset that fall within
-the region. To use this functionality, the `pyregion <https://github.com/astropy/pyregion/>`_
+This function takes a `ds9 <http://ds9.si.edu/site/Home.html>`_ region and 
+creates a "cut region" data container from it, that can be used to select 
+the cells in the FITS dataset that fall within the region. To use this 
+functionality, the `pyregion <https://github.com/astropy/pyregion/>`_
 package must be installed.
 
 .. code-block:: python
@@ -899,8 +930,9 @@
 ``PlotWindowWCS``
 """""""""""""""""
 
-This class takes a on-axis ``SlicePlot`` or ``ProjectionPlot`` of FITS data and adds celestial
-coordinates to the plot axes. To use it, a version of AstroPy >= 1.3 must be installed.
+This class takes a on-axis ``SlicePlot`` or ``ProjectionPlot`` of FITS 
+data and adds celestial coordinates to the plot axes. To use it, a 
+version of AstroPy >= 1.3 must be installed.
 
 .. code-block:: python
 
@@ -908,21 +940,23 @@
   wcs_slc.show() # for the IPython notebook
   wcs_slc.save()
 
-``WCSAxes`` is still in an experimental state, but as its functionality improves it will be
-utilized more here.
+``WCSAxes`` is still in an experimental state, but as its functionality 
+improves it will be utilized more here.
 
 ``create_spectral_slabs``
 """""""""""""""""""""""""
 
 .. note::
 
-  The following functionality requires the `spectral-cube <http://spectral-cube.readthedocs.org>`_
-  library to be installed.
-
-If you have a spectral intensity dataset of some sort, and would like to extract emission in
-particular slabs along the spectral axis of a certain width, ``create_spectral_slabs`` can be
-used to generate a dataset with these slabs as different fields. In this example, we use it
-to extract individual lines from an intensity cube:
+  The following functionality requires the 
+  `spectral-cube <http://spectral-cube.readthedocs.org>`_ library to be 
+  installed.
+
+If you have a spectral intensity dataset of some sort, and would like to
+extract emission in particular slabs along the spectral axis of a certain 
+width, ``create_spectral_slabs`` can be used to generate a dataset with 
+these slabs as different fields. In this example, we use it to extract 
+individual lines from an intensity cube:
 
 .. code-block:: python
 
@@ -934,11 +968,13 @@
                                     slab_centers, slab_width,
                                     nan_mask=0.0)
 
-All keyword arguments to `create_spectral_slabs` are passed on to `load` when creating the dataset
-(see :ref:`additional_fits_options` above). In the returned dataset, the different slabs will be
-different fields, with the field names taken from the keys in ``slab_centers``. The WCS coordinates
-on the spectral axis are reset so that the center of the domain along this axis is zero, and the
-left and right edges of the domain along this axis are :math:`\pm` ``0.5*slab_width``.
+All keyword arguments to ``create_spectral_slabs`` are passed on to ``load`` when 
+creating the dataset (see :ref:`additional_fits_options` above). In the 
+returned dataset, the different slabs will be different fields, with the field 
+names taken from the keys in ``slab_centers``. The WCS coordinates on the 
+spectral axis are reset so that the center of the domain along this axis is 
+zero, and the left and right edges of the domain along this axis are 
+:math:`\pm` ``0.5*slab_width``.
 
 Examples of Using FITS Data
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 tests/tests.yaml
--- a/tests/tests.yaml
+++ b/tests/tests.yaml
@@ -17,7 +17,7 @@
   local_enzo_p_001:
     - yt/frontends/enzo_p/tests/test_outputs.py
 
-  local_fits_001:
+  local_fits_002:
     - yt/frontends/fits/tests/test_outputs.py
 
   local_flash_008:

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 yt/frontends/fits/api.py
--- a/yt/frontends/fits/api.py
+++ b/yt/frontends/fits/api.py
@@ -13,7 +13,11 @@
 from .data_structures import \
       FITSGrid, \
       FITSHierarchy, \
-      FITSDataset
+      FITSDataset, \
+      SpectralCubeFITSHierarchy, \
+      SpectralCubeFITSDataset, \
+      SkyDataFITSDataset, \
+      EventsFITSDataset
 
 from .fields import \
       FITSFieldInfo

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 yt/frontends/fits/data_structures.py
--- a/yt/frontends/fits/data_structures.py
+++ b/yt/frontends/fits/data_structures.py
@@ -20,7 +20,6 @@
 import weakref
 import warnings
 
-
 from collections import defaultdict
 
 from yt.config import ytcfg
@@ -40,16 +39,18 @@
     FITSFileHandler
 from yt.utilities.io_handler import \
     io_registry
-from .fields import FITSFieldInfo
+from .fields import FITSFieldInfo, \
+    WCSFITSFieldInfo
 from yt.utilities.decompose import \
     decompose_array, get_psize
+from yt.funcs import issue_deprecation_warning
 from yt.units.unit_lookup_table import \
     default_unit_symbol_lut, \
     prefixable_units, \
     unit_prefixes
 from yt.units import dimensions
-from yt.utilities.on_demand_imports import _astropy, NotAModule
-
+from yt.utilities.on_demand_imports import \
+    _astropy, NotAModule
 
 lon_prefixes = ["X","RA","GLON","LINEAR"]
 lat_prefixes = ["Y","DEC","GLAT","LINEAR"]
@@ -57,10 +58,16 @@
 delimiters += [str(i) for i in range(10)]
 regex_pattern = '|'.join(re.escape(_) for _ in delimiters)
 
-spec_names = {"V":"Velocity",
-              "F":"Frequency",
-              "E":"Energy",
-              "W":"Wavelength"}
+spec_names = {"V": "Velocity",
+              "F": "Frequency",
+              "E": "Energy",
+              "W": "Wavelength"}
+
+space_prefixes = list(set(lon_prefixes + lat_prefixes))
+sky_prefixes = set(space_prefixes)
+sky_prefixes.difference_update({"X", "Y", "LINEAR"})
+sky_prefixes = list(sky_prefixes)
+spec_prefixes = list(spec_names.keys())
 
 field_from_unit = {"Jy":"intensity",
                    "K":"temperature"}
@@ -68,8 +75,8 @@
 class FITSGrid(AMRGridPatch):
     _id_offset = 0
     def __init__(self, id, index, level):
-        AMRGridPatch.__init__(self, id, filename = index.index_filename,
-                              index = index)
+        AMRGridPatch.__init__(self, id, filename=index.index_filename,
+                              index=index)
         self.Parent = None
         self.Children = []
         self.Level = 0
@@ -136,19 +143,7 @@
             return True
 
     def _detect_output_fields(self):
-        ds = self.dataset
         self.field_list = []
-        if ds.events_data:
-            for k,v in ds.events_info.items():
-                fname = "event_"+k
-                mylog.info("Adding field %s to the list of fields." % (fname))
-                self.field_list.append(("io",fname))
-                if k in ["x","y"]:
-                    field_unit = "code_length"
-                else:
-                    field_unit = v
-                self.dataset.field_units[("io",fname)] = field_unit
-            return
         self._axis_map = {}
         self._file_map = {}
         self._ext_map = {}
@@ -200,7 +195,7 @@
                         self._axis_map[fname] = k
                         self._file_map[fname] = fits_file
                         self._ext_map[fname] = j
-                        self._scale_map[fname] = [0.0,1.0]
+                        self._scale_map[fname] = [0.0, 1.0]
                         if "bzero" in hdu.header:
                             self._scale_map[fname][0] = hdu.header["bzero"]
                         if "bscale" in hdu.header:
@@ -224,45 +219,27 @@
 
         # If nprocs > 1, decompose the domain into virtual grids
         if self.num_grids > 1:
-            if self.ds.z_axis_decomp:
-                dz = ds.quan(1.0, "code_length")*ds.spectral_factor
-                self.grid_dimensions[:,2] = np.around(float(ds.domain_dimensions[2])/
-                                                            self.num_grids).astype("int")
-                self.grid_dimensions[-1,2] += (ds.domain_dimensions[2] % self.num_grids)
-                self.grid_left_edge[0,2] = ds.domain_left_edge[2]
-                self.grid_left_edge[1:,2] = ds.domain_left_edge[2] + \
-                                            np.cumsum(self.grid_dimensions[:-1,2])*dz
-                self.grid_right_edge[:,2] = self.grid_left_edge[:,2]+self.grid_dimensions[:,2]*dz
-                self.grid_left_edge[:,:2] = ds.domain_left_edge[:2]
-                self.grid_right_edge[:,:2] = ds.domain_right_edge[:2]
-                self.grid_dimensions[:,:2] = ds.domain_dimensions[:2]
-            else:
-                bbox = np.array([[le,re] for le, re in zip(ds.domain_left_edge,
-                                                           ds.domain_right_edge)])
-                dims = np.array(ds.domain_dimensions)
-                psize = get_psize(dims, self.num_grids)
-                gle, gre, shapes, slices = decompose_array(dims, psize, bbox)
-                self.grid_left_edge = self.ds.arr(gle, "code_length")
-                self.grid_right_edge = self.ds.arr(gre, "code_length")
-                self.grid_dimensions = np.array([shape for shape in shapes], dtype="int32")
+            self._domain_decomp()
         else:
             self.grid_left_edge[0,:] = ds.domain_left_edge
             self.grid_right_edge[0,:] = ds.domain_right_edge
             self.grid_dimensions[0] = ds.domain_dimensions
 
-        if ds.events_data:
-            try:
-                self.grid_particle_count[:] = ds.primary_header["naxis2"]
-            except KeyError:
-                self.grid_particle_count[:] = 0.0
-            self._particle_indices = np.zeros(self.num_grids + 1, dtype='int64')
-            self._particle_indices[1] = self.grid_particle_count.squeeze()
-
         self.grid_levels.flat[:] = 0
         self.grids = np.empty(self.num_grids, dtype='object')
         for i in range(self.num_grids):
             self.grids[i] = self.grid(i, self, self.grid_levels[i,0])
 
+    def _domain_decomp(self):
+        bbox = np.array([self.ds.domain_left_edge,
+                         self.ds.domain_right_edge]).transpose()
+        dims = self.ds.domain_dimensions
+        psize = get_psize(dims, self.num_grids)
+        gle, gre, shapes, slices = decompose_array(dims, psize, bbox)
+        self.grid_left_edge = self.ds.arr(gle, "code_length")
+        self.grid_right_edge = self.ds.arr(gre, "code_length")
+        self.grid_dimensions = np.array(shapes, dtype="int32")
+
     def _populate_grid_objects(self):
         for i in range(self.num_grids):
             self.grids[i]._prepare_grid()
@@ -298,6 +275,58 @@
             yield YTDataChunk(dobj, "io", gs, self._count_selection(dobj, gs),
                               cache = cache)
 
+def find_primary_header(fileh):
+    # Sometimes the primary hdu doesn't have an image
+    if len(fileh) > 1 and fileh[0].header["naxis"] == 0:
+        first_image = 1
+    else:
+        first_image = 0
+    header = fileh[first_image].header
+    return header, first_image
+
+def check_fits_valid(args):
+    ext = args[0].rsplit(".", 1)[-1]
+    if ext.upper() in ("GZ", "FZ"):
+        # We don't know for sure that there will be > 1
+        ext = args[0].rsplit(".", 1)[0].rsplit(".", 1)[-1]
+    if ext.upper() not in ("FITS", "FTS"):
+        return None
+    elif isinstance(_astropy.pyfits, NotAModule):
+        raise RuntimeError("This appears to be a FITS file, but AstroPy is not installed.")
+    try:
+        with warnings.catch_warnings():
+            warnings.filterwarnings('ignore', category=UserWarning, append=True)
+            fileh = _astropy.pyfits.open(args[0])
+        header, _ = find_primary_header(fileh)
+        if header["naxis"] >= 2:
+            return fileh
+        else:
+            fileh.close()
+    except:
+        pass
+    return None
+
+def check_sky_coords(args, ndim):
+    fileh = check_fits_valid(args)
+    if fileh is not None:
+        try:
+            if (len(fileh) > 1 and
+                fileh[1].name == "EVENTS" and ndim == 2):
+                fileh.close()
+                return True
+            else:
+                header, _ = find_primary_header(fileh)
+                if header["naxis"] < ndim:
+                    return False
+                axis_names = [header.get("ctype%d" % (i + 1), "")
+                              for i in range(header["naxis"])]
+                x = find_axes(axis_names, sky_prefixes + spec_prefixes)
+                fileh.close()
+                return x >= ndim
+        except:
+            pass
+    return False
+
 class FITSDataset(Dataset):
     _index_class = FITSHierarchy
     _field_info_class = FITSFieldInfo
@@ -310,8 +339,6 @@
                  nprocs=None,
                  storage_filename=None,
                  nan_mask=None,
-                 spectral_factor=1.0,
-                 z_axis_decomp=False,
                  suppress_astropy_warnings=True,
                  parameters=None,
                  units_override=None,
@@ -322,9 +349,6 @@
         parameters["nprocs"] = nprocs
         self.specified_parameters = parameters
 
-        self.z_axis_decomp = z_axis_decomp
-        self.spectral_factor = spectral_factor
-
         if suppress_astropy_warnings:
             warnings.filterwarnings('ignore', module="astropy", append=True)
         auxiliary_files = ensure_list(auxiliary_files)
@@ -360,65 +384,6 @@
                                              ignore_blank=True)
                 self._handle._fits_files.append(f)
 
-        if len(self._handle) > 1 and self._handle[1].name == "EVENTS":
-            self.events_data = True
-            self.first_image = 1
-            self.primary_header = self._handle[self.first_image].header
-            self.naxis = 2
-            self.wcs = _astropy.pywcs.WCS(naxis=2)
-            self.events_info = {}
-            for k,v in self.primary_header.items():
-                if k.startswith("TTYP"):
-                    if v.lower() in ["x","y"]:
-                        num = k.strip("TTYPE")
-                        self.events_info[v.lower()] = (self.primary_header["TLMIN"+num],
-                                                       self.primary_header["TLMAX"+num],
-                                                       self.primary_header["TCTYP"+num],
-                                                       self.primary_header["TCRVL"+num],
-                                                       self.primary_header["TCDLT"+num],
-                                                       self.primary_header["TCRPX"+num])
-                    elif v.lower() in ["energy","time"]:
-                        num = k.strip("TTYPE")
-                        unit = self.primary_header["TUNIT"+num].lower()
-                        if unit.endswith("ev"): unit = unit.replace("ev","eV")
-                        self.events_info[v.lower()] = unit
-            self.axis_names = [self.events_info[ax][2] for ax in ["x","y"]]
-            self.reblock = 1
-            if "reblock" in self.specified_parameters:
-                self.reblock = self.specified_parameters["reblock"]
-            self.wcs.wcs.cdelt = [self.events_info["x"][4]*self.reblock,
-                                  self.events_info["y"][4]*self.reblock]
-            self.wcs.wcs.crpix = [(self.events_info["x"][5]-0.5)/self.reblock+0.5,
-                                  (self.events_info["y"][5]-0.5)/self.reblock+0.5]
-            self.wcs.wcs.ctype = [self.events_info["x"][2],self.events_info["y"][2]]
-            self.wcs.wcs.cunit = ["deg","deg"]
-            self.wcs.wcs.crval = [self.events_info["x"][3],self.events_info["y"][3]]
-            self.dims = [(self.events_info["x"][1]-self.events_info["x"][0])/self.reblock,
-                         (self.events_info["y"][1]-self.events_info["y"][0])/self.reblock]
-        else:
-            self.events_data = False
-            # Sometimes the primary hdu doesn't have an image
-            if len(self._handle) > 1 and self._handle[0].header["naxis"] == 0:
-                self.first_image = 1
-            else:
-                self.first_image = 0
-            self.primary_header = self._handle[self.first_image].header
-            self.naxis = self.primary_header["naxis"]
-            self.axis_names = [self.primary_header.get("ctype%d" % (i+1),"LINEAR")
-                               for i in range(self.naxis)]
-            self.dims = [self.primary_header["naxis%d" % (i+1)]
-                         for i in range(self.naxis)]
-            wcs = _astropy.pywcs.WCS(header=self.primary_header)
-            if self.naxis == 4:
-                self.wcs = _astropy.pywcs.WCS(naxis=3)
-                self.wcs.wcs.crpix = wcs.wcs.crpix[:3]
-                self.wcs.wcs.cdelt = wcs.wcs.cdelt[:3]
-                self.wcs.wcs.crval = wcs.wcs.crval[:3]
-                self.wcs.wcs.cunit = [str(unit) for unit in wcs.wcs.cunit][:3]
-                self.wcs.wcs.ctype = [type for type in wcs.wcs.ctype][:3]
-            else:
-                self.wcs = wcs
-
         self.refine_by = 2
 
         Dataset.__init__(self, fn, dataset_type, units_override=units_override,
@@ -459,18 +424,13 @@
             beam_size = self.quan(beam_size[0], beam_size[1]).in_cgs().value
         else:
             beam_size = 1.0
-        self.unit_registry.add("beam",beam_size,dimensions=dimensions.solid_angle)
-        if self.spec_cube:
-            units = self.wcs_2d.wcs.cunit[0]
-            if units == "deg": units = "degree"
-            if units == "rad": units = "radian"
-            pixel_area = np.prod(np.abs(self.wcs_2d.wcs.cdelt))
-            pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
-            pixel_dims = pixel_area.units.dimensions
-            self.unit_registry.add("pixel",float(pixel_area.value),dimensions=pixel_dims)
+        self.unit_registry.add("beam", beam_size, dimensions=dimensions.solid_angle)
 
     def _parse_parameter_file(self):
 
+        self._determine_structure()
+        self._determine_axes()
+
         if self.parameter_filename.startswith("InMemory"):
             self.unique_identifier = time.time()
         else:
@@ -484,7 +444,10 @@
 
         # Sometimes a FITS file has a 4D datacube, in which case
         # we take the 4th axis and assume it consists of different fields.
-        if self.dimensionality == 4: self.dimensionality = 3
+        if self.dimensionality == 4: 
+            self.dimensionality = 3
+
+        self._determine_wcs()
 
         self.domain_dimensions = np.array(self.dims)[:self.dimensionality]
         if self.dimensionality == 2:
@@ -514,154 +477,56 @@
         self.current_redshift = self.omega_lambda = self.omega_matter = \
             self.hubble_constant = self.cosmological_simulation = 0.0
 
-        if self.dimensionality == 2 and self.z_axis_decomp:
-            mylog.warning("You asked to decompose along the z-axis, but this is a 2D dataset. " +
-                          "Ignoring.")
-            self.z_axis_decomp = False
-
-        if self.events_data: self.specified_parameters["nprocs"] = 1
-
-        # If nprocs is None, do some automatic decomposition of the domain
-        if self.specified_parameters["nprocs"] is None:
-            if self.z_axis_decomp:
-                nprocs = np.around(self.domain_dimensions[2]/8).astype("int")
-            else:
-                nprocs = np.around(np.prod(self.domain_dimensions)/32**self.dimensionality).astype("int")
-            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
-        else:
-            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
-
-        # Check to see if this data is in some kind of (Lat,Lon,Vel) format
-        self.spec_cube = False
-        self.wcs_2d = None
-        x = 0
-        for p in lon_prefixes+lat_prefixes+list(spec_names.keys()):
-            y = np_char.startswith(self.axis_names[:self.dimensionality], p)
-            x += np.any(y)
-        if x == self.dimensionality:
-            if self.axis_names == ['LINEAR','LINEAR']:
-                self.wcs_2d = self.wcs
-                self.lat_axis = 1
-                self.lon_axis = 0
-                self.lat_name = "Y"
-                self.lon_name = "X"
-            else:
-                self._setup_spec_cube()
+        self._determine_nprocs()
 
         # Now we can set up some of our parameters for convenience.
-        #self.parameters['wcs'] = dict(self.wcs.to_header())
         for k, v in self.primary_header.items():
             self.parameters[k] = v
         # Remove potential default keys
         self.parameters.pop('', None)
 
-    def _setup_spec_cube(self):
-
-        self.spec_cube = True
-        self.geometry = "spectral_cube"
-
-        end = min(self.dimensionality+1,4)
-        if self.events_data:
-            ctypes = self.axis_names
+    def _determine_nprocs(self):
+        # If nprocs is None, do some automatic decomposition of the domain
+        if self.specified_parameters["nprocs"] is None:
+            nprocs = np.around(np.prod(self.domain_dimensions)/32**self.dimensionality).astype("int")
+            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
         else:
-            ctypes = np.array([self.primary_header["CTYPE%d" % (i)]
-                               for i in range(1,end)])
-
-        log_str = "Detected these axes: "+"%s "*len(ctypes)
-        mylog.info(log_str % tuple([ctype for ctype in ctypes]))
+            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
 
-        self.lat_axis = np.zeros((end-1), dtype="bool")
-        for p in lat_prefixes:
-            self.lat_axis += np_char.startswith(ctypes, p)
-        self.lat_axis = np.where(self.lat_axis)[0][0]
-        self.lat_name = ctypes[self.lat_axis].split("-")[0].lower()
-
-        self.lon_axis = np.zeros((end-1), dtype="bool")
-        for p in lon_prefixes:
-            self.lon_axis += np_char.startswith(ctypes, p)
-        self.lon_axis = np.where(self.lon_axis)[0][0]
-        self.lon_name = ctypes[self.lon_axis].split("-")[0].lower()
-
-        if self.lat_axis == self.lon_axis and self.lat_name == self.lon_name:
-            self.lat_axis = 1
-            self.lon_axis = 0
-            self.lat_name = "Y"
-            self.lon_name = "X"
-
-        if self.wcs.naxis > 2:
-
-            self.spec_axis = np.zeros((end-1), dtype="bool")
-            for p in spec_names.keys():
-                self.spec_axis += np_char.startswith(ctypes, p)
-            self.spec_axis = np.where(self.spec_axis)[0][0]
-            self.spec_name = spec_names[ctypes[self.spec_axis].split("-")[0][0]]
+    def _determine_structure(self):
+        self.primary_header, self.first_image = find_primary_header(self._handle)
+        self.naxis = self.primary_header["naxis"]
+        self.axis_names = [self.primary_header.get("ctype%d" % (i+1),"LINEAR")
+                           for i in range(self.naxis)]
+        self.dims = [self.primary_header["naxis%d" % (i+1)]
+                     for i in range(self.naxis)]
 
-            self.wcs_2d = _astropy.pywcs.WCS(naxis=2)
-            self.wcs_2d.wcs.crpix = self.wcs.wcs.crpix[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.cdelt = self.wcs.wcs.cdelt[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.crval = self.wcs.wcs.crval[[self.lon_axis, self.lat_axis]]
-            self.wcs_2d.wcs.cunit = [str(self.wcs.wcs.cunit[self.lon_axis]),
-                                     str(self.wcs.wcs.cunit[self.lat_axis])]
-            self.wcs_2d.wcs.ctype = [self.wcs.wcs.ctype[self.lon_axis],
-                                     self.wcs.wcs.ctype[self.lat_axis]]
-
-            self._p0 = self.wcs.wcs.crpix[self.spec_axis]
-            self._dz = self.wcs.wcs.cdelt[self.spec_axis]
-            self._z0 = self.wcs.wcs.crval[self.spec_axis]
-            self.spec_unit = str(self.wcs.wcs.cunit[self.spec_axis])
+    def _determine_wcs(self):
+        wcs = _astropy.pywcs.WCS(header=self.primary_header)
+        if self.naxis == 4:
+            self.wcs = _astropy.pywcs.WCS(naxis=3)
+            self.wcs.wcs.crpix = wcs.wcs.crpix[:3]
+            self.wcs.wcs.cdelt = wcs.wcs.cdelt[:3]
+            self.wcs.wcs.crval = wcs.wcs.crval[:3]
+            self.wcs.wcs.cunit = [str(unit) for unit in wcs.wcs.cunit][:3]
+            self.wcs.wcs.ctype = [type for type in wcs.wcs.ctype][:3]
+        else:
+            self.wcs = wcs
 
-            if self.spectral_factor == "auto":
-                self.spectral_factor = float(max(self.domain_dimensions[[self.lon_axis,
-                                                                         self.lat_axis]]))
-                self.spectral_factor /= self.domain_dimensions[self.spec_axis]
-                mylog.info("Setting the spectral factor to %f" % (self.spectral_factor))
-            Dz = self.domain_right_edge[self.spec_axis]-self.domain_left_edge[self.spec_axis]
-            dre = self.domain_right_edge
-            dre[self.spec_axis] = (self.domain_left_edge[self.spec_axis] +
-                                   self.spectral_factor*Dz)
-            self.domain_right_edge = dre
-            self._dz /= self.spectral_factor
-            self._p0 = (self._p0-0.5)*self.spectral_factor + 0.5
-
-        else:
-
-            self.wcs_2d = self.wcs
-            self.spec_axis = 2
-            self.spec_name = "z"
-            self.spec_unit = "code_length"
-
-    def spec2pixel(self, spec_value):
-        sv = self.arr(spec_value).in_units(self.spec_unit)
-        return self.arr((sv.v-self._z0)/self._dz+self._p0,
-                        "code_length")
-
-    def pixel2spec(self, pixel_value):
-        pv = self.arr(pixel_value, "code_length")
-        return self.arr((pv.v-self._p0)*self._dz+self._z0,
-                        self.spec_unit)
+    def _determine_axes(self):
+        self.lat_axis = 1
+        self.lon_axis = 0
+        self.lat_name = "Y"
+        self.lon_name = "X"
 
     @classmethod
     def _is_valid(cls, *args, **kwargs):
-        ext = args[0].rsplit(".", 1)[-1]
-        if ext.upper() in ("GZ", "FZ"):
-            # We don't know for sure that there will be > 1
-            ext = args[0].rsplit(".", 1)[0].rsplit(".", 1)[-1]
-        if ext.upper() not in ("FITS", "FTS"):
+        fileh = check_fits_valid(args)
+        if fileh is None:
             return False
-        elif isinstance(_astropy.pyfits, NotAModule):
-            raise RuntimeError("This appears to be a FITS file, but AstroPy is not installed.")
-        try:
-            with warnings.catch_warnings():
-                warnings.filterwarnings('ignore', category=UserWarning, append=True)
-                fileh = _astropy.pyfits.open(args[0])
-            valid = fileh[0].header["naxis"] >= 2
-            if len(fileh) > 1:
-                valid = fileh[1].header["naxis"] >= 2 or valid
+        else:
             fileh.close()
-            return valid
-        except:
-            pass
-        return False
+            return True
 
     @classmethod
     def _guess_candidates(cls, base, directories, files):
@@ -676,3 +541,264 @@
 
     def close(self):
         self._handle.close()
+
+def find_axes(axis_names, prefixes):
+    x = 0
+    for p in prefixes:
+        y = np_char.startswith(axis_names, p)
+        x += np.any(y)
+    return x
+
+class SkyDataFITSDataset(FITSDataset):
+    _field_info_class = WCSFITSFieldInfo
+
+    def _determine_wcs(self):
+        super(SkyDataFITSDataset, self)._determine_wcs()
+        end = min(self.dimensionality+1, 4)
+        self.ctypes = np.array([self.primary_header["CTYPE%d" % (i)]
+                                for i in range(1, end)])
+        self.wcs_2d = self.wcs
+
+    def _parse_parameter_file(self):
+        super(SkyDataFITSDataset, self)._parse_parameter_file()
+
+        end = min(self.dimensionality + 1, 4)
+
+        self.geometry = "spectral_cube"
+
+        log_str = "Detected these axes: "+"%s "*len(self.ctypes)
+        mylog.info(log_str % tuple([ctype for ctype in self.ctypes]))
+
+        self.lat_axis = np.zeros((end-1), dtype="bool")
+        for p in lat_prefixes:
+            self.lat_axis += np_char.startswith(self.ctypes, p)
+        self.lat_axis = np.where(self.lat_axis)[0][0]
+        self.lat_name = self.ctypes[self.lat_axis].split("-")[0].lower()
+
+        self.lon_axis = np.zeros((end-1), dtype="bool")
+        for p in lon_prefixes:
+            self.lon_axis += np_char.startswith(self.ctypes, p)
+        self.lon_axis = np.where(self.lon_axis)[0][0]
+        self.lon_name = self.ctypes[self.lon_axis].split("-")[0].lower()
+
+        if self.lat_axis == self.lon_axis and self.lat_name == self.lon_name:
+            self.lat_axis = 1
+            self.lon_axis = 0
+            self.lat_name = "Y"
+            self.lon_name = "X"
+
+        self.spec_axis = 2
+        self.spec_name = "z"
+        self.spec_unit = ""
+
+    def _set_code_unit_attributes(self):
+        super(SkyDataFITSDataset, self)._set_code_unit_attributes()
+        units = self.wcs_2d.wcs.cunit[0]
+        if units == "deg":
+            units = "degree"
+        if units == "rad":
+            units = "radian"
+        pixel_area = np.prod(np.abs(self.wcs_2d.wcs.cdelt))
+        pixel_area = self.quan(pixel_area, "%s**2" % (units)).in_cgs()
+        pixel_dims = pixel_area.units.dimensions
+        self.unit_registry.add("pixel", float(pixel_area.value), dimensions=pixel_dims)
+
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        return check_sky_coords(args, 2)
+
+class SpectralCubeFITSHierarchy(FITSHierarchy):
+
+    def _domain_decomp(self):
+        dz = self.ds.quan(1.0, "code_length") * self.ds.spectral_factor
+        self.grid_dimensions[:, 2] = np.around(float(self.ds.domain_dimensions[2]) /
+                                               self.num_grids).astype("int")
+        self.grid_dimensions[-1, 2] += (self.ds.domain_dimensions[2] % self.num_grids)
+        self.grid_left_edge[0, 2] = self.ds.domain_left_edge[2]
+        self.grid_left_edge[1:, 2] = self.ds.domain_left_edge[2] + \
+                                     np.cumsum(self.grid_dimensions[:-1, 2]) * dz
+        self.grid_right_edge[:, 2] = self.grid_left_edge[:, 2] + self.grid_dimensions[:, 2] * dz
+        self.grid_left_edge[:, :2] = self.ds.domain_left_edge[:2]
+        self.grid_right_edge[:, :2] = self.ds.domain_right_edge[:2]
+        self.grid_dimensions[:, :2] = self.ds.domain_dimensions[:2]
+
+
+class SpectralCubeFITSDataset(SkyDataFITSDataset):
+    _index_class = SpectralCubeFITSHierarchy
+    def __init__(self, filename,
+                 auxiliary_files=[],
+                 nprocs=None,
+                 storage_filename=None,
+                 nan_mask=None,
+                 spectral_factor=1.0,
+                 suppress_astropy_warnings=True,
+                 parameters=None,
+                 units_override=None,
+                 unit_system="cgs",
+                 z_axis_decomp=None):
+        self.spectral_factor = spectral_factor
+        if z_axis_decomp is not None:
+            issue_deprecation_warning("The 'z_axis_decomp' argument is deprecated, "
+                                      "as this decomposition is now performed for "
+                                      "spectral-cube FITS datasets automatically.")
+        super(SpectralCubeFITSDataset, self).__init__(filename, nprocs=nprocs,
+            auxiliary_files=auxiliary_files, storage_filename=storage_filename,
+            suppress_astropy_warnings=suppress_astropy_warnings, nan_mask=nan_mask,
+            parameters=parameters, units_override=units_override,
+            unit_system=unit_system)
+
+    def _parse_parameter_file(self):
+        super(SpectralCubeFITSDataset, self)._parse_parameter_file()
+
+        self.geometry = "spectral_cube"
+
+        end = min(self.dimensionality+1,4)
+
+        self.spec_axis = np.zeros(end-1, dtype="bool")
+        for p in spec_names.keys():
+            self.spec_axis += np_char.startswith(self.ctypes, p)
+        self.spec_axis = np.where(self.spec_axis)[0][0]
+        self.spec_name = spec_names[self.ctypes[self.spec_axis].split("-")[0][0]]
+
+        self.wcs_2d = _astropy.pywcs.WCS(naxis=2)
+        self.wcs_2d.wcs.crpix = self.wcs.wcs.crpix[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.cdelt = self.wcs.wcs.cdelt[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.crval = self.wcs.wcs.crval[[self.lon_axis, self.lat_axis]]
+        self.wcs_2d.wcs.cunit = [str(self.wcs.wcs.cunit[self.lon_axis]),
+                                 str(self.wcs.wcs.cunit[self.lat_axis])]
+        self.wcs_2d.wcs.ctype = [self.wcs.wcs.ctype[self.lon_axis],
+                                 self.wcs.wcs.ctype[self.lat_axis]]
+
+        self._p0 = self.wcs.wcs.crpix[self.spec_axis]
+        self._dz = self.wcs.wcs.cdelt[self.spec_axis]
+        self._z0 = self.wcs.wcs.crval[self.spec_axis]
+        self.spec_unit = str(self.wcs.wcs.cunit[self.spec_axis])
+
+        if self.spectral_factor == "auto":
+            self.spectral_factor = float(max(self.domain_dimensions[[self.lon_axis,
+                                                                     self.lat_axis]]))
+            self.spectral_factor /= self.domain_dimensions[self.spec_axis]
+            mylog.info("Setting the spectral factor to %f" % (self.spectral_factor))
+        Dz = self.domain_right_edge[self.spec_axis]-self.domain_left_edge[self.spec_axis]
+        dre = self.domain_right_edge
+        dre[self.spec_axis] = (self.domain_left_edge[self.spec_axis] +
+                               self.spectral_factor*Dz)
+        self.domain_right_edge = dre
+        self._dz /= self.spectral_factor
+        self._p0 = (self._p0-0.5)*self.spectral_factor + 0.5
+
+    def _determine_nprocs(self):
+        # If nprocs is None, do some automatic decomposition of the domain
+        if self.specified_parameters["nprocs"] is None:
+            nprocs = np.around(self.domain_dimensions[2] / 8).astype("int")
+            self.parameters["nprocs"] = max(min(nprocs, 512), 1)
+        else:
+            self.parameters["nprocs"] = self.specified_parameters["nprocs"]
+
+    def spec2pixel(self, spec_value):
+        sv = self.arr(spec_value).in_units(self.spec_unit)
+        return self.arr((sv.v-self._z0)/self._dz+self._p0,
+                        "code_length")
+
+    def pixel2spec(self, pixel_value):
+        pv = self.arr(pixel_value, "code_length")
+        return self.arr((pv.v-self._p0)*self._dz+self._z0,
+                        self.spec_unit)
+
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        return check_sky_coords(args, 3)
+
+class EventsFITSHierarchy(FITSHierarchy):
+
+    def _detect_output_fields(self):
+        ds = self.dataset
+        self.field_list = []
+        for k,v in ds.events_info.items():
+            fname = "event_"+k
+            mylog.info("Adding field %s to the list of fields." % (fname))
+            self.field_list.append(("io",fname))
+            if k in ["x","y"]:
+                field_unit = "code_length"
+            else:
+                field_unit = v
+            self.dataset.field_units[("io",fname)] = field_unit
+        return
+
+    def _parse_index(self):
+        super(EventsFITSHierarchy, self)._parse_index()
+        try:
+            self.grid_particle_count[:] = self.dataset.primary_header["naxis2"]
+        except KeyError:
+            self.grid_particle_count[:] = 0.0
+        self._particle_indices = np.zeros(self.num_grids + 1, dtype='int64')
+        self._particle_indices[1] = self.grid_particle_count.squeeze()
+
+class EventsFITSDataset(SkyDataFITSDataset):
+    _index_class = EventsFITSHierarchy
+    def __init__(self, filename,
+                 storage_filename=None,
+                 suppress_astropy_warnings=True,
+                 reblock=1,
+                 parameters=None,
+                 units_override=None,
+                 unit_system="cgs"):
+        self.reblock = reblock
+        super(EventsFITSDataset, self).__init__(filename, nprocs=1,
+            storage_filename=storage_filename, parameters=parameters,
+            suppress_astropy_warnings=suppress_astropy_warnings,
+            units_override=units_override, unit_system=unit_system)
+
+    def _determine_structure(self):
+        self.first_image = 1
+        self.primary_header = self._handle[self.first_image].header
+        self.naxis = 2
+
+    def _determine_wcs(self):
+        self.wcs = _astropy.pywcs.WCS(naxis=2)
+        self.events_info = {}
+        for k, v in self.primary_header.items():
+            if k.startswith("TTYP"):
+                if v.lower() in ["x", "y"]:
+                    num = k.strip("TTYPE")
+                    self.events_info[v.lower()] = (self.primary_header["TLMIN" + num],
+                                                   self.primary_header["TLMAX" + num],
+                                                   self.primary_header["TCTYP" + num],
+                                                   self.primary_header["TCRVL" + num],
+                                                   self.primary_header["TCDLT" + num],
+                                                   self.primary_header["TCRPX" + num])
+                elif v.lower() in ["energy", "time"]:
+                    num = k.strip("TTYPE")
+                    unit = self.primary_header["TUNIT" + num].lower()
+                    if unit.endswith("ev"):
+                        unit = unit.replace("ev", "eV")
+                    self.events_info[v.lower()] = unit
+        self.axis_names = [self.events_info[ax][2] for ax in ["x", "y"]]
+        if "reblock" in self.specified_parameters:
+            issue_deprecation_warning("'reblock' is now a keyword argument that "
+                                      "can be passed to 'yt.load'. This behavior "
+                                      "is deprecated.")
+            self.reblock = self.specified_parameters["reblock"]
+        self.wcs.wcs.cdelt = [self.events_info["x"][4] * self.reblock,
+                              self.events_info["y"][4] * self.reblock]
+        self.wcs.wcs.crpix = [(self.events_info["x"][5] - 0.5) / self.reblock + 0.5,
+                              (self.events_info["y"][5] - 0.5) / self.reblock + 0.5]
+        self.wcs.wcs.ctype = [self.events_info["x"][2], self.events_info["y"][2]]
+        self.wcs.wcs.cunit = ["deg", "deg"]
+        self.wcs.wcs.crval = [self.events_info["x"][3], self.events_info["y"][3]]
+        self.dims = [(self.events_info["x"][1] - self.events_info["x"][0]) / self.reblock,
+                     (self.events_info["y"][1] - self.events_info["y"][0]) / self.reblock]
+        self.ctypes = self.axis_names
+        self.wcs_2d = self.wcs
+
+    @classmethod
+    def _is_valid(cls, *args, **kwargs):
+        fileh = check_fits_valid(args)
+        if fileh is not None:
+            try:
+                valid = fileh[1].name == "EVENTS"
+                fileh.close()
+                return valid
+            except:
+                pass
+        return False
\ No newline at end of file

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 yt/frontends/fits/fields.py
--- a/yt/frontends/fits/fields.py
+++ b/yt/frontends/fits/fields.py
@@ -19,12 +19,20 @@
     def __init__(self, ds, field_list, slice_info=None):
         super(FITSFieldInfo, self).__init__(ds, field_list, slice_info=slice_info)
         for field in ds.field_list:
-            if field[0] == "fits": self[field].take_log = False
+            if field[0] == "fits": 
+                self[field].take_log = False
+
+class WCSFITSFieldInfo(FITSFieldInfo):
 
-    def _setup_spec_cube_fields(self):
+    def setup_fluid_fields(self):
+        wcs_2d = getattr(self.ds, "wcs_2d", self.ds.wcs)
+
+        def _pixel(field, data):
+            return data.ds.arr(data["ones"], "pixel")
+        self.add_field(("fits", "pixel"), sampling_type="cell", function=_pixel, units="pixel")
 
         def _get_2d_wcs(data, axis):
-            w_coords = data.ds.wcs_2d.wcs_pix2world(data["x"], data["y"], 1)
+            w_coords = wcs_2d.wcs_pix2world(data["x"], data["y"], 1)
             return w_coords[axis]
 
         def world_f(axis, unit):
@@ -33,11 +41,14 @@
             return _world_f
 
         for (i, axis), name in zip(enumerate([self.ds.lon_axis, self.ds.lat_axis]),
-                                             [self.ds.lon_name, self.ds.lat_name]):
-            unit = str(self.ds.wcs_2d.wcs.cunit[i])
-            if unit.lower() == "deg": unit = "degree"
-            if unit.lower() == "rad": unit = "radian"
-            self.add_field(("fits",name), sampling_type="cell",  function=world_f(axis, unit), units=unit)
+                                   [self.ds.lon_name, self.ds.lat_name]):
+            unit = str(wcs_2d.wcs.cunit[i])
+            if unit.lower() == "deg":
+                unit = "degree"
+            if unit.lower() == "rad":
+                unit = "radian"
+            self.add_field(("fits",name), sampling_type="cell", 
+                           function=world_f(axis, unit), units=unit)
 
         if self.ds.dimensionality == 3:
             def _spec(field, data):
@@ -46,12 +57,3 @@
                 return data.ds.arr(sp, data.ds.spec_unit)
             self.add_field(("fits","spectral"), sampling_type="cell",  function=_spec,
                            units=self.ds.spec_unit, display_name=self.ds.spec_name)
-
-    def setup_fluid_fields(self):
-
-        if self.ds.spec_cube:
-            def _pixel(field, data):
-                return data.ds.arr(data["ones"], "pixel")
-            self.add_field(("fits","pixel"), sampling_type="cell",  function=_pixel, units="pixel")
-            self._setup_spec_cube_fields()
-            return

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 yt/frontends/fits/misc.py
--- a/yt/frontends/fits/misc.py
+++ b/yt/frontends/fits/misc.py
@@ -162,6 +162,7 @@
     >>> print circle_region.quantities.extrema("flux")
     """
     import pyregion
+    from yt.frontends.fits.api import EventsFITSDataset
     if os.path.exists(reg):
         r = pyregion.open(reg)
     else:
@@ -170,8 +171,8 @@
     filter = r.get_filter(header=ds.wcs_2d.to_header())
     nx = ds.domain_dimensions[ds.lon_axis]
     ny = ds.domain_dimensions[ds.lat_axis]
-    mask = filter.mask((ny,nx)).transpose()
-    if ds.events_data:
+    mask = filter.mask((ny, nx)).transpose()
+    if isinstance(ds, EventsFITSDataset):
         prefix = "event_"
     else:
         prefix = ""

diff -r e64509889f88dabc7f11328d95b9aef816700464 -r 89cf5553a4699f5583704674b5bb578b9a1766b4 yt/frontends/fits/tests/test_outputs.py
--- a/yt/frontends/fits/tests/test_outputs.py
+++ b/yt/frontends/fits/tests/test_outputs.py
@@ -21,14 +21,17 @@
     requires_ds, \
     small_patch_amr, \
     data_dir_load
-from ..data_structures import FITSDataset
+from ..data_structures import FITSDataset, \
+    SpectralCubeFITSDataset, \
+    SkyDataFITSDataset, \
+    EventsFITSDataset
 
 _fields_grs = ("temperature",)
 
 grs = "radio_fits/grs-50-cube.fits"
 @requires_ds(grs)
 def test_grs():
-    ds = data_dir_load(grs, cls=FITSDataset, kwargs={"nan_mask":0.0})
+    ds = data_dir_load(grs, cls=SpectralCubeFITSDataset, kwargs={"nan_mask":0.0})
     assert_equal(str(ds), "grs-50-cube.fits")
     for test in small_patch_amr(ds, _fields_grs, input_center="c", input_weight="ones"):
         test_grs.__name__ = test.description
@@ -45,10 +48,50 @@
         test_velocity_field.__name__ = test.description
         yield test
 
+acis = "xray_fits/acisf05356N003_evt2.fits.gz"
+
+_fields_acis = ("counts_0.1-2.0", "counts_2.0-5.0")
+
+ at requires_ds(acis)
+def test_acis():
+    from yt.frontends.fits.misc import setup_counts_fields
+    ds = data_dir_load(acis, cls=EventsFITSDataset)
+    ebounds = [(0.1, 2.0), (2.0, 5.0)]
+    setup_counts_fields(ds, ebounds)
+    assert_equal(str(ds), "acisf05356N003_evt2.fits.gz")
+    for test in small_patch_amr(ds, _fields_acis, input_center="c", input_weight="ones"):
+        test_acis.__name__ = test.description
+        yield test
+
+A2052 = "xray_fits/A2052_merged_0.3-2_match-core_tmap_bgecorr.fits"
+
+_fields_A2052 = ("flux",)
+
+ at requires_ds(A2052)
+def test_A2052():
+    ds = data_dir_load(A2052, cls=SkyDataFITSDataset)
+    assert_equal(str(ds), "A2052_merged_0.3-2_match-core_tmap_bgecorr.fits")
+    for test in small_patch_amr(ds, _fields_A2052, input_center="c", input_weight="ones"):
+        test_A2052.__name__ = test.description
+        yield test
+
 @requires_file(vf)
 def test_units_override():
     units_override_check(vf)
 
- at requires_file(grs)
+ at requires_file(vf)
 def test_FITSDataset():
-    assert isinstance(data_dir_load(grs), FITSDataset)
+    assert isinstance(data_dir_load(vf), FITSDataset)
+
+ at requires_file(grs)
+def test_SpectralCubeFITSDataset():
+    assert isinstance(data_dir_load(grs), SpectralCubeFITSDataset)
+
+ at requires_file(acis)
+def test_EventsFITSDataset():
+    assert isinstance(data_dir_load(acis), EventsFITSDataset)
+
+ at requires_file(A2052)
+def test_SkyDataFITSDataset():
+    assert isinstance(data_dir_load(A2052), SkyDataFITSDataset)
+

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.


More information about the yt-svn mailing list