[yt-svn] commit/yt: 2 new changesets
commits-noreply at bitbucket.org
commits-noreply at bitbucket.org
Tue Jul 22 07:33:05 PDT 2014
2 new commits in yt:
https://bitbucket.org/yt_analysis/yt/commits/d25f73b9fab9/
Changeset: d25f73b9fab9
Branch: yt-3.0
User: mzingale
Date: 2014-07-22 04:33:15
Summary: the docstring said draw_domain() returned nothing -- it actually returns
a new image
Affected #: 1 file
diff -r 52b3ddbb6a53030a51290fba45acf8deab74910f -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e yt/visualization/volume_rendering/camera.py
--- a/yt/visualization/volume_rendering/camera.py
+++ b/yt/visualization/volume_rendering/camera.py
@@ -420,13 +420,14 @@
Returns
-------
- None
+ nim: Numpy ndarray
+ A new image with the domain lines drawn
Examples
--------
>>> im = cam.snapshot()
- >>> cam.draw_domain(im)
- >>> write_bitmap(im, 'render_with_domain_boundary.png')
+ >>> nim = cam.draw_domain(im)
+ >>> write_bitmap(nim, 'render_with_domain_boundary.png')
"""
# Must normalize the image
https://bitbucket.org/yt_analysis/yt/commits/e816afd0f063/
Changeset: e816afd0f063
Branch: yt-3.0
User: mzingale
Date: 2014-07-22 04:33:57
Summary: merge
Affected #: 53 files
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/cheatsheet.tex
--- a/doc/cheatsheet.tex
+++ b/doc/cheatsheet.tex
@@ -3,7 +3,7 @@
\usepackage{calc}
\usepackage{ifthen}
\usepackage[landscape]{geometry}
-\usepackage[colorlinks = true, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref}
+\usepackage[hyphens]{url}
% To make this come out properly in landscape mode, do one of the following
% 1.
@@ -101,9 +101,13 @@
Documentation \url{http://yt-project.org/doc/index.html}.
Need help? Start here \url{http://yt-project.org/doc/help/} and then
try the IRC chat room \url{http://yt-project.org/irc.html},
-or the mailing list \url{http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org}.
-{\bf Installing yt:} The easiest way to install yt is to use the installation script
-found on the yt homepage or the docs linked above.
+or the mailing list \url{http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org}. \\
+
+\subsection{Installing yt} The easiest way to install yt is to use the
+installation script found on the yt homepage or the docs linked above. If you
+already have python set up with \texttt{numpy}, \texttt{scipy},
+\texttt{matplotlib}, \texttt{h5py}, and \texttt{cython}, you can also use
+\texttt{pip install yt}
\subsection{Command Line yt}
yt, and its convenience functions, are launched from a command line prompt.
@@ -118,9 +122,8 @@
\texttt{yt stats} {\it dataset} \textemdash\ Print stats of a dataset. \\
\texttt{yt update} \textemdash\ Update yt to most recent version.\\
\texttt{yt update --all} \textemdash\ Update yt and dependencies to most recent version. \\
-\texttt{yt instinfo} \textemdash\ yt installation information. \\
+\texttt{yt version} \textemdash\ yt installation information. \\
\texttt{yt notebook} \textemdash\ Run the IPython notebook server. \\
-\texttt{yt serve} ({\it dataset}) \textemdash\ Run yt-specific web GUI ({\it dataset} is optional).\\
\texttt{yt upload\_image} {\it image.png} \textemdash\ Upload PNG image to imgur.com. \\
\texttt{yt upload\_notebook} {\it notebook.nb} \textemdash\ Upload IPython notebook to hub.yt-project.org.\\
\texttt{yt plot} {\it dataset} \textemdash\ Create a set of images.\\
@@ -132,16 +135,8 @@
paste.yt-project.org. \\
\texttt{yt pastebin\_grab} {\it identifier} \textemdash\ Print content of pastebin to
STDOUT. \\
- \texttt{yt hub\_register} \textemdash\ Register with
-hub.yt-project.org. \\
-\texttt{yt hub\_submit} \textemdash\ Submit hg repo to
-hub.yt-project.org. \\
-\texttt{yt bootstrap\_dev} \textemdash\ Bootstrap a yt
-development environment. \\
\texttt{yt bugreport} \textemdash\ Report a yt bug. \\
\texttt{yt hop} {\it dataset} \textemdash\ Run hop on a dataset. \\
-\texttt{yt rpdb} \textemdash\ Connect to running rpd
- session.
\subsection{yt Imports}
In order to use yt, Python must load the relevant yt modules into memory.
@@ -149,37 +144,40 @@
used as part of a script.
\newlength{\MyLen}
\settowidth{\MyLen}{\texttt{letterpaper}/\texttt{a4paper} \ }
-\texttt{from yt.mods import \textasteriskcentered} \textemdash\
-Load base yt modules. \\
+\texttt{import yt} \textemdash\
+Load yt. \\
\texttt{from yt.config import ytcfg} \textemdash\
Used to set yt configuration options.
- If used, must be called before importing any other module.\\
-\texttt{from yt.analysis\_modules.api import \textasteriskcentered} \textemdash\
-Load all yt analysis modules. \\
+If used, must be called before importing any other module.\\
\texttt{from yt.analysis\_modules.\emph{halo\_finding}.api import \textasteriskcentered} \textemdash\
Load halo finding modules. Other modules
are loaded in a similar way by swapping the
{\em emphasized} text.
See the \textbf{Analysis Modules} section for a listing and short descriptions of each.
-\subsection{Numpy Arrays}
-Simulation data in yt is returned in Numpy arrays. The Numpy package provides a wealth of built-in
-functions that operate on Numpy arrays. Here is a very brief list of some useful ones.
-Please see \url{http://docs.scipy.org/doc/numpy/reference/} for the full
-numpy documentation.\\
-\settowidth{\MyLen}{\texttt{multicol} }
+\subsection{YTArray}
+Simulation data in yt is returned as a YTArray. YTArray is a numpy array that
+has unit data attached to it and can automatically handle unit conversions and
+detect unit errors. Just like a numpy array, YTArray provides a wealth of
+built-in functions to calculate properties of the data in the array. Here is a
+very brief list of some useful ones.
+\settowidth{\MyLen}{\texttt{multicol} }\\
+\texttt{v = a.in\_cgs()} \textemdash\ Return the array in CGS units \\
+\texttt{v = a.in\_units('Msun/pc**3')} \textemdash\ Return the array in solar masses per cubic parsec \\
\texttt{v = a.max(), a.min()} \textemdash\ Return maximum, minimum of \texttt{a}. \\
-\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max,
+\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max,
min value of \texttt{a}.\\
\texttt{v = a[}{\it index}\texttt{]} \textemdash\ Select a single value from \texttt{a} at location {\it index}.\\
-\texttt{b = a[}{\it i:j}\texttt{]} \textemdash\ Select the slice of values from \texttt{a} between
+\texttt{b = a[}{\it i:j}\texttt{]} \textemdash\ Select the slice of values from
+\texttt{a} between
locations {\it i} to {\it j-1} saved to a new Numpy array \texttt{b} with length {\it j-i}. \\
-\texttt{sel = (a > const)} \textemdash\ Create a new boolean Numpy array \texttt{sel}, of the same shape as \texttt{a},
+\texttt{sel = (a > const)} \textemdash\ Create a new boolean Numpy array
+\texttt{sel}, of the same shape as \texttt{a},
that marks which values of \texttt{a > const}. Other operators (e.g. \textless, !=, \%) work as well.\\
-\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of elements from \texttt{a} that correspond to elements of \texttt{sel}
+\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of
+elements from \texttt{a} that correspond to elements of \texttt{sel}
that are {\it True}. In the above example \texttt{b} would be all elements of \texttt{a} that are greater than \texttt{const}.\\
-\texttt{a.dump({\it filename.dat})} \textemdash\ Save \texttt{a} to the binary file {\it filename.dat}.\\
-\texttt{a = np.load({\it filename.dat})} \textemdash\ Load the contents of {\it filename.dat} into \texttt{a}.
+\texttt{a.write\_hdf5({\it filename.h5})} \textemdash\ Save \texttt{a} to the hdf5 file {\it filename.h5}.\\
\subsection{IPython Tips}
\settowidth{\MyLen}{\texttt{multicol} }
@@ -196,6 +194,7 @@
\texttt{\%hist} \textemdash\ Print recent command history.\\
\texttt{\%quickref} \textemdash\ Print IPython quick reference.\\
\texttt{\%pdb} \textemdash\ Automatically enter the Python debugger at an exception.\\
+\texttt{\%debug} \textemdash\ Drop into a debugger at the location of the last unhandled exception. \\
\texttt{\%time, \%timeit} \textemdash\ Find running time of expressions for benchmarking.\\
\texttt{\%lsmagic} \textemdash\ List all available IPython magics. Hint: \texttt{?} works with magics.\\
@@ -208,10 +207,10 @@
After that, simulation data is generally accessed in yt using {\it Data Containers} which are Python objects
that define a region of simulation space from which data should be selected.
\settowidth{\MyLen}{\texttt{multicol} }
-\texttt{ds = load(}{\it dataset}\texttt{)} \textemdash\ Reference a single snapshot.\\
+\texttt{ds = yt.load(}{\it dataset}\texttt{)} \textemdash\ Reference a single snapshot.\\
\texttt{dd = ds.all\_data()} \textemdash\ Select the entire volume.\\
-\texttt{a = dd[}{\it field\_name}\texttt{]} \textemdash\ Saves the contents of {\it field} into the
-numpy array \texttt{a}. Similarly for other data containers.\\
+\texttt{a = dd[}{\it field\_name}\texttt{]} \textemdash\ Copies the contents of {\it field} into the
+YTArray \texttt{a}. Similarly for other data containers.\\
\texttt{ds.field\_list} \textemdash\ A list of available fields in the snapshot. \\
\texttt{ds.derived\_field\_list} \textemdash\ A list of available derived fields
in the snapshot. \\
@@ -231,45 +230,29 @@
direction set by {\it normal},with total length
2$\times${\it height} and with radius {\it radius}. \\
- \texttt{bl = ds.boolean({\it constructor})} \textemdash\ Create a boolean data
- container. {\it constructor} is a list of pre-defined non-boolean
- data containers with nested boolean logic using the
- ``AND'', ``NOT'', or ``OR'' operators. E.g. {\it constructor=}
- {\it [sp, ``NOT'', (di, ``OR'', re)]} gives a volume defined
- by {\it sp} minus the patches covered by {\it di} and {\it re}.\\
-
\texttt{ds.save\_object(sp, {\it ``sp\_for\_later''})} \textemdash\ Save an object (\texttt{sp}) for later use.\\
\texttt{sp = ds.load\_object({\it ``sp\_for\_later''})} \textemdash\ Recover a saved object.\\
-\subsection{Defining New Fields \& Quantities}
-\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory. Quantities reduce a field (e.g. "Density") defined over an object (e.g. "sphere") to get a single value (e.g. "Mass"). \\
-\texttt{def \_MetalMassMsun({\it field},{\it data})}\\
-\texttt{\hspace{4 mm} return data["Metallicity"]*data["CellMassMsun"]}\\
-\texttt{add\_field("MetalMassMsun",function=\_MetalMassMsun)}\\
-Define a new quantity; note the first function operates on grids and data objects and the second on the results of the first. \\
-\texttt{def \_TotalMass(data): }\\
-\texttt{\hspace{4 mm} baryon\_mass = data["CellMassMsun"].sum()}\\
-\texttt{\hspace{4 mm} particle\_mass = data["ParticleMassMsun"].sum()}\\
-\texttt{\hspace{4 mm} return baryon\_mass, particle\_mass}\\
-\texttt{def \_combTotalMass(data, baryon\_mass, particle\_mass):}\\
-\texttt{\hspace{4 mm} return baryon\_mass.sum() + particle\_mass.sum()}\\
-\texttt{add\_quantity("TotalMass", function=\_TotalMass,}\\
-\texttt{\hspace{4 mm} combine\_function=\_combTotalMass, n\_ret = 2)}\\
-
-
+\subsection{Defining New Fields}
+\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory.
+Field can either be created before a dataset is loaded using \texttt{add\_field}:
+\texttt{def \_metal\_mass({\it field},{\it data})}\\
+\texttt{\hspace{4 mm} return data["metallicity"]*data["cell\_mass"]}\\
+\texttt{add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
+Or added to an existing dataset using \texttt{ds.add\_field}:
+\texttt{ds.add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
\subsection{Slices and Projections}
\settowidth{\MyLen}{\texttt{multicol} }
-\texttt{slc = SlicePlot(ds, {\it axis}, {\it field}, {\it center=}, {\it width=}, {\it weight\_field=}, {\it additional parameters})} \textemdash\ Make a slice plot
-perpendicular to {\it axis} of {\it field} weighted by {\it weight\_field} at (code-units) {\it center} with
-{\it width} in code units or a (value, unit) tuple. Hint: try {\it SlicePlot?} in IPython to see additional parameters.\\
+\texttt{slc = yt.SlicePlot(ds, {\it axis or normal vector}, {\it field}, {\it center=}, {\it width=}, {\it weight\_field=}, {\it additional parameters})} \textemdash\ Make a slice plot
+perpendicular to {\it axis} (specified via 'x', 'y', or 'z') or a normal vector for an off-axis slice of {\it field} weighted by {\it weight\_field} at (code-units) {\it center} with
+{\it width} in code units or a (value, unit) tuple. Hint: try {\it yt.SlicePlot?} in IPython to see additional parameters.\\
\texttt{slc.save({\it file\_prefix})} \textemdash\ Save the slice to a png with name prefix {\it file\_prefix}.
\texttt{.save()} works similarly for the commands below.\\
-\texttt{prj = ProjectionPlot(ds, {\it axis}, {\it field}, {\it addit. params})} \textemdash\ Make a projection. \\
-\texttt{prj = OffAxisSlicePlot(ds, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off-axis slice. Note this takes an array of fields. \\
-\texttt{prj = OffAxisProjectionPlot(ds, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off axis projection. Note this takes an array of fields. \\
+\texttt{prj = yt.ProjectionPlot(ds, {\it axis}, {\it field}, {\it addit. params})} \textemdash\ Make a projection. \\
+\texttt{prj = yt.OffAxisProjectionPlot(ds, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off axis projection. Note this takes an array of fields. \\
\subsection{Plot Annotations}
\settowidth{\MyLen}{\texttt{multicol} }
@@ -299,51 +282,37 @@
The \texttt{my\_plugins.py} file \textemdash\ Add functions, derived fields, constants, or other commonly-used Python code to yt.
-
-
\subsection{Analysis Modules}
\settowidth{\MyLen}{\texttt{multicol}}
The import name for each module is listed at the end of each description (see \textbf{yt Imports}).
\texttt{Absorption Spectrum} \textemdash\ (\texttt{absorption\_spectrum}). \\
\texttt{Clump Finder} \textemdash\ Find clumps defined by density thresholds (\texttt{level\_sets}). \\
-\texttt{Coordinate Transformation} \textemdash\ (\texttt{coordinate\_transformation}). \\
\texttt{Halo Finding} \textemdash\ Locate halos of dark matter particles (\texttt{halo\_finding}). \\
-\texttt{Halo Mass Function} \textemdash\ Find halo mass functions from data and from theory (\texttt{halo\_mass\_function}). \\
-\texttt{Halo Profiling} \textemdash\ Profile and project multiple halos (\texttt{halo\_profiler}). \\
-\texttt{Halo Merger Tree} \textemdash\ Create a database of halo mergers (\texttt{halo\_merger\_tree}). \\
\texttt{Light Cone Generator} \textemdash\ Stitch datasets together to perform analysis over cosmological volumes. \\
\texttt{Light Ray Generator} \textemdash\ Analyze the path of light rays.\\
-\texttt{Radial Column Density} \textemdash\ Calculate column densities around a point (\texttt{radial\_column\_density}). \\
\texttt{Rockstar Halo Finding} \textemdash\ Locate halos of dark matter using the Rockstar halo finder (\texttt{halo\_finding.rockstar}). \\
\texttt{Star Particle Analysis} \textemdash\ Analyze star formation history and assemble spectra (\texttt{star\_analysis}). \\
\texttt{Sunrise Exporter} \textemdash\ Export data to the sunrise visualization format (\texttt{sunrise\_export}). \\
-\texttt{Two Point Functions} \textemdash\ Two point correlations (\texttt{two\_point\_functions}). \\
\subsection{Parallel Analysis}
-\settowidth{\MyLen}{\texttt{multicol}}
-Nearly all of yt is parallelized using MPI.
-The {\it mpi4py} package must be installed for parallelism in yt.
-To install {\it pip install mpi4py} on the command line usually works.
+\settowidth{\MyLen}{\texttt{multicol}}
+Nearly all of yt is parallelized using
+MPI. The {\it mpi4py} package must be installed for parallelism in yt. To
+install {\it pip install mpi4py} on the command line usually works.
Execute python in parallel similar to this:\\
-{\it mpirun -n 12 python script.py --parallel}\\
-This command may differ for each system on which you use yt;
-please consult the system documentation for details on how to run parallel applications.
+{\it mpirun -n 12 python script.py}\\
+The file \texttt{script.py} must call the \texttt{yt.enable\_parallelism()} to
+turn on yt's parallelism. If this doesn't happen, all cores will execute the
+same serial yt script. This command may differ for each system on which you use
+yt; please consult the system documentation for details on how to run parallel
+applications.
-\texttt{from yt.pmods import *} \textemdash\ Load yt faster when in parallel.
-This replaces the usual \texttt{from yt.mods import *}.\\
\texttt{parallel\_objects()} \textemdash\ A way to parallelize analysis over objects
(such as halos or clumps).\\
-\subsection{Pre-Installed Versions}
-\settowidth{\MyLen}{\texttt{multicol}}
-yt is pre-installed on several supercomputer systems.
-
-\textbf{NICS Kraken} \textemdash\ {\it module load yt} \\
-
-
\subsection{Mercurial}
\settowidth{\MyLen}{\texttt{multicol}}
Please see \url{http://mercurial.selenic.com/} for the full Mercurial documentation.
@@ -365,8 +334,7 @@
\subsection{FAQ}
\settowidth{\MyLen}{\texttt{multicol}}
-\texttt{ds.field\_info[`field'].take\_log = False} \textemdash\ When plotting \texttt{field}, do not take log.
-Must enter \texttt{ds.index} before this command. \\
+\texttt{slc.set\_log('field', False)} \textemdash\ When plotting \texttt{field}, use linear scaling instead of log scaling.
%\rule{0.3\linewidth}{0.25pt}
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/_dq_docstrings.inc
--- a/doc/source/analyzing/_dq_docstrings.inc
+++ b/doc/source/analyzing/_dq_docstrings.inc
@@ -1,43 +1,20 @@
-.. function:: Action(action, combine_action, filter=None):
+.. function:: angular_momentum_vector()
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._Action`.)
- This function evals the string given by the action arg and uses
- the function thrown with the combine_action to combine the values.
- A filter can be thrown to be evaled to short-circuit the calculation
- if some criterion is not met.
- :param action: a string containing the desired action to be evaled.
- :param combine_action: the function used to combine the answers when done lazily.
- :param filter: a string to be evaled to serve as a data filter.
-
-
-
-.. function:: AngularMomentumVector():
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._AngularMomentumVector`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.AngularMomentumVector`.)
This function returns the mass-weighted average angular momentum vector.
+.. function:: bulk_velocity():
-.. function:: BaryonSpinParameter():
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._BaryonSpinParameter`.)
- This function returns the spin parameter for the baryons, but it uses
- the particles in calculating enclosed mass.
-
-
-
-.. function:: BulkVelocity():
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._BulkVelocity`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.BulkVelocity`.)
This function returns the mass-weighted average velocity in the object.
+.. function:: center_of_mass(use_cells=True, use_particles=False):
-.. function:: CenterOfMass(use_cells=True, use_particles=False):
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._CenterOfMass`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.CenterOfMass`.)
This function returns the location of the center
of mass. By default, it computes of the *non-particle* data in the object.
@@ -51,112 +28,64 @@
-.. function:: Extrema(fields, non_zero=False, filter=None):
+.. function:: extrema(fields, non_zero=False, filter=None):
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._Extrema`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.Extrema`.)
This function returns the extrema of a set of fields
:param fields: A field name, or a list of field names
:param filter: a string to be evaled to serve as a data filter.
+.. function:: max_location(field):
-.. function:: IsBound(truncate=True, include_thermal_energy=False, treecode=True, opening_angle=1.0, periodic_test=False, include_particles=True):
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._IsBound`.)
- This returns whether or not the object is gravitationally bound. If this
- returns a value greater than one, it is bound, and otherwise not.
-
- Parameters
- ----------
- truncate : Bool
- Should the calculation stop once the ratio of
- gravitational:kinetic is 1.0?
- include_thermal_energy : Bool
- Should we add the energy from ThermalEnergy
- on to the kinetic energy to calculate
- binding energy?
- treecode : Bool
- Whether or not to use the treecode.
- opening_angle : Float
- The maximal angle a remote node may subtend in order
- for the treecode method of mass conglomeration may be
- used to calculate the potential between masses.
- periodic_test : Bool
- Used for testing the periodic adjustment machinery
- of this derived quantity.
- include_particles : Bool
- Should we add the mass contribution of particles
- to calculate binding energy?
-
- Examples
- --------
- >>> sp.quantities["IsBound"](truncate=False,
- ... include_thermal_energy=True, treecode=False, opening_angle=2.0)
- 0.32493
-
-
-
-.. function:: MaxLocation(field):
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._MaxLocation`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.max_location`.)
This function returns the location of the maximum of a set
of fields.
+.. function:: min_location(field):
-.. function:: MinLocation(field):
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._MinLocation`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.MinLocation`.)
This function returns the location of the minimum of a set
of fields.
-.. function:: ParticleSpinParameter():
+.. function:: spin_parameter(use_gas=True, use_particles=True):
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._ParticleSpinParameter`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.SpinParameter`.)
This function returns the spin parameter for the baryons, but it uses
the particles in calculating enclosed mass.
+.. function:: total_mass():
-.. function:: StarAngularMomentumVector():
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._StarAngularMomentumVector`.)
- This function returns the mass-weighted average angular momentum vector
- for stars.
-
-
-
-.. function:: TotalMass():
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._TotalMass`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.TotalMass`.)
This function takes no arguments and returns the sum of cell masses and
particle masses in the object.
+.. function:: total_quantity(fields):
-.. function:: TotalQuantity(fields):
-
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._TotalQuantity`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.TotalQuantity`.)
This function sums up a given field over the entire region
:param fields: The fields to sum up
-.. function:: WeightedAverageQuantity(field, weight):
+.. function:: weighted_average_quantity(field, weight):
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._WeightedAverageQuantity`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.WeightedAverageQuantity`.)
This function returns an averaged quantity.
:param field: The field to average
:param weight: The field to weight by
-.. function:: WeightedVariance(field, weight):
+.. function:: weighted_variance(field, weight):
- (This is a proxy for :func:`~yt.data_objects.derived_quantities._WeightedVariance`.)
+ (This is a proxy for :func:`~yt.data_objects.derived_quantities.WeightedVariance`.)
This function returns the variance of a field.
:param field: The target field
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/Halo_Analysis.ipynb
--- a/doc/source/analyzing/analysis_modules/Halo_Analysis.ipynb
+++ b/doc/source/analyzing/analysis_modules/Halo_Analysis.ipynb
@@ -1,6 +1,7 @@
{
"metadata": {
- "name": ""
+ "name": "",
+ "signature": "sha256:e792ad188f59161aa3ff4cdbb32cad75142b2e6b4062dfa1d8c12b3172fcf4e9"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -34,7 +35,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
+ "import yt\n",
"from yt.analysis_modules.halo_analysis.api import *\n",
"import tempfile\n",
"import shutil\n",
@@ -44,7 +45,7 @@
"tmpdir = tempfile.mkdtemp()\n",
"\n",
"# Load the data set with the full simulation information\n",
- "data_ds = load('Enzo_64/RD0006/RedshiftOutput0006')"
+ "data_ds = yt.load('Enzo_64/RD0006/RedshiftOutput0006')"
],
"language": "python",
"metadata": {},
@@ -62,7 +63,7 @@
"collapsed": false,
"input": [
"# Load the rockstar data files\n",
- "halos_ds = load('rockstar_halos/halos_0.0.bin')"
+ "halos_ds = yt.load('rockstar_halos/halos_0.0.bin')"
],
"language": "python",
"metadata": {},
@@ -407,4 +408,4 @@
"metadata": {}
}
]
-}
+}
\ No newline at end of file
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/PPVCube.ipynb
--- a/doc/source/analyzing/analysis_modules/PPVCube.ipynb
+++ b/doc/source/analyzing/analysis_modules/PPVCube.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:ba8b6a53571695ae1d0c236ad43875823746e979a329a9d35ab0a8b899cebbba"
+ "signature": "sha256:56a8d72735e3cc428ff04b241d4b2ce6f653019818c6fc7a4148840d99030c85"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -19,8 +19,9 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "%matplotlib inline\n",
- "from yt.mods import *\n",
+ "import yt\n",
+ "import numpy as np\n",
+ "\n",
"from yt.analysis_modules.ppv_cube.api import PPVCube"
],
"language": "python",
@@ -122,7 +123,7 @@
"data[\"velocity_y\"] = (vely, \"km/s\")\n",
"data[\"velocity_z\"] = (np.zeros((nx,ny,nz)), \"km/s\") # zero velocity in the z-direction\n",
"bbox = np.array([[-0.5,0.5],[-0.5,0.5],[-0.5,0.5]]) # bbox of width 1 on a side with center (0,0,0)\n",
- "ds = load_uniform_grid(data, (nx,ny,nz), length_unit=(2*R,\"kpc\"), nprocs=1, bbox=bbox)"
+ "ds = yt.load_uniform_grid(data, (nx,ny,nz), length_unit=(2*R,\"kpc\"), nprocs=1, bbox=bbox)"
],
"language": "python",
"metadata": {},
@@ -139,7 +140,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "slc = SlicePlot(ds, \"z\", [\"density\",\"velocity_x\",\"velocity_y\",\"velocity_magnitude\"])"
+ "slc = yt.SlicePlot(ds, \"z\", [\"density\",\"velocity_x\",\"velocity_y\",\"velocity_magnitude\"])"
],
"language": "python",
"metadata": {},
@@ -222,7 +223,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "ds = load(\"cube.fits\")"
+ "ds = yt.load(\"cube.fits\")"
],
"language": "python",
"metadata": {},
@@ -233,7 +234,7 @@
"collapsed": false,
"input": [
"# Specifying no center gives us the center slice\n",
- "slc = SlicePlot(ds, \"z\", [\"density\"])\n",
+ "slc = yt.SlicePlot(ds, \"z\", [\"density\"])\n",
"slc.show()"
],
"language": "python",
@@ -248,7 +249,7 @@
"# Picking different velocities for the slices\n",
"new_center = ds.domain_center\n",
"new_center[2] = ds.spec2pixel(-1.0*u.km/u.s)\n",
- "slc = SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+ "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
"slc.show()"
],
"language": "python",
@@ -260,7 +261,7 @@
"collapsed": false,
"input": [
"new_center[2] = ds.spec2pixel(0.7*u.km/u.s)\n",
- "slc = SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+ "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
"slc.show()"
],
"language": "python",
@@ -272,7 +273,7 @@
"collapsed": false,
"input": [
"new_center[2] = ds.spec2pixel(-0.3*u.km/u.s)\n",
- "slc = SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
+ "slc = yt.SlicePlot(ds, \"z\", [\"density\"], center=new_center)\n",
"slc.show()"
],
"language": "python",
@@ -290,7 +291,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "prj = ProjectionPlot(ds, \"z\", [\"density\"], proj_style=\"sum\")\n",
+ "prj = yt.ProjectionPlot(ds, \"z\", [\"density\"], proj_style=\"sum\")\n",
"prj.set_log(\"density\", True)\n",
"prj.set_zlim(\"density\", 1.0e-3, 0.2)\n",
"prj.show()"
@@ -303,4 +304,4 @@
"metadata": {}
}
]
-}
+}
\ No newline at end of file
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
--- a/doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
+++ b/doc/source/analyzing/analysis_modules/Particle_Trajectories.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:e4b5ea69687eb79452c16385b3a6f795b4572518dfa7f9d8a8125bd75b5fea85"
+ "signature": "sha256:5ab80c6b33a115cb88c36fde8659434d14a852dd43b0b419f2bb0c04acf66278"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -20,7 +20,7 @@
"collapsed": false,
"input": [
"%matplotlib inline\n",
- "from yt.mods import *\n",
+ "import yt\n",
"import glob\n",
"from yt.analysis_modules.particle_trajectories.api import ParticleTrajectories\n",
"from yt.config import ytcfg\n",
@@ -77,7 +77,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "ds = load(my_fns[0])\n",
+ "ds = yt.load(my_fns[0])\n",
"dd = ds.all_data()\n",
"indices = dd[\"particle_index\"].astype(\"int\")\n",
"print indices"
@@ -205,8 +205,8 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "ds = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
- "slc = SlicePlot(ds, \"x\", [\"density\",\"dark_matter_density\"], center=\"max\", width=(3.0, \"Mpc\"))\n",
+ "ds = yt.load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+ "slc = yt.SlicePlot(ds, \"x\", [\"density\",\"dark_matter_density\"], center=\"max\", width=(3.0, \"Mpc\"))\n",
"slc.show()"
],
"language": "python",
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/SZ_projections.ipynb
--- a/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
+++ b/doc/source/analyzing/analysis_modules/SZ_projections.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:4745a15abb6512547b50280b92c22567f89255189fd968ca706ef7c39d48024f"
+ "signature": "sha256:e4db171b795d155870280ddbe8986f55f9a94ffb10783abf9d4cc2de3ec24894"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -89,11 +89,10 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "%matplotlib inline\n",
- "from yt.mods import *\n",
+ "import yt\n",
"from yt.analysis_modules.sunyaev_zeldovich.api import SZProjection\n",
"\n",
- "ds = load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
+ "ds = yt.load(\"enzo_tiny_cosmology/DD0046/DD0046\")\n",
"\n",
"freqs = [90.,180.,240.]\n",
"szprj = SZProjection(ds, freqs)"
@@ -218,14 +217,6 @@
"including coordinate information in kpc. The optional keyword\n",
"`clobber` allows a previous file to be overwritten. \n"
]
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [],
- "language": "python",
- "metadata": {},
- "outputs": []
}
],
"metadata": {}
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/clump_finding.rst
--- a/doc/source/analyzing/analysis_modules/clump_finding.rst
+++ b/doc/source/analyzing/analysis_modules/clump_finding.rst
@@ -87,7 +87,7 @@
ds = load("DD0000")
sp = ds.sphere([0.5, 0.5, 0.5], radius=0.1)
- ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
+ ratio = sp.quantities.is_bound(truncate=False, include_thermal_energy=True,
treecode=True, opening_angle=2.0)
This example will accomplish the same as the above, but will use the full
@@ -100,7 +100,7 @@
ds = load("DD0000")
sp = ds.sphere([0.5, 0.5, 0.5], radius=0.1)
- ratio = sp.quantities["IsBound"](truncate=False, include_thermal_energy=True,
+ ratio = sp.quantities.is_bound(truncate=False, include_thermal_energy=True,
treecode=False)
Here the treecode method is specified for clump finding (this is default).
@@ -109,7 +109,7 @@
.. code-block:: python
- function_name = 'self.data.quantities["IsBound"](truncate=True, \
+ function_name = 'self.data.quantities.is_bound(truncate=True, \
include_thermal_energy=True, treecode=True, opening_angle=2.0) > 1.0'
master_clump = amods.level_sets.Clump(data_source, None, field,
function=function_name)
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/analysis_modules/index.rst
--- a/doc/source/analyzing/analysis_modules/index.rst
+++ b/doc/source/analyzing/analysis_modules/index.rst
@@ -2,10 +2,9 @@
================
These are "canned" analysis modules that can operate on datasets, performing a
-sequence of operations that result in a final result.
-
-Astrophysics Analysis Modules
------------------------------
+sequence of operations that result in a final result. This functionality
+interoperates with yt, but one needs to import the functions associated
+with each specific analysis module into python before using them.
.. toctree::
:maxdepth: 2
@@ -13,13 +12,6 @@
halo_analysis
synthetic_observation
exporting
-
-General Analysis Modules
-------------------------
-
-.. toctree::
- :maxdepth: 1
-
two_point_functions
clump_finding
particle_trajectories
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/objects.rst
--- a/doc/source/analyzing/objects.rst
+++ b/doc/source/analyzing/objects.rst
@@ -31,35 +31,15 @@
derived fields. If it finds nothing there, it then defaults to examining the
global set of derived fields.
-To add a field to the list of fields that you know should exist in a particular
-frontend, call the function ``add_frontend_field`` where you replace
-``frontend`` with the name of the frontend. Below is an example for adding
-``Cooling_Time`` to Enzo:
+To add a derived field, which is not expected to necessarily exist on disk, use
+the standard construction:
.. code-block:: python
- add_enzo_field("Cooling_Time", units=r"\rm{s}",
- function=NullFunc,
- validators=ValidateDataField("Cooling_Time"))
+ add_field("specific_thermal_energy", function=_specific_thermal_energy,
+ units="ergs/g")
-Note that we used the ``NullFunc`` function here. To add a derived field,
-which is not expected to necessarily exist on disk, use the standard
-construction:
-
-.. code-block:: python
-
- add_field("thermal_energy", function=_ThermalEnergy,
- units=r"\rm{ergs}/\rm{g}")
-
-To add a translation from one field to another, use the ``TranslationFunc`` as
-the function for reading the field. For instance, this code appears in the Nyx
-frontend:
-
-.. code-block:: python
-
- add_field("density", function=TranslationFunc("density"), take_log=True,
- units=r"\rm{g} / \rm{cm}^3",
- projected_units =r"\rm{g} / \rm{cm}^2")
+where ``_specific_thermal_energy`` is a python function that defines the field.
.. _accessing-fields:
@@ -105,7 +85,7 @@
.. code-block:: python
- ds = load("my_data")
+ ds = yt.load("my_data")
print ds.field_list
print ds.derived_field_list
@@ -115,7 +95,7 @@
.. code-block:: python
- ds = load("my_data")
+ ds = yt.load("my_data")
print ds.field_info["pressure"].get_units()
This is a fast way to examine the units of a given field, and additionally you
@@ -141,8 +121,8 @@
.. code-block:: python
- from yt.mods import *
- ds = load("RedshiftOutput0005")
+ import yt
+ ds = yt.load("RedshiftOutput0005")
reg = ds.region([0.5, 0.5, 0.5], [0.0, 0.0, 0.0], [1.0, 1.0, 1.0])
.. include:: _obj_docstrings.inc
@@ -199,7 +179,7 @@
ds = load("my_data")
dd = ds.all_data()
- dd.quantities["AngularMomentumVector"]()
+ dd.quantities.angular_momentum_vector()
The following quantities are available via the ``quantities`` interface.
@@ -246,8 +226,8 @@
.. notebook-cell::
- from yt.mods import *
- ds = load("enzo_tiny_cosmology/DD0046/DD0046")
+ import yt
+ ds = yt.load("enzo_tiny_cosmology/DD0046/DD0046")
ad = ds.all_data()
total_mass = ad.quantities.total_quantity('cell_mass')
# now select only gas with 1e5 K < T < 1e7 K.
@@ -268,12 +248,12 @@
.. python-script::
- from yt.mods import *
- ds = load("enzo_tiny_cosmology/DD0046/DD0046")
+ import yt
+ ds = yt.load("enzo_tiny_cosmology/DD0046/DD0046")
ad = ds.all_data()
new_region = ad.cut_region(['obj["density"] > 1e-29'])
- plot = ProjectionPlot(ds, "x", "density", weight_field="density",
- data_source=new_region)
+ plot = yt.ProjectionPlot(ds, "x", "density", weight_field="density",
+ data_source=new_region)
plot.save()
.. _extracting-connected-sets:
@@ -311,10 +291,6 @@
Extracting Isocontour Information
---------------------------------
-.. versionadded:: 2.3
-
-.. warning::
- This is still beta!
``yt`` contains an implementation of the `Marching Cubes
<http://en.wikipedia.org/wiki/Marching_cubes>`_ algorithm, which can operate on
@@ -378,8 +354,8 @@
.. code-block:: python
- from yt.mods import *
- ds = load("my_data")
+ import yt
+ ds = yt.load("my_data")
sp = ds.sphere([0.5, 0.5, 0.5], 10.0/ds['kpc'])
ds.save_object(sp, "sphere_to_analyze_later")
@@ -390,9 +366,9 @@
.. code-block:: python
- from yt.mods import *
+ import yt
- ds = load("my_data")
+ ds = yt.load("my_data")
sphere_to_analyze = ds.load_object("sphere_to_analyze_later")
Additionally, if we want to store the object independent of the ``.yt`` file,
@@ -400,9 +376,9 @@
.. code-block:: python
- from yt.mods import *
+ import yt
- ds = load("my_data")
+ ds = yt.load("my_data")
sp = ds.sphere([0.5, 0.5, 0.5], 10.0/ds['kpc'])
sp.save_object("my_sphere", "my_storage_file.cpkl")
@@ -416,10 +392,10 @@
.. code-block:: python
- from yt.mods import *
+ import yt
import shelve
- ds = load("my_data") # not necessary if storeparameterfiles is on
+ ds = yt.load("my_data") # not necessary if storeparameterfiles is on
obj_file = shelve.open("my_storage_file.cpkl")
ds, obj = obj_file["my_sphere"]
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/time_series_analysis.rst
--- a/doc/source/analyzing/time_series_analysis.rst
+++ b/doc/source/analyzing/time_series_analysis.rst
@@ -33,27 +33,23 @@
creating your own, and these operators can be applied either to datasets on the
whole or to subregions of individual datasets.
-The simplest mechanism for creating a ``DatasetSeries`` object is to use the
-class method
-:meth:`~yt.data_objects.time_series.DatasetSeries.from_filenames`. This
-method accepts a list of strings that can be supplied to ``load``. For
-example:
+The simplest mechanism for creating a ``DatasetSeries`` object is to pass a glob
+pattern to the ``yt.load`` function.
.. code-block:: python
- from yt.mods import *
- filenames = ["DD0030/output_0030", "DD0040/output_0040"]
- ts = DatasetSeries.from_filenames(filenames)
+ import yt
+ ts = yt.load("DD????/DD????")
-This will create a new time series, populated with the output files ``DD0030``
-and ``DD0040``. This object, here called ``ts``, can now be analyzed in bulk.
-Alternately, you can specify a pattern that is supplied to :mod:`glob`, and
-those filenames will be sorted and returned. Here is an example:
+This will create a new time series, populated with all datasets that match the
+pattern "DD" followed by four digits. This object, here called ``ts``, can now
+be analyzed in bulk. Alternately, you can specify an already formatted list of
+filenames directly to the `DatasetSeries` initializer:
.. code-block:: python
- from yt.mods import *
- ts = DatasetSeries.from_filenames("*/*.index")
+ import yt
+ ts = DatasetSeries(["DD0030/DD0030", "DD0040/DD0040")
Analyzing Each Dataset In Sequence
----------------------------------
@@ -64,8 +60,8 @@
.. code-block:: python
- from yt.mods import *
- ts = DatasetSeries.from_filenames("*/*.index")
+ import yt
+ ts = yt.load("*/*.index")
for ds in ts:
print ds.current_time
@@ -77,87 +73,6 @@
* The cookbook recipe for :ref:`cookbook-time-series-analysis`
* :class:`~yt.data_objects.time_series.DatasetSeries`
-Prepared Time Series Analysis
------------------------------
-
-A few handy functions for treating time series data as a uniform, single object
-are also available.
-
-.. warning:: The future of these functions is uncertain: they may be removed in
- the future!
-
-Simple Analysis Tasks
-~~~~~~~~~~~~~~~~~~~~~
-
-The available tasks that come built-in can be seen by looking at the output of
-``ts.tasks.keys()``. For instance, one of the simplest ones is the
-``MaxValue`` task. We can execute this task by calling it with the field whose
-maximum value we want to evaluate:
-
-.. code-block:: python
-
- from yt.mods import *
- ts = TimeSeries.from_filenames("*/*.index")
- max_rho = ts.tasks["MaximumValue"]("density")
-
-When we call the task, the time series object executes the task on each
-component dataset. The results are then returned to the user. More
-complex, multi-task evaluations can be conducted by using the
-:meth:`~yt.data_objects.time_series.DatasetSeries.eval` call, which accepts a
-list of analysis tasks.
-
-Analysis Tasks Applied to Objects
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Just as some tasks can be applied to datasets as a whole, one can also apply
-the creation of objects to datasets. This means that you are able to construct
-a generalized "sphere" operator that will be created inside all datasets, which
-you can then calculate derived quantities (see :ref:`derived-quantities`) from.
-
-For instance, imagine that you wanted to create a sphere that is centered on
-the most dense point in the simulation and that is 1 pc in radius, and then
-calculate the angular momentum vector on this sphere. You could do that with
-this script:
-
-.. code-block:: python
-
- from yt.mods import *
- ts = TimeSeries.from_filenames("*/*.index")
- sphere = ts.sphere("max", (1.0, "pc"))
- L_vecs = sphere.quantities["AngularMomentumVector"]()
-
-Note that we have specified the units differently than usual -- the time series
-objects allow units as a tuple, so that in cases where units may change over
-the course of several outputs they are correctly set at all times. This script
-simply sets up the time series object, creates a sphere, and then runs
-quantities on it. It is designed to look very similar to the code that would
-conduct this analysis on a single output.
-
-All of the objects listed in :ref:`available-objects` are made available in
-the same manner as "sphere" was used above.
-
-Creating Analysis Tasks
-~~~~~~~~~~~~~~~~~~~~~~~
-
-If you wanted to look at the mass in star particles as a function of time, you
-would write a function that accepts params and ds and then decorate it with
-analysis_task. Here we have done so:
-
-.. code-block:: python
-
- @analysis_task(('particle_type',))
- def MassInParticleType(params, ds):
- dd = ds.all_data()
- ptype = (dd["particle_type"] == params.particle_type)
- return (ptype.sum(), dd["ParticleMassMsun"][ptype].sum())
-
- ms = ts.tasks["MassInParticleType"](4)
- print ms
-
-This allows you to create your own analysis tasks that will be then available
-to time series data objects. Since ``DatasetSeries`` objects iterate over
-filenames in parallel by default, this allows for transparent parallelization.
-
.. _analyzing-an-entire-simulation:
Analyzing an Entire Simulation
@@ -175,9 +90,9 @@
.. code-block:: python
- from yt.mods import *
- my_sim = simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo',
- find_outputs=False)
+ import yt
+ my_sim = yt.simulation('enzo_tiny_cosmology/32Mpc_32.enzo', 'Enzo',
+ find_outputs=False)
Then, create a ``DatasetSeries`` object with the :meth:`get_time_series`
function. With no additional keywords, the time series will include every
@@ -198,7 +113,7 @@
for ds in my_sim.piter()
all_data = ds.all_data()
- print all_data.quantities['Extrema']('density')
+ print all_data.quantities.extrema('density')
Additional keywords can be given to :meth:`get_time_series` to select a subset
of the total data:
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
--- a/doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
+++ b/doc/source/analyzing/units/2)_Data_Selection_and_fields.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:882b31591c60bfe6ad4cb0f8842953d2e94fb8a12ce742be831a65642eea72c9"
+ "signature": "sha256:2faff88abc93fe2bc9d91467db786a8b69ec3ece6783a7055942ecc7c47a0817"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -34,8 +34,8 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
- "ds = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
+ "import yt\n",
+ "ds = yt.load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
" \n",
"dd = ds.all_data()\n",
"maxval, maxloc = ds.find_max('density')\n",
@@ -324,6 +324,8 @@
"collapsed": false,
"input": [
"from astropy import units as u\n",
+ "from yt import YTQuantity, YTArray\n",
+ "\n",
"x = 42.0 * u.meter\n",
"y = YTQuantity.from_astropy(x) "
],
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/units/3)_Comoving_units_and_code_units.ipynb
--- a/doc/source/analyzing/units/3)_Comoving_units_and_code_units.ipynb
+++ b/doc/source/analyzing/units/3)_Comoving_units_and_code_units.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:242d7005d45a82744713bfe6389e49d47f39b524d1e7fcbf5ceb2e65dc473e68"
+ "signature": "sha256:8ba193cc3867e2185133bbf3952bd5834e6c63993208635c71cf55fa6f27b491"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -34,8 +34,8 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
- "ds = load('Enzo_64/DD0043/data0043')"
+ "import yt\n",
+ "ds = yt.load('Enzo_64/DD0043/data0043')"
],
"language": "python",
"metadata": {},
@@ -208,7 +208,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "slc = SlicePlot(ds, 0, 'density', width=(128, 'Mpccm/h'))\n",
+ "slc = yt.SlicePlot(ds, 0, 'density', width=(128, 'Mpccm/h'))\n",
"slc.set_figure_size(6)"
],
"language": "python",
@@ -234,6 +234,8 @@
"cell_type": "code",
"collapsed": false,
"input": [
+ "from yt import YTQuantity\n",
+ "\n",
"a = YTQuantity(3, 'cm')\n",
"\n",
"print a.units.registry.keys()"
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
--- a/doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
+++ b/doc/source/analyzing/units/4)_Comparing_units_from_different_datasets.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:448380e74a746d19dc1eecfe222c0e798a87a4ac285e4f50e2598316086c5ee8"
+ "signature": "sha256:273a23e3a20b277a9e5ea7117b48cf19013c331d0893e6e9d21896e97f59aceb"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -22,9 +22,9 @@
"collapsed": false,
"input": [
"# A high redshift output from z ~ 8\n",
- "from yt.mods import *\n",
+ "import yt\n",
"\n",
- "ds1 = load('Enzo_64/DD0002/data0002')\n",
+ "ds1 = yt.load('Enzo_64/DD0002/data0002')\n",
"print \"z = %s\" % ds1.current_redshift\n",
"print \"Internal length units = %s\" % ds1.length_unit\n",
"print \"Internal length units in cgs = %s\" % ds1.length_unit.in_cgs()"
@@ -38,7 +38,7 @@
"collapsed": false,
"input": [
"# A low redshift output from z ~ 0\n",
- "ds2 = load('Enzo_64/DD0043/data0043')\n",
+ "ds2 = yt.load('Enzo_64/DD0043/data0043')\n",
"print \"z = %s\" % ds2.current_redshift\n",
"print \"Internal length units = %s\" % ds2.length_unit\n",
"print \"Internal length units in cgs = %s\" % ds2.length_unit.in_cgs()"
@@ -94,9 +94,10 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
+ "import yt\n",
+ "yt.enable_parallelism()\n",
"\n",
- "ts = DatasetSeries.from_filenames(\"Enzo_64/DD????/data????\")\n",
+ "ts = yt.load(\"Enzo_64/DD????/data????\")\n",
"\n",
"storage = {}\n",
"\n",
@@ -104,7 +105,7 @@
" sto.result_id = ds.current_time\n",
" sto.result = ds.length_unit\n",
"\n",
- "if is_root():\n",
+ "if yt.is_root():\n",
" for t in sorted(storage.keys()):\n",
" print t.in_units('Gyr'), storage[t].in_units('Mpc')"
],
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/analyzing/units/5)_Units_and_plotting.ipynb
--- a/doc/source/analyzing/units/5)_Units_and_plotting.ipynb
+++ b/doc/source/analyzing/units/5)_Units_and_plotting.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:981baca6958c75f0d84bbc24be7d2b75af5957d36aa3eb4ba725d9e47a85f80d"
+ "signature": "sha256:3deac8455c3bbd85e3cefc0f8905be509fba0050f67f69a7faed0505b4d8dbad"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -28,9 +28,9 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
- "ds = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
- "slc = SlicePlot(ds, 2, 'density', center=[0.5, 0.5, 0.5], width=(15, 'kpc'))\n",
+ "import yt\n",
+ "ds = yt.load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
+ "slc = yt.SlicePlot(ds, 2, 'density', center=[0.5, 0.5, 0.5], width=(15, 'kpc'))\n",
"slc.set_figure_size(6)"
],
"language": "python",
@@ -107,7 +107,7 @@
"collapsed": false,
"input": [
"dd = ds.all_data()\n",
- "plot = ProfilePlot(dd, 'density', 'temperature', weight_field='cell_mass')\n",
+ "plot = yt.ProfilePlot(dd, 'density', 'temperature', weight_field='cell_mass')\n",
"plot.show()"
],
"language": "python",
@@ -142,7 +142,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "plot = PhasePlot(dd, 'density', 'temperature', 'cell_mass')\n",
+ "plot = yt.PhasePlot(dd, 'density', 'temperature', 'cell_mass')\n",
"plot.set_figure_size(6)"
],
"language": "python",
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
--- a/doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
+++ b/doc/source/bootcamp/5)_Derived_Fields_and_Profiles.ipynb
@@ -230,14 +230,14 @@
"collapsed": false,
"input": [
"sp_small = ds.sphere(\"max\", (50.0, 'kpc'))\n",
- "bv = sp_small.quantities[\"BulkVelocity\"]()\n",
+ "bv = sp_small.quantities.bulk_velocity()\n",
"\n",
"sp = ds.sphere(\"max\", (0.1, 'Mpc'))\n",
- "rv1 = sp.quantities[\"Extrema\"](\"radial_velocity\")\n",
+ "rv1 = sp.quantities.extrema(\"radial_velocity\")\n",
"\n",
"sp.clear_data()\n",
"sp.set_field_parameter(\"bulk_velocity\", bv)\n",
- "rv2 = sp.quantities[\"Extrema\"](\"radial_velocity\")\n",
+ "rv2 = sp.quantities.extrema(\"radial_velocity\")\n",
"\n",
"print bv\n",
"print rv1\n",
@@ -251,4 +251,4 @@
"metadata": {}
}
]
-}
\ No newline at end of file
+}
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/amrkdtree_downsampling.py
--- a/doc/source/cookbook/amrkdtree_downsampling.py
+++ b/doc/source/cookbook/amrkdtree_downsampling.py
@@ -31,9 +31,9 @@
# This rendering is okay, but lets say I'd like to improve it, and I don't want
# to spend the time rendering the high resolution data. What we can do is
# generate a low resolution version of the AMRKDTree and pass that in to the
-# camera. We do this by specifying a maximum refinement level of 3.
+# camera. We do this by specifying a maximum refinement level of 6.
-kd_low_res = AMRKDTree(ds, max_level=3)
+kd_low_res = AMRKDTree(ds, max_level=6)
print kd_low_res.count_volume()
print kd_low_res.count_cells()
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/amrkdtree_to_uniformgrid.py
--- a/doc/source/cookbook/amrkdtree_to_uniformgrid.py
+++ /dev/null
@@ -1,33 +0,0 @@
-import numpy as np
-import yt
-
-#This is an example of how to map an amr data set
-#to a uniform grid. In this case the highest
-#level of refinement is mapped into a 1024x1024x1024 cube
-
-#first the amr data is loaded
-ds = yt.load("~/pfs/galaxy/new_tests/feedback_8bz/DD0021/DD0021")
-
-#next we get the maxium refinement level
-lmax = ds.parameters['MaximumRefinementLevel']
-
-#calculate the center of the domain
-domain_center = (ds.domain_right_edge - ds.domain_left_edge)/2
-
-#determine the cellsize in the highest refinement level
-cell_size = ds.domain_width/(ds.domain_dimensions*2**lmax)
-
-#calculate the left edge of the new grid
-left_edge = domain_center - 512*cell_size
-
-#the number of cells per side of the new grid
-ncells = 1024
-
-#ask yt for the specified covering grid
-cgrid = ds.covering_grid(lmax, left_edge, np.array([ncells,]*3))
-
-#get a map of the density into the new grid
-density_map = cgrid["density"].astype(dtype="float32")
-
-#save the file as a numpy array for convenient future processing
-np.save("/pfs/goldbaum/galaxy/new_tests/feedback_8bz/gas_density_DD0021_log_densities.npy", density_map)
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/constructing_data_objects.rst
--- a/doc/source/cookbook/constructing_data_objects.rst
+++ b/doc/source/cookbook/constructing_data_objects.rst
@@ -25,6 +25,8 @@
.. yt_cookbook:: find_clumps.py
+.. _extract_frb:
+
Extracting Fixed Resolution Data
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/cosmological_analysis.rst
--- a/doc/source/cookbook/cosmological_analysis.rst
+++ b/doc/source/cookbook/cosmological_analysis.rst
@@ -21,14 +21,6 @@
.. yt_cookbook:: halo_profiler.py
-Halo Tracking Across Timesteps
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-This script demonstrates tracking a halo across multiple timesteps
-in a TimeSeries object, as well as some handy functions for looking
-at the properties of that halo over time.
-
-.. yt_cookbook:: halo_merger_tree.py
-
.. _cookbook-light_cone:
Light Cone Projection
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/custom_colorbar_tickmarks.ipynb
--- a/doc/source/cookbook/custom_colorbar_tickmarks.ipynb
+++ b/doc/source/cookbook/custom_colorbar_tickmarks.ipynb
@@ -1,6 +1,7 @@
{
"metadata": {
- "name": ""
+ "name": "",
+ "signature": "sha256:e8fd07931e339dc67b9d84b0fbc6abc84d3957d885544c24da7aa550f9427a1f"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -11,8 +12,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "%matplotlib inline\n",
- "from yt.mods import *"
+ "import yt"
],
"language": "python",
"metadata": {},
@@ -22,8 +22,8 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "ds = load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
- "slc = SlicePlot(ds, 'x', 'density')\n",
+ "ds = yt.load('IsolatedGalaxy/galaxy0030/galaxy0030')\n",
+ "slc = yt.SlicePlot(ds, 'x', 'density')\n",
"slc"
],
"language": "python",
@@ -87,4 +87,4 @@
"metadata": {}
}
]
-}
+}
\ No newline at end of file
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/embedded_javascript_animation.ipynb
--- a/doc/source/cookbook/embedded_javascript_animation.ipynb
+++ b/doc/source/cookbook/embedded_javascript_animation.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:4f7d409d15ecc538096d15212923312e2cb4a911ebf5a9cf7edc9bd63a8335e9"
+ "signature": "sha256:bed79f0227742715a8753a98f2ad54175767a7c9ded19b14976ee6c8ff255f04"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -23,7 +23,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
+ "import yt\n",
"from JSAnimation import IPython_display\n",
"from matplotlib import animation"
],
@@ -47,14 +47,14 @@
"import matplotlib.pyplot as plt\n",
"from matplotlib.backends.backend_agg import FigureCanvasAgg\n",
"\n",
- "prj = ProjectionPlot(load('Enzo_64/DD0000/data0000'), 0, 'density', weight_field='density',width=(180,'Mpccm'))\n",
+ "prj = yt.ProjectionPlot(yt.load('Enzo_64/DD0000/data0000'), 0, 'density', weight_field='density',width=(180,'Mpccm'))\n",
"prj.set_figure_size(5)\n",
"prj.set_zlim('density',1e-32,1e-26)\n",
"fig = prj.plots['density'].figure\n",
"\n",
"# animation function. This is called sequentially\n",
"def animate(i):\n",
- " ds = load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
+ " ds = yt.load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
" prj._switch_ds(ds)\n",
"\n",
"# call the animator. blit=True means only re-draw the parts that have changed.\n",
@@ -68,4 +68,4 @@
"metadata": {}
}
]
-}
+}
\ No newline at end of file
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/embedded_webm_animation.ipynb
--- a/doc/source/cookbook/embedded_webm_animation.ipynb
+++ b/doc/source/cookbook/embedded_webm_animation.ipynb
@@ -1,7 +1,7 @@
{
"metadata": {
"name": "",
- "signature": "sha256:0090176ae6299b2310bf613404cbfbb42a54e19a03d1469d1429a01170a63aa0"
+ "signature": "sha256:b400f12ff9e27ff6a3ddd13f2f8fc3f88bd857fa6083fad6808f00d771312db7"
},
"nbformat": 3,
"nbformat_minor": 0,
@@ -21,7 +21,7 @@
"cell_type": "code",
"collapsed": false,
"input": [
- "from yt.mods import *\n",
+ "import yt\n",
"from matplotlib import animation"
],
"language": "python",
@@ -96,13 +96,13 @@
"import matplotlib.pyplot as plt\n",
"from matplotlib.backends.backend_agg import FigureCanvasAgg\n",
"\n",
- "prj = ProjectionPlot(load('Enzo_64/DD0000/data0000'), 0, 'density', weight_field='density',width=(180,'Mpccm'))\n",
+ "prj = yt.ProjectionPlot(yt.load('Enzo_64/DD0000/data0000'), 0, 'density', weight_field='density',width=(180,'Mpccm'))\n",
"prj.set_zlim('density',1e-32,1e-26)\n",
"fig = prj.plots['density'].figure\n",
"\n",
"# animation function. This is called sequentially\n",
"def animate(i):\n",
- " ds = load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
+ " ds = yt.load('Enzo_64/DD%04i/data%04i' % (i,i))\n",
" prj._switch_ds(ds)\n",
"\n",
"# call the animator. blit=True means only re-draw the parts that have changed.\n",
@@ -119,4 +119,4 @@
"metadata": {}
}
]
-}
+}
\ No newline at end of file
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/ffmpeg_volume_rendering.py
--- a/doc/source/cookbook/ffmpeg_volume_rendering.py
+++ /dev/null
@@ -1,99 +0,0 @@
-#This is an example of how to make videos of
-#uniform grid data using Theia and ffmpeg
-
-#The Scene object to hold the ray caster and view camera
-from yt.visualization.volume_rendering.theia.scene import TheiaScene
-
-#GPU based raycasting algorithm to use
-from yt.visualization.volume_rendering.theia.algorithms.front_to_back import FrontToBackRaycaster
-
-#These will be used to define how to color the data
-from yt.visualization.volume_rendering.transfer_functions import ColorTransferFunction
-from yt.visualization.color_maps import *
-
-#This will be used to launch ffmpeg
-import subprocess as sp
-
-#Of course we need numpy for math magic
-import numpy as np
-
-#Opacity scaling function
-def scale_func(v, mi, ma):
- return np.minimum(1.0, (v-mi)/(ma-mi) + 0.0)
-
-#load the uniform grid from a numpy array file
-bolshoi = "/home/bogert/log_densities_1024.npy"
-density_grid = np.load(bolshoi)
-
-#Set the TheiaScene to use the density_grid and
-#setup the raycaster for a resulting 1080p image
-ts = TheiaScene(volume = density_grid, raycaster = FrontToBackRaycaster(size = (1920,1080) ))
-
-#the min and max values in the data to color
-mi, ma = 0.0, 3.6
-
-#setup colortransferfunction
-bins = 5000
-tf = ColorTransferFunction( (mi, ma), bins)
-tf.map_to_colormap(0.5, ma, colormap="spring", scale_func = scale_func)
-
-#pass the transfer function to the ray caster
-ts.source.raycaster.set_transfer(tf)
-
-#Initial configuration for start of video
-#set initial opacity and brightness values
-#then zoom into the center of the data 30%
-ts.source.raycaster.set_opacity(0.03)
-ts.source.raycaster.set_brightness(2.3)
-ts.camera.zoom(30.0)
-
-#path to ffmpeg executable
-FFMPEG_BIN = "/usr/local/bin/ffmpeg"
-
-pipe = sp.Popen([ FFMPEG_BIN,
- '-y', # (optional) overwrite the output file if it already exists
- #This must be set to rawvideo because the image is an array
- '-f', 'rawvideo',
- #This must be set to rawvideo because the image is an array
- '-vcodec','rawvideo',
- #The size of the image array and resulting video
- '-s', '1920x1080',
- #This must be rgba to match array format (uint32)
- '-pix_fmt', 'rgba',
- #frame rate of video
- '-r', '29.97',
- #Indicate that the input to ffmpeg comes from a pipe
- '-i', '-',
- # Tells FFMPEG not to expect any audio
- '-an',
- #Setup video encoder
- #Use any encoder you life available from ffmpeg
- '-vcodec', 'libx264', '-preset', 'ultrafast', '-qp', '0',
- '-pix_fmt', 'yuv420p',
- #Name of the output
- 'bolshoiplanck2.mkv' ],
- stdin=sp.PIPE,stdout=sp.PIPE)
-
-
-#Now we loop and produce 500 frames
-for k in range (0,500) :
- #update the scene resulting in a new image
- ts.update()
-
- #get the image array from the ray caster
- array = ts.source.get_results()
-
- #send the image array to ffmpeg
- array.tofile(pipe.stdin)
-
- #rotate the scene by 0.01 rads in x,y & z
- ts.camera.rotateX(0.01)
- ts.camera.rotateZ(0.01)
- ts.camera.rotateY(0.01)
-
- #zoom in 0.01% for a total of a 5% zoom
- ts.camera.zoom(0.01)
-
-
-#Close the pipe to ffmpeg
-pipe.terminate()
diff -r d25f73b9fab97c90cb94cb9f4286ebe6a9bd426e -r e816afd0f0637f5a4eb1136a01309ad4875a97ce doc/source/cookbook/halo_merger_tree.py
--- a/doc/source/cookbook/halo_merger_tree.py
+++ /dev/null
@@ -1,75 +0,0 @@
-### THIS RECIPE IS CURRENTLY BROKEN IN YT-3.0
-### DO NOT TRUST THIS RECIPE UNTIL THIS LINE IS REMOVED
-
-# This script demonstrates some of the halo merger tracking infrastructure,
-# for tracking halos across multiple datadumps in a time series.
-# Ultimately, it outputs an HDF5 file with the important quantities for the
-# top 20 most massive halos in the datadump, and it plots out their mass
-# accretion histories to a series of plots.
-
-# Currently this has only been tested with enzo outputs, but we are looking
-# to generalize this with other codes imminently.
-
-from yt.mods import *
-from yt.analysis_modules.halo_finding.api import *
-from yt.analysis_modules.halo_merger_tree.api import *
-
-# Makes a TimeSeries object from all of whatever files you have
-ts = DatasetSeries.from_filenames("enzo_tiny_cosmology/DD????/DD????")
-
-# For each datadump in our timeseries, run the friends of friends
-# halo finder on it (this has only been tested with FOF currently).
-# Output the information about the halos and the particles comprising each
-# to disk. These files will all be in the FOF subdirectory.
-# This also works with an external FOF program run outside of yt,
-# in which case skip this step and do that yourself.
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-for ds in ts:
- halo_list = FOFHaloFinder(ds)
- i = int(ds.basename[2:])
- halo_list.write_out("FOF/groups_%05i.txt" % i)
- halo_list.write_particle_lists("FOF/particles_%05i" % i)
-
-# Create a merger tree object. This object is a tuple, where the
-# first part is a dictionary showing the correlation between file
-# output_number and redshift. The second part is a dictionary
-# correlating each halo with it's parent halos from the previous timestep
-# (along with number and fraction of particles taken from that parent)
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-
-# by default, this saves the merger tree to disk in a CPickle:
-# FOF/merger_tree.cpkl so you can use it later.
-# Note that there are a bunch of filters you can place on this, so
-# that you're only building a merger tree for a subset of redshifts
-# or data outputs.
-mt = EnzoFOFMergerTree(external_FOF=False)
-
-# If you want to just use your already generated merger_tree,
-# uncomment the next line.
-# mt = EnzoFOFMergerTree(external_FOF=False, load_saved=True)
-
-# For each of the top 20 most massive halos from the final timestep
-# build its merger history. You can then print this tree to screen,
-# or more usefully, save the lineage of that tree to disk for use
-# later. save_halo_evolution() follows the largest progenitor
-# which contributes the most to the subsequent children.
-for i in range(20):
- mt.build_tree(i)
- mt.print_tree()
- mt.save_halo_evolution('halos.h5')
-
-# For each of the top 20 most massive halos from the final timestep
-# plot its evolution of two quantities. The default is to look at
-# timestep vs mass, but you can look at center_of_mass phase-space
-# coordinates, fraction from progenitor, mass of halo, and more.
-# These are spit out to .png files in the FOF directory.
-for i in range(20):
- plot_halo_evolution('halos.h5', i)
This diff is so big that we needed to truncate the remainder.
Repository URL: https://bitbucket.org/yt_analysis/yt/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
More information about the yt-svn
mailing list