[yt-svn] commit/yt: 4 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Mon Jul 21 11:42:21 PDT 2014


4 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/c5d46078107e/
Changeset:   c5d46078107e
Branch:      yt-3.0
User:        chummels
Date:        2014-07-21 01:19:07
Summary:     Removing halo_merger_tree recipe, since that functionality has not yet been ported to 3.0
Affected #:  2 files

diff -r 43ba03d9098851c7018b796410a73ae60e376d3d -r c5d46078107e219ed9747537fa5e762e626eced8 doc/source/cookbook/cosmological_analysis.rst
--- a/doc/source/cookbook/cosmological_analysis.rst
+++ b/doc/source/cookbook/cosmological_analysis.rst
@@ -21,14 +21,6 @@
 
 .. yt_cookbook:: halo_profiler.py
 
-Halo Tracking Across Timesteps
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-This script demonstrates tracking a halo across multiple timesteps
-in a TimeSeries object, as well as some handy functions for looking
-at the properties of that halo over time.
-
-.. yt_cookbook:: halo_merger_tree.py
-
 .. _cookbook-light_cone:
 
 Light Cone Projection

diff -r 43ba03d9098851c7018b796410a73ae60e376d3d -r c5d46078107e219ed9747537fa5e762e626eced8 doc/source/cookbook/halo_merger_tree.py
--- a/doc/source/cookbook/halo_merger_tree.py
+++ /dev/null
@@ -1,75 +0,0 @@
-### THIS RECIPE IS CURRENTLY BROKEN IN YT-3.0
-### DO NOT TRUST THIS RECIPE UNTIL THIS LINE IS REMOVED
-
-# This script demonstrates some of the halo merger tracking infrastructure,
-# for tracking halos across multiple datadumps in a time series.
-# Ultimately, it outputs an HDF5 file with the important quantities for the
-# top 20 most massive halos in the datadump, and it plots out their mass
-# accretion histories to a series of plots.
-
-# Currently this has only been tested with enzo outputs, but we are looking
-# to generalize this with other codes imminently.
-
-from yt.mods import *
-from yt.analysis_modules.halo_finding.api import *
-from yt.analysis_modules.halo_merger_tree.api import *
-
-# Makes a TimeSeries object from all of whatever files you have
-ts = DatasetSeries.from_filenames("enzo_tiny_cosmology/DD????/DD????")
-
-# For each datadump in our timeseries, run the friends of friends
-# halo finder on it (this has only been tested with FOF currently).
-# Output the information about the halos and the particles comprising each
-# to disk.  These files will all be in the FOF subdirectory.
-# This also works with an external FOF program run outside of yt,
-# in which case skip this step and do that yourself.
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP 
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-for ds in ts:
-    halo_list = FOFHaloFinder(ds)
-    i = int(ds.basename[2:])
-    halo_list.write_out("FOF/groups_%05i.txt" % i)
-    halo_list.write_particle_lists("FOF/particles_%05i" % i)
-
-# Create a merger tree object.  This object is a tuple, where the
-# first part is a dictionary showing the correlation between file 
-# output_number and redshift.  The second part is a dictionary
-# correlating each halo with it's parent halos from the previous timestep
-# (along with number and fraction of particles taken from that parent)
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP 
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-
-# by default, this saves the merger tree to disk in a CPickle:
-# FOF/merger_tree.cpkl so you can use it later.
-# Note that there are a bunch of filters you can place on this, so 
-# that you're only building a merger tree for a subset of redshifts
-# or data outputs.
-mt = EnzoFOFMergerTree(external_FOF=False)
-
-# If you want to just use your already generated merger_tree, 
-# uncomment the next line.
-# mt = EnzoFOFMergerTree(external_FOF=False, load_saved=True)
-
-# For each of the top 20 most massive halos from the final timestep
-# build its merger history.  You can then print this tree to screen, 
-# or more usefully, save the lineage of that tree to disk for use 
-# later.  save_halo_evolution() follows the largest progenitor
-# which contributes the most to the subsequent children.
-for i in range(20):
-    mt.build_tree(i)
-    mt.print_tree()
-    mt.save_halo_evolution('halos.h5')
-
-# For each of the top 20 most massive halos from the final timestep
-# plot its evolution of two quantities.  The default is to look at 
-# timestep vs mass, but you can look at center_of_mass phase-space 
-# coordinates, fraction from progenitor, mass of halo, and more.
-# These are spit out to .png files in the FOF directory.
-for i in range(20):
-    plot_halo_evolution('halos.h5', i)


https://bitbucket.org/yt_analysis/yt/commits/fed5fe18106e/
Changeset:   fed5fe18106e
Branch:      yt-3.0
User:        chummels
Date:        2014-07-21 01:23:54
Summary:     Adding note to yt Hub page indicating that it is currently offline.
Affected #:  1 file

diff -r c5d46078107e219ed9747537fa5e762e626eced8 -r fed5fe18106e6e2e4831abe169cfa13ea26cd339 doc/source/reference/sharing_data.rst
--- a/doc/source/reference/sharing_data.rst
+++ b/doc/source/reference/sharing_data.rst
@@ -1,6 +1,9 @@
 What is the yt Hub?
 ===================
 
+.. warning:: The yt Hub is currently offline due to some hosting problems.  We
+             hope to have it back up online soon.
+
 The yt data hub is a mechanism by which images, data objects and projects can be
 shared with other people.  For instance, one can upload projections and browse
 them with an interface similar to Google Maps or upload notebooks and view them


https://bitbucket.org/yt_analysis/yt/commits/f33ebcd3c243/
Changeset:   f33ebcd3c243
Branch:      yt-3.0
User:        chummels
Date:        2014-07-21 03:36:11
Summary:     Making analysis modules only 1 level deep in rst file so one can find them from the "analyze" page.
Affected #:  1 file

diff -r fed5fe18106e6e2e4831abe169cfa13ea26cd339 -r f33ebcd3c243764284f6cdfb30721e8ac673cf4e doc/source/analyzing/analysis_modules/index.rst
--- a/doc/source/analyzing/analysis_modules/index.rst
+++ b/doc/source/analyzing/analysis_modules/index.rst
@@ -2,10 +2,9 @@
 ================
 
 These are "canned" analysis modules that can operate on datasets, performing a
-sequence of operations that result in a final result.
-
-Astrophysics Analysis Modules
------------------------------
+sequence of operations that result in a final result.  This functionality 
+interoperates with yt, but one needs to import the functions associated
+with each specific analysis module into python before using them.
 
 .. toctree::
    :maxdepth: 2
@@ -13,13 +12,6 @@
    halo_analysis
    synthetic_observation
    exporting
-
-General Analysis Modules
-------------------------
-
-.. toctree::
-   :maxdepth: 1
-
    two_point_functions
    clump_finding
    particle_trajectories


https://bitbucket.org/yt_analysis/yt/commits/e47aaa3b9769/
Changeset:   e47aaa3b9769
Branch:      yt-3.0
User:        chummels
Date:        2014-07-21 20:42:14
Summary:     Merged in chummels/yt/yt-3.0 (pull request #1045)

A few misc docs changes.
Affected #:  4 files

diff -r 52b3ddbb6a53030a51290fba45acf8deab74910f -r e47aaa3b97697882c6653fadc2619f92d0e1bc36 doc/source/analyzing/analysis_modules/index.rst
--- a/doc/source/analyzing/analysis_modules/index.rst
+++ b/doc/source/analyzing/analysis_modules/index.rst
@@ -2,10 +2,9 @@
 ================
 
 These are "canned" analysis modules that can operate on datasets, performing a
-sequence of operations that result in a final result.
-
-Astrophysics Analysis Modules
------------------------------
+sequence of operations that result in a final result.  This functionality 
+interoperates with yt, but one needs to import the functions associated
+with each specific analysis module into python before using them.
 
 .. toctree::
    :maxdepth: 2
@@ -13,13 +12,6 @@
    halo_analysis
    synthetic_observation
    exporting
-
-General Analysis Modules
-------------------------
-
-.. toctree::
-   :maxdepth: 1
-
    two_point_functions
    clump_finding
    particle_trajectories

diff -r 52b3ddbb6a53030a51290fba45acf8deab74910f -r e47aaa3b97697882c6653fadc2619f92d0e1bc36 doc/source/cookbook/cosmological_analysis.rst
--- a/doc/source/cookbook/cosmological_analysis.rst
+++ b/doc/source/cookbook/cosmological_analysis.rst
@@ -21,14 +21,6 @@
 
 .. yt_cookbook:: halo_profiler.py
 
-Halo Tracking Across Timesteps
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-This script demonstrates tracking a halo across multiple timesteps
-in a TimeSeries object, as well as some handy functions for looking
-at the properties of that halo over time.
-
-.. yt_cookbook:: halo_merger_tree.py
-
 .. _cookbook-light_cone:
 
 Light Cone Projection

diff -r 52b3ddbb6a53030a51290fba45acf8deab74910f -r e47aaa3b97697882c6653fadc2619f92d0e1bc36 doc/source/cookbook/halo_merger_tree.py
--- a/doc/source/cookbook/halo_merger_tree.py
+++ /dev/null
@@ -1,75 +0,0 @@
-### THIS RECIPE IS CURRENTLY BROKEN IN YT-3.0
-### DO NOT TRUST THIS RECIPE UNTIL THIS LINE IS REMOVED
-
-# This script demonstrates some of the halo merger tracking infrastructure,
-# for tracking halos across multiple datadumps in a time series.
-# Ultimately, it outputs an HDF5 file with the important quantities for the
-# top 20 most massive halos in the datadump, and it plots out their mass
-# accretion histories to a series of plots.
-
-# Currently this has only been tested with enzo outputs, but we are looking
-# to generalize this with other codes imminently.
-
-from yt.mods import *
-from yt.analysis_modules.halo_finding.api import *
-from yt.analysis_modules.halo_merger_tree.api import *
-
-# Makes a TimeSeries object from all of whatever files you have
-ts = DatasetSeries.from_filenames("enzo_tiny_cosmology/DD????/DD????")
-
-# For each datadump in our timeseries, run the friends of friends
-# halo finder on it (this has only been tested with FOF currently).
-# Output the information about the halos and the particles comprising each
-# to disk.  These files will all be in the FOF subdirectory.
-# This also works with an external FOF program run outside of yt,
-# in which case skip this step and do that yourself.
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP 
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-for ds in ts:
-    halo_list = FOFHaloFinder(ds)
-    i = int(ds.basename[2:])
-    halo_list.write_out("FOF/groups_%05i.txt" % i)
-    halo_list.write_particle_lists("FOF/particles_%05i" % i)
-
-# Create a merger tree object.  This object is a tuple, where the
-# first part is a dictionary showing the correlation between file 
-# output_number and redshift.  The second part is a dictionary
-# correlating each halo with it's parent halos from the previous timestep
-# (along with number and fraction of particles taken from that parent)
-
-# ------------------------------------------------------------
-# DEPENDING ON THE SIZE OF YOUR FILES, THIS CAN BE A LONG STEP 
-# but because we're writing them out to disk, you only have to do this once.
-# ------------------------------------------------------------
-
-# by default, this saves the merger tree to disk in a CPickle:
-# FOF/merger_tree.cpkl so you can use it later.
-# Note that there are a bunch of filters you can place on this, so 
-# that you're only building a merger tree for a subset of redshifts
-# or data outputs.
-mt = EnzoFOFMergerTree(external_FOF=False)
-
-# If you want to just use your already generated merger_tree, 
-# uncomment the next line.
-# mt = EnzoFOFMergerTree(external_FOF=False, load_saved=True)
-
-# For each of the top 20 most massive halos from the final timestep
-# build its merger history.  You can then print this tree to screen, 
-# or more usefully, save the lineage of that tree to disk for use 
-# later.  save_halo_evolution() follows the largest progenitor
-# which contributes the most to the subsequent children.
-for i in range(20):
-    mt.build_tree(i)
-    mt.print_tree()
-    mt.save_halo_evolution('halos.h5')
-
-# For each of the top 20 most massive halos from the final timestep
-# plot its evolution of two quantities.  The default is to look at 
-# timestep vs mass, but you can look at center_of_mass phase-space 
-# coordinates, fraction from progenitor, mass of halo, and more.
-# These are spit out to .png files in the FOF directory.
-for i in range(20):
-    plot_halo_evolution('halos.h5', i)

diff -r 52b3ddbb6a53030a51290fba45acf8deab74910f -r e47aaa3b97697882c6653fadc2619f92d0e1bc36 doc/source/reference/sharing_data.rst
--- a/doc/source/reference/sharing_data.rst
+++ b/doc/source/reference/sharing_data.rst
@@ -1,6 +1,9 @@
 What is the yt Hub?
 ===================
 
+.. warning:: The yt Hub is currently offline due to some hosting problems.  We
+             hope to have it back up online soon.
+
 The yt data hub is a mechanism by which images, data objects and projects can be
 shared with other people.  For instance, one can upload projections and browse
 them with an interface similar to Google Maps or upload notebooks and view them

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list