[Yt-svn] commit/yt-doc: 4 new changesets

Bitbucket commits-noreply at bitbucket.org
Thu Mar 10 13:04:25 PST 2011


4 new changesets in yt-doc:

http://bitbucket.org/yt_analysis/yt-doc/changeset/c0ef3ed82539/
changeset:   r39:c0ef3ed82539
user:        MatthewTurk
date:        2011-03-06 19:11:56
summary:     Moving analysis up a line
affected #:  1 file (0 bytes)

--- a/source/advanced/index.rst	Tue Mar 08 10:43:54 2011 -0700
+++ b/source/advanced/index.rst	Sun Mar 06 13:11:56 2011 -0500
@@ -14,5 +14,5 @@
    creating_derived_quantities
    creating_datatypes
    debugdrive
+   external_analysis
    developing
-   external_analysis


http://bitbucket.org/yt_analysis/yt-doc/changeset/87208efcd702/
changeset:   r40:87208efcd702
user:        MatthewTurk
date:        2011-03-09 20:53:03
summary:     Outline of simulated observations
affected #:  1 file (2.2 KB)

--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/source/visualizing/simulated_observations.rst	Wed Mar 09 14:53:03 2011 -0500
@@ -0,0 +1,64 @@
+Generating Simulated Observations
+=================================
+
+yt has several facilities for generating simulated observations.  Each of these
+comes with several caveats, and none should be expected to produce a completely
+finished product.  You should investigate each option carefully and determine
+which, if any, will deliver the type of observation you are interested in.
+
+
+
+X-ray Observations
+++++++++++++++++++
+
+Under the assumption of optically thin gas, projections can be made
+using emissivity to generated simulated observations.  yt includes a
+method for handling output from CLOUDY in the ROCO (Smith et al 2008)
+format, and generating integrated emissivity over given energy ranges.
+
+Caveats: The ROCO format for input requires some non-trivial handling
+of CLOUDY output.
+
+= SED Generation and Deposition =
+
+Using BC03 models for stellar population synthesis, star particles in
+a given calculation can be assigned an integrated flux for a specific
+bandpass.  These fluxes can then be combined using either projections
+or volume rendering.  This can use CIC interpolation to deposit a
+total flux into each cell (which should be flux-conserving, modulo a
+multiplicative factor not currently included) which is then either
+projected or volume rendered.
+
+Caveats: The deposition method produces far too washed out and murky
+results.  The multiplicative factor is not currently set correctly
+universally.
+
+= Thermal Gas Emission =
+
+Applying a black body spectrum to the thermal content of the gas, we
+can volume render the domain and apply absorption based on broad
+arguments of scattering.  One could theoretically include star
+particles as point sources in this, using recent changes to the volume
+renderer.
+
+Caveats: Scattering that results in re-emission is completely
+neglected, such as Halpha emission.  Scattering that results in just
+attenuating the emission is set in an ad hoc fashion.  Emission from
+point sources, if included at all, is included in a non-conservative
+fashion.
+
+= Export to Sunrise =
+
+Data can be exported to Sunrise for simulated observation generation.
+
+Caveats: This process is poorly documented.
+
+= SZ Compton y and SZ Kinetic Maps =
+
+Future Directions
+-----------------
+
+* ALMA maps
+* 21cm observations
+* Applying PSFs
+* 


http://bitbucket.org/yt_analysis/yt-doc/changeset/f8250821e6cf/
changeset:   r41:f8250821e6cf
user:        MatthewTurk
date:        2011-03-10 22:04:14
summary:     Fixing error in object docs
affected #:  1 file (6 bytes)

--- a/source/analyzing/objects.rst	Wed Mar 09 14:53:03 2011 -0500
+++ b/source/analyzing/objects.rst	Thu Mar 10 13:04:14 2011 -0800
@@ -48,7 +48,7 @@
 .. code-block:: python
 
    pf = load("my_data")
-   print pf.field_info["Pressure"].units
+   print pf.field_info["Pressure"].get_units()
 
 This is a fast way to examine the units of a given field, and additionally you
 can use :meth:`yt.lagos.DerivedField.get_source` to get the source code:


http://bitbucket.org/yt_analysis/yt-doc/changeset/895c43ee9421/
changeset:   r42:895c43ee9421
user:        MatthewTurk
date:        2011-03-10 22:04:20
summary:     Merging
affected #:  0 files (0 bytes)

--- a/source/analysis_modules/merger_tree.rst	Thu Mar 10 13:04:14 2011 -0800
+++ b/source/analysis_modules/merger_tree.rst	Thu Mar 10 13:04:20 2011 -0800
@@ -15,7 +15,7 @@
 General Overview
 ----------------
 
-The first requirement is a set of sequential Enzo datasets.
+The first requirement is a set of sequential datasets.
 The detail of the merger tree is increased as the difference in
 time between snapshots is reduced, at the cost of higher computational effort
 for the tree itself and and disk usage for the snapshots.
@@ -110,7 +110,9 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
+  from yt.analysis_modules.halo_finding.api import *
 
   files = []
   start = 100
@@ -134,7 +136,9 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
+  from yt.analysis_modules.halo_finding.api import *
   import yt.analysis_modules.simulation_handler.api as ES
   
   es = ES.EnzoSimulation('/path/to/snapshots/simulation.par')
@@ -226,6 +230,13 @@
   halos.write_particle_lists('MergerHalos')
   halos.write_particle_lists_txt('MergerHalos')
 
+There is a convenience function that will call the three functions above
+at one time:
+
+.. code-block:: python
+
+  halos.dump('MergerHalos')
+
 Please see the documents on halo finding for more information on what these
 commands do (:ref:`halo_finding`).
 
@@ -291,6 +302,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
 
   mtc = MergerTreeConnect(database='halos.db')
@@ -306,6 +318,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
 
   mtc = MergerTreeConnect(database='halos.db')
@@ -336,6 +349,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
 
   mtc = MergerTreeConnect(database='halos.db')
@@ -370,6 +384,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   mtc = MergerTreeConnect(database='halos.db')
@@ -395,6 +410,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   mtc = MergerTreeConnect(database='halos.db')
@@ -417,6 +433,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   mtc = MergerTreeConnect(database='halos.db')
@@ -599,6 +616,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   mtc = MergerTreeConnect(database='halos.db')
@@ -612,6 +630,19 @@
 ``SnapHaloID``=0. ``results`` will contain a one-tuple in a list of the
 desired ``GlobalHaloID``.
 
+Alternatively, one of the convenience functions can be used which may be easier:
+
+.. code-block:: python
+
+  from yt.mods import *
+  
+  mtc = MergerTreeConnect(database='halos.db')
+  
+  thisHalo = mtc.get_GlobalHaloID(0, 0.0)
+
+``thisHalo`` will be an integer giving the GlobalHaloID for the most massive
+halo (ID=0) at z=0.0.
+
 To output the merger tree for the five largest halos in the last snapshot,
 it may be simplest to find the ``SnapCurrentTimeIdentifier`` for that
 snapshot.
@@ -622,6 +653,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   mtc = MergerTreeConnect(database='halos.db')
@@ -640,6 +672,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   MergerTreeDotOutput(halos=[0,1,2,3,4], database='halos.db',
@@ -655,6 +688,7 @@
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
   
   MergerTreeDotOutput(halos=[24212,5822,19822,10423,51324], database='halos.db',
@@ -664,6 +698,18 @@
 parent and child halos for which at least 70% of the parent halo's mass goes
 to the child. The default is 0.2.
 
+In this slightly modified example below, if ``dot`` is installed in the
+``PATH``, an image file will be created without an intermediate text file:
+
+.. code-block:: python
+
+  from yt.mods import *
+  from yt.analysis_modules.halo_merger_tree.api import *
+  
+  MergerTreeDotOutput(halos=[24212,5822,19822,10423,51324], database='halos.db',
+      dotfile='MergerTree.png', link_min=0.7)
+
+
 Plain-Text Output
 ~~~~~~~~~~~~~~~~~
 
@@ -679,12 +725,15 @@
 -----------------------
 
 Here is an example of how to create a merger tree for the most massive halo
-in the final snapshot from start to finish.
+in the final snapshot from start to finish, and output the Graphviz
+visualization as a PDF file.
 This will work in serial and in parallel.
 
 .. code-block:: python
 
+  from yt.mods import *
   from yt.analysis_modules.halo_merger_tree.api import *
+  from yt.analysis_modules.halo_finding.api import *
 
   # Pick our snapshots to use.
   files = []
@@ -700,13 +749,11 @@
   
   # Get the GlobalHaloID for the halo.
   mtc = MergerTreeConnect(database=my_database)
-  line = "SELECT max(GlobalHaloID) FROM Halos WHERE SnapHaloID=0;"
-  results = mtc.query(line)
-  my_halo = results[0][0] # one-tuple in a list
+  my_halo = mtc.get_GlobalHaloID(0, 0.0)
   
-  # Output the Graphviz file.
+  # Output the tree as a PDF file.
   MergerTreeDotOutput(halos=my_halo, database=my_database, link_min=0.5,
-      dotfile='MergerTree.gv')
+      dotfile='MergerTree.pdf')

Repository URL: https://bitbucket.org/yt_analysis/yt-doc/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list