[yt-svn] commit/yt-doc: 3 new changesets

Bitbucket commits-noreply at bitbucket.org
Fri Aug 17 10:39:32 PDT 2012


3 new commits in yt-doc:


https://bitbucket.org/yt_analysis/yt-doc/changeset/5a74167667c7/
changeset:   5a74167667c7
user:        Christopher Moody
date:        2012-08-16 01:55:23
summary:     Adding documentation for ART loading
affected #:  1 file

diff -r f2bbe043376d6588f97dc1662e411155d09180fe -r 5a74167667c7f9d5a6aeb5da973938a5216c3247 source/analyzing/loading_data.rst
--- a/source/analyzing/loading_data.rst
+++ b/source/analyzing/loading_data.rst
@@ -162,7 +162,71 @@
 ART Data
 --------
 
-ART data enjoys very preliminary support and is supported by Christopher Moody.
+ART data enjoys preliminary support and is supported by Christopher Moody.
 Please contact the ``yt-dev`` mailing list if you are interested in using yt
 for ART data, or if you are interested in assisting with development of yt to
 work with ART data.
+
+At the moment, the ART octree is 'regridded' at each level to make the native
+octree look more like a mesh-based code. As a result, the initial outlay
+is about ~60 seconds to grid octs onto a mesh. This will be improved in 
+``yt-3.0``, where octs will be supported natively. 
+
+To load an ART dataset you can use the ``load`` command provided by 
+``yt.mods`` and passing the gas mesh file. It will search for and attempt 
+to find the complementary dark matter and stellar particle header and data 
+files. However, your simulations may not follow the same naming convention,
+but you can specify the individual files.
+
+So for example, a single snapshot might have a series of files looking like
+this:
+
+.. code-block:: none
+
+    10MpcBox_csf512_a0.300.d    #Gas mesh
+    PMcrda0.300.DAT             #Particle header
+    PMcrs0a0.300.DAT            #Particle data (positions,velocities)
+    stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
+
+The ART frontend tries to find the associated files matching the above, but
+if that fails you can specify ``file_particle_data``,``file_particle_data``,
+``file_star_data`` in addition to the specifying the gas mesh. You also have 
+the option of gridding these particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's best to leave
+``do_grid_particles=False`` as the deault.
+
+To speed up the loading of an ART file, you have a few options. You can turn 
+off the particles entirely by setting ``discover_particles=False``. You can
+also only grid octs up to a certain level, ``limit_level=5``, which is useful
+when debugging by artificially creating a 'smaller' dataset to work with.
+
+Finally, when stellar ages are computed we 'spread' the ages evenly within a
+smoothing window. By default this is turned on and set to 10Myr. To turn this 
+off you can set ``spread=False``, and you can tweak the age smoothing window
+by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
+
+:: code-block:: python
+    
+    from yt.mods import *
+
+    file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
+    pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
+    pf.h.print_stats()
+    dd=pf.h.all_data()
+    print na.sum(dd['particle_type']==0)
+
+In the above example code, the first line imports the standard yt functions,
+followed by defining the gas mesh file, and load it only through level 3,
+but gridding particles through on to meshes on level 2 and higher. Finally, 
+we create a data container and ask it to gather the particle_type array. In 
+this case type==0 is for the most highly-refined dark matter particle, and 
+we print out how many high-resolution star particles we find in the simulation.
+Typically, however, you shouldn't have to specify any keyword arguments to load
+in a dataset.
+
+
+
+
+
+
+



https://bitbucket.org/yt_analysis/yt-doc/changeset/b34d4682f051/
changeset:   b34d4682f051
user:        Christopher Moody
date:        2012-08-16 03:06:16
summary:     Typos in ART docs
affected #:  1 file

diff -r 5a74167667c7f9d5a6aeb5da973938a5216c3247 -r b34d4682f051bb24f240ce37b8aed83ab3f3de4a source/analyzing/loading_data.rst
--- a/source/analyzing/loading_data.rst
+++ b/source/analyzing/loading_data.rst
@@ -175,8 +175,7 @@
 To load an ART dataset you can use the ``load`` command provided by 
 ``yt.mods`` and passing the gas mesh file. It will search for and attempt 
 to find the complementary dark matter and stellar particle header and data 
-files. However, your simulations may not follow the same naming convention,
-but you can specify the individual files.
+files. However, your simulations may not follow the same naming convention.
 
 So for example, a single snapshot might have a series of files looking like
 this:
@@ -191,8 +190,8 @@
 The ART frontend tries to find the associated files matching the above, but
 if that fails you can specify ``file_particle_data``,``file_particle_data``,
 ``file_star_data`` in addition to the specifying the gas mesh. You also have 
-the option of gridding these particles, and assigning them onto the meshes.
-This process is in beta, and for the time being it's best to leave
+the option of gridding particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's probably  best to leave
 ``do_grid_particles=False`` as the deault.
 
 To speed up the loading of an ART file, you have a few options. You can turn 
@@ -216,8 +215,8 @@
     print na.sum(dd['particle_type']==0)
 
 In the above example code, the first line imports the standard yt functions,
-followed by defining the gas mesh file, and load it only through level 3,
-but gridding particles through on to meshes on level 2 and higher. Finally, 
+followed by defining the gas mesh file. It's loaded only through level 3,
+but grids particles on to meshes on level 2 and higher. Finally, 
 we create a data container and ask it to gather the particle_type array. In 
 this case type==0 is for the most highly-refined dark matter particle, and 
 we print out how many high-resolution star particles we find in the simulation.



https://bitbucket.org/yt_analysis/yt-doc/changeset/3924ac089f0e/
changeset:   3924ac089f0e
user:        MatthewTurk
date:        2012-08-17 19:39:31
summary:     Merged in juxtaposicion/yt-doc (pull request #51)
affected #:  1 file

diff -r eb3c3f1f75520f7a3bdcb0920b694254234f1175 -r 3924ac089f0eab3401ba4a8fa38a3e23c64f91bd source/analyzing/loading_data.rst
--- a/source/analyzing/loading_data.rst
+++ b/source/analyzing/loading_data.rst
@@ -162,7 +162,70 @@
 ART Data
 --------
 
-ART data enjoys very preliminary support and is supported by Christopher Moody.
+ART data enjoys preliminary support and is supported by Christopher Moody.
 Please contact the ``yt-dev`` mailing list if you are interested in using yt
 for ART data, or if you are interested in assisting with development of yt to
 work with ART data.
+
+At the moment, the ART octree is 'regridded' at each level to make the native
+octree look more like a mesh-based code. As a result, the initial outlay
+is about ~60 seconds to grid octs onto a mesh. This will be improved in 
+``yt-3.0``, where octs will be supported natively. 
+
+To load an ART dataset you can use the ``load`` command provided by 
+``yt.mods`` and passing the gas mesh file. It will search for and attempt 
+to find the complementary dark matter and stellar particle header and data 
+files. However, your simulations may not follow the same naming convention.
+
+So for example, a single snapshot might have a series of files looking like
+this:
+
+.. code-block:: none
+
+    10MpcBox_csf512_a0.300.d    #Gas mesh
+    PMcrda0.300.DAT             #Particle header
+    PMcrs0a0.300.DAT            #Particle data (positions,velocities)
+    stars_a0.300.dat            #Stellar data (metallicities, ages, etc.)
+
+The ART frontend tries to find the associated files matching the above, but
+if that fails you can specify ``file_particle_data``,``file_particle_data``,
+``file_star_data`` in addition to the specifying the gas mesh. You also have 
+the option of gridding particles, and assigning them onto the meshes.
+This process is in beta, and for the time being it's probably  best to leave
+``do_grid_particles=False`` as the deault.
+
+To speed up the loading of an ART file, you have a few options. You can turn 
+off the particles entirely by setting ``discover_particles=False``. You can
+also only grid octs up to a certain level, ``limit_level=5``, which is useful
+when debugging by artificially creating a 'smaller' dataset to work with.
+
+Finally, when stellar ages are computed we 'spread' the ages evenly within a
+smoothing window. By default this is turned on and set to 10Myr. To turn this 
+off you can set ``spread=False``, and you can tweak the age smoothing window
+by specifying the window in seconds, ``spread=1.0e7*265*24*3600``. 
+
+:: code-block:: python
+    
+    from yt.mods import *
+
+    file = "/u/cmoody3/data/art_snapshots/SFG1/10MpcBox_csf512_a0.460.d"
+    pf = load(file,discover_particles=True,grid_particles=2,limit_level=3)
+    pf.h.print_stats()
+    dd=pf.h.all_data()
+    print na.sum(dd['particle_type']==0)
+
+In the above example code, the first line imports the standard yt functions,
+followed by defining the gas mesh file. It's loaded only through level 3,
+but grids particles on to meshes on level 2 and higher. Finally, 
+we create a data container and ask it to gather the particle_type array. In 
+this case type==0 is for the most highly-refined dark matter particle, and 
+we print out how many high-resolution star particles we find in the simulation.
+Typically, however, you shouldn't have to specify any keyword arguments to load
+in a dataset.
+
+
+
+
+
+
+

Repository URL: https://bitbucket.org/yt_analysis/yt-doc/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list