[yt-svn] commit/yt: 2 new changesets

commits-noreply at bitbucket.org commits-noreply at bitbucket.org
Mon Aug 4 05:51:16 PDT 2014


2 new commits in yt:

https://bitbucket.org/yt_analysis/yt/commits/73a9f7491572/
Changeset:   73a9f7491572
Branch:      stable
User:        MatthewTurk
Date:        2014-08-04 14:50:38
Summary:     Merging from yt
Affected #:  884 files

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 .hgchurn
--- a/.hgchurn
+++ b/.hgchurn
@@ -13,8 +13,13 @@
 drudd = drudd at uchicago.edu
 awetzel = andrew.wetzel at yale.edu
 David Collins (dcollins4096 at gmail.com) = dcollins4096 at gmail.com
+dcollins4096 = dcollins4096 at gmail.com
 dcollins at physics.ucsd.edu = dcollins4096 at gmail.com
 tabel = tabel at slac.stanford.edu
 sername=kayleanelson = kaylea.nelson at yale.edu
 kayleanelson = kaylea.nelson at yale.edu
 jcforbes at ucsc.edu = jforbes at ucolick.org
+ngoldbau at ucsc.edu = goldbaum at ucolick.org
+biondo at wisc.edu = Biondo at wisc.edu
+samgeen at googlemail.com = samgeen at gmail.com
+fbogert = fbogert at ucsc.edu
\ No newline at end of file

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -6,12 +6,23 @@
 png.cfg
 rockstar.cfg
 yt_updater.log
+yt/frontends/artio/_artio_caller.c
+yt/analysis_modules/halo_finding/rockstar/rockstar_groupies.c
 yt/analysis_modules/halo_finding/rockstar/rockstar_interface.c
 yt/frontends/ramses/_ramses_reader.cpp
+yt/frontends/sph/smoothing_kernel.c
+yt/geometry/fake_octree.c
+yt/geometry/oct_container.c
+yt/geometry/oct_visitors.c
+yt/geometry/particle_deposit.c
+yt/geometry/particle_oct_container.c
+yt/geometry/particle_smooth.c
+yt/geometry/selection_routines.c
 yt/utilities/amr_utils.c
 yt/utilities/kdtree/forthonf2c.h
 yt/utilities/libconfig_wrapper.c
 yt/utilities/spatial/ckdtree.c
+yt/utilities/lib/alt_ray_tracers.c
 yt/utilities/lib/amr_kdtools.c
 yt/utilities/lib/CICDeposit.c
 yt/utilities/lib/ContourFinding.c
@@ -20,14 +31,18 @@
 yt/utilities/lib/fortran_reader.c
 yt/utilities/lib/freetype_writer.c
 yt/utilities/lib/geometry_utils.c
+yt/utilities/lib/image_utilities.c
 yt/utilities/lib/Interpolators.c
 yt/utilities/lib/kdtree.c
+yt/utilities/lib/mesh_utilities.c
 yt/utilities/lib/misc_utilities.c
 yt/utilities/lib/Octree.c
+yt/utilities/lib/origami.c
 yt/utilities/lib/png_writer.c
 yt/utilities/lib/PointsInVolume.c
 yt/utilities/lib/QuadTree.c
 yt/utilities/lib/RayIntegrators.c
+yt/utilities/lib/ragged_arrays.c
 yt/utilities/lib/VolumeIntegrator.c
 yt/utilities/lib/grid_traversal.c
 yt/utilities/lib/GridTree.c
@@ -38,4 +53,9 @@
 *.pyc
 .*.swp
 *.so
+.idea/*
 tests/results/*
+doc/build/*
+doc/source/reference/api/generated/*
+doc/_temp/*
+doc/source/bootcamp/.ipynb_checkpoints/

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -5156,6 +5156,9 @@
 0000000000000000000000000000000000000000 mpi-opaque
 f15825659f5af3ce64aaad30062aff3603cbfb66 hop callback
 0000000000000000000000000000000000000000 hop callback
+a71dffe4bc813fdadc506ccad9efb632e23dc843 yt-3.0a1
+954d1ffcbf04c3d1b394c2ea05324d903a9a07cf yt-3.0a2
+f4853999c2b5b852006d6628719c882cddf966df yt-3.0a3
 079e456c38a87676472a458210077e2be325dc85 last_gplv3
 ca6e536c15a60070e6988fd472dc771a1897e170 yt-2.0
 882c41eed5dd4a3cdcbb567bcb79b833e46b1f42 yt-2.0.1
@@ -5173,3 +5176,4 @@
 d43ff9d8e20f2d2b8f31f4189141d2521deb341b yt-2.6.1
 f1e22ef9f3a225f818c43262e6ce9644e05ffa21 yt-2.6.2
 816186f16396a16853810ac9ebcde5057d8d5b1a yt-2.6.3
+f327552a6ede406b82711fb800ebcd5fe692d1cb yt-3.0a4

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 CITATION
--- a/CITATION
+++ b/CITATION
@@ -29,3 +29,28 @@
    adsurl = {http://adsabs.harvard.edu/abs/2011ApJS..192....9T},
   adsnote = {Provided by the SAO/NASA Astrophysics Data System}
 }
+
+Using yt can also utilize other functionality.  If you utilize ORIGAMI, we ask
+that you please cite the ORIGAMI paper:
+
+ at ARTICLE{2012ApJ...754..126F,
+   author = {{Falck}, B.~L. and {Neyrinck}, M.~C. and {Szalay}, A.~S.},
+    title = "{ORIGAMI: Delineating Halos Using Phase-space Folds}",
+  journal = {\apj},
+archivePrefix = "arXiv",
+   eprint = {1201.2353},
+ primaryClass = "astro-ph.CO",
+ keywords = {dark matter, galaxies: halos, large-scale structure of universe, methods: numerical},
+     year = 2012,
+    month = aug,
+   volume = 754,
+      eid = {126},
+    pages = {126},
+      doi = {10.1088/0004-637X/754/2/126},
+   adsurl = {http://adsabs.harvard.edu/abs/2012ApJ...754..126F},
+  adsnote = {Provided by the SAO/NASA Astrophysics Data System}
+}
+
+The main homepage for ORIGAMI can be found here:
+
+http://icg.port.ac.uk/~falckb/origami.html

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 CREDITS
--- a/CREDITS
+++ b/CREDITS
@@ -2,15 +2,23 @@
 
 Contributors:   
                 Tom Abel (tabel at stanford.edu)
-                David Collins (dcollins at physics.ucsd.edu)
+                Gabriel Altay (gabriel.altay at gmail.com)
+                Kenza Arraki (karraki at gmail.com)
+                Elliott Biondo (biondo at wisc.edu)
+                Alex Bogert (fbogert at ucsc.edu)
+                Pengfei Chen (madcpf at gmail.com)
+                David Collins (dcollins4096 at gmail.com)
                 Brian Crosby (crosby.bd at gmail.com)
                 Andrew Cunningham (ajcunn at gmail.com)
+                Miguel de Val-Borro (miguel.deval at gmail.com)
                 Hilary Egan (hilaryye at gmail.com)
                 John Forces (jforbes at ucolick.org)
+                Sam Geen (samgeen at gmail.com)
                 Nathan Goldbaum (goldbaum at ucolick.org)
                 Markus Haider (markus.haider at uibk.ac.at)
                 Cameron Hummels (chummels at gmail.com)
                 Christian Karch (chiffre at posteo.de)
+                Ben W. Keller (kellerbw at mcmaster.ca)
                 Ji-hoon Kim (me at jihoonkim.org)
                 Steffen Klemer (sklemer at phys.uni-goettingen.de)
                 Kacper Kowalik (xarthisius.kk at gmail.com)
@@ -21,18 +29,23 @@
                 Chris Malone (chris.m.malone at gmail.com)
                 Josh Maloney (joshua.moloney at colorado.edu)
                 Chris Moody (cemoody at ucsc.edu)
+                Stuart Mumford (stuart at mumford.me.uk)
                 Andrew Myers (atmyers at astro.berkeley.edu)
                 Jill Naiman (jnaiman at ucolick.org)
+                Desika Narayanan (dnarayan at haverford.edu)
                 Kaylea Nelson (kaylea.nelson at yale.edu)
                 Jeff Oishi (jsoishi at gmail.com)
+                Brian O'Shea (bwoshea at gmail.com)
                 Jean-Claude Passy (jcpassy at uvic.ca)
+                John Regan (john.regan at helsinki.fi)
                 Mark Richardson (Mark.L.Richardson at asu.edu)
                 Thomas Robitaille (thomas.robitaille at gmail.com)
                 Anna Rosen (rosen at ucolick.org)
                 Douglas Rudd (drudd at uchicago.edu)
                 Anthony Scopatz (scopatz at gmail.com)
                 Noel Scudder (noel.scudder at stonybrook.edu)
-                Devin Silvia (devin.silvia at colorado.edu)
+                Pat Shriwise (shriwise at wisc.edu)
+                Devin Silvia (devin.silvia at gmail.com)
                 Sam Skillman (samskillman at gmail.com)
                 Stephen Skory (s at skory.us)
                 Britton Smith (brittonsmith at gmail.com)
@@ -42,8 +55,10 @@
                 Stephanie Tonnesen (stonnes at gmail.com)
                 Matthew Turk (matthewturk at gmail.com)
                 Rich Wagner (rwagner at physics.ucsd.edu)
+                Michael S. Warren (mswarren at gmail.com)
                 Andrew Wetzel (andrew.wetzel at yale.edu)
                 John Wise (jwise at physics.gatech.edu)
+                Michael Zingale (michael.zingale at stonybrook.edu)
                 John ZuHone (jzuhone at gmail.com)
 
 Several items included in the yt/extern directory were written by other

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 MANIFEST.in
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -1,7 +1,12 @@
-include distribute_setup.py README* CREDITS COPYING.txt CITATION nose.cfg
-recursive-include yt/gui/reason/html *.html *.png *.ico *.js
-recursive-include yt *.pyx *.pxd *.h README* *.glsl *.cu
-recursive-include yt/utilities/kdtree *.f90 *.v Makefile LICENSE
+include distribute_setup.py README* CREDITS COPYING.txt CITATION
+recursive-include yt/gui/reason/html *.html *.png *.ico *.js *.gif *.css
+recursive-include yt *.py *.pyx *.pxd *.h README* *.txt LICENSE*
+recursive-include doc *.rst *.txt *.py *.ipynb *.png *.jpg *.css *.inc *.html
+recursive-include doc *.h *.c *.sh *.svgz *.pdf *.svg *.pyx
+include doc/README doc/activate doc/activate.csh doc/cheatsheet.tex
+include doc/extensions/README doc/Makefile
+prune doc/source/reference/api/generated
+prune doc/build/
 recursive-include yt/analysis_modules/halo_finding/rockstar *.py *.pyx
 prune yt/frontends/_skeleton
 prune tests

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/Makefile
--- /dev/null
+++ b/doc/Makefile
@@ -0,0 +1,140 @@
+# Makefile for Sphinx documentation
+#
+
+# You can set these variables from the command line.
+SPHINXOPTS    =
+SPHINXBUILD   = sphinx-build
+PAPER         =
+BUILDDIR      = build
+
+# Internal variables.
+PAPEROPT_a4     = -D latex_paper_size=a4
+PAPEROPT_letter = -D latex_paper_size=letter
+ALLSPHINXOPTS   = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
+
+.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest
+
+help:
+	@echo "Please use \`make <target>' where <target> is one of"
+	@echo "  html        to make standalone HTML files"
+	@echo "  dirhtml     to make HTML files named index.html in directories"
+	@echo "  singlehtml  to make a single large HTML file"
+	@echo "  pickle      to make pickle files"
+	@echo "  json        to make JSON files"
+	@echo "  htmlhelp    to make HTML files and a HTML help project"
+	@echo "  qthelp      to make HTML files and a qthelp project"
+	@echo "  devhelp     to make HTML files and a Devhelp project"
+	@echo "  epub        to make an epub"
+	@echo "  latex       to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
+	@echo "  latexpdf    to make LaTeX files and run them through pdflatex"
+	@echo "  text        to make text files"
+	@echo "  man         to make manual pages"
+	@echo "  changes     to make an overview of all changed/added/deprecated items"
+	@echo "  linkcheck   to check all external links for integrity"
+	@echo "  doctest     to run all doctests embedded in the documentation (if enabled)"
+	@echo "  clean 	     to remove the build directory"
+	@echo "  fullclean   to remove the build directory and autogenerated api docs"
+	@echo "  recipeclean to remove files produced by running the cookbook scripts"
+
+clean:
+	-rm -rf $(BUILDDIR)/*
+
+fullclean:
+	-rm -rf $(BUILDDIR)/*
+	-rm -rf source/reference/api/generated
+
+recipeclean:
+	-rm -rf _temp/*.done source/cookbook/_static/*
+
+html:
+	$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
+	@echo
+	@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
+
+dirhtml:
+	$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
+	@echo
+	@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
+
+singlehtml:
+	$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
+	@echo
+	@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
+
+pickle:
+	$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
+	@echo
+	@echo "Build finished; now you can process the pickle files."
+
+json:
+	$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
+	@echo
+	@echo "Build finished; now you can process the JSON files."
+
+htmlhelp:
+	$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
+	@echo
+	@echo "Build finished; now you can run HTML Help Workshop with the" \
+	      ".hhp project file in $(BUILDDIR)/htmlhelp."
+
+qthelp:
+	$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
+	@echo
+	@echo "Build finished; now you can run "qcollectiongenerator" with the" \
+	      ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
+	@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/yt.qhcp"
+	@echo "To view the help file:"
+	@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/yt.qhc"
+
+devhelp:
+	$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
+	@echo
+	@echo "Build finished."
+	@echo "To view the help file:"
+	@echo "# mkdir -p $$HOME/.local/share/devhelp/yt"
+	@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/yt"
+	@echo "# devhelp"
+
+epub:
+	$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
+	@echo
+	@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
+
+latex:
+	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+	@echo
+	@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
+	@echo "Run \`make' in that directory to run these through (pdf)latex" \
+	      "(use \`make latexpdf' here to do that automatically)."
+
+latexpdf:
+	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+	@echo "Running LaTeX files through pdflatex..."
+	make -C $(BUILDDIR)/latex all-pdf
+	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
+
+text:
+	$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
+	@echo
+	@echo "Build finished. The text files are in $(BUILDDIR)/text."
+
+man:
+	$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
+	@echo
+	@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
+
+changes:
+	$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
+	@echo
+	@echo "The overview file is in $(BUILDDIR)/changes."
+
+linkcheck:
+	$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
+	@echo
+	@echo "Link check complete; look for any errors in the above output " \
+	      "or in $(BUILDDIR)/linkcheck/output.txt."
+
+doctest:
+	$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
+	@echo "Testing of doctests in the sources finished, look at the " \
+	      "results in $(BUILDDIR)/doctest/output.txt."

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/README
--- a/doc/README
+++ b/doc/README
@@ -1,14 +1,10 @@
-This directory contains the compiled yt documentation.  Development of the
-documentation happens in a mercurial repository, yt-doc, available at:
-
-http://hg.yt-project.org/yt-doc/
-
-It's written to be used with Sphinx, a tool designed for writing Python
-documentation.  Sphinx is available at this URL:
+This directory contains the uncompiled yt documentation.  It's written to be
+used with Sphinx, a tool designed for writing Python documentation.  Sphinx is
+available at this URL:
 
 http://sphinx.pocoo.org/
 
-All of the pre-built HTML files, accessible with any web browser, are available
-in the build/ directory, as well as at:
+Because the documentation requires a number of dependencies, we provide
+pre-built versions online, accessible here:
 
-http://yt-project.org/doc/index.html
+http://yt-project.org/docs/dev-3.0/

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/cheatsheet.tex
--- /dev/null
+++ b/doc/cheatsheet.tex
@@ -0,0 +1,347 @@
+\documentclass[10pt,landscape]{article}
+\usepackage{multicol}
+\usepackage{calc}
+\usepackage{ifthen}
+\usepackage[landscape]{geometry}
+\usepackage[hyphens]{url}
+
+% To make this come out properly in landscape mode, do one of the following
+% 1.
+%  pdflatex latexsheet.tex
+%
+% 2.
+%  latex latexsheet.tex
+%  dvips -P pdf  -t landscape latexsheet.dvi
+%  ps2pdf latexsheet.ps
+
+
+% If you're reading this, be prepared for confusion.  Making this was
+% a learning experience for me, and it shows.  Much of the placement
+% was hacked in; if you make it better, let me know...
+
+
+% 2008-04
+% Changed page margin code to use the geometry package. Also added code for
+% conditional page margins, depending on paper size. Thanks to Uwe Ziegenhagen
+% for the suggestions.
+
+% 2006-08
+% Made changes based on suggestions from Gene Cooperman. <gene at ccs.neu.edu>
+
+% 2012-11 - Stephen Skory
+% Converted the latex cheat sheet to a yt cheat sheet, taken from
+% http://www.stdout.org/~winston/latex/
+
+
+% This sets page margins to .5 inch if using letter paper, and to 1cm
+% if using A4 paper. (This probably isn't strictly necessary.)
+% If using another size paper, use default 1cm margins.
+\ifthenelse{\lengthtest { \paperwidth = 11in}}
+	{ \geometry{top=.5in,left=.5in,right=.5in,bottom=0.85in} }
+	{\ifthenelse{ \lengthtest{ \paperwidth = 297mm}}
+		{\geometry{top=1cm,left=1cm,right=1cm,bottom=1cm} }
+		{\geometry{top=1cm,left=1cm,right=1cm,bottom=1cm} }
+	}
+
+% Turn off header and footer
+\pagestyle{empty}
+ 
+
+% Redefine section commands to use less space
+\makeatletter
+\renewcommand{\section}{\@startsection{section}{1}{0mm}%
+                                {-1ex plus -.5ex minus -.2ex}%
+                                {0.5ex plus .2ex}%x
+                                {\normalfont\large\bfseries}}
+\renewcommand{\subsection}{\@startsection{subsection}{2}{0mm}%
+                                {-1explus -.5ex minus -.2ex}%
+                                {0.5ex plus .2ex}%
+                                {\normalfont\normalsize\bfseries}}
+\renewcommand{\subsubsection}{\@startsection{subsubsection}{3}{0mm}%
+                                {-1ex plus -.5ex minus -.2ex}%
+                                {1ex plus .2ex}%
+                                {\normalfont\small\bfseries}}
+\makeatother
+
+% Define BibTeX command
+\def\BibTeX{{\rm B\kern-.05em{\sc i\kern-.025em b}\kern-.08em
+    T\kern-.1667em\lower.7ex\hbox{E}\kern-.125emX}}
+
+% Don't print section numbers
+\setcounter{secnumdepth}{0}
+
+
+\setlength{\parindent}{0pt}
+\setlength{\parskip}{0pt plus 0.5ex}
+
+
+% -----------------------------------------------------------------------
+
+\begin{document}
+
+\raggedright
+\fontsize{3mm}{3mm}\selectfont
+\begin{multicols}{3}
+
+
+% multicol parameters
+% These lengths are set only within the two main columns
+%\setlength{\columnseprule}{0.25pt}
+\setlength{\premulticols}{1pt}
+\setlength{\postmulticols}{1pt}
+\setlength{\multicolsep}{1pt}
+\setlength{\columnsep}{2pt}
+
+\begin{center}
+     \Large{\textbf{yt Cheat Sheet}} \\
+\end{center}
+
+\subsection{General Info}
+For everything yt please see \url{http://yt-project.org}.
+Documentation \url{http://yt-project.org/doc/index.html}.
+Need help? Start here \url{http://yt-project.org/doc/help/} and then
+try the IRC chat room \url{http://yt-project.org/irc.html},
+or the mailing list \url{http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org}. \\
+
+\subsection{Installing yt} The easiest way to install yt is to use the
+installation script found on the yt homepage or the docs linked above.  If you
+already have python set up with \texttt{numpy}, \texttt{scipy},
+\texttt{matplotlib}, \texttt{h5py}, and \texttt{cython}, you can also use
+\texttt{pip install yt}
+
+\subsection{Command Line yt}
+yt, and its convenience functions, are launched from a command line prompt.
+Many commands have flags to control behavior.
+Commands can be followed by
+{\bf {-}{-}help} (e.g. {\bf yt render {-}{-}help}) for detailed help for that command
+including a list of the available flags.
+
+\texttt{iyt}\textemdash\ Load yt and IPython. \\
+\texttt{yt load} {\it dataset}   \textemdash\ Load a single dataset.  \\
+\texttt{yt help} \textemdash\ Print yt help information. \\
+\texttt{yt stats} {\it dataset} \textemdash\ Print stats of a dataset. \\
+\texttt{yt update} \textemdash\ Update yt to most recent version.\\
+\texttt{yt update --all} \textemdash\ Update yt and dependencies to most recent version. \\
+\texttt{yt version} \textemdash\ yt installation information. \\
+\texttt{yt notebook} \textemdash\ Run the IPython notebook server. \\
+\texttt{yt upload\_image} {\it image.png} \textemdash\ Upload PNG image to imgur.com. \\
+\texttt{yt upload\_notebook} {\it notebook.nb} \textemdash\ Upload IPython notebook to hub.yt-project.org.\\
+\texttt{yt plot} {\it dataset} \textemdash\ Create a set of images.\\
+\texttt{yt render} {\it dataset} \textemdash\ Create a simple
+ volume rendering. \\
+\texttt{yt mapserver} {\it dataset} \textemdash\ View a plot/projection in a Gmaps-like
+ interface. \\
+\texttt{yt pastebin} {\it text.out} \textemdash\ Post text to the pastebin at
+ paste.yt-project.org. \\ 
+\texttt{yt pastebin\_grab} {\it identifier} \textemdash\ Print content of pastebin to
+ STDOUT. \\
+\texttt{yt bugreport} \textemdash\ Report a yt bug. \\
+\texttt{yt hop} {\it dataset} \textemdash\  Run hop on a dataset. \\
+
+\subsection{yt Imports}
+In order to use yt, Python must load the relevant yt modules into memory.
+The import commands are entered in the Python/IPython shell or
+used as part of a script.
+\newlength{\MyLen}
+\settowidth{\MyLen}{\texttt{letterpaper}/\texttt{a4paper} \ }
+\texttt{import yt}  \textemdash\ 
+Load yt. \\
+\texttt{from yt.config import ytcfg}  \textemdash\ 
+Used to set yt configuration options.
+If used, must be called before importing any other module.\\
+\texttt{from yt.analysis\_modules.\emph{halo\_finding}.api import \textasteriskcentered}  \textemdash\ 
+Load halo finding modules. Other modules
+are loaded in a similar way by swapping the 
+{\em emphasized} text.
+See the \textbf{Analysis Modules} section for a listing and short descriptions of each.
+
+\subsection{YTArray}
+Simulation data in yt is returned as a YTArray.  YTArray is a numpy array that
+has unit data attached to it and can automatically handle unit conversions and
+detect unit errors. Just like a numpy array, YTArray provides a wealth of
+built-in functions to calculate properties of the data in the array. Here is a
+very brief list of some useful ones.
+\settowidth{\MyLen}{\texttt{multicol} }\\
+\texttt{v = a.in\_cgs()} \textemdash\ Return the array in CGS units \\
+\texttt{v = a.in\_units('Msun/pc**3')} \textemdash\ Return the array in solar masses per cubic parsec \\ 
+\texttt{v = a.max(), a.min()} \textemdash\ Return maximum, minimum of \texttt{a}. \\
+\texttt{index = a.argmax(), a.argmin()} \textemdash\ Return index of max,
+min value of \texttt{a}.\\
+\texttt{v = a[}{\it index}\texttt{]} \textemdash\ Select a single value from \texttt{a} at location {\it index}.\\
+\texttt{b = a[}{\it i:j}\texttt{]} \textemdash\ Select the slice of values from
+\texttt{a} between
+locations {\it i} to {\it j-1} saved to a new Numpy array \texttt{b} with length {\it j-i}. \\
+\texttt{sel = (a > const)} \textemdash\ Create a new boolean Numpy array
+\texttt{sel}, of the same shape as \texttt{a},
+that marks which values of \texttt{a > const}. Other operators (e.g. \textless, !=, \%) work as well.\\
+\texttt{b = a[sel]} \textemdash\ Create a new Numpy array \texttt{b} made up of
+elements from \texttt{a} that correspond to elements of \texttt{sel}
+that are {\it True}. In the above example \texttt{b} would be all elements of \texttt{a} that are greater than \texttt{const}.\\
+\texttt{a.write\_hdf5({\it filename.h5})} \textemdash\ Save \texttt{a} to the hdf5 file {\it filename.h5}.\\
+
+\subsection{IPython Tips}
+\settowidth{\MyLen}{\texttt{multicol} }
+These tips work if IPython has been loaded, typically either by invoking
+\texttt{iyt} or \texttt{yt load} on the command line, or using the IPython notebook (\texttt{yt notebook}).
+\texttt{Tab complete} \textemdash\ IPython will attempt to auto-complete a
+variable or function name when the \texttt{Tab} key is pressed, e.g. {\it HaloFi}\textendash\texttt{Tab} would auto-complete
+to {\it HaloFinder}. This also works with imports, e.g. {\it from numpy.random.}\textendash\texttt{Tab}
+would give you a list of random functions (note the trailing period before hitting \texttt{Tab}).\\
+\texttt{?, ??} \textemdash\ Appending one or two question marks at the end of any object gives you
+detailed information about it, e.g. {\it variable\_name}?.\\
+Below a few IPython ``magics'' are listed, which are IPython-specific shortcut commands.\\
+\texttt{\%paste} \textemdash\ Paste content from the system clipboard into the IPython shell.\\
+\texttt{\%hist} \textemdash\ Print recent command history.\\
+\texttt{\%quickref} \textemdash\ Print IPython quick reference.\\
+\texttt{\%pdb} \textemdash\ Automatically enter the Python debugger at an exception.\\
+\texttt{\%debug} \textemdash\ Drop into a debugger at the location of the last unhandled exception. \\
+\texttt{\%time, \%timeit} \textemdash\ Find running time of expressions for benchmarking.\\
+\texttt{\%lsmagic} \textemdash\ List all available IPython magics. Hint: \texttt{?} works with magics.\\
+
+
+Please see \url{http://ipython.org/documentation.html} for the full
+IPython documentation.
+
+\subsection{Load and Access Data}
+The first step in using yt is to reference a simulation snapshot.
+After that, simulation data is generally accessed in yt using {\it Data Containers} which are Python objects
+that define a region of simulation space from which data should be selected.
+\settowidth{\MyLen}{\texttt{multicol} }
+\texttt{ds = yt.load(}{\it dataset}\texttt{)} \textemdash\   Reference a single snapshot.\\
+\texttt{dd = ds.all\_data()} \textemdash\ Select the entire volume.\\
+\texttt{a = dd[}{\it field\_name}\texttt{]} \textemdash\ Copies the contents of {\it field} into the
+YTArray \texttt{a}. Similarly for other data containers.\\
+\texttt{ds.field\_list} \textemdash\ A list of available fields in the snapshot. \\
+\texttt{ds.derived\_field\_list} \textemdash\ A list of available derived fields
+in the snapshot. \\
+\texttt{val, loc = ds.find\_max("Density")} \textemdash\ Find the \texttt{val}ue of
+the maximum of the field \texttt{Density} and its \texttt{loc}ation. \\
+\texttt{sp = ds.sphere(}{\it cen}\texttt{,}{\it radius}\texttt{)} \textemdash\   Create a spherical data 
+container. {\it cen} may be a coordinate, or ``max'' which 
+centers on the max density point. {\it radius} may be a float in 
+code units or a tuple of ({\it length, unit}).\\
+
+\texttt{re = ds.region({\it cen}, {\it left edge}, {\it right edge})} \textemdash\ Create a
+rectilinear data container. {\it cen} is required but not used.
+{\it left} and {\it right edge} are coordinate values that define the region.
+
+\texttt{di = ds.disk({\it cen}, {\it normal}, {\it radius}, {\it height})} \textemdash\ 
+Create a cylindrical data container centered at {\it cen} along the 
+direction set by {\it normal},with total length
+ 2$\times${\it height} and with radius {\it radius}. \\
+ 
+\texttt{ds.save\_object(sp, {\it ``sp\_for\_later''})} \textemdash\ Save an object (\texttt{sp}) for later use.\\
+\texttt{sp = ds.load\_object({\it ``sp\_for\_later''})} \textemdash\ Recover a saved object.\\
+
+
+\subsection{Defining New Fields}
+\texttt{yt} expects on-disk fields, fields generated on-demand and in-memory. 
+Field can either be created before a dataset is loaded using \texttt{add\_field}:
+\texttt{def \_metal\_mass({\it field},{\it data})}\\
+\texttt{\hspace{4 mm} return data["metallicity"]*data["cell\_mass"]}\\
+\texttt{add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
+Or added to an existing dataset using \texttt{ds.add\_field}:
+\texttt{ds.add\_field("metal\_mass", units='g', function=\_metal\_mass)}\\
+
+\subsection{Slices and Projections}
+\settowidth{\MyLen}{\texttt{multicol} }
+\texttt{slc = yt.SlicePlot(ds, {\it axis or normal vector}, {\it field}, {\it center=}, {\it width=}, {\it weight\_field=}, {\it additional parameters})} \textemdash\ Make a slice plot
+perpendicular to {\it axis} (specified via 'x', 'y', or 'z') or a normal vector for an off-axis slice of {\it field} weighted by {\it weight\_field} at (code-units) {\it center} with 
+{\it width} in code units or a (value, unit) tuple. Hint: try {\it yt.SlicePlot?} in IPython to see additional parameters.\\
+\texttt{slc.save({\it file\_prefix})} \textemdash\ Save the slice to a png with name prefix {\it file\_prefix}.
+\texttt{.save()} works similarly for the commands below.\\
+
+\texttt{prj = yt.ProjectionPlot(ds, {\it axis}, {\it field}, {\it addit. params})} \textemdash\ Make a projection. \\
+\texttt{prj = yt.OffAxisProjectionPlot(ds, {\it normal}, {\it fields}, {\it center=}, {\it width=}, {\it depth=},{\it north\_vector=},{\it weight\_field=})} \textemdash Make an off axis projection. Note this takes an array of fields. \\
+
+\subsection{Plot Annotations}
+\settowidth{\MyLen}{\texttt{multicol} }
+Plot callbacks are functions itemized in a registry that is attached to every plot object. They can be accessed and then called like \texttt{ prj.annotate\_velocity(factor=16, normalize=False)}. Most callbacks also accept a {\it plot\_args} dict that is fed to matplotlib annotator. \\
+\texttt{velocity({\it factor=},{\it scale=},{\it scale\_units=}, {\it normalize=})} \textemdash\ Uses field "x-velocity" to draw quivers\\
+\texttt{magnetic\_field({\it factor=},{\it scale=},{\it scale\_units=}, {\it normalize=})} \textemdash\ Uses field "Bx" to draw quivers\\
+\texttt{quiver({\it field\_x},{\it field\_y},{\it factor=},{\it scale=},{\it scale\_units=}, {\it normalize=})} \\
+\texttt{contour({\it field=},{\it ncont=},{\it factor=},{\it clim=},{\it take\_log=}, {\it additional parameters})} \textemdash Plots a number of contours {\it ncont} to interpolate {\it field} optionally using {\it take\_log}, upper and lower {\it c}ontour{\it lim}its and {\it factor} number of points in the interpolation.\\
+\texttt{grids({\it alpha=}, {\it draw\_ids=}, {\it periodic=}, {\it min\_level=}, {\it max\_level=})} \textemdash Add grid boundaries. \\
+\texttt{streamlines({\it field\_x},{\it field\_y},{\it factor=},{\it density=})}\\
+\texttt{clumps({\it clumplist})} \textemdash\ Generate {\it clumplist} using the clump finder and plot. \\
+\texttt{arrow({\it pos}, {\it code\_size})} Add an arrow at a {\it pos}ition. \\
+\texttt{point({\it pos}, {\it text})} \textemdash\ Add text at a {\it pos}ition. \\
+\texttt{marker({\it pos}, {\it marker=})} \textemdash\ Add a matplotlib-defined marker at a {\it pos}ition. \\
+\texttt{sphere({\it center}, {\it radius}, {\it text=})} \textemdash\ Draw a circle and append {\it text}.\\
+\texttt{hop\_circles({\it hop\_output}, {\it max\_number=}, {\it annotate=}, {\it min\_size=}, {\it max\_size=}, {\it font\_size=}, {\it print\_halo\_size=}, {\it fixed\_radius=}, {\it min\_mass=}, {\it print\_halo\_mass=}, {\it width=})} \textemdash\ Draw a halo, printing it's ID, mass, clipping halos depending on number of particles ({\it size}) and optionally fixing the drawn circle radius to be constant for all halos.\\
+\texttt{hop\_particles({\it hop\_output},{\it max\_number=},{\it p\_size=},\\
+{\it min\_size},{\it alpha=})} \textemdash\ Draw particle positions for member halos with a certain number of pixels per particle.\\
+\texttt{particles({\it width},{\it p\_size=},{\it col=}, {\it marker=}, {\it stride=}, {\it ptype=}, {\it stars\_only=}, {\it dm\_only=}, {\it minimum\_mass=}, {\it alpha=})}  \textemdash\  Draw particles of {\it p\_size} pixels in a slab of {\it width} with {\it col}or using a matplotlib {\it marker} plotting only every {\it stride} number of particles.\\
+\texttt{title({\it text})}\\
+
+\subsection{The $\sim$/.yt/ Directory}
+\settowidth{\MyLen}{\texttt{multicol} }
+yt will automatically check for configuration files in a special directory (\texttt{\$HOME/.yt/}) in the user's home directory.
+
+The \texttt{config} file \textemdash\ Settings that control runtime behavior. \\
+The \texttt{my\_plugins.py} file \textemdash\ Add functions, derived fields, constants, or other commonly-used Python code to yt.
+
+
+\subsection{Analysis Modules}
+\settowidth{\MyLen}{\texttt{multicol}}
+The import name for each module is listed at the end of each description (see \textbf{yt Imports}).
+
+\texttt{Absorption Spectrum} \textemdash\ (\texttt{absorption\_spectrum}). \\
+\texttt{Clump Finder} \textemdash\ Find clumps defined by density thresholds (\texttt{level\_sets}). \\
+\texttt{Halo Finding} \textemdash\ Locate halos of dark matter particles (\texttt{halo\_finding}). \\
+\texttt{Light Cone Generator} \textemdash\ Stitch datasets together to perform analysis over cosmological volumes. \\
+\texttt{Light Ray Generator} \textemdash\ Analyze the path of light rays.\\
+\texttt{Rockstar Halo Finding} \textemdash\ Locate halos of dark matter using the Rockstar halo finder (\texttt{halo\_finding.rockstar}). \\
+\texttt{Star Particle Analysis} \textemdash\ Analyze star formation history and assemble spectra (\texttt{star\_analysis}). \\
+\texttt{Sunrise Exporter} \textemdash\ Export data to the sunrise visualization format (\texttt{sunrise\_export}). \\
+
+
+\subsection{Parallel Analysis}
+\settowidth{\MyLen}{\texttt{multicol}} 
+Nearly all of yt is parallelized using
+MPI.  The {\it mpi4py} package must be installed for parallelism in yt.  To
+install {\it pip install mpi4py} on the command line usually works.
+Execute python in parallel similar to this:\\
+{\it mpirun -n 12 python script.py}\\
+The file \texttt{script.py} must call the \texttt{yt.enable\_parallelism()} to
+turn on yt's parallelism.  If this doesn't happen, all cores will execute the
+same serial yt script.  This command may differ for each system on which you use
+yt; please consult the system documentation for details on how to run parallel
+applications.
+
+\texttt{parallel\_objects()} \textemdash\ A way to parallelize analysis over objects
+(such as halos or clumps).\\
+
+
+\subsection{Mercurial}
+\settowidth{\MyLen}{\texttt{multicol}}
+Please see \url{http://mercurial.selenic.com/} for the full Mercurial documentation.
+
+\texttt{hg clone https://bitbucket.org/yt\_analysis/yt} \textemdash\ Clone a copy of yt. \\
+\texttt{hg status} \textemdash\ Files changed in working directory.\\
+\texttt{hg diff} \textemdash\ Print diff of all changed files in working directory. \\
+\texttt{hg diff -r{\it RevX} -r{\it RevY}} \textemdash\ Print diff of all changes between revision {\it RevX} and {\it RevY}.\\
+\texttt{hg log} \textemdash\ History of changes.\\
+\texttt{hg cat -r{\it RevX file}} \textemdash\ Print the contents of {\it file} from revision {\it RevX}.\\
+\texttt{hg heads} \textemdash\ Print all the current heads. \\
+\texttt{hg revert -r{\it RevX file}} \textemdash\ Revert {\it file} to revision {\it RevX}. On-disk changed version is
+moved to {\it file.orig}. \\
+\texttt{hg commit} \textemdash\ Commit changes to repository. \\
+\texttt{hg push} \textemdash\ Push changes to default remote repository. \\
+\texttt{hg pull} \textemdash\ Pull changes from default remote repository. \\
+\texttt{hg serve} \textemdash\ Launch a webserver on the local machine to examine the repository in a web browser. \\
+
+\subsection{FAQ}
+\settowidth{\MyLen}{\texttt{multicol}}
+
+\texttt{slc.set\_log('field', False)} \textemdash\ When plotting \texttt{field}, use linear scaling instead of log scaling.
+
+
+%\rule{0.3\linewidth}{0.25pt}
+%\scriptsize
+
+% Can put some final stuff here like copyright etc...
+
+\end{multicols}
+
+\end{document}

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/coding_styleguide.txt
--- a/doc/coding_styleguide.txt
+++ b/doc/coding_styleguide.txt
@@ -34,11 +34,11 @@
  * Do not import "*" from anything other than "yt.funcs".
  * Internally, only import from source files directly -- instead of:
 
-   from yt.visualization.api import PlotCollection
+   from yt.visualization.api import ProjectionPlot
 
    do:
 
-   from yt.visualization.plot_collection import PlotCollection
+   from yt.visualization.plot_window import ProjectionPlot
 
  * Numpy is to be imported as "np", after a long time of using "na".
  * Do not use too many keyword arguments.  If you have a lot of keyword
@@ -49,7 +49,7 @@
  * Don't create a new class to replicate the functionality of an old class --
    replace the old class.  Too many options makes for a confusing user
    experience.
- * Parameter files are a last resort.
+ * Parameter files external to yt are a last resort.
  * The usage of the **kwargs construction should be avoided.  If they cannot
    be avoided, they must be explained, even if they are only to be passed on to
    a nested function.
@@ -60,8 +60,8 @@
  * Avoid Enzo-isms.  This includes but is not limited to:
    * Hard-coding parameter names that are the same as those in Enzo.  The
      following translation table should be of some help.  Note that the
-     parameters are now properties on a StaticOutput subclass: you access them
-     like pf.refine_by .
+     parameters are now properties on a Dataset subclass: you access them
+     like ds.refine_by .
      * RefineBy => refine_by
      * TopGridRank => dimensionality
      * TopGridDimensions => domain_dimensions

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/docstring_example.txt
--- a/doc/docstring_example.txt
+++ b/doc/docstring_example.txt
@@ -73,7 +73,7 @@
     Examples
     --------
     These are written in doctest format, and should illustrate how to
-    use the function.  Use the variables 'pf' for the parameter file, 'pc' for
+    use the function.  Use the variables 'ds' for the dataset, 'pc' for
     a plot collection, 'c' for a center, and 'L' for a vector. 
 
     >>> a=[1,2,3]

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/docstring_idioms.txt
--- a/doc/docstring_idioms.txt
+++ b/doc/docstring_idioms.txt
@@ -19,7 +19,7 @@
 useful variable names that correspond to specific instances that the user is
 presupposed to have created.
 
-   * `pf`: a parameter file, loaded successfully
+   * `ds`: a dataset, loaded successfully
    * `sp`: a sphere
    * `c`: a 3-component "center"
    * `L`: a 3-component vector that corresponds to either angular momentum or a
@@ -43,7 +43,7 @@
 To indicate the return type of a given object, you can reference it using this
 construction:
 
-    This function returns a :class:`PlotCollection`.
+    This function returns a :class:`ProjectionPlot`.
 
 To reference a function, you can use:
 
@@ -51,4 +51,4 @@
 
 To reference a method, you can use:
 
-    To add a projection, use :meth:`PlotCollection.add_projection`.
+    To add a projection, use :meth:`ProjectionPlot.set_width`.

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/extensions/README
--- /dev/null
+++ b/doc/extensions/README
@@ -0,0 +1,4 @@
+This includes a version of the Numpy Documentation extension that has been
+slightly modified to emit extra TOC tree items.
+
+-- Matt Turk, March 25, 2011

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/extensions/notebook_sphinxext.py
--- /dev/null
+++ b/doc/extensions/notebook_sphinxext.py
@@ -0,0 +1,188 @@
+import os, shutil, string, glob, re
+from sphinx.util.compat import Directive
+from docutils import nodes
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from IPython.nbformat.current import read, write
+from runipy.notebook_runner import NotebookRunner, NotebookError
+
+class NotebookDirective(Directive):
+    """Insert an evaluated notebook into a document
+
+    This uses runipy and nbconvert to transform a path to an unevaluated notebook
+    into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 1
+    optional_arguments = 1
+    option_spec = {'skip_exceptions' : directives.flag}
+    final_argument_whitespace = True
+
+    def run(self): # check if there are spaces in the notebook name
+        nb_path = self.arguments[0]
+        if ' ' in nb_path: raise ValueError(
+            "Due to issues with docutils stripping spaces from links, white "
+            "space is not allowed in notebook filenames '{0}'".format(nb_path))
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # get path to notebook
+        source_dir = os.path.dirname(
+            os.path.abspath(self.state.document.current_source))
+        nb_filename = self.arguments[0]
+        nb_basename = os.path.basename(nb_filename)
+        rst_file = self.state_machine.document.attributes['source']
+        rst_dir = os.path.abspath(os.path.dirname(rst_file))
+        nb_abs_path = os.path.abspath(os.path.join(rst_dir, nb_filename))
+
+        # Move files around.
+        rel_dir = os.path.relpath(rst_dir, setup.confdir)
+        rel_path = os.path.join(rel_dir, nb_basename)
+        dest_dir = os.path.join(setup.app.builder.outdir, rel_dir)
+        dest_path = os.path.join(dest_dir, nb_basename)
+
+        if not os.path.exists(dest_dir):
+            os.makedirs(dest_dir)
+
+        # Copy unevaluated script
+        try:
+            shutil.copyfile(nb_abs_path, dest_path)
+        except IOError:
+            raise RuntimeError("Unable to copy notebook to build destination.")
+
+        dest_path_eval = string.replace(dest_path, '.ipynb', '_evaluated.ipynb')
+        dest_path_script = string.replace(dest_path, '.ipynb', '.py')
+        rel_path_eval = string.replace(nb_basename, '.ipynb', '_evaluated.ipynb')
+        rel_path_script = string.replace(nb_basename, '.ipynb', '.py')
+
+        # Create python script vesion
+        unevaluated_text = nb_to_html(nb_abs_path)
+        script_text = nb_to_python(nb_abs_path)
+        f = open(dest_path_script, 'w')
+        f.write(script_text.encode('utf8'))
+        f.close()
+
+        skip_exceptions = 'skip_exceptions' in self.options
+
+        evaluated_text = evaluate_notebook(nb_abs_path, dest_path_eval,
+                                           skip_exceptions=skip_exceptions)
+
+        # Create link to notebook and script files
+        link_rst = "(" + \
+                   formatted_link(nb_basename) + "; " + \
+                   formatted_link(rel_path_eval) + "; " + \
+                   formatted_link(rel_path_script) + \
+                   ")"
+
+        self.state_machine.insert_input([link_rst], rst_file)
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = notebook_node('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # add dependency
+        self.state.document.settings.record_dependencies.add(nb_abs_path)
+
+        # clean up png files left behind by notebooks.
+        png_files = glob.glob("*.png")
+        fits_files = glob.glob("*.fits")
+        h5_files = glob.glob("*.h5")
+        for file in png_files:
+            os.remove(file)
+
+        return [nb_node]
+
+
+class notebook_node(nodes.raw):
+    pass
+
+def nb_to_python(nb_path):
+    """convert notebook to python script"""
+    exporter = python.PythonExporter()
+    output, resources = exporter.from_filename(nb_path)
+    return output
+
+def nb_to_html(nb_path):
+    """convert notebook to html"""
+    exporter = html.HTMLExporter(template_file='full')
+    output, resources = exporter.from_filename(nb_path)
+    header = output.split('<head>', 1)[1].split('</head>',1)[0]
+    body = output.split('<body>', 1)[1].split('</body>',1)[0]
+
+    # http://imgur.com/eR9bMRH
+    header = header.replace('<style', '<style scoped="scoped"')
+    header = header.replace('body {\n  overflow: visible;\n  padding: 8px;\n}\n', '')
+    header = header.replace("code,pre{", "code{")
+
+    # Filter out styles that conflict with the sphinx theme.
+    filter_strings = [
+        'navbar',
+        'body{',
+        'alert{',
+        'uneditable-input{',
+        'collapse{',
+    ]
+    filter_strings.extend(['h%s{' % (i+1) for i in range(6)])
+
+    line_begin_strings = [
+        'pre{',
+        'p{margin'
+        ]
+
+    header_lines = filter(
+        lambda x: not any([s in x for s in filter_strings]), header.split('\n'))
+    header_lines = filter(
+        lambda x: not any([x.startswith(s) for s in line_begin_strings]), header_lines)
+
+    header = '\n'.join(header_lines)
+
+    # concatenate raw html lines
+    lines = ['<div class="ipynotebook">']
+    lines.append(header)
+    lines.append(body)
+    lines.append('</div>')
+    return '\n'.join(lines)
+
+def evaluate_notebook(nb_path, dest_path=None, skip_exceptions=False):
+    # Create evaluated version and save it to the dest path.
+    # Always use --pylab so figures appear inline
+    # perhaps this is questionable?
+    notebook = read(open(nb_path), 'json')
+    nb_runner = NotebookRunner(notebook, pylab=False)
+    try:
+        nb_runner.run_notebook(skip_exceptions=skip_exceptions)
+    except NotebookError as e:
+        print ''
+        print e
+        # Return the traceback, filtering out ANSI color codes.
+        # http://stackoverflow.com/questions/13506033/filtering-out-ansi-escape-sequences
+        return 'Notebook conversion failed with the following traceback: \n%s' % \
+            re.sub(r'\\033[\[\]]([0-9]{1,2}([;@][0-9]{0,2})*)*[mKP]?', '', str(e))
+    if dest_path is None:
+        dest_path = 'temp_evaluated.ipynb'
+    write(nb_runner.nb, open(dest_path, 'w'), 'json')
+    ret = nb_to_html(dest_path)
+    if dest_path is 'temp_evaluated.ipynb':
+        os.remove(dest_path)
+    return ret
+
+def formatted_link(path):
+    return "`%s <%s>`__" % (os.path.basename(path), path)
+
+def visit_notebook_node(self, node):
+    self.visit_raw(node)
+
+def depart_notebook_node(self, node):
+    self.depart_raw(node)
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook', NotebookDirective)

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/extensions/notebookcell_sphinxext.py
--- /dev/null
+++ b/doc/extensions/notebookcell_sphinxext.py
@@ -0,0 +1,67 @@
+import os, shutil, string, glob, io
+from sphinx.util.compat import Directive
+from docutils.parsers.rst import directives
+from IPython.nbconvert import html, python
+from IPython.nbformat import current
+from runipy.notebook_runner import NotebookRunner
+from jinja2 import FileSystemLoader
+from notebook_sphinxext import \
+    notebook_node, nb_to_html, nb_to_python, \
+    visit_notebook_node, depart_notebook_node, \
+    evaluate_notebook
+
+class NotebookCellDirective(Directive):
+    """Insert an evaluated notebook cell into a document
+
+    This uses runipy and nbconvert to transform an inline python
+    script into html suitable for embedding in a Sphinx document.
+    """
+    required_arguments = 0
+    optional_arguments = 1
+    has_content = True
+    option_spec = {'skip_exceptions' : directives.flag}
+
+    def run(self):
+        # check if raw html is supported
+        if not self.state.document.settings.raw_enabled:
+            raise self.warning('"%s" directive disabled.' % self.name)
+
+        # Construct notebook from cell content
+        content = "\n".join(self.content)
+        with open("temp.py", "w") as f:
+            f.write(content)
+
+        convert_to_ipynb('temp.py', 'temp.ipynb')
+
+        skip_exceptions = 'skip_exceptions' in self.options
+
+        evaluated_text = evaluate_notebook('temp.ipynb', skip_exceptions=skip_exceptions)
+
+        # create notebook node
+        attributes = {'format': 'html', 'source': 'nb_path'}
+        nb_node = notebook_node('', evaluated_text, **attributes)
+        (nb_node.source, nb_node.line) = \
+            self.state_machine.get_source_and_line(self.lineno)
+
+        # clean up
+        files = glob.glob("*.png") + ['temp.py', 'temp.ipynb']
+        for file in files:
+            os.remove(file)
+
+        return [nb_node]
+
+def setup(app):
+    setup.app = app
+    setup.config = app.config
+    setup.confdir = app.confdir
+
+    app.add_node(notebook_node,
+                 html=(visit_notebook_node, depart_notebook_node))
+
+    app.add_directive('notebook-cell', NotebookCellDirective)
+
+def convert_to_ipynb(py_file, ipynb_file):
+    with io.open(py_file, 'r', encoding='utf-8') as f:
+        notebook = current.reads(f.read(), format='py')
+    with io.open(ipynb_file, 'w', encoding='utf-8') as f:
+        current.write(notebook, f, format='ipynb')

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/extensions/numpydocmod/__init__.py
--- /dev/null
+++ b/doc/extensions/numpydocmod/__init__.py
@@ -0,0 +1,1 @@
+from numpydoc import setup

diff -r 0b29fa48fa3a25d0ed2e1cc28d8f946f878d18ac -r 73a9f749157260c8949f05c07715305aafa06408 doc/extensions/numpydocmod/comment_eater.py
--- /dev/null
+++ b/doc/extensions/numpydocmod/comment_eater.py
@@ -0,0 +1,158 @@
+from cStringIO import StringIO
+import compiler
+import inspect
+import textwrap
+import tokenize
+
+from compiler_unparse import unparse
+
+
+class Comment(object):
+    """ A comment block.
+    """
+    is_comment = True
+    def __init__(self, start_lineno, end_lineno, text):
+        # int : The first line number in the block. 1-indexed.
+        self.start_lineno = start_lineno
+        # int : The last line number. Inclusive!
+        self.end_lineno = end_lineno
+        # str : The text block including '#' character but not any leading spaces.
+        self.text = text
+
+    def add(self, string, start, end, line):
+        """ Add a new comment line.
+        """
+        self.start_lineno = min(self.start_lineno, start[0])
+        self.end_lineno = max(self.end_lineno, end[0])
+        self.text += string
+
+    def __repr__(self):
+        return '%s(%r, %r, %r)' % (self.__class__.__name__, self.start_lineno,
+            self.end_lineno, self.text)
+
+
+class NonComment(object):
+    """ A non-comment block of code.
+    """
+    is_comment = False
+    def __init__(self, start_lineno, end_lineno):
+        self.start_lineno = start_lineno
+        self.end_lineno = end_lineno
+
+    def add(self, string, start, end, line):
+        """ Add lines to the block.
+        """
+        if string.strip():
+            # Only add if not entirely whitespace.
+            self.start_lineno = min(self.start_lineno, start[0])
+            self.end_lineno = max(self.end_lineno, end[0])
+
+    def __repr__(self):
+        return '%s(%r, %r)' % (self.__class__.__name__, self.start_lineno,
+            self.end_lineno)
+
+
+class CommentBlocker(object):
+    """ Pull out contiguous comment blocks.
+    """
+    def __init__(self):
+        # Start with a dummy.
+        self.current_block = NonComment(0, 0)
+
+        # All of the blocks seen so far.
+        self.blocks = []
+
+        # The index mapping lines of code to their associated comment blocks.
+        self.index = {}
+
+    def process_file(self, file):
+        """ Process a file object.
+        """
+        for token in tokenize.generate_tokens(file.next):
+            self.process_token(*token)
+        self.make_index()
+
+    def process_token(self, kind, string, start, end, line):
+        """ Process a single token.
+        """
+        if self.current_block.is_comment:
+            if kind == tokenize.COMMENT:
+                self.current_block.add(string, start, end, line)
+            else:
+                self.new_noncomment(start[0], end[0])
+        else:
+            if kind == tokenize.COMMENT:
+                self.new_comment(string, start, end, line)
+            else:
+                self.current_block.add(string, start, end, line)
+
+    def new_noncomment(self, start_lineno, end_lineno):
+        """ We are transitioning from a noncomment to a comment.
+        """
+        block = NonComment(start_lineno, end_lineno)
+        self.blocks.append(block)
+        self.current_block = block
+
+    def new_comment(self, string, start, end, line):
+        """ Possibly add a new comment.
+        
+        Only adds a new comment if this comment is the only thing on the line.
+        Otherwise, it extends the noncomment block.
+        """
+        prefix = line[:start[1]]
+        if prefix.strip():
+            # Oops! Trailing comment, not a comment block.
+            self.current_block.add(string, start, end, line)
+        else:
+            # A comment block.
+            block = Comment(start[0], end[0], string)
+            self.blocks.append(block)
+            self.current_block = block
+
+    def make_index(self):
+        """ Make the index mapping lines of actual code to their associated
+        prefix comments.
+        """
+        for prev, block in zip(self.blocks[:-1], self.blocks[1:]):
+            if not block.is_comment:
+                self.index[block.start_lineno] = prev
+
+    def search_for_comment(self, lineno, default=None):
+        """ Find the comment block just before the given line number.
+
+        Returns None (or the specified default) if there is no such block.
+        """
+        if not self.index:
+            self.make_index()
+        block = self.index.get(lineno, None)
+        text = getattr(block, 'text', default)
+        return text
+
+
+def strip_comment_marker(text):
+    """ Strip # markers at the front of a block of comment text.
+    """
+    lines = []
+    for line in text.splitlines():
+        lines.append(line.lstrip('#'))
+    text = textwrap.dedent('\n'.join(lines))
+    return text
+
+
+def get_class_traits(klass):
+    """ Yield all of the documentation for trait definitions on a class object.
+    """
+    # FIXME: gracefully handle errors here or in the caller?
+    source = inspect.getsource(klass)
+    cb = CommentBlocker()
+    cb.process_file(StringIO(source))
+    mod_ast = compiler.parse(source)
+    class_ast = mod_ast.node.nodes[0]
+    for node in class_ast.code.nodes:
+        # FIXME: handle other kinds of assignments?
+        if isinstance(node, compiler.ast.Assign):
+            name = node.nodes[0].name
+            rhs = unparse(node.expr).strip()
+            doc = strip_comment_marker(cb.search_for_comment(node.lineno, default=''))
+            yield name, rhs, doc
+

This diff is so big that we needed to truncate the remainder.

https://bitbucket.org/yt_analysis/yt/commits/cf867eab0515/
Changeset:   cf867eab0515
Branch:      stable
User:        MatthewTurk
Date:        2014-08-04 14:50:44
Summary:     Added tag yt-3.0.0 for changeset 73a9f7491572
Affected #:  1 file

diff -r 73a9f749157260c8949f05c07715305aafa06408 -r cf867eab0515a5ceec0f1de67c1dfbe09d766f91 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -5177,3 +5177,4 @@
 f1e22ef9f3a225f818c43262e6ce9644e05ffa21 yt-2.6.2
 816186f16396a16853810ac9ebcde5057d8d5b1a yt-2.6.3
 f327552a6ede406b82711fb800ebcd5fe692d1cb yt-3.0a4
+73a9f749157260c8949f05c07715305aafa06408 yt-3.0.0

Repository URL: https://bitbucket.org/yt_analysis/yt/

--

This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.



More information about the yt-svn mailing list