[yt-svn] commit/yt: 6 new changesets
commits-noreply at bitbucket.org
commits-noreply at bitbucket.org
Wed Mar 9 09:13:16 PST 2016
6 new commits in yt:
https://bitbucket.org/yt_analysis/yt/commits/fc2acf29bc15/
Changeset: fc2acf29bc15
Branch: yt
User: rthompson
Date: 2016-03-08 21:22:29+00:00
Summary: adding get_hash() function to yt/funcs.py which returns a hash for a file.
Affected #: 1 file
diff -r 030372da4a869adba281c0c8e301df4416d66c59 -r fc2acf29bc15861ce667de8a4f442808602da655 yt/funcs.py
--- a/yt/funcs.py
+++ b/yt/funcs.py
@@ -860,3 +860,52 @@
return 'unitary'
else:
return u
+
+def get_hash(infile, algorithm='md5'):
+ """Generate file hash without reading in the entire file at once.
+ From: http://pythoncentral.io/hashing-files-with-python/
+
+ Parameters
+ ----------
+ infile : str
+ File of interest (including the path).
+ algorithm : str (optional)
+ Hash algorithm of choice. Defaults to 'md5'.
+
+ Returns
+ -------
+ hash : str
+ The hash of the file.
+
+ Examples
+ --------
+ import yt.funcs as funcs
+ funcs.get_hash('/path/to/test.png')
+ > 'd38da04859093d430fa4084fd605de60'
+
+ """
+ import hashlib
+ BLOCKSIZE = 65536
+
+ try:
+ hasher = getattr(hashlib, algorithm)()
+ except:
+ raise NotImplementedError("'%s' not available! Available algorithms: %s" %
+ (algorithm, hashlib.algorithms))
+
+ filesize = os.path.getsize(infile)
+ iterations = int(float(filesize)/float(BLOCKSIZE))
+
+ pbar = get_pbar('Generating %s hash' % algorithm, iterations)
+
+ iter = 0
+ with open(infile,'rb') as f:
+ buf = f.read(BLOCKSIZE)
+ while len(buf) > 0:
+ hasher.update(buf)
+ buf = f.read(BLOCKSIZE)
+ iter += 1
+ pbar.update(iter)
+ pbar.finish()
+
+ return hasher.hexdigest()
https://bitbucket.org/yt_analysis/yt/commits/80558f3e822b/
Changeset: 80558f3e822b
Branch: yt
User: rthompson
Date: 2016-03-08 21:24:51+00:00
Summary: bringing this up to date with main.
Affected #: 9 files
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b CONTRIBUTING.rst
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -843,7 +843,7 @@
be avoided, they must be explained, even if they are only to be passed on to
a nested function.
-.. _docstrings
+.. _docstrings:
Docstrings
----------
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/analyzing/objects.rst
--- a/doc/source/analyzing/objects.rst
+++ b/doc/source/analyzing/objects.rst
@@ -246,6 +246,8 @@
| A plane normal to a specified vector and intersecting a particular
coordinate.
+.. _region-reference:
+
3D Objects
""""""""""
@@ -256,8 +258,6 @@
creating a Region covering the entire dataset domain. It is effectively
``ds.region(ds.domain_center, ds.domain_left_edge, ds.domain_right_edge)``.
-.. _region-reference:
-
**Box Region**
| Class :class:`~yt.data_objects.selection_data_containers.YTRegion`
| Usage: ``region(center, left_edge, right_edge, fields=None, ds=None, field_parameters=None, data_source=None)``
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/index.rst
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -175,6 +175,7 @@
.. toctree::
:hidden:
+ intro/index
installing
yt Quickstart <quickstart/index>
yt3differences
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/installing.rst
--- a/doc/source/installing.rst
+++ b/doc/source/installing.rst
@@ -19,7 +19,7 @@
* If you do not have root access on your computer, are not comfortable managing
python packages, or are working on a supercomputer or cluster computer, you
will probably want to use the bash all-in-one installation script. This builds
- python, numpy, matplotlib, and yt from source to set up an isolated scientific
+ Python, NumPy, Matplotlib, and yt from source to set up an isolated scientific
python environment inside of a single folder in your home directory. See
:ref:`install-script` for more details.
@@ -35,9 +35,9 @@
up python using a source-based package manager like `Homebrew
<http://brew.sh>`_ or `MacPorts <http://www.macports.org/>`_ this choice will
let you install yt using the python installed by the package manager. Similarly
- for python environments set up via linux package managers so long as you
+ for python environments set up via Linux package managers so long as you
have the the necessary compilers installed (e.g. the ``build-essentials``
- package on debian and ubuntu).
+ package on Debian and Ubuntu).
.. note::
See `Parallel Computation
@@ -199,13 +199,12 @@
If you do not want to install the full anaconda python distribution, you can
install a bare-bones Python installation using miniconda. To install miniconda,
-visit http://repo.continuum.io/miniconda/ and download a recent version of the
-``Miniconda-x.y.z`` script (corresponding to Python 2.7) for your platform and
-system architecture. Next, run the script, e.g.:
+visit http://repo.continuum.io/miniconda/ and download ``Miniconda-latest-...``
+script for your platform and system architecture. Next, run the script, e.g.:
.. code-block:: bash
- bash Miniconda-3.3.0-Linux-x86_64.sh
+ bash Miniconda-latest-Linux-x86_64.sh
For both the Anaconda and Miniconda installations, make sure that the Anaconda
``bin`` directory is in your path, and then issue:
@@ -214,7 +213,28 @@
conda install yt
-which will install yt along with all of its dependencies.
+which will install stable branch of yt along with all of its dependencies.
+
+If you would like to install latest development version of yt, you can download
+it from our custom anaconda channel:
+
+.. code-block:: bash
+
+ conda install -c http://use.yt/with_conda/ yt
+
+New packages for development branch are built after every pull request is
+merged. In order to make sure you are running latest version, it's recommended
+to update frequently:
+
+.. code-block:: bash
+
+ conda update -c http://use.yt/with_conda/ yt
+
+Location of our channel can be added to ``.condarc`` to avoid retyping it during
+each *conda* invocation. Please refer to `Conda Manual
+<http://conda.pydata.org/docs/config.html#channel-locations-channels>`_ for
+detailed instructions.
+
Obtaining Source Code
^^^^^^^^^^^^^^^^^^^^^
@@ -252,7 +272,7 @@
git clone https://github.com/conda/conda-recipes
-Then navigate to the repository root and invoke `conda build`:
+Then navigate to the repository root and invoke ``conda build``:
.. code-block:: bash
@@ -290,7 +310,7 @@
.. code-block:: bash
- $ pip install numpy matplotlib cython cython h5py nose sympy
+ $ pip install numpy matplotlib cython h5py nose sympy
If you're using IPython notebooks, you can install its dependencies
with ``pip`` as well:
@@ -366,7 +386,7 @@
yt update
This will detect that you have installed yt from the mercurial repository, pull
-any changes from bitbucket, and then recompile yt if necessary.
+any changes from Bitbucket, and then recompile yt if necessary.
.. _testing-installation:
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/intro/index.rst
--- a/doc/source/intro/index.rst
+++ b/doc/source/intro/index.rst
@@ -49,7 +49,7 @@
the :ref:`units system <units>` works to tag every individual field and
quantity with a physical unit (e.g. cm, AU, kpc, Mpc, etc.), and it describes
ways of analyzing multiple chronological data outputs from the same underlying
-dataset known as :ref:`time series <time-series-analysis`. Lastly, it includes
+dataset known as :ref:`time series <time-series-analysis>`. Lastly, it includes
information on how to enable yt to operate :ref:`in parallel over multiple
processors simultaneously <parallel-computation>`.
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/reference/index.rst
--- a/doc/source/reference/index.rst
+++ b/doc/source/reference/index.rst
@@ -14,5 +14,6 @@
command-line
api/api
configuration
+ python_introduction
field_list
changelog
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/visualizing/index.rst
--- a/doc/source/visualizing/index.rst
+++ b/doc/source/visualizing/index.rst
@@ -16,7 +16,6 @@
manual_plotting
volume_rendering
unstructured_mesh_rendering
- hardware_volume_rendering
sketchfab
mapserver
streamlines
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b doc/source/visualizing/volume_rendering.rst
--- a/doc/source/visualizing/volume_rendering.rst
+++ b/doc/source/visualizing/volume_rendering.rst
@@ -236,12 +236,13 @@
The :class:`~yt.visualization.volume_rendering.camera.Camera` object
is what it sounds like, a camera within the Scene. It possesses the
quantities:
- * :meth:`~yt.visualization.volume_rendering.camera.Camera.position` - the position of the camera in scene-space
- * :meth:`~yt.visualization.volume_rendering.camera.Camera.width` - the width of the plane the camera can see
- * :meth:`~yt.visualization.volume_rendering.camera.Camera.focus` - the point in space the camera is looking at
- * :meth:`~yt.visualization.volume_rendering.camera.Camera.resolution` - the image resolution
- * ``north_vector`` - a vector defining the "up" direction in an image
- * :ref:`lens <lenses>` - an object controlling how rays traverse the Scene
+
+* :meth:`~yt.visualization.volume_rendering.camera.Camera.position` - the position of the camera in scene-space
+* :meth:`~yt.visualization.volume_rendering.camera.Camera.width` - the width of the plane the camera can see
+* :meth:`~yt.visualization.volume_rendering.camera.Camera.focus` - the point in space the camera is looking at
+* :meth:`~yt.visualization.volume_rendering.camera.Camera.resolution` - the image resolution
+* ``north_vector`` - a vector defining the "up" direction in an image
+* :ref:`lens <lenses>` - an object controlling how rays traverse the Scene
.. _camera_movement:
@@ -482,7 +483,7 @@
their combination, are described below.
MPI Parallelization
-+++++++++++++++++++
+^^^^^^^^^^^^^^^^^^^
Currently the volume renderer is parallelized using MPI to decompose the volume
by attempting to split up the
@@ -516,7 +517,7 @@
For more information about enabling parallelism, see :ref:`parallel-computation`.
OpenMP Parallelization
-++++++++++++++++++++++
+^^^^^^^^^^^^^^^^^^^^^^
The volume rendering also parallelized using the OpenMP interface in Cython.
While the MPI parallelization is done using domain decomposition, the OpenMP
@@ -532,7 +533,7 @@
by default by modifying the environment variable OMP_NUM_THREADS.
Running in Hybrid MPI + OpenMP
-++++++++++++++++++++++++++++++
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The two methods for volume rendering parallelization can be used together to
leverage large supercomputing resources. When choosing how to balance the
diff -r fc2acf29bc15861ce667de8a4f442808602da655 -r 80558f3e822b8e5fea9438b35de842697b21939b yt/visualization/volume_rendering/render_source.py
--- a/yt/visualization/volume_rendering/render_source.py
+++ b/yt/visualization/volume_rendering/render_source.py
@@ -202,7 +202,8 @@
"""Set the source's fields to render
Parameters
- ---------
+ ----------
+
fields: field name or list of field names
The field or fields to render
no_ghost: boolean
https://bitbucket.org/yt_analysis/yt/commits/e9f244d2b2c3/
Changeset: e9f244d2b2c3
Branch: yt
User: rthompson
Date: 2016-03-08 21:28:51+00:00
Summary: making BLOCKSIZE an optional argument. Improving doc formatting.
Affected #: 1 file
diff -r 80558f3e822b8e5fea9438b35de842697b21939b -r e9f244d2b2c37c37de9b8e1ad598e9e415e7b6b5 yt/funcs.py
--- a/yt/funcs.py
+++ b/yt/funcs.py
@@ -861,7 +861,7 @@
else:
return u
-def get_hash(infile, algorithm='md5'):
+def get_hash(infile, algorithm='md5', BLOCKSIZE=65536):
"""Generate file hash without reading in the entire file at once.
From: http://pythoncentral.io/hashing-files-with-python/
@@ -871,6 +871,8 @@
File of interest (including the path).
algorithm : str (optional)
Hash algorithm of choice. Defaults to 'md5'.
+ BLOCKSIZE : int (optional)
+ How much data in bytes to read in at once.
Returns
-------
@@ -879,13 +881,12 @@
Examples
--------
- import yt.funcs as funcs
- funcs.get_hash('/path/to/test.png')
+ >>> import yt.funcs as funcs
+ >>> funcs.get_hash('/path/to/test.png')
> 'd38da04859093d430fa4084fd605de60'
"""
import hashlib
- BLOCKSIZE = 65536
try:
hasher = getattr(hashlib, algorithm)()
https://bitbucket.org/yt_analysis/yt/commits/94e4541576e2/
Changeset: 94e4541576e2
Branch: yt
User: rthompson
Date: 2016-03-08 21:30:38+00:00
Summary: adding licensing note.
Affected #: 1 file
diff -r e9f244d2b2c37c37de9b8e1ad598e9e415e7b6b5 -r 94e4541576e20dc0184041b3b6e7b452301706e5 yt/funcs.py
--- a/yt/funcs.py
+++ b/yt/funcs.py
@@ -863,7 +863,9 @@
def get_hash(infile, algorithm='md5', BLOCKSIZE=65536):
"""Generate file hash without reading in the entire file at once.
- From: http://pythoncentral.io/hashing-files-with-python/
+
+ Original code licensed under MIT. Source:
+ http://pythoncentral.io/hashing-files-with-python/
Parameters
----------
https://bitbucket.org/yt_analysis/yt/commits/1aef10a270f3/
Changeset: 1aef10a270f3
Branch: yt
User: rthompson
Date: 2016-03-08 21:32:23+00:00
Summary: fixing example docstring.
Affected #: 1 file
diff -r 94e4541576e20dc0184041b3b6e7b452301706e5 -r 1aef10a270f3649fa3563e8d1ac9f120c9e2d27e yt/funcs.py
--- a/yt/funcs.py
+++ b/yt/funcs.py
@@ -885,7 +885,7 @@
--------
>>> import yt.funcs as funcs
>>> funcs.get_hash('/path/to/test.png')
- > 'd38da04859093d430fa4084fd605de60'
+ 'd38da04859093d430fa4084fd605de60'
"""
import hashlib
https://bitbucket.org/yt_analysis/yt/commits/a79baee298d9/
Changeset: a79baee298d9
Branch: yt
User: ngoldbaum
Date: 2016-03-09 17:13:04+00:00
Summary: Merged in rthompson/sphgr_yt (pull request #2034)
adding get_hash() function to yt/funcs.py which returns a hash for a file.
Affected #: 1 file
diff -r fe826d7cd6ba606461cf25fc50404f1d3836dbf9 -r a79baee298d9cc1a4346de3878d0d0ec6d27442c yt/funcs.py
--- a/yt/funcs.py
+++ b/yt/funcs.py
@@ -860,3 +860,55 @@
return 'unitary'
else:
return u
+
+def get_hash(infile, algorithm='md5', BLOCKSIZE=65536):
+ """Generate file hash without reading in the entire file at once.
+
+ Original code licensed under MIT. Source:
+ http://pythoncentral.io/hashing-files-with-python/
+
+ Parameters
+ ----------
+ infile : str
+ File of interest (including the path).
+ algorithm : str (optional)
+ Hash algorithm of choice. Defaults to 'md5'.
+ BLOCKSIZE : int (optional)
+ How much data in bytes to read in at once.
+
+ Returns
+ -------
+ hash : str
+ The hash of the file.
+
+ Examples
+ --------
+ >>> import yt.funcs as funcs
+ >>> funcs.get_hash('/path/to/test.png')
+ 'd38da04859093d430fa4084fd605de60'
+
+ """
+ import hashlib
+
+ try:
+ hasher = getattr(hashlib, algorithm)()
+ except:
+ raise NotImplementedError("'%s' not available! Available algorithms: %s" %
+ (algorithm, hashlib.algorithms))
+
+ filesize = os.path.getsize(infile)
+ iterations = int(float(filesize)/float(BLOCKSIZE))
+
+ pbar = get_pbar('Generating %s hash' % algorithm, iterations)
+
+ iter = 0
+ with open(infile,'rb') as f:
+ buf = f.read(BLOCKSIZE)
+ while len(buf) > 0:
+ hasher.update(buf)
+ buf = f.read(BLOCKSIZE)
+ iter += 1
+ pbar.update(iter)
+ pbar.finish()
+
+ return hasher.hexdigest()
Repository URL: https://bitbucket.org/yt_analysis/yt/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
More information about the yt-svn
mailing list