Skip to content

DOC: Attempt to find versioneer version when building docs #894

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Mar 6, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -121,8 +121,7 @@ script:
flake8 nibabel
elif [ "${CHECK_TYPE}" == "doc" ]; then
cd doc
make html;
make doctest;
make html && make doctest
elif [ "${CHECK_TYPE}" == "test" ]; then
# Change into an innocuous directory and find tests from installation
mkdir for_testing
Expand Down
6 changes: 3 additions & 3 deletions Changelog
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ Enhancements

Bug fixes
---------
* Sliced ``Tractogram``s no longer ``apply_affine`` to the original
* Sliced ``Tractogram``\s no longer ``apply_affine`` to the original
``Tractogram``'s streamlines. (pr/811) (MC, reviewed by Serge Koudoro,
Philippe Poulin, CM, MB)
* Change strings with invalid escapes to raw strings (pr/827) (EL, reviewed
Expand All @@ -98,7 +98,7 @@ Maintenance
API changes and deprecations
----------------------------
* Fully remove deprecated ``checkwarns`` and ``minc`` modules. (pr/852) (CM)
* The ``keep_file_open`` argument to file load operations and ``ArrayProxy``s
* The ``keep_file_open`` argument to file load operations and ``ArrayProxy``\s
no longer acccepts the value ``"auto"``, raising a ``ValueError``. (pr/852)
(CM)
* Deprecate ``ArraySequence.data`` in favor of ``ArraySequence.get_data()``,
Expand Down Expand Up @@ -420,7 +420,7 @@ New features
* Support for MRtrix TCK streamlines file format (pr/486) (MC, reviewed by
MB, Arnaud Bore, J-Donald Tournier, Jean-Christophe Houde)
* Added ``get_fdata()`` as default method to retrieve scaled floating point
data from ``DataobjImage``s (pr/551) (MB, reviewed by CM, SG)
data from ``DataobjImage``\s (pr/551) (MB, reviewed by CM, SG)

Enhancements
------------
Expand Down
15 changes: 4 additions & 11 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,8 @@

import sys
import os
try:
from configparser import ConfigParser
except ImportError:
from ConfigParser import ConfigParser # PY2
from runpy import run_path
from configparser import ConfigParser

# Check for external Sphinx extensions we depend on
try:
Expand All @@ -51,9 +49,7 @@
# -- General configuration ----------------------------------------------------

# We load the nibabel release info into a dict by explicit execution
rel = {}
with open(os.path.join('..', '..', 'nibabel', 'info.py'), 'r') as fobj:
exec(fobj.read(), rel)
rel = run_path(os.path.join('..', '..', 'nibabel', 'info.py'))

# Write long description from info
with open('_long_description.inc', 'wt') as fobj:
Expand All @@ -62,10 +58,7 @@
# Load metadata from setup.cfg
config = ConfigParser()
config.read(os.path.join('..', '..', 'setup.cfg'))
try:
metadata = config['metadata']
except AttributeError:
metadata = dict(config.items('metadata')) # PY2
metadata = config['metadata']

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
Expand Down
27 changes: 21 additions & 6 deletions doc/tools/build_modref_templates.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
# stdlib imports
import sys
import re
import os
from os.path import join as pjoin

# local imports
Expand Down Expand Up @@ -48,12 +49,25 @@ def abort(error):

installed_version = V(module.__version__)

info_file = pjoin('..', package, 'info.py')
info_lines = open(info_file).readlines()
source_version = '.'.join([v.split('=')[1].strip(" '\n.")
for v in info_lines if re.match(
'^_version_(major|minor|micro|extra)', v
)])
version_file = pjoin('..', package, '_version.py')
source_version = None
if os.path.exists(version_file):
# Versioneer
from runpy import run_path
try:
source_version = run_path(version_file)['get_versions']()['version']
except (FileNotFoundError, KeyError):
pass
if source_version == '0+unknown':
source_version = None
if source_version is None:
# Legacy fall-back
info_file = pjoin('..', package, 'info.py')
info_lines = open(info_file).readlines()
source_version = '.'.join([v.split('=')[1].strip(" '\n.")
for v in info_lines if re.match(
'^_version_(major|minor|micro|extra)', v
)])
print('***', source_version)

if source_version != installed_version:
Expand All @@ -68,6 +82,7 @@ def abort(error):
r'.*test.*$',
r'\.info.*$',
r'\.pkg_info.*$',
r'\.py3k.*$',
]
docwriter.write_api_docs(outdir)
docwriter.write_index(outdir, 'index', relative_to=outdir)
Expand Down
2 changes: 1 addition & 1 deletion nibabel/affines.py
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ def obliquity(affine):
This implementation is inspired by `AFNI's implementation
<https://github.com/afni/afni/blob/b6a9f7a21c1f3231ff09efbd861f8975ad48e525/src/thd_coords.c#L660-L698>`_.
For further details about *obliquity*, check `AFNI's documentation
<https://sscc.nimh.nih.gov/sscc/dglen/Obliquity>_.
<https://sscc.nimh.nih.gov/sscc/dglen/Obliquity>`_.

Parameters
----------
Expand Down
5 changes: 3 additions & 2 deletions nibabel/brikhead.py
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ def __init__(self, file_like, header, mmap=True, keep_file_open=None):
a new file handle is created every time the image is accessed.
If ``file_like`` refers to an open file handle, this setting has no
effect. The default value (``None``) will result in the value of
``nibabel.arrayproxy.KEEP_FILE_OPEN_DEFAULT` being used.
``nibabel.arrayproxy.KEEP_FILE_OPEN_DEFAULT`` being used.
"""
super(AFNIArrayProxy, self).__init__(file_like,
header,
Expand Down Expand Up @@ -533,7 +533,7 @@ def from_file_map(klass, file_map, mmap=True, keep_file_open=None):
a new file handle is created every time the image is accessed.
If ``file_like`` refers to an open file handle, this setting has no
effect. The default value (``None``) will result in the value of
``nibabel.arrayproxy.KEEP_FILE_OPEN_DEFAULT` being used.
``nibabel.arrayproxy.KEEP_FILE_OPEN_DEFAULT`` being used.
"""
with file_map['header'].get_prepare_fileobj('rt') as hdr_fobj:
hdr = klass.header_class.from_fileobj(hdr_fobj)
Expand All @@ -553,6 +553,7 @@ def filespec_to_file_map(klass, filespec):
afni.nimh.nih.gov/pub/dist/doc/program_help/README.compression.html.
Thus, if you have AFNI files my_image.HEAD and my_image.BRIK.gz and you
want to load the AFNI BRIK / HEAD pair, you can specify:

* The HEAD filename - e.g., my_image.HEAD
* The BRIK filename w/o compressed extension - e.g., my_image.BRIK
* The full BRIK filename - e.g., my_image.BRIK.gz
Expand Down
2 changes: 1 addition & 1 deletion nibabel/cifti2/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@
Cifti2TransformationMatrixVoxelIndicesIJKtoXYZ,
Cifti2Vertices, Cifti2Volume, CIFTI_BRAIN_STRUCTURES,
Cifti2HeaderError, CIFTI_MODEL_TYPES, load, save)
from .cifti2_axes import (Axis, BrainModelAxis, ParcelsAxis, SeriesAxis, LabelAxis, ScalarAxis)
from .cifti2_axes import (Axis, BrainModelAxis, ParcelsAxis, SeriesAxis, LabelAxis, ScalarAxis)
6 changes: 3 additions & 3 deletions nibabel/cifti2/cifti2.py
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ def _to_xml_element(self):


class Cifti2LabelTable(xml.XmlSerializable, MutableMapping):
""" CIFTI-2 label table: a sequence of ``Cifti2Label``s
""" CIFTI-2 label table: a sequence of ``Cifti2Label``\s

* Description - Used by NamedMap when IndicesMapToDataType is
"CIFTI_INDEX_TYPE_LABELS" in order to associate names and display colors
Expand Down Expand Up @@ -927,8 +927,8 @@ class Cifti2MatrixIndicesMap(xml.XmlSerializable, MutableSequence):
* Text Content: [NA]
* Parent Element - Matrix

Attribute
---------
Attributes
----------
applies_to_matrix_dimension : list of ints
Dimensions of this matrix that follow this mapping
indices_map_to_data_type : str one of CIFTI_MAP_TYPES
Expand Down
4 changes: 2 additions & 2 deletions nibabel/cifti2/cifti2_axes.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
(except for SeriesAxis objects, which have to remain monotonically increasing or decreasing).

Creating new CIFTI-2 axes
-----------------------
-------------------------
New Axis objects can be constructed by providing a description for what is contained
in each row/column of the described tensor. For each Axis sub-class this descriptor is:

Expand Down Expand Up @@ -250,7 +250,7 @@ def __init__(self, name, voxel=None, vertex=None, affine=None,
factory methods:

- :py:meth:`~BrainModelAxis.from_mask`: creates surface or volumetric BrainModelAxis axis
from respectively 1D or 3D masks
from respectively 1D or 3D masks
- :py:meth:`~BrainModelAxis.from_surface`: creates a surface BrainModelAxis axis

The resulting BrainModelAxis axes can be concatenated by adding them together.
Expand Down
15 changes: 9 additions & 6 deletions nibabel/gifti/gifti.py
Original file line number Diff line number Diff line change
Expand Up @@ -207,18 +207,21 @@ class GiftiCoordSystem(xml.XmlSerializable):
Attributes
----------
dataspace : int
From the spec: "Contains the stereotaxic space of a DataArray's data
From the spec: Contains the stereotaxic space of a DataArray's data
prior to application of the transformation matrix. The stereotaxic
space should be one of:
NIFTI_XFORM_UNKNOWN
NIFTI_XFORM_SCANNER_ANAT
NIFTI_XFORM_ALIGNED_ANAT
NIFTI_XFORM_TALAIRACH
NIFTI_XFORM_MNI_152"

- NIFTI_XFORM_UNKNOWN
- NIFTI_XFORM_SCANNER_ANAT
- NIFTI_XFORM_ALIGNED_ANAT
- NIFTI_XFORM_TALAIRACH
- NIFTI_XFORM_MNI_152

xformspace : int
Spec: "Contains the stereotaxic space of a DataArray's data after
application of the transformation matrix. See the DataSpace element for
a list of stereotaxic spaces."

xform : array-like shape (4, 4)
Affine transformation matrix
"""
Expand Down
24 changes: 12 additions & 12 deletions nibabel/nifti1.py
Original file line number Diff line number Diff line change
Expand Up @@ -1775,18 +1775,18 @@ def __init__(self, dataobj, affine, header=None,
self._affine2header()
# Copy docstring
__init__.__doc__ = analyze.AnalyzeImage.__init__.__doc__ + '''
Notes
-----

If both a `header` and an `affine` are specified, and the `affine` does
not match the affine that is in the `header`, the `affine` will be used,
but the ``sform_code`` and ``qform_code`` fields in the header will be
re-initialised to their default values. This is performed on the basis
that, if you are changing the affine, you are likely to be changing the
space to which the affine is pointing. The :meth:`set_sform` and
:meth:`set_qform` methods can be used to update the codes after an image
has been created - see those methods, and the :ref:`manual
<default-sform-qform-codes>` for more details. '''
Notes
-----

If both a `header` and an `affine` are specified, and the `affine` does
not match the affine that is in the `header`, the `affine` will be used,
but the ``sform_code`` and ``qform_code`` fields in the header will be
re-initialised to their default values. This is performed on the basis
that, if you are changing the affine, you are likely to be changing the
space to which the affine is pointing. The :meth:`set_sform` and
:meth:`set_qform` methods can be used to update the codes after an image
has been created - see those methods, and the :ref:`manual
<default-sform-qform-codes>` for more details. '''

def update_header(self):
''' Harmonize header with image data and affine
Expand Down
10 changes: 4 additions & 6 deletions nibabel/streamlines/tck.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,20 +30,18 @@ class TckFile(TractogramFile):
-----
MRtrix (so its file format: TCK) considers streamlines coordinates
to be in world space (RAS+ and mm space). MRtrix refers to that space
as the "real" or "scanner" space [1]_.
as the "real" or "scanner" space [#]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a
Moreover, when streamlines are mapped back to voxel space [#]_, a
streamline point located at an integer coordinate (i,j,k) is considered
to be at the center of the corresponding voxel. This is in contrast with
TRK's internal convention where it would have referred to a corner.

NiBabel's streamlines internal representation follows the same
convention as MRtrix.

References
----------
[1] http://www.nitrc.org/pipermail/mrtrix-discussion/2014-January/000859.html
[2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
.. [#] http://www.nitrc.org/pipermail/mrtrix-discussion/2014-January/000859.html
.. [#] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
"""
# Constants
MAGIC_NUMBER = "mrtrix tracks"
Expand Down
16 changes: 8 additions & 8 deletions nibabel/streamlines/tractogram.py
Original file line number Diff line number Diff line change
Expand Up @@ -263,9 +263,9 @@ class Tractogram(object):
choice as long as you provide the correct `affine_to_rasmm` matrix, at
construction time. When applied to streamlines coordinates, that
transformation matrix should bring the streamlines back to world space
(RAS+ and mm space) [1]_.
(RAS+ and mm space) [#]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a
Moreover, when streamlines are mapped back to voxel space [#]_, a
streamline point located at an integer coordinate (i,j,k) is considered
to be at the center of the corresponding voxel. This is in contrast with
other conventions where it might have referred to a corner.
Expand All @@ -292,8 +292,8 @@ class Tractogram(object):

References
----------
[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
[2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
.. [#] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
.. [#] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
"""
def __init__(self, streamlines=None,
data_per_streamline=None,
Expand Down Expand Up @@ -515,9 +515,9 @@ class LazyTractogram(Tractogram):
choice as long as you provide the correct `affine_to_rasmm` matrix, at
construction time. When applied to streamlines coordinates, that
transformation matrix should bring the streamlines back to world space
(RAS+ and mm space) [1]_.
(RAS+ and mm space) [#]_.

Moreover, when streamlines are mapped back to voxel space [2]_, a
Moreover, when streamlines are mapped back to voxel space [#]_, a
streamline point located at an integer coordinate (i,j,k) is considered
to be at the center of the corresponding voxel. This is in contrast with
other conventions where it might have referred to a corner.
Expand Down Expand Up @@ -553,8 +553,8 @@ class LazyTractogram(Tractogram):

References
----------
[1] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
[2] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
.. [#] http://nipy.org/nibabel/coordinate_systems.html#naming-reference-spaces
.. [#] http://nipy.org/nibabel/coordinate_systems.html#voxel-coordinates-are-in-voxel-space
"""
def __init__(self, streamlines=None,
data_per_streamline=None,
Expand Down