Skip to content

[MRG] Correct a few small doc issues following #280 #281

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Mar 4, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,3 +69,14 @@
# Switch to old behavior with html4, for a good display of references,
# as described in https://github.com/sphinx-doc/sphinx/issues/6705
html4_writer = True


# Temporary work-around for spacing problem between parameter and parameter
# type in the doc, see https://github.com/numpy/numpydoc/issues/215. The bug
# has been fixed in sphinx (https://github.com/sphinx-doc/sphinx/pull/5976) but
# through a change in sphinx basic.css except rtd_theme does not use basic.css.
# In an ideal world, this would get fixed in this PR:
# https://github.com/readthedocs/sphinx_rtd_theme/pull/747/files
def setup(app):
app.add_javascript('js/copybutton.js')
app.add_stylesheet("basic.css")
4 changes: 2 additions & 2 deletions doc/weakly_supervised.rst
Original file line number Diff line number Diff line change
Expand Up @@ -483,7 +483,7 @@ is the off-diagonal L1 norm.
L1-penalized log-determinant regularization <https://icml.cc/Conferences/2009/papers/46.pdf>`_.
ICML 2009.

.. [2] Adapted from https://gist.github.com/kcarnold/5439945
.. [2] Code adapted from https://gist.github.com/kcarnold/5439945

.. _rca:

Expand Down Expand Up @@ -794,6 +794,6 @@ by default, :math:`D_{ld}(\mathbf{\cdot, \cdot})` is the LogDet divergence:
`Metric Learning from Relative Comparisons by Minimizing Squared
Residual <http://www.cs.ucla.edu/~weiwang/paper/ICDM12.pdf>`_. ICDM 2012

.. [2] Adapted from https://gist.github.com/kcarnold/5439917
.. [2] Code adapted from https://gist.github.com/kcarnold/5439917


14 changes: 7 additions & 7 deletions metric_learn/base_metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def score_pairs(self, pairs):

Returns
-------
scores: `numpy.ndarray` of shape=(n_pairs,)
scores : `numpy.ndarray` of shape=(n_pairs,)
The score of every pair.

See Also
Expand Down Expand Up @@ -69,27 +69,27 @@ def _prepare_inputs(self, X, y=None, type_of_inputs='classic',

Parameters
----------
input: array-like
X : array-like
The input data array to check.

y : array-like
The input labels array to check.

type_of_inputs: `str` {'classic', 'tuples'}
type_of_inputs : `str` {'classic', 'tuples'}
The type of inputs to check. If 'classic', the input should be
a 2D array-like of points or a 1D array like of indicators of points. If
'tuples', the input should be a 3D array-like of tuples or a 2D
array-like of indicators of tuples.

**kwargs: dict
**kwargs : dict
Arguments to pass to check_input.

Returns
-------
X : `numpy.ndarray`
The checked input data array.

y: `numpy.ndarray` (optional)
y : `numpy.ndarray` (optional)
The checked input labels array.
"""
self._check_preprocessor()
Expand Down Expand Up @@ -203,7 +203,7 @@ def score_pairs(self, pairs):

Returns
-------
scores: `numpy.ndarray` of shape=(n_pairs,)
scores : `numpy.ndarray` of shape=(n_pairs,)
The learned Mahalanobis distance for every pair.

See Also
Expand Down Expand Up @@ -271,7 +271,7 @@ def metric_fun(u, v, squared=False):

Returns
-------
distance: float
distance : float
The distance between u and v according to the new metric.
"""
u = validate_vector(u)
Expand Down
29 changes: 16 additions & 13 deletions metric_learn/constraints.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,12 @@ class Constraints(object):
Parameters
----------
partial_labels : `numpy.ndarray` of ints, shape=(n_samples,)
Array of labels, with -1 indicating unknown label.
Array of labels, with -1 indicating unknown label.

Attributes
----------
partial_labels : `numpy.ndarray` of ints, shape=(n_samples,)
Array of labels, with -1 indicating unknown label.
Array of labels, with -1 indicating unknown label.
"""

def __init__(self, partial_labels):
Expand All @@ -45,26 +45,29 @@ def positive_negative_pairs(self, num_constraints, same_length=False,

Parameters
----------
num_constraints : int
Number of positive and negative constraints to generate.
same_length : bool, optional (default=False)
If True, forces the number of positive and negative pairs to be
equal by ignoring some pairs from the larger set.
random_state : int or numpy.RandomState or None, optional (default=None)
A pseudo random number generator object or a seed for it if int.
num_constraints : int
Number of positive and negative constraints to generate.

same_length : bool, optional (default=False)
If True, forces the number of positive and negative pairs to be
equal by ignoring some pairs from the larger set.

random_state : int or numpy.RandomState or None, optional (default=None)
A pseudo random number generator object or a seed for it if int.

Returns
-------
a : array-like, shape=(n_constraints,)
1D array of indicators for the left elements of positive pairs.
1D array of indicators for the left elements of positive pairs.

b : array-like, shape=(n_constraints,)
1D array of indicators for the right elements of positive pairs.
1D array of indicators for the right elements of positive pairs.

c : array-like, shape=(n_constraints,)
1D array of indicators for the left elements of negative pairs.
1D array of indicators for the left elements of negative pairs.

d : array-like, shape=(n_constraints,)
1D array of indicators for the right elements of negative pairs.
1D array of indicators for the right elements of negative pairs.
"""
random_state = check_random_state(random_state)
a, b = self._pairs(num_constraints, same_label=True,
Expand Down
9 changes: 6 additions & 3 deletions metric_learn/itml.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,9 +211,9 @@ class ITML(_BaseITML, _PairsClassifierMixin):

References
----------
.. [1] `Information-theoretic Metric Learning
.. [1] Jason V. Davis, et al. `Information-theoretic Metric Learning
<http://www.prateekjain.org/publications/all_papers\
/DavisKJSD07_ICML.pdf>`_ Jason V. Davis, et al.
/DavisKJSD07_ICML.pdf>`_. ICML 2007.
"""

def fit(self, pairs, y, bounds=None, calibration_params=None):
Expand All @@ -229,8 +229,10 @@ def fit(self, pairs, y, bounds=None, calibration_params=None):
3D Array of pairs with each row corresponding to two points,
or 2D array of indices of pairs if the metric learner uses a
preprocessor.

y: array-like, of shape (n_constraints,)
Labels of constraints. Should be -1 for dissimilar pair, 1 for similar.

bounds : array-like of two numbers
Bounds on similarity, aside slack variables, s.t.
``d(a, b) < bounds_[0]`` for all given pairs of similar points ``a``
Expand All @@ -239,6 +241,7 @@ def fit(self, pairs, y, bounds=None, calibration_params=None):
If not provided at initialization, bounds_[0] and bounds_[1] will be
set to the 5th and 95th percentile of the pairwise distances among all
points present in the input `pairs`.

calibration_params : `dict` or `None`
Dictionary of parameters to give to `calibrate_threshold` for the
threshold calibration step done at the end of `fit`. If `None` is
Expand Down Expand Up @@ -280,7 +283,7 @@ class ITML_Supervised(_BaseITML, TransformerMixin):
`num_labeled` was deprecated in version 0.5.0 and will
be removed in 0.6.0.

num_constraints: int, optional (default=None)
num_constraints : int, optional (default=None)
Number of constraints to generate. If None, default to `20 *
num_classes**2`.

Expand Down
29 changes: 18 additions & 11 deletions metric_learn/lfda.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,10 +39,16 @@ class LFDA(MahalanobisMixin, TransformerMixin):
defaults to min(7, n_features - 1).

embedding_type : str, optional (default: 'weighted')
Type of metric in the embedding space
'weighted' - weighted eigenvectors
'orthonormalized' - orthonormalized
'plain' - raw eigenvectors
Type of metric in the embedding space.

'weighted'
weighted eigenvectors

'orthonormalized'
orthonormalized

'plain'
raw eigenvectors

preprocessor : array-like, shape=(n_samples, n_features) or callable
The preprocessor to call to get tuples from indices. If array-like,
Expand All @@ -67,13 +73,14 @@ class LFDA(MahalanobisMixin, TransformerMixin):

References
------------------
.. [1] `Dimensionality Reduction of Multimodal Labeled Data by Local Fisher
Discriminant Analysis <http://www.ms.k.u-tokyo.ac.jp/2007/LFDA.pdf>`_
Masashi Sugiyama.

.. [2] `Local Fisher Discriminant Analysis on Beer Style Clustering
<https://gastrograph.com/resources/whitepapers/local-fisher\
-discriminant-analysis-on-beer-style-clustering.html#>`_ Yuan Tang.
.. [1] Masashi Sugiyama. `Dimensionality Reduction of Multimodal Labeled
Data by Local Fisher Discriminant Analysis
<http://www.ms.k.u-tokyo.ac.jp/2007/LFDA.pdf>`_. JMLR 2007.

.. [2] Yuan Tang. `Local Fisher Discriminant Analysis on Beer Style
Clustering
<https://gastrograph.com/resources/whitepapers/local-fisher\
-discriminant-analysis-on-beer-style-clustering.html#>`_.
'''

def __init__(self, n_components=None, num_dims='deprecated',
Expand Down
9 changes: 5 additions & 4 deletions metric_learn/lmnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,10 +137,11 @@ class LMNN(MahalanobisMixin, TransformerMixin):

References
----------
.. [1] `Distance Metric Learning for Large Margin Nearest Neighbor
Classification <http://papers.nips.cc/paper/2795-distance-metric\
-learning-for-large-margin-nearest-neighbor-classification>`_
Kilian Q. Weinberger, John Blitzer, Lawrence K. Saul
.. [1] K. Q. Weinberger, J. Blitzer, L. K. Saul. `Distance Metric
Learning for Large Margin Nearest Neighbor Classification
<http://papers.nips.cc/paper/2795-distance-metric\
-learning-for-large-margin-nearest-neighbor-classification>`_. NIPS
2005.
"""

def __init__(self, init=None, k=3, min_iter=50, max_iter=1000,
Expand Down
2 changes: 1 addition & 1 deletion metric_learn/lsml.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,7 @@ class LSML(_BaseLSML, _QuadrupletsClassifierMixin):
Squared Residual
<http://www.cs.ucla.edu/~weiwang/paper/ICDM12.pdf>`_. ICDM 2012.

.. [2] Adapted from https://gist.github.com/kcarnold/5439917
.. [2] Code adapted from https://gist.github.com/kcarnold/5439917

See Also
--------
Expand Down
12 changes: 6 additions & 6 deletions metric_learn/mlkr.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,15 +73,15 @@ class MLKR(MahalanobisMixin, TransformerMixin):
:meth:`fit` and n_features_a must be less than or equal to that.
If ``n_components`` is not None, n_features_a must match it.

A0: Not used.
A0 : Not used.
.. deprecated:: 0.5.0
`A0` was deprecated in version 0.5.0 and will
be removed in 0.6.0. Use 'init' instead.

tol: float, optional (default=None)
tol : float, optional (default=None)
Convergence tolerance for the optimization.

max_iter: int, optional (default=1000)
max_iter : int, optional (default=1000)
Cap on number of conjugate gradient iterations.

verbose : bool, optional (default=False)
Expand Down Expand Up @@ -118,9 +118,9 @@ class MLKR(MahalanobisMixin, TransformerMixin):

References
----------
.. [1] `Information-theoretic Metric Learning
<http://machinelearning.wustl.edu/\
mlpapers/paper_files/icml2007_DavisKJSD07.pdf>`_ Jason V. Davis, et al.
.. [1] K.Q. Weinberger and G. Tesauto. `Metric Learning for Kernel
Regression <http://proceedings.mlr.press/v2/weinberger07a\
/weinberger07a.pdf>`_. AISTATS 2007.
"""

def __init__(self, n_components=None, num_dims='deprecated', init=None,
Expand Down
17 changes: 5 additions & 12 deletions metric_learn/mmc.py
Original file line number Diff line number Diff line change
Expand Up @@ -383,10 +383,6 @@ class MMC(_BaseMMC, _PairsClassifierMixin):
An SPD matrix of shape (n_features, n_features), that will
be used as such to initialize the metric.

preprocessor : array-like, shape=(n_samples, n_features) or callable
The preprocessor to call to get tuples from indices. If array-like,
tuples will be gotten like this: X[indices].

A0 : Not used.
.. deprecated:: 0.5.0
`A0` was deprecated in version 0.5.0 and will
Expand Down Expand Up @@ -442,10 +438,11 @@ class MMC(_BaseMMC, _PairsClassifierMixin):

References
----------
.. [1] `Distance metric learning with application to clustering with
side-information <http://papers.nips.cc/paper/2164-distance-metric-\
learning-with-application-to-clustering-with-side-information.pdf>`_
Xing, Jordan, Russell, Ng.
.. [1] Xing, Jordan, Russell, Ng. `Distance metric learning with application
to clustering with side-information
<http://papers.nips.cc/paper/2164-distance-metric-\
learning-with-application-to-clustering-with-side-information.pdf>`_.
NIPS 2002.

See Also
--------
Expand Down Expand Up @@ -538,10 +535,6 @@ class MMC_Supervised(_BaseMMC, TransformerMixin):
A numpy array of shape (n_features, n_features), that will
be used as such to initialize the metric.

preprocessor : array-like, shape=(n_samples, n_features) or callable
The preprocessor to call to get tuples from indices. If array-like,
tuples will be gotten like this: X[indices].

A0 : Not used.
.. deprecated:: 0.5.0
`A0` was deprecated in version 0.5.0 and will
Expand Down
2 changes: 1 addition & 1 deletion metric_learn/nca.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ class NCA(MahalanobisMixin, TransformerMixin):
.. [1] J. Goldberger, G. Hinton, S. Roweis, R. Salakhutdinov. `Neighbourhood
Components Analysis
<http://www.cs.nyu.edu/~roweis/papers/ncanips.pdf>`_.
Advances in Neural Information Processing Systems. 17, 513-520, 2005.
NIPS 2005.

.. [2] Wikipedia entry on `Neighborhood Components Analysis
<https://en.wikipedia.org/wiki/Neighbourhood_components_analysis>`_
Expand Down
8 changes: 4 additions & 4 deletions metric_learn/rca.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,10 +72,10 @@ class RCA(MahalanobisMixin, TransformerMixin):

References
------------------
.. [1] `Adjustment learning and relevant component analysis
<http://citeseerx.ist.\
psu.edu/viewdoc/download?doi=10.1.1.19.2871&rep=rep1&type=pdf>`_ Noam
Shental, et al.
.. [1] Noam Shental, et al. `Adjustment learning and relevant component
analysis <http://citeseerx.ist.\
psu.edu/viewdoc/download?doi=10.1.1.19.2871&rep=rep1&type=pdf>`_ .
ECCV 2002.


Attributes
Expand Down
12 changes: 5 additions & 7 deletions metric_learn/sdml.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,14 +211,12 @@ class SDML(_BaseSDML, _PairsClassifierMixin):

References
----------
.. [1] Qi et al. `An efficient sparse metric learning in high-dimensional
space via L1-penalized log-determinant regularization
<http://www.machinelearning.org/archive/icml2009/papers/46.pdf>`_.
ICML 2009.

.. [1] Qi et al.
An efficient sparse metric learning in high-dimensional space via
L1-penalized log-determinant regularization. ICML 2009.
http://lms.comp.nus.edu.sg/sites/default/files/publication\
-attachments/icml09-guojun.pdf

.. [2] Adapted from https://gist.github.com/kcarnold/5439945
.. [2] Code adapted from https://gist.github.com/kcarnold/5439945
"""

def fit(self, pairs, y, calibration_params=None):
Expand Down