Skip to content

Commit 4e0c444

Browse files
authored
[MRG][DOC] Fixes almost all warnings in the docs (#338)
* Update API names, unuse depretaed html4 * Fixes a lot of warning. Add Methods doctree * More warnings solved * Fix docs dependencies * New style for Example Code and References * Add all Methods to all classes in docstrings, in alphabetical order * Add MetricTransformer and MahalanobisMixin to auto-docs * Delete unused vars in docs. Use simple quotes * Fix identation * Fix Github CI instead of old Travis CI * References Lists are now numbered * RemoveExample Code body almost everywhere * Removed Methods directive. Kept warnings * Deprecated directive now is red as in sklearn
1 parent a797635 commit 4e0c444

16 files changed

+175
-117
lines changed

doc/_static/css/styles.css

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
.hatnote {
2+
border-color: #e1e4e5 ;
3+
border-style: solid ;
4+
border-width: 1px ;
5+
font-size: x-small ;
6+
font-style: italic ;
7+
margin-left: auto ;
8+
margin-right: auto ;
9+
margin-bottom: 24px;
10+
padding: 12px;
11+
}
12+
.hatnote-gray {
13+
background-color: #f5f5f5
14+
}
15+
.hatnote li {
16+
list-style-type: square;
17+
margin-left: 12px !important;
18+
}
19+
.hatnote ul {
20+
list-style-type: square;
21+
margin-left: 0px !important;
22+
margin-bottom: 0px !important;
23+
}
24+
.deprecated {
25+
color: #b94a48;
26+
background-color: #F3E5E5;
27+
border-color: #eed3d7;
28+
margin-top: 0.5rem;
29+
padding: 0.5rem;
30+
border-radius: 0.5rem;
31+
margin-bottom: 0.5rem;
32+
}
33+
34+
.deprecated p {
35+
margin-bottom: 0 !important;
36+
}

doc/conf.py

Lines changed: 2 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,6 @@
3838
html_static_path = ['_static']
3939
htmlhelp_basename = 'metric-learndoc'
4040

41-
# Option to only need single backticks to refer to symbols
42-
default_role = 'any'
43-
4441
# Option to hide doctests comments in the documentation (like # doctest:
4542
# +NORMALIZE_WHITESPACE for instance)
4643
trim_doctest_flags = True
@@ -67,10 +64,6 @@
6764
# generate autosummary even if no references
6865
autosummary_generate = True
6966

70-
# Switch to old behavior with html4, for a good display of references,
71-
# as described in https://github.com/sphinx-doc/sphinx/issues/6705
72-
html4_writer = True
73-
7467

7568
# Temporary work-around for spacing problem between parameter and parameter
7669
# type in the doc, see https://github.com/numpy/numpydoc/issues/215. The bug
@@ -79,8 +72,8 @@
7972
# In an ideal world, this would get fixed in this PR:
8073
# https://github.com/readthedocs/sphinx_rtd_theme/pull/747/files
8174
def setup(app):
82-
app.add_javascript('js/copybutton.js')
83-
app.add_stylesheet("basic.css")
75+
app.add_js_file('js/copybutton.js')
76+
app.add_css_file('css/styles.css')
8477

8578

8679
# Remove matplotlib agg warnings from generated doc when using plt.show

doc/index.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
metric-learn: Metric Learning in Python
22
=======================================
3-
|Travis-CI Build Status| |License| |PyPI version| |Code coverage|
3+
|GitHub Actions Build Status| |License| |PyPI version| |Code coverage|
44

55
`metric-learn <https://github.com/scikit-learn-contrib/metric-learn>`_
66
contains efficient Python implementations of several popular supervised and
@@ -57,8 +57,8 @@ Documentation outline
5757

5858
:ref:`genindex` | :ref:`search`
5959

60-
.. |Travis-CI Build Status| image:: https://api.travis-ci.org/scikit-learn-contrib/metric-learn.svg?branch=master
61-
:target: https://travis-ci.org/scikit-learn-contrib/metric-learn
60+
.. |GitHub Actions Build Status| image:: https://github.com/scikit-learn-contrib/metric-learn/workflows/CI/badge.svg
61+
:target: https://github.com/scikit-learn-contrib/metric-learn/actions?query=event%3Apush+branch%3Amaster
6262
.. |PyPI version| image:: https://badge.fury.io/py/metric-learn.svg
6363
:target: http://badge.fury.io/py/metric-learn
6464
.. |License| image:: http://img.shields.io/:license-mit-blue.svg?style=flat

doc/metric_learn.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ Base Classes
1313

1414
metric_learn.Constraints
1515
metric_learn.base_metric.BaseMetricLearner
16+
metric_learn.base_metric.MetricTransformer
17+
metric_learn.base_metric.MahalanobisMixin
1618
metric_learn.base_metric._PairsClassifierMixin
1719
metric_learn.base_metric._TripletsClassifierMixin
1820
metric_learn.base_metric._QuadrupletsClassifierMixin

doc/supervised.rst

Lines changed: 29 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@ neighbors (with same labels) of :math:`\mathbf{x}_{i}`, :math:`y_{ij}=0`
152152
indicates :math:`\mathbf{x}_{i}, \mathbf{x}_{j}` belong to different classes,
153153
:math:`[\cdot]_+=\max(0, \cdot)` is the Hinge loss.
154154

155-
.. topic:: Example Code:
155+
.. rubric:: Example Code
156156

157157
::
158158

@@ -167,15 +167,15 @@ indicates :math:`\mathbf{x}_{i}, \mathbf{x}_{j}` belong to different classes,
167167
lmnn = LMNN(k=5, learn_rate=1e-6)
168168
lmnn.fit(X, Y, verbose=False)
169169

170-
.. topic:: References:
170+
.. rubric:: References
171171

172-
.. [1] Weinberger et al. `Distance Metric Learning for Large Margin
173-
Nearest Neighbor Classification
174-
<http://jmlr.csail.mit.edu/papers/volume10/weinberger09a/weinberger09a.pdf>`_.
175-
JMLR 2009
176172

177-
.. [2] `Wikipedia entry on Large Margin Nearest Neighbor <https://en.wikipedia.org/wiki/Large_margin_nearest_neighbor>`_
178-
173+
.. container:: hatnote hatnote-gray
174+
175+
[1]. Weinberger et al. `Distance Metric Learning for Large Margin Nearest Neighbor Classification <http://jmlr.csail.mit.edu/papers/volume10/weinberger09a/weinberger09a.pdf>`_. JMLR 2009.
176+
177+
[2]. `Wikipedia entry on Large Margin Nearest Neighbor <https://en.wikipedia.org/wiki/Large_margin_nearest_neighbor>`_.
178+
179179

180180
.. _nca:
181181

@@ -216,7 +216,7 @@ the sum of probability of being correctly classified:
216216
217217
\mathbf{L} = \text{argmax}\sum_i p_i
218218
219-
.. topic:: Example Code:
219+
.. rubric:: Example Code
220220

221221
::
222222

@@ -231,13 +231,14 @@ the sum of probability of being correctly classified:
231231
nca = NCA(max_iter=1000)
232232
nca.fit(X, Y)
233233

234-
.. topic:: References:
234+
.. rubric:: References
235+
236+
237+
.. container:: hatnote hatnote-gray
235238

236-
.. [1] Goldberger et al.
237-
`Neighbourhood Components Analysis <https://papers.nips.cc/paper/2566-neighbourhood-components-analysis.pdf>`_.
238-
NIPS 2005
239+
[1]. Goldberger et al. `Neighbourhood Components Analysis <https://papers.nips.cc/paper/2566-neighbourhood-components-analysis.pdf>`_. NIPS 2005.
239240

240-
.. [2] `Wikipedia entry on Neighborhood Components Analysis <https://en.wikipedia.org/wiki/Neighbourhood_components_analysis>`_
241+
[2]. `Wikipedia entry on Neighborhood Components Analysis <https://en.wikipedia.org/wiki/Neighbourhood_components_analysis>`_.
241242

242243

243244
.. _lfda:
@@ -289,7 +290,7 @@ nearby data pairs in the same class are made close and the data pairs in
289290
different classes are separated from each other; far apart data pairs in the
290291
same class are not imposed to be close.
291292

292-
.. topic:: Example Code:
293+
.. rubric:: Example Code
293294

294295
::
295296

@@ -309,15 +310,14 @@ same class are not imposed to be close.
309310

310311
To work around this, fit instances of this class to data once, then keep the instance around to do transformations.
311312

312-
.. topic:: References:
313+
.. rubric:: References
313314

314-
.. [1] Sugiyama. `Dimensionality Reduction of Multimodal Labeled Data by Local
315-
Fisher Discriminant Analysis <http://www.jmlr.org/papers/volume8/sugiyama07b/sugiyama07b.pdf>`_.
316-
JMLR 2007
317315

318-
.. [2] Tang. `Local Fisher Discriminant Analysis on Beer Style Clustering
319-
<https://gastrograph.com/resources/whitepapers/local-fisher
320-
-discriminant-analysis-on-beer-style-clustering.html#>`_.
316+
.. container:: hatnote hatnote-gray
317+
318+
[1]. Sugiyama. `Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis <http://www.jmlr.org/papers/volume8/sugiyama07b/sugiyama07b.pdf>`_. JMLR 2007.
319+
320+
[2]. Tang. `Local Fisher Discriminant Analysis on Beer Style Clustering <https://gastrograph.com/resources/whitepapers/local-fisher-discriminant-analysis-on-beer-style-clustering.html#>`_.
321321

322322
.. _mlkr:
323323

@@ -363,7 +363,7 @@ calculating a weighted average of all the training samples:
363363
364364
\hat{y}_i = \frac{\sum_{j\neq i}y_jk_{ij}}{\sum_{j\neq i}k_{ij}}
365365
366-
.. topic:: Example Code:
366+
.. rubric:: Example Code
367367

368368
::
369369

@@ -377,10 +377,12 @@ calculating a weighted average of all the training samples:
377377
mlkr = MLKR()
378378
mlkr.fit(X, Y)
379379

380-
.. topic:: References:
380+
.. rubric:: References
381+
382+
383+
.. container:: hatnote hatnote-gray
381384

382-
.. [1] Weinberger et al. `Metric Learning for Kernel Regression <http://proceedings.mlr.
383-
press/v2/weinberger07a/weinberger07a.pdf>`_. AISTATS 2007
385+
[1]. Weinberger et al. `Metric Learning for Kernel Regression <http://proceedings.mlr.press/v2/weinberger07a/weinberger07a.pdf>`_. AISTATS 2007.
384386

385387

386388
.. _supervised_version:
@@ -417,7 +419,7 @@ quadruplets, where for each quadruplet the two first points are from the same
417419
class, and the two last points are from a different class (so indeed the two
418420
last points should be less similar than the two first points).
419421

420-
.. topic:: Example Code:
422+
.. rubric:: Example Code
421423

422424
::
423425

doc/unsupervised.rst

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ It can be used for ZCA whitening of the data (see the Wikipedia page of
2020
`whitening transformation <https://en.wikipedia.org/wiki/\
2121
Whitening_transformation>`_).
2222

23-
.. topic:: Example Code:
23+
.. rubric:: Example Code
2424

2525
::
2626

@@ -32,6 +32,9 @@ Whitening_transformation>`_).
3232
cov = Covariance().fit(iris)
3333
x = cov.transform(iris)
3434

35-
.. topic:: References:
35+
.. rubric:: References
3636

37-
.. [1] On the Generalized Distance in Statistics, P.C.Mahalanobis, 1936
37+
38+
.. container:: hatnote hatnote-gray
39+
40+
[1]. On the Generalized Distance in Statistics, P.C.Mahalanobis, 1936.

0 commit comments

Comments
 (0)