You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Changes in documentation. Rephrasing, fixed examples, standarized notation, etc. (#274)
* Multiple changes to the documentation. Rephrasing, fixed examples and standarized notation, and others.
* Forgot to change one A to L
* Replaced broken modindex link for module list
* fixed compliance with flake8
* Fixed typos, misplaced example, etc
* No new bullet and rectification
* remove modules index link
* add "respectively"
* fix rca examples
* fix rca examples again
Copy file name to clipboardExpand all lines: README.rst
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,6 +26,7 @@ metric-learn contains efficient Python implementations of several popular superv
26
26
27
27
- For SDML, using skggm will allow the algorithm to solve problematic cases
28
28
(install from commit `a0ed406 <https://github.com/skggm/skggm/commit/a0ed406586c4364ea3297a658f415e13b5cbdaf8>`_).
29
+
``pip install 'git+https://github.com/skggm/skggm.git@a0ed406586c4364ea3297a658f415e13b5cbdaf8'`` to install the required version of skggm from GitHub.
Copy file name to clipboardExpand all lines: doc/getting_started.rst
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ Run ``pip install metric-learn`` to download and install from PyPI.
10
10
Alternately, download the source repository and run:
11
11
12
12
- ``python setup.py install`` for default installation.
13
-
- ``python setup.py test`` to run all tests.
13
+
- ``pytest test`` to run all tests.
14
14
15
15
**Dependencies**
16
16
@@ -21,6 +21,7 @@ Alternately, download the source repository and run:
21
21
22
22
- For SDML, using skggm will allow the algorithm to solve problematic cases
23
23
(install from commit `a0ed406 <https://github.com/skggm/skggm/commit/a0ed406586c4364ea3297a658f415e13b5cbdaf8>`_).
24
+
``pip install 'git+https://github.com/skggm/skggm.git@a0ed406586c4364ea3297a658f415e13b5cbdaf8'`` to install the required version of skggm from GitHub.
where :math:`\mathbf{x}_i` is an data point, :math:`\mathbf{x}_j` is one
135
-
of its knearest neighbors sharing the same label, and :math:`\mathbf{x}_l`
134
+
where :math:`\mathbf{x}_i` is a data point, :math:`\mathbf{x}_j` is one
135
+
of its k-nearest neighbors sharing the same label, and :math:`\mathbf{x}_l`
136
136
are all the other instances within that region with different labels,
137
137
:math:`\eta_{ij}, y_{ij} \in\{0, 1\}` are both the indicators,
138
-
:math:`\eta_{ij}` represents :math:`\mathbf{x}_{j}` is the knearest
139
-
neighbors(with same labels) of :math:`\mathbf{x}_{i}`, :math:`y_{ij}=0`
140
-
indicates :math:`\mathbf{x}_{i}, \mathbf{x}_{j}` belong to different class,
138
+
:math:`\eta_{ij}` represents :math:`\mathbf{x}_{j}` is the k-nearest
139
+
neighbors(with same labels) of :math:`\mathbf{x}_{i}`, :math:`y_{ij}=0`
140
+
indicates :math:`\mathbf{x}_{i}, \mathbf{x}_{j}` belong to different classes,
141
141
:math:`[\cdot]_+=\max(0, \cdot)` is the Hinge loss.
142
142
143
143
.. topic:: Example Code:
@@ -235,7 +235,7 @@ the sum of probability of being correctly classified:
235
235
236
236
Local Fisher Discriminant Analysis (:py:class:`LFDA <metric_learn.LFDA>`)
237
237
238
-
`LFDA` is a linear supervised dimensionality reduction method. It is
238
+
`LFDA` is a linear supervised dimensionality reduction method which effectively combines the ideas of `Linear Discriminant Analysis <https://en.wikipedia.org/wiki/Linear_discriminant_analysis>` and Locality-Preserving Projection . It is
239
239
particularly useful when dealing with multi-modality, where one ore more classes
240
240
consist of separate clusters in input space. The core optimization problem of
241
241
LFDA is solved as a generalized eigenvalue problem.
here :math:`\mathbf{A}_{i,j}` is the :math:`(i,j)`-th entry of the affinity
264
-
matrix :math:`\mathbf{A}`:, which can be calculated with local scaling methods.
264
+
matrix :math:`\mathbf{A}`:, which can be calculated with local scaling methods, `n` and `n_l` are the total number of points and the number of points per cluster `l` respectively.
265
265
266
266
Then the learning problem becomes derive the LFDA transformation matrix
0 commit comments