Skip to content

Commit 8027751

Browse files
Add terms to the glossary (#5014)
* Added made changes to glossary.md * Made changes to formula for Bayes theorem * fix syntax and conflicts Co-authored-by: Oriol (ZBook) <[email protected]>
1 parent 557bc79 commit 8027751

File tree

1 file changed

+21
-0
lines changed

1 file changed

+21
-0
lines changed

docs/source/glossary.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -95,4 +95,25 @@ GLM
9595

9696
Hierarchical Ordinary Differential Equation
9797
Individual, group, or other level types calculations of {term}`Ordinary Differential Equation`'s.
98+
99+
[Generalized Poisson Distribution](https://doi.org/10.2307/1267389)
100+
A generalization of the {term}`Poisson distribution`, with two parameters X1, and X2, is obtained as a limiting form of the generalized negative binomial distribution. The variance of the distribution is greater than, equal to or smaller than the mean according as X2 is positive, zero or negative. For formula and more detail, visit the link in the title.
101+
102+
[Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem)
103+
Describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately (by conditioning it on their age) than simply assuming that the individual is typical of the population as a whole.
104+
Formula:
105+
106+
$$
107+
P(A|B) = \frac{P(B|A) P(A)}{P(B)}
108+
$$
109+
110+
Where $A$ and $B$ are events and $P(B) \neq 0$
111+
112+
113+
[Markov Chain](https://en.wikipedia.org/wiki/Markov_chain)
114+
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
115+
116+
[Markov Chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo)
117+
[MCMC](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo)
118+
Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a {term}`Markov Chain` that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm.
98119
:::::

0 commit comments

Comments
 (0)