Skip to content

Commit bba5916

Browse files
authored
Merge branch 'master' into smc_ml
2 parents 05dddb1 + d4f850a commit bba5916

39 files changed

+668
-285
lines changed

README.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,8 +51,8 @@ Learn Bayesian statistics with a book together with PyMC3:
5151

5252
- `Probabilistic Programming and Bayesian Methods for Hackers <https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers>`__: Fantastic book with many applied code examples.
5353
- `PyMC3 port of the book "Doing Bayesian Data Analysis" by John Kruschke <https://github.com/aloctavodia/Doing_bayesian_data_analysis>`__ as well as the `second edition <https://github.com/JWarmenhoven/DBDA-python>`__: Principled introduction to Bayesian data analysis.
54-
- `PyMC3 port of the book "Statistical Rethinking A Bayesian Course with Examples in R and Stan" by Richard McElreath <https://github.com/aloctavodia/Statistical-Rethinking-with-Python-and-PyMC3>`__
55-
- `PyMC3 port of the book "Bayesian Cognitive Modeling" by Michael Lee and EJ Wagenmakers <https://github.com/junpenglao/Bayesian-Cognitive-Modeling-in-Pymc3>`__: Focused on using Bayesian statistics in cognitive modeling.
54+
- `PyMC3 port of the book "Statistical Rethinking A Bayesian Course with Examples in R and Stan" by Richard McElreath <https://github.com/pymc-devs/resources/tree/master/Rethinking>`__
55+
- `PyMC3 port of the book "Bayesian Cognitive Modeling" by Michael Lee and EJ Wagenmakers <https://github.com/pymc-devs/resources/tree/master/BCM>`__: Focused on using Bayesian statistics in cognitive modeling.
5656
- `Bayesian Analysis with Python by Osvaldo Martin <https://www.packtpub.com/big-data-and-business-intelligence/bayesian-analysis-python>`__ (and `errata <https://github.com/aloctavodia/BAP>`__): Great introductory book.
5757

5858
PyMC3 talks

RELEASE-NOTES.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
- Renamed `sample_ppc()` and `sample_ppc_w()` to `sample_posterior_predictive()` and `sample_posterior_predictive_w()`, respectively.
77
- Refactor SMC and properly compute marginal likelihood (#3124)
88

9+
910
## PyMC 3.5 (July 21 2018)
1011

1112
### New features

build_and_deploy_docs.sh

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
#!/bin/sh
22

3+
latesttag=$(git describe --tags `git rev-list --tags --max-count=1`)
4+
echo checking out ${latesttag}
5+
git checkout ${latesttag}
36
pushd docs/source
47
make html
58
ghp-import -c docs.pymc.io -n -p _build/html/

docs/source/advanced_theano.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,9 +60,9 @@ variable for our observations::
6060
# fit the model
6161
trace = pm.sample()
6262

63-
# Switch out the observations and use `sample_ppc` to predict
63+
# Switch out the observations and use `sample_posterior_predictive` to predict
6464
x_shared.set_value([-1, 0, 1.])
65-
post_pred = pm.sample_ppc(trace, samples=500)
65+
post_pred = pm.sample_posterior_predictive(trace, samples=500)
6666

6767
However, due to the way we handle shapes at the moment, it is
6868
not possible to change the shape of a shared variable if that would

docs/source/examples.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ Howto
1818
notebooks/PyMC3_tips_and_heuristic.ipynb
1919
notebooks/LKJ.ipynb
2020
notebooks/live_sample_plots.ipynb
21+
notebooks/sampling_compound_step.ipynb
2122

2223
Applied
2324
=======

docs/source/intro.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Introduction
88
Purpose
99
=======
1010

11-
PyMC3 is a probabilistic programming module for Python that allows users to fit Bayesian models using a variety of numerical methods, most notably Markov chain Monte Carlo (MCMC) and variational inference (VI). Its flexibility and extensibility make it applicable to a large suite of problems. Along with core model specification and fitting functionality, PyMC3 includes functionality for summarizing output and for model diagnostics.
11+
PyMC3 is a probabilistic programming package for Python that allows users to fit Bayesian models using a variety of numerical methods, most notably Markov chain Monte Carlo (MCMC) and variational inference (VI). Its flexibility and extensibility make it applicable to a large suite of problems. Along with core model specification and fitting functionality, PyMC3 includes functionality for summarizing output and for model diagnostics.
1212

1313

1414

docs/source/notebooks/Bayes_factor.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -437,8 +437,8 @@
437437
],
438438
"source": [
439439
"_, ax = plt.subplots()\n",
440-
"ppc_0 = pm.sample_ppc(traces[0], 1000, models[0], size=(len(y), 20))\n",
441-
"ppc_1 = pm.sample_ppc(traces[1], 1000, models[1], size=(len(y), 20))\n",
440+
"ppc_0 = pm.sample_posterior_predictive(traces[0], 1000, models[0], size=(len(y), 20))\n",
441+
"ppc_1 = pm.sample_posterior_predictive(traces[1], 1000, models[1], size=(len(y), 20))\n",
442442
"for m_0, m_1 in zip(ppc_0['yl'].T, ppc_1['yl'].T):\n",
443443
" pm.kdeplot(np.mean(m_0, 0), ax=ax, color='C0', label='model 0')\n",
444444
" pm.kdeplot(np.mean(m_1, 0), ax=ax, color='C1', label='model 1')\n",

docs/source/notebooks/Euler-Maruyama and SDEs.ipynb renamed to docs/source/notebooks/Euler-Maruyama_and_SDEs.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -431,7 +431,7 @@
431431
],
432432
"source": [
433433
"# generate trace from posterior\n",
434-
"ppc_trace = pm.sample_ppc(trace, model=model)\n",
434+
"ppc_trace = pm.sample_posterior_predictive(trace, model=model)\n",
435435
"\n",
436436
"# plot with data\n",
437437
"figure(figsize=(10, 3))\n",
@@ -799,7 +799,7 @@
799799
],
800800
"source": [
801801
"# generate trace from posterior\n",
802-
"ppc_trace = pm.sample_ppc(trace, model=model)\n",
802+
"ppc_trace = pm.sample_posterior_predictive(trace, model=model)\n",
803803
"\n",
804804
"# plot with data\n",
805805
"figure(figsize=(10, 3))\n",

docs/source/notebooks/GLM-robust-with-outlier-detection.ipynb

Lines changed: 112 additions & 196 deletions
Large diffs are not rendered by default.

docs/source/notebooks/GP-Latent.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -301,7 +301,7 @@
301301
"\n",
302302
"### Using `.conditional`\n",
303303
"\n",
304-
"Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_ppc` function. This is similar to how Stan uses its `generated quantities {...}` blocks. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps."
304+
"Next, we extend the model by adding the conditional distribution so we can predict at new $x$ locations. Lets see how the extrapolation looks out to higher $x$. To do this, we extend our `model` with the `conditional` distribution of the GP. Then, we can sample from it using the `trace` and the `sample_posterior_predictive` function. This is similar to how Stan uses its `generated quantities {...}` blocks. We could have included `gp.conditional` in the model *before* we did the NUTS sampling, but it is more efficient to separate these steps."
305305
]
306306
},
307307
{
@@ -334,7 +334,7 @@
334334
"\n",
335335
"# Sample from the GP conditional distribution\n",
336336
"with model:\n",
337-
" pred_samples = pm.sample_ppc(trace, vars=[f_pred], samples=1000)"
337+
" pred_samples = pm.sample_posterior_predictive(trace, vars=[f_pred], samples=1000)"
338338
]
339339
},
340340
{
@@ -543,7 +543,7 @@
543543
" f_pred = gp.conditional(\"f_pred\", X_new)\n",
544544
"\n",
545545
"with model:\n",
546-
" pred_samples = pm.sample_ppc(trace, vars=[f_pred], samples=1000)"
546+
" pred_samples = pm.sample_posterior_predictive(trace, vars=[f_pred], samples=1000)"
547547
]
548548
},
549549
{

0 commit comments

Comments
 (0)