diff --git a/_quarto.yml b/_quarto.yml index bbc4c44f1..52d1d2927 100644 --- a/_quarto.yml +++ b/_quarto.yml @@ -50,15 +50,14 @@ website: - text: documentation collapse-level: 1 contents: - - section: "Documentation" + - section: "For Users" # href: tutorials/index.qmd, This page will be added later so keep this line commented contents: - - section: "Using Turing - Modelling Syntax and Interface" + - section: "Using the Turing library" collapse-level: 1 contents: - tutorials/docs-00-getting-started/index.qmd - - text: "Quick Start" - href: tutorials/docs-14-using-turing-quick-start/index.qmd + - tutorials/00-introduction/index.qmd - tutorials/docs-12-using-turing-guide/index.qmd - text: "Mode Estimation" href: tutorials/docs-17-mode-estimation/index.qmd @@ -70,9 +69,8 @@ website: - text: "External Samplers" href: tutorials/docs-16-using-turing-external-samplers/index.qmd - - section: "Using Turing - Tutorials" + - section: "Examples of Turing Models" contents: - - tutorials/00-introduction/index.qmd - text: Gaussian Mixture Models href: tutorials/01-gaussian-mixture-model/index.qmd - tutorials/02-logistic-regression/index.qmd @@ -97,13 +95,15 @@ website: - text: "Gaussian Process Latent Variable Models" href: tutorials/12-gplvm/index.qmd - - section: "Developers: Contributing" + - section: "For Developers" + contents: + - section: "Contributing" collapse-level: 1 contents: - text: "How to Contribute" href: tutorials/docs-01-contributing-guide/index.qmd - - section: "Developers: PPL" + - section: "How Turing Works" collapse-level: 1 contents: - tutorials/docs-05-for-developers-compiler/index.qmd diff --git a/tutorials/docs-00-getting-started/index.qmd b/tutorials/docs-00-getting-started/index.qmd index 17197f977..e8e69a904 100644 --- a/tutorials/docs-00-getting-started/index.qmd +++ b/tutorials/docs-00-getting-started/index.qmd @@ -16,96 +16,65 @@ Pkg.instantiate(); To use Turing, you need to install Julia first and then install Turing. -### Install Julia +You will need to install Julia 1.7 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/). -You will need to install Julia 1.3 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/). - -### Install Turing.jl - -Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL: +Turing is officially registered in the [Julia General package registry](https://github.com/JuliaRegistries/General), which means that you can install a stable version of Turing by running the following in the Julia REPL: ```{julia} +#| eval: false #| output: false using Pkg Pkg.add("Turing") ``` -You can check if all tests pass by running `Pkg.test("Turing")` (it might take a long time) - -### Example +### Example usage -Here's a simple example showing Turing in action. - -First, we can load the Turing and StatsPlots modules +First, we load the Turing and StatsPlots modules. +The latter is required for visualising the results. ```{julia} using Turing using StatsPlots ``` -Then, we define a simple Normal model with unknown mean and variance +We then specify our model, which is a simple Gaussian model with unknown mean and variance. +In mathematical notation, the model is defined as follows: + +$$\begin{align} +s^2 &\sim \text{InverseGamma}(2, 3) \\ +m &\sim \mathcal{N}(0, \sqrt{s^2}) \\ +x, y &\sim \mathcal{N}(m, s^2) +\end{align}$$ + +This translates directly into the following Turing model. +Here, both `x` and `y` are observed values, and should therefore be passed as function parameters. +`m` and `s²` are the parameters to be inferred. ```{julia} @model function gdemo(x, y) s² ~ InverseGamma(2, 3) m ~ Normal(0, sqrt(s²)) x ~ Normal(m, sqrt(s²)) - return y ~ Normal(m, sqrt(s²)) + y ~ Normal(m, sqrt(s²)) end ``` -Then we can run a sampler to collect results. In this case, it is a Hamiltonian Monte Carlo sampler - -```{julia} -chn = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false) -``` - -We can plot the results +Suppose we observe `x = 1.5` and `y = 2`, and want to infer the mean and variance. +We can pass these data as arguments to the `gdemo` function, and run a sampler to collect the results. +Here, we collect 1000 samples using the No U-Turn Sampler (NUTS) algorithm. ```{julia} -plot(chn) +chain = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false) ``` -In this case, because we use the normal-inverse gamma distribution as a conjugate prior, we can compute its updated mean as follows: +We can plot the results: ```{julia} -s² = InverseGamma(2, 3) -m = Normal(0, 1) -data = [1.5, 2] -x_bar = mean(data) -N = length(data) - -mean_exp = (m.σ * m.μ + N * x_bar) / (m.σ + N) -``` - -We can also compute the updated variance - -```{julia} -updated_alpha = shape(s²) + (N / 2) -updated_beta = - scale(s²) + - (1 / 2) * sum((data[n] - x_bar)^2 for n in 1:N) + - (N * m.σ) / (N + m.σ) * ((x_bar)^2) / 2 -variance_exp = updated_beta / (updated_alpha - 1) +plot(chain) ``` -Finally, we can check if these expectations align with our HMC approximations from earlier. We can compute samples from a normal-inverse gamma following the equations given [here](https://en.wikipedia.org/wiki/Normal-inverse-gamma_distribution#Generating_normal-inverse-gamma_random_variates). - -```{julia} -function sample_posterior(alpha, beta, mean, lambda, iterations) - samples = [] - for i in 1:iterations - sample_variance = rand(InverseGamma(alpha, beta), 1) - sample_x = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1) - samples = append!(samples, sample_x) - end - return samples -end - -analytical_samples = sample_posterior(updated_alpha, updated_beta, mean_exp, 2, 1000); -``` +and obtain summary statistics by indexing the chain: ```{julia} -density(analytical_samples; label="Posterior (Analytical)") -density!(chn[:m]; label="Posterior (HMC)") +mean(chain[:m]), mean(chain[:s²]) ``` diff --git a/tutorials/docs-12-using-turing-guide/index.qmd b/tutorials/docs-12-using-turing-guide/index.qmd index b66b4d29d..56d59a144 100755 --- a/tutorials/docs-12-using-turing-guide/index.qmd +++ b/tutorials/docs-12-using-turing-guide/index.qmd @@ -1,5 +1,5 @@ --- -title: Guide +title: "Turing's Core Functionality" engine: julia --- @@ -10,6 +10,8 @@ using Pkg; Pkg.instantiate(); ``` +This article provides an overview of the core functionality in Turing.jl, which are likely to be used across a wide range of models. + ## Basics ### Introduction diff --git a/tutorials/docs-14-using-turing-quick-start/index.qmd b/tutorials/docs-14-using-turing-quick-start/index.qmd deleted file mode 100755 index 67d848752..000000000 --- a/tutorials/docs-14-using-turing-quick-start/index.qmd +++ /dev/null @@ -1,74 +0,0 @@ ---- -pagetitle: Quick Start -title: Probabilistic Programming in Thirty Seconds -engine: julia ---- - -```{julia} -#| echo: false -#| output: false -using Pkg; -Pkg.instantiate(); -``` - -If you are already well-versed in probabilistic programming and want to take a quick look at how Turing's syntax works or otherwise just want a model to start with, we have provided a complete Bayesian coin-flipping model below. - -This example can be run wherever you have Julia installed (see [Getting Started](../docs-00-getting-started/index.qmd), but you will need to install the packages `Turing` and `StatsPlots` if you have not done so already. - -This is an excerpt from a more formal example which can be found [here](../00-introduction/index.qmd). - -## Import Libraries -```{julia} -# Import libraries. -using Turing, StatsPlots, Random -``` - -```{julia} -# Set the true probability of heads in a coin. -p_true = 0.5 - -# Iterate from having seen 0 observations to 100 observations. -Ns = 0:100 -``` - -```{julia} -# Draw data from a Bernoulli distribution, i.e. draw heads or tails. -Random.seed!(12) -data = rand(Bernoulli(p_true), last(Ns)) -``` - - -## Declare Turing Model -```{julia} -# Declare our Turing model. -@model function coinflip(y) - # Our prior belief about the probability of heads in a coin. - p ~ Beta(1, 1) - - # The number of observations. - N = length(y) - for n in 1:N - # Heads or tails of a coin are drawn from a Bernoulli distribution. - y[n] ~ Bernoulli(p) - end -end -``` - - -## Setting HMC Sampler -```{julia} -# Settings of the Hamiltonian Monte Carlo (HMC) sampler. -iterations = 1000 -ϵ = 0.05 -τ = 10 - -# Start sampling. -chain = sample(coinflip(data), HMC(ϵ, τ), iterations, progress=false) -``` - - -## Plot a summary -```{julia} -# Plot a summary of the sampling process for the parameter p, i.e. the probability of heads in a coin. -histogram(chain[:p]) -``` \ No newline at end of file