You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Bump the minimum version of Julia.
- Explain the significance of parameters to @model function.
- Explain how to obtain the actual samples in the chain.
- Expand on the maths involved in the analytical posterior calculation
(otherwise, to someone who hasn't seen it before, the formulas look
too magic).
Copy file name to clipboardExpand all lines: tutorials/docs-00-getting-started/index.qmd
+86-33Lines changed: 86 additions & 33 deletions
Original file line number
Diff line number
Diff line change
@@ -16,96 +16,149 @@ Pkg.instantiate();
16
16
17
17
To use Turing, you need to install Julia first and then install Turing.
18
18
19
-
### Install Julia
19
+
You will need to install Julia 1.7 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
20
20
21
-
You will need to install Julia 1.3 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
22
-
23
-
### Install Turing.jl
24
-
25
-
Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL:
21
+
Turing is officially registered in the [Julia General package registry](https://github.com/JuliaRegistries/General), which means that you can install a stable version of Turing by running the following in the Julia REPL:
26
22
27
23
```{julia}
28
24
#| output: false
29
25
using Pkg
30
26
Pkg.add("Turing")
31
27
```
32
28
33
-
You can check if all tests pass by running `Pkg.test("Turing")` (it might take a long time)
29
+
### Example usage
34
30
35
-
### Example
31
+
Here is a simple example showing Turing in action.
36
32
37
-
Here's a simple example showing Turing in action.
38
-
39
-
First, we can load the Turing and StatsPlots modules
33
+
First, we load the Turing and StatsPlots modules.
34
+
The latter is required for visualising the results.
40
35
41
36
```{julia}
42
37
using Turing
43
38
using StatsPlots
44
39
```
45
40
46
-
Then, we define a simple Normal model with unknown mean and variance
41
+
Then, we define a simple Normal model with unknown mean and variance.
42
+
Here, both `x` and `y` are observed values (since they are function parameters), whereas `m` and `s²` are the parameters to be inferred.
47
43
48
44
```{julia}
49
45
@model function gdemo(x, y)
50
46
s² ~ InverseGamma(2, 3)
51
47
m ~ Normal(0, sqrt(s²))
52
48
x ~ Normal(m, sqrt(s²))
53
-
return y ~ Normal(m, sqrt(s²))
49
+
y ~ Normal(m, sqrt(s²))
54
50
end
55
51
```
56
52
57
-
Then we can run a sampler to collect results. In this case, it is a Hamiltonian Monte Carlo sampler
53
+
Suppose we have some data `x` and `y` that we want to infer the mean and variance for.
54
+
We can pass these data as arguments to the `gdemo` function, and run a sampler to collect the results.
55
+
In this specific example, we collect 1000 samples using the No U-Turn Sampler (NUTS) algorithm, which is a Hamiltonian Monte Carlo method.
The four updated parameters are given respectively (see for example [Wikipedia](https://en.wikipedia.org/wiki/Normal-gamma_distribution#Posterior_distribution_of_the_parameters)) by
We can use these to analytically calculate, for example, the expectation values of the mean and variance parameters in the posterior distribution: $E[m] = \mu'$, and $E[s^2] = \beta'/(\alpha
124
+
- 1)$.
82
125
83
126
```{julia}
84
-
updated_alpha = shape(s²) + (N / 2)
85
-
updated_beta =
86
-
scale(s²) +
87
-
(1 / 2) * sum((data[n] - x_bar)^2 for n in 1:N) +
88
-
(N * m.σ) / (N + m.σ) * ((x_bar)^2) / 2
89
-
variance_exp = updated_beta / (updated_alpha - 1)
127
+
mean_exp = µ´
128
+
variance_exp = β´ / (α´ - 1)
129
+
130
+
(mean_exp, variance_exp)
90
131
```
91
132
92
-
Finally, we can check if these expectations align with our HMC approximations from earlier. We can compute samples from a normal-inverse gamma following the equations given [here](https://en.wikipedia.org/wiki/Normal-inverse-gamma_distribution#Generating_normal-inverse-gamma_random_variates).
133
+
Alternatively, we can compare the two distributions visually (one parameter at a time).
134
+
To do so, we will directly sample from the analytical posterior distribution using the procedure described in [Wikipedia](https://en.wikipedia.org/wiki/Normal-inverse-gamma_distribution#Generating_normal-inverse-gamma_random_variates).
93
135
94
136
```{julia}
95
-
function sample_posterior(alpha, beta, mean, lambda, iterations)
137
+
function sample_posterior_analytic(µ´, λ´, α´, β´, iterations)
0 commit comments