|
| 1 | +--- |
| 2 | +title: External Likelihoods |
| 3 | +engine: julia |
| 4 | +--- |
| 5 | + |
| 6 | +```{julia} |
| 7 | +#| echo: false |
| 8 | +#| output: false |
| 9 | +using Pkg; |
| 10 | +Pkg.instantiate(); |
| 11 | +``` |
| 12 | + |
| 13 | +Sometimes a model's likelihood is not expressed directly as a distribution over observed data, but is instead computed by an external algorithm. |
| 14 | +A common example is **state-space models**, where a filtering algorithm (e.g. a Kalman filter or a particle filter) marginalises out the latent states and returns the marginal log-likelihood of the observations given the model parameters. |
| 15 | + |
| 16 | +In this setting Turing only needs to sample the model parameters; the likelihood contribution is injected into the model with the [`@addlogprob!`]({{< meta usage-modifying-logprob >}}) macro. |
| 17 | + |
| 18 | +## Minimal example |
| 19 | + |
| 20 | +The function below stands in for an external filtering algorithm — for instance one provided by [`SSMProblems.jl`](https://github.com/TuringLang/SSMProblems.jl) or [`GeneralisedFilters.jl`](https://github.com/TuringLang/GeneralisedFilters.jl). |
| 21 | +Here we simply compute the log-likelihood of a Gaussian with unit variance, which is sufficient to demonstrate the integration pattern. |
| 22 | + |
| 23 | +```{julia} |
| 24 | +using Turing |
| 25 | +
|
| 26 | +# Mock filter — computes the Gaussian log-likelihood (constant terms |
| 27 | +# omitted as they do not affect MCMC). |
| 28 | +function run_external_filter(data, θ) |
| 29 | + return -0.5 * sum((data .- θ) .^ 2) |
| 30 | +end |
| 31 | +
|
| 32 | +@model function external_likelihood_demo(data) |
| 33 | + θ ~ Normal(0, 1) |
| 34 | + @addlogprob! run_external_filter(data, θ) |
| 35 | +end |
| 36 | +``` |
| 37 | + |
| 38 | +We can now sample from this model in the usual way: |
| 39 | + |
| 40 | +```{julia} |
| 41 | +data = randn(100) |
| 42 | +model = external_likelihood_demo(data) |
| 43 | +chain = sample(model, NUTS(), 100) |
| 44 | +``` |
| 45 | + |
| 46 | +Because the mock filter computes a Gaussian log-likelihood with unit variance, the posterior for `θ` should concentrate around the sample mean of `data`. |
| 47 | + |
| 48 | +## When to use this pattern |
| 49 | + |
| 50 | +Use `@addlogprob!` whenever the likelihood of your observations is computed by code that lives outside Turing's `~` syntax. Typical cases include: |
| 51 | + |
| 52 | +- **State-space filtering** — packages such as `SSMProblems.jl` and `GeneralisedFilters.jl` evaluate the marginal likelihood via Kalman or particle filters. |
| 53 | +- **Hidden Markov Models** — the [HMM tutorial]({{< meta hidden-markov-model >}}#efficient-inference-with-the-forward-algorithm) shows the same pattern using `HiddenMarkovModels.jl` and `logdensityof`. |
| 54 | +- **Any domain-specific likelihood** — whenever you have a function that returns a log-probability, you can plug it in with `@addlogprob!`. |
| 55 | + |
| 56 | +For more details on the macro itself, see [Modifying the Log Probability]({{< meta usage-modifying-logprob >}}). |
0 commit comments