Skip to content

Bayesian inference with Turing.jl

Compatibility with the Distributions.jl API allows extensive interaction with the broader Julia ecosystem. One of the first examples discovered was the possibility to perform Bayesian inference of model parameters (including copulas) with Turing.jl.

Consider that we have a given model with a certain copula and certain marginals, all having parameters to be fitted. Then we can use Turing's @addlogprob! to compute the loglikelihood of our model and maximize it around the parameters alongside the chain as follows:

julia
using Copulas
using Distributions
using Random
using Turing
using StatsPlots

Random.seed!(123)
true_θ = 7
true_θ₁ = 1
true_θ₂ = 3
true_θ₃ = 2
D = SklarDist(ClaytonCopula(3,true_θ), (Exponential(true_θ₁), Pareto(true_θ₂), Exponential(true_θ₃)))
draws = rand(D, 2000)

@model function copula(X)
    # Priors
    θ  ~ TruncatedNormal(1.0, 1.0, -1/3, Inf)
    θ₁ ~ TruncatedNormal(1.0, 1.0, 0, Inf)
    θ₂ ~ TruncatedNormal(1.0, 1.0, 0, Inf)
    θ₃ ~ TruncatedNormal(1.0, 1.0, 0, Inf)

    # Build the parametric model
    C = ClaytonCopula(3,θ)
    X₁ = Exponential(θ₁)
    X₂ = Pareto(θ₂)
    X₃ = Exponential(θ₃)
    D = SklarDist(C, (X₁, X₂, X₃))

    # Compute the final loglikelyhood
    Turing.Turing.@addlogprob! loglikelihood(D, X)
end

sampler = NUTS() # MH() works too
chain = sample(copula(draws), sampler, MCMCThreads(), 100, 4)

Note that we truncated the θ parameter at -1/3 and not 0, as the ClaytonCopula can handle negative dependence structures. Only 100 steps were run for efficiency; you can increase this number as needed. The code above outputs a summary of the chain :

Summary Statistics:

parameterstrue valuemeanstdmcseess_bulkess_tailrhatess_per_sec
SymbolFloat64Float64Float64Float64Float64Float64Float64Float64
θ7.06.93190.11500.0067291.8858267.13531.00610.7238
θ₁1.00.99540.02090.0019116.694194.30701.03470.2894
θ₂3.02.98390.06390.0062108.9185105.52841.03900.2701
θ₃2.02.00550.04180.0039114.7324109.53961.03280.2845

Quantiles:

parameters2.5%25.0%50.0%75.0%97.5%
SymbolFloat64Float64Float64Float64Float64
θ6.72866.84386.93307.01507.1436
θ₁0.95550.98180.99531.00931.0386
θ₂2.86062.94262.98593.01963.1186
θ₃1.92541.97582.00562.03362.0923

And then plot(chain) produces the following plot:

Similar approaches can be used to fit many other dependence structures in a Bayesian setting. The example above showcases parametric estimation of a sampling model, but Bayesian regression with an error structure given by a parametric copula is just as easy to implement.

This was run on the following environment:

julia
julia> versioninfo()
Julia Version 1.10.0
Commit 3120989f39 (2023-12-25 18:01 UTC)
[...]

(env) pkg> status
  [ae264745] Copulas v0.1.20
  [31c24e10] Distributions v0.25.107
  [f3b207a7] StatsPlots v0.15.6
  [fce5fe82] Turing v0.30.3
  [9a3f8284] Random