Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

covariance matrix cannot be estimated using Wishart with NUTS, HMC or HMCDA samplers #1861

Closed
DDoyle1066 opened this issue Jul 20, 2022 · 2 comments

Comments

@DDoyle1066
Copy link

DDoyle1066 commented Jul 20, 2022

I ran into an issue when trying to estimate a covariance matrix using an inverse Wishart or a wishart distibution (which I believe is the appropriate prior). Below is a minimal example:

using Turing
true_cov = [0.7 0.5; 0.5 0.8]
true_mu = [-0.3, 0.2]
true_dist = MvNormal(true_mu, true_cov)
rng = MersenneTwister(20220720)
data = rand(rng, true_dist, 10000)
cov(data')

@model function norm_est(data)
  D, N = size(data)
  covar ~ Wishart(3, true_cov)
  mu ~ filldist(Normal(0, 1), D)
  data ~ filldist(MvNormal(mu, Hermitian(covar)), N) # tried to force it to be hermitian which was a different error but still got positive definite issue
end

est = norm_est(data)
chain_that_does_not_work = sample(est, NUTS(), 500)
chain_that_works = sample(est, IS(), 500)

The error I get says that the matrix I am using is not positive definite which I believe is some sort of tolerance error. it only affects NUTS, HMC and HMCDA samplers as far as I can tell because of a gradient calculation. I am reluctant to use the other samplers because they did not give accurate results for this example at least. Full error is below:

PosDefException: matrix is not positive definite; Cholesky factorization failed.
Stacktrace:
[1] checkpositivedefinite
@ /julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/factorization.jl:18 [inlined]
[2] #cholesky!#138
@ /julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/cholesky.jl:266 [inlined]
[3] cholesky!(A::Matrix{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, ::Val{false}; check::Bool)
@ LinearAlgebra /julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/cholesky.jl:298
[4] #cholesky#143
@ /julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/cholesky.jl:394 [inlined]
[5] cholesky (repeats 2 times)
@ /julia-1.7.3/share/julia/stdlib/v1.7/LinearAlgebra/src/cholesky.jl:394 [inlined]
[6] pd_logpdf_with_trans(d::Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}, X::Matrix{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, transform::Bool)
@ Bijectors ~/.julia/packages/Bijectors/U0SqN/src/Bijectors.jl:229
[7] logpdf_with_trans
@ ~/.julia/packages/Bijectors/U0SqN/src/Bijectors.jl:132 [inlined]
[8] assume(dist::Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}, vn::AbstractPPL.VarName{:covar, Setfield.IdentityLens}, vi::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}}}, ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Base.RefValue{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}}})
@ DynamicPPL ~/.julia/packages/DynamicPPL/R7VK9/src/context_implementations.jl:198
[9] assume
@ ~/.julia/packages/Turing/S4Y4B/src/inference/hmc.jl:462 [inlined]
[10] tilde_assume
@ ~/.julia/packages/DynamicPPL/R7VK9/src/context_implementations.jl:49 [inlined]
[11] tilde_assume
@ ~/.julia/packages/DynamicPPL/R7VK9/src/context_implementations.jl:46 [inlined]
[12] tilde_assume
@ ~/.julia/packages/DynamicPPL/R7VK9/src/context_implementations.jl:31 [inlined]
[13] tilde_assume!!(context::DynamicPPL.SamplingContext{DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext, Random._GLOBAL_RNG}, right::Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}, vn::AbstractPPL.VarName{:covar, Setfield.IdentityLens}, vi::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}}}, ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Base.RefValue{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}}})
@ DynamicPPL ~/.julia/packages/DynamicPPL/R7VK9/src/context_implementations.jl:117
[14] norm_est(model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, varinfo::DynamicPPL.ThreadSafeVarInfo{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Set{DynamicPPL.Selector}}}}}, ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}, Vector{Base.RefValue{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}}}, context::DynamicPPL.SamplingContext{DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext, Random._GLOBAL_RNG}, data::Matrix{Float64})
@ Main /app/ESG_Enhancements.jl:58
[15] macro expansion
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:493 [inlined]
[16] _evaluate!!
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:476 [inlined]
[17] evaluate_threadsafe!!
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:467 [inlined]
[18] evaluate!!
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:402 [inlined]
[19] evaluate!!
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:415 [inlined]
[20] evaluate!!
@ ~/.julia/packages/DynamicPPL/R7VK9/src/model.jl:423 [inlined]
[21] (::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext})(θ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}})
@ Turing ~/.julia/packages/Turing/S4Y4B/src/Turing.jl:37
[22] vector_mode_dual_eval!
@ ~/.julia/packages/ForwardDiff/wAaVJ/src/apiutils.jl:37 [inlined]
[23] vector_mode_gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}})
@ ForwardDiff ~/.julia/packages/ForwardDiff/wAaVJ/src/gradient.jl:113
[24] gradient!
@ ~/.julia/packages/ForwardDiff/wAaVJ/src/gradient.jl:37 [inlined]
[25] gradient!(result::DiffResults.MutableDiffResult{1, Float64, Tuple{Vector{Float64}}}, f::Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext}, x::Vector{Float64}, cfg::ForwardDiff.GradientConfig{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Turing.TuringTag, Float64}, Float64, 6}}})
@ ForwardDiff ~/.julia/packages/ForwardDiff/wAaVJ/src/gradient.jl:35
[26] gradient_logp(ad::Turing.Essential.ForwardDiffAD{0, true}, θ::Vector{Float64}, vi::DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, sampler::DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, context::DynamicPPL.DefaultContext)
@ Turing.Essential ~/.julia/packages/Turing/S4Y4B/src/essential/ad.jl:130
[27] gradient_logp (repeats 2 times)
@ ~/.julia/packages/Turing/S4Y4B/src/essential/ad.jl:88 [inlined]
[28] ∂logπ∂θ
@ ~/.julia/packages/Turing/S4Y4B/src/inference/hmc.jl:433 [inlined]
[29] ∂H∂θ
@ ~/.julia/packages/AdvancedHMC/51xgc/src/hamiltonian.jl:31 [inlined]
[30] macro expansion
@ ~/.julia/packages/UnPack/EkESO/src/UnPack.jl:100 [inlined]
[31] step(lf::AdvancedHMC.Leapfrog{Float64}, h::AdvancedHMC.Hamiltonian{AdvancedHMC.UnitEuclideanMetric{Float64, Tuple{Int64}}, Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext}, Turing.Inference.var"#∂logπ∂θ#53"{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}}}, z::AdvancedHMC.PhasePoint{Vector{Float64}, AdvancedHMC.DualValue{Float64, Vector{Float64}}}, n_steps::Int64; fwd::Bool, full_trajectory::Val{false})
@ AdvancedHMC ~/.julia/packages/AdvancedHMC/51xgc/src/integrator.jl:88
[32] step (repeats 2 times)
@ ~/.julia/packages/AdvancedHMC/51xgc/src/integrator.jl:66 [inlined]
[33] A
@ ~/.julia/packages/AdvancedHMC/51xgc/src/trajectory.jl:692 [inlined]
[34] find_good_stepsize(rng::Random._GLOBAL_RNG, h::AdvancedHMC.Hamiltonian{AdvancedHMC.UnitEuclideanMetric{Float64, Tuple{Int64}}, Turing.LogDensityFunction{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.DefaultContext}, Turing.Inference.var"#∂logπ∂θ#53"{DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}, DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}}}, θ::Vector{Float64}; max_n_iters::Int64)
@ AdvancedHMC ~/.julia/packages/AdvancedHMC/51xgc/src/trajectory.jl:714
[35] #find_good_stepsize#19
@ ~/.julia/packages/AdvancedHMC/51xgc/src/trajectory.jl:770 [inlined]
[36] find_good_stepsize
@ ~/.julia/packages/AdvancedHMC/51xgc/src/trajectory.jl:770 [inlined]
[37] initialstep(rng::Random._GLOBAL_RNG, model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, spl::DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, vi::DynamicPPL.TypedVarInfo{NamedTuple{(:covar, :mu), Tuple{DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:covar, Setfield.IdentityLens}, Int64}, Vector{Wishart{Float64, PDMats.PDMat{Float64, Matrix{Float64}}, Int64}}, Vector{AbstractPPL.VarName{:covar, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}, DynamicPPL.Metadata{Dict{AbstractPPL.VarName{:mu, Setfield.IdentityLens}, Int64}, Vector{DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}}, Vector{AbstractPPL.VarName{:mu, Setfield.IdentityLens}}, Vector{Float64}, Vector{Set{DynamicPPL.Selector}}}}}, Float64}; init_params::Nothing, nadapts::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Turing.Inference ~/.julia/packages/Turing/S4Y4B/src/inference/hmc.jl:187
[38] step(rng::Random._GLOBAL_RNG, model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, spl::DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}; resume_from::Nothing, init_params::Nothing, kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:nadapts,), Tuple{Int64}}})
@ DynamicPPL ~/.julia/packages/DynamicPPL/R7VK9/src/sampler.jl:104
[39] macro expansion
@ ~/.julia/packages/AbstractMCMC/fnRmh/src/sample.jl:120 [inlined]
[40] macro expansion
@ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:328 [inlined]
[41] macro expansion
@ ~/.julia/packages/AbstractMCMC/fnRmh/src/logging.jl:9 [inlined]
[42] mcmcsample(rng::Random._GLOBAL_RNG, model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, sampler::DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, N::Int64; progress::Bool, progressname::String, callback::Nothing, discard_initial::Int64, thinning::Int64, chain_type::Type, kwargs::Base.Pairs{Symbol, Int64, Tuple{Symbol}, NamedTuple{(:nadapts,), Tuple{Int64}}})
@ AbstractMCMC ~/.julia/packages/AbstractMCMC/fnRmh/src/sample.jl:111
[43] sample(rng::Random._GLOBAL_RNG, model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, sampler::DynamicPPL.Sampler{HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}}, N::Int64; chain_type::Type, resume_from::Nothing, progress::Bool, nadapts::Int64, discard_adapt::Bool, discard_initial::Int64, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Turing.Inference ~/.julia/packages/Turing/S4Y4B/src/inference/hmc.jl:133
[44] sample
@ ~/.julia/packages/Turing/S4Y4B/src/inference/hmc.jl:116 [inlined]
[45] #sample#2
@ ~/.julia/packages/Turing/S4Y4B/src/inference/Inference.jl:145 [inlined]
[46] sample
@ ~/.julia/packages/Turing/S4Y4B/src/inference/Inference.jl:145 [inlined]
[47] #sample#1
@ ~/.julia/packages/Turing/S4Y4B/src/inference/Inference.jl:135 [inlined]
[48] sample(model::DynamicPPL.Model{typeof(norm_est), (:data,), (), (), Tuple{Matrix{Float64}}, Tuple{}, DynamicPPL.DefaultContext}, alg::HMCDA{Turing.Essential.ForwardDiffAD{0}, (), AdvancedHMC.UnitEuclideanMetric}, N::Int64)
@ Turing.Inference ~/.julia/packages/Turing/S4Y4B/src/inference/Inference.jl:135
[49] top-level scope
@ /app/ESG_Enhancements.jl:64

@LLTeixeira
Copy link

Unfortunately I am having the same issue on v0.24 but using the Inverse Wishart distribution for the covariance matrix. Only happening with the NUTS, HMC and HMCDA samplers as well and with Gibbs whenever it is composed with one of those.

function MultiNorm_Cov_Model(X,::Type{T}=Float64) where T
    rows = size(X, 1)
    vars = size(X, 2)

    μ = Vector{T}(undef, vars)
    for i in 1:vars
        μ[i] ~ Normal(0, 50)
    end

    Σ ~ InverseWishart(vars + 1, Matrix(I(vars)*1))
    
    for i in 1:rows
        X[i,:] ~ MvNormal(μ, Symmetric(Σ))
    end
end

Using the "Symmetric" function doesn't solve the issue. And I still get:

"PosDefException: matrix is not Hermitian; Cholesky factorization failed."

@yebai
Copy link
Member

yebai commented Jun 6, 2024

Duplicate with #2188 (comment)

@yebai yebai closed this as completed Jun 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants