You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Automated Factor Slice Sampler has an initial adaptive stage, of A iterations, during which the proposal covariance matrix is periodically updated from the scatter matrix of the observed samples. The period of these updates is given by decomp.freq. Such updates are no longer made after A iterations.
In the present version, this update period is chosen as (line 1794) decomp.freq <- max(LIV * floor(Iterations / Thinning / 100), 10)
where LIV is the number of parameters (dimension of the sampled space), Iterations is the total number of requested MCMC iterations (before thinning), and Thinning is the thinning.
It seems strange to me that the update period is calculated based on the total number of iterations requested, rather than on the number of adaptive ones, A. Consider this case: the user wants a very large number of samples (Iterations/Thinning large), but chooses a much shorter adaptive time A, known to be sufficient. In such a case the update period, according to the formula above, would be very large, potentially becoming insufficient. In fact it may never happen if A < max(LIV * floor(Iterations / Thinning / 100), 10).
The update period should probably be calculated based on the number of parameters alone. But at least it should depend on the adaptive time A rather than the full sampling time, for the reason explained above. I propose to change line 1794 into either decomp.freq <- max(LIV * floor(A/ Thinning / 10), 10)
or even just decomp.freq <- max(LIV, 10) or decomp.freq <- max(2*LIV, 10)
to make sure that at least as many iterations as parameters have passed. Similar changes should be made for the blockwise computation on lines 1963–1964.
I checked Tibbits & al. 2014, mentioned in the manual, but they don't seem to give any specific recommendation.
Any take on this?
The text was updated successfully, but these errors were encountered:
Hi all, this refers to LaplacesDemon.R
The Automated Factor Slice Sampler has an initial adaptive stage, of
A
iterations, during which the proposal covariance matrix is periodically updated from the scatter matrix of the observed samples. The period of these updates is given bydecomp.freq
. Such updates are no longer made afterA
iterations.In the present version, this update period is chosen as (line 1794)
decomp.freq <- max(LIV * floor(Iterations / Thinning / 100), 10)
where
LIV
is the number of parameters (dimension of the sampled space),Iterations
is the total number of requested MCMC iterations (before thinning), andThinning
is the thinning.It seems strange to me that the update period is calculated based on the total number of iterations requested, rather than on the number of adaptive ones,
A
. Consider this case: the user wants a very large number of samples (Iterations/Thinning
large), but chooses a much shorter adaptive timeA
, known to be sufficient. In such a case the update period, according to the formula above, would be very large, potentially becoming insufficient. In fact it may never happen if A < max(LIV * floor(Iterations / Thinning / 100), 10).The update period should probably be calculated based on the number of parameters alone. But at least it should depend on the adaptive time
A
rather than the full sampling time, for the reason explained above. I propose to change line 1794 into eitherdecomp.freq <- max(LIV * floor(A/ Thinning / 10), 10)
or even just
decomp.freq <- max(LIV, 10)
ordecomp.freq <- max(2*LIV, 10)
to make sure that at least as many iterations as parameters have passed. Similar changes should be made for the blockwise computation on lines 1963–1964.
I checked Tibbits & al. 2014, mentioned in the manual, but they don't seem to give any specific recommendation.
Any take on this?
The text was updated successfully, but these errors were encountered: