Adaptive Metropolis-within-Gibbs algorithm with univariate Gaussian random walk proposals for the log-logistic model
Usage
MCMC_LLOG(
N,
thin,
burn,
Time,
Cens,
X,
Q = 10,
beta0 = NULL,
sigma20 = NULL,
prior = 2,
set = TRUE,
eps_l = 0.5,
eps_r = 0.5,
N.AKS = 3
)Arguments
- N
Total number of iterations. Must be a multiple of thin.
- thin
Thinning period.
- burn
Burn-in period. Must be a multiple of thin.
- Time
Vector containing the survival times.
- Cens
Censoring indication (1: observed, 0: right-censored).
- X
Design matrix with dimensions \(n\) x \(k\) where \(n\) is the number of observations and \(k\) is the number of covariates (including the intercept).
- Q
Update period for the \(\lambda_{i}\)’s
- beta0
Starting values for \(\beta\). If not provided, they will be randomly generated from a normal distribution.
- sigma20
Starting value for \(\sigma^2\). If not provided, it will be randomly generated from a gamma distribution.
- prior
Indicator of prior (1: Jeffreys, 2: Type I Ind. Jeffreys, 3: Ind. Jeffreys).
- set
Indicator for the use of set observations (1: set observations, 0: point observations). The former is strongly recommended over the latter as point observations cause problems in the context of Bayesian inference (due to continuous sampling models assigning zero probability to a point).
- eps_l
Lower imprecision \((\epsilon_l)\) for set observations (default value: 0.5).
- eps_r
Upper imprecision \((\epsilon_r)\) for set observations (default value: 0.5)
- N.AKS
Maximum number of terms of the Kolmogorov-Smirnov density used for the rejection sampling when updating mixing parameters (default value: 3)
Value
A matrix with \(N / thin + 1\) rows. The columns are the MCMC chains for \(\beta\) (\(k\) columns), \(\sigma^2\) (1 column), \(\theta\) (1 column, if appropriate), \(\lambda\) (\(n\) columns, not provided for log-normal model), \(\log(t)\) (\(n\) columns, simulated via data augmentation) and the logarithm of the adaptive variances (the number varies among models). The latter allows the user to evaluate if the adaptive variances have been stabilized.
Examples
library(BASSLINE)
# Please note: N=1000 is not enough to reach convergence.
# This is only an illustration. Run longer chains for more accurate
# estimations.
LLOG <- MCMC_LLOG(N = 1000, thin = 20, burn = 40, Time = cancer[, 1],
Cens = cancer[, 2], X = cancer[, 3:11])
#> Sampling initial betas from a Normal(0, 1) distribution
#> Initial beta 1 : 1.29
#> Initial beta 2 : 1.44
#> Initial beta 3 : -1.13
#> Initial beta 4 : 0.62
#> Initial beta 5 : 0.75
#> Initial beta 6 : -0.62
#> Initial beta 7 : 0.44
#> Initial beta 8 : 0.99
#> Initial beta 9 : 0.22
#>
#> Sampling initial sigma^2 from a Gamma(2, 2) distribution
#> Initial sigma^2 : 0.55
#>
