Title: | Reversible-Jump MCMC Using Post-Processing |
---|---|
Description: | Performs reversible-jump Markov chain Monte Carlo (Green, 1995) <doi:10.2307/2337340>, specifically the restriction introduced by Barker & Link (2013) <doi:10.1080/00031305.2013.791644>. By utilising a 'universal parameter' space, RJMCMC is treated as a Gibbs sampling problem. Previously-calculated posterior distributions are used to quickly estimate posterior model probabilities. Jacobian matrices are found using automatic differentiation. For a detailed description of the package, see Gelling, Schofield & Barker (2019) <doi:10.1111/anzs.12263>. |
Authors: | Nick Gelling [aut, cre], Matthew R. Schofield [aut], Richard J. Barker [aut] |
Maintainer: | Nick Gelling <[email protected]> |
License: | GPL-3 |
Version: | 0.4.5 |
Built: | 2025-01-26 04:24:56 UTC |
Source: | https://github.com/cran/rjmcmc |
A wrapper function to the functionality of the madness
package.
Converts a given x
to a madness object and then applies func
to
it, returning the result and the Jacobian for the transformation func
.
adiff
is used by the rjmcmcpost
function.
adiff(func, x, ...)
adiff(func, x, ...)
func |
The function to be differentiated (usually user-defined). |
x |
The values at which to evaluate the function |
... |
Further arguments to be passed to |
A numeric vector or matrix containing the result of the function
evaluation func(x, ...)
. The "gradient"
attribute of this
object contains the Jacobian matrix of the transformation func
.
Pav, S. E. (2016) Madness: Automatic Differentiation of Multivariate Operations.
x2x3 = function(x){ return(c(x^2, x^3)) } y = adiff(x2x3, c(5,6)) attr(y, "gradient")
x2x3 = function(x){ return(c(x^2, x^3)) } y = adiff(x2x3, c(5,6)) attr(y, "gradient")
Performs Bayesian multimodel inference, estimating Bayes factors and
posterior model probabilities for N candidate models. Unlike
rjmcmcpost
, this function uses a default bijection scheme based
on approximating each posterior by a multivariate normal distribution. The
result is reminiscent of the algorithm of Carlin & Chib (1995) with a
multivariate normal pseudo-prior. Transformation Jacobians are computed using
automatic differentiation so do not need to be specified.
defaultpost(posterior, likelihood, param.prior, model.prior, chainlength = 10000, TM.thin = chainlength/10, progress = TRUE, save.all = TRUE)
defaultpost(posterior, likelihood, param.prior, model.prior, chainlength = 10000, TM.thin = chainlength/10, progress = TRUE, save.all = TRUE)
posterior |
A list of N matrices containing the posterior distributions under each model. Generally this will be obtained from MCMC output. Note that each parameter should be real-valued so some parameters may need to be transformed, using logarithms for example. |
likelihood |
A list of N functions specifying the log-likelihood functions for the data under each model. |
param.prior |
A list of N functions specifying the prior distributions for each model-specific parameter vector. |
model.prior |
A numeric vector of the prior model probabilities. Note that this argument is not required to sum to one as it is automatically normalised. |
chainlength |
How many iterations to run the Markov chain for. |
TM.thin |
How regularly to calculate transition matrices as the chain progresses. |
progress |
A logical determining whether a progress bar is drawn. |
save.all |
A logical determining whether to save the value of the
universal parameter at each iteration, as well as the corresponding
likelihoods, priors and posteriors. If |
Returns an object of class rj
(see rjmethods
).
If save.all=TRUE
, the output has named elements result
,
densities
, psidraws
, progress
and meta
. If
save.all=FALSE
, the densities
and psidraws
elements
are omitted.
result
contains useful point estimates, progress
contains
snapshots of these estimates over time, and meta
contains
information about the function call.
Carlin, B. P. and Chib, S. (1995) Bayesian Model Choice via Markov Chain Monte Carlo Methods. Journal of the Royal Statistical Society, Series B, 473-484.
Barker, R. J. and Link, W. A. (2013) Bayesian multimodel inference by RJMCMC: A Gibbs sampling approach. The American Statistician, 67(3), 150-156.
## Comparing two binomial models -- see Barker & Link (2013) for further details. y=c(8,16); sumy=sum(y) n=c(20,30); sumn=sum(n) L1=function(p){if((all(p>=0))&&(all(p<=1))) sum(dbinom(y,n,p,log=TRUE)) else -Inf} L2=function(p){if((p[1]>=0)&&(p[1]<=1)) sum(dbinom(y,n,p[1],log=TRUE)) else -Inf} p.prior1=function(p){sum(dbeta(p,1,1,log=TRUE))} p.prior2=function(p){dbeta(p[1],1,1,log=TRUE)+dbeta(p[2],17,15,log=TRUE)} draw1=matrix(rbeta(2000,y+1,n-y+1), 1000, 2, byrow=TRUE) ## full conditional posterior draw2=matrix(c(rbeta(1000,sumy+1,sumn-sumy+1),rbeta(1000,17,15)), 1000, 2) out=defaultpost(posterior=list(draw1,draw2), likelihood=list(L1,L2), param.prior=list(p.prior1,p.prior2), model.prior=c(1,1), chainlength=1000)
## Comparing two binomial models -- see Barker & Link (2013) for further details. y=c(8,16); sumy=sum(y) n=c(20,30); sumn=sum(n) L1=function(p){if((all(p>=0))&&(all(p<=1))) sum(dbinom(y,n,p,log=TRUE)) else -Inf} L2=function(p){if((p[1]>=0)&&(p[1]<=1)) sum(dbinom(y,n,p[1],log=TRUE)) else -Inf} p.prior1=function(p){sum(dbeta(p,1,1,log=TRUE))} p.prior2=function(p){dbeta(p[1],1,1,log=TRUE)+dbeta(p[2],17,15,log=TRUE)} draw1=matrix(rbeta(2000,y+1,n-y+1), 1000, 2, byrow=TRUE) ## full conditional posterior draw2=matrix(c(rbeta(1000,sumy+1,sumn-sumy+1),rbeta(1000,17,15)), 1000, 2) out=defaultpost(posterior=list(draw1,draw2), likelihood=list(L1,L2), param.prior=list(p.prior1,p.prior2), model.prior=c(1,1), chainlength=1000)
A utility function which accepts a matrix of MCMC output and creates a function which samples from the posterior distribution for the parameters of the model.
getsampler(modelfit, sampler.name = "post.draw", order = "default", envir = .GlobalEnv)
getsampler(modelfit, sampler.name = "post.draw", order = "default", envir = .GlobalEnv)
modelfit |
A matrix of output from a previously-run MCMC algorithm, with one column per variable and one row per iteration. |
sampler.name |
A string giving the desired name for the function to be defined. |
order |
A numeric vector of indices specifying the desired parameters to
extract from |
envir |
The environment in which to define the sampling function. Defaults to the global environment. |
A function is defined in envir
which randomly samples from the
posterior distribution for the parameters. Note that this function does not
take any arguments. A function generated in this way is suitable for
passing to the rjmcmcpost
function.
Plummer, M. (2003) JAGS: A program for analysis of Bayesian graphical models using Gibbs sampling. Proceedings of the 3rd international workshop on distributed statistical computing (Vol. 124, p. 125).
# Generate synthetic 'MCMC output' for a model with 3 parameters. There is # one column per parameter, and 1000 iterations. matrix_output = matrix(c(runif(1000,0,1), rnorm(1000,5,2), rgamma(1000,2,2)), 1000, 3) getsampler(modelfit=matrix_output, sampler.name="posterior1") set.seed(100) posterior1() ## Alternatively posterior1b = getsampler(modelfit=matrix_output) # this successfully defines a function named # posterior1b but also defines an identical function corresponding to the value # of sampler.name, i.e. the default "post.draw" in this case. set.seed(100) posterior1b() set.seed(100) posterior1()
# Generate synthetic 'MCMC output' for a model with 3 parameters. There is # one column per parameter, and 1000 iterations. matrix_output = matrix(c(runif(1000,0,1), rnorm(1000,5,2), rgamma(1000,2,2)), 1000, 3) getsampler(modelfit=matrix_output, sampler.name="posterior1") set.seed(100) posterior1() ## Alternatively posterior1b = getsampler(modelfit=matrix_output) # this successfully defines a function named # posterior1b but also defines an identical function corresponding to the value # of sampler.name, i.e. the default "post.draw" in this case. set.seed(100) posterior1b() set.seed(100) posterior1()
Performs reversible-jump MCMC (Green, 1995), specifically the reformulation
introduced by Barker & Link (2013). Using a 'universal parameter' space,
RJMCMC is treated as Gibbs sampling making it simpler to implement. The
required Jacobian matrices are calculated automatically, utilising the
madness
package.
rjmcmcpost
defaultpost
adiff
getsampler
Green, P. J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 711-732.
Barker, R. J. and Link, W. A. (2013) Bayesian multimodel inference by RJMCMC: A Gibbs sampling approach. The American Statistician, 67(3), 150-156.
Performs Bayesian multimodel inference, estimating Bayes factors and posterior model probabilities for N candidate models. Using the 'universal parameter' restriction in Barker & Link (2013), RJMCMC is treated as a Gibbs sampling problem, where the algorithm alternates between updating the model and the model specific parameters. Transformation Jacobians are computed using automatic differentiation so do not need to be specified.
rjmcmcpost(post.draw, g, ginv, likelihood, param.prior, model.prior, chainlength = 10000, TM.thin = chainlength/10, save.all = FALSE, progress = TRUE)
rjmcmcpost(post.draw, g, ginv, likelihood, param.prior, model.prior, chainlength = 10000, TM.thin = chainlength/10, save.all = FALSE, progress = TRUE)
post.draw |
A list of N functions that randomly draw from the posterior distribution under each model. Generally these functions sample from the output of a model fitted using MCMC. |
g |
A list of N functions specifying the bijections from the universal
parameter |
ginv |
A list of N functions specifying the bijections from each
model-specific parameter set to |
likelihood |
A list of N functions specifying the log-likelihood functions for the data under each model. |
param.prior |
A list of N functions specifying the log-prior distributions for each model-specific parameter vector. |
model.prior |
A numeric vector of the prior model probabilities. Note that this argument is not required to sum to one as it is automatically normalised. |
chainlength |
How many iterations to run the Markov chain for. |
TM.thin |
How regularly to calculate transition matrices as the chain progresses. |
save.all |
A logical determining whether to save the value of the
universal parameter at each iteration, as well as the corresponding
likelihoods, priors and posteriors. If |
progress |
A logical determining whether a progress bar is drawn. |
Returns an object of class rj
(see rjmethods
).
If save.all=TRUE
, the output has named elements result
,
densities
, psidraws
, progress
and meta
. If
save.all=FALSE
, the densities
and psidraws
elements
are omitted.
result
contains useful point estimates, progress
contains
snapshots of these estimates over time, and meta
contains
information about the function call.
Barker, R. J. and Link, W. A. (2013) Bayesian multimodel inference by RJMCMC: A Gibbs sampling approach. The American Statistician, 67(3), 150-156.
## Comparing two binomial models -- see Barker & Link (2013) for further details. y=c(8,16); sumy=sum(y) n=c(20,30); sumn=sum(n) L1=function(p){if((all(p>=0))&&(all(p<=1))) sum(dbinom(y,n,p,log=TRUE)) else -Inf} L2=function(p){if((p[1]>=0)&&(p[1]<=1)) sum(dbinom(y,n,p[1],log=TRUE)) else -Inf} g1=function(psi){p=psi} g2=function(psi){w=n[1]/sum(n); p=c(w*psi[1]+(1-w)*psi[2],psi[2])} ginv1=function(p){p} ginv2=function(p){c(sum(n)/n[1]*p[1]-n[2]/n[1]*p[2],p[2])} p.prior1=function(p){sum(dbeta(p,1,1,log=TRUE))} p.prior2=function(p){dbeta(p[1],1,1,log=TRUE)+dbeta(p[2],17,15,log=TRUE)} draw1=function(){rbeta(2,y+1,n-y+1)} draw2=function(){c(rbeta(1,sumy+1,sumn-sumy+1),rbeta(1,17,15))} out=rjmcmcpost(post.draw=list(draw1,draw2), g=list(g1,g2), ginv=list(ginv1,ginv2), likelihood=list(L1,L2), param.prior=list(p.prior1,p.prior2), model.prior=c(0.5,0.5), chainlength=1500)
## Comparing two binomial models -- see Barker & Link (2013) for further details. y=c(8,16); sumy=sum(y) n=c(20,30); sumn=sum(n) L1=function(p){if((all(p>=0))&&(all(p<=1))) sum(dbinom(y,n,p,log=TRUE)) else -Inf} L2=function(p){if((p[1]>=0)&&(p[1]<=1)) sum(dbinom(y,n,p[1],log=TRUE)) else -Inf} g1=function(psi){p=psi} g2=function(psi){w=n[1]/sum(n); p=c(w*psi[1]+(1-w)*psi[2],psi[2])} ginv1=function(p){p} ginv2=function(p){c(sum(n)/n[1]*p[1]-n[2]/n[1]*p[2],p[2])} p.prior1=function(p){sum(dbeta(p,1,1,log=TRUE))} p.prior2=function(p){dbeta(p[1],1,1,log=TRUE)+dbeta(p[2],17,15,log=TRUE)} draw1=function(){rbeta(2,y+1,n-y+1)} draw2=function(){c(rbeta(1,sumy+1,sumn-sumy+1),rbeta(1,17,15))} out=rjmcmcpost(post.draw=list(draw1,draw2), g=list(g1,g2), ginv=list(ginv1,ginv2), likelihood=list(L1,L2), param.prior=list(p.prior1,p.prior2), model.prior=c(0.5,0.5), chainlength=1500)
An object of class rj
is returned from the functions
rjmcmcpost
or defaultpost
. The following methods
can be applied to an object of this class. See Details for more information.
## S3 method for class 'rj' print(x, ...) ## S3 method for class 'rj' plot(x, legend = TRUE, col = "maroon4", ylim = c(0, 1), lwd = 2, lty = c(1, 1, 1), ...) ## S3 method for class 'rj' summary(object, quantiles = c(0.025, 0.25, 0.5, 0.75, 0.975), ...)
## S3 method for class 'rj' print(x, ...) ## S3 method for class 'rj' plot(x, legend = TRUE, col = "maroon4", ylim = c(0, 1), lwd = 2, lty = c(1, 1, 1), ...) ## S3 method for class 'rj' summary(object, quantiles = c(0.025, 0.25, 0.5, 0.75, 0.975), ...)
x , object
|
An object of class |
... |
Further arguments to the generic method. |
legend , col , ylim , lwd , lty
|
Some useful graphical arguments to the generic
|
quantiles |
The desired density quantiles for |
The print
method prints the point estimates obtained from the
algorithm, including the transition matrix, posterior model probabilities and
Bayes factors.
The plot
method plots how the estimates of the posterior probabilities
changed as the algorithm progressed, illustrating convergence.
The summary
method returns quantiles of the posterior densities for
each model (as well as likelihoods and prior densities). The point estimates
as in print
are also returned. Note that this requires save.all
must be TRUE
in the rjmcmcpost
call.