The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
This vignette demonstrates the different wolverine SCR models presented in “A flexible and efficient Bayesian implementation of point process models for spatial capture-recapture data” (Zhang et al, submitted) using NIMBLE (de Valpine et al. 2017; NIMBLE Development Team 2020) and the nimbleSCR package (Bischof et al. 2020).
Here, we define the first nimble
model. It uses i) the Bernoulli point process for modelling the distribution of individual wolverine activity centers, ii) a Poisson point process for modelling individual detections, and iii) a semi-complete data-likelihood approach (King et al. 2016) to estimate population size.
modelCode1 <- nimbleCode({
##------ SPATIAL PROCESS
## Intercept and slope for the log-linear model for habitat selection intensity
habCoeffInt ~ dnorm(0, sd = 10)
habCoeffSlope ~ dnorm(0, sd = 10)
## Habitat intensity for each habitat window
habIntensity[1:numHabWindows] <- exp(habCoeffInt + habCoeffSlope * habCovs[1:numHabWindows])
sumHabIntensity <- sum(habIntensity[1:numHabWindows])
logHabIntensity[1:numHabWindows] <- log(habIntensity[1:numHabWindows])
logSumHabIntensity <- log(sumHabIntensity)
## Activity centres of the observed individuals: a Bernoulli point process
for(i in 1:numIdDetected){
sxy[i,1:2] ~ dbernppAC(
lowerCoords = habLoCoords[1:numHabWindows, 1:2],
upperCoords = habUpCoords[1:numHabWindows, 1:2],
logIntensities = logHabIntensity[1:numHabWindows],
logSumIntensity = logSumHabIntensity,
habitatGrid = habitatGrid[1:y.max,1:x.max],
numGridRows = y.max,
numGridCols = x.max)
}#i
##----- DEMOGRAPHIC PROCESS
## Number of individuals in the population
N ~ dpois(sumHabIntensity)
## Number of detected individuals
nDetectedIndiv ~ dbin(probDetection, N)
##----- DETECTION PROCESS
## Scale for the multivariate normal detection function
sigma ~ dunif(0,10)
## Intercept and slope parameters for the log-linear model for detection intensity
for(c in 1:numCounties){
detCoeffInt[c] ~ dnorm(0, sd = 10)
}#c
for(cc in 1:numDetCovs){
detCoeffSlope[cc] ~ dnorm(0, sd = 10)
}#cc
## Baseline detection intensity for each detection window
for(j in 1:numDetWindows){
detIntensity[j] <- exp( detCoeffInt[detCounties[j]] +
detCoeffSlope[1] * detCovs[j,1] +
detCoeffSlope[2] * detCovs[j,2] +
detCoeffSlope[3] * detCovs[j,3])
}#j
## Detections of the observed individuals conditional on their activity centers
for(i in 1:numIdDetected) {
y[i,1:(maxDetections+1),1:3] ~ dpoisppDetection_normal(
lowerCoords = detLoCoords[1:numDetWindows,1:2],
upperCoords = detUpCoords[1:numDetWindows,1:2],
s = sxy[i,1:2],
sd = sigma,
baseIntensities = detIntensity[1:numDetWindows],
numMaxPoints = maxDetections,
numWindows = numDetWindows,
indicator = 1)
}#i
## The probability that an individual in the population is detected at least once
## i.e. one minus the void probability over all detection windows
probDetection <- 1 - marginalVoidProbNumIntegration(
quadNodes = quadNodes[1:maxNumNodes,1:2,1:numHabWindows],
quadWeights = quadWeights[1:numHabWindows],
numNodes = numNodes[1:numHabWindows],
lowerCoords = detLoCoords[1:numDetWindows,1:2],
upperCoords = detUpCoords[1:numDetWindows,1:2],
sd = sigma,
baseIntensities = detIntensity[1:numDetWindows],
habIntensities = habIntensity[1:numHabWindows],
sumHabIntensity = sumHabIntensity,
numObsWindows = numDetWindows,
numHabWindows = numHabWindows
)
logDetProb <- log(probDetection)
normData ~ dnormalizer(logNormConstant = -numIdDetected*logDetProb)
})
Next, we can define a second nimble
model based on the “usual” formulation of Bayesian SCR models. In this model, individual activity center locations are modeled using a two-step process in which i) a categorical distribution is used to sample the habitat window the activity center is located in, and ii) the center of the selected habitat window is used as the location of the activity center inside the selected habitat window. The probability vector of the categorical process follows the intensity of the density point process of the first formulation. Individual detections are not modeled continuously in space, instead they are aggregated at discrete detector locations and the number of detections at each detector is modeled using a Poisson distribution. Finally, we use the data augmentation approach (Royle and Dorazio 2012) to derive population size instead of the semi-complete data-likelihood approach.
modelCode2 <- nimbleCode({
##------ SPATIAL PROCESS
habCoeffSlope ~ dnorm(0, sd = 10)
## Habitat intensity for each habitat window
habIntensity[1:numHabWindows] <- exp(habCoeffSlope * habCovs[1:numHabWindows])
for(i in 1:M){
sID[i] ~ dcat(habIntensity[1:numHabWindows])
}#i
##----- DEMOGRAPHIC PROCESS
psi ~ dunif(0,1)
for(i in 1:M){
z[i] ~ dbern(psi)
}#i
## Number of individuals in the population
N <- sum(z[1:M])
##----- DETECTION PROCESS
## Scale for the multivariate normal detection function
sigma ~ dunif(0,10)
## Intercept and slope parameters for the log-linear model for detection intensity
for(c in 1:numCounties){
detCoeffInt[c] ~ dnorm(0, sd = 10)
}#c
for(cc in 1:numDetCovs){
detCoeffSlope[cc] ~ dnorm(0, sd = 10)
}#cc
## Baseline detection intensity for each detection window
for(j in 1:numDetWindows){
lambdaTraps[j] <- exp( detCoeffInt[detCounties[j]] +
detCoeffSlope[1] * detCovs[j,1] +
detCoeffSlope[2] * detCovs[j,2] +
detCoeffSlope[3] * detCovs[j,3])
}#j
## Detections of the observed individuals conditional on their activity centers
for(i in 1:M) {
y[i,1:lengthYCombined] ~ dpoisLocal_normal(
lambdaTraps = lambdaTraps[1:numDetWindows],
trapCoords = detCoords[1:numDetWindows,1:2],
s = habCoords[sID[i],1:2],
sigma = sigma,
localTrapsIndices = localTrapsIndices[1:numHabWindows,1:numDetWindows],
localTrapsNum = localTrapsNum[1:numHabWindows],
habitatGrid = habitatGrid[1:y.max,1:x.max],
lengthYCombined = lengthYCombined,
indicator = z[i])
}#i
})
For comparison purposes, we define a third nimble
model, which is a hybrid formulation of the two previous ones. It uses i) the Bernoulli point process for modelling the distribution of wolverine activity centers and ii) a Poisson point process to model individual detections in continuous space but uses iii) data augmentation instead of the semi-complete data-likelihood approach to estimate population size. This formulation is expected to be both a better representation of the detection process (thus more accurate than model 2) and more efficient than model 1 because of the complexity of the semi-complete data likelihood calculation.
modelCode3 <- nimbleCode({
##------ SPATIAL PROCESS
habCoeffSlope ~ dnorm(0, sd = 10)
## Habitat intensity for each habitat window
habIntensity[1:numHabWindows] <- exp(habCoeffSlope * habCovs[1:numHabWindows])
sumHabIntensity <- sum(habIntensity[1:numHabWindows])
logHabIntensity[1:numHabWindows] <- log(habIntensity[1:numHabWindows])
logSumHabIntensity <- log(sumHabIntensity)
## Activity centres of the observed individuals: a bernoulli point process
for(i in 1:M){
sxy[i,1:2] ~ dbernppAC(
lowerCoords = habLoCoords[1:numHabWindows,1:2],
upperCoords = habUpCoords[1:numHabWindows,1:2],
logIntensities = logHabIntensity[1:numHabWindows],
logSumIntensity = logSumHabIntensity,
habitatGrid = habitatGrid[1:y.max,1:x.max],
numGridRows = y.max,
numGridCols = x.max)
}#i
##----- DEMOGRAPHIC PROCESS
psi ~ dunif(0,1)
for(i in 1:M){
z[i] ~ dbern(psi)
}#i
## Number of individuals in the population
N <- sum(z[1:M])
##----- DETECTION PROCESS
## Scale for the half-normal detection function
sigma ~ dunif(0,10)
## Intercept and slope parameters for the log-linear model for detection intensity
for(c in 1:numCounties){
detCoeffInt[c] ~ dnorm(0, sd = 10)
}#c
for(cc in 1:numDetCovs){
detCoeffSlope[cc] ~ dnorm(0, sd = 10)
}#cc
## Baseline detection intensity for each detection window
for(j in 1:numDetWindows){
detIntensity[j] <- exp( detCoeffInt[detCounties[j]] +
detCoeffSlope[1] * detCovs[j,1] +
detCoeffSlope[2] * detCovs[j,2] +
detCoeffSlope[3] * detCovs[j,3])
}#j
## Detections of the observed individuals conditional on their activity centers
for(i in 1:M){
y[i,1:(maxDetections+1),1:3] ~ dpoisppDetection_normal(
lowerCoords = detLoCoords[1:numDetWindows,1:2],
upperCoords = detUpCoords[1:numDetWindows,1:2],
s = sxy[i,1:2],
sd = sigma,
baseIntensities = detIntensity[1:numDetWindows],
numMaxPoints = maxDetections,
numWindows = numDetWindows,
indicator = z[i])
}#i
})
We can also define a fourth nimble
model, which uses the categorical distribution to model the distribution of wolverine activity centers, a Poisson point process to model individual detections in continuous space and data augmentation to estimate population size. This formulation is expected to be slower than version 3 because of the categorical process.
modelCode4 <- nimbleCode({
##------ SPATIAL PROCESS
habCoeffSlope ~ dnorm(0, sd = 10)
## Habitat intensity for each habitat window
habIntensity[1:numHabWindows] <- exp(habCoeffSlope * habCovs[1:numHabWindows])
for(i in 1:M){
sID[i] ~ dcat(habIntensity[1:numHabWindows])
}#i
##----- DEMOGRAPHIC PROCESS
psi ~ dunif(0,1)
for(i in 1:M){
z[i] ~ dbern(psi)
}#i
## Number of individuals in the population
N <- sum(z[1:M])
##----- DETECTION PROCESS
## Scale for the half-normal detection function
sigma ~ dunif(0,10)
## Intercept and slope parameters for the log-linear model for detection intensity
for(c in 1:numCounties){
detCoeffInt[c] ~ dnorm(0, sd = 10)
}#c
for(cc in 1:numDetCovs){
detCoeffSlope[cc] ~ dnorm(0, sd = 10)
}#cc
## Baseline detection intensity for each detection window
for(j in 1:numDetWindows){
detIntensity[j] <- exp( detCoeffInt[detCounties[j]] +
detCoeffSlope[1] * detCovs[j,1] +
detCoeffSlope[2] * detCovs[j,2] +
detCoeffSlope[3] * detCovs[j,3])
}#j
## Detections of the observed individuals conditional on their activity centers
for(i in 1:M){
y[i,1:(maxDetections+1),1:3] ~ dpoisppDetection_normal(
lowerCoords = detLoCoords[1:numDetWindows,1:2],
upperCoords = detUpCoords[1:numDetWindows,1:2],
s = habCoords[sID[i],1:2],
sd = sigma,
baseIntensities = detIntensity[1:numDetWindows],
numMaxPoints = maxDetections,
numWindows = numDetWindows,
indicator = z[i])
}#i
})
We load the wolverine example data available directly via nimbleSCR
Now, we can create the nimble
model objects, using the model structures defined above, as well as the constants, data, and initial values.
Rmodel1 <- nimbleModel(modelCode1, constants1, data1, inits1)
Rmodel2 <- nimbleModel(modelCode2, constants2, data2, inits2)
Rmodel3 <- nimbleModel(modelCode3, constants3, data3, inits3)
Rmodel4 <- nimbleModel(modelCode4, constants4, data4, inits4)
We configure an MCMC algorithm for each Rmodel
model object and we assign MCMC monitors to the different parameters we want to track (e.g. \(N\), \(\sigma\), …)
conf1 <- configureMCMC( Rmodel1,
monitors = params1,
print = FALSE)
conf2 <- configureMCMC( Rmodel2,
monitors = params2,
print = FALSE)
conf3 <- configureMCMC( Rmodel3,
monitors = params3,
print = FALSE)
conf4 <- configureMCMC( Rmodel4,
monitors = params4,
print = FALSE)
Rmcmc1 <- buildMCMC(conf1)
Rmcmc2 <- buildMCMC(conf2)
Rmcmc3 <- buildMCMC(conf3)
Rmcmc4 <- buildMCMC(conf4)
Finally, we compile the models and MCMC objects and execute the compiled MCMC runs for 4 chains of 11000 iterations each.
Cmodel1 <- compileNimble(Rmodel1)
Cmcmc1 <- compileNimble(Rmcmc1, project = Rmodel1)
MCMC_runtime1 <- system.time(
MCMC_samples1 <- runMCMC(Cmcmc1,
niter = 11000,
nburnin = 1000,
nchains = 4))
Cmodel2 <- compileNimble(Rmodel2)
Cmcmc2 <- compileNimble(Rmcmc2, project = Rmodel2)
MCMC_runtime2 <- system.time(
MCMC_samples2 <- runMCMC(Cmcmc2,
niter = 11000,
nburnin = 1000,
nchains = 4))
Cmodel3 <- compileNimble(Rmodel3)
Cmcmc3 <- compileNimble(Rmcmc3, project = Rmodel3)
MCMC_runtime3 <- system.time(
MCMC_samples3 <- runMCMC(Cmcmc3,
niter = 11000,
nburnin = 1000,
nchains = 4))
Cmodel4 <- compileNimble(Rmodel4)
Cmcmc4 <- compileNimble(Rmcmc4, project = Rmodel4)
MCMC_runtime4 <- system.time(
MCMC_samples4 <- runMCMC(Cmcmc4,
niter = 11000,
nburnin = 1000,
nchains = 4))
MCMC_samples1_summary <- samplesSummary(do.call(rbind,MCMC_samples1))
MCMC_samples2_summary <- samplesSummary(do.call(rbind,MCMC_samples2))
MCMC_samples3_summary <- samplesSummary(do.call(rbind,MCMC_samples3))
MCMC_samples4_summary <- samplesSummary(do.call(rbind,MCMC_samples4))
MCMC_samples1_ess <- effectiveSize(MCMC_samples1)
MCMC_samples2_ess <- effectiveSize(MCMC_samples2)
MCMC_samples3_ess <- effectiveSize(MCMC_samples3)
MCMC_samples4_ess <- effectiveSize(MCMC_samples4)
We can then look at the summary of posterior distributions for the different models:
## Mean Median St.Dev. 95%CI_low 95%CI_upp
## N 137.33 137.00 12.27 115.00 163.00
## detCoeffInt[1] 3.45 3.47 0.84 1.75 5.01
## detCoeffInt[2] 3.69 3.70 0.21 3.28 4.10
## detCoeffInt[3] 3.99 3.99 0.14 3.71 4.26
## detCoeffSlope[1] 0.26 0.26 0.09 0.08 0.43
## detCoeffSlope[2] -0.13 -0.13 0.09 -0.32 0.05
## detCoeffSlope[3] 0.15 0.14 0.10 -0.06 0.34
## habCoeffInt 0.97 0.97 0.15 0.66 1.25
## habCoeffSlope 0.64 0.64 0.11 0.41 0.85
## sigma 0.09 0.09 0.00 0.08 0.09
## Mean Median St.Dev. 95%CI_low 95%CI_upp
## N 108.97 109.00 8.57 94.00 127.00
## detCoeffInt[1] -2.01 -1.95 0.65 -3.42 -0.92
## detCoeffInt[2] -0.93 -0.93 0.23 -1.38 -0.50
## detCoeffInt[3] -0.82 -0.81 0.12 -1.07 -0.58
## detCoeffSlope[1] 0.41 0.41 0.06 0.29 0.53
## detCoeffSlope[2] -0.07 -0.07 0.09 -0.25 0.10
## detCoeffSlope[3] 0.03 0.03 0.09 -0.15 0.22
## habCoeffSlope 0.57 0.57 0.12 0.33 0.79
## psi 0.36 0.36 0.04 0.29 0.45
## sigma 0.35 0.34 0.01 0.32 0.38
## Mean Median St.Dev. 95%CI_low 95%CI_upp
## N 138.91 138.00 12.72 116.00 166.00
## detCoeffInt[1] 3.43 3.44 0.84 1.69 5.04
## detCoeffInt[2] 3.68 3.69 0.21 3.24 4.09
## detCoeffInt[3] 3.98 3.98 0.14 3.70 4.26
## detCoeffSlope[1] 0.26 0.26 0.09 0.09 0.43
## detCoeffSlope[2] -0.13 -0.13 0.09 -0.32 0.05
## detCoeffSlope[3] 0.14 0.14 0.10 -0.06 0.34
## habCoeffSlope 0.64 0.64 0.11 0.43 0.86
## psi 0.46 0.46 0.05 0.37 0.57
## sigma 0.09 0.09 0.00 0.08 0.09
## Mean Median St.Dev. 95%CI_low 95%CI_upp
## N 98.34 98.00 6.86 86.00 113.00
## detCoeffInt[1] 0.43 0.47 0.65 -0.99 1.55
## detCoeffInt[2] 1.33 1.34 0.21 0.93 1.73
## detCoeffInt[3] 1.47 1.48 0.12 1.24 1.70
## detCoeffSlope[1] 0.42 0.42 0.06 0.30 0.54
## detCoeffSlope[2] 0.01 0.02 0.09 -0.16 0.18
## detCoeffSlope[3] 0.00 0.00 0.09 -0.18 0.18
## habCoeffSlope 15.14 15.51 5.09 3.81 24.78
## psi 0.33 0.33 0.04 0.26 0.40
## sigma 0.33 0.33 0.01 0.31 0.36
Next, we can check the posterior effective sample size (ESS) resulting from our 40 000 posterior samples for the population size estimates (\(N\)):
## N sigma
## 5646.59 1775.95
## N sigma
## 4472.89 2169.46
## N sigma
## 2122.96 1552.12
## N sigma
## 4476.06 2465.72
We can also calculate the MCMC efficiency; this corresponds to the rate of generating effectively independent posterior samples, per second of MCMC runtime:
## N
## 0.2310625
## N
## 0.9795007
## N
## 0.5121119
## N
## 0.2694068
We can see that model 2 and 4 seem to overestimate individual space use (\(\sigma\)), and underestimate \(N\), as a consequence of the coarse resolutions used for the habitat and detectors and the aggregation of activity centers and detections to the centers of the habitat and detector windows, respectively.
We can also see that the semi-likelihood approach is much slower than the data-augmentation for comparable models and results (Runtime : 6.8hrs for model 1 against 1.2hrs for model 3).
However, when looking at the efficiency, model 2 seems to perform better, thanks to the faster detection model using discrete locations (Efficiency : 0.9795007ESS/sec) (… but at the cost of poor population size estimates).
Overall, model 3, combining the detection point process and data augmentation seems to be the best compromise with accurate population size estimates and relatively good efficiency (Efficiency : 0.5121119ESS/sec)
Bischof, Richard, Daniel Turek, Cyril Milleret, Torbjørn Ergon, Pierre Dupont, and de Valpine Perry. 2020. NimbleSCR: Spatial Capture-Recapture (Scr) Methods Using ’Nimble’.
de Valpine, Perry, Daniel Turek, Christopher J Paciorek, Clifford Anderson-Bergman, Duncan Temple Lang, and Rastislav Bodik. 2017. “Programming with Models: Writing Statistical Algorithms for General Model Structures with NIMBLE.” J. Comput. Graph. Stat. 26 (2): 403–13.
King, R., B. T. McClintock, D. Kidney, and D. Borchers. 2016. “Capture–Recapture Abundance Estimation Using a Semi-Complete Data Likelihood Approach.” The Annals of Applied Statistics 10 (1): 264–85.
NIMBLE Development Team. 2020. “NIMBLE: MCMC, Particle Filtering, and Programmable Hierarchical Modeling.” https://doi.org/10.5281/zenodo.1211190.
Royle, J Andrew, and Robert M Dorazio. 2012. “Parameter-Expanded Data Augmentation for Bayesian Analysis of Capture–Recapture Models.” Journal of Ornithology 152 (2): 521–37.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.