DHARMa: residual diagnostics for hierarchical (multi-level/mixed) regression models

Florian Hartig, Theoretical Ecology, University of Regensburg website

2019-11-18

Abstract

The ‘DHARMa’ package uses a simulation-based approach to create readily interpretable scaled (quantile) residuals for fitted (generalized) linear mixed models. Currently supported are linear and generalized linear (mixed) models from ‘lme4’ (classes ‘lmerMod’, ‘glmerMod’), ‘glmmTMB’ and ‘spaMM’, generalized additive models (‘gam’ from ‘mgcv’), ‘glm’ (including ‘negbin’ from ‘MASS’, but excluding quasi-distributions) and ‘lm’ model classes. Moreover, externally created simulations, e.g. posterior predictive simulations from Bayesian software such as ‘JAGS’, ‘STAN’, or ‘BUGS’ can be processed as well. The resulting residuals are standardized to values between 0 and 1 and can be interpreted as intuitively as residuals from a linear regression. The package also provides a number of plot and test functions for typical model misspecification problems, such as over/underdispersion, zero-inflation, and residual spatial and temporal autocorrelation.

Motivation

Residual interpretation for generalized linear mixed models (GLMMs) is often problematic. As an example, here two Poisson GLMMs, one that is lacking a quadratic effect, and one that fits the data perfectly. I show three standard residuals diagnostics each. Which is the misspecified model?

Just for completeness - it was the first one. But don’t get too excited if you got it right. Either you were lucky, or you noted that the first model seems a bit overdispersed (range of the Pearson residuals). But even when noting that, would you have added a quadratic effect, instead of adding an overdispersion correction? The point here is that misspecifications in GL(M)Ms cannot reliably be diagnosed with standard residual plots, and GLMMs are thus often not as thoroughly checked as LMs.

One reason why GL(M)Ms residuals are harder to interpret is that the expected distribution of the data changes with the fitted values. Reweighting with the expected variance, as done in Pearson residuals, or using deviance residuals, helps a bit, but does not lead to visually homogenous residuals even if the model is correctly specified. As a result, standard residual plots, when interpreted in the same way as for linear models, seem to show all kind of problems, such as non-normality, heteroscedasticity, even if the model is correctly specified. Questions on the R mailing lists and forums show that practitioners are regularly confused about whether such patterns in GL(M)M residuals are a problem or not.

But even experienced statistical analysts currently have few options to diagnose misspecification problems in GLMMs. In my experience, the current standard practice is to eyeball the residual plots for major misspecifications, potentially have a look at the random effect distribution, and then run a test for overdispersion, which is usually positive, after which the model is modified towards an overdispersed / zero-inflated distribution. This approach, however, has a number of problems, notably:

DHARMa aims at solving these problems by creating readily interpretable residuals for generalized linear (mixed) models that are standardized to values between 0 and 1, and that can be interpreted as intuitively as residuals for the linear model. This is achieved by a simulation-based approach, similar to the Bayesian p-value or the parametric bootstrap, that transforms the residuals to a standardized scale. The basic steps are:

  1. Simulate new data from the fitted model for each observation.

  2. For each observation, calculate the empirical cumulative density function for the simulated observations, which describes the possible values (and their probability) at the predictor combination of the observed value, assuming the fitted model is correct.

  3. The residual is then defined as the value of the empirical density function at the value of the observed data, so a residual of 0 means that all simulated values are larger than the observed value, and a residual of 0.5 means half of the simulated values are larger than the observed value.

These steps are visualized in the following figure

The key motivation behind this definition is that we have a clear expectations how these residuals should be distributed. If the model is correctly specified, then the observed data should look as if they were created from the fitted model. Hence, for a correctly specified model, all values of the cumulative distribution should appear with equal probability. That means we expect the distribution of the residuals to be flat, regardless of the model structure (Poisson, binomial, random effects and so on).

I currently prepare a more exact statistical justification for the approach in an accompanying paper, but if you must provide a reference in the meantime I would suggest citing

p.s.: DHARMa stands for “Diagnostics for HierArchical Regression Models” – which, strictly speaking, would make DHARM. But in German, Darm means intestines; plus, the meaning of DHARMa in Hinduism makes the current abbreviation so much more suitable for a package that tests whether your model is in harmony with your data:

From Wikipedia, 28/08/16: In Hinduism, dharma signifies behaviours that are considered to be in accord with rta, the order that makes life and universe possible, and includes duties, rights, laws, conduct, virtues and ‘‘right way of living’’.

Workflow in DHARMa

Installing, loading and citing the package

If you haven’t installed the package yet, either run

Or follow the instructions on https://github.com/florianhartig/DHARMa to install a development version.

Loading and citation

## 
## To cite package 'DHARMa' in publications use:
## 
##   Florian Hartig (2019). DHARMa: Residual Diagnostics for
##   Hierarchical (Multi-Level / Mixed) Regression Models. R package
##   version 0.2.5. http://florianhartig.github.io/DHARMa/
## 
## A BibTeX entry for LaTeX users is
## 
##   @Manual{,
##     title = {DHARMa: Residual Diagnostics for Hierarchical (Multi-Level / Mixed) Regression Models},
##     author = {Florian Hartig},
##     year = {2019},
##     note = {R package version 0.2.5},
##     url = {http://florianhartig.github.io/DHARMa/},
##   }

Calculating scaled residuals

The scaled (quantile) residuals are calculated with the simulateResiduals() function. The default number of simulations to run is 250, which proved to be a reasonable compromise between computation time and precision, but if high precision is desired, n should be raised to 1000 at least.

What the function does is a) creating n new synthetic datasets by simulating from the fitted model, b) calculates the cumulative distribution of simulated values for each observed value, and c) returning the quantile value that corresponds to the observed value.

For example, a scaled residual value of 0.5 means that half of the simulated data are higher than the observed value, and half of them lower. A value of 0.99 would mean that nearly all simulated data are lower than the observed value. The minimum/maximum values for the residuals are 0 and 1.

The calculated residuals are stored in

As discussed above, for a correctly specified model we would expect

Note: the expected uniform distribution is the only differences to the linear regression that one has to keep in mind when interpreting DHARMa residuals. If you cannot get used to this and you must have residuals that behave exactly like a linear regression, you can access a normal transformation of the residuals via

These normal residuals will behave exactly like the residuals of a linear regression. However, for reasons of a) numeric stability with low number of simulations and b) my conviction that it is much easier to visually detect deviations from uniformity than normality, I would STRONGLY advice against using this transformation.

Plotting the scaled residuals

We can get a visual impression of these properties with the plot.DHARMa() function

which creates a qq-plot to detect overall deviations from the expected distribution (with added tests for uniformity and outliers), and a plot of the residuals against the predicted value.

In the plot res ~ predicted, simulation outliers are highlighted as red stars. By simulation outliers, I mean data points that are outside the range of simulated values. Because no data was simulated in that range, we actually don’t know “how much” these values deviate from the model expecation, so the term “outlier” should be used with a grain of salt. Note also that the probability of an outlier depends on the number of simulations (in fact, it is 1/(nSim +1) for each side), so whether the existence of outliers is a reason for concern depends also on the number of simulations.

To provide a visual aid in detecting deviations from uniformity in y-direction, the plot function calculates an (optional) quantile regression, which compares the empirical 0.25, 0.5 and 0.75 quantiles in y direction (red solid lines) with the theoretical 0.25, 0.5 and 0.75 quantiles (dashed black line). Assymptotically (i.e. for lots of data / residuals), if the model is correct, theoretical and the empirical quantiles should be identical (i.e. dashed and solid lines should match).

In practice, however, there will be only a finite and often small number of residuals. If the model is correct, these residuals are drawn from the theoretical (uniform) distribution, but because of the limited sample size, the empirical quantiles of these residuals will never perfectly match the theoretical quantiles. It’s the same as in a normal linear regression - even if the model is entirely correct, the qq-plot (or any other residual plot) for a few data points will never perfectly match the theoretical quantiles.

Thus, for a limited amount of data, the question one has to ask is if the deviation of the empirical (red) from the expected (dashed) distribution is strong enough so that one can reject the null hypothesis that the residuals are drawn from a uniform distribution. To answer this question, DHARMa has various tests implemented (see later). Unfortunately, there is not yet a dedicated test for trends in the red quantile lines, so at the moment it’s up to the user to make the call of a deviation in the residual pattern is is still acceptable, i.e. could appear do to random variation.

If you want to plot the residuals against other predictors (highly recommend), you can use the function

which does the same quantile plot as the main plotting function.

Formal goodness-of-fit tests on the scaled residuals

To support the visual inspection of the residuals, the DHARMa package provides a number of specialized goodness-of-fit tests on the simulated residuals:

that basically do what they say. See the help of the functions and further comments below for a more detailed description. The wrapper function testResiduals calculates the first three tests, including their graphical outputs

Simulation options

There are a few important technical details regarding how the simulations are performed, in particular regarding the treatments of random effects and integer responses. It is strongly recommended to read the help of

Refit

  • if refit = F (default), new data is simulated from the fitted model, and residuals are calculated by comparing the observed data to the new data

  • if refit = T, a parametric bootstrap is performed, meaning that the model is refit to the new data, and residuals are created by comparing observed residuals against refitted residuals

The second option is much much slower, and also seemed to have lower power in some tests I ran. ** It is therefore not recommended for standard residual diagnostics!** I only recommend using it if you know what you are doing, and have particular reasons, for example if you estimate that the tested model is biased. A bias could, for example, arise in small data situations, or when estimating models with shrinkage estimators that include a purposeful bias, such as ridge/lasso, random effects or the splines in GAMs. My idea was then that simulated data would not fit to the observations, but that residuals for model fits on simulated data would have the same patterns/bias than model fits on the observed data.

Note also that refit = T can sometimes run into numerical problems, if the fitted model does not converge on the newly simulated data.

Random effect simulations

The second option is the treatment of the stochastic hierarchy. In a hierarchical model, several layers of stochasticity are placed on top of each other. Specifically, in a GLMM, we have a lower level stochastic process (random effect), whose result enters into a higher level (e.g. Poisson distribution). For other hierarchical models, such as state-space models, similar considerations apply, but the hierarchy can be more complex. When simulating, we have to decide if we want to re-simulate all stochastic levels, or only a subset of those. For example, in a GLMM, it is common to only simulate the last stochastic level (e.g. Poisson) conditional on the fitted random effects, meaning that the random effects are set on the fitted values.

For controlling how many levels should be re-simulated, the simulateResidual function allows to pass on parameters to the simulate function of the fitted model object. Please refer to the help of the different simulate functions (e.g. ?simulate.merMod) for details. For merMod (lme4) model objects, the relevant parameters are “use.u”, and “re.form”, as, e.g., in

If the model is correctly specified and the fitting procedure is unbiased (disclaimer: GLMM estimators are not always unbiased), the simulated residuals should be flat regardless how many hierarchical levels we re-simulate. The most thorough procedure would be therefore to test all possible options. If testing only one option, I would recommend to re-simulate all levels, because this essentially tests the model structure as a whole. This is the default setting in the DHARMa package. A potential drawback is that re-simulating the random effects creates more variability, which may reduce power for detecting problems in the upper-level stochastic processes.

Integer treatment / randomization

A third option is the treatment of integer responses. The background of this option is that, for integer-valued variables, some additional steps are neccessary to make sure that the residual distribution becomes flat (essentially, we have to smoothen away the integer nature of the data). The idea is explained in

  • Dunn, K. P., and Smyth, G. K. (1996). Randomized quantile residuals. Journal of Computational and Graphical Statistics 5, 1-10.

The simulateResiduals function will automatically check if the family is integer valued, and apply randomization if that is the case. I see no reason why one would not want to randomize for an integer-valued function, so the parameter should usually not be changed.

Calculating residuals per group

In many situations, it can be useful to look at residuals per group, e.g. to see how much the model over / underpredicts per plot, year or subject. To do this, use the recalculateResiduals() function, together with a grouping variable

you can keep using the simulation output as before. Note, hover, that items such as simulationOutput$scaledResiduals now have as many entries as you have groups, so if you perform plots by hand, you have to aggregate predictors in the same way. For the latter purpose, recalculateResiduals adds a function aggregateByGroup to the output.

Reproducibility notes, random seed and random state

As DHARMa uses simulations to calculate the residuals, a naive implementation of the algorithm would mean that residuals would look slightly different each time a DHARMa calculation is executed. This might both be confusing and bear the danger that a user would run the simulation several times and take the result that looks better (which would amount to multiple testing / p-hacking).

By default, DHARMa therefore fixes the random seed to the same value every time a simulation is run, and afterwards restores the random state to the old value. This means that you will get exactly the same residual plot each time. If you want to avoid this behavior, for example for simulation experiments on DHARMa, use seed = NULL -> no seed set, but random state will be restored, or seed = F -> no seed set, and random state will not be restored. Whether or not you fix the seed, the setting for the random seed and the random state are stored in

If you want to reproduce simualtions for such a run, set the variable .Random.seed by hand, and simulate with seed = NULL.

Moreover (general advice), to ensure reproducibility, it’s advisable to add a set.seed() at the beginning, and a session.info() at the end of your script. The latter will lists the version number of R and all loaded packages.

Interpreting residuals and recognizing misspecification problems

In all plots / tests that were shown so far, the model was correctly specified, resulting in “perfect” residual plots. In this section, we discuss how to recognize and interpret model misspecifications in the scaled residuals. Note, however, that

  1. The fact that none of the here-presented tests shows a misspecification problem doesn’t proof that the model is correctly specified. There are likely a large number of structural problems that will not show a pattern in the standard residual plots.

  2. Conversely, while a clear pattern in the residuals indicates with good reliability that the observed data would not be likely to originate from the fitted model, it doesn’t neccessarily indicate that the model results are not usuable. There are many cases where it is common practice to work “wrong models”. For example, random effect estimates (in particular in GLMMs) are often slightly biased, especially if the model is fit with MLE. For that reason, DHARMa will often show a slight pattern in the residuals even if the model is correctly specified, and tests for this can get significant for large sample sizes. Another example is data that is missing at random (MAR) (see here). It is know that this phenomenon does not createa bias on the fixed effect estimates, and it is therefore common practice to fit this data with mixed models. Nevertheless, DHARMa recognizes that the observed data looks different than what would be expected from the model assummptions, and flags the model as problemaetic

Important conclusion: DHARMa only flags a difference between the observed and expected data - the user has to decide whether this difference is actually a problem for the analysis!

Overdispersion / underdispersion

The most common concern for GLMMs is overdispersion, underdispersion and zero-inflation.

Over/underdispersion refers to the phenomenon that residual variance is larger/smaller than expected under the fitted model. Over/underdispersion can appear for any distributional family with fixed variance, in particular for Poisson and binomial models.

A few general rules of thumb

An example of overdispersion

This this is how overdispersion looks like in the DHARMa residuals

Note that we get more residuals around 0 and 1, which means that more residuals are in the tail of distribution than would be expected under the fitted model.

An example of underdispersion

This is an example of underdispersion

## Generalized linear mixed model fit by maximum likelihood (Laplace
##   Approximation) [glmerMod]
##  Family: poisson  ( log )
## Formula: observedResponse ~ Environment1 + (1 | group)
##    Data: testData
## 
##      AIC      BIC   logLik deviance df.resid 
##   1031.1   1043.8   -512.6   1025.1      497 
## 
## Scaled residuals: 
##      Min       1Q   Median       3Q      Max 
## -0.64083 -0.35390 -0.05813  0.22834  0.91703 
## 
## Random effects:
##  Groups Name        Variance Std.Dev.
##  group  (Intercept) 0        0       
## Number of obs: 500, groups:  group, 10
## 
## Fixed effects:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)  -0.13024    0.05831  -2.233   0.0255 *  
## Environment1  2.19567    0.08519  25.772   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Correlation of Fixed Effects:
##             (Intr)
## Environmnt1 -0.818
## convergence code: 0
## boundary (singular) fit: see ?isSingular

## 
##  One-sample Kolmogorov-Smirnov test
## 
## data:  simulationOutput$scaledResiduals
## D = 0.21859, p-value < 2.2e-16
## alternative hypothesis: two-sided

Here, we get too many residuals around 0.5, which means that we are not getting as many residuals as we would expect in the tail of the distribution than expected from the fitted model.

Testing for over/underdispersion

Although, as discussed above, over/underdispersion will show up in the residuals, and it’s possible to detect it with the testUniformity function, simulations show that this test is less powerful than more targeted tests.

DHARMa therefore contains two overdispersion tests that compares the dispersion of simulated residuals to the observed residuals.

  1. A non-parametric test on the simulated residuals
  2. A non-parametric overdispersion test on the re-fitted residuals.

You can call these tests as follows:

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 0.24135, p-value < 2.2e-16
## alternative hypothesis: two.sided

## 
##  DHARMa nonparametric dispersion test via mean deviance residual
##  fitted vs. simulated-refitted
## 
## data:  simulationOutput2
## dispersion = 0.15184, p-value < 2.2e-16
## alternative hypothesis: two.sided

Note: previous versions of DHARMa (< 0.2.0) discouraged the simulated overdispersion test in favor of the refitted and parametric tests. I have since changed the test function, and simulations show that it as powerful as the refitted or parametric test. Because of the generality and speed of this option, I see no good reason for either refitting or running parametric tests. Therefore

  1. My recommendation for testing dispersion is to simply use the standard dispersion test, based on the simulated residuals

  2. It’s not clear to if the refitted test is better … but it’s available.

  3. In my simulations, parametric tests, such as AER::dispersiontest didn’t provide higher power. Because of that, and because of the higher generality of the simulated tests, I no longer provide parametric tests in DHARMa. However, you can see various implementions of the parametric tests in the DHARMa GitHub repo under Code/DHARMaPerformance/Power).

Below and example from there, which compares the four options to test for overdispersion (2 options to use DHARMa::testDispersoin, AER::dispersiontest, and DHARMa::testUniformity) for a Poisson glm

Comparison of power from simulation studies

Comparison of power from simulation studies

A word of warning that applies also to all other tests that follow: significance in hypothesis tests depends on at least 2 ingredients: strenght of the signal, and number of data points. Hence, the p-value alone is not a good indicator of the extent to which your residuals deviate from assumptions. Specifically, if you have a lot of data points, residual diagnostics will nearly inevitably become significant, because having a perfectly fitting model is very unlikely. That, however, doesn’t neccessarily mean that you need to change your model. The p-values confirm that there is a deviation from your null hypothesis. It is, however, in your discretion to decide whether this deviation is worth worrying about. If you see a dispersion parameter of 1.01, I would not worry, even if the test is significant. A significant value of 5, however, is clearly a reason to move to a model that accounts for overdispersion.

Zero-inflation / k-inflation or deficits

A common special case of overdispersion is zero-inflation, which is the situation when more zeros appear in the observation than expected under the fitted model. Zero-inflation requires special correction steps.

More generally, we can also have too few zeros, or too much or too few of any other values. We’ll discuss that at the end of this section

Zero-inflation in the scaled residuals

In the normal DHARMa residual, plots, zero-inflation will look pretty much like overdispersion

The reason is that the model will usually try to find a compromise between the zeros, and the other values, which will lead to excess variance in the residuals.

Test for zero-inflation

DHARMa has a special test for zero-inflation, which compares the distribution of expected zeros in the data against the observed zeros

## 
##  DHARMa zero-inflation test via comparison to expected zeros with
##  simulation under H0 = fitted model
## 
## data:  simulationOutput
## ratioObsSim = 2.1744, p-value < 2.2e-16
## alternative hypothesis: two.sided

This test is likely better suited for detecting zero-inflation than the standard plot, but note that also overdispersion will lead to excess zeros, so only seeing too many zeros is not a reliable diagnostics for moving towards a zero-inflated model. A reliable differentiation between overdispersion and zero-inflation will usually only be possible when directly comparing alternative models, e.g. through residual comparison / model selection of a model with / without zero-inflation, or by simply fitting a model with zero-inflation and looking at the parameter estimate for the zero-inflation.

A good option is the R package glmmTMB, which is also supported by DHARMa. We can use this to fit

##  Family: poisson  ( log )
## Formula:          
## observedResponse ~ Environment1 + I(Environment1^2) + (1 | group)
## Zero inflation:                    ~1
## Data: testData
## 
##      AIC      BIC   logLik deviance df.resid 
##   1288.7   1309.7   -639.3   1278.7      495 
## 
## Random effects:
## 
## Conditional model:
##  Groups Name        Variance Std.Dev. 
##  group  (Intercept) 8.05e-10 2.837e-05
## Number of obs: 500, groups:  group, 10
## 
## Conditional model:
##                   Estimate Std. Error z value Pr(>|z|)    
## (Intercept)        2.00497    0.04648   43.13   <2e-16 ***
## Environment1       1.08342    0.10810   10.02   <2e-16 ***
## I(Environment1^2) -2.92000    0.19383  -15.06   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Zero-inflation model:
##             Estimate Std. Error z value Pr(>|z|)   
## (Intercept)   0.2990     0.1036   2.885  0.00392 **
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Testing generic summary statistics, e.g. for k-inflation or deficits

To test for generic excess / deficits of particular values, we have the function testGeneric, which compares the values of a generic, user-provided summary statistics

Choose one of alternative = c(“greater”, “two.sided”, “less”) to test for inflation / deficit or both. Default is “greater” = inflation.

## 
##  DHARMa generic simulation test
## 
## data:  simulationOutput
## ratioObsSim = 0.94807, p-value = 0.632
## alternative hypothesis: greater

Heteroscedasticity

So far, most of the things that we have tested could also have been detected with parametric tests. Here, we come to the first issue that is difficult to detect with current tests, and that is usually neglected.

Heteroscedasticity means that there is a systematic dependency of the dispersion / variance on another variable in the model. It is not sufficiently appreciated that also binomial or Poisson models can show heteroscedasticity. Basically, it means that the level of over/underdispersion depends on another parameter. Here an example where we create such data

## 
##  One-sample Kolmogorov-Smirnov test
## 
## data:  simulationOutput$scaledResiduals
## D = 0.35226, p-value < 2.2e-16
## alternative hypothesis: two-sided

Adding a simple overdispersion correction will try to find a compromise between the different levels of dispersion in the model. The qq plot looks better now, but there is still a pattern in the residuals

## 
##  One-sample Kolmogorov-Smirnov test
## 
## data:  simulationOutput$scaledResiduals
## D = 0.043426, p-value = 0.3024
## alternative hypothesis: two-sided

To remove this pattern, you would need to make the dispersion parameter dependent on a predictor (e.g. in JAGS), or apply a transformation on the data.

Missing predictors or quadratic effects

A second test that is typically run for LMs, but not for GL(M)Ms is to plot residuals against the predictors in the model (or potentially predictors that were not in the model) to detect possible misspecifications. Doing this is highly recommended. For that purpose, you can retrieve the residuals via

Note again that the residual values are scaled between 0 and 1. If you plot the residuals against predictors, space or time, the resulting plots should not only show no systematic dependency of those residuals on the covariates, but they should also again be flat for each fixed situation. That means that if you have, for example, a categorical predictor: treatment / control, the distribution of residuals for each predictor alone should be flat as well.

Here an example with a missing quadratic effect in the model and 2 predictors

## 
##  One-sample Kolmogorov-Smirnov test
## 
## data:  simulationOutput$scaledResiduals
## D = 0.08939, p-value = 0.08183
## alternative hypothesis: two-sided

It is difficult to see that there is a problem at all in the general plot, but it becomes clear if we plot against the environment

##             pred         res
## 1    0.016329347 0.802143863
## 2    0.724544860 0.135802328
## 3    0.434782025 0.503031972
## 4    0.732375308 0.346200042
## 5   -0.325901324 0.692570333
## 6   -0.542829623 0.160975004
## 7   -0.474397656 0.275586019
## 8    0.519422511 0.678244398
## 9    0.729878517 0.257483217
## 10  -0.216391226 0.774806890
## 11   0.595418846 0.171115029
## 12   0.989479212 0.063343334
## 13   0.639174867 0.150412643
## 14   0.677867951 0.090795401
## 15   0.322913416 0.148493345
## 16   0.227624038 0.957007520
## 17  -0.097288257 0.429278622
## 18  -0.838499072 0.589795498
## 19  -0.614992770 0.121288569
## 20  -0.682935226 0.720927867
## 21   0.952135332 0.039973632
## 22  -0.493974815 0.019878431
## 23   0.122655998 0.930900206
## 24   0.961514818 0.574292211
## 25  -0.269128891 0.213116499
## 26   0.061270144 0.799720042
## 27   0.923341120 0.429148073
## 28   0.981087993 0.708394101
## 29   0.908282014 0.756235563
## 30   0.402283705 0.842320324
## 31  -0.494661707 0.741358046
## 32   0.721249844 0.273305089
## 33  -0.002069699 0.995998771
## 34  -0.068248182 0.964264355
## 35  -0.225122089 0.782094745
## 36  -0.704887456 0.672749592
## 37   0.661253544 0.737504883
## 38  -0.972178305 0.620963244
## 39  -0.245759266 0.823233525
## 40   0.978249201 0.076504670
## 41   0.214875476 0.464279241
## 42  -0.786287437 0.467920376
## 43   0.033064533 0.364559875
## 44   0.346503434 0.819958083
## 45  -0.003895139 0.132245354
## 46   0.267461346 0.823353412
## 47  -0.671044159 0.366972488
## 48  -0.027753438 0.285610360
## 49   0.114539083 0.562718117
## 50   0.016211722 0.747999843
## 51   0.352055250 0.622356676
## 52  -0.951962973 0.006065941
## 53   0.030112941 0.198287196
## 54  -0.937616411 0.334687963
## 55  -0.289342841 0.553020464
## 56   0.568653875 0.244704068
## 57   0.428027267 0.064901331
## 58  -0.852604859 0.069375412
## 59   0.096802128 0.110491545
## 60  -0.411677886 0.291142345
## 61   0.595095849 0.965496356
## 62  -0.757150364 0.114061660
## 63  -0.694074956 0.049788557
## 64   0.843091523 0.722661356
## 65   0.948485786 0.640641331
## 66   0.708718081 0.334293611
## 67  -0.703585888 0.020737229
## 68  -0.929823743 0.281316566
## 69  -0.376810428 0.243515863
## 70   0.397381756 0.366204690
## 71   0.547162745 0.250723256
## 72  -0.394371128 0.279100764
## 73   0.303239846 0.251651496
## 74  -0.792440871 0.392337859
## 75  -0.862828277 0.071968603
## 76  -0.764453925 0.351935208
## 77  -0.471973781 0.623992096
## 78   0.040612453 0.326509860
## 79   0.532850381 0.698558486
## 80  -0.015156378 0.753815657
## 81   0.968401264 0.070182967
## 82  -0.852684105 0.097774263
## 83   0.913272267 0.360217152
## 84   0.657482500 0.220657229
## 85   0.949317455 0.021898515
## 86   0.275500456 0.991996454
## 87   0.746402652 0.837232825
## 88  -0.079259626 0.979036161
## 89   0.372936771 0.948730271
## 90   0.258673331 0.485604575
## 91  -0.219072012 0.969422656
## 92  -0.442033746 0.664766898
## 93   0.499451965 0.847072282
## 94  -0.417572767 0.968003185
## 95  -0.460599734 0.792697845
## 96  -0.829670165 0.146296760
## 97  -0.954839440 0.060976962
## 98   0.938127374 0.276529214
## 99   0.774595167 0.277386603
## 100  0.934457538 0.173455338
## 101 -0.394264268 0.758584841
## 102 -0.540556113 0.525408882
## 103  0.532664754 0.963432812
## 104 -0.161762932 0.995836629
## 105 -0.915527129 0.238618508
## 106  0.560737956 1.000000000
## 107 -0.999474645 0.230047073
## 108 -0.654276905 0.608641321
## 109 -0.090259786 0.519214871
## 110  0.061593282 0.990036066
## 111 -0.852151222 0.097790056
## 112 -0.754057813 0.100869319
## 113 -0.288054679 0.790226155
## 114 -0.560853006 0.962726209
## 115  0.265813111 0.951168102
## 116 -0.196089601 0.992782822
## 117 -0.408158248 0.705776391
## 118 -0.873895148 0.155569535
## 119  0.024112926 1.000000000
## 120  0.453826341 0.990052379
## 121 -0.531513043 0.077021147
## 122 -0.530271063 0.553662812
## 123 -0.727463293 0.635185546
## 124 -0.068269663 0.885455941
## 125 -0.187263497 0.977194697
## 126 -0.965092286 0.093040442
## 127  0.688888723 0.114943391
## 128 -0.002746358 0.784176639
## 129 -0.729346495 0.190002814
## 130  0.903492809 0.191891740
## 131  0.618473358 0.732651062
## 132 -0.563387069 0.870750899
## 133 -0.413118920 0.562061111
## 134 -0.425176987 0.491107834
## 135 -0.814892724 0.140037097
## 136  0.533892605 0.402318208
## 137  0.474530497 0.883750754
## 138  0.269547141 0.586598575
## 139  0.534733912 0.825997635
## 140  0.068640707 0.839316836
## 141  0.756596346 0.038540718
## 142  0.948212874 0.013457768
## 143  0.446644397 0.018967935
## 144  0.460157504 0.110502935
## 145 -0.034254333 0.884248246
## 146 -0.749983209 0.488143955
## 147  0.871744363 0.448787945
## 148 -0.091386907 0.970490572
## 149 -0.338151921 0.845936031
## 150  0.233224936 0.677152767
## 151 -0.067905995 0.787584149
## 152 -0.966973396 0.074882449
## 153 -0.258827719 0.196467759
## 154 -0.860205528 0.438918920
## 155  0.544073089 0.853619572
## 156  0.342508148 0.863597751
## 157 -0.292428020 0.444139941
## 158  0.947292638 0.380351708
## 159  0.485465252 0.911184857
## 160 -0.587300418 0.092133975
## 161 -0.848640000 0.163773220
## 162  0.328789959 0.252466564
## 163 -0.564113139 0.626770792
## 164 -0.547689855 0.122929286
## 165  0.230785024 0.373994776
## 166  0.821503535 0.029764962
## 167  0.153748982 0.347773372
## 168 -0.705929290 0.666969894
## 169 -0.515511521 0.260167313
## 170 -0.716362564 0.025617914
## 171  0.835940945 0.316612547
## 172 -0.288877787 0.262105699
## 173  0.587247032 0.241342852
## 174  0.334023700 0.729777150
## 175  0.068688138 0.443275270
## 176  0.191618680 0.208991317
## 177  0.232102375 0.269723946
## 178  0.878821753 0.126177743
## 179 -0.210926833 0.069342404
## 180  0.379429946 0.452748390
## 181  0.776516399 0.042911803
## 182  0.120790177 0.746972229
## 183  0.289059819 0.778272554
## 184  0.417963855 0.981339646
## 185 -0.954577521 0.136016617
## 186 -0.105281018 0.675931629
## 187  0.025339697 0.856857941
## 188 -0.631002742 0.383261692
## 189 -0.039543609 0.797554300
## 190  0.425877935 0.684435701
## 191 -0.690948494 0.471383259
## 192  0.041338589 0.771804448
## 193 -0.287996831 0.638400402
## 194  0.700300386 0.196091824
## 195  0.913831946 0.733691549
## 196 -0.158851998 0.979472053
## 197  0.988766413 0.437581153
## 198 -0.788387035 0.728792183
## 199  0.724136366 0.193698013
## 200 -0.140255381 0.740351995

##             pred         res
## 1   -0.343327213 0.802143863
## 2   -0.279018169 0.135802328
## 3    0.679029177 0.503031972
## 4    0.893330680 0.346200042
## 5    0.516587990 0.692570333
## 6   -0.579570313 0.160975004
## 7    0.921186969 0.275586019
## 8   -0.745919610 0.678244398
## 9   -0.484105062 0.257483217
## 10   0.856136659 0.774806890
## 11   0.624895222 0.171115029
## 12   0.531525537 0.063343334
## 13   0.991184868 0.150412643
## 14   0.566904318 0.090795401
## 15  -0.964424310 0.148493345
## 16   0.721039385 0.957007520
## 17   0.619964349 0.429278622
## 18  -0.324340321 0.589795498
## 19   0.813842780 0.121288569
## 20  -0.694370609 0.720927867
## 21   0.375013087 0.039973632
## 22   0.014160453 0.019878431
## 23   0.434177380 0.930900206
## 24  -0.787965439 0.574292211
## 25  -0.549428906 0.213116499
## 26  -0.552548018 0.799720042
## 27  -0.537964245 0.429148073
## 28  -0.722270661 0.708394101
## 29  -0.694698612 0.756235563
## 30   0.762255092 0.842320324
## 31  -0.914433395 0.741358046
## 32   0.949168249 0.273305089
## 33   0.334869414 0.995998771
## 34   0.588012419 0.964264355
## 35   0.088230739 0.782094745
## 36  -0.924861075 0.672749592
## 37  -0.023523876 0.737504883
## 38  -0.602592182 0.620963244
## 39   0.124807649 0.823233525
## 40   0.758952247 0.076504670
## 41  -0.320992684 0.464279241
## 42  -0.736867537 0.467920376
## 43   0.923276296 0.364559875
## 44  -0.618044822 0.819958083
## 45   0.279186172 0.132245354
## 46   0.235661421 0.823353412
## 47  -0.016545696 0.366972488
## 48   0.908456336 0.285610360
## 49  -0.710721329 0.562718117
## 50  -0.898661722 0.747999843
## 51  -0.673853979 0.622356676
## 52   0.842970604 0.006065941
## 53  -0.546512829 0.198287196
## 54  -0.035783312 0.334687963
## 55   0.361556511 0.553020464
## 56   0.121027631 0.244704068
## 57   0.640483606 0.064901331
## 58   0.821720998 0.069375412
## 59  -0.428575128 0.110491545
## 60   0.527239328 0.291142345
## 61  -0.599458952 0.965496356
## 62  -0.478931928 0.114061660
## 63  -0.534910359 0.049788557
## 64  -0.468455022 0.722661356
## 65  -0.721691219 0.640641331
## 66   0.946712470 0.334293611
## 67   0.087296591 0.020737229
## 68  -0.804611744 0.281316566
## 69   0.779220564 0.243515863
## 70   0.092791121 0.366204690
## 71  -0.990850252 0.250723256
## 72   0.572761985 0.279100764
## 73  -0.062996245 0.251651496
## 74  -0.341269373 0.392337859
## 75   0.810146100 0.071968603
## 76  -0.213000886 0.351935208
## 77  -0.569586791 0.623992096
## 78  -0.521211523 0.326509860
## 79  -0.085888893 0.698558486
## 80   0.628775772 0.753815657
## 81   0.573109812 0.070182967
## 82   0.059931925 0.097774263
## 83   0.285245418 0.360217152
## 84  -0.190729263 0.220657229
## 85   0.258793425 0.021898515
## 86  -0.473780019 0.991996454
## 87  -0.977794098 0.837232825
## 88   0.799911868 0.979036161
## 89  -0.318995315 0.948730271
## 90   0.358443372 0.485604575
## 91  -0.149528506 0.969422656
## 92  -0.985345042 0.664766898
## 93   0.711063039 0.847072282
## 94   0.790289444 0.968003185
## 95   0.250041903 0.792697845
## 96  -0.171675636 0.146296760
## 97   0.551517125 0.060976962
## 98  -0.195927193 0.276529214
## 99   0.607251434 0.277386603
## 100 -0.804463793 0.173455338
## 101 -0.852795023 0.758584841
## 102  0.304732781 0.525408882
## 103  0.412040387 0.963432812
## 104 -0.270325772 0.995836629
## 105  0.388384483 0.238618508
## 106 -0.107385660 1.000000000
## 107 -0.802893579 0.230047073
## 108  0.994534701 0.608641321
## 109 -0.963138441 0.519214871
## 110  0.248791170 0.990036066
## 111  0.192910518 0.097790056
## 112  0.779564337 0.100869319
## 113  0.352330822 0.790226155
## 114 -0.676470478 0.962726209
## 115 -0.510923428 0.951168102
## 116  0.886818453 0.992782822
## 117  0.694460339 0.705776391
## 118  0.712833596 0.155569535
## 119 -0.828679343 1.000000000
## 120 -0.820453434 0.990052379
## 121  0.947237327 0.077021147
## 122  0.955359829 0.553662812
## 123  0.461674225 0.635185546
## 124  0.086814250 0.885455941
## 125  0.863829858 0.977194697
## 126  0.984417694 0.093040442
## 127  0.585653764 0.114943391
## 128 -0.769240991 0.784176639
## 129  0.269955020 0.190002814
## 130 -0.502945974 0.191891740
## 131  0.305351620 0.732651062
## 132 -0.077752986 0.870750899
## 133  0.801530275 0.562061111
## 134  0.376440280 0.491107834
## 135  0.735167122 0.140037097
## 136 -0.084280028 0.402318208
## 137  0.198901632 0.883750754
## 138 -0.361202580 0.586598575
## 139  0.959788077 0.825997635
## 140  0.271703063 0.839316836
## 141 -0.707764025 0.038540718
## 142  0.655136468 0.013457768
## 143 -0.807546955 0.018967935
## 144  0.516682119 0.110502935
## 145  0.986066169 0.884248246
## 146 -0.660229879 0.488143955
## 147 -0.572463214 0.448787945
## 148  0.984350986 0.970490572
## 149  0.888469046 0.845936031
## 150 -0.585324858 0.677152767
## 151  0.588648834 0.787584149
## 152  0.506755500 0.074882449
## 153 -0.332276811 0.196467759
## 154 -0.493446557 0.438918920
## 155  0.011406210 0.853619572
## 156  0.305151844 0.863597751
## 157 -0.312639312 0.444139941
## 158 -0.556556010 0.380351708
## 159  0.078580398 0.911184857
## 160  0.427620164 0.092133975
## 161 -0.380028300 0.163773220
## 162  0.435149482 0.252466564
## 163 -0.717269314 0.626770792
## 164  0.445403071 0.122929286
## 165  0.508376580 0.373994776
## 166  0.334973323 0.029764962
## 167 -0.231200141 0.347773372
## 168 -0.506294284 0.666969894
## 169 -0.147643025 0.260167313
## 170  0.556368694 0.025617914
## 171 -0.272847987 0.316612547
## 172 -0.453748535 0.262105699
## 173  0.271215139 0.241342852
## 174 -0.007286701 0.729777150
## 175 -0.672835096 0.443275270
## 176 -0.582548601 0.208991317
## 177  0.259716962 0.269723946
## 178  0.356381757 0.126177743
## 179 -0.396032583 0.069342404
## 180 -0.009824374 0.452748390
## 181 -0.179424522 0.042911803
## 182 -0.398152994 0.746972229
## 183 -0.378448983 0.778272554
## 184  0.896219637 0.981339646
## 185  0.046772676 0.136016617
## 186  0.026409910 0.675931629
## 187  0.340212511 0.856857941
## 188 -0.718222275 0.383261692
## 189 -0.611717834 0.797554300
## 190 -0.029343676 0.684435701
## 191 -0.431993512 0.471383259
## 192 -0.160816906 0.771804448
## 193 -0.102677888 0.638400402
## 194 -0.752762273 0.196091824
## 195 -0.889837968 0.733691549
## 196 -0.634175551 0.979472053
## 197 -0.095728869 0.437581153
## 198 -0.818726334 0.728792183
## 199  0.901484098 0.193698013
## 200  0.404426950 0.740351995

Temporal autocorrelation

A special case of plotting residuals against predictors is the plot against time and space, which should always be performed if those variables are present in the model. Let’s create some temporally autocorrelated data

Test and plot for temporal autocorrelation

The function testTemporalAutocorrelation performs a Durbin-Watson test from the package lmtest on the uniform residuals to test for temporal autocorrelation in the residuals, and additionally plots the residuals against time.

The function also has an option to perform the test against randomized time (H0) - the sense of this is to be able to run simulations for testing if the test has correct error rates in the respective situation, i.e. is not oversensitive (too high sensitivity has sometimes been reported for Durbin-Watson).

## 
##  Durbin-Watson test
## 
## data:  simulationOutput$scaledResiduals ~ 1
## DW = 1.4652, p-value = 0.00691
## alternative hypothesis: true autocorrelation is not 0

## 
##  Durbin-Watson test
## 
## data:  simulationOutput$scaledResiduals ~ 1
## DW = 1.8943, p-value = 0.5936
## alternative hypothesis: true autocorrelation is not 0

Note general caveats mentioned about the DW test in the help of testTemporalAutocorrelation(). In general, as for spatial autocorrelation, it is difficult to specify one test, because temporal and spatial autocorrelation can appear in many flavors, short-scale and long scale, homogenous or not, and so on. The pre-defined functions in DHARMa are a starting point, but they are not something you should rely on blindly.

Spatial autocorrelation

Here an example with spatial autocorrelation

Test and plot for spatial autocorrelation

The spatial autocorrelation test performs the Moran.I test from the package ape and plots the residuals against space.

An additional test against randomized space (H0) can be performed, for the same reasons as explained above.

## 
##  DHARMa Moran's I test for spatial autocorrelation
## 
## data:  simulationOutput
## observed = 0.078302, expected = -0.010101, sd = 0.016529, p-value
## = 8.881e-08
## alternative hypothesis: Spatial autocorrelation

## 
##  DHARMa Moran's I test for spatial autocorrelation
## 
## data:  simulationOutput
## observed = -0.0031074, expected = -0.0101010, sd = 0.0168619,
## p-value = 0.6783
## alternative hypothesis: Spatial autocorrelation

The usual caveats for Moran.I apply, in particular that it may miss non-local and heterogeneous (non-stationary) spatial autocorrelation. The former should be better detectable visually in the spatial plot, or via regressions on the pattern.

Supported packages

lm and glm

lm and glm and MASS::glm.nb are fully supported.

lme4

lme4 model classes are fully supported.

mgcv

mgcv is partly supported. Non-standard distributions are not supported, because mgcv doesn’t implement a simulate function for those.

glmmTMB

glmmTMB is partly supported. Current limitations of glmmTMB

spaMM

spaMM is supported by DHARMa since 0.2.1

Other packages

See my general comments about adding new R packages to DHARMa

As noted there, if you want to use DHARMa for a specific case, you could write a custom simulate function for the specific model you are working with. This will usually involve using the predict function and adding the random distribution, plus potentially drawing new data for the random effects or other hierarchical levels.

As an example, for an poisson glm, a simulate function could be programmed as in the following example, which also shows how the results are read into DHARMa and plotted (see also following section)

Importing external simulations (e.g. from Bayesian software or unsupported packages)

As mentioned earlier, the quantile residuals defined in DHARMa are the frequentist equivalent of the so-called “Bayesian p-values”, i.e. residuals created from posterior predictive simulations in a Bayesian analysis.

To make the plots and tests in DHARMa also available for Bayesian analysis, DHARMa provides the option to convert externally created posterior predictive simulations into a DHARMa object

What is provided as simulatedResponse is up to the user, but median posterior predictions seem most sensible to me. After the conversion, all DHARMa plots can be used, however, note that Bayesian p-values != DHARMA residuals, because in the Bayesian analysis, parameters are varied as well.

Important: as DHARMa doesn’t know the distribution fitted model, it is vital to specify the integerResponse option by hand (see above / ?simulateResiduals for details).

Case studies and examples

Note: More real-world examples on the DHARMa GitHub repository here

Budworm example (count-proportion n/k binomial)

This example comes from Jochen Fründ. Measured are the number of parasitized observations, with population density as a covariate

Let’s fit the data with a regular binomial n/k glm

The residuals look clearly overdispersed. We can confirm that with the omnibus test

## 
##  One-sample Kolmogorov-Smirnov test
## 
## data:  simulationOutput$scaledResiduals
## D = 0.36382, p-value = 0.005911
## alternative hypothesis: two-sided

Or with the more powerful overdispersion test

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 1.1516, p-value < 2.2e-16
## alternative hypothesis: two.sided

OK, so let’s add overdispersion through an individual-level random effect

The overdispersion looks better, but you can see that the residuals look a bit irregular.

Likely, the reason is the steep increase in the beginning that one can see in the raw data plot. One would probably need to apply another transformation or a nonlinear function to completely fit this away.

Owl example (count data)

The next examples uses the fairly well known Owl dataset which is provided in glmmTMB (see ?Owls for more info about the data).

The following shows a sequence of models, all checked with DHARMa. The example is discussed in a talk at ISEC 2018, see slides here.

##  Family: nbinom1  ( log )
## Formula:          
## SiblingNegotiation ~ FoodTreatment * SexParent + offset(log(BroodSize)) +  
##     (1 | Nest)
## Data: Owls
## 
##      AIC      BIC   logLik deviance df.resid 
##   3400.8   3427.2  -1694.4   3388.8      593 
## 
## Random effects:
## 
## Conditional model:
##  Groups Name        Variance Std.Dev.
##  Nest   (Intercept) 0.1265   0.3556  
## Number of obs: 599, groups:  Nest, 27
## 
## Overdispersion parameter for nbinom1 family (): 7.05 
## 
## Conditional model:
##                                     Estimate Std. Error z value Pr(>|z|)
## (Intercept)                          0.67674    0.11340   5.968 2.41e-09
## FoodTreatmentSatiated               -0.87038    0.13964  -6.233 4.58e-10
## SexParentMale                        0.04469    0.10712   0.417    0.677
## FoodTreatmentSatiated:SexParentMale  0.12173    0.17520   0.695    0.487
##                                        
## (Intercept)                         ***
## FoodTreatmentSatiated               ***
## SexParentMale                          
## FoodTreatmentSatiated:SexParentMale    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

##         pred         res
## 1   Deprived 0.339790736
## 2   Satiated 0.196481510
## 3   Deprived 0.136670209
## 4   Deprived 0.168020382
## 5   Deprived 0.196224187
## 6   Deprived 0.197283751
## 7   Deprived 0.834639028
## 8   Satiated 0.716877505
## 9   Deprived 0.810786358
## 10  Satiated 0.194864952
## 11  Satiated 0.132496864
## 12  Deprived 0.288995827
## 13  Satiated 0.028667106
## 14  Deprived 0.200854975
## 15  Deprived 0.240661298
## 16  Deprived 0.385362304
## 17  Deprived 0.033811374
## 18  Satiated 0.569311367
## 19  Deprived 0.033525201
## 20  Deprived 0.437686347
## 21  Deprived 0.412032504
## 22  Satiated 0.163920202
## 23  Satiated 0.964272243
## 24  Satiated 0.743217776
## 25  Deprived 0.079799774
## 26  Deprived 0.509529154
## 27  Deprived 0.778198228
## 28  Deprived 0.294672254
## 29  Satiated 0.629010018
## 30  Satiated 0.523403616
## 31  Deprived 0.777413665
## 32  Deprived 0.797895621
## 33  Deprived 0.899647892
## 34  Satiated 0.440819146
## 35  Satiated 0.534038509
## 36  Satiated 0.543908387
## 37  Deprived 0.893586706
## 38  Deprived 0.776396666
## 39  Satiated 0.081852741
## 40  Deprived 0.741425499
## 41  Deprived 0.713384206
## 42  Satiated 0.121200978
## 43  Satiated 0.225130852
## 44  Satiated 0.288146009
## 45  Satiated 0.224321563
## 46  Deprived 0.081351392
## 47  Deprived 0.045165302
## 48  Satiated 0.212741796
## 49  Satiated 0.154716938
## 50  Deprived 0.025263161
## 51  Satiated 0.045059026
## 52  Deprived 0.411079597
## 53  Deprived 0.025141373
## 54  Satiated 0.156417194
## 55  Deprived 0.747448684
## 56  Deprived 0.898596683
## 57  Deprived 0.937282391
## 58  Deprived 0.157002412
## 59  Deprived 0.103995992
## 60  Deprived 0.363309335
## 61  Deprived 0.382756540
## 62  Deprived 0.478858794
## 63  Satiated 0.241083645
## 64  Satiated 0.351298179
## 65  Deprived 0.433541083
## 66  Deprived 0.635824045
## 67  Deprived 0.661775953
## 68  Deprived 0.880589725
## 69  Deprived 0.907417928
## 70  Deprived 0.863280318
## 71  Satiated 0.122071855
## 72  Deprived 0.094069553
## 73  Deprived 0.148495189
## 74  Satiated 0.441252998
## 75  Satiated 0.119987835
## 76  Satiated 0.186816875
## 77  Deprived 0.189548974
## 78  Satiated 0.178810268
## 79  Satiated 0.153970843
## 80  Deprived 0.241850572
## 81  Deprived 0.373002126
## 82  Deprived 0.911564082
## 83  Deprived 0.840949761
## 84  Deprived 0.865204059
## 85  Deprived 0.822535741
## 86  Deprived 0.856819272
## 87  Deprived 0.873329985
## 88  Deprived 0.824367547
## 89  Deprived 0.791857930
## 90  Deprived 0.530917954
## 91  Deprived 0.514294987
## 92  Deprived 0.723806951
## 93  Deprived 0.787140018
## 94  Deprived 0.680873651
## 95  Satiated 0.981330800
## 96  Deprived 0.005485406
## 97  Deprived 0.865174078
## 98  Deprived 0.320847182
## 99  Deprived 0.705866820
## 100 Deprived 0.200840270
## 101 Satiated 0.730935319
## 102 Satiated 0.046719002
## 103 Satiated 0.343576047
## 104 Satiated 0.492461283
## 105 Satiated 0.324780965
## 106 Satiated 0.404322503
## 107 Deprived 0.926141145
## 108 Satiated 0.427659652
## 109 Deprived 0.697729903
## 110 Deprived 0.653776150
## 111 Deprived 0.787342010
## 112 Satiated 0.635338317
## 113 Satiated 0.908915006
## 114 Deprived 0.656448181
## 115 Satiated 0.603610590
## 116 Satiated 0.741422672
## 117 Satiated 0.886489491
## 118 Satiated 0.808070393
## 119 Satiated 0.797499997
## 120 Satiated 0.667618086
## 121 Satiated 0.865982886
## 122 Deprived 0.561020310
## 123 Satiated 0.979819570
## 124 Deprived 0.317228483
## 125 Deprived 0.368821132
## 126 Deprived 0.672554101
## 127 Satiated 0.131558392
## 128 Satiated 0.141073095
## 129 Deprived 0.536679984
## 130 Deprived 0.703739862
## 131 Deprived 0.332595382
## 132 Satiated 0.400076523
## 133 Deprived 0.534287837
## 134 Deprived 0.150544854
## 135 Satiated 0.095956093
## 136 Deprived 0.391179912
## 137 Deprived 0.338973954
## 138 Satiated 0.202820495
## 139 Satiated 0.173088482
## 140 Satiated 0.083041384
## 141 Satiated 0.188816339
## 142 Deprived 0.021370610
## 143 Satiated 0.178018057
## 144 Satiated 0.204049721
## 145 Deprived 0.123940776
## 146 Satiated 0.181241894
## 147 Deprived 0.279821506
## 148 Deprived 0.336755503
## 149 Deprived 0.134506598
## 150 Deprived 0.649854584
## 151 Deprived 0.464948797
## 152 Satiated 0.508598690
## 153 Deprived 0.592526812
## 154 Deprived 0.198537990
## 155 Deprived 0.649386693
## 156 Deprived 0.643155520
## 157 Deprived 0.690246777
## 158 Satiated 0.042936360
## 159 Deprived 0.103662112
## 160 Satiated 0.255003711
## 161 Satiated 0.124850279
## 162 Satiated 0.073029542
## 163 Satiated 0.174782196
## 164 Deprived 0.874244839
## 165 Deprived 0.910299214
## 166 Deprived 0.880200698
## 167 Deprived 0.622237141
## 168 Deprived 0.648984699
## 169 Deprived 0.181020746
## 170 Deprived 0.407193995
## 171 Deprived 0.290762764
## 172 Deprived 0.390914710
## 173 Deprived 0.464975640
## 174 Deprived 0.477167939
## 175 Deprived 0.520140050
## 176 Satiated 0.287113162
## 177 Satiated 0.253186116
## 178 Satiated 0.042772371
## 179 Satiated 0.633709735
## 180 Deprived 0.007826385
## 181 Deprived 0.369595831
## 182 Deprived 0.433005937
## 183 Deprived 0.383530611
## 184 Deprived 0.809218941
## 185 Satiated 0.790950998
## 186 Deprived 0.607783767
## 187 Deprived 0.850278991
## 188 Deprived 0.387926781
## 189 Deprived 0.479153947
## 190 Satiated 0.372274788
## 191 Deprived 0.648112861
## 192 Deprived 0.604099009
## 193 Deprived 0.733042672
## 194 Deprived 0.664448652
## 195 Satiated 0.462707365
## 196 Deprived 0.533186867
## 197 Satiated 0.117392058
## 198 Deprived 0.935337424
## 199 Deprived 0.964633513
## 200 Satiated 0.784456621
## 201 Deprived 0.053378715
## 202 Deprived 0.005056137
## 203 Deprived 0.240443079
## 204 Satiated 0.274470269
## 205 Deprived 0.701501567
## 206 Satiated 0.950773056
## 207 Deprived 0.864240065
## 208 Deprived 0.556253683
## 209 Deprived 0.052879321
## 210 Satiated 0.532066273
## 211 Deprived 0.134281587
## 212 Deprived 0.513614109
## 213 Deprived 0.723845369
## 214 Deprived 0.729222532
## 215 Satiated 0.706633802
## 216 Satiated 0.981837245
## 217 Deprived 0.631563037
## 218 Satiated 0.512890979
## 219 Satiated 0.387313145
## 220 Deprived 0.164837608
## 221 Satiated 0.263949364
## 222 Deprived 0.155399849
## 223 Deprived 0.182985177
## 224 Deprived 0.129845845
## 225 Satiated 0.308303208
## 226 Satiated 0.511218526
## 227 Deprived 0.558309959
## 228 Satiated 0.226945398
## 229 Satiated 0.009442978
## 230 Deprived 0.671790432
## 231 Satiated 0.174215421
## 232 Deprived 0.196615809
## 233 Deprived 0.628651182
## 234 Deprived 0.280245045
## 235 Deprived 0.245686925
## 236 Satiated 0.030719697
## 237 Deprived 0.194640246
## 238 Deprived 0.210529871
## 239 Satiated 0.158904699
## 240 Satiated 0.151759605
## 241 Deprived 0.006686739
## 242 Satiated 0.142407467
## 243 Satiated 0.107607287
## 244 Deprived 0.148029350
## 245 Satiated 0.091145010
## 246 Satiated 0.190627298
## 247 Deprived 0.386442430
## 248 Satiated 0.147937767
## 249 Satiated 0.914615540
## 250 Deprived 0.741110664
## 251 Deprived 0.732002627
## 252 Satiated 0.319928336
## 253 Satiated 0.360195346
## 254 Satiated 0.930981322
## 255 Deprived 0.044661203
## 256 Satiated 0.328752758
## 257 Deprived 0.000000000
## 258 Deprived 0.142917730
## 259 Deprived 0.053103194
## 260 Deprived 0.820733985
## 261 Deprived 0.617261520
## 262 Satiated 0.846456305
## 263 Deprived 0.934290273
## 264 Deprived 0.832986576
## 265 Satiated 0.480450576
## 266 Deprived 0.687159523
## 267 Deprived 0.716134082
## 268 Deprived 0.664709513
## 269 Satiated 0.200901757
## 270 Deprived 0.036168157
## 271 Deprived 0.055136108
## 272 Deprived 0.752296287
## 273 Deprived 0.787947998
## 274 Deprived 0.797519481
## 275 Deprived 0.211714950
## 276 Satiated 0.917647991
## 277 Deprived 0.006365637
## 278 Deprived 0.141891793
## 279 Satiated 0.096389696
## 280 Deprived 0.398274223
## 281 Deprived 0.619085898
## 282 Deprived 0.125126430
## 283 Satiated 0.106080203
## 284 Deprived 0.348535691
## 285 Satiated 0.223721258
## 286 Deprived 0.445221066
## 287 Deprived 0.600639199
## 288 Satiated 0.875842173
## 289 Satiated 0.810718937
## 290 Deprived 0.515890282
## 291 Deprived 0.859995517
## 292 Deprived 0.720333658
## 293 Deprived 0.767329994
## 294 Deprived 0.755347607
## 295 Deprived 0.781529387
## 296 Deprived 0.814355307
## 297 Satiated 0.798030180
## 298 Deprived 0.819724793
## 299 Deprived 0.824882547
## 300 Satiated 0.756590967
## 301 Deprived 0.019963444
## 302 Satiated 0.396934161
## 303 Deprived 0.042424149
## 304 Satiated 0.831054244
## 305 Satiated 0.814661500
## 306 Satiated 0.567398675
## 307 Deprived 0.271056121
## 308 Deprived 0.721209130
## 309 Satiated 0.796684574
## 310 Satiated 0.815280316
## 311 Deprived 0.792955136
## 312 Deprived 0.958631910
## 313 Satiated 0.084761888
## 314 Deprived 0.552189781
## 315 Deprived 0.056229068
## 316 Satiated 0.631478062
## 317 Satiated 0.600003357
## 318 Deprived 0.414163471
## 319 Deprived 0.381069483
## 320 Satiated 0.846007602
## 321 Satiated 0.855762894
## 322 Deprived 0.793047729
## 323 Deprived 0.608896065
## 324 Deprived 0.406990794
## 325 Satiated 0.066432558
## 326 Satiated 0.312207958
## 327 Satiated 0.875220101
## 328 Deprived 0.334976890
## 329 Deprived 0.872150968
## 330 Satiated 0.808347980
## 331 Deprived 0.384960313
## 332 Satiated 0.316044897
## 333 Deprived 0.061369620
## 334 Deprived 0.068360721
## 335 Satiated 0.690968481
## 336 Deprived 0.752882531
## 337 Deprived 0.752897481
## 338 Deprived 0.569699377
## 339 Satiated 0.482997172
## 340 Deprived 0.770461266
## 341 Satiated 0.427799231
## 342 Deprived 0.615583644
## 343 Satiated 0.000000000
## 344 Deprived 0.498656087
## 345 Deprived 0.444585529
## 346 Deprived 0.490375309
## 347 Deprived 0.774703557
## 348 Satiated 0.637040902
## 349 Satiated 0.631813506
## 350 Deprived 0.783793093
## 351 Deprived 0.887851692
## 352 Satiated 0.229022258
## 353 Deprived 0.864468723
## 354 Satiated 0.914007297
## 355 Deprived 0.597625014
## 356 Satiated 0.853821550
## 357 Deprived 0.580129859
## 358 Deprived 0.899654986
## 359 Deprived 0.872682999
## 360 Satiated 0.755395707
## 361 Satiated 0.760340036
## 362 Satiated 0.978813852
## 363 Satiated 0.968417821
## 364 Deprived 0.275219175
## 365 Deprived 0.746196424
## 366 Deprived 0.232006956
## 367 Satiated 0.182555441
## 368 Satiated 0.216567506
## 369 Deprived 0.150874633
## 370 Deprived 0.017697718
## 371 Deprived 0.664427305
## 372 Deprived 0.288892874
## 373 Satiated 0.478125099
## 374 Satiated 0.539046347
## 375 Satiated 0.303821131
## 376 Deprived 0.314134305
## 377 Satiated 0.984082107
## 378 Deprived 0.927251608
## 379 Deprived 0.770733115
## 380 Deprived 0.762249077
## 381 Satiated 0.934292044
## 382 Deprived 0.760940492
## 383 Deprived 0.810190232
## 384 Deprived 0.805324513
## 385 Satiated 0.937610502
## 386 Satiated 0.943195956
## 387 Satiated 0.944044973
## 388 Deprived 0.020342283
## 389 Satiated 0.072295402
## 390 Deprived 0.035932302
## 391 Deprived 0.028712857
## 392 Satiated 0.575638983
## 393 Satiated 0.154174251
## 394 Deprived 0.534062382
## 395 Satiated 0.956186584
## 396 Deprived 0.388041675
## 397 Satiated 0.048272340
## 398 Satiated 0.304419802
## 399 Deprived 0.280190354
## 400 Satiated 0.868102133
## 401 Satiated 0.965397013
## 402 Satiated 0.933811254
## 403 Deprived 0.193989646
## 404 Satiated 0.891543295
## 405 Satiated 0.909234959
## 406 Satiated 0.883504048
## 407 Satiated 0.807231693
## 408 Deprived 0.776722205
## 409 Satiated 0.919040875
## 410 Deprived 0.639095338
## 411 Deprived 0.558043104
## 412 Satiated 0.127259417
## 413 Satiated 0.069385299
## 414 Deprived 0.415640879
## 415 Deprived 0.401508170
## 416 Deprived 0.400951963
## 417 Satiated 0.014531301
## 418 Satiated 0.017039211
## 419 Deprived 0.289073115
## 420 Deprived 0.326354843
## 421 Deprived 0.458762228
## 422 Satiated 0.524390802
## 423 Satiated 0.990184908
## 424 Deprived 0.364713274
## 425 Satiated 0.993464438
## 426 Satiated 0.716952187
## 427 Deprived 0.477183116
## 428 Deprived 0.483428899
## 429 Satiated 0.628926069
## 430 Satiated 0.256967933
## 431 Deprived 0.258555432
## 432 Satiated 0.736436313
## 433 Deprived 0.698694025
## 434 Satiated 0.765779867
## 435 Satiated 0.689356883
## 436 Deprived 0.702623985
## 437 Deprived 0.038626241
## 438 Satiated 0.034329066
## 439 Satiated 0.890043400
## 440 Satiated 0.909449584
## 441 Satiated 0.039156139
## 442 Deprived 0.242700209
## 443 Deprived 0.274718901
## 444 Satiated 0.029396722
## 445 Satiated 0.104315508
## 446 Deprived 0.540423011
## 447 Deprived 0.613039923
## 448 Deprived 0.394296625
## 449 Deprived 0.182343021
## 450 Satiated 0.076606196
## 451 Deprived 0.059137687
## 452 Satiated 0.702507053
## 453 Satiated 0.756245829
## 454 Satiated 0.815399997
## 455 Satiated 0.759375083
## 456 Satiated 0.823699728
## 457 Deprived 0.547894168
## 458 Satiated 0.717604505
## 459 Deprived 0.755590661
## 460 Satiated 0.806756077
## 461 Satiated 0.716988272
## 462 Satiated 0.207723958
## 463 Satiated 0.037352798
## 464 Satiated 0.223656708
## 465 Satiated 0.738180807
## 466 Satiated 0.652095310
## 467 Satiated 0.628913524
## 468 Satiated 0.742659873
## 469 Deprived 0.543230745
## 470 Deprived 0.704544279
## 471 Satiated 0.004080720
## 472 Deprived 0.686421422
## 473 Satiated 0.119863072
## 474 Satiated 0.076569232
## 475 Satiated 0.141355799
## 476 Satiated 0.073980828
## 477 Deprived 0.490197813
## 478 Satiated 0.171071068
## 479 Deprived 0.660292383
## 480 Satiated 0.957301755
## 481 Satiated 0.973255585
## 482 Satiated 0.976703337
## 483 Satiated 0.970276716
## 484 Satiated 0.871185119
## 485 Deprived 0.523812878
## 486 Satiated 0.887857255
## 487 Satiated 0.831238629
## 488 Deprived 0.537627915
## 489 Deprived 0.454694151
## 490 Satiated 0.306386076
## 491 Satiated 0.049968603
## 492 Deprived 0.131182546
## 493 Deprived 0.563576630
## 494 Satiated 0.271988244
## 495 Deprived 0.718063430
## 496 Deprived 0.758518042
## 497 Satiated 0.455111013
## 498 Satiated 0.333914115
## 499 Satiated 0.976804741
## 500 Satiated 0.230842967
## 501 Satiated 0.668787800
## 502 Satiated 0.806015197
## 503 Deprived 0.540144880
## 504 Satiated 0.349858495
## 505 Deprived 0.683690929
## 506 Deprived 0.425713535
## 507 Satiated 0.153575495
## 508 Satiated 0.757401203
## 509 Deprived 0.319406125
## 510 Satiated 0.711730692
## 511 Satiated 0.334559392
## 512 Satiated 0.403047082
## 513 Deprived 0.747154827
## 514 Satiated 0.820356252
## 515 Deprived 0.551236066
## 516 Deprived 0.224930782
## 517 Satiated 0.146094027
## 518 Satiated 0.719813119
## 519 Satiated 0.983667553
## 520 Deprived 0.885953915
## 521 Satiated 0.887391804
## 522 Satiated 0.927321085
## 523 Satiated 0.748704168
## 524 Satiated 0.543899214
## 525 Satiated 0.475389611
## 526 Deprived 0.759403448
## 527 Satiated 0.346099147
## 528 Deprived 0.156799334
## 529 Deprived 0.213832675
## 530 Satiated 0.775400754
## 531 Satiated 0.524688682
## 532 Deprived 0.605134205
## 533 Deprived 0.621745042
## 534 Deprived 0.472477478
## 535 Satiated 0.276236897
## 536 Deprived 0.028449280
## 537 Deprived 0.410653318
## 538 Satiated 0.044555848
## 539 Deprived 0.323191307
## 540 Satiated 0.756309618
## 541 Satiated 0.099860487
## 542 Deprived 0.038326449
## 543 Deprived 0.236431222
## 544 Satiated 0.881652263
## 545 Deprived 0.739243130
## 546 Deprived 0.624724044
## 547 Deprived 0.811598766
## 548 Satiated 0.336594617
## 549 Satiated 0.832684944
## 550 Satiated 0.990007204
## 551 Satiated 0.902134917
## 552 Satiated 0.977231644
## 553 Satiated 0.992410425
## 554 Satiated 0.508575646
## 555 Satiated 0.875993920
## 556 Satiated 0.896782783
## 557 Satiated 0.464664424
## 558 Satiated 0.944959842
## 559 Satiated 0.757513633
## 560 Satiated 0.320598966
## 561 Satiated 0.307911853
## 562 Satiated 0.748426345
## 563 Satiated 0.067605241
## 564 Satiated 0.855736977
## 565 Satiated 0.896551869
## 566 Satiated 0.786339262
## 567 Satiated 0.772385675
## 568 Satiated 0.908667171
## 569 Satiated 0.871278636
## 570 Satiated 0.905702385
## 571 Deprived 0.254173262
## 572 Deprived 0.686652412
## 573 Satiated 0.483666217
## 574 Deprived 0.871775547
## 575 Deprived 0.881682019
## 576 Satiated 0.706927130
## 577 Deprived 0.797771037
## 578 Deprived 0.787787035
## 579 Deprived 0.615378709
## 580 Satiated 0.049919830
## 581 Satiated 0.642730919
## 582 Satiated 0.273383824
## 583 Satiated 0.200267288
## 584 Deprived 0.349184423
## 585 Deprived 0.245533441
## 586 Deprived 0.310227000
## 587 Deprived 0.435006378
## 588 Satiated 0.395034643
## 589 Satiated 0.812743912
## 590 Satiated 0.787827708
## 591 Satiated 0.858844680
## 592 Deprived 0.530079336
## 593 Deprived 0.800592540
## 594 Deprived 0.283650616
## 595 Deprived 0.302769573
## 596 Deprived 0.260184642
## 597 Deprived 0.659640496
## 598 Deprived 0.400238733
## 599 Satiated 0.000000000

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 0.742, p-value < 2.2e-16
## alternative hypothesis: two.sided

## 
##  DHARMa zero-inflation test via comparison to expected zeros with
##  simulation under H0 = fitted model
## 
## data:  simulationOutput
## ratioObsSim = 1.2488, p-value = 0.064
## alternative hypothesis: two.sided
##  Family: nbinom1  ( log )
## Formula:          
## SiblingNegotiation ~ FoodTreatment * SexParent + offset(log(BroodSize)) +  
##     (1 | Nest)
## Zero inflation:                      ~FoodTreatment + SexParent
## Data: Owls
## 
##      AIC      BIC   logLik deviance df.resid 
##   3361.0   3400.6  -1671.5   3343.0      590 
## 
## Random effects:
## 
## Conditional model:
##  Groups Name        Variance Std.Dev.
##  Nest   (Intercept) 0.07114  0.2667  
## Number of obs: 599, groups:  Nest, 27
## 
## Overdispersion parameter for nbinom1 family (): 4.07 
## 
## Conditional model:
##                                     Estimate Std. Error z value Pr(>|z|)
## (Intercept)                          0.79147    0.09841   8.042 8.82e-16
## FoodTreatmentSatiated               -0.42028    0.14476  -2.903  0.00369
## SexParentMale                       -0.06593    0.09886  -0.667  0.50481
## FoodTreatmentSatiated:SexParentMale  0.11693    0.16948   0.690  0.49022
##                                        
## (Intercept)                         ***
## FoodTreatmentSatiated               ** 
## SexParentMale                          
## FoodTreatmentSatiated:SexParentMale    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Zero-inflation model:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)            -2.0325     0.3084  -6.590 4.40e-11 ***
## FoodTreatmentSatiated   1.5427     0.2998   5.146 2.66e-07 ***
## SexParentMale          -0.4902     0.2740  -1.789   0.0736 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

##         pred         res
## 1   Deprived 0.291420934
## 2   Satiated 0.151218367
## 3   Deprived 0.175068016
## 4   Deprived 0.193798095
## 5   Deprived 0.139007999
## 6   Deprived 0.146736213
## 7   Deprived 0.871820996
## 8   Satiated 0.635209105
## 9   Deprived 0.847626719
## 10  Satiated 0.272827358
## 11  Satiated 0.024315900
## 12  Deprived 0.234810160
## 13  Satiated 0.251499348
## 14  Deprived 0.241787000
## 15  Deprived 0.199873140
## 16  Deprived 0.284974156
## 17  Deprived 0.117398514
## 18  Satiated 0.500152414
## 19  Deprived 0.089523946
## 20  Deprived 0.443595982
## 21  Deprived 0.441500301
## 22  Satiated 0.148189760
## 23  Satiated 0.926869515
## 24  Satiated 0.635256582
## 25  Deprived 0.091896951
## 26  Deprived 0.407861407
## 27  Deprived 0.791025512
## 28  Deprived 0.373877714
## 29  Satiated 0.627018992
## 30  Satiated 0.456145974
## 31  Deprived 0.840352736
## 32  Deprived 0.788811274
## 33  Deprived 0.924220652
## 34  Satiated 0.423978145
## 35  Satiated 0.515162997
## 36  Satiated 0.414266977
## 37  Deprived 0.928251717
## 38  Deprived 0.770049360
## 39  Satiated 0.047061757
## 40  Deprived 0.771445912
## 41  Deprived 0.751029966
## 42  Satiated 0.039230346
## 43  Satiated 0.197609139
## 44  Satiated 0.362258827
## 45  Satiated 0.234503611
## 46  Deprived 0.086223649
## 47  Deprived 0.009344872
## 48  Satiated 0.295558157
## 49  Satiated 0.102845471
## 50  Deprived 0.074352169
## 51  Satiated 0.161075914
## 52  Deprived 0.375728352
## 53  Deprived 0.068511819
## 54  Satiated 0.103193678
## 55  Deprived 0.794528809
## 56  Deprived 0.948891239
## 57  Deprived 0.928796327
## 58  Deprived 0.115049003
## 59  Deprived 0.089827704
## 60  Deprived 0.276320167
## 61  Deprived 0.319306491
## 62  Deprived 0.390536358
## 63  Satiated 0.124633029
## 64  Satiated 0.492995196
## 65  Deprived 0.387507090
## 66  Deprived 0.677089263
## 67  Deprived 0.634760705
## 68  Deprived 0.851335274
## 69  Deprived 0.905001663
## 70  Deprived 0.833804990
## 71  Satiated 0.053187475
## 72  Deprived 0.140754266
## 73  Deprived 0.083302580
## 74  Satiated 0.220140860
## 75  Satiated 0.309907890
## 76  Satiated 0.054173058
## 77  Deprived 0.004808820
## 78  Satiated 0.376567177
## 79  Satiated 0.120000763
## 80  Deprived 0.200918232
## 81  Deprived 0.349400667
## 82  Deprived 0.947603267
## 83  Deprived 0.940468032
## 84  Deprived 0.887924809
## 85  Deprived 0.811980403
## 86  Deprived 0.918401449
## 87  Deprived 0.935511435
## 88  Deprived 0.844376561
## 89  Deprived 0.805994239
## 90  Deprived 0.534799838
## 91  Deprived 0.417075558
## 92  Deprived 0.784685442
## 93  Deprived 0.752241530
## 94  Deprived 0.756094474
## 95  Satiated 0.973348057
## 96  Deprived 0.007696913
## 97  Deprived 0.923495329
## 98  Deprived 0.282301419
## 99  Deprived 0.683292799
## 100 Deprived 0.159845526
## 101 Satiated 0.658054995
## 102 Satiated 0.047828706
## 103 Satiated 0.083185698
## 104 Satiated 0.547838595
## 105 Satiated 0.399391598
## 106 Satiated 0.080533665
## 107 Deprived 0.950943581
## 108 Satiated 0.428008093
## 109 Deprived 0.713999398
## 110 Deprived 0.637241649
## 111 Deprived 0.817681550
## 112 Satiated 0.530415643
## 113 Satiated 0.877207258
## 114 Deprived 0.652347712
## 115 Satiated 0.664183806
## 116 Satiated 0.620548280
## 117 Satiated 0.915456068
## 118 Satiated 0.781140625
## 119 Satiated 0.738107847
## 120 Satiated 0.622943197
## 121 Satiated 0.823599347
## 122 Deprived 0.537013207
## 123 Satiated 0.989970863
## 124 Deprived 0.245977998
## 125 Deprived 0.272995859
## 126 Deprived 0.745796952
## 127 Satiated 0.239268015
## 128 Satiated 0.172821135
## 129 Deprived 0.521679684
## 130 Deprived 0.709182746
## 131 Deprived 0.404902287
## 132 Satiated 0.401690653
## 133 Deprived 0.485464860
## 134 Deprived 0.264992698
## 135 Satiated 0.040529947
## 136 Deprived 0.282245601
## 137 Deprived 0.320081447
## 138 Satiated 0.084386874
## 139 Satiated 0.143110483
## 140 Satiated 0.040919799
## 141 Satiated 0.237395397
## 142 Deprived 0.081469888
## 143 Satiated 0.425495181
## 144 Satiated 0.006993908
## 145 Deprived 0.138853277
## 146 Satiated 0.101613360
## 147 Deprived 0.304826038
## 148 Deprived 0.286659528
## 149 Deprived 0.110221886
## 150 Deprived 0.565325973
## 151 Deprived 0.512363011
## 152 Satiated 0.532256027
## 153 Deprived 0.587946413
## 154 Deprived 0.144946834
## 155 Deprived 0.751178196
## 156 Deprived 0.714540922
## 157 Deprived 0.702532011
## 158 Satiated 0.386914596
## 159 Deprived 0.121207046
## 160 Satiated 0.375911137
## 161 Satiated 0.258216200
## 162 Satiated 0.097267810
## 163 Satiated 0.172935012
## 164 Deprived 0.899074851
## 165 Deprived 0.956776982
## 166 Deprived 0.922090012
## 167 Deprived 0.510593270
## 168 Deprived 0.571023293
## 169 Deprived 0.167057863
## 170 Deprived 0.397821179
## 171 Deprived 0.208021425
## 172 Deprived 0.341925310
## 173 Deprived 0.464663009
## 174 Deprived 0.415597590
## 175 Deprived 0.472424796
## 176 Satiated 0.364957788
## 177 Satiated 0.260811174
## 178 Satiated 0.296241405
## 179 Satiated 0.525259123
## 180 Deprived 0.016329156
## 181 Deprived 0.457777502
## 182 Deprived 0.356420445
## 183 Deprived 0.303278435
## 184 Deprived 0.861184533
## 185 Satiated 0.633215984
## 186 Deprived 0.555534362
## 187 Deprived 0.838572212
## 188 Deprived 0.429038584
## 189 Deprived 0.512935141
## 190 Satiated 0.304283101
## 191 Deprived 0.647478395
## 192 Deprived 0.556846175
## 193 Deprived 0.631549324
## 194 Deprived 0.682315361
## 195 Satiated 0.402872023
## 196 Deprived 0.503302201
## 197 Satiated 0.073831804
## 198 Deprived 0.951101387
## 199 Deprived 0.968881531
## 200 Satiated 0.796988810
## 201 Deprived 0.214617105
## 202 Deprived 0.102895021
## 203 Deprived 0.173419414
## 204 Satiated 0.043289198
## 205 Deprived 0.724525203
## 206 Satiated 0.962375229
## 207 Deprived 0.874502693
## 208 Deprived 0.468521455
## 209 Deprived 0.219113751
## 210 Satiated 0.524988881
## 211 Deprived 0.193138057
## 212 Deprived 0.475382110
## 213 Deprived 0.747048851
## 214 Deprived 0.735692765
## 215 Satiated 0.707888958
## 216 Satiated 0.988907765
## 217 Deprived 0.541305237
## 218 Satiated 0.348086503
## 219 Satiated 0.294898622
## 220 Deprived 0.239054177
## 221 Satiated 0.284450861
## 222 Deprived 0.113669975
## 223 Deprived 0.035896666
## 224 Deprived 0.216854592
## 225 Satiated 0.493696675
## 226 Satiated 0.456999134
## 227 Deprived 0.464668280
## 228 Satiated 0.262760437
## 229 Satiated 0.092617313
## 230 Deprived 0.688975043
## 231 Satiated 0.177024611
## 232 Deprived 0.172704350
## 233 Deprived 0.674627537
## 234 Deprived 0.292855432
## 235 Deprived 0.195010656
## 236 Satiated 0.427739261
## 237 Deprived 0.220373358
## 238 Deprived 0.203640715
## 239 Satiated 0.086263757
## 240 Satiated 0.293876828
## 241 Deprived 0.018193710
## 242 Satiated 0.293264410
## 243 Satiated 0.174042782
## 244 Deprived 0.183143527
## 245 Satiated 0.249026268
## 246 Satiated 0.000000000
## 247 Deprived 0.363707756
## 248 Satiated 0.222499017
## 249 Satiated 0.883418920
## 250 Deprived 0.783064246
## 251 Deprived 0.786586218
## 252 Satiated 0.067627877
## 253 Satiated 0.072604850
## 254 Satiated 0.927005768
## 255 Deprived 0.044031259
## 256 Satiated 0.281151647
## 257 Deprived 0.089443443
## 258 Deprived 0.078936313
## 259 Deprived 0.000000000
## 260 Deprived 0.789583849
## 261 Deprived 0.634308791
## 262 Satiated 0.778452640
## 263 Deprived 0.933244912
## 264 Deprived 0.863715180
## 265 Satiated 0.450175689
## 266 Deprived 0.745420119
## 267 Deprived 0.801968847
## 268 Deprived 0.739238926
## 269 Satiated 0.464857106
## 270 Deprived 0.052826688
## 271 Deprived 0.150379121
## 272 Deprived 0.811979052
## 273 Deprived 0.810696364
## 274 Deprived 0.821072261
## 275 Deprived 0.237348265
## 276 Satiated 0.875564839
## 277 Deprived 0.019041491
## 278 Deprived 0.162655193
## 279 Satiated 0.140440606
## 280 Deprived 0.306250914
## 281 Deprived 0.607704684
## 282 Deprived 0.156360217
## 283 Satiated 0.328182751
## 284 Deprived 0.373859571
## 285 Satiated 0.126232947
## 286 Deprived 0.410611557
## 287 Deprived 0.538288587
## 288 Satiated 0.816739489
## 289 Satiated 0.780430373
## 290 Deprived 0.545828761
## 291 Deprived 0.864065749
## 292 Deprived 0.800061627
## 293 Deprived 0.841564169
## 294 Deprived 0.849452694
## 295 Deprived 0.832700681
## 296 Deprived 0.809872674
## 297 Satiated 0.746562391
## 298 Deprived 0.811458688
## 299 Deprived 0.869255563
## 300 Satiated 0.751223642
## 301 Deprived 0.123581961
## 302 Satiated 0.464980574
## 303 Deprived 0.057270652
## 304 Satiated 0.851716604
## 305 Satiated 0.831589278
## 306 Satiated 0.543659375
## 307 Deprived 0.201074318
## 308 Deprived 0.799214105
## 309 Satiated 0.765074689
## 310 Satiated 0.658003464
## 311 Deprived 0.765971577
## 312 Deprived 0.979486693
## 313 Satiated 0.035179450
## 314 Deprived 0.483271775
## 315 Deprived 0.077447537
## 316 Satiated 0.566962209
## 317 Satiated 0.535434110
## 318 Deprived 0.341865140
## 319 Deprived 0.383010150
## 320 Satiated 0.869162422
## 321 Satiated 0.784533418
## 322 Deprived 0.779496623
## 323 Deprived 0.570539071
## 324 Deprived 0.272473661
## 325 Satiated 0.023421799
## 326 Satiated 0.331037641
## 327 Satiated 0.889136309
## 328 Deprived 0.369849872
## 329 Deprived 0.889738943
## 330 Satiated 0.790161460
## 331 Deprived 0.332672652
## 332 Satiated 0.358451829
## 333 Deprived 0.038846261
## 334 Deprived 0.054332092
## 335 Satiated 0.636950114
## 336 Deprived 0.766645641
## 337 Deprived 0.842027201
## 338 Deprived 0.587961141
## 339 Satiated 0.453919475
## 340 Deprived 0.755903705
## 341 Satiated 0.355019709
## 342 Deprived 0.578910916
## 343 Satiated 0.051493712
## 344 Deprived 0.415325656
## 345 Deprived 0.443453401
## 346 Deprived 0.429163472
## 347 Deprived 0.764050028
## 348 Satiated 0.602102813
## 349 Satiated 0.558331770
## 350 Deprived 0.840680605
## 351 Deprived 0.892786119
## 352 Satiated 0.290590822
## 353 Deprived 0.869230542
## 354 Satiated 0.844761972
## 355 Deprived 0.582930916
## 356 Satiated 0.827637762
## 357 Deprived 0.622202062
## 358 Deprived 0.892696300
## 359 Deprived 0.884199778
## 360 Satiated 0.769316052
## 361 Satiated 0.657461842
## 362 Satiated 0.971693760
## 363 Satiated 0.969640127
## 364 Deprived 0.215549515
## 365 Deprived 0.730476867
## 366 Deprived 0.252923566
## 367 Satiated 0.263750267
## 368 Satiated 0.223452669
## 369 Deprived 0.176787113
## 370 Deprived 0.042277354
## 371 Deprived 0.722860630
## 372 Deprived 0.277240358
## 373 Satiated 0.519372894
## 374 Satiated 0.403376627
## 375 Satiated 0.474257687
## 376 Deprived 0.319652764
## 377 Satiated 0.991824464
## 378 Deprived 0.969151587
## 379 Deprived 0.825222407
## 380 Deprived 0.804425188
## 381 Satiated 0.926183672
## 382 Deprived 0.782967199
## 383 Deprived 0.751414661
## 384 Deprived 0.804083576
## 385 Satiated 0.895498127
## 386 Satiated 0.921526037
## 387 Satiated 0.926647656
## 388 Deprived 0.071889686
## 389 Satiated 0.086954659
## 390 Deprived 0.086692449
## 391 Deprived 0.119492115
## 392 Satiated 0.535768897
## 393 Satiated 0.290369709
## 394 Deprived 0.487319032
## 395 Satiated 0.958967291
## 396 Deprived 0.404897809
## 397 Satiated 0.276230421
## 398 Satiated 0.211017168
## 399 Deprived 0.211253563
## 400 Satiated 0.832054585
## 401 Satiated 0.917982396
## 402 Satiated 0.921415411
## 403 Deprived 0.172793271
## 404 Satiated 0.860282455
## 405 Satiated 0.881184786
## 406 Satiated 0.893659292
## 407 Satiated 0.871223367
## 408 Deprived 0.802429421
## 409 Satiated 0.890269831
## 410 Deprived 0.672723603
## 411 Deprived 0.668004290
## 412 Satiated 0.259097634
## 413 Satiated 0.355781554
## 414 Deprived 0.417571801
## 415 Deprived 0.412994419
## 416 Deprived 0.410951976
## 417 Satiated 0.062104240
## 418 Satiated 0.053212579
## 419 Deprived 0.266275001
## 420 Deprived 0.287091971
## 421 Deprived 0.345581218
## 422 Satiated 0.480442212
## 423 Satiated 0.989397781
## 424 Deprived 0.446747908
## 425 Satiated 0.992325209
## 426 Satiated 0.648734030
## 427 Deprived 0.404436533
## 428 Deprived 0.474887650
## 429 Satiated 0.570580219
## 430 Satiated 0.315026024
## 431 Deprived 0.184219638
## 432 Satiated 0.701383621
## 433 Deprived 0.694720259
## 434 Satiated 0.660948247
## 435 Satiated 0.619922392
## 436 Deprived 0.769321994
## 437 Deprived 0.007919627
## 438 Satiated 0.335952709
## 439 Satiated 0.853433892
## 440 Satiated 0.873278747
## 441 Satiated 0.019574407
## 442 Deprived 0.256908521
## 443 Deprived 0.198732742
## 444 Satiated 0.124677053
## 445 Satiated 0.286832936
## 446 Deprived 0.550175302
## 447 Deprived 0.521408819
## 448 Deprived 0.398233129
## 449 Deprived 0.163278504
## 450 Satiated 0.261023878
## 451 Deprived 0.038300690
## 452 Satiated 0.644734157
## 453 Satiated 0.690601881
## 454 Satiated 0.741995844
## 455 Satiated 0.683177910
## 456 Satiated 0.752407649
## 457 Deprived 0.483490012
## 458 Satiated 0.670344220
## 459 Deprived 0.755979868
## 460 Satiated 0.769285158
## 461 Satiated 0.768301878
## 462 Satiated 0.358710020
## 463 Satiated 0.098881069
## 464 Satiated 0.357783403
## 465 Satiated 0.709710000
## 466 Satiated 0.580989956
## 467 Satiated 0.515925131
## 468 Satiated 0.690089025
## 469 Deprived 0.591281372
## 470 Deprived 0.723402239
## 471 Satiated 0.228721945
## 472 Deprived 0.693888820
## 473 Satiated 0.261430042
## 474 Satiated 0.271559588
## 475 Satiated 0.221611952
## 476 Satiated 0.291851569
## 477 Deprived 0.469642899
## 478 Satiated 0.114046047
## 479 Deprived 0.683981926
## 480 Satiated 0.964614405
## 481 Satiated 0.970514009
## 482 Satiated 0.966834620
## 483 Satiated 0.959503043
## 484 Satiated 0.873389725
## 485 Deprived 0.480357183
## 486 Satiated 0.852890111
## 487 Satiated 0.822786011
## 488 Deprived 0.496115984
## 489 Deprived 0.466832542
## 490 Satiated 0.087953856
## 491 Satiated 0.136764891
## 492 Deprived 0.133385082
## 493 Deprived 0.530978804
## 494 Satiated 0.361340864
## 495 Deprived 0.721614663
## 496 Deprived 0.722006771
## 497 Satiated 0.518946406
## 498 Satiated 0.351319517
## 499 Satiated 0.986707447
## 500 Satiated 0.301523358
## 501 Satiated 0.629731428
## 502 Satiated 0.700388224
## 503 Deprived 0.662793298
## 504 Satiated 0.290958150
## 505 Deprived 0.694051035
## 506 Deprived 0.359045761
## 507 Satiated 0.351367173
## 508 Satiated 0.785515645
## 509 Deprived 0.312324674
## 510 Satiated 0.614093337
## 511 Satiated 0.347344783
## 512 Satiated 0.362727224
## 513 Deprived 0.742701736
## 514 Satiated 0.729983676
## 515 Deprived 0.605550642
## 516 Deprived 0.204077023
## 517 Satiated 0.219461543
## 518 Satiated 0.638485779
## 519 Satiated 0.981141171
## 520 Deprived 0.873930959
## 521 Satiated 0.933064237
## 522 Satiated 0.932266387
## 523 Satiated 0.674775018
## 524 Satiated 0.545184241
## 525 Satiated 0.439118276
## 526 Deprived 0.724499787
## 527 Satiated 0.101515806
## 528 Deprived 0.191678078
## 529 Deprived 0.179057980
## 530 Satiated 0.749344334
## 531 Satiated 0.551255300
## 532 Deprived 0.677429078
## 533 Deprived 0.622990385
## 534 Deprived 0.435058350
## 535 Satiated 0.298157346
## 536 Deprived 0.007802437
## 537 Deprived 0.347222843
## 538 Satiated 0.018991713
## 539 Deprived 0.325212655
## 540 Satiated 0.700760522
## 541 Satiated 0.000000000
## 542 Deprived 0.074430562
## 543 Deprived 0.132036228
## 544 Satiated 0.844534868
## 545 Deprived 0.714008230
## 546 Deprived 0.622794145
## 547 Deprived 0.793456782
## 548 Satiated 0.392098512
## 549 Satiated 0.787922613
## 550 Satiated 0.988849630
## 551 Satiated 0.896495254
## 552 Satiated 0.976282093
## 553 Satiated 0.977557825
## 554 Satiated 0.468911646
## 555 Satiated 0.871340678
## 556 Satiated 0.795878866
## 557 Satiated 0.419246284
## 558 Satiated 0.913554366
## 559 Satiated 0.665647552
## 560 Satiated 0.412936085
## 561 Satiated 0.275120642
## 562 Satiated 0.722143969
## 563 Satiated 0.129085482
## 564 Satiated 0.838363640
## 565 Satiated 0.846596756
## 566 Satiated 0.648772214
## 567 Satiated 0.707294994
## 568 Satiated 0.887161146
## 569 Satiated 0.878409631
## 570 Satiated 0.877650296
## 571 Deprived 0.218028807
## 572 Deprived 0.707506781
## 573 Satiated 0.327715268
## 574 Deprived 0.919437300
## 575 Deprived 0.943663314
## 576 Satiated 0.685456925
## 577 Deprived 0.798733181
## 578 Deprived 0.831779274
## 579 Deprived 0.564006735
## 580 Satiated 0.274508984
## 581 Satiated 0.589766074
## 582 Satiated 0.380887795
## 583 Satiated 0.440220979
## 584 Deprived 0.351719964
## 585 Deprived 0.245189558
## 586 Deprived 0.292560140
## 587 Deprived 0.401856264
## 588 Satiated 0.414394542
## 589 Satiated 0.740210353
## 590 Satiated 0.699463120
## 591 Satiated 0.824493415
## 592 Deprived 0.498229963
## 593 Deprived 0.806477430
## 594 Deprived 0.267494288
## 595 Deprived 0.265165025
## 596 Deprived 0.195703341
## 597 Deprived 0.546660060
## 598 Deprived 0.367870491
## 599 Satiated 0.236955712

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 0.84909, p-value = 0.024
## alternative hypothesis: two.sided

## 
##  DHARMa zero-inflation test via comparison to expected zeros with
##  simulation under H0 = fitted model
## 
## data:  simulationOutput
## ratioObsSim = 1.0389, p-value = 0.616
## alternative hypothesis: two.sided
##  Family: nbinom1  ( log )
## Formula:          
## SiblingNegotiation ~ FoodTreatment * SexParent + offset(log(BroodSize)) +  
##     (1 | Nest)
## Zero inflation:                      ~FoodTreatment + SexParent
## Dispersion:                          ~FoodTreatment
## Data: Owls
## 
##      AIC      BIC   logLik deviance df.resid 
##   3353.0   3397.0  -1666.5   3333.0      589 
## 
## Random effects:
## 
## Conditional model:
##  Groups Name        Variance Std.Dev.
##  Nest   (Intercept) 0.08695  0.2949  
## Number of obs: 599, groups:  Nest, 27
## 
## Conditional model:
##                                     Estimate Std. Error z value Pr(>|z|)
## (Intercept)                          0.79825    0.09511   8.393  < 2e-16
## FoodTreatmentSatiated               -0.47113    0.16647  -2.830  0.00465
## SexParentMale                       -0.08524    0.09024  -0.945  0.34484
## FoodTreatmentSatiated:SexParentMale  0.12765    0.18960   0.673  0.50079
##                                        
## (Intercept)                         ***
## FoodTreatmentSatiated               ** 
## SexParentMale                          
## FoodTreatmentSatiated:SexParentMale    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Zero-inflation model:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)            -1.8392     0.2912  -6.317 2.67e-10 ***
## FoodTreatmentSatiated   1.0184     0.4131   2.465   0.0137 *  
## SexParentMale          -0.5722     0.3319  -1.724   0.0847 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Dispersion model:
##                       Estimate Std. Error z value Pr(>|z|)    
## (Intercept)             1.1061     0.1460   7.578  3.5e-14 ***
## FoodTreatmentSatiated   0.8267     0.2714   3.046  0.00232 ** 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Binomial 0/1 data

There are a lot of rumors about that can and cannot be checked with binomial 0/1 data. Let’s consider a clearly misspecified binomial model

A rumor that is true is that, unlike in k/n or count data, such a misspecification will not produce overdispersion.

However, you can clearly see the misfit if you plot

##              pred         res
## 1    0.6455242676 0.455828971
## 2   -0.8634708272 0.113206561
## 3   -0.6437908188 0.377663567
## 4    0.0422987193 0.246930839
## 5   -0.9271261496 0.299550778
## 6   -0.3675582181 0.127711221
## 7   -0.5963398600 0.061711737
## 8    0.1525291423 0.444099564
## 9    0.3544580927 0.926960595
## 10  -0.4452586444 0.133785671
## 11  -0.5948871127 0.262608117
## 12  -0.1729078474 0.262503966
## 13  -0.9586558444 0.287600325
## 14  -0.3241865532 0.281297034
## 15  -0.7891679052 0.004123354
## 16  -0.2023743689 0.154749037
## 17  -0.1665652804 0.199416589
## 18  -0.4305426590 0.145760085
## 19   0.3277208488 0.249267986
## 20   0.2526869504 0.763754460
## 21  -0.2291748808 0.274176654
## 22   0.1121617854 0.964858413
## 23  -0.1847857409 0.067262553
## 24  -0.6292327251 0.075247944
## 25   0.5761361043 0.414791692
## 26   0.2049266300 0.004291554
## 27   0.0132431006 0.178860086
## 28  -0.1359941759 0.144184681
## 29  -0.0246962728 0.307712310
## 30   0.2050552173 0.309618368
## 31  -0.9668000932 0.253876767
## 32   0.7658808632 0.456350595
## 33   0.5302116009 0.803955887
## 34  -0.0001838813 0.439835905
## 35  -0.0222955397 0.088198575
## 36  -0.8837969499 0.193518401
## 37  -0.3277504900 0.254021872
## 38   0.8501906539 0.635783810
## 39   0.9023782322 0.772531940
## 40   0.4089015564 0.330300218
## 41  -0.3240905399 0.714592131
## 42  -0.7282109633 0.024657078
## 43   0.8352073799 0.363419007
## 44   0.4895319007 0.979586371
## 45   0.1047348315 0.492410101
## 46  -0.8969573625 0.330348281
## 47   0.4724861137 0.756184842
## 48   0.7769249645 0.927079719
## 49  -0.1472262992 0.633002502
## 50   0.1278790892 1.000000000
## 51  -0.6474713902 0.086095990
## 52   0.3145516682 0.721028473
## 53   0.8514027586 0.807706064
## 54   0.2237005294 0.879420788
## 55   0.2785428674 0.393103616
## 56   0.3625454055 0.438855088
## 57   0.5032011755 0.531009019
## 58  -0.3735022289 0.891787106
## 59  -0.9238799345 0.077587659
## 60   0.1216340438 0.405986713
## 61  -0.8722649883 0.006820208
## 62   0.7022574083 0.027268386
## 63  -0.6932103988 0.253480030
## 64  -0.8144953125 0.284718517
## 65  -0.8571297638 0.096814249
## 66   0.3474725364 0.388864614
## 67   0.5488601434 0.735109821
## 68  -0.7373263398 0.048845236
## 69   0.5506620742 0.390548216
## 70   0.8278400726 0.632544546
## 71  -0.0796370944 0.355981376
## 72  -0.0636390769 0.277524683
## 73  -0.8185635149 0.184941546
## 74  -0.5491065113 0.321530666
## 75   0.8112339503 0.082117452
## 76  -0.5088839815 0.037405090
## 77  -0.4997809916 0.061608303
## 78  -0.0439471798 0.788906387
## 79   0.6777522597 0.533866962
## 80   0.0933314953 0.014951110
## 81   0.6840716824 0.993023333
## 82  -0.3953493480 0.891751181
## 83   0.1880494501 0.596606045
## 84   0.2668173555 0.432757486
## 85  -0.9442095477 0.097680636
## 86  -0.7011061283 0.492667770
## 87   0.5066318703 0.744274421
## 88   0.8053276855 0.386893550
## 89   0.0184994810 0.419646808
## 90   0.3052054625 0.509774188
## 91   0.2108985316 0.579120848
## 92   0.7760231760 0.574535080
## 93  -0.2896341239 0.713600821
## 94  -0.5712288916 0.268078592
## 95   0.7122668703 0.935132279
## 96  -0.9101546267 0.146531676
## 97  -0.9660782828 0.145796373
## 98   0.8598226793 0.520722310
## 99  -0.1024818253 0.580956858
## 100  0.0721415915 0.757773764
## 101 -0.7252482725 0.074049284
## 102  0.8347236994 0.726546246
## 103  0.9278829601 0.967917434
## 104 -0.7376255663 0.286193432
## 105  0.2782781655 0.824847545
## 106 -0.5881235315 0.504654844
## 107 -0.9055977683 0.444884730
## 108 -0.3995262193 0.683579650
## 109  0.8590993434 0.498596521
## 110 -0.7786236978 0.093838898
## 111 -0.2906161379 0.648824034
## 112  0.9795495742 0.761041386
## 113  0.7446679506 0.829248890
## 114  0.2552458872 0.703824004
## 115 -0.8181042299 0.265487711
## 116  0.5496888896 0.737663286
## 117  0.9331960254 0.523952526
## 118 -0.6987393643 0.300164611
## 119  0.2943613613 0.909369327
## 120 -0.8513229643 0.273119797
## 121 -0.0383902141 0.560896970
## 122 -0.8468800089 0.739936363
## 123 -0.7410055222 0.391007534
## 124 -0.9851831947 0.330523760
## 125 -0.8217374743 0.103628152
## 126 -0.2088671285 0.543357399
## 127  0.3475814904 0.551207273
## 128 -0.2282035709 0.210943630
## 129  0.3851229385 0.556955659
## 130 -0.5109517891 0.224772673
## 131 -0.1674109967 0.658050327
## 132  0.0507297106 0.846902072
## 133  0.5043879710 0.931875234
## 134  0.0194632616 0.907627148
## 135  0.7398894201 0.698653378
## 136 -0.1659558890 0.535791513
## 137  0.0788629455 0.692549569
## 138 -0.2918431889 0.907727457
## 139 -0.4443255016 0.617022223
## 140 -0.0412291428 0.936066704
## 141  0.2127131098 0.739747529
## 142 -0.8589165891 0.974606821
## 143  0.6536655063 0.466846881
## 144  0.0292602144 0.925434655
## 145 -0.1007728823 0.920879911
## 146 -0.4437908405 0.736631661
## 147  0.4475732981 0.874659315
## 148 -0.9200719800 0.744840418
## 149 -0.4949357044 0.818991350
## 150  0.3725688467 0.452749531
## 151 -0.2701782114 0.960922325
## 152 -0.0014616535 0.900761174
## 153  0.9588214154 0.944719178
## 154  0.8683875632 0.814375350
## 155 -0.8185887202 0.524325396
## 156  0.4189980770 0.972946940
## 157  0.3722255854 0.993500049
## 158  0.5592436069 0.936648237
## 159  0.8587895720 0.678151740
## 160 -0.4349691020 0.496378822
## 161  0.4320965889 0.564180936
## 162 -0.6012038514 0.545632653
## 163 -0.2482219208 0.566616347
## 164 -0.3455726504 0.300146252
## 165 -0.3155147373 0.664399874
## 166  0.7779583060 0.726640709
## 167  0.9947710917 0.708303880
## 168 -0.2287239390 0.684937580
## 169  0.6570935263 0.525183428
## 170 -0.7764400500 0.208734470
## 171 -0.5399909061 0.317120627
## 172 -0.2960297032 0.806048973
## 173  0.3662672532 0.790477576
## 174  0.2404299947 0.664994576
## 175  0.7148773661 0.895437893
## 176 -0.8898700019 0.093298239
## 177 -0.1807414619 0.659713688
## 178 -0.2899959944 0.467629647
## 179  0.3421191289 0.731888769
## 180  0.6082367082 0.518400980
## 181 -0.1539263432 0.085973657
## 182 -0.9538709279 0.157087186
## 183 -0.7831648863 0.120200929
## 184  0.9872349501 0.444179653
## 185  0.1318933773 0.772687151
## 186 -0.1101774354 0.478780349
## 187  0.5978478994 0.847839009
## 188 -0.9225595254 0.000000000
## 189 -0.0906763198 0.975929054
## 190  0.7902844986 0.482777294
## 191  0.9119730652 0.800055782
## 192 -0.2341207769 0.787672703
## 193 -0.7489807825 0.151801265
## 194 -0.7594495877 0.048738596
## 195  0.1303078230 0.818905181
## 196 -0.1243950953 0.748260704
## 197  0.3245875151 0.380180321
## 198 -0.1601297092 0.204956404
## 199  0.3959873342 1.000000000
## 200 -0.5818965873 0.245319596
## 201  0.3904431779 0.676191279
## 202  0.2919251933 0.477591291
## 203 -0.3876546412 0.784677502
## 204  0.1518208990 0.877595751
## 205 -0.9338115733 0.185089383
## 206  0.0863254918 0.917984453
## 207 -0.5272557684 0.749276044
## 208  0.2681963337 0.643327620
## 209 -0.2479330171 0.574341481
## 210 -0.7611723035 0.718020135
## 211  0.0851510549 0.557053073
## 212  0.4820330632 0.700296095
## 213  0.2707141563 0.813705322
## 214 -0.0681064785 0.483113515
## 215  0.4217492836 0.684083384
## 216 -0.0209869910 0.923760442
## 217 -0.0263231630 0.483086707
## 218  0.8086509537 0.546530216
## 219 -0.0258578197 0.701740284
## 220  0.3012806685 0.743041616
## 221 -0.0658871098 0.452229474
## 222 -0.3834863724 0.293270361
## 223 -0.9218967338 0.043090898
## 224 -0.3534536860 0.361579754
## 225 -0.1680829828 0.538360408
## 226  0.5868192785 0.518485928
## 227  0.1984919030 0.917558576
## 228  0.7143420116 0.759309825
## 229 -0.1541466461 0.645782673
## 230  0.7402073513 0.719661240
## 231 -0.2013189285 0.417334514
## 232  0.2692617802 0.704286635
## 233 -0.7525645648 0.202644628
## 234 -0.2404105249 0.426682316
## 235 -0.3201814191 0.601279262
## 236  0.3464034260 0.620977087
## 237 -0.2478896938 0.709153915
## 238 -0.1386564709 0.993470384
## 239  0.9400684256 0.991827026
## 240 -0.4330935050 0.979912342
## 241 -0.6859444696 0.359965247
## 242  0.6243365044 0.284630250
## 243  0.6960156346 0.409758331
## 244  0.5659065191 0.259435166
## 245 -0.4307305929 0.234619002
## 246 -0.3944681929 0.973355220
## 247  0.6537670875 0.111494172
## 248  0.8798789172 0.815030288
## 249 -0.9916500119 0.313043260
## 250 -0.9633893985 0.288820990
## 251 -0.6636891528 0.384379952
## 252  0.6134323115 0.000000000
## 253  0.5503639071 0.164654726
## 254 -0.3642482068 0.066288145
## 255  0.2225711853 0.064564236
## 256  0.4410483669 0.590550022
## 257  0.1590505415 0.226743923
## 258 -0.7416385538 0.048206210
## 259  0.0283142840 0.047646089
## 260 -0.0687424811 0.258847604
## 261  0.0202217852 0.679901166
## 262 -0.6005201908 0.573875998
## 263 -0.3067893488 0.692482397
## 264  0.7574281381 0.679138448
## 265  0.0227287761 0.979790213
## 266  0.6297351667 0.910882807
## 267 -0.4777370589 0.871012812
## 268 -0.0524891308 0.438095781
## 269  0.8359331721 0.731764177
## 270 -0.8258993770 0.021246076
## 271 -0.4094751664 0.586849059
## 272  0.2277003033 0.533230684
## 273  0.6396134286 0.554100100
## 274  0.9832845489 0.415558830
## 275 -0.3164132521 0.672387007
## 276  0.0166798262 0.445087315
## 277 -0.8469136213 0.837538538
## 278  0.0425696434 0.464726798
## 279  0.8855433031 0.415138970
## 280  0.1679638801 0.628973432
## 281  0.6328670294 0.876127736
## 282  0.5150957936 0.522037407
## 283 -0.4286749191 0.236458524
## 284 -0.0834789439 0.243410236
## 285 -0.0878355387 0.552827002
## 286 -0.9007116817 0.347316899
## 287 -0.3019751734 0.420705952
## 288 -0.6940304930 0.355190982
## 289 -0.7356856605 0.208193559
## 290  0.3046160480 0.954202348
## 291 -0.8213655837 0.152229952
## 292  0.9104395001 0.540641333
## 293  0.1479405100 0.436781754
## 294 -0.9447381557 0.142440637
## 295  0.5812614663 0.655827779
## 296 -0.0254703611 0.457881879
## 297 -0.0629950021 0.270431752
## 298 -0.3118514754 0.050060004
## 299  0.3488319376 0.614820458
## 300  0.2592362603 0.067603837
## 301  0.2365881167 0.971128604
## 302 -0.2695430461 0.932994517
## 303  0.8233310962 0.891012520
## 304 -0.0290525104 0.337424487
## 305 -0.9970019702 0.131004502
## 306  0.8283548760 0.582343335
## 307 -0.6803442915 0.299186127
## 308 -0.3365118536 0.672863407
## 309  0.5128823011 0.970024740
## 310  0.3546337332 0.633079706
## 311  0.9772755383 0.646605487
## 312 -0.1412347099 0.474686485
## 313 -0.5968847875 0.057105596
## 314  0.1003852421 0.490345819
## 315  0.8757053050 0.726717566
## 316 -0.8841708214 0.103610740
## 317  0.4538287483 0.447603959
## 318 -0.0763973761 0.557796205
## 319  0.8315512058 0.660643811
## 320 -0.0387900174 0.670534152
## 321 -0.1738258419 0.334559061
## 322 -0.2501723892 0.304400519
## 323 -0.9459872246 0.317307447
## 324  0.8324981527 0.741271114
## 325  0.7772721183 0.581321542
## 326 -0.2129363981 0.312599317
## 327  0.3855148419 0.854325799
## 328  0.8544808137 0.928307057
## 329  0.7685386832 0.553545650
## 330  0.8027487369 0.806213875
## 331 -0.2064202693 0.095185659
## 332  0.7018460920 0.767235310
## 333 -0.1494214227 0.073058323
## 334 -0.7197828447 0.031966176
## 335  0.7214639378 0.715966332
## 336  0.9429451618 0.390005805
## 337 -0.6566206268 0.422870248
## 338 -0.2898704787 0.308595771
## 339  0.9009870570 0.982668916
## 340  0.0419881735 0.150573054
## 341  0.2040642397 0.000000000
## 342 -0.3710212489 0.327074187
## 343  0.9950782596 0.611838404
## 344 -0.2877450553 0.213387605
## 345  0.8086697948 0.256368628
## 346 -0.1796246124 0.235152554
## 347 -0.9739065934 0.367622900
## 348 -0.9013128630 0.221830646
## 349  0.1478842776 0.177057437
## 350  0.8969291463 0.491797225
## 351 -0.0688442118 0.260630758
## 352 -0.4535448160 0.402640952
## 353  0.0083827293 0.098067453
## 354  0.6529187537 0.875000433
## 355 -0.0237580943 0.016789653
## 356  0.7846361892 0.671327513
## 357  0.4597278018 0.259499517
## 358  0.4364669095 0.808078975
## 359 -0.1161067053 0.282795836
## 360  0.9424859262 0.643895848
## 361  0.1841332731 0.361161785
## 362 -0.8021212909 0.243075825
## 363 -0.2413893566 0.144082575
## 364 -0.3757357253 0.311547086
## 365 -0.9375773254 0.270851562
## 366 -0.2514445214 0.031877138
## 367 -0.3447429161 0.366651477
## 368 -0.1367153917 0.245869374
## 369 -0.4430400985 0.215689494
## 370  0.1304284395 0.053467704
## 371  0.2690206445 0.514151225
## 372  0.0340441554 0.030421706
## 373 -0.7696465794 0.098526949
## 374 -0.2192046265 0.276569893
## 375 -0.2477389551 0.208093090
## 376 -0.7946000337 0.081244580
## 377 -0.4791901824 0.339263422
## 378  0.2075434192 0.133491073
## 379  0.2198478170 0.305902516
## 380 -0.5016394174 0.056490609
## 381 -0.6459366805 0.050374424
## 382 -0.7725382908 0.393750768
## 383  0.5011488106 0.622412041
## 384  0.1207978493 0.899369902
## 385  0.2646914748 0.544632697
## 386 -0.1701018931 0.507859971
## 387 -0.9353955272 0.764396280
## 388  0.3238700344 0.934703326
## 389  0.6346072191 0.632994202
## 390 -0.7170120669 0.259994381
## 391 -0.0281893061 0.770691627
## 392  0.3263039743 0.555213290
## 393  0.6098396508 0.883560988
## 394 -0.4612765866 0.101010147
## 395  0.6025492894 0.683485194
## 396  0.2908759499 0.858146921
## 397  0.8128983602 0.655973790
## 398  0.8013101672 0.511248193
## 399 -0.2374115093 0.000000000
## 400  0.4559888639 0.673931218
## 401 -0.5730695720 0.468932425
## 402  0.3795503741 0.286054775
## 403 -0.9872198035 0.213686705
## 404  0.7464193250 0.641209545
## 405  0.7731138198 0.692567502
## 406 -0.0574731571 0.751086393
## 407 -0.6624595150 0.217386433
## 408 -0.4799540946 0.104202945
## 409 -0.7177845738 0.355694627
## 410 -0.7219509673 0.163069077
## 411 -0.6376303569 0.305054051
## 412  0.1032523992 0.392561098
## 413 -0.2935051317 0.093578513
## 414  0.7624154608 0.420844626
## 415  0.5023445995 0.103551669
## 416 -0.9705314916 0.401303656
## 417  0.7905192296 0.875797209
## 418  0.7154801097 0.516868607
## 419 -0.5886369636 0.397408900
## 420  0.6760459682 0.399730993
## 421  0.9609807087 0.765685509
## 422  0.3964235727 0.619589842
## 423  0.4521315643 0.772223022
## 424  0.3502948531 0.557150801
## 425 -0.1488703447 0.971091233
## 426  0.5439703907 0.768215102
## 427  0.4106427897 0.591547398
## 428  0.8203508537 0.430871132
## 429  0.2246386679 0.449141379
## 430 -0.0345456712 0.982264141
## 431 -0.0317177116 0.881044834
## 432  0.0930346064 0.574319097
## 433 -0.4726542835 0.747654108
## 434 -0.9009059714 0.305609112
## 435  0.2342062197 0.571942275
## 436  0.9710454894 0.490990307
## 437 -0.7853083168 0.815600977
## 438 -0.2928920877 0.431420106
## 439 -0.0563388695 0.505022840
## 440 -0.6677194554 0.189374307
## 441  0.8893409027 0.745829287
## 442 -0.1061508898 0.824302839
## 443 -0.3822832382 0.228433541
## 444  0.7000630293 0.646981372
## 445 -0.4654685813 0.285280890
## 446  0.8287252388 0.968519849
## 447 -0.5722691515 0.107561682
## 448  0.1763187130 0.515851499
## 449  0.6183981560 0.608597886
## 450  0.1062716623 0.519864538
## 451 -0.5548874494 0.364258457
## 452 -0.3000295893 0.978616949
## 453  0.6698308527 0.890754825
## 454 -0.1987980814 0.026693815
## 455 -0.0776811144 0.768211373
## 456 -0.8830897068 0.075458751
## 457  0.5404790267 0.846916762
## 458 -0.2550089774 0.160229013
## 459 -0.7418670086 0.065006870
## 460  0.5165311606 0.584238723
## 461 -0.4922575983 0.275289699
## 462  0.7202370432 0.637564927
## 463 -0.7199831526 0.012756611
## 464 -0.3564302414 0.876451124
## 465 -0.9687955654 0.110036364
## 466 -0.3069710433 0.027930074
## 467 -0.3718943195 0.360756770
## 468 -0.7425230541 0.286275248
## 469 -0.0383140356 0.204906877
## 470  0.6264637443 0.384932148
## 471  0.4123244798 0.604953341
## 472  0.3097740975 0.915195830
## 473 -0.4863727810 0.055451025
## 474 -0.1655715876 0.688086970
## 475 -0.8488741489 0.024929108
## 476  0.2671352546 0.986955586
## 477  0.8208167669 0.938072808
## 478 -0.0471392893 0.964854208
## 479  0.3825218687 0.458067214
## 480 -0.6700512734 0.348538009
## 481  0.3444600673 0.741506761
## 482  0.1078079278 0.457807402
## 483 -0.0804121252 0.211325453
## 484  0.4078871310 0.683114972
## 485  0.4917473714 0.806974351
## 486 -0.1045578416 0.397746288
## 487 -0.2729354743 0.167649881
## 488 -0.0617587869 0.801013546
## 489 -0.2534796232 0.298534463
## 490  0.2040501004 0.880295230
## 491  0.5666730134 0.683356723
## 492 -0.6125771133 0.114677083
## 493 -0.7535629007 0.291774810
## 494  0.3944535223 0.328064004
## 495  0.5756658008 0.919594094
## 496  0.1090662307 0.631499053
## 497 -0.0344261145 0.593455313
## 498  0.7037284449 0.405684058
## 499  0.9910177384 0.787470544
## 500 -0.2396854507 0.280576584

Moreover, if you perform the DHARMa procedure per group, you can see overdispersion created by the random effect variance (left without grouping, right with grouping)

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 1.0013, p-value = 1
## alternative hypothesis: two.sided

## 
##  DHARMa nonparametric dispersion test via sd of residuals fitted
##  vs. simulated
## 
## data:  simulationOutput
## ratioObsSim = 2.5084, p-value < 2.2e-16
## alternative hypothesis: two.sided