Because the number of correlations grows quadratically with the number of random slopes, adding just one additional slope (whether main effect or interaction) can greatly increase the number of free parameters in the model. We can omit them from the model, using the || in lme4, splitting elements into separate (x + ... | grp) terms in lme4 or MixedModels.jl or using zerocorr in MixedModels.jl.
In terms of philosophy, this is a bit like omitting higher order interactions from the fixed effects: there is change in the bias-variance tradeoff. However, practice suggests that the tradeoff is often worthwhile, although it makes shrinkage less efficient. John Kruschke has a nice worked example on his blog.
For our sleep study example, we we can see that there is very little impact because there is almost no correlation between the random intercept and random slope.
usingCairoMakieusingMixedModelsusingMixedModelsMakiesleepstudy = MixedModels.dataset(:sleepstudy)# REML=false by default in Juliam2 =fit(MixedModel, @formula(reaction ~1+ days + (1+ days | subj)), sleepstudy)
┌ Warning: ProgressMeter by default refresh meters with additional information in IJulia via `IJulia.clear_output`, which clears all outputs in the cell.
│ - To prevent this behaviour, do `ProgressMeter.ijulia_behavior(:append)`.
│ - To disable this warning message, do `ProgressMeter.ijulia_behavior(:clear)`.
└ @ ProgressMeter /home/phillip/.julia/packages/ProgressMeter/sN2xr/src/ProgressMeter.jl:618
Minimizing 57 Time: 0:00:00 ( 7.41 ms/it)
Est.
SE
z
p
σ_subj
(Intercept)
251.4051
6.6323
37.91
<1e-99
23.7805
days
10.4673
1.5022
6.97
<1e-11
5.7168
Residual
25.5918
We can see this with a shrinkage plot, which show the by-group (here: by-subject) offsets from the grand mean for each random effect. The red dots correspond to the esimtates you would get from classical linear regression within subjects, while the blue dots correspond to the shrunken “estimates” (technically predictions) you get for each subject from the mixed model.
shrinkageplot(m2)
MixedModels.PCA(m2)[:subj]
Principal components based on correlation matrix
(Intercept) 1.0 .
days 0.08 1.0
Normalized cumulative variances:
[0.5407, 1.0]
Component loadings
PC1 PC2
(Intercept) -0.71 -0.71
days -0.71 0.71
m2_zerocorr =fit(MixedModel, @formula(reaction ~1+ days +zerocorr(1+ days | subj)), sleepstudy)
Est.
SE
z
p
σ_subj
(Intercept)
251.4051
6.7077
37.48
<1e-99
24.1714
days
10.4673
1.5193
6.89
<1e-11
5.7994
Residual
25.5561
shrinkageplot(m2_zerocorr)
MixedModels.PCA(m2)[:subj]
Principal components based on correlation matrix
(Intercept) 1.0 .
days 0.08 1.0
Normalized cumulative variances:
[0.5407, 1.0]
Component loadings
PC1 PC2
(Intercept) -0.71 -0.71
days -0.71 0.71
If we consider a more complex model, then the change can be much more dramatic:
┌ Warning: ProgressMeter by default refresh meters with additional information in IJulia via `IJulia.clear_output`, which clears all outputs in the cell.
│ - To prevent this behaviour, do `ProgressMeter.ijulia_behavior(:append)`.
│ - To disable this warning message, do `ProgressMeter.ijulia_behavior(:clear)`.
└ @ ProgressMeter /home/phillip/.julia/packages/ProgressMeter/sN2xr/src/ProgressMeter.jl:618
Minimizing 601 Time: 0:00:00 ( 1.55 ms/it)
objective: 29652.62866326455
The effective dimensionality can be seen in the way that the random effects collapse into lines (i.e.. a 1-D object) within the majority of the panels (each representing a 2-D plane).
When we force the correlations to be zero, we can no longer get diagonal lines – we we can only get horizontal or vertical lines within each panel. Diagonal lines correspond to non zeor correlations between two variance components.