Interactions and Moderation

A brief and evolving tutorial on interactions and moderation

Author

Andy Grogan-Kaylor

Published

December 1, 2023

1 Introduction

I’ve heard somewhere–I unfortunately forget exactly where–that a statistical model is nothing more than a principled statement about the data, that can be tested against the data. I think of moderation in that context.

Moderation is a kind of “statistical story” about different groups in the data.

As we build up to the idea of moderation, we are telling various “stories” about the data, and testing whether they are accurate.

2 Setup

2.1 The Equation That Generates The Data

Important

In practice, we usually don’t know this equation. We are trying to estimate it.

\[y_i = \beta_0 + \beta_1 x + \beta_2 \text{group} + \beta_3 (x \times \text{group}) + e_i\]

Numerically…

\[y_i = 10 + 1x + 30 \text{group} -.25 (x \times \text{group}) + e_i\]

2.2 Simulated Data

2.3 Graph The Data

3 First Model: x Only

3.1 The Story

Our first story is that the regression coefficients function the same for everyone in the data set, regardless of the group that they are in. Everyone has the same \(\beta_0\), and the same \(\beta_1\) or regression slope.

y Is A Function Of …

y is a function of an intercept, x and some error.

3.2 The Equation

\[y = \beta_0 + \beta_1 x + e_i\]

3.3 The Regression

  Estimate Std. Error t value Pr(>|t|)
(Intercept) 24.57 0.9564 25.69 3.75e-112
x 0.8803 0.009542 92.26 0
Fitting linear model: y ~ x
Observations Residual Std. Error \(R^2\) Adjusted \(R^2\)
1000 3.008 0.8951 0.8949

3.4 Graph The Regression

4 Second Model: x and group

4.1 The Story

Our second story is that one group has a different \(\beta_0\), a different intercept from the other group, But both groups have the same regression slope or \(\beta_1\).

y Is A Function Of …

y is a function of an intercept, x, group membership, and some error.

4.2 The Equation

\[y = \beta_0 + \beta_1 x + \beta_2 group + e_i\]

4.3 The Regression

  Estimate Std. Error t value Pr(>|t|)
(Intercept) 21.92 0.5248 41.78 2.673e-221
x 0.8808 0.005207 169.2 0
group 5.039 0.1039 48.52 1.029e-264
Fitting linear model: y ~ x + group
Observations Residual Std. Error \(R^2\) Adjusted \(R^2\)
1000 1.641 0.9688 0.9687

4.4 Graph The Regression

5 Third Model: x and group and interaction of x and group

5.1 The Story

Our third and final story–and this is where interactions and moderation come in–is that each group has its own regression intercept, and each group now also has its own slope. One group has a slope of \(\beta_1\). The other group has a slope of \(\beta_1 + \beta_3\), where \(\beta_3\) indicates how the regression slope for the second group, is different from the regression slope for the first group.

y Is A Function Of …

y is a function of an intercept, x, group membership, group membership multiplied by x, and some error.

5.2 The Equation

\[y = \beta_0 + \beta_1 x + \beta_2 group + \beta_3 x \times group + e_i\]

5.3 The Regression

  Estimate Std. Error t value Pr(>|t|)
(Intercept) 9.648 0.4567 21.13 3.783e-82
x 1.004 0.004554 220.5 0
group 30.54 0.6579 46.42 2.44e-251
x:group -0.2557 0.006563 -38.96 2.015e-202
Fitting linear model: y ~ x * group
Observations Residual Std. Error \(R^2\) Adjusted \(R^2\)
1000 1.034 0.9876 0.9876

5.4 Graph The Regression

6 Summary of the Coefficients for Each Group

Group Intercept Slope
0 \(\beta_0\) \(\beta_1\)
1 \(\beta_0 + \beta_2\) \(\beta_1 + \beta_3\)

If we consider a group membership variable that has values 0 or 1, then effectively, the intercept for group 0 is \(\beta_0\) and the slope for group 0 is \(\beta_1\). For group 1, the intercept is \(\beta_0 + \beta_2\) while the slope is \(\beta_1 + \beta_3\).

\(\beta_2\) is thus the difference in intercepts between the two groups.

\(\beta_3\) is the difference in slopes between the two groups.