Since it’s not available on CRAN yet, you have to download and install the package directly from GitHub:
devtools::install_github("aschersleben/cointReg", build_vignettes = TRUE)
Load the package:
library("cointReg")
Generate a regression variable x
and a dependant variable y
. The fastest and easiest way to plot both time series is matplot(...)
.
set.seed(42)
x <- cumsum(rnorm(200, mean = 0, sd = 0.1)) + 10
y <- x + rnorm(200, sd = 0.4) + 2
matplot(1:200, cbind(y, x), type = "l", main = "Cointegration Model")
Now you can estimate the model parameters with the FM-OLS method and include an intercept in the model via the deter
variable:
deter <- rep(1, 200)
test <- cointRegFM(x = x, y = y, deter = deter)
Print the results:
print(test)
##
## ### FM-OLS model ###
##
## Model: y ~ deter + x
##
## Parameters: Kernel = "ba" // Bandwidth = 1.40497 ("Andrews")
##
## Coefficients:
## Estimate Std.Err t value Pr(|t|>0)
## deter 2.182468 0.576897 3.7831 0.0002051 ***
## x.coint 0.982432 0.057845 16.9839 < 2.2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
You can see that both the intercept and the regression variable are significant.
Finally, you can plot the residuals:
plot(test, main = "Residuals of the Cointegration Model")
set.seed(1909)
x1 <- cumsum(rnorm(100, mean = 0.05, sd = 0.1))
x2 <- cumsum(rnorm(100, sd = 0.1)) + 1
x3 <- cumsum(rnorm(100, sd = 0.2)) + 2
x <- cbind(x1, x2, x3)
y <- x1 + x2 + x3 + rnorm(100, sd = 0.2) + 1
matplot(1:100, cbind(y, x), type = "l", main = "Cointegration Model")
deter <- cbind(level = 1, trend = 1:100)
test <- cointRegFM(x, y, deter, kernel = "ba", bandwidth = "and")
print(test)
##
## ### FM-OLS model ###
##
## Model: y ~ deter + x
##
## Parameters: Kernel = "ba" // Bandwidth = 1.940012 ("Andrews")
##
## Coefficients:
## Estimate Std.Err t value Pr(|t|>0)
## level 1.1451921 0.1657569 6.9089 5.538e-10 ***
## trend -0.0065659 0.0077114 -0.8514 0.3967
## x1 1.0944603 0.1159726 9.4372 2.637e-15 ***
## x2 0.8679426 0.0857958 10.1164 < 2.2e-16 ***
## x3 0.9902175 0.0626775 15.7986 < 2.2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
plot(test, main = "Residuals of the Cointegration Model")
This is why you should use modified OLS methods instead of a normal OLS model to estimate parameters of a cointegrating regression:
set.seed(26)
x <- cumsum(rnorm(200))
y <- cumsum(rnorm(200))
summary(lm(y ~ x))
##
## Call:
## lm(formula = y ~ x)
##
## Residuals:
## Min 1Q Median 3Q Max
## -10.7889 -3.3236 0.6175 2.8696 8.9689
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 8.7016 0.3899 22.32 < 2e-16 ***
## x -0.3811 0.0590 -6.46 7.94e-10 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4.196 on 198 degrees of freedom
## Multiple R-squared: 0.1741, Adjusted R-squared: 0.1699
## F-statistic: 41.73 on 1 and 198 DF, p-value: 7.943e-10
The independant variable x
seems to be significant at a very secure level.
And now have a look at the results of an FM-OLS regression:
cointRegFM(x = x, y = y, deter = rep(1, 200))
##
## ### FM-OLS model ###
##
## Model: y ~ rep(1, 200) + x
##
## Parameters: Kernel = "ba" // Bandwidth = 51.01288 ("Andrews")
##
## Coefficients:
## Estimate Std.Err t value Pr(|t|>0)
## deter 8.74037 1.50333 5.8140 2.412e-08 ***
## x.coint -0.27930 0.22696 -1.2306 0.2199
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
So the x
variable doesn’t have an influence on y
– which makes sense because they were generated independently.