This learner supports prediction using grouped time-series modeling, using hts. Fitting is done with hts and prediction is performed via forecast.gts.

Format

R6Class object.

Value

Learner object with methods for training and prediction. See Lrnr_base for documentation on learners.

Parameters

method)

Method for distributing forecasts within hierarchy. See details of forecast.gts.

weights)

Weights used for "optimal combination" method: weights="ols" uses an unweighted combination (as described in Hyndman et al 2011); weights="wls" uses weights based on forecast variances (as described in Hyndman et al 2015); weights="mint" uses a full covariance estimate to determine the weights (as described in Hyndman et al 2016); weights="nseries" uses weights based on the number of series aggregated at each node.

fmethod)

Forecasting method to use for each series.

algorithms)

An algorithm to be used for computing the combination forecasts (when method=="comb"). The combination forecasts are based on an ill-conditioned regression model. "lu" indicates LU decomposition is used; "cg" indicates a conjugate gradient method; "chol" corresponds to a Cholesky decomposition; "recursive" indicates the recursive hierarchical algorithm of Hyndman et al (2015); "slm" uses sparse linear regression. Note that algorithms = "recursive" and algorithms = "slm" cannot be used if weights="mint".

covariance)

Type of the covariance matrix to be used with weights="mint": either a shrinkage estimator ("shr") with shrinkage towards the diagonal; or a sample covariance matrix ("sam").

keep.fitted)

If TRUE, keep fitted values at the bottom level.

keep.resid)

If TRUE, keep residuals at the bottom level.

positive)

If TRUE, forecasts are forced to be strictly positive (by setting lambda=0).

lambda)

Box-Cox transformation parameter.

level

Level used for "middle-out" method (only used when method = "mo").

parallel

If TRUE, import parallel to allow parallel processing.

num.cores

If parallel = TRUE, specify how many cores are going to be used.

Examples

# Example adapted from hts package manual
# The hierarchical structure looks like 2 child nodes associated with level 1,
# which are followed by 3 and 2 sub-child nodes respectively at level 2.
library(hts)
#> Loading required package: forecast

set.seed(3274)
abc <- as.data.table(5 + matrix(sort(rnorm(200)), ncol = 4, nrow = 50))
setnames(abc, paste("Series", 1:ncol(abc), sep = "_"))
abc[, time := .I]
#>     Series_1 Series_2 Series_3 Series_4 time
#>  1: 2.607712 4.453525 5.059647 5.674443    1
#>  2: 2.770780 4.468715 5.081054 5.686761    2
#>  3: 2.782223 4.482786 5.102684 5.707513    3
#>  4: 2.787655 4.483375 5.115142 5.739496    4
#>  5: 2.932073 4.489276 5.136250 5.748407    5
#>  6: 3.064115 4.527663 5.141674 5.789793    6
#>  7: 3.123294 4.549361 5.144565 5.794407    7
#>  8: 3.249828 4.549487 5.158862 5.799323    8
#>  9: 3.370915 4.558805 5.161763 5.799465    9
#> 10: 3.379109 4.560854 5.198417 5.838728   10
#> 11: 3.383535 4.562303 5.199713 5.861504   11
#> 12: 3.420078 4.562310 5.212496 5.865641   12
#> 13: 3.425742 4.571433 5.215093 5.869555   13
#> 14: 3.495619 4.589284 5.216852 5.909086   14
#> 15: 3.497682 4.594580 5.217866 5.947755   15
#> 16: 3.502086 4.617152 5.228778 5.963670   16
#> 17: 3.507000 4.620320 5.240832 5.994254   17
#> 18: 3.539549 4.629534 5.257752 6.012084   18
#> 19: 3.558871 4.638255 5.304387 6.021723   19
#> 20: 3.654411 4.656876 5.304651 6.073125   20
#> 21: 3.691043 4.657565 5.306511 6.082792   21
#> 22: 3.708930 4.662122 5.367066 6.118491   22
#> 23: 3.710579 4.665984 5.381140 6.133808   23
#> 24: 3.719483 4.696833 5.388976 6.144857   24
#> 25: 3.734375 4.698856 5.390283 6.166144   25
#> 26: 3.743854 4.704998 5.397072 6.198817   26
#> 27: 3.881388 4.714980 5.458463 6.228678   27
#> 28: 3.900640 4.735690 5.460558 6.248769   28
#> 29: 3.908360 4.739398 5.464411 6.251769   29
#> 30: 3.952202 4.808120 5.468979 6.289440   30
#> 31: 3.970051 4.826650 5.496115 6.302676   31
#> 32: 4.055186 4.831577 5.513275 6.379657   32
#> 33: 4.096847 4.867235 5.519347 6.418440   33
#> 34: 4.108913 4.870614 5.523854 6.418850   34
#> 35: 4.109187 4.886651 5.525574 6.431287   35
#> 36: 4.138922 4.888021 5.546059 6.502432   36
#> 37: 4.145324 4.897311 5.548523 6.558697   37
#> 38: 4.149141 4.898081 5.556770 6.559662   38
#> 39: 4.179869 4.901912 5.563149 6.581637   39
#> 40: 4.250431 4.905990 5.566242 6.612510   40
#> 41: 4.306163 4.920350 5.576517 6.632175   41
#> 42: 4.315984 4.931794 5.576623 6.674475   42
#> 43: 4.343608 4.935432 5.588910 6.728559   43
#> 44: 4.347261 4.945405 5.622140 6.796547   44
#> 45: 4.357179 4.965257 5.657682 6.822929   45
#> 46: 4.359392 4.982516 5.659091 6.955815   46
#> 47: 4.368390 4.990881 5.659724 7.041288   47
#> 48: 4.394952 5.026946 5.662659 7.406428   48
#> 49: 4.398161 5.031637 5.669859 7.551534   49
#> 50: 4.412912 5.043044 5.671561 7.819792   50
#>     Series_1 Series_2 Series_3 Series_4 time
grps <- rbind(c(1, 1, 2, 2), c(1, 2, 1, 2))
horizon <- 12
suppressWarnings(abc_long <- melt(abc, id = "time", variable.name = "series"))

# create sl3 task (no outcome for hierarchical/grouped series)
node_list <- list(outcome = "value", time = "time", id = "series")
train_task <- sl3_Task$new(data = abc_long, nodes = node_list)
test_data <- expand.grid(time = 51:55, series = unique(abc_long$series))
test_data <- as.data.table(test_data)[, value := 0]
test_task <- sl3_Task$new(data = test_data, nodes = node_list)

gts_learner <- Lrnr_gts$new()
gts_learner_fit <- gts_learner$train(train_task)
gts_learner_preds <- gts_learner_fit$predict(test_task)