R/Lrnr_lstm_keras.R
Lrnr_lstm_keras.Rd
This learner supports long short-term memory (LSTM) recurrent neural network
algorithm. This learner uses the keras
package. Note that all
preprocessing, such as differencing and seasonal effects for time series
should be addressed before using this learner. Desired lags of the time series
should be added as predictors before using the learner.
A learner object inheriting from Lrnr_base
with
methods for training and prediction. For a full list of learner
functionality, see the complete documentation of Lrnr_base
.
batch_size
: How many times should the training data be used to
train the neural network?
units
: Positive integer, dimensionality of the output space.
dropout
: Float between 0 and 1. Fraction of the input units to
drop.
recurrent_dropout
: Float between 0 and 1. Fraction of the units
to drop for the linear transformation of the recurrent state.
activation
: Activation function to use. If you pass NULL, no
activation is applied (e.g., "linear" activation: a(x) = x
).
recurrent_activation
: Activation function to use for the
recurrent step.
recurrent_out
: Activation function to use for the output step.
epochs
: Number of epochs to train the model.
lr
: Learning rate.
layers
: How many LSTM layers. Only allows for 1 or 2.
callbacks
: List of callbacks, which is a set of functions to
be applied at given stages of the training procedure. Default callback
function callback_early_stopping
stops training if the validation
loss does not improve across patience
number of epochs.
...
: Other parameters passed to keras
.
Other Learners:
Custom_chain
,
Lrnr_HarmonicReg
,
Lrnr_arima
,
Lrnr_bartMachine
,
Lrnr_base
,
Lrnr_bayesglm
,
Lrnr_bilstm
,
Lrnr_caret
,
Lrnr_cv_selector
,
Lrnr_cv
,
Lrnr_dbarts
,
Lrnr_define_interactions
,
Lrnr_density_discretize
,
Lrnr_density_hse
,
Lrnr_density_semiparametric
,
Lrnr_earth
,
Lrnr_expSmooth
,
Lrnr_gam
,
Lrnr_ga
,
Lrnr_gbm
,
Lrnr_glm_fast
,
Lrnr_glm_semiparametric
,
Lrnr_glmnet
,
Lrnr_glmtree
,
Lrnr_glm
,
Lrnr_grfcate
,
Lrnr_grf
,
Lrnr_gru_keras
,
Lrnr_gts
,
Lrnr_h2o_grid
,
Lrnr_hal9001
,
Lrnr_haldensify
,
Lrnr_hts
,
Lrnr_independent_binomial
,
Lrnr_lightgbm
,
Lrnr_mean
,
Lrnr_multiple_ts
,
Lrnr_multivariate
,
Lrnr_nnet
,
Lrnr_nnls
,
Lrnr_optim
,
Lrnr_pca
,
Lrnr_pkg_SuperLearner
,
Lrnr_polspline
,
Lrnr_pooled_hazards
,
Lrnr_randomForest
,
Lrnr_ranger
,
Lrnr_revere_task
,
Lrnr_rpart
,
Lrnr_rugarch
,
Lrnr_screener_augment
,
Lrnr_screener_coefs
,
Lrnr_screener_correlation
,
Lrnr_screener_importance
,
Lrnr_sl
,
Lrnr_solnp_density
,
Lrnr_solnp
,
Lrnr_stratified
,
Lrnr_subset_covariates
,
Lrnr_svm
,
Lrnr_tsDyn
,
Lrnr_ts_weights
,
Lrnr_xgboost
,
Pipeline
,
Stack
,
define_h2o_X()
,
undocumented_learner
if (FALSE) {
library(origami)
data(bsds)
# make folds appropriate for time-series cross-validation
folds <- make_folds(bsds,
fold_fun = folds_rolling_window, window_size = 500,
validation_size = 100, gap = 0, batch = 50
)
# build task by passing in external folds structure
task <- sl3_Task$new(
data = bsds,
folds = folds,
covariates = c(
"weekday", "temp"
),
outcome = "cnt"
)
# create tasks for taining and validation (simplifed example)
train_task <- training(task, fold = task$folds[[1]])
valid_task <- validation(task, fold = task$folds[[1]])
# instantiate learner, then fit and predict (simplifed example)
lstm_lrnr <- Lrnr_lstm_keras$new(batch_size = 1, epochs = 200)
lstm_fit <- lstm_lrnr$train(train_task)
lstm_preds <- lstm_fit$predict(valid_task)
}