Defining A Task |
|
---|---|
Define a Machine Learning Task |
|
Specify Variable Type |
|
Finding Learners |
|
List sl3 Learners |
|
sl3 Learners |
|
Harmonic Regression |
|
Univariate ARIMA Models |
|
bartMachine: Bayesian Additive Regression Trees (BART) |
|
Base Class for all sl3 Learners |
|
Bayesian Generalized Linear Models |
|
Bidirectional Long short-term memory Recurrent Neural Network (LSTM) |
|
Bound Predictions |
|
Caret (Classification and Regression) Training |
|
Fit/Predict a learner with Cross Validation |
|
Cross-Validated Selector |
|
Discrete Bayesian Additive Regression Tree sampler |
|
Define interactions terms |
|
Density from Classification |
|
Density Estimation With Mean Model and Homoscedastic Errors |
|
Density Estimation With Mean Model and Homoscedastic Errors |
|
Earth: Multivariate Adaptive Regression Splines |
|
Exponential Smoothing state space model |
|
Nonlinear Optimization via Genetic Algorithm (GA) |
|
GAM: Generalized Additive Models |
|
GBM: Generalized Boosted Regression Models |
|
Generalized Linear Models |
|
Computationally Efficient Generalized Linear Model (GLM) Fitting |
|
Semiparametric Generalized Linear Models |
|
GLMs with Elastic Net Regularization |
|
Generalized Linear Model Trees |
|
Generalized Random Forests Learner |
|
Generalized Random Forests for Conditional Average Treatment Effects |
|
Recurrent Neural Network with Gated Recurrent Unit (GRU) with Keras |
|
Grouped Time-Series Forecasting |
|
h2o Model Definition |
|
Grid Search Models with h2o |
|
Scalable Highly Adaptive Lasso (HAL) |
|
Conditional Density Estimation with the Highly Adaptive LASSO |
|
Hierarchical Time-Series Forecasting |
|
Classification from Binomial Regression |
|
LightGBM: Light Gradient Boosting Machine |
|
Long short-term memory Recurrent Neural Network (LSTM) with Keras |
|
Fitting Intercept Models |
|
Stratify univariable time-series learners by time-series |
|
Multivariate Learner |
|
Feed-Forward Neural Networks and Multinomial Log-Linear Models |
|
Non-negative Linear Least Squares |
|
Optimize Metalearner according to Loss Function using optim |
|
Principal Component Analysis and Regression |
|
Polyspline - multivariate adaptive polynomial spline regression (polymars) and polychotomous regression and multiple classification (polyclass) |
|
Classification from Pooled Hazards |
|
Random Forests |
|
Ranger: Fast(er) Random Forests |
|
Learner that chains into a revere task |
|
Learner for Recursive Partitioning and Regression Trees |
|
Univariate GARCH Models |
|
Augmented Covariate Screener |
|
Coefficient Magnitude Screener |
|
Correlation Screening Procedures |
|
Variable Importance Screener |
|
The Super Learner Algorithm |
|
Nonlinear Optimization via Augmented Lagrange |
|
Nonlinear Optimization via Augmented Lagrange |
|
Stratify learner fits by a single variable |
|
Learner with Covariate Subsetting |
|
Support Vector Machines |
|
Nonlinear Time Series Analysis |
|
Time-specific weighting of prediction losses |
|
xgboost: eXtreme Gradient Boosting |
|
Use SuperLearner Wrappers, Screeners, and Methods, in sl3 |
|
Composing Learners |
|
Pipeline (chain) of learners. |
|
Learner Stacking |
|
Customize chaining for a learner |
|
Loss functions |
|
|
Loss Function Definitions |
Risk Estimation |
|
Metalearner functions |
|
|
Combine predictions from multiple learners |
Helpful for Defining Learners |
|
Generate a file containing a template |
|
Get all arguments of parent call (both specified and defaults) as list |
|
Call with filtered argument list |
|
Estimate object size using serialization |
|
dim that works for vectors too |
|
|
Learner helpers |
Sample Datasets |
|
Subset of growth data from the collaborative perinatal project (CPP) |
|
Subset of growth data from the collaborative perinatal project (CPP) |
|
Bicycle sharing time series dataset |
|
Simulated data with continuous exposure |
|
Miscellaneous |
|
Querying/setting a single |
|
Index |
|
Customize chaining for a learner |
|
Harmonic Regression |
|
Univariate ARIMA Models |
|
bartMachine: Bayesian Additive Regression Trees (BART) |
|
Base Class for all sl3 Learners |
|
Bayesian Generalized Linear Models |
|
Bidirectional Long short-term memory Recurrent Neural Network (LSTM) |
|
Bound Predictions |
|
Caret (Classification and Regression) Training |
|
Fit/Predict a learner with Cross Validation |
|
Cross-Validated Selector |
|
Discrete Bayesian Additive Regression Tree sampler |
|
Define interactions terms |
|
Density from Classification |
|
Density Estimation With Mean Model and Homoscedastic Errors |
|
Density Estimation With Mean Model and Homoscedastic Errors |
|
Earth: Multivariate Adaptive Regression Splines |
|
Exponential Smoothing state space model |
|
Nonlinear Optimization via Genetic Algorithm (GA) |
|
GAM: Generalized Additive Models |
|
GBM: Generalized Boosted Regression Models |
|
Generalized Linear Models |
|
Computationally Efficient Generalized Linear Model (GLM) Fitting |
|
Semiparametric Generalized Linear Models |
|
GLMs with Elastic Net Regularization |
|
Generalized Linear Model Trees |
|
Generalized Random Forests Learner |
|
Generalized Random Forests for Conditional Average Treatment Effects |
|
Recurrent Neural Network with Gated Recurrent Unit (GRU) with Keras |
|
Grouped Time-Series Forecasting |
|
h2o Model Definition |
|
Grid Search Models with h2o |
|
Scalable Highly Adaptive Lasso (HAL) |
|
Conditional Density Estimation with the Highly Adaptive LASSO |
|
Hierarchical Time-Series Forecasting |
|
Classification from Binomial Regression |
|
LightGBM: Light Gradient Boosting Machine |
|
Long short-term memory Recurrent Neural Network (LSTM) with Keras |
|
Fitting Intercept Models |
|
Stratify univariable time-series learners by time-series |
|
Multivariate Learner |
|
Feed-Forward Neural Networks and Multinomial Log-Linear Models |
|
Non-negative Linear Least Squares |
|
Optimize Metalearner according to Loss Function using optim |
|
Principal Component Analysis and Regression |
|
Polyspline - multivariate adaptive polynomial spline regression (polymars) and polychotomous regression and multiple classification (polyclass) |
|
Classification from Pooled Hazards |
|
Random Forests |
|
Ranger: Fast(er) Random Forests |
|
Learner that chains into a revere task |
|
Learner for Recursive Partitioning and Regression Trees |
|
Univariate GARCH Models |
|
Augmented Covariate Screener |
|
Coefficient Magnitude Screener |
|
Correlation Screening Procedures |
|
Variable Importance Screener |
|
The Super Learner Algorithm |
|
Nonlinear Optimization via Augmented Lagrange |
|
Nonlinear Optimization via Augmented Lagrange |
|
Stratify learner fits by a single variable |
|
Learner with Covariate Subsetting |
|
Support Vector Machines |
|
Nonlinear Time Series Analysis |
|
Time-specific weighting of prediction losses |
|
xgboost: eXtreme Gradient Boosting |
|
Pipeline (chain) of learners. |
|
Container Class for data.table Shared Between Tasks |
|
Learner Stacking |
|
Use SuperLearner Wrappers, Screeners, and Methods, in sl3 |
|
Get all arguments of parent call (both specified and defaults) as list |
|
Bicycle sharing time series dataset |
|
Subset of growth data from the collaborative perinatal project (CPP) |
|
Subset of growth data from the collaborative perinatal project (CPP) |
|
Subset Tasks for CV THe functions use origami folds to subset tasks. These functions are used by Lrnr_cv (and therefore other learners that use Lrnr_cv). So that nested cv works properly, currently the subsetted task objects do not have fold structures of their own, and so generate them from defaults if nested cv is requested. |
|
Cross-validated Risk Estimation |
|
Cross-validated Super Learner |
|
|
Helper functions to debug sl3 Learners |
Automatically Defined Metalearner |
|
Simulated data with continuous exposure |
|
Convert Factors to indicators |
|
Importance
Extract variable importance measures produced by
|
|
Variable Importance Plot |
|
Inverse CDF Sampling |
|
|
Learner helpers |
List sl3 Learners |
|
|
Loss Function Definitions |
Make a stack of sl3 learners |
|
|
Combine predictions from multiple learners |
Pack multidimensional predictions into a vector (and unpack again) |
|
Generate A Pooled Hazards Task from a Failure Time (or Categorical) Task |
|
Predict Class from Predicted Probabilities |
|
Plot predicted and true values for diganostic purposes |
|
Process Data |
|
Risk Estimation |
|
FACTORY RISK FUNCTION FOR ROCR PERFORMANCE MEASURES WITH BINARY OUTCOMES |
|
dim that works for vectors too |
|
Querying/setting a single |
|
Define a Machine Learning Task |
|
Revere (SplitSpecific) Task |
|
Make folds work on subset of data |
|
Undocumented Learner |
|
Specify Variable Type |
|
Generate a file containing a template |