LazyDeepMTS.RdSee also https://techtonique.github.io/nnetsauce/
LazyDeepMTS(
verbose = 0,
ignore_warnings = TRUE,
custom_metric = NULL,
predictions = FALSE,
random_state = 42L,
estimators = "all",
preprocess = FALSE,
show_progress = TRUE,
n_layers = 3L,
...
)monitor progress (0, default, is false and 1 is true)
print trace when model fitting failed
defining a custom metric (default is NULL)
obtain predictions (default is FALSE)
reproducibility seed
specify regressors to be adjusted (default is 'all')
preprocessing input covariates (default is FALSE FALSE)
number of layers for the deep model
additional parameters to be passed to nnetsauce::CustomRegressor
a list that you can $fit
set.seed(123)
X <- matrix(rnorm(300), 100, 3)
(index_train <- base::sample.int(n = nrow(X),
size = floor(0.8*nrow(X)),
replace = FALSE))
#> [1] 60 62 80 9 11 43 30 61 74 55 93 34 57 38 84 63 46 78 70 18 2 72 75 21 88
#> [26] 71 79 59 37 91 6 40 82 28 32 49 35 67 47 20 1 5 96 99 90 98 48 51 10 53
#> [51] 65 7 87 44 36 23 89 24 4 29 45 58 33 3 54 94 76 83 66 8 26 56 14 25 13
#> [76] 50 31 92 22 16
X_train <- data.frame(X[index_train, ])
X_test <- data.frame(X[-index_train, ])
obj <- LazyDeepMTS()
res <- obj$fit(X_train, X_test)
print(res[[1]])
#> [1] 0.9576618 0.9707051 0.9709615 0.9748545 0.9791164 0.9907240 0.9907240
#> [8] 0.9907240 0.9907240 0.9966559 1.0095353 1.0110766 1.0142847 1.0348505
#> [15] 1.0648919 1.0672458 1.0735442 1.2412343 1.3172612 1.3286419 1.3619977
#> [22] 1.3765455 1.4272372 1.5753650 1.6729886 1.8409357 1.8473388 1.8900647
#> [29] 1.8900647