Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

xgboost_impl

Wrapper for parsnip::xgb_train


Description

Wrapper for parsnip::xgb_train

Usage

xgboost_impl(
  x,
  y,
  max_depth = 6,
  nrounds = 15,
  eta = 0.3,
  colsample_bytree = 1,
  min_child_weight = 1,
  gamma = 0,
  subsample = 1,
  validation = 0,
  early_stop = NULL,
  ...
)

Arguments

x

A data frame or matrix of predictors

y

A vector (factor or numeric) or matrix (numeric) of outcome data.

max_depth

An integer for the maximum depth of the tree.

nrounds

An integer for the number of boosting iterations.

eta

A numeric value between zero and one to control the learning rate.

colsample_bytree

Subsampling proportion of columns.

min_child_weight

A numeric value for the minimum sum of instance weights needed in a child to continue to split.

gamma

A number for the minimum loss reduction required to make a further partition on a leaf node of the tree

subsample

Subsampling proportion of rows.

validation

A positive number. If on [0, 1) the value, validation is a random proportion of data in x and y that are used for performance assessment and potential early stopping. If 1 or greater, it is the number of training set samples use for these purposes.

early_stop

An integer or NULL. If not NULL, it is the number of training iterations without improvement before stopping. If validation is used, performance is base on the validation set; otherwise the training set is used.

...

Other options to pass to xgb.train.


modeltime

The Tidymodels Extension for Time Series Modeling

v0.5.1
MIT + file LICENSE
Authors
Matt Dancho [aut, cre], Business Science [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.