Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

lgb.importance

Compute feature importance in a model


Description

Creates a data.table of feature importances in a model.

Usage

lgb.importance(model, percentage = TRUE)

Arguments

model

object of class lgb.Booster.

percentage

whether to show importance in relative percentage.

Value

For a tree model, a data.table with the following columns:

  • Feature: Feature names in the model.

  • Gain: The total gain of this feature's splits.

  • Cover: The number of observation related to this feature.

  • Frequency: The number of times a feature splited in trees.

Examples

data(agaricus.train, package = "lightgbm")
train <- agaricus.train
dtrain <- lgb.Dataset(train$data, label = train$label)

params <- list(
  objective = "binary"
  , learning_rate = 0.1
  , max_depth = -1L
  , min_data_in_leaf = 1L
  , min_sum_hessian_in_leaf = 1.0
)
model <- lgb.train(
    params = params
    , data = dtrain
    , nrounds = 5L
)

tree_imp1 <- lgb.importance(model, percentage = TRUE)
tree_imp2 <- lgb.importance(model, percentage = FALSE)

lightgbm

Light Gradient Boosting Machine

v3.2.1
MIT + file LICENSE
Authors
Guolin Ke [aut, cre], Damien Soukhavong [aut], James Lamb [aut], Qi Meng [aut], Thomas Finley [aut], Taifeng Wang [aut], Wei Chen [aut], Weidong Ma [aut], Qiwei Ye [aut], Tie-Yan Liu [aut], Yachen Yan [ctb], Microsoft Corporation [cph], Dropbox, Inc. [cph], Jay Loden [cph], Dave Daeschler [cph], Giampaolo Rodola [cph], Alberto Ferreira [ctb], Daniel Lemire [ctb], Victor Zverovich [cph], IBM Corporation [ctb]
Initial release
2021-04-12

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.