Log Loss/Binary Cross Entropy
Calculates weighted logloss resp. cross entropy. Equals half of the unit Bernoulli deviance. The smaller, the better.
logLoss(actual, predicted, w = NULL, ...)
actual |
Observed values (0 or 1). |
predicted |
Predicted values strictly larger than 0 and smaller than 1. |
w |
Optional case weights. |
... |
Further arguments passed to |
A numeric vector of length one.
logLoss(c(0, 0, 1, 1), c(0.1, 0.1, 0.9, 0.8)) logLoss(c(1, 0, 0, 1), c(0.1, 0.1, 0.9, 0.8)) logLoss(c(0, 0, 1, 1), c(0.1, 0.1, 0.9, 0.8), w = 1:4)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.