Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_soft_margin_loss

Soft margin loss


Description

Creates a criterion that optimizes a two-class classification logistic loss between input tensor x and target tensor y (containing 1 or -1).

Usage

nn_soft_margin_loss(reduction = "mean")

Arguments

reduction

(string, optional): Specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'. 'none': no reduction will be applied, 'mean': the sum of the output will be divided by the number of elements in the output, 'sum': the output will be summed. Note: size_average and reduce are in the process of being deprecated, and in the meantime, specifying either of those two args will override reduction. Default: 'mean'

Details

\mbox{loss}(x, y) = ∑_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\mbox{x.nelement}()}

Shape

  • Input: (*) where * means, any number of additional dimensions

  • Target: (*), same shape as the input

  • Output: scalar. If reduction is 'none', then same shape as the input


torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.