Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_softplus

Softplus module


Description

Applies the element-wise function:

\mbox{Softplus}(x) = \frac{1}{β} * \log(1 + \exp(β * x))

Usage

nn_softplus(beta = 1, threshold = 20)

Arguments

beta

the β value for the Softplus formulation. Default: 1

threshold

values above this revert to a linear function. Default: 20

Details

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive. For numerical stability the implementation reverts to the linear function when input \times β > threshold.

Shape

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_softplus()
input <- torch_randn(2)
output <- m(input)

}

torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.