Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_leaky_relu

LeakyReLU module


Description

Applies the element-wise function:

Usage

nn_leaky_relu(negative_slope = 0.01, inplace = FALSE)

Arguments

negative_slope

Controls the angle of the negative slope. Default: 1e-2

inplace

can optionally do the operation in-place. Default: FALSE

Details

\mbox{LeakyReLU}(x) = \max(0, x) + \mbox{negative\_slope} * \min(0, x)

or

\mbox{LeakyRELU}(x) = ≤ft\{ \begin{array}{ll} x, & \mbox{ if } x ≥q 0 \\ \mbox{negative\_slope} \times x, & \mbox{ otherwise } \end{array} \right.

Shape

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_leaky_relu(0.1)
input <- torch_randn(2)
output <- m(input)

}

torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.