Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_relu

ReLU module


Description

Applies the rectified linear unit function element-wise

\mbox{ReLU}(x) = (x)^+ = \max(0, x)

Usage

nn_relu(inplace = FALSE)

Arguments

inplace

can optionally do the operation in-place. Default: FALSE

Shape

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_relu()
input <- torch_randn(2)
m(input)

}

torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.