Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_rrelu

RReLU module


Description

Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:

Usage

nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)

Arguments

lower

lower bound of the uniform distribution. Default: \frac{1}{8}

upper

upper bound of the uniform distribution. Default: \frac{1}{3}

inplace

can optionally do the operation in-place. Default: FALSE

Details

Empirical Evaluation of Rectified Activations in Convolutional Network.

The function is defined as:

\mbox{RReLU}(x) = ≤ft\{ \begin{array}{ll} x & \mbox{if } x ≥q 0 \\ ax & \mbox{ otherwise } \end{array} \right.

where a is randomly sampled from uniform distribution \mathcal{U}(\mbox{lower}, \mbox{upper}). See: https://arxiv.org/pdf/1505.00853.pdf

Shape

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_rrelu(0.1, 0.3)
input <- torch_randn(2)
m(input)

}

torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.