RReLU module
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
lower |
lower bound of the uniform distribution. Default: \frac{1}{8} |
upper |
upper bound of the uniform distribution. Default: \frac{1}{3} |
inplace |
can optionally do the operation in-place. Default: |
Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
\mbox{RReLU}(x) = ≤ft\{ \begin{array}{ll} x & \mbox{if } x ≥q 0 \\ ax & \mbox{ otherwise } \end{array} \right.
where a is randomly sampled from uniform distribution \mathcal{U}(\mbox{lower}, \mbox{upper}). See: https://arxiv.org/pdf/1505.00853.pdf
Input: (N, *) where *
means, any number of additional
dimensions
Output: (N, *), same shape as the input
if (torch_is_installed()) { m <- nn_rrelu(0.1, 0.3) input <- torch_randn(2) m(input) }
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.