SELU module
Applied element-wise, as:
nn_selu(inplace = FALSE)
inplace |
(bool, optional): can optionally do the operation in-place. Default: |
\mbox{SELU}(x) = \mbox{scale} * (\max(0,x) + \min(0, α * (\exp(x) - 1)))
with α = 1.6732632423543772848170429916717 and \mbox{scale} = 1.0507009873554804934193349852946.
More details can be found in the paper Self-Normalizing Neural Networks.
Input: (N, *) where *
means, any number of additional
dimensions
Output: (N, *), same shape as the input
if (torch_is_installed()) { m <- nn_selu() input <- torch_randn(2) output <- m(input) }
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.