Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

nn_softmax

Softmax module


Description

Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1. Softmax is defined as:

Usage

nn_softmax(dim)

Arguments

dim

(int): A dimension along which Softmax will be computed (so every slice along dim will sum to 1).

Details

\mbox{Softmax}(x_{i}) = \frac{\exp(x_i)}{∑_j \exp(x_j)}

When the input Tensor is a sparse tensor then the unspecifed values are treated as -Inf.

Value

: a Tensor of the same dimension and shape as the input with values in the range [0, 1]

Shape

  • Input: (*) where * means, any number of additional dimensions

  • Output: (*), same shape as the input

Note

This module doesn't work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use LogSoftmax instead (it's faster and has better numerical properties).

Examples

if (torch_is_installed()) {
m <- nn_softmax(1)
input <- torch_randn(2, 3)
output <- m(input)

}

torch

Tensors and Neural Networks with 'GPU' Acceleration

v0.3.0
MIT + file LICENSE
Authors
Daniel Falbel [aut, cre, cph], Javier Luraschi [aut], Dmitriy Selivanov [ctb], Athos Damiani [ctb], Christophe Regouby [ctb], Krzysztof Joachimiak [ctb], RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.