SoftmaxActivation
Internal class, wrapping TensorFlow operand
tf.nn.softmax
For each batch i
and class j
we have
softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))
See also
org.jetbrains.kotlinx.dl.api.core.activation.Activations.Softmax
for explanation.
Constructors
SoftmaxActivation
Link copied to clipboard
fun SoftmaxActivation()
Content copied to clipboard