SoftmaxActivation

class SoftmaxActivation : Activation

Internal class, wrapping TensorFlow operand

tf.nn.softmax

For each batch i and class j we have

softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))

See also

Constructors

SoftmaxActivation
Link copied to clipboard
fun SoftmaxActivation()

Functions

apply
Link copied to clipboard
open override fun apply(tf: Ops, features: Operand<Float>): Operand<Float>

Applies the activation functions to the input features to produce the output.

open fun apply(tf: Ops, features: Operand<Float>, name: String = ""): Operand<Float>

Applies the activation functions to the input features to produce the output.