Activation

interface Activation

Basic interface for all activation functions.

Functions

apply
Link copied to clipboard
abstract fun apply(tf: Ops, features: Operand<Float>): Operand<Float>
open fun apply(tf: Ops, features: Operand<Float>, name: String = ""): Operand<Float>

Applies the activation functions to the input features to produce the output.

Inheritors

LinearActivation
Link copied to clipboard
SigmoidActivation
Link copied to clipboard
ReluActivation
Link copied to clipboard
Relu6Activation
Link copied to clipboard
TanhActivation
Link copied to clipboard
TanhShrinkActivation
Link copied to clipboard
EluActivation
Link copied to clipboard
SeluActivation
Link copied to clipboard
SoftmaxActivation
Link copied to clipboard
LogSoftmaxActivation
Link copied to clipboard
ExponentialActivation
Link copied to clipboard
SoftPlusActivation
Link copied to clipboard
SoftSignActivation
Link copied to clipboard
HardSigmoidActivation
Link copied to clipboard
SwishActivation
Link copied to clipboard
MishActivation
Link copied to clipboard
HardShrinkActivation
Link copied to clipboard
SoftShrinkActivation
Link copied to clipboard
LishtActivation
Link copied to clipboard
SnakeActivation
Link copied to clipboard
GeluActivation
Link copied to clipboard
SparsemaxActivation
Link copied to clipboard