Activations

enum Activations : Enum<Activations>

Neural network hyperparameter, activation function of a node defines the output of that node given an input or set of inputs.

Entries

Sparsemax
Link copied to clipboard

Sparsemax activation function is similar to softmax but able to output sparse probabilities.

TanhShrink
Link copied to clipboard

TanhShrink Activation Function.

Snake
Link copied to clipboard

Snake Activation Function.

LiSHT
Link copied to clipboard

Non-Parametric Linearly Scaled Hyperbolic Tangent (LiSHT) Activation Function.

Gelu
Link copied to clipboard

Gelu Function

HardShrink
Link copied to clipboard

HardShrink Function

Mish
Link copied to clipboard

Mish activation function.

Swish
Link copied to clipboard

Swish activation function.

SoftShrink
Link copied to clipboard

Softshrink activation function.

HardSigmoid
Link copied to clipboard

Hard sigmoid activation function.

SoftSign
Link copied to clipboard

Softsign activation function.

SoftPlus
Link copied to clipboard

Softplus activation function.

Exponential
Link copied to clipboard

Exponential activation function.

LogSoftmax
Link copied to clipboard

Log softmax activation function.

Softmax
Link copied to clipboard

Softmax converts a real vector to a vector of categorical probabilities. The elements of the output vector are in range (0, 1) and sum to 1.

Selu
Link copied to clipboard

Scaled Exponential Linear Unit (SELU).

Elu
Link copied to clipboard
Elu()

Exponential Linear Unit.

Relu6
Link copied to clipboard

Computes Rectified Linear 6:

Relu
Link copied to clipboard

Rectified linear unit (ReLU).

Tanh
Link copied to clipboard

Hyperbolic tangent activation function.

Sigmoid
Link copied to clipboard

Sigmoid activation function.

Linear
Link copied to clipboard

Linear unit. Returns unmodified input.

Types

Companion
Link copied to clipboard
object Companion

Properties

name
Link copied to clipboard
val name: String
ordinal
Link copied to clipboard
val ordinal: Int