Activations

enum Activations : Enum<Activations>

Neural network hyperparameter, activation function of a node defines the output of that node given an input or set of inputs.

Entries

Swish
Link copied to clipboard
Swish activation function.
HardSigmoid
Link copied to clipboard
Hard sigmoid activation function.
SoftSign
Link copied to clipboard
Softsign activation function.
SoftPlus
Link copied to clipboard
Softplus activation function.
Exponential
Link copied to clipboard
Exponential activation function.
LogSoftmax
Link copied to clipboard
Softmax
Link copied to clipboard
Softmax converts a real vector to a vector of categorical probabilities.
Selu
Link copied to clipboard
Scaled Exponential Linear Unit (SELU).
Elu
Link copied to clipboard
Elu()
Exponential Linear Unit.
Relu6
Link copied to clipboard
Computes Rectified Linear 6:
min(max(features, 0), 6)
Calls Relu6Activation under the hood.
Relu
Link copied to clipboard
Rectified linear unit (ReLU).
Tanh
Link copied to clipboard
Hyperbolic tangent activation function.
Sigmoid
Link copied to clipboard
Sigmoid activation function.
Linear
Link copied to clipboard
Linear unit.

Types

Companion
Link copied to clipboard
object Companion

Properties

name
Link copied to clipboard
val name: String
ordinal
Link copied to clipboard
val ordinal: Int