ReLU
class ReLU(maxValue: Float?, negativeSlope: Float, threshold: Float, name: String) : AbstractActivationLayer
Content copied to clipboard
Rectified Linear Unit activation function.
With default values, it returns element-wise max(x, 0)
.
Otherwise, it follows:
f(x) = maxValue, if x >= maxValue
f(x) = x, if threshold <= x < maxValue
f(x) = negativeSlope * (x - threshold), if x < threshold
Since
0.2
Constructors
Functions
Properties
hasActivation
Link copied to clipboard
inboundLayers
Link copied to clipboard
negativeSlope
Link copied to clipboard
outboundLayers
Link copied to clipboard
outputShape
Link copied to clipboard
parentModel
Link copied to clipboard