ThresholdedReLU
class ThresholdedReLU(theta: Float, name: String) : AbstractActivationLayer
Content copied to clipboard
Thresholded Rectified Linear Unit.
It follows:
f(x) = x, if x theta
f(x) = 0 otherwise
Since
0.3
Constructors
ThresholdedReLU
Link copied to clipboard
Creates ThresholdedReLU object.
Functions
buildFromInboundLayers
Link copied to clipboard
Extend this function to define variables in layer.
computeOutputShape
Link copied to clipboard
Computes output shape, based on inputShape and Layer type.
computeOutputShapeFromInboundLayers
Link copied to clipboard
Computes output shape, based on input shapes of inbound layers.
Properties
hasActivation
Link copied to clipboard
inboundLayers
Link copied to clipboard
outboundLayers
Link copied to clipboard
outputShape
Link copied to clipboard
parentModel
Link copied to clipboard