Relu
Relu()
Content copied to clipboard
Rectified linear unit (ReLU).
With default values, this returns the standard ReLU activation: max(x, 0)
, the element-wise maximum of 0 and the input tensor.
Calls ReluActivation under the hood.
Rectified linear unit (ReLU).
With default values, this returns the standard ReLU activation: max(x, 0)
, the element-wise maximum of 0 and the input tensor.
Calls ReluActivation under the hood.