PReLU
class PReLU(alphaInitializer: Initializer, alphaRegularizer: Regularizer?, sharedAxes: IntArray?, name: String) : AbstractActivationLayer, TrainableLayer
Content copied to clipboard
Parametric Rectified Linear Unit.
It follows:
f(x) = alpha * x if x < 0
f(x) = x if x >= 0
where alpha
is a learnable weight and has the same shape as x
(i.e. input).
Since
0.3
Constructors
PReLU
Link copied to clipboard
fun PReLU(alphaInitializer: Initializer = Zeros(), alphaRegularizer: Regularizer? = null, sharedAxes: IntArray? = null, name: String = "")
Content copied to clipboard
Functions
Properties
alphaInitializer
Link copied to clipboard
alphaRegularizer
Link copied to clipboard
hasActivation
Link copied to clipboard
inboundLayers
Link copied to clipboard
isTrainable
Link copied to clipboard
outboundLayers
Link copied to clipboard
outputShape
Link copied to clipboard
paramCount
Link copied to clipboard
parentModel
Link copied to clipboard
sharedAxes
Link copied to clipboard