ELU
Exponential Unit activation function.
It follows:
f(x) = x, if x 0
f(x) = alpha * (exp(x) - 1), if x <= 0
In contrast to ReLU it has negative values which push the mean of the activation closer to zero which enable faster learning as they bring the gradient to the natural gradient.
Since
0.3
Constructors
Functions
Properties
hasActivation
Link copied to clipboard
inboundLayers
Link copied to clipboard
outboundLayers
Link copied to clipboard
outputShape
Link copied to clipboard
parentModel
Link copied to clipboard