BatchNorm
class BatchNorm(axis: List<Int>, momentum: Double, center: Boolean, epsilon: Double, scale: Boolean, gammaInitializer: Initializer, betaInitializer: Initializer, gammaRegularizer: Regularizer?, betaRegularizer: Regularizer?, movingMeanInitializer: Initializer, movingVarianceInitializer: Initializer, name: String) : Layer, NoGradients, ParametrizedLayer
Content copied to clipboard
NOTE: This layer is not trainable and does not update its weights. It's frozen by default.
Since
0.2
Constructors
BatchNorm
Link copied to clipboard
fun BatchNorm(axis: List<Int> = arrayListOf(3), momentum: Double = 0.99, center: Boolean = true, epsilon: Double = 0.001, scale: Boolean = true, gammaInitializer: Initializer = Ones(), betaInitializer: Initializer = Zeros(), gammaRegularizer: Regularizer? = null, betaRegularizer: Regularizer? = null, movingMeanInitializer: Initializer = Zeros(), movingVarianceInitializer: Initializer = Ones(), name: String = "")
Content copied to clipboard
Creates BatchNorm object.
Functions
buildFromInboundLayers
Link copied to clipboard
Extend this function to define variables in layer.
computeOutputShape
Link copied to clipboard
Computes output shape, based on inputShape and Layer type.
computeOutputShapeFromInboundLayers
Link copied to clipboard
Computes output shape, based on input shapes of inbound layers.
forward
Link copied to clipboard
Properties
betaInitializer
Link copied to clipboard
betaRegularizer
Link copied to clipboard
gammaInitializer
Link copied to clipboard
gammaRegularizer
Link copied to clipboard
hasActivation
Link copied to clipboard
inboundLayers
Link copied to clipboard
movingMeanInitializer
Link copied to clipboard
movingVarianceInitializer
Link copied to clipboard
outboundLayers
Link copied to clipboard
outputShape
Link copied to clipboard
paramCount
Link copied to clipboard
parentModel
Link copied to clipboard