BatchNorm

class BatchNorm(axis: List<Int>, momentum: Double, center: Boolean, epsilon: Double, scale: Boolean, gammaInitializer: Initializer, betaInitializer: Initializer, gammaRegularizer: Regularizer?, betaRegularizer: Regularizer?, movingMeanInitializer: Initializer, movingVarianceInitializer: Initializer, name: String) : Layer, NoGradients

NOTE: This layer is not trainable and does not update its weights. It's frozen by default.

Since

0.2

Constructors

BatchNorm
Link copied to clipboard
fun BatchNorm(axis: List<Int> = arrayListOf(3), momentum: Double = 0.99, center: Boolean = true, epsilon: Double = 0.001, scale: Boolean = true, gammaInitializer: Initializer = Ones(), betaInitializer: Initializer = Zeros(), gammaRegularizer: Regularizer? = null, betaRegularizer: Regularizer? = null, movingMeanInitializer: Initializer = Zeros(), movingVarianceInitializer: Initializer = Ones(), name: String = "")

Creates BatchNorm object.

Functions

build
Link copied to clipboard
open override fun build(tf: Ops, kGraph: KGraph, inputShape: Shape)

Extend this function to define variables in layer.

buildFromInboundLayers
Link copied to clipboard
fun buildFromInboundLayers(tf: Ops, kGraph: KGraph)

Extend this function to define variables in layer.

computeOutputShape
Link copied to clipboard
open override fun computeOutputShape(inputShape: Shape): Shape

Computes output shape, based on inputShape and Layer type.

computeOutputShapeFromInboundLayers
Link copied to clipboard
open fun computeOutputShapeFromInboundLayers(): TensorShape

Computes output shape, based on input shapes of inbound layers.

forward
Link copied to clipboard
open override fun forward(tf: Ops, input: Operand<Float>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>

Builds main layer input transformation with tf. Depends on Layer type.

open fun forward(tf: Ops, input: List<Operand<Float>>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>

Builds main layer input transformation with tf. Depends on Layer type.

invoke
Link copied to clipboard
operator fun invoke(vararg layers: Layer): Layer

Important part of functional API. It takes layers as input and saves them to the inboundLayers of the given layer.

toString
Link copied to clipboard
open override fun toString(): String

Properties

axis
Link copied to clipboard
val axis: List<Int>

Integer or a list of integers, the axis that should be normalized (typically the features' axis).

betaInitializer
Link copied to clipboard
val betaInitializer: Initializer

Initializer for the beta weight.

betaRegularizer
Link copied to clipboard
val betaRegularizer: Regularizer? = null

Optional regularizer for the beta weight.

betaShapeArray
Link copied to clipboard
val betaShapeArray: LongArray?

Returns the shape of beta variable weights.

center
Link copied to clipboard
val center: Boolean = true

If True, add offset of beta to normalized tensor. If False, beta is ignored.

epsilon
Link copied to clipboard
val epsilon: Double = 0.001

Small float added to variance to avoid dividing by zero.

gammaInitializer
Link copied to clipboard
val gammaInitializer: Initializer

Initializer for the gamma weight.

gammaRegularizer
Link copied to clipboard
val gammaRegularizer: Regularizer? = null

Optional regularizer for the gamma weight.

gammaShapeArray
Link copied to clipboard
val gammaShapeArray: LongArray?

Returns the shape of gamma variable weights.

hasActivation
Link copied to clipboard
open override val hasActivation: Boolean

Returns True, if layer has internal activation function.

inboundLayers
Link copied to clipboard
var inboundLayers: MutableList<Layer>

Returns inbound layers.

isTrainable
Link copied to clipboard
var isTrainable: Boolean = true

True, if layer's weights could be changed during training. If false, layer's weights are frozen and could be changed during the training.

momentum
Link copied to clipboard
val momentum: Double = 0.99

Momentum for the moving average.

movingMeanInitializer
Link copied to clipboard
val movingMeanInitializer: Initializer

Initializer for the moving mean.

movingMeanShapeArray
Link copied to clipboard
val movingMeanShapeArray: LongArray

Returns the shape of movingMean variable weights.

movingVarianceInitializer
Link copied to clipboard
val movingVarianceInitializer: Initializer

Initializer for the moving variance.

movingVarianceShapeArray
Link copied to clipboard
val movingVarianceShapeArray: LongArray

Returns the shape of movingVariance variable weights.

name
Link copied to clipboard
var name: String
outboundLayers
Link copied to clipboard
var outboundLayers: MutableList<Layer>

Returns outbound layers.

outputShape
Link copied to clipboard
lateinit var outputShape: TensorShape

Output data tensor shape.

paramCount
Link copied to clipboard
open override val paramCount: Int

Returns amount of neurons.

parentModel
Link copied to clipboard
var parentModel: TrainableModel? = null

Model where this layer is used.

scale
Link copied to clipboard
val scale: Boolean = true

If True, multiply by gamma. If False, gamma is not used.

weights
Link copied to clipboard
open override var weights: Map<String, Array<*>>

Layer's weights.