Package org.jetbrains.kotlinx.dl.api.core.optimizer

Types

AdaDelta
Link copied to clipboard
class AdaDelta(learningRate: Float, rho: Float, epsilon: Float, clipGradient: ClipGradientAction) : Optimizer

Adadelta optimizer.

AdaGrad
Link copied to clipboard
class AdaGrad(learningRate: Float, initialAccumulatorValue: Float, clipGradient: ClipGradientAction) : Optimizer

Adagrad optimizer.

AdaGradDA
Link copied to clipboard
class AdaGradDA(learningRate: Float, initialAccumulatorValue: Float, l1Strength: Float, l2Strength: Float, clipGradient: ClipGradientAction) : Optimizer

Adagrad Dual Averaging algorithm for sparse linear models.

Adam
Link copied to clipboard
class Adam(learningRate: Float, beta1: Float, beta2: Float, epsilon: Float, useNesterov: Boolean, clipGradient: ClipGradientAction) : Optimizer

Adam optimizer.

Adamax
Link copied to clipboard
class Adamax(learningRate: Float, beta1: Float, beta2: Float, epsilon: Float, clipGradient: ClipGradientAction) : Optimizer

Adamax optimizer from Adam paper's Section 7.

ClipGradientAction
Link copied to clipboard
abstract class ClipGradientAction

Base abstract class for approaches to clip gradient values from step to step in optimizer.

ClipGradientByNorm
Link copied to clipboard
class ClipGradientByNorm(clipNormValue: Float) : ClipGradientAction

Clips gradient with pre-defined clipNormValue norm.

ClipGradientByValue
Link copied to clipboard
class ClipGradientByValue(clipValue: Float) : ClipGradientAction

Clips gradient value to clipValue if gradient value more than clipValue and to -clipValue if gradient value less than -clipValue.

Ftrl
Link copied to clipboard
class Ftrl(learningRate: Float, l1RegularizationStrength: Float, l2RegularizationStrength: Float, learningRatePower: Float, l2ShrinkageRegularizationStrength: Float, initialAccumulatorValue: Float, clipGradient: ClipGradientAction) : Optimizer

Optimizer that implements the FTRL algorithm.

Momentum
Link copied to clipboard
class Momentum(learningRate: Float, momentum: Float, useNesterov: Boolean, clipGradient: ClipGradientAction) : Optimizer

Improved version of SGD optimizer.

NoClipGradient
Link copied to clipboard
class NoClipGradient : ClipGradientAction

No gradient clipping. Gradients go forward without any changes.

Optimizer
Link copied to clipboard
abstract class Optimizer(clipGradient: ClipGradientAction)

Base class for all optimizers.

RMSProp
Link copied to clipboard
class RMSProp(learningRate: Float, decay: Float, momentum: Float, epsilon: Float, centered: Boolean, clipGradient: ClipGradientAction) : Optimizer

RMSProp optimizer.

SGD
Link copied to clipboard
class SGD(clipGradient: ClipGradientAction) : Optimizer

Stochastic gradient descent optimizer.