Package org.jetbrains.kotlinx.dl.api.core.layer.activation

Types

AbstractActivationLayer
Link copied to clipboard
abstract class AbstractActivationLayer(name: String) : Layer

Base class for all layer class representing activation functions.

ELU
Link copied to clipboard
class ELU(alpha: Float, name: String) : AbstractActivationLayer

Exponential Unit activation function.

LeakyReLU
Link copied to clipboard
class LeakyReLU(alpha: Float, name: String) : AbstractActivationLayer

Leaky version of a Rectified Linear Unit.

PReLU
Link copied to clipboard
class PReLU(alphaInitializer: Initializer, alphaRegularizer: Regularizer?, sharedAxes: IntArray?, name: String) : AbstractActivationLayer, TrainableLayer

Parametric Rectified Linear Unit.

ReLU
Link copied to clipboard
class ReLU(maxValue: Float?, negativeSlope: Float, threshold: Float, name: String) : AbstractActivationLayer

Rectified Linear Unit activation function.

Softmax
Link copied to clipboard
class Softmax(axis: List<Int>, name: String) : AbstractActivationLayer

Softmax activation layer

ThresholdedReLU
Link copied to clipboard
class ThresholdedReLU(theta: Float, name: String) : AbstractActivationLayer

Thresholded Rectified Linear Unit.