ReLU

class ReLU(maxValue: Float?, negativeSlope: Float, threshold: Float, name: String) : AbstractActivationLayer

Rectified Linear Unit activation function.

With default values, it returns element-wise max(x, 0).

Otherwise, it follows:

f(x) = maxValue,                        if x >= maxValue
f(x) = x, if threshold <= x < maxValue
f(x) = negativeSlope * (x - threshold), if x < threshold

Since

0.2

Constructors

ReLU
Link copied to clipboard
fun ReLU(maxValue: Float? = null, negativeSlope: Float = 0.0f, threshold: Float = 0.0f, name: String = "")

Creates ReLU object.

Functions

build
Link copied to clipboard
open override fun build(tf: Ops, kGraph: KGraph, inputShape: Shape)

Extend this function to define variables in layer.

buildFromInboundLayers
Link copied to clipboard
fun buildFromInboundLayers(tf: Ops, kGraph: KGraph)

Extend this function to define variables in layer.

computeOutputShape
Link copied to clipboard
open override fun computeOutputShape(inputShape: Shape): Shape

Computes output shape, based on inputShape and Layer type.

computeOutputShapeFromInboundLayers
Link copied to clipboard
open fun computeOutputShapeFromInboundLayers(): TensorShape

Computes output shape, based on input shapes of inbound layers.

forward
Link copied to clipboard
open override fun forward(tf: Ops, input: Operand<Float>): Operand<Float>

Applies the activation functions to the input to produce the output.

open override fun forward(tf: Ops, input: Operand<Float>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>

Builds main layer input transformation with tf. Depends on Layer type.

open fun forward(tf: Ops, input: List<Operand<Float>>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>

Builds main layer input transformation with tf. Depends on Layer type.

invoke
Link copied to clipboard
operator fun invoke(vararg layers: Layer): Layer

Important part of functional API. It takes layers as input and saves them to the inboundLayers of the given layer.

toString
Link copied to clipboard
open override fun toString(): String

Properties

hasActivation
Link copied to clipboard
open override val hasActivation: Boolean

Returns True, if layer has internal activation function.

inboundLayers
Link copied to clipboard
var inboundLayers: MutableList<Layer>

Returns inbound layers.

isTrainable
Link copied to clipboard
var isTrainable: Boolean = true

True, if layer's weights could be changed during training. If false, layer's weights are frozen and could be changed during the training.

maxValue
Link copied to clipboard
val maxValue: Float? = null

Maximum activation value. Should be >= 0.

name
Link copied to clipboard
var name: String
negativeSlope
Link copied to clipboard
val negativeSlope: Float = 0.0f

Negative slope coefficient. Should be >= 0.

outboundLayers
Link copied to clipboard
var outboundLayers: MutableList<Layer>

Returns outbound layers.

outputShape
Link copied to clipboard
lateinit var outputShape: TensorShape

Output data tensor shape.

paramCount
Link copied to clipboard
open override val paramCount: Int

Returns amount of neurons.

parentModel
Link copied to clipboard
var parentModel: TrainableModel? = null

Model where this layer is used.

threshold
Link copied to clipboard
val threshold: Float = 0.0f

Threshold value for threshold activation.

weights
Link copied to clipboard
open override var weights: Map<String, Array<*>>

Layer's weights.