PReLU

class PReLU(alphaInitializer: Initializer, alphaRegularizer: Regularizer?, sharedAxes: IntArray?, name: String) : AbstractActivationLayer, TrainableLayer

Parametric Rectified Linear Unit.

It follows:

f(x) = alpha * x     if x < 0
f(x) = x if x >= 0

where alpha is a learnable weight and has the same shape as x (i.e. input).

Since

0.3

Constructors

PReLU
Link copied to clipboard
fun PReLU(alphaInitializer: Initializer = Zeros(), alphaRegularizer: Regularizer? = null, sharedAxes: IntArray? = null, name: String = "")

Functions

build
Link copied to clipboard
open override fun build(tf: Ops, input: Operand<Float>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>
open fun build(tf: Ops, input: List<Operand<Float>>, isTraining: Operand<Boolean>, numberOfLosses: Operand<Float>?): Operand<Float>

Extend this function to define variables in the layer and compute layer output.

forward
Link copied to clipboard
open override fun forward(tf: Ops, input: Operand<Float>): Operand<Float>

Applies the activation functions to the input to produce the output.

invoke
Link copied to clipboard
operator fun invoke(vararg layers: Layer): Layer

Important part of functional API. It takes layers as input and saves them to the inboundLayers of the given layer.

toString
Link copied to clipboard
open override fun toString(): String

Properties

alphaInitializer
Link copied to clipboard
val alphaInitializer: Initializer

Initializer instance for the weights.

alphaRegularizer
Link copied to clipboard
val alphaRegularizer: Regularizer? = null

Regularizer instance for the weights.

hasActivation
Link copied to clipboard
open override val hasActivation: Boolean

Returns True, if layer has internal activation function.

inboundLayers
Link copied to clipboard
var inboundLayers: MutableList<Layer>

Returns inbound layers.

isTrainable
Link copied to clipboard
open override var isTrainable: Boolean = true

True, if layer's weights could be changed during training. If false, layer's weights are frozen and could not be changed during training.

name
Link copied to clipboard
var name: String

Layer name. A new name is generated during model compilation when provided name is empty.

outboundLayers
Link copied to clipboard
var outboundLayers: MutableList<Layer>

Returns outbound layers.

outputShape
Link copied to clipboard
lateinit var outputShape: TensorShape

Output data tensor shape.

paramCount
Link copied to clipboard
open val paramCount: Int

Number of parameters in this layer.

parentModel
Link copied to clipboard
var parentModel: GraphTrainableModel? = null

Model where this layer is used.

sharedAxes
Link copied to clipboard
val sharedAxes: IntArray? = null

The axes along which to share learnable parameters.

variables
Link copied to clipboard
open override val variables: List<KVariable>

Variables used in this layer.