Swish

Swish activation function.

Transforms input 'x' according formula:

swish(x) = x * sigmoid(x)

It is a smooth, non-monotonic function that consistently matches or outperforms ReLU on deep networks, it is unbounded above and bounded below.

Calls SwishActivation under the hood.

See also

Properties

name
Link copied to clipboard
val name: String
ordinal
Link copied to clipboard
val ordinal: Int