Mish

Mish activation function.

Transforms input 'x' according formula:

mish(x) = x * tanh(softplus(x))

It is a smooth, non-monotonic function that consistently matches or outperforms ReLU and Swish on deep networks, it is unbounded above and bounded below. It also smoothens the loss landscape of the network.

Calls MishActivation under the hood.

See also

Properties

name
Link copied to clipboard
val name: String
ordinal
Link copied to clipboard
val ordinal: Int