Swish
Swish()
Content copied to clipboard
Swish activation function.
Transforms input 'x' according formula:
swish(x) = x * sigmoid(x)
It is a smooth, non-monotonic function that consistently matches or outperforms ReLU on deep networks, it is unbounded above and bounded below.
Calls SwishActivation under the hood.