flax.linen.activation.selu#
- flax.linen.activation.selu(x)[source]#
Scaled exponential linear unit activation.
Computes the element-wise function:
\[\begin{split}\mathrm{selu}(x) = \lambda \begin{cases} x, & x > 0\\ \alpha e^x - \alpha, & x \le 0 \end{cases}\end{split}\]where \(\lambda = 1.0507009873554804934193349852946\) and \(\alpha = 1.6732632423543772848170429916717\).
For more information, see Self-Normalizing Neural Networks.
- Parameters:
x (
Union
[Array
,ndarray
,bool_
,number
,bool
,int
,float
,complex
]) – input array- Return type:
- Returns:
An array.
See also