Skip to main content

Mojo module

activations

The module contains implementations of activation functions.

Functions

  • elu: Compute the Elu Op using the equation zifz>=0elsealpha(ez1)z if z >= 0 else alpha*(e^z -1).
  • leaky_relu: Compute the Leaky ReLU using the equation max(0,x)+negativeslopemin(0,x)max(0, x) + negative_slope * min(0, x).
  • relu: Compute the Relu Op using the equation max(0,x)max(0, x).
  • relu_n1: Compute the Relu N1 Op using the equation max(min(x,1),1)max(min(x,1),-1).
  • sign: Compute the sign (0, 1) of the input value.

Was this page helpful?