Skip to content

ReLU

Rectified Linear Unit (ReLU) activation function.

Methods

apply

Apply the activation function to a layer output z.

  • z

gradient

Return the gradient with respect to a layer output z.

  • z