ReLU¶ Rectified Linear Unit (ReLU) activation function. Methods¶ applyApply the activation function to a layer output z. z gradientReturn the gradient with respect to a layer output z. z