to

Primary Definition

f(x)=max(0,x)f(x) = \max(0, x)

Piecewise Definition

f(x)={0if x0xif x>0f(x) = \begin{cases} 0 & \text{if } x \leq 0 \\ x & \text{if } x > 0 \end{cases}

ReLU (Rectified Linear Unit) is a non-linear activation function that outputs the input directly if positive, otherwise outputs zero.

1import torch
2
3# Using PyTorch
4x = torch.tensor([[-2, -1, 0, 1, 2]])
5output = torch.relu(x)
6
7# Or functional API
8output = torch.nn.functional.relu(x)