to
Primary Definition
Piecewise Definition
ReLU (Rectified Linear Unit) is a non-linear activation function that outputs the input directly if positive, otherwise outputs zero.
1import torch
2
3# Using PyTorch
4x = torch.tensor([[-2, -1, 0, 1, 2]])
5output = torch.relu(x)
6
7# Or functional API
8output = torch.nn.functional.relu(x)