Example of ReLU Activation Function

ReLU (Rectified Linear Unit) Example

The ReLU function is defined as:

f(x)=max(0,x)

This means:

  • If x is positive, it stays the same.
  • If x is negative, it becomes 0.

Real Number Examples

Input (x) ReLU Output f(x)
-3 0
-1 0
0 0
2 2
5 5

In this table:

  • Negative numbers become 0 🚫
  • Positive numbers pass through ✅

This makes ReLU very fast and useful for deep learning models! 🤖✨