Example of ReLU Activation Function

Revision as of 09:06, 11 June 2025 by Thakshashila (talk | contribs) (Created page with "== ReLU (Rectified Linear Unit) Example == The ReLU function is defined as: :<math>f(x) = \max(0, x)</math> This means: * If ''x'' is '''positive''', it stays the same. * If ''x'' is '''negative''', it becomes ''0''. === Real Number Examples === {| class="wikitable" ! Input (x) ! ReLU Output f(x) |- | -3 | 0 |- | -1 | 0 |- | 0 | 0 |- | 2 | 2 |- | 5 | 5 |} In this table: * Negative numbers become 0 🚫 * Positive numbers pass through ✅ This makes ReLU very fast...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

ReLU (Rectified Linear Unit) Example

The ReLU function is defined as:

f(x)=max(0,x)

This means:

  • If x is positive, it stays the same.
  • If x is negative, it becomes 0.

Real Number Examples

Input (x) ReLU Output f(x)
-3 0
-1 0
0 0
2 2
5 5

In this table:

  • Negative numbers become 0 🚫
  • Positive numbers pass through ✅

This makes ReLU very fast and useful for deep learning models! 🤖✨