Example of ReLU Activation Function: Difference between revisions
Thakshashila (talk | contribs) Created page with "== ReLU (Rectified Linear Unit) Example == The ReLU function is defined as: :<math>f(x) = \max(0, x)</math> This means: * If ''x'' is '''positive''', it stays the same. * If ''x'' is '''negative''', it becomes ''0''. === Real Number Examples === {| class="wikitable" ! Input (x) ! ReLU Output f(x) |- | -3 | 0 |- | -1 | 0 |- | 0 | 0 |- | 2 | 2 |- | 5 | 5 |} In this table: * Negative numbers become 0 🚫 * Positive numbers pass through ✅ This makes ReLU very fast..."  |
(No difference)
|
Latest revision as of 09:06, 11 June 2025
ReLU (Rectified Linear Unit) Example
The ReLU function is defined as:
This means:
- If x is positive, it stays the same.
- If x is negative, it becomes 0.
Real Number Examples
Input (x) | ReLU Output f(x) |
---|---|
-3 | 0 |
-1 | 0 |
0 | 0 |
2 | 2 |
5 | 5 |
In this table:
- Negative numbers become 0 🚫
- Positive numbers pass through ✅
This makes ReLU very fast and useful for deep learning models! 🤖✨