User contributions for Thakshashila

A user with 264 edits. Account created on 11 April 2025.
Search for contributionsExpandCollapse
β§Όcontribs-topβ§½
β§Όcontribs-dateβ§½
(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)

12 June 2025

  • 11:5011:50, 12 June 2025 diff hist +2,653 N Chemical Thermodynamics Created page with "== Chemical Thermodynamics == '''Chemical Thermodynamics''' is the branch of thermodynamics that studies the interrelation of heat and work with chemical reactions or physical changes of state within chemical systems. It provides the framework to predict whether a reaction will occur spontaneously and to what extent it proceeds. === Basic Concepts === Chemical thermodynamics deals with the energy changes and equilibrium conditions in chemical reactions, focusing on va..." current
  • 11:4911:49, 12 June 2025 diff hist +2,526 N Entropy Created page with "== Entropy == '''Entropy''' (symbol <math>S</math>) is a fundamental thermodynamic property that measures the degree of disorder or randomness in a system. It quantifies the number of microscopic configurations that correspond to a thermodynamic system's macroscopic state. === Definition === Entropy is related to the number of possible microstates (<math>\Omega</math>) by the Boltzmann equation: <math> S = k_B \ln \Omega </math> where: * <math>S</math> = entropy..." current
  • 11:4911:49, 12 June 2025 diff hist +2,990 N Phase Equilibrium Created page with "== Phase Equilibrium == '''Phase equilibrium''' refers to the condition where multiple phases of a substance coexist in equilibrium without any net change in their amounts over time. It occurs when the chemical potential of each component is the same in all coexisting phases, ensuring no driving force for phase change. === Basics === In a system involving different phases (solid, liquid, gas), phase equilibrium is established when the rates of phase transitions (such a..." current
  • 11:4811:48, 12 June 2025 diff hist +2,941 N Thermodynamic Potential Created page with "== Thermodynamic Potential == '''Thermodynamic potentials''' are scalar quantities used in thermodynamics to describe the equilibrium and spontaneous behavior of physical systems. They are functions of state variables such as temperature, pressure, volume, and entropy, and provide criteria for spontaneous processes and equilibrium under different constraints. === Overview === Thermodynamic potentials combine the system's internal energy with other thermodynamic paramet..." current
  • 11:4711:47, 12 June 2025 diff hist +2,466 N Gibbs Free Energy Created page with "== Gibbs Free Energy == '''Gibbs Free Energy''' (denoted as <math>G</math>) is a thermodynamic potential that measures the maximum reversible work a thermodynamic system can perform at constant temperature and pressure. It is an important concept in chemistry and physics, used to predict the spontaneity of chemical reactions and phase changes. === Definition === Gibbs Free Energy is defined as: <math>G = H - TS</math> where: * <math>G</math> = Gibbs free energy *..." current

11 June 2025

  • 11:4411:44, 11 June 2025 diff hist +3,251 N Convolutional Neural Network Created page with "== Convolutional Neural Networks (CNNs) == A '''Convolutional Neural Network (CNN)''' is a type of deep learning model specially designed for working with '''image data''' πŸ“·. CNNs are widely used in computer vision tasks like image classification, object detection, and face recognition. === 🧠 Why CNNs for Images? === Images are large (millions of pixels), and fully connected neural networks don't scale well with size. CNNs solve this by using convolution operati..." current
  • 11:1211:12, 11 June 2025 diff hist +3,012 N Backpropagation Created page with "== Backpropagation == '''Backpropagation''' (short for "backward propagation of errors") is a fundamental algorithm used to train neural networks. It calculates how much each weight in the network contributed to the total error and updates them to reduce this error. === 🧠 Purpose === The main goal of backpropagation is to: * Minimize the '''loss function''' (error) πŸ“‰ * Improve model accuracy over time by adjusting weights πŸ”§ === πŸ” How It Works (Step-by-Ste..." current
  • 10:0910:09, 11 June 2025 diff hist 0 Exploding Gradient Problem β†’πŸ“Ž See Also current
  • 10:0910:09, 11 June 2025 diff hist +2,974 N Exploding Gradient Problem Created page with "== Exploding Gradient Problem == The '''Exploding Gradient Problem''' is a common issue in training deep neural networks where the gradients grow too large during backpropagation. This leads to very large weight updates, making the model unstable or completely unusable. === πŸ“ˆ What Are Gradients? === Gradients are computed during the backpropagation step of training. They help the model understand how to change its weights to reduce error. :<math> \text{Gradient} =..."
  • 10:0610:06, 11 June 2025 diff hist +2,942 N Vanishing gradient problem Created page with "== Vanishing Gradient Problem == The '''Vanishing Gradient Problem''' is a common issue encountered during the training of deep neural networks. It occurs when the gradients (used to update weights) become extremely small, effectively preventing the network from learning. === 🧠 What is a Gradient? === In neural networks, gradients are values calculated during '''backpropagation'''. They show how much the model's weights should change to reduce the loss (error). The..." current
  • 09:0609:06, 11 June 2025 diff hist +519 N Example of ReLU Activation Function Created page with "== ReLU (Rectified Linear Unit) Example == The ReLU function is defined as: :<math>f(x) = \max(0, x)</math> This means: * If ''x'' is '''positive''', it stays the same. * If ''x'' is '''negative''', it becomes ''0''. === Real Number Examples === {| class="wikitable" ! Input (x) ! ReLU Output f(x) |- | -3 | 0 |- | -1 | 0 |- | 0 | 0 |- | 2 | 2 |- | 5 | 5 |} In this table: * Negative numbers become 0 🚫 * Positive numbers pass through βœ… This makes ReLU very fast..." current

10 June 2025

(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)