MediaWiki API result
This is the HTML representation of the JSON format. HTML is good for debugging, but is unsuitable for application use.
Specify the format parameter to change the output format. To see the non-HTML representation of the JSON format, set format=json.
See the complete documentation, or the API help for more information.
{ "batchcomplete": "", "continue": { "lecontinue": "20250610063526|122", "continue": "-||" }, "query": { "logevents": [ { "logid": 132, "ns": 0, "title": "Chemical Thermodynamics", "pageid": 127, "logpage": 127, "revid": 265, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-12T11:50:21Z", "comment": "Created page with \"== Chemical Thermodynamics == '''Chemical Thermodynamics''' is the branch of thermodynamics that studies the interrelation of heat and work with chemical reactions or physical changes of state within chemical systems. It provides the framework to predict whether a reaction will occur spontaneously and to what extent it proceeds. === Basic Concepts === Chemical thermodynamics deals with the energy changes and equilibrium conditions in chemical reactions, focusing on va...\"" }, { "logid": 131, "ns": 0, "title": "Entropy", "pageid": 126, "logpage": 126, "revid": 264, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-12T11:49:49Z", "comment": "Created page with \"== Entropy == '''Entropy''' (symbol <math>S</math>) is a fundamental thermodynamic property that measures the degree of disorder or randomness in a system. It quantifies the number of microscopic configurations that correspond to a thermodynamic system's macroscopic state. === Definition === Entropy is related to the number of possible microstates (<math>\\Omega</math>) by the Boltzmann equation: <math> S = k_B \\ln \\Omega </math> where: * <math>S</math> = entropy...\"" }, { "logid": 130, "ns": 0, "title": "Phase Equilibrium", "pageid": 125, "logpage": 125, "revid": 263, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-12T11:49:19Z", "comment": "Created page with \"== Phase Equilibrium == '''Phase equilibrium''' refers to the condition where multiple phases of a substance coexist in equilibrium without any net change in their amounts over time. It occurs when the chemical potential of each component is the same in all coexisting phases, ensuring no driving force for phase change. === Basics === In a system involving different phases (solid, liquid, gas), phase equilibrium is established when the rates of phase transitions (such a...\"" }, { "logid": 129, "ns": 0, "title": "Thermodynamic Potential", "pageid": 124, "logpage": 124, "revid": 262, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-12T11:48:33Z", "comment": "Created page with \"== Thermodynamic Potential == '''Thermodynamic potentials''' are scalar quantities used in thermodynamics to describe the equilibrium and spontaneous behavior of physical systems. They are functions of state variables such as temperature, pressure, volume, and entropy, and provide criteria for spontaneous processes and equilibrium under different constraints. === Overview === Thermodynamic potentials combine the system's internal energy with other thermodynamic paramet...\"" }, { "logid": 128, "ns": 0, "title": "Gibbs Free Energy", "pageid": 123, "logpage": 123, "revid": 261, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-12T11:47:35Z", "comment": "Created page with \"== Gibbs Free Energy == '''Gibbs Free Energy''' (denoted as <math>G</math>) is a thermodynamic potential that measures the maximum reversible work a thermodynamic system can perform at constant temperature and pressure. It is an important concept in chemistry and physics, used to predict the spontaneity of chemical reactions and phase changes. === Definition === Gibbs Free Energy is defined as: <math>G = H - TS</math> where: * <math>G</math> = Gibbs free energy *...\"" }, { "logid": 127, "ns": 0, "title": "Convolutional Neural Network", "pageid": 122, "logpage": 122, "revid": 260, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-11T11:44:37Z", "comment": "Created page with \"== Convolutional Neural Networks (CNNs) == A '''Convolutional Neural Network (CNN)''' is a type of deep learning model specially designed for working with '''image data''' \ud83d\udcf7. CNNs are widely used in computer vision tasks like image classification, object detection, and face recognition. === \ud83e\udde0 Why CNNs for Images? === Images are large (millions of pixels), and fully connected neural networks don't scale well with size. CNNs solve this by using convolution operati...\"" }, { "logid": 126, "ns": 0, "title": "Backpropagation", "pageid": 121, "logpage": 121, "revid": 259, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-11T11:12:10Z", "comment": "Created page with \"== Backpropagation == '''Backpropagation''' (short for \"backward propagation of errors\") is a fundamental algorithm used to train neural networks. It calculates how much each weight in the network contributed to the total error and updates them to reduce this error. === \ud83e\udde0 Purpose === The main goal of backpropagation is to: * Minimize the '''loss function''' (error) \ud83d\udcc9 * Improve model accuracy over time by adjusting weights \ud83d\udd27 === \ud83d\udd01 How It Works (Step-by-Ste...\"" }, { "logid": 125, "ns": 0, "title": "Exploding Gradient Problem", "pageid": 120, "logpage": 120, "revid": 257, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-11T10:09:11Z", "comment": "Created page with \"== Exploding Gradient Problem == The '''Exploding Gradient Problem''' is a common issue in training deep neural networks where the gradients grow too large during backpropagation. This leads to very large weight updates, making the model unstable or completely unusable. === \ud83d\udcc8 What Are Gradients? === Gradients are computed during the backpropagation step of training. They help the model understand how to change its weights to reduce error. :<math> \\text{Gradient} =...\"" }, { "logid": 124, "ns": 0, "title": "Vanishing gradient problem", "pageid": 119, "logpage": 119, "revid": 256, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-11T10:06:54Z", "comment": "Created page with \"== Vanishing Gradient Problem == The '''Vanishing Gradient Problem''' is a common issue encountered during the training of deep neural networks. It occurs when the gradients (used to update weights) become extremely small, effectively preventing the network from learning. === \ud83e\udde0 What is a Gradient? === In neural networks, gradients are values calculated during '''backpropagation'''. They show how much the model's weights should change to reduce the loss (error). The...\"" }, { "logid": 123, "ns": 0, "title": "Example of ReLU Activation Function", "pageid": 118, "logpage": 118, "revid": 255, "params": {}, "type": "create", "action": "create", "user": "Thakshashila", "timestamp": "2025-06-11T09:06:03Z", "comment": "Created page with \"== ReLU (Rectified Linear Unit) Example == The ReLU function is defined as: :<math>f(x) = \\max(0, x)</math> This means: * If ''x'' is '''positive''', it stays the same. * If ''x'' is '''negative''', it becomes ''0''. === Real Number Examples === {| class=\"wikitable\" ! Input (x) ! ReLU Output f(x) |- | -3 | 0 |- | -1 | 0 |- | 0 | 0 |- | 2 | 2 |- | 5 | 5 |} In this table: * Negative numbers become 0 \ud83d\udeab * Positive numbers pass through \u2705 This makes ReLU very fast...\"" } ] } }