Main public logs
Combined display of all available logs of Qbase. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).
- 08:31, 28 March 2026 Thakshashila talk contribs created page Quantum Mechanics (Created page with "= Quantum Mechanics = '''Quantum Mechanics''' is a fundamental branch of physics that studies matter and energy at the smallest scales, typically atomic and subatomic levels. It departs from classical mechanics by introducing principles such as wave-particle duality, uncertainty, and quantum entanglement, which challenge our classical intuitions about reality. == History == The development of quantum mechanics began in the early 20th century, driven by the need to expl...")
- 11:50, 12 June 2025 Thakshashila talk contribs created page Chemical Thermodynamics (Created page with "== Chemical Thermodynamics == '''Chemical Thermodynamics''' is the branch of thermodynamics that studies the interrelation of heat and work with chemical reactions or physical changes of state within chemical systems. It provides the framework to predict whether a reaction will occur spontaneously and to what extent it proceeds. === Basic Concepts === Chemical thermodynamics deals with the energy changes and equilibrium conditions in chemical reactions, focusing on va...")
- 11:49, 12 June 2025 Thakshashila talk contribs created page Entropy (Created page with "== Entropy == '''Entropy''' (symbol <math>S</math>) is a fundamental thermodynamic property that measures the degree of disorder or randomness in a system. It quantifies the number of microscopic configurations that correspond to a thermodynamic system's macroscopic state. === Definition === Entropy is related to the number of possible microstates (<math>\Omega</math>) by the Boltzmann equation: <math> S = k_B \ln \Omega </math> where: * <math>S</math> = entropy...")
- 11:49, 12 June 2025 Thakshashila talk contribs created page Phase Equilibrium (Created page with "== Phase Equilibrium == '''Phase equilibrium''' refers to the condition where multiple phases of a substance coexist in equilibrium without any net change in their amounts over time. It occurs when the chemical potential of each component is the same in all coexisting phases, ensuring no driving force for phase change. === Basics === In a system involving different phases (solid, liquid, gas), phase equilibrium is established when the rates of phase transitions (such a...")
- 11:48, 12 June 2025 Thakshashila talk contribs created page Thermodynamic Potential (Created page with "== Thermodynamic Potential == '''Thermodynamic potentials''' are scalar quantities used in thermodynamics to describe the equilibrium and spontaneous behavior of physical systems. They are functions of state variables such as temperature, pressure, volume, and entropy, and provide criteria for spontaneous processes and equilibrium under different constraints. === Overview === Thermodynamic potentials combine the system's internal energy with other thermodynamic paramet...")
- 11:47, 12 June 2025 Thakshashila talk contribs created page Gibbs Free Energy (Created page with "== Gibbs Free Energy == '''Gibbs Free Energy''' (denoted as <math>G</math>) is a thermodynamic potential that measures the maximum reversible work a thermodynamic system can perform at constant temperature and pressure. It is an important concept in chemistry and physics, used to predict the spontaneity of chemical reactions and phase changes. === Definition === Gibbs Free Energy is defined as: <math>G = H - TS</math> where: * <math>G</math> = Gibbs free energy *...")
- 11:44, 11 June 2025 Thakshashila talk contribs created page Convolutional Neural Network (Created page with "== Convolutional Neural Networks (CNNs) == A '''Convolutional Neural Network (CNN)''' is a type of deep learning model specially designed for working with '''image data''' 📷. CNNs are widely used in computer vision tasks like image classification, object detection, and face recognition. === 🧠 Why CNNs for Images? === Images are large (millions of pixels), and fully connected neural networks don't scale well with size. CNNs solve this by using convolution operati...")
- 11:12, 11 June 2025 Thakshashila talk contribs created page Backpropagation (Created page with "== Backpropagation == '''Backpropagation''' (short for "backward propagation of errors") is a fundamental algorithm used to train neural networks. It calculates how much each weight in the network contributed to the total error and updates them to reduce this error. === 🧠 Purpose === The main goal of backpropagation is to: * Minimize the '''loss function''' (error) 📉 * Improve model accuracy over time by adjusting weights 🔧 === 🔁 How It Works (Step-by-Ste...")
- 10:09, 11 June 2025 Thakshashila talk contribs created page Exploding Gradient Problem (Created page with "== Exploding Gradient Problem == The '''Exploding Gradient Problem''' is a common issue in training deep neural networks where the gradients grow too large during backpropagation. This leads to very large weight updates, making the model unstable or completely unusable. === 📈 What Are Gradients? === Gradients are computed during the backpropagation step of training. They help the model understand how to change its weights to reduce error. :<math> \text{Gradient} =...")
- 10:06, 11 June 2025 Thakshashila talk contribs created page Vanishing gradient problem (Created page with "== Vanishing Gradient Problem == The '''Vanishing Gradient Problem''' is a common issue encountered during the training of deep neural networks. It occurs when the gradients (used to update weights) become extremely small, effectively preventing the network from learning. === 🧠 What is a Gradient? === In neural networks, gradients are values calculated during '''backpropagation'''. They show how much the model's weights should change to reduce the loss (error). The...")
- 09:06, 11 June 2025 Thakshashila talk contribs created page Example of ReLU Activation Function (Created page with "== ReLU (Rectified Linear Unit) Example == The ReLU function is defined as: :<math>f(x) = \max(0, x)</math> This means: * If ''x'' is '''positive''', it stays the same. * If ''x'' is '''negative''', it becomes ''0''. === Real Number Examples === {| class="wikitable" ! Input (x) ! ReLU Output f(x) |- | -3 | 0 |- | -1 | 0 |- | 0 | 0 |- | 2 | 2 |- | 5 | 5 |} In this table: * Negative numbers become 0 🚫 * Positive numbers pass through ✅ This makes ReLU very fast...")
- 06:35, 10 June 2025 Thakshashila talk contribs created page Gradient Descent (Created page with "= Gradient Descent = '''Gradient Descent''' is an optimization algorithm used in machine learning and deep learning to minimize the cost (loss) function by iteratively updating model parameters in the direction of steepest descent, i.e., the negative gradient. == What is Gradient Descent? == Gradient Descent helps find the best-fit parameters (like weights in a neural network or coefficients in regression) that minimize the error between predicted and actual values. I...")
- 06:34, 10 June 2025 Thakshashila talk contribs created page Normalization (Machine Learning) (Created page with "= Normalization (Machine Learning) = '''Normalization''' in machine learning is a data preprocessing technique used to scale input features so they fall within a similar range, typically between 0 and 1. This helps improve model performance, especially for algorithms sensitive to the scale of data. == Why Normalize Data? == Some machine learning algorithms (e.g., K-Nearest Neighbors, Gradient Descent-based models, Neural Networks) perform better when input features ar...")
- 06:15, 10 June 2025 Thakshashila talk contribs created page Category:Artificial Intelligence (Created page with "= Artificial Intelligence = This category includes all pages related to Artificial Intelligence (AI), including machine learning, deep learning, neural networks, and other AI-related techniques and applications. == Related Categories == * Category:Machine Learning * Category:Data Science * Category:Computer Science")
- 06:11, 10 June 2025 Thakshashila talk contribs created page Dimensionality Reduction (Created page with "= Dimensionality Reduction = '''Dimensionality Reduction''' is a technique in machine learning and data analysis used to reduce the number of input variables (features) while preserving as much relevant information as possible. == Why Use Dimensionality Reduction? == High-dimensional data can lead to problems such as: * '''Overfitting:''' Too many features can cause the model to learn noise. * '''Increased Computation:''' More features = more time and resources....")
- 06:10, 10 June 2025 Thakshashila talk contribs created page Unsupervised Learning (Created page with "= Unsupervised Learning = '''Unsupervised Learning''' is a type of machine learning where the model learns patterns and structures from unlabeled data without predefined outputs. == What is Unsupervised Learning? == In unsupervised learning, the input data has no associated labels. The goal is to explore the data’s inherent structure, group similar data points, or reduce the data’s dimensionality. == Common Types of Unsupervised Learning == * '''Clustering:''' G...")
- 06:08, 10 June 2025 Thakshashila talk contribs created page Clustering (Created page with "= Clustering = '''Clustering''' is an unsupervised machine learning technique that groups data points into clusters such that points in the same cluster are more similar to each other than to those in other clusters. == What is Clustering? == Unlike supervised learning, clustering does not use labeled data. The goal is to find natural groupings or patterns within the data based on similarity or distance measures. == Types of Clustering == * '''Partitioning Methods:'...")
- 06:05, 10 June 2025 Thakshashila talk contribs created page Regression (Created page with "= Regression = '''Regression''' is a type of supervised learning used to predict a continuous output variable based on one or more input features. == What is Regression? == In regression tasks, the goal is to model the relationship between input variables (features) and a continuous target variable. The model learns to estimate the output value for new inputs. == Types of Regression == * '''Simple Linear Regression:''' Models the relationship between a single input...")
- 06:04, 10 June 2025 Thakshashila talk contribs created page Supervised Learning (Created page with "= Supervised Learning = '''Supervised Learning''' is a type of machine learning where the model learns to map input data to output labels using a labeled dataset. == What is Supervised Learning? == In supervised learning, each training example includes both the input features and the corresponding correct output (label). The goal is for the model to learn the relationship between inputs and outputs so it can predict the labels for new, unseen data. == Types of Superv...")
- 06:00, 10 June 2025 Thakshashila talk contribs created page Hyperparameter Tuning (Created page with "= Hyperparameter Tuning = '''Hyperparameter Tuning''' is the process of optimizing the hyperparameters of a machine learning model to improve its performance on a specific task. == What are Hyperparameters? == Hyperparameters are settings or configurations external to the model that control the learning process. They are not learned from the data but set before training. Examples of hyperparameters include: * Learning rate in neural networks * Number of trees in a...")
- 05:59, 10 June 2025 Thakshashila talk contribs created page Bias-Variance Tradeoff (Created page with "= Bias-Variance Tradeoff = '''Bias-Variance Tradeoff''' is a fundamental concept in machine learning that describes the balance between two sources of error that affect model performance: bias and variance. == What is Bias? == Bias refers to the error introduced by approximating a real-world problem, which may be complex, by a simpler model. A model with high bias pays little attention to the training data and oversimplifies the problem. * High bias can cause '''unde...")
- 05:58, 10 June 2025 Thakshashila talk contribs created page Regularization (Created page with "= Regularization = '''Regularization''' is a technique in machine learning used to prevent '''overfitting''' by adding extra constraints or penalties to a model during training. == Why Regularization is Important == Overfitting happens when a model learns noise and details from the training data, harming its ability to generalize on new data. Regularization discourages overly complex models by penalizing large or unnecessary model parameters. == Common Types of Regul...")
- 05:56, 10 June 2025 Thakshashila talk contribs created page Train-Test Split (Created page with "= Train-Test Split = '''Train-Test Split''' is a fundamental technique in machine learning used to evaluate the performance of a model by dividing the dataset into two parts: a training set and a testing set. == What is Train-Test Split? == The dataset is split into: * '''Training Set:''' Used to train the machine learning model. * '''Testing Set:''' Used to evaluate how well the trained model performs on unseen data. This helps measure the model’s ability to ge...")
- 05:53, 10 June 2025 Thakshashila talk contribs created page Underfitting (Created page with "= Underfitting = '''Underfitting''' occurs when a machine learning model is too simple to capture the underlying pattern in the data, resulting in poor performance on both training and unseen data. == What is Underfitting? == Underfitting means the model fails to learn enough from the training data. It shows high errors during training and testing because it cannot capture important trends. == Causes of Underfitting == * '''Model Too Simple:''' Using a linear model...")
- 05:51, 10 June 2025 Thakshashila talk contribs created page Overfitting (Created page with "= Overfitting = '''Overfitting''' is a common problem in machine learning where a model learns the training data too well, including its noise and outliers, resulting in poor performance on new, unseen data. == What is Overfitting? == When a model is overfitted, it captures not only the underlying pattern but also the random fluctuations or noise in the training dataset. This causes the model to perform excellently on training data but badly on test or real-world data...")
- 05:46, 10 June 2025 Thakshashila talk contribs created page Evaluation Metrics (Created page with "= Evaluation Metrics = '''Evaluation Metrics''' are quantitative measures used to assess the performance of machine learning models. Choosing the right metric is essential for understanding how well a model performs, especially in classification and regression problems. == Why Are Evaluation Metrics Important? == * Provide objective criteria to compare different models. * Help detect issues like overfitting or underfitting. * Guide model improvement and selection. * R...")
- 05:45, 10 June 2025 Thakshashila talk contribs created page Cost-Sensitive Learning (Created page with "= Cost-Sensitive Learning = '''Cost-Sensitive Learning''' is a machine learning approach that incorporates different costs for different types of classification errors, helping models make better decisions in situations where misclassification errors have unequal consequences. == Why Cost-Sensitive Learning? == In many real-world problems, different mistakes have different costs. For example: * In medical diagnosis, a false negative (missing a disease) may be more co...")
- 05:44, 10 June 2025 Thakshashila talk contribs created page Area Under Precision-Recall Curve (AUPRC) (Created page with "= Area Under Precision-Recall Curve (AUPRC) = The '''Area Under the Precision-Recall Curve''' ('''AUPRC''') is a single scalar value that summarizes the performance of a binary classification model by measuring the area under its Precision-Recall (PR) curve. == What is the Precision-Recall Curve? == The Precision-Recall Curve plots: * '''Precision''' (y-axis): the proportion of true positive predictions among all positive predictions. * '''Recall''' (x-axis): the pro...")
- 05:40, 10 June 2025 Thakshashila talk contribs created page Imbalanced Data (Created page with "= Imbalanced Data = '''Imbalanced Data''' refers to datasets where the classes are not represented equally. In classification problems, one class (usually the positive or minority class) has far fewer examples than the other class (negative or majority class). == Why is Imbalanced Data a Problem? == Machine learning models often assume that classes are balanced and try to maximize overall accuracy. When data is imbalanced, models tend to be biased toward the majority...")
- 05:36, 10 June 2025 Thakshashila talk contribs created page Cross Validation (Created page with "= Cross-Validation = '''Cross-Validation''' is a statistical method used to estimate the performance of machine learning models on unseen data. It helps ensure that the model generalizes well and reduces the risk of overfitting. == Why Cross-Validation? == When training a model, it is important to test how well it performs on data it has never seen before. Simply evaluating a model on the same data it was trained on can lead to overly optimistic results. Cross-validat...")
- 05:35, 10 June 2025 Thakshashila talk contribs created page Model Selection (Created page with "= Model Selection = '''Model Selection''' is the process of choosing the best machine learning model from a set of candidate models based on their performance on a given task. It is a critical step to ensure the selected model generalizes well to new, unseen data. == Why Model Selection is Important == Different algorithms and model configurations may perform differently depending on the dataset and problem. Selecting the right model helps: * Improve prediction accur...")
- 05:34, 10 June 2025 Thakshashila talk contribs created page Threshold Tuning (Created page with "= Threshold Tuning = '''Threshold Tuning''' is the process of selecting the best decision threshold in a classification model to optimize performance metrics such as Precision, Recall, F1 Score, or Accuracy. It is crucial in models that output '''probabilities''' rather than direct class labels. == Why Threshold Tuning Matters == Many classifiers (e.g., Logistic Regression, Neural Networks) output a probability score indicating how likely an instance b...")
- 05:33, 10 June 2025 Thakshashila talk contribs created page AUC Score (Created page with "= AUC Score (Area Under the Curve) = The '''AUC Score''' refers to the '''Area Under the Curve''' and is a popular metric used to evaluate the performance of classification models, especially in binary classification tasks. Most commonly, AUC represents the area under the ROC Curve (Receiver Operating Characteristic Curve) or under the Precision-Recall Curve (PR Curve). == What is AUC? == AUC measures the ability of a model to distinguish between positive and...")