Data Forest logo
Home page  /  Glossary / 
Sensitivity Analysis

Sensitivity Analysis

Sensitivity analysis is a mathematical and computational technique used to determine how variations in the input variables of a model impact the output. In data science, finance, engineering, and decision-making, sensitivity analysis identifies which inputs have the most significant effect on model outcomes, thus revealing the robustness and reliability of the model under different conditions. This approach enables analysts to understand the extent to which changes in specific factors—whether they are parameters, assumptions, or initial conditions—influence results, thus supporting informed decisions and effective risk management.

Core Characteristics of Sensitivity Analysis

  1. Input-Output Relationship:
    • Sensitivity analysis studies the dependency between input variables (factors) and the model's output by varying each input and observing changes in the output. This process is applied to quantify how “sensitive” the output is to variations in each input variable.  
    • If a small change in an input significantly alters the output, the model is said to be highly sensitive to that input. Conversely, if the output changes minimally, the sensitivity is low, suggesting that this variable has less influence on the model’s outcome.
  2. Types of Sensitivity Analysis:
    • Local Sensitivity Analysis: Evaluates changes in output based on small, incremental variations in each input around a fixed point. This approach is particularly useful for linear models or when the input-output relationship is well-understood.  
    • Global Sensitivity Analysis: Considers the effect of varying each input across its entire range, allowing for non-linear and complex models. Global sensitivity analysis provides a more comprehensive view by examining interactions between multiple variables simultaneously.
  3. Mathematical Formulation in Sensitivity Analysis
    Sensitivity analysis often involves partial derivatives to measure the rate of change in output (y) with respect to changes in an input variable (x). For a function y = f(x), the sensitivity of y to x is:  
    Sensitivity = ∂y / ∂x      
    which represents the rate at which y changes as x changes.  
    If a model has multiple inputs, the partial derivative for each input can be calculated to identify individual sensitivities. For a function y = f(x1, x2, ..., xn), the sensitivity of y with respect to input xi is:  
    ∂y / ∂xi  
    Elasticity is another common metric in sensitivity analysis, often used in economics and finance to assess the proportional change in the output for a percentage change in an input. It is defined as:  
    Elasticity = (∂y / y) / (∂x / x)
  4. Variance-Based Sensitivity:
    In complex models where exact functional relationships are challenging to define, variance-based sensitivity analysis is used. This approach decomposes the variance of the model output into contributions from each input factor, using formulas such as:    
    Var(y) = Σ Vi        
    where Vi represents the contribution of each input variable’s variance to the total variance of y.

Techniques and Methods in Sensitivity Analysis

  1. One-at-a-Time (OAT):
    • The OAT method involves changing one input variable at a time while keeping others constant, allowing the modeler to observe the isolated effect of each input on the output.  
    • This method is straightforward but can be less informative in cases where inputs interact or exhibit non-linear relationships.
  2. Monte Carlo Simulation:
    • Monte Carlo simulations use random sampling across input variable distributions to analyze sensitivity. Multiple simulations generate a range of outputs, which helps in estimating the probabilistic impact of inputs on the output.  
    • The Monte Carlo approach is commonly applied in fields where probabilistic risk assessment is required, such as finance and engineering.
  3. Sobol’ Indices:
    • Sobol’ indices are a measure used in global sensitivity analysis, quantifying the contribution of each input variable to the output’s total variance. The first-order Sobol’ index captures the individual effect of an input, while higher-order Sobol’ indices represent interaction effects between inputs.

Sensitivity analysis is widely applied across domains to validate and refine models, especially when predictions influence strategic decisions. In data science, sensitivity analysis is integral to model interpretation and validation, allowing data scientists to understand which features or parameters most influence predictive outcomes. For instance, in machine learning models, sensitivity analysis can reveal the importance of individual features in influencing model predictions.

In financial modeling, sensitivity analysis is used to test how various assumptions impact projections of key performance indicators, such as profit margins, cash flow, or investment returns. Similarly, in Big Data applications, where datasets are vast and complex, sensitivity analysis helps reduce model complexity by identifying critical inputs, focusing computational resources on the most influential factors, and ensuring robust predictions.

Sensitivity analysis enhances model reliability and interpretability by clarifying how changes in variables affect results, enabling more accurate, informed, and resilient decision-making.

Data Science
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Latest publications

All publications
Article preview
January 29, 2025
24 min

AI In Healthcare: Healing by Digital Transformation

Article preview
January 29, 2025
24 min

Predictive Maintenance in Utility Services: Sensor Data for ML

Article preview
January 29, 2025
21 min

Data Science in Power Generation: Energy 4.0 Concept

All publications
top arrow icon