Sensitivity analysis refers to an interesting approach aims at identifying the most influential parameters for a given output of a mathematical computer model that is often based on big data these days. The key goal is to evaluate the effect of uncertainty in each uncertain computer input variable on a particular model output. It is based on a probabilistic framework that evaluates the computer model itself on a large size of samples of model inputs. It thus analyzes all the results of the computer model outputs with a corresponding toolset. This typically helps in testing the robustness of the results of a computing model in the presence of uncertainty. More importantly, it provides a better understanding of the relationships between input and output variables in a computing model. Variance-based sensitivity measures are often used methods for global sensitivity analysis. A more general introduction is available here.
Sensitivity analysis studies are an important application of uncertainty quantification in numerical simulations that are often performed on supercomputers. One of the challenges of sensitivity analysis are large scale numerical systems that simulate complex spatial and temporal evolutions in models driven by known physical laws. The problem with complicated numerical models are the evolution in time with tens of variables on thousands of grid points. This means that very many computer simulation runs in order of O(1000) ensembles of the numerical simulation that are required to compute good statistics for a sensitivity analysis. In this context global sensitivity analysis is an ensemble of methods that work with a probabilistic representation of the input parameters to evaluate their overall variation range. More information can be obtained from here.
Details on Sensitivity Analysis
The following video provides more pieces of information about this subject: