Home » Quantifying uncertainty is understanding complexity

Quantifying uncertainty is understanding complexity

by admin
Quantifying uncertainty is understanding complexity

Listen to the audio version of the article

It is perhaps instinctive: we often anchor ourselves to the single piece of data, to its numerical value, attributing enormous meaning to such timely and (apparently) precise information. What the scientific community tells us with ever greater force is something completely different: more than the specific value, the variability and reproducibility that characterize the number itself are important. In computation and statistics, as well as in everyday life, the reassuring beauty of a numerical outcome leaves room for a more complex reality, made up of many possible scenarios that have different probabilities of materialising.

Uncertainty is therefore an integral part of the data, the key to interpreting the present and predicting the future, where deterministic analysis proves insufficient, especially in the most complex and frontier applications.

Rozza: «This way we foresee complex scenarios»

Observation, measurement and evaluation of information produce results that gain value based on understanding uncertainty, through data-driven mathematical models and applicable thanks to supercomputers and high-performance computing. The most emblematic example is weather forecasts: many days in advance it is possible to have a range of meteorological scenarios, then as the time distance decreases the possible options tend to decrease. «The quantification of uncertainty allows us to bring data closer to models, helping to understand increasingly complex phenomena, to outline scenarios that were unthinkable until a few years ago», explains Gianluigi Rozza, professor of numerical analysis at Sissa in Trieste and one of the presidents of the Conference on the quantification of uncertainty of the Society for Industrial and Applied Mathematics (Siam), hosted for its sixth edition by the city of Trieste with the participation of over a thousand scientists from all over the world. A sign, in this case not at all uncertain, of the growing interest in the line of research and its concrete consequences. «Mastering these dynamics improves the reliability of the results and allows you to establish more precisely the probability that a phenomenon will happen, adequately weighing all the factors in play and providing exhaustive explanations», he specifies. Thus the quantification of uncertainty becomes decisive as a tool for projection into the future, well beyond the confines of the laboratory.

See also  The basic idea of ​​social networks has failed - and that's a good thing

A scientific branch in great development

From the currents in the atmosphere and in the seas to the management of Industry 4.0 processes, passing through progress in the medical field (just think of the development of drugs through numerical simulations, or the personalization of treatments), data and their variability also guide the work of artificial intelligence systems. There is even talk of the possibility of developing forecast estimates on earthquakes, even if faced with risks of this kind the analysis requires not only solid quantification but also virtuous models of public communication, always poised between the drift of panic and that of underestimation of danger.

From a technical point of view, working with uncertainty means that instead of mathematical curve classifications – made up of lines without thickness – you have more or less wide bands, very narrow in areas where there is little variability and wider elsewhere . But it is not simply a question of a range between a minimum and a maximum, but rather of attributing the right weights to the probability distributions. «In recent years, computational power has grown and costs have fallen, so it is possible to exploit simulation techniques that even just a few decades ago would have been visionary», explains Daniela Calvetti of Case Western Reserve University in Ohio, in the United States, and co -organizer of the Trieste convention. «Now this scientific branch is undergoing great development: in all sectors there is a growing awareness that a quantitative result can only be of value if it adequately takes into account all possible scenarios».

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy