Portal:Complex Systems Digital Campus/E-Department on Geosciences and the environment

Portal:Complex_Systems_Digital_Campus/E-Department_on_Geosciences and the environment  Introduction

The physical, chemical and biological environment of humans – from the local, community level to the global, planetary one – represents a rapidly increasing concern of the post-industrial era. Its study involves all the subsystems of the Earth system – the atmosphere, oceans, hydro- and cryosphere, as well as the solid Earth’s upper crust – along with their interactions with the biosphere and with human activities. We are therefore dealing with a highly complex, heterogeneous and multiscale system, and with an exceedingly interdisciplinary set of approaches to it. The concepts and tools of complex-system theory seem particularly useful in attacking three major challenges. Firstly, the range of uncertainties still prevailing in future climate change projections has until now been attributed largely to difficulties in parameterizing subgrid-scale processes in general circulation models (GCMs) and in tuning semi-empirical parameters. Recent studies also point to fundamental difficulties associated with the structural instability of climate models and suggest applying the theory of random dynamical systems to help reduce the uncertainties. Secondly, the Earth system varies at all space and time scales and is thus out of and probably far from thermodynamic equilibrium. The methods of statistical physics are therefore of interest in modeling the system’s near-equilibrium behavior and then extending the results farther away from equilibrium. Finally, much of the interest in this area arises from concern about the socio-economic impact of extreme events. The study of their statistics and dynamics can lead to a deeper understanding and more reliable prediction of these events. The physical, chemical and biological environment of humans – from the local, community level to the global, planetary one – represents a rapidly increasing concern of the post-industrial era. The system’s complexity is certainly comparable to that of systems studied in the life or cognitive sciences. It therefore appears highly appropriate to include this major area of applications of complex-system theory into the concerns of this road map. The Earth system involves several subsystems – the atmosphere, oceans, hydro- and cryosphere, as well as the solid Earth’s upper crust – each of which in turn is highly heterogeneous and variable on all space and time scales. Moreover, this variability is affected by and in turn affects the ecosystems hosted by each subsystem, as well as humans, their economy, society and politics. We are thus dealing with a highly complex, heterogeneous and multiscale system, and so the scientific disciplines needed to better understand, monitor, predict and manage this system are diverse and numerous. They include various subsets of the physical and life sciences, mathematics and informatics, and of course the full set of the geo- and environmental sciences, from geology, geophysics and geochemistry to the atmospheric and oceanic sciences, hydrology, glaciology and soil science. Among the key interdisciplinary issues that arise in this major area are future climate change, change in the distribution of and interaction between species given past, present and future climate change, the way that the biogeochemical cycles of trace chemicals and nutrients interact with other changes in the system, and the connection between health issues and environmental change. On the methodological side, major objectives that would help to solve these issues include better prediction and reduction of uncertainties, better description and modeling of the transport and mixing of planetary fluids, understanding the net effect of weather on climate and the changes in weather as climate changes. Understanding the best uses of stochastic, deterministic or combined modeling in this highly complex setting is also essential. To deal at the same time with some of these key issues and attempt to achieve some of the associated major objectives, we propose to focus on the following three main challenges: (i) to understand the reasons for and reduce the uncertainties in future climate change projections; (ii) to study the out-of-equilibrium statistical physics of the Earth system, across all scales; and (iii) to investigate the statistics and dynamics of extreme events. The range of uncertainties in future climate change projections was originally determined in 1979 as an equilibrium response in global temperatures of 1.5–4.5 K for a doubling of atmospheric CO2 concentration. After four IPCC assessment reports, it is still of a few degrees of end-of-century temperatures for any given greenhouse gas scenario. This persistent difficulty in reducing uncertainties has, until recently, been attributed largely to difficulties in parameterizing subgrid-scale processes in general circulation models (GCMs) and in tuning their semi-empirical parameters. But recent studies also point to fundamental difficulties associated with the structural instability of climate models and suggest applying the theory of random dynamical systems to help reduce the uncertainties. The Earth system varies at all space and time scales, from the microphysics of clouds to the general circulation of the atmosphere and oceans, from micro-organisms to planetary ecosystems, and from the decadal fluctuations of the magnetic field to continental drift. The entire system, as well as each of its subsystems, is a forced and dissipative system and is thus out of thermodynamic equilibrium and probably far away from it. The methods of statistical physics therefore seem of interest in modeling the system’s near-equilibrium behavior and trying to derive results that might then be extended to more realistic settings, farther away from equilibrium. Finally, much of the interest in the geosciences and the environment arises from concern about the socio-economic impact of extreme events. The standard approach to such events rests on generalized extreme value theory (GEV). Its assumptions, however, are rarely met in practice. It is therefore necessary to pursue more sophisticated statistical models and to try to ground them in a better understanding of the dynamics that gives rise to extreme events. Based on better statistical and dynamical models, we should be able to provide more reliable predictive schemes for extreme events, and subject them to extensive testing across disciplines and data sets. The geosciences have a long tradition of contributing to the study of nonlinear and complex systems. The work of E.N. Lorenz in the early 1960s has provided a major paradigm of sensitive dependence on initial state. His work and that of C. E. Leith have yielded deep insights into error propagation across scales of motion. Multiscale phenomena in the solid-earth and fluid-envelope context have helped refine the understanding of multi-fractality and its consequences for prediction across disciplines, even in the social and political sphere. We hope and trust that the work proposed here will prove equally inspiring and fruitful for the theory of complex systems and its applications in many other disciplines. New challenges

1. Understanding and reducing uncertainties. Charney et al. (Natural Academic Press, 1979) were the first to attempt a consensus estimate of the equilibrium sensitivity of climate change in atmospheric CO2 concentrations. The result was the now famous range of 1.5K to 4.5K of an increase in global near-surface air temperatures Ts given a doubling of CO2 concentrations. Earth's climate, however, never was and probably never will be in equilibrium. In addition to estimates of equilibrium sensitivity, the four successive reports of the Intergovernmental Panel on Climate Change (IPCC: 1991, 1996, 2001, 2007) therefore focused on estimates of climate change over the 21st century, based on several scenarios of CO2 increase over this time interval. The general circulation models (GCM) results of temperature increase over the coming 100 years have stubbornly resisted any narrowing of the range of estimates, with results for end-of-century Ts still ranging over several degrees Celsius, for a fixed CO2 increase scenario. This difficulty in narrowing the range of estimates is clearly connected to the complexity of the climate system, the nonlinearity of the processes involved and the obstacles to a faithful representation of these processes and feedbacks in GCMs. One obvious source of errors is the difficulty of representing all the processes that fall below the spatial and temporal resolution of the model. This problem is especially evident for biochemical processes, where the microphysical and microbiogical dynamics is coupled to the turbulent dynamics of the ocean and atmosphere and produces a spatiotemporal variability at virtually any scale of observation. One example is phytoplankton, whose fundamental role in absorbing CO2 is affected as much by the nutrient advection due to the large-scale circulation (basin scale, years), as by the presence of upwelling filaments (1-20 km, days), the ecological interaction with zooplankton (mm/m, hours/days), or the turbulent and biological processes at the cell scale. The study of such biochemical phenomena requires the development of novel theoretical tools that are beyond the capability of individual disciplines but which, because of their characteristics, fall naturally into the framework of complex systems. Such studies should be able to: 1. deal at the same time with the various spatial and temporal scales of transport and tracer dynamics; 2. integrate descriptions of different disciplines, notably transport and mixing properties from turbulence theory, and the biological and/or chemical processes of the advected tracer; 3. provide results in a form that can be compared with ever-expanding observational datasets; 4. and finally, allow to formulate a computationally-efficient parameterization scheme for circulation models. A second source of errors lies in the fundamental difficulty related to the structural instability of climate models. It is well-known that the space of all deterministic, differentiable dynamical systems (DDS) has a highly intricate structure: the structural stable systems are unfortunatly not typical of all deterministic dynamics, as originally hoped (Smale, 1967). Indeed, what is modeled by DDS does not appear to be typically robust from a qualitative, topological point of view, even for small systems like the Lorenz (1963) model. This disappointing fact has led mathematicians to grasp the problem of robustness and genericity with the help of new stochastic approaches (Palis, 2005). On the other hand, work on developing and using GCMs over several decades has amply demonstrated that any addition or change in a model's "parametrizations" - i.e. in the representation of subgrid-scale processes in terms of the model's explicit, large-scale variables - may result in noticeable changes in the model solution's behavior. The range of uncertainties issue, far from being a mere practical difficulty in "tuning" several model parameters, could be related to the inherent structural instability of climate models. A possible way of reducing this structural instability is the use of stochastic parametrizations with the aim of smoothing the resulting dynamics through ensemble average. A key question is then to determine whether ad-hoc stochastic parametrizations add some form of robustness to known deterministic climate models, and how they can reduce the range of uncertainties in future climate projections. Preliminary results indicate that noise has stabilizing effects that need to be investigated across a hierarchy of climate models from the simple to the most complex GCMs. Such an idea could be tested using theoretical concepts and numerical tools from the theory of random dynamical systems (RDS; L. Arnold, 1998). In this purely geometrical theory, noise is parametrized so as to treat stochastic processes as genuine flows living in an extended phase space called a "probability bundle". Random invariant sets such as random attractors can then be defined and rigorously compared, using the RDS concept of stochastic equivalence, thereby enabling us to consider the structural stochastic stability of these models.

2. Out-of-equilibrium statistical physics of the Earth system The Earth and its various components (hydrosphere, atmosphere, biosphere, lithosphere) are typical out-of-equilibrium systems: due the intrinsic dissipative nature of their processes, without forcing they are bound to decay to rest. However, in the presence of permanent forcing, a steady state regime can be established, in which forcing and dissipation equilibrate on average, allowing the maintenance of non-trivial steady states, with large fluctuations covering a wide range of scales. The number of degrees of freedom involved in the corresponding dynamics is so large that a statistical mechanics approach - allowing the emergence of global relevant quantities to describe the systems - would be welcome. Such a simplification would be especially welcome in the modeling of the fluid envelopes, where the capacity of present computers prohibits the full-scale numerical simulation of the (Navier-Stokes) equations describing them. Similar problems are ubiquitous in biology and environment, when the equations are known. Another interesting outcome of a statistical approach would be to derive an equivalent of the Fluctuation-Dissipation Theorem (FDT), to offer a direct relation between the fluctuations and the response of the system to infinitesimal external forcing. Applied to the Earth system, such an approach could provide new estimates of the impact of climate perturbation through greenhouse gas emissions. Various difficulties are associated with the definition of out-of-equilibrium statistical mechanics in the earth system, including: - the problem of the definition of an entropy (possibly an infinite hierarchy of them) in heterogeneous systems; - the identification of the constraints; - the problem of the non-extensivity of the statistical variables, due to correlations between the different components of the system (possibly solved by introducing effective (fractional) dimensions). On the physical side, several advances have been made recently in the description of turbulence, using tools borrowed from statistical mechanics for flows with symmetries. Variational principles of entropy production are also worth considering. Other advances have been made with regard to the equivalent of the FDT for physical systems far from equilibrium. Experimental tests in a glassy magnetic system have evidenced violation of the FDT through non-linearities in the relation between fluctuation and response. General identities between fluctuation and dissipation have been theoretically derived only for time-symmetric systems. They have been experimentally tested successfully in dissipative (non time-symmetric) systems like electrical circuits or turbulent flow. It would be interesting to extend these results to the Earth system.