Thursday, June 26, 2014

Muddling Through with Models

A few observations on climate models from some folks on the inside....

"Model Structural Uncertainty: Are GCMs the Best Tools?" Folks here may recollect comments regarding model structure uncertainty in an earlier post
The policy-driven imperative of climate prediction has resulted in the accumulation of power and authority around GCMs [global coupled atmosphere-ocean models], based on the promise of using GCMs to set emissions reduction targets and for regional predictions of climate change. Complexity of model representation has become a central normative principle in evaluating climate models, good science and policy utility. However, not only are GCMs resource-intensive and intractable, they are also characterized by over parameterization and inadequate attention to uncertainty. Apart from the divergence of climate model predictions from observations over the past two decades that are raising questions as to whether GCMs are over sensitive to CO2 forcing, the hope for useful regional predictions of climate change is unlikely to be realized based on the current path of model development. The advancement of climate science is arguably being slowed by the focus of resources on this one path of climate modeling.
-- Curry, Climate, etc.

the GCM is the numerical  solution of a complex but purely deterministic set of nonlinear partial differential equations over a defined spatiotemporal grid, and no attempt is made to introduce any quantification of uncertainty into its construction. [emph. added]

Reductionism argues that deterministic approaches to science and positivist views of causation are the appropriate methodologies for exploring complex, multivariate systems. The difficulty is that a successful reductionist explanation need not imply the possibility of a successful constructionist approach, i.e., one where the behavior of a complex system can be deduced from the fundamental reductionist understanding. Rather, large, complex systems may be better understood, and perhaps only understood, in terms of observed, emergent behavior. The practical implication is that there exist system behaviors and structures that are not amenable to explanation or prediction by reductionist methodologies. [emph. added]
-- Stephan Harrison and David Stainforth, "Predicting Climate Change: Lessons from Reductionism, Emergence and the Past." (2009)
When climate modelers work to characterize uncertainties in their model, they focus on initial condition uncertainty and parametric (parameter and parameterization) uncertainty. 
[And this is why the model results are always stated with greater certainty than actually justified. Uncertainty in the parameters will always be less than uncertainty in the actual predictions.]

1 comment:

  1. Air quality models of EPA are used to predict downwind pollutant concentrations and have been used for regulatory purposes for many years. The results are typically within a factor of two or three of measured pollutant concentrations. So how in the heck can global temperatures be successfully predicted when GCM models have much less data to work with than the short-term (<1 year) and short distance (<25 mi) EPA regulatory models?

    ReplyDelete

Wonder and Anticipation, the Likes of Which We Have Never Seen

  Hello family, friends and fans of Michael F. Flynn.   It is with sorrow and regret that I inform you that my father passed away yesterday,...