2012

  1. [Annan and Hargreaves 2012] Abstract. We investigate the identifiability of the climate by limited proxy data. We test a data assimilation approach through perfect model pseudoproxy experiments, using a simple likelihood-based weighting based on the particle filtering process. Our experimental set-up enables us to create a massive 10 000-member ensemble at modest computational cost, thus enabling us to generate statistically robust results. We find that the method works well when data are sparse and imprecise, but in this case the reconstruction has a rather low accuracy as indicated by residual RMS errors. Conversely, when data are relatively plentiful and accurate, the estimate tracks the target closely, at least when considering the hemispheric mean. However, in this case, our prior ensemble size of 10 000 appears to be inadequate to correctly represent the true posterior, and the regional performance is poor. Using correlations to assess performance gives a more encouraging picture, with significant correlations ranging from about 0.3 when data are sparse to values over 0.7 when data are plentiful, but the residual RMS errors are substantial in all cases. Our results imply that caution is required in interpreting climate reconstructions, especially when considering the regional scale, as skill on this basis is markedly lower than on the large scale of hemispheric mean temperature.
    Paleoclimate assimilation, pseudoproxy

  2. [Anchukaitis et al. 2012] To the Editor  In their Letter, Mann and colleagues1 claim to have identified a discrepancy between the degree of volcanic cooling in climate model simulations and the analogous cooling indicated in a tree-ring-based Northern Hemisphere temperature reconstruction2, and attribute it to a putative temporary cessation of tree growth at some sites near the temperature limit for growth. They argue that this growth cessation would lead to missing rings in cool years, thus resulting in underestimation of cooling in the tree-ring record. This suggestion implies that periods of volcanic cooling could result in widespread chronological errors in tree-ring-based temperature reconstructions1,3. Mann and colleagues base their conclusions solely on the evidence of a tree-ring-growth model.
    Tree ring response to volcanic forcing

  3. [Braconnot et al. 2012] There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
    Climate simulations, reconstructions, pmip3

  4. [Brands et al. 2012] In this study, we provide a worldwide overview on the expected sensitivity of downscaling studies to reanalysis choice. To this aim, we assess the similarity of middle-tropospheric variables |which are important for the development of both dynamical and statistical downscaling schemes| from ERA- 40 and NCEP/NCAR reanalysis data on daily timescale. For estimating distributional similarity, we use two comparable scores: the two-sample Kolmogorov-Smirnov statistic and the PDF-score. In addition, the similarity of the day-to-day sequences is evaluated with the Pearson Correlation Coecient. As most important results, the PDF-score is found to be inappropriate if the underlying data follows a mixed distribution. By providing global similarity maps for each variable under study, regions where reanalysis data should not assumed to be perfect are detected. In contrast to geopotential and temperature, signiffcant distributional dissimilarities for speciffc humidity are found in almost any region of the world. Moreover, for the latter these differences not only occur in the mean, but also in higher order moments. However, when considering standardized anomalies, distributional and serial dissimilarities are negligible over most extratropical land areas. Since transformed reanalysis data are not appropriate for regional climate models |as op- posite to statistical approaches,| their results are expected to be more sensitive to reanalysis choice.
    Reanalysis, era40, ncep, similarity

  5. [Brohan et al. 2012] Abstract. The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores, etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations. As the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. One organisation which systematically made observations and collected the results was the English East India Company (EEIC), and their archives have been preserved in the British Library. Inspection of those archives revealed 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure, and subjective estimates of wind speed and direction, from voyages across the Atlantic and Indian Oceans between 1789 and 1834. Those records have been extracted and digitised, providing 273 000 new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5 C). This provides an out-of-sample validation for the proxy reconstructions   supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this  such simulations are unlikely to be accurate in this respect.
    Model response to volcanic events, Krakatoa

  6. [Christiansen 2012]
    In their comment Tingley and Li (2011) acknowledge both the importance of relating proxies to local temperatures and the importance of choosing the proxy as the dependent variable. However, they raise questions about the behavior of the regression model and claim that Christiansen (2011) did not consider the uncertainty of the LOC method. They also strongly advocate for a hierarchical Bayesian approach for estimation of uncertainty. While it is true that the chosen regression model (step ii above) is ill-behaved for small slopes it is for the present purpose superior when proxies without correlations to the local temperatures have been eliminated (step i above). In Christiansen (2011) the uncertainty was not considered bt propagation of errors . Instead the skill of the reconstructions was evaluated in a series of ensemble pseudo-proxy experiments; a method that in this context is better than the suggested Bayesian approach as it also can address the problem of biases introduced by model choices. Christiansen and Ljungqvist (2011) 30 used the pseudo-proxy approach to provide confidence intervals for the reconstructions. We will deal with these points in more detail below.
    Climate reconstruttions, last millennium

  7. [Christidis et al. 2012]
    ABSTRACT: Seasonal mean temperatures averaged over the European region have warmed at a rate of 0.35–0.52 K/decade since 1980. The last decade has seen record-breaking seasonal temperatures in Europe including the summer of 2003 and the spring, autumn, and winter of 2007. Previous studies have established that European summer warming since the early twentieth century can be attributed to the effects of human influence. The attribution analysis described here employs temperature data from observations and experiments with two climate models and uses optimal fingerprinting to partition the climate response between its anthropogenic and natural components. These responses are subsequently combined with estimates of unforced climate variability to construct distributions of the annual values of seasonal mean temperatures with and without the effect of human activity. We find that in all seasons, anthropogenic forcings have shifted the tempera- ture distributions towards higher values. We compute the associated change in the likelihood of having seasons whose temperatures exceed a pre-specified threshold. We first set the threshold equal to the seasonal temperature observed in a particular year to assess the effect of anthropogenic influences in past seasons. We find that in the last decade (1999–2008) it is extremely likely (probability greater than 95) that the probability has more than doubled under the influence of human activity in spring and autumn, while for summer it is extremely likely that the probability has at least quadrupled. One of the two models employed in the analysis indicates it is extremely likely the probability has more than doubled in winter too. We also compute the change in probability over a range of temperature thresholds which enables us to provide updates on the likely change in probability attributable to human influence as soon as observations become available. Such near-real time information could be very useful for adaptation planning.
    Detection attribution in European seasons

  8. [Cook et al. 2012]
    ABSTRACT: We develop a summer temperature recon- struction for temperate East Asia based on a network of annual tree-ring chronologies covering the period 800–1989 C.E. The East Asia reconstruction is the regional average of 585 individual grid point summer temperature reconstructions produced using an ensemble version of point-by-point regression. Statistical calibration and vali- dation tests indicate that the regional average possesses sufficient overall skill to allow it to be used to study the causes of temperature variability and change over the region. The reconstruction suggests a moderately warm early medieval epoch (ca. 850–1050 C.E.), followed by generally cooler ‘Little Ice Age’ conditions (ca. 1350–1880 C.E.) and 20th century warming up to the present time. Since 1990, average temperature has exceeded past warm epochs of comparable duration, but it is not statistically unprecedented. Superposed epoch analysis reveals a vol- canic forcing signal in the East Asia summer temperature reconstruction, resulting in pulses of cooler summer con- ditions that may persist for several years. Substantial uncertainties remain, however, particularly at lower frequencies, thus requiring caution and scientific prudence in the interpretation of this record.
    Drought reconstructions last millennium

  9. [Crowley and Unterman 2012] This technical report describes details of developing a volcano forcing reconstruction (Crowley et al., 2008) for climate models that is based primarily on sulphate records in Antarctic and Greenland ice cores. The chronology of eruptions is considered accurate to within 5 1 yr for the interval AD 1104 2000 and 2 yr for AD 800 1103. The reconstruction involves: (1) calibration against satellite aerosol optical depth (AOD) estimates of the 1991 Pinatubo/Hudson eruptions; (2) partial validation against independent lunar estimates of AOD and global sulphate emissions; (3) partial assessment of uncertainties in AOD estimates; (4) assessment of possible tropical false positives  in ice core 10 reconstructions due to simultaneous occurrence of mid/high-latitude eruptions in each hemisphere; (5) identification of a new category of eruptions, termed unipolar  tropical eruptions, in which the eruption plume penetrates mainly to polar regions in only the hemisphere of its eruption; (6) use of different growth curves for high- and low-latitude eruptions; (7) specification of 2/3 power shortwave scaling for eruptions larger than the 15 1991 Pinatubo eruption; and (8) compensatory introduction of an estimate of effective particle size that affects lifetime and scattering properties of stratospheric aerosols.

  10. [Dunn et al. 2012] Abstract. This paper describes the creation of HadISD: an automatically quality-controlled synoptic resolution dataset of temperature, dewpoint temperature, sea-level pressure, wind speed, wind direction and cloud cover from global weather stations for 1973 2011. The full dataset consists of over 6000 stations, with 3427 long-term stations deemed to have sufficient sampling and quality for climate applications requiring sub-daily resolution. As with other surface datasets, coverage is heavily skewed towards Northern Hemisphere mid-latitudes. The dataset is constructed from a large pre-existing ASCII flatfile data bank that represents over a decade of substantial effort at data retrieval, reformatting and provision. These raw data have had varying levels of quality control applied to them by individual data providers. The work proceeded in several steps: merging stations with multiple reporting identifiers; reformatting to netCDF; quality control; and then filtering to form a final dataset. Particular attention has been paid to maintaining true extreme values where possible within an automated, objective process. Detailed validation has been performed on a subset of global stations and also on UK data using known extreme events to help finalise the QC tests. Further validation was performed on a selection of extreme events world-wide (Hurricane Katrina in 2005, the cold snap in Alaska in 1989 and heat waves in SE Australia in 2009). Some very initial analyses are performed to illustrate some of the types of problems to which the final data could be applied. Although the filtering has removed the poorest station records, no attempt has been made to homogenise the data thus far, due to the complexity of retaining the true distribution of high-resolution data when applying adjustments. Hence non-climatic, time-varying errors may still exist in many of the individual station records and care is needed in inferring long-term trends from these data. This dataset will allow the study of high frequency variations of temperature, pressure and humidity on a global basis over the last four decades. Both individual extremes and the overall population of extreme events could be investigated in detail to allow for comparison with past and projected climate. A version-control system has been constructed for this dataset to allow for the clear documentation of any updates and corrections in the future.
    Quality control

  11. [Gao et al. 2012]
    Correction to [Gao et al. 2008]
    In the paper Volcanic forcing of climate over the past 1500 years: An improved ice core based index for climate models  by C. Gao et al. (Journal of Geophysical Research, 113, D23111, doi:10.1029/2008JD010239, 2008) an erroneous longitude was given in one of the ice cores  DML B33  in Table 1 as well as Figure 1. The correct coordinates should be 75.2S, 6.5E  instead of 75.2S, 6.5W.  The corrected Figure 1 and Table 1 are below.
    volcanic forcing

  12. [Goll et al. 2012]
    Terrestrial carbon (C) cycle models applied for cli- mate projections simulate a strong increase in net primary productivity (NPP) due to elevated atmospheric CO2 concen- tration during the 21st century. These models usually neglect the limited availability of nitrogen (N) and phosphorus (P), nutrients that commonly limit plant growth and soil carbon turnover. To investigate how the projected C sequestration is altered when stoichiometric constraints on C cycling are considered, we incorporated a P cycle into the land surface model JSBACH (Jena Scheme for Biosphere–Atmosphere Coupling in Hamburg), which already includes representa- tions of coupled C and N cycles. The model reveals a distinct geographic pattern of P and N limitation. Under the SRES (Special Report on Emissions Scenarios) A1B scenario, the accumulated land C uptake be- tween 1860 and 2100 is 13 % (particularly at high latitudes) and 16 % (particularly at low latitudes) lower in simulations with N and P cycling, respectively, than in simulations with- out nutrient cycles. The combined effect of both nutrients re- duces land C uptake by 25 % compared to simulations with- out N or P cycling. Nutrient limitation in general may be biased by the model simplicity, but the ranking of limita- tions is robust against the parameterization and the inflex- ibility of stoichiometry. After 2100, increased temperature and high CO2 concentration cause a shift from N to P limita- tion at high latitudes, while nutrient limitation in the tropics declines. The increase in P limitation at high-latitudes is in- duced by a strong increase in NPP and the low P sorption capacity of soils, while a decline in tropical NPP due to high autotrophic respiration rates alleviates N and P limitations. The quantification of P limitation remains challenging. The poorly constrained processes of soil P sorption and biochem- ical mineralization are identified as the main uncertainties in the strength of P limitation. Even so, our findings indicate that global land C uptake in the 21st century is likely over- estimated in models that neglect P and N limitations. In the long term, insufficient P availability might become an impor- tant constraint on C cycling at high latitudes. Accordingly, we argue that the P cycle must be included in global models used for C cycle projections.
    carbon model

  13. [Gómez-Navarro et al. 2012]
    Abstract. In this study we analyse the role of internal variability in regional climate simulations through a comparison of two regional paleoclimate simulations for the last millennium. They share the same external forcings and model configuration, differing only in the initial condition used to run the driving global model simulation. A comparison of these simulations allows us to study the role of internal variability in climate models at regional scales, and how it affects the long-term evolution of climate variables such as temperature and precipitation. The results indicate that, although temperature is homogeneously sensitive to the effect of external forcings, the evolution of precipitation is more strongly governed by random unpredictable internal dynamics. There are, however, some areas where the role of internal variability is lower than expected, allowing precipitation to respond to the external forcings. In this respect, we explore the underlying physical mechanisms responsible for it. This study identifies areas, depending on the season, in which a direct comparison between model simulations of precipitation and climate reconstructions would be meaningful, but also other areas where good agreement between them should not be expected even if both are perfect.
    Regional climate simulations last millennium.

  14. [Goosse et al. 2012a]
    The spatial pattern and potential dynamical origin of the Medieval Climate Anomaly (MCA, around 1000 AD) in Europe are assessed with two recent reconstructions and simulations constrained to follow those reconstructions by means of paleoclimate data assimilation. The simulations employ a climate model of intermediate complexity (LOVECLIM). The data assimilation technique is based on a particle filter using an ensemble of 96 simulations. The peak winter (and annual mean) warming during the MCA, in our analyses, is found to be strongest at high latitudes, associated with strengthened mid-latitude westerlies. Summer warmth, by contrast, is found to be greatest in southern Europe and the Mediterranean Sea, associated with reduced westerlies and strengthened southerly winds off North Africa. The results of our analysis thus underscore the complexity of the spatial and seasonal structure of the MCA in Europe.
    Climate reconstruttions, last millennium, Data assimillation

  15. [Goosse et al. 2012b]
    Abstract Proxy reconstructions suggest that peak global temperature during the past warm interval known as the Medieval Climate Anomaly (MCA, roughly 950 1250 AD) has been exceeded only during the most recent decades. To better understand the origin of this warm period, we use model simulations constrained by data assimilation establishing the spatial pattern of temperature changes that is most consistent with forcing estimates, model physics and the empirical information contained in paleoclimate proxy records. These numerical experiments demonstrate that the reconstructed spatial temperature pattern of the MCA can be explained by a simple thermodynamical response of the climate system to relatively weak changes in radiative forcing combined with a modification of the atmospheric circulation, displaying some similarities with the positive phase of the so-called Arctic Oscillation, and with northward shifts in the position of the Gulf Stream and Kuroshio currents. The mechanisms underlying the MCA are thus quite different from anthropogenic mechanisms responsible for modern global warming.
    Climate reconstruttions, last millennium, Data assimillation

  16. [Huber and Knutti 2011]
    The Earths energy balance is key to understanding climate and climate variations that are caused by natural and anthropogenic changes in the atmospheric composition. Despite abundant observational evidence for changes in the energy balance over the past decades, the formal detection of climate warming and its attribution to human influence has so far relied mostly on the difference between spatio-temporal warming patterns of natural and anthropogenic origin. Here we present an alternative attribution method that relies on the principle of conservation of energy, without assumptions about spatial warming patterns. Based on a massive ensemble of simulations with an intermediate-complexity climate model we demonstrate that known changes in the global energy balance and in radiative forcing tightly constrain the magnitude of anthropogenic warming. We find that since the mid-twentieth century, greenhouse gases contributed 0:85 C of warming (5-95percent uncertainty: 0:6-1:1 C), about half of which was offset by the cooling effects of aerosols, with a total observed change in global temperature of about 0:56 C. The observed trends are extremely unlikely (<5percent) to be caused by internal variability, even if current models were found to strongly underestimate it. Our method is complementary to optimal fingerprinting attribution and produces fully consistent results, thus suggesting an even higher confidence that human-induced causes dominate the observed warming.
    Attribution, 20th century, EBM

  17. [Lawrence et al. 2012]
    To assess the climate impacts of historical and projected land cover change in the Community Climate System Model, version 4 (CCSM4), new time series of transient Community Land Model, version 4 (CLM4) plant functional type (PFT) and wood harvest parameters have been developed. The new parameters capture the dynamics of the Coupled Model Intercomparison Project phase 5 (CMIP5) land cover change and wood harvest trajectories for the historical period from 1850 to 2005 and for the four representative concentration pathway (RCP) scenarios from 2006 to 2100. Analysis of the biogeochemical impacts of land cover change in CCSM4 reveals that the model produced a historical cumulative land use flux of 127.7 PgC from 1850 to 2005, which is in general agreement with other global estimates of 156 PgC for the same period. The biogeophysical impacts of the transient land cover change parameters were cooling of the near-surface atmosphere over land by 20.18C, through increased surface albedo and reduced shortwave radiation absorption. When combined with other transient climate forcings, the higher albedo from land cover change was counteracted by de- creasing snow albedo from black carbon deposition and high-latitude warming. The future CCSM4 RCP simulations showed that the CLM4 transient PFT parameters can be used to represent a wide range of land cover change scenarios. In the reforestation scenario of RCP 4.5, CCSM4 simulated a drawdown of 67.3 PgC from the atmosphere into the terrestrial ecosystem and product pools. By contrast the RCP 8.5 scenario with deforestation and high wood harvest resulted in the release of 30.3 PgC currently stored in the ecosystem.
    Biogeochemical, biogeophysical changes with LULC

  18. [Ljungqvist et al. 2012]
    We analyse the spatio-temporal patterns of temperature variability over Northern Hemisphere land areas, on centennial time-scales, for the last 12 centuries using an unprecedentedly large network of temperature-sensitive proxy records. Geographically widespread positive temperature anomalies are observed from the 9th to 11th centuries, similar in extent and magnitude to the 20th century mean. A dominance of widespread negative anomalies is observed from the 16th to 18th centuries. Though we find the amplitude and spatial extent of the 20th century warming is within the range of natural variability over the last 12 centuries, we also find that the rate of warming from the 19th to the 20th century is unprecedented in the context of the last 1200 yr. The positive Northern Hemisphere temperature change from the 19th to the 20th century is clearly the largest between any two consecutive centuries in the past 12 centuries. These results remain robust even after removing a significant number of proxies in various tests of robustness showing that the choice of proxies has no particular influence on the overall conclusions of this study.
    Climate reconstruttions, last millennium

  19. [Lehner et al. 2012]
    The reconstruction of past atmospheric circulation is crucial for the understanding of natural climate change and its driving factors. A recent reconstruction suggests that, during Medieval times, the Euro- pean region was dominated by a persistent positive phase of the North Atlantic Oscillation (NAO), fol- lowed by a shift to a more oscillatory behavior. We test this hypothesis and the concept underlying the reconstruction in a pseudo-proxy approach using instrumental records, reanalysis data sets and millennial simulations with four different climate models. While a shift from a more positive to a more negative phase of the NAO seems to be likely, the amplitude and persistence of the reconstructed positive phase cannot be reproduced by models. The analysis further reveals that proxy locations that were used in the reconstruction are not always sufficient to describe the NAO. This is reflected in a failure of the reconstruction to verify against instrumental records of the NAO in the 19th century. By adding complementary proxies, the robustness of an NAO reconstruction can be improved to the degree that it would withstand the tests presented here
    Pseudoreality, testing NAO

  20. [Levitus et al. 2012]
    Abstract: We provide updated estimates of the change of ocean heat content and the thermosteric component of sea level change of the 0–700 and 0–2000 m layers of the World Ocean for 1955–2010. Our estimates are based on historical data not previously available, additional modern data, and bathythermograph data corrected for instrumental biases. We have also used Argo data corrected by the Argo DAC if available and used uncorrected Argo data if no corrections were available at the time we downloaded the Argo data. The heat content of the World Ocean for the 0–2000 m layer increased by 24.0 ± 1.9 × 1022 J (±2S.E.) corresponding to a rate of 0.39 W m−2 (per unit area of the World Ocean) and a volume mean warming of 0.09°C. This warming corresponds to a rate of 0.27 W m−2 per unit area of earth's surface. The heat content of the World Ocean for the 0–700 m layer increased by 16.7 ± 1.6 × 1022 J corresponding to a rate of 0.27 W m−2(per unit area of the World Ocean) and a volume mean warming of 0.18°C. The World Ocean accounts for approximately 93% of the warming of the earth system that has occurred since 1955. The 700–2000 m ocean layer accounted for approximately one-third of the warming of the 0–2000 m layer of the World Ocean. The thermosteric component of sea level trend was 0.54 ± .05 mm yr−1 for the 0–2000 m layer and 0.41 ± .04 mm yr−1 for the 0–700 m layer of the World Ocean for 1955–2010.
    Climate variability, OHC

  21. [Loeb et al. 2012]
    Global climate change results from a small yet persistent imbalance between the amount of sunlight absorbed by Earth and the thermal radiation emitted back to space1 . An apparent inconsistency has been diagnosed between interannual variations in the net radiation imbalance inferred from satellite measurements and upper-ocean heating rate from in situ measurements, and this inconsistency has been interpreted as -missing energy in the system2 . Here we present a revised analysis of net radiation at the top of the atmosphere from satellite data, and we estimate ocean heat content, based on three independent sources. We find that the difference between the heat balance at the top of the atmosphere and upper-ocean heat content change is not statistically significant when accounting for observational uncertainties in ocean measurements3 , given transitions in instrumentation and sampling. Furthermore, variability in Earth-Fs energy imbalance relating to El Ni-Aqo-Southern Oscillation is found to be consistent within observational uncertainties among the satellite measurements, a reanalysis model simulation and one of the ocean heat content records. We combine satellite data with ocean measurements to depths of 1,800 m, and show that between January 2001 and December 2010, Earth has been steadily accumulating energy at a rate of 0.50-A10.43 WmG (uncertainties at the 90% confidence level). We conclude that energy storage is continuing to increase in the sub-surface ocean.
    Climate reconstruttions, last millennium

  22. [Mann et al. 2012b,Mann et al. 2012a]
    The largest eruption of a tropical volcano during the past millennium occurred in AD 1258 1259. Its estimated radiative forcing was several times larger than the 1991 Pinatubo eruption1. Radiative forcing of that magnitude is expected to result in a climate cooling of about 2 C (refs 2 5). This effect, however, is largely absent from tree-ring reconstructions of temperature6 8, and is muted in reconstructions that employ a mix of tree-rings and other proxy data9,10. This discrepancy has called into question the climate impact of the eruption2,5,11. Here we use a tree-growth model driven by simulated temperature variations to show that the discrepancy between expected and reconstructed temperatures is probably an artefact caused by a reduced sensitivity to cooling in trees that grow near the treeline. This effect is compounded by the secondary effects of chronological errors due to missing growth rings and volcanically induced alterations of diffuse light. We support this conclusion with an assessment of synthetic proxy records created using the simulated temperature variations. Our findings suggest that the evidence from tree rings is consistent with a substantial climate impact2 5 of volcanic eruptions in past centuries that is greater than that estimated by tree-ring-based temperature reconstructions. Climate reconstruttions, last millennium

  23. [Martínez-Peña et al. 2012b]
    Mushrooms in general, and Boletus edulis and Lactarius deliciosus in particular, are important non-wood forest products worldwide. Despite their economic and ecological importance, models that describe the influence of different factors on mushroom yield are few. These models would support multi-objec- tive forest management and planning that takes into account mushroom production. This study aims at providing models for predicting the total yield of wild ectomycorrhizal mushrooms and, especially, of L. group deliciosus and B. edulis. Mushroom data were collected in 18 permanent plots in pure even-aged Pinus sylvestris stands during fifteen consecutive years. Variables describing weather conditions, stand structure and local site characteristics were used as predictors in the modeling process. Rainfall and tem- perature were significant predictors in all the fitted models. In addition, the total yield of ectomycorrhizal fungi was significantly affected by dominant height and stand age. The production of L. group deliciosus was influenced by dominant height and stand basal area. The equation fitted for B. edulis, to our knowl- edge, is the first model for this species. It shows that stand basal area is a strong factor influencing the yield. The equations presented in this study enable predictions of mushroom yield under different forest management schedules and climatic scenarios.
    Boletus edulis, Lactarius

  24. [Martínez-Peña et al. 2012a]
    Abstract With the aim of increasing knowledge of commu- nity structure, dynamics and production of ectomycorrhizal fungi, edible sporocarp yields were monitored between 1995 and 2004 in a Pinus sylvestris stand in the northeast zone of the Iberian Peninsula. A random sampling design was performed by stand age class according to the forest management plan: 0–15, 16–30, 31–50, 51–70 and over 71-years-old. Eighteen 150 m plots were established and sampled weekly every year from September to December. One hundred and nineteen taxa belonging to 51 genera were collected, 40 of which were edible and represented 74percent of the total biomass. Boletus edulis, Lactarius deliciosus, Cantharellus cibarius and Tricholoma portentosum sporocarps, which are considered to be of high commercial value, represented 34percent of the total production. B. edulis and L. deliciosus were the most remarkable and abundant species, and both were collected in more than 60percent of the samplings. B. edulis fructified every year of the experiment; its mean production was 40 kg/ha and year and its maximum productivity was more than 94 kg/ha in 1998. The age class with the largest production of this taxa was the fourth (51–70 years), with 70 kg/ha. L. deliciosus only failed to fructify one autumn (2000); its mean production was almost 10 kg/ha and its maximum productivity close to 30 kg/ha in
    Boletus edulis, Lactarius

  25. [Miller et al. 2012]
    Northern Hemisphere summer temperatures over the past 8000 years have been paced by the slow decrease in summer insolation resulting from the precession of the equinoxes. However, the causes of superposed century-scale cold summer anomalies, of which the Little Ice Age (LIA) is the most extreme, remain debated, largely because the natural forcings are either weak or, in the case of volcanism, short lived. Here we present precisely dated records of ice-cap growth from Arctic Canada and Iceland showing that LIA summer cold and ice growth began abruptly between 1275 and 1300 AD, followed by a substantial intensification 1430 1455 AD. Intervals of sudden ice growth coincide with two of the most volcanically perturbed half centuries of the past millennium. A transient climate model simulation shows that explosive volcanism produces abrupt summer cooling at these times, and that cold summers can be maintained by sea-ice/ ocean feedbacks long after volcanic aerosols are removed. Our results suggest that the onset of the LIA can be linked to an unusual 50-year-long episode with four large sulfur-rich explosive eruptions, each with global sulfate loading >60 Tg. The persistence of cold summers is best explained by consequent sea-ice/ocean feedbacks during a hemispheric summer insolation minimum; large changes in solar irradiance are not required
    Climate reconstructions, LIA, MCA

  26. [Moberg 2012] Christiansen and Ljungqvist (2011, doi:10.1175/2011JCLI4145.1) have presented an extra7 tropical NH temperature reconstruction using a method (LOC) they claim to preserve" low frequency variability, at the expense of exaggerated high-frequency variability. Using theoretical arguments and a pseudoproxy experiment, it is demonstrated here that the LOC method is not guaranteed to preserve variability at any frequency. Rather, LOC reconstruc1 tions will have more variance than true large-scale temperature averages at all frequencies. This variance in ation, however, can be negligible at those frequencies where the noise variance in individual proxies is small enough to be efectively cancelled when computing an avergage over the available proxies. Because the proxy noise variance at low frequencies can1 not be directly estimated, and thus has to be regarded as unkown, it is more safe to regard a reconstruction with the LOC method as providing an estimate of the upper bound of the large-scale low-frequency temperature varibility rather than one with a correct estimate of this variance. Climate reconstruttions, last millennium

  27. [Morice et al. 2012] Recent developments in observational near-surface air temperature and sea-surface temperature analyses are combined to produce HadCRUT4, a new data set of global and regional temperature evolution from 1850 to the present. This includes the addition of newly digitized measurement data, both over land and sea, new sea-surface temperature bias adjustments and a more comprehensive error model for describing uncertainties in sea-surface temperature measurements. An ensemble approach has been adopted to better describe complex temporal and spatial interdependencies of measurement and bias uncertainties and to allow these correlated uncertainties to be taken into account in studies that are based upon HadCRUT4. Climate diagnostics computed from the gridded data set broadly agree with those of other global near-surface temperature analyses. Fitted linear trends in temperature anomalies are approximately 0.07 C/decade from 1901 to 2010 and 0.17 C/decade from 1979 to 2010 globally. Northern/southern hemispheric trends are 0.08/0.07 C/decade over 1901 to 2010 and 0.24/0.10 C/decade over 1979 to 2010. Linear trends in other prominent near-surface temperature analyses agree well with the range of trends computed from the HadCRUT4 ensemble members.
    Global temp uncertainties

  28. [Rahmstorf et al. 2012 volume =] We analyse global temperature and sea-level data for the past few decades and compare them to projections published in the third and fourth assessment reports of the Intergovernmental Panel on Climate Change (IPCC). The results show that global temperature continues to increase in good agreement with the best estimates of the IPCC, especially if we account for the effects of short-term variability due to the El Nino/Southern Oscillation, volcanic activity $,1$|(B and solar variability. The rate of sea-level rise of the past few decades, on the other hand, is greater than projected by the IPCC models. This suggests that IPCC sea-level projections for the future may also be biased low.
    Ipcc projections and observations, climate change

  29. [Rodrigo et al. 2012] Abstract. In this work, a reconstruction of climatic condi- tions in Andalusia (southern Iberian Peninsula) during the period 1701–1850, as well as an evaluation of its associated uncertainties, is presented. This period is interesting because it is characterized by a minimum in solar irradiance (Dal- ton Minimum, around 1800), as well as intense volcanic ac- tivity (for instance, the eruption of Tambora in 1815), at a time when any increase in atmospheric CO2 concentrations was of minor importance. The reconstruction is based on the analysis of a wide variety of documentary data. The re- construction methodology is based on counting the number of extreme events in the past, and inferring mean value and standard deviation using the assumption of normal distribu- tion for the seasonal means of climate variables. This recon- struction methodology is tested within the pseudoreality of a high-resolution paleoclimate simulation performed with the regional climate model MM5 coupled to the global model ECHO-G. The results show that the reconstructions are influenced by the reference period chosen and the threshold values used to define extreme values. This creates uncertainties which are assessed within the context of climate simu- lation. An ensemble of reconstructions was obtained using two different reference periods (1885-1915 and 1960-1990) etc.
    Regional mm5 simulations of the last millennium and documentary data

  30. [Schmidt et al. 2012] Update of forcings for PMIP3 experiments for the Last Millennium re land use and solar activity forcing, pmip3

  31. [Smerdon 2012] [Smerdon 2012] Millennium-length, forced transient simulations with fully coupled general circulation models have become important new tools for addressing uncertainties in global and hemispheric temperature reconstructions targeting the Common Era (the last two millennia). These model simulations are used as test beds on which to evaluate the performance of paleoclimate reconstructionmethods using controlled and systematic investigations known as pseudoproxy experiments (PPEs). Such experiments are motivated by the fact that any given real-world reconstruction is the product of multiple uncontrolled factors, making it difficult to isolate the impact of one factor in reconstruction assessments and comparisons. PPEs have established a common experimental framework that can be systematically altered and evaluated, and thus test reconstruction methods and their dependencies. Although the translation of PPE results into real-world implications must be done cautiously, their experimental design attributes allow researchers to test reconstruction techniques beyond what was previously possible with real-world data alone. This review summarizes the development of PPEs and their findings over the last decade. The state of the science and its implications for global and hemispheric temperature reconstructions is also reviewed, as well as near-term design improvements that will expand the utility of PPEs pseudo reality

  32. [Stahle et al. 2012] [Stahle et al. 2012] Abstract A new tree-ring reconstruction of the Palmer Drought Severity Index (PDSI) for Mesoamerica from AD 771 to 2008 identifies megadroughts more severe and sustained than any witnessed during the twentieth century. Correlation analyses indicate strong forcing of instrumental and reconstructed June PDSI over Mesoamerica from the El Nin o/Southern Oscillation (ENSO). Spectral analyses of the 1,238-year reconstruction indicate significant concentrations of variance at ENSO, sub-decadal, bi-decadal, and multidecadal timescales. Instrumental and model-based analyses indicate that the Atlantic Multidecadal Oscillation is important to warm season climate variability over Mexico. Ocean-atmospheric variability in the Atlantic is not strongly correlated with the June PDSI reconstruction during the instrumental era, but may be responsible for the strong multidecadal variance detected in the reconstruction episodically over the past millennium. June drought indices in Mesoamerica are negatively correlated with gridded June PDSI over the United States from 1950 to 2005, based on both instrumental and reconstructed data. Interannual variability in this latitudinal moisture gradient is due in part to ENSO forcing, where warm events favor wet June PDSI conditions over the southern US and northern Mexico, but dryness over central and southern Mexico (Mesoamerica). Strong anti-phasing between multidecadal regimes of treering reconstructed June PDSI over Mesoamerica and reconstructed summer (JJA) PDSI over the Southwest has also been detected episodically over the past millennium, including the 1950 1960s when La Nin a and warm Atlantic SSTs prevailed, and the 1980 1990s when El Nin o and cold Atlantic SSTs prevailed. Several Mesoamerican megadroughts are reconstructed when wetness prevailed over the Southwest, including the early tenth century Terminal Classic Drought, implicating El Nin o and Atlantic SSTs in this intense and widespread drought that may have contributed to social changes in ancient Mexico. Drought, Enso, last millennium, North America

  33. [Taylor et al. 2012] [Taylor et al. 2012] The fifth phase of the Climate Model Intercomparison Project (CMIP5), now underway, promises to produce a freely available state-of-the-art multimodel dataset designed to advance our knowledge of climate variability and climate change. CMIP5

  34. [Tingley and Li 2012] [Tingley and Li 2012] In a recent paper, Bo Christiansen presents and discusses LOC, a methodology for reconstructing past climate that is based on local regressions between climate proxy time series and instrumental time series (Christiansen 2010, hereafter C2010). LOC respects two important scientific facts about proxy data which are often overlooked, namely that many proxies are likely influenced by strictly local temperature, and, to reflect causality, the proxies must be written as functions of climate, not vice versa. There are, however, several weaknesses to the LOC method: uncertainty is not propagated through the multiple stages of the analysis, the effects of observational errors in the instrumental observations are not considered, and as the proxies become uninformative of climate, the variance of a reconstruction produced by LOC becomes unbounded - a result which is clearly unphysical. These shortcomings can be overcome by interpreting the LOC method in the context of recently proposed Bayesian Hierarchical reconstruction methods. Climate reconstruttions, last millennium

  35. [Tingley et al. 2012] [Tingley et al. 2012] Reconstructing a climate process in both space and time from incomplete instrumental and climate proxy time series is a problem with clear societal relevance that poses both scientific and statistical challenges. These challenges, along with the interdisciplinary nature of the reconstruction problem, point to the need for greater cooperation between the earth science and statistics communities e a sentiment echoed in recent parliamentary reports. As a step in this direction, it is prudent to formalize what is meant by the paleoclimate reconstruction problem using the language and tools of modern statistics. This article considers the challenge of inferring, with uncertainties, a climate process through space and time from overlapping instrumental and climate sensitive proxy time series that are assumed to bewell dated e an assumption that is likely only reasonable for certain proxies over at most the last few millennia.Within a unifying, hierarchical spaceetime modeling framework for this problem, the modeling assumptions made by a number of published methods can be understood as special cases, and the distinction between modeling assumptions and analysis or inference choices becomes more transparent. The key aims of this article are to 1) establish a unifying modeling and notational framework for the paleoclimate reconstruction problem that is transparent to both the climate science and statistics communities; 2) describe how currently favored methods fit within this framework; 3) outline and distinguish between scientific and statistical challenges; 4) indicate how recent advances in the statistical modeling of large spaceetime data sets, as well as advances in statistical computation, can be brought to bear upon the problem; 5) offer, in broad strokes, some suggestions for model construction and how to perform the required statistical inference; and 6) identify issues that are important to both the climate science and applied statistics communities, and encourage greater collaboration between the two. Climate reconstruttions, last millennium, bayesian methods, review

  36. [Trenberth and Fasullo 2012] Abstract The state of knowledge and outstanding issues with respect to the global mean energy budget of planet Earth are described, along with the ability to track changes over time. Best estimates of the main energy components involved in radiative transfer and energy flows through the climate system do not satisfy physical constraints for conser- vation of energy without adjustments. The main issues relate to the downwelling longwave (LW) radiation and the hydrological cycle, and thus the surface evaporative cooling. It is argued that the discrepancy is 18but only 4is most likely that the latter is astray in some calculations, including many models, although there is also scope for precipitation estimates to be revised. Beginning in 2000, the top-of-atmosphere radiation measurements provide stable estimates of the net global radiative imbalance changes over a decade, but after 2004 there is missing energy as the observing system of the changes in ocean heat content, melting of land ice, and so on is unable to account for where it has gone. Based upon a number of climate model experiments for the twenty-first century where there are stases in global surface temperature and upper ocean heat content in spite of an identifiable global energy imbalance, we infer that the main sink of the missing energy is likely the deep ocean below 275 m depth.
    Radiative balance

  37. [Trouet et al. 2012] [Trouet et al. 2012] Within the last Millennium, the transition between the Medieval Climate Anomaly (MCA; ca. 1000 1300 CE) and the Little Ice Age (LIA; ca. 1400 1800 CE) has been recorded in a global array of climatic and oceanographic proxies. In this study, we review proxy evidence for two alternative hypotheses for the effects of this shift in the North Atlantic region. One hypothesis postulates that the MCA/LIA transition included a weakening of the Atlantic Meridional Overturning Circulation (AMOC) and a transition to more negative North Atlantic Oscillation (NAO) conditions, resulting in a strong cooling of the North Atlantic region. The alternative hypothesis proposes a MCA/LIA shift to an increased number of storms over the North Atlantic linked to increased mid-latitude cyclogenesis and hence a pervasive positive NAO state. The two sets of proxy records and thus of the two competing hypotheses are then reconciled based on available results from climate model simulations of the last Millennium. While an increase in storm frequency implicates positive NAO, increased intensity would be consistent with negative NAO during the LIA. Such an increase in cyclone intensity could have resulted from the steepening of the meridional temperature gradient as the poles cooled more strongly than the Tropics from the MCA into the LIA.
    NAO, North Atlantic, Last millennium, two conflicting hypothesis

  38. [Vuille et al. 2012] Abstract. We review the history of the South American summer monsoon (SASM) over the past 2000 yr based on high-resolution stable isotope proxies from speleothems, ice cores and lake sediments. Our review is complemented by an analysis of an isotope-enabled atmospheric general circulation model (GCM) for the past 130 yr. Proxy records from the monsoon belt in the tropical Andes and SE Brazil show a very coherent behavior over the past 2 millennia with significant decadal to multidecadal variability superimposed on large excursions during three key periods: the Medieval Climate Anomaly (MCA), the Little Ice Age (LIA) and the current warm period (CWP). We interpret these three periods as times when the SASM's mean state was significantly weakened (MCA and CWP) and strengthened (LIA), respectively. During the LIA each of the proxy archives considered contains the most negative δ18O values recorded during the entire record length. On the other hand, the monsoon strength is currently rather weak in a 2000-yr historical perspective, rivaled only by the low intensity during the MCA. Our climatic interpretation of these archives is consistent with our isotope-based GCM analysis, which suggests that these sites are sensitive recorders of large-scale monsoon variations. We hypothesize that these centennial-scale climate anomalies were at least partially driven by temperature changes in the Northern Hemisphere and in particular over the North Atlantic, leading to a latitudinal displacement of the ITCZ and a change in monsoon intensity (amount of rainfall upstream over the Amazon Basin). This interpretation is supported by several independent records from different proxy archives and modeling studies. Although ENSO is the main forcing for δ18O variability over tropical South America on interannual time scales, our results suggest that its influence may be significantly modulated by North Atlantic climate variability on longer time scales. Finally, our analyses indicate that isotopic proxies, because of their ability to integrate climatic information on large spatial scales, could complement more traditional proxies such as tree rings or documentary evidence. Future climate reconstruction efforts could potentially benefit from including isotopic proxies as large-scale predictors in order to better constrain past changes in the atmospheric circulation.
    Drought, monsoon, itcz, last2k
  39. [Yiou et al. 2012] The variability of the extra-tropical atmospheric circulation and its potential dependence on external forcings have been debated topics in climate modeling and observation communities. A recent reconstruction of the North Atlantic Oscillation Index has argued that the Medieval Warm Period period yielded a persistent positive phase of this index in contrast with an oscillating mode during the Little Ice Age. This paper tests whether this feature can be obtained, in millennium simulations from three different climate models. We examine the daily atmospheric dynamics that drives the main modes of extra-tropical variability. We find that the transition from a Medieval Warm Period to a Little Ice Age in the North Atlantic does not imply changes in patterns or frequency of weather regimes, although the mean surface temperature change is significant. This implies that the interpretation of proxy records in terms of atmospheric variability should be revised in order to take into account the structure of daily meteorological patterns, and/or climate models are too constrained to infer large changes of atmospheric variability.
    Test NAO reconstructions, pseudoproxy