scholarly journals Exploring Atmosphere–Ocean Coupling Using Principal Component and Redundancy Analysis

2010 ◽  
Vol 23 (18) ◽  
pp. 4926-4943 ◽  
Author(s):  
Faez Bakalian ◽  
Harold Ritchie ◽  
Keith Thompson ◽  
William Merryfield

Abstract Principal component analysis (PCA), which is designed to look at internal modes of variability, has often been applied beyond its intended design to study coupled modes of variability in combined datasets, also referred to as combined PCA. There are statistical techniques better suited for this purpose such as singular value decomposition (SVD) and canonical correlation analysis (CCA). In this paper, a different technique is examined that has not often been applied in climate science, that is, redundancy analysis (RA). Similar to multivariate regression, RA seeks to maximize the variance accounted for in one random vector that is linearly regressed against another random vector. RA can be used for forecasting and prediction studies of the climate system. This technique has the added advantage that the time-lagged redundancy index offers a robust method of identifying lead–lag relations among climate variables. In this study, combined PCA and RA of global sea surface temperatures (SSTs) and sea level pressures (SLPs) are carried out for the National Centers for Environmental Prediction (NCEP) reanalysis data and a simulation of the Canadian Centre for Climate Modeling and Analysis (CCCma) climate model. A simplified state-space model is also constructed to aid in the diagnosis and interpretation of the results. The relative advantages and disadvantages of combined PCA and RA are discussed. Overall, RA tends to provide a clearer and more consistent picture of the underlying physical processes than combined PCA.

2013 ◽  
Vol 23 (1) ◽  
pp. 57 ◽  
Author(s):  
Nicholas M Weber ◽  
Andrea K Thomer ◽  
Gary Strand

<p>Model intercomparison projects are a unique and highly specialized form of data—intensive collaboration in the earth sciences. Typically, a set of pre‐determined boundary conditions (scenarios) are agreed upon by a community of model developers that then test and simulate each of those scenarios with individual ‘runs’ of a climate model. Because both the human expertise, and the computational power needed to produce an intercomparison project are exceptionally expensive, the data they produce are often archived for the broader climate science community to use in future research. Outside of high energy physics and astronomy sky surveys, climate modeling intercomparisons are one of the largest and most rapid methods of producing data in the natural sciences (Overpeck et al., 2010).</p><p>But, like any collaborative eScience project, the discovery and broad accessibility of this data is dependent on classifications and categorizations in the form of structured metadata—namely the Climate and Forecast (CF) metadata standard, which provides a controlled vocabulary to normalize the naming of a dataset’s variables. Intriguingly, the CF standard’s original publication notes, “…conventions have been developed only for things we know we need. Instead of trying to foresee the future, we have added features as required and will continue to do this” (Gregory, 2003). Yet, qualitatively we’ve observed that  this is not the case; although the time period of intercomparison projects remains stable (2-3 years), the scale and complexity of models and their output continue to grow—and thus, data creation and variable names consistently outpace the ratification of CF.</p><p> </p>


2021 ◽  
Author(s):  
Letizia Elia ◽  
Susanna Zerbini ◽  
Fabio Raicich

&lt;p&gt;We investigated a large network of permanent GPS stations to identify and analyse common patterns in the series of the GPS height, environmental parameters, and climate indexes.&lt;/p&gt;&lt;p&gt;The study is confined to Europe, the Mediterranean, and the North-eastern Atlantic area, where 114 GPS stations were selected from the Nevada Geodetic Laboratory (NGL) archive. The GPS time series were selected on the basis of the completeness and the length of the series.&lt;/p&gt;&lt;p&gt;In addition to the GPS height, the parameters analysed in this study are the atmospheric surface pressure (SP), the terrestrial water storage (TWS), and a few climate indexes, such as MEI (Multivariate ENSO Index). The Principal Component Analysis (PCA) is the methodology adopted to extract the main patterns of space/time variability of the parameters.&lt;/p&gt;&lt;p&gt;Moreover, the coupled modes of space/time interannual variability between pairs of variables was investigated. The methodology adopted is the Singular Value Decomposition (SVD).&lt;/p&gt;&lt;p&gt;Over the study area, main modes of variability in the time series of the GPS height, SP and TWS were identified. For each parameter, the main modes of variability are the first four. In particular, the first mode explains about 30% of the variance for GPS height and TWS and about 46% for SP. The relevant spatial patterns are coherent over the entire study area in all three cases.&lt;/p&gt;&lt;p&gt;The SVD analysis of coupled parameters, namely H-AP and H-TWS, shows that most of the common variability is explained by the first 3 modes, which account for almost 80% and 45% of the covariance, respectively.&lt;/p&gt;&lt;p&gt;Finally, we investigated the relation between the GPS height and a few climate indexes. Significant correlations, up to 50%, were found between the MEI (Multivariate Enso Index) and about half of the stations in the network.&lt;/p&gt;


2020 ◽  
Vol 33 (17) ◽  
pp. 7591-7617 ◽  
Author(s):  
Clara Orbe ◽  
Luke Van Roekel ◽  
Ángel F. Adames ◽  
Amin Dezfuli ◽  
John Fasullo ◽  
...  

AbstractWe compare the performance of several modes of variability across six U.S. climate modeling groups, with a focus on identifying robust improvements in recent models [including those participating in phase 6 of the Coupled Model Intercomparison Project (CMIP)] compared to previous versions. In particular, we examine the representation of the Madden–Julian oscillation (MJO), El Niño–Southern Oscillation (ENSO), the Pacific decadal oscillation (PDO), the quasi-biennial oscillation (QBO) in the tropical stratosphere, and the dominant modes of extratropical variability, including the southern annular mode (SAM), the northern annular mode (NAM) [and the closely related North Atlantic Oscillation (NAO)], and the Pacific–North American pattern (PNA). Where feasible, we explore the processes driving these improvements through the use of “intermediary” experiments that utilize model versions between CMIP3/5 and CMIP6 as well as targeted sensitivity experiments in which individual modeling parameters are altered. We find clear and systematic improvements in the MJO and QBO and in the teleconnection patterns associated with the PDO and ENSO. Some gains arise from better process representation, while others (e.g., the QBO) from higher resolution that allows for a greater range of interactions. Our results demonstrate that the incremental development processes in multiple climate model groups lead to more realistic simulations over time.


2013 ◽  
Vol 31 (3) ◽  
pp. 413 ◽  
Author(s):  
André Becker Nunes ◽  
Gilson Carlos Da Silva

ABSTRACT. The eastern region of Santa Catarina State (Brazil) has an important history of natural disasters due to extreme rainfall events. Floods and landslides are enhancedby local features such as orography and urbanization: the replacement of natural surface coverage causing more surface runoff and, hence, flooding. Thus, studies of this type of events – which directly influence life in the towns – take on increasing importance. This work makes a quantitative analysis of occurrences of extreme rainfall events in the eastern and northern regions of Santa Catarina State in the last 60 years, through individual analysis, considering the history of floods ineach selected town, as well as an estimate through to the end of century following regional climate modeling. A positive linear trend, in most of the towns studied, was observed in the results, indicating greater frequency of these events in recent decades, and the HadRM3P climate model shows a heterogeneous increase of events for all towns in the period from 2071 to 2100.Keywords: floods, climate modeling, linear trend. RESUMO. A região leste do Estado de Santa Catarina tem um importante histórico de desastres naturais ocasionados por eventos extremos de precipitação. Inundações e deslizamentos de terra são potencializados pelo relevo acidentado e pela urbanização das cidades da região: a vegetação nativa vem sendo removida acarretando um maior escoamento superficial e, consequentemente, em inundações. Desta forma, torna-se de suma importância os estudos acerca deste tipo de evento que influencia diretamente a sociedade em geral. Neste trabalho é realizada uma análise quantitativa do número de eventos severos de precipitação ocorridos nas regiões leste e norte de Santa Catarina dos últimos 60 anos, por meio de uma análise pontual, considerandoo histórico de inundações de cada cidade selecionada, além de uma projeção para o fim do século de acordo com modelagem climática regional. Na análise dos resultados observou-se uma tendência linear positiva na maioria das cidades, indicando uma maior frequência deste tipo de evento nas últimas décadas, e o modelo climático HadRM3P mostra um aumento heterogêneo no número de eventos para todas as cidades no período de 2071 a 2100.Palavras-chave: inundações, modelagem climática, tendência linear.


Author(s):  
Weijia Qian ◽  
Howard H. Chang

Health impact assessments of future environmental exposures are routinely conducted to quantify population burdens associated with the changing climate. It is well-recognized that simulations from climate models need to be bias-corrected against observations to estimate future exposures. Quantile mapping (QM) is a technique that has gained popularity in climate science because of its focus on bias-correcting the entire exposure distribution. Even though improved bias-correction at the extreme tails of exposure may be particularly important for estimating health burdens, the application of QM in health impact projection has been limited. In this paper we describe and apply five QM methods to estimate excess emergency department (ED) visits due to projected changes in warm-season minimum temperature in Atlanta, USA. We utilized temperature projections from an ensemble of regional climate models in the North American-Coordinated Regional Climate Downscaling Experiment (NA-CORDEX). Across QM methods, we estimated consistent increase in ED visits across climate model ensemble under RCP 8.5 during the period 2050 to 2099. We found that QM methods can significantly reduce between-model variation in health impact projections (50–70% decreases in between-model standard deviation). Particularly, the quantile delta mapping approach had the largest reduction and is recommended also because of its ability to preserve model-projected absolute temporal changes in quantiles.


2015 ◽  
Vol 28 (3) ◽  
pp. 1016-1030 ◽  
Author(s):  
Erik Swenson

Abstract Various multivariate statistical methods exist for analyzing covariance and isolating linear relationships between datasets. The most popular linear methods are based on singular value decomposition (SVD) and include canonical correlation analysis (CCA), maximum covariance analysis (MCA), and redundancy analysis (RDA). In this study, continuum power CCA (CPCCA) is introduced as one extension of continuum power regression for isolating pairs of coupled patterns whose temporal variation maximizes the squared covariance between partially whitened variables. Similar to the whitening transformation, the partial whitening transformation acts to decorrelate individual variables but only to a partial degree with the added benefit of preconditioning sample covariance matrices prior to inversion, providing a more accurate estimate of the population covariance. CPCCA is a unified approach in the sense that the full range of solutions bridges CCA, MCA, RDA, and principal component regression (PCR). Recommended CPCCA solutions include a regularization for CCA, a variance bias correction for MCA, and a regularization for RDA. Applied to synthetic data samples, such solutions yield relatively higher skill in isolating known coupled modes embedded in noise. Provided with some crude prior expectation of the signal-to-noise ratio, the use of asymmetric CPCCA solutions may be justifiable and beneficial. An objective parameter choice is offered for regularization with CPCCA based on the covariance estimate of O. Ledoit and M. Wolf, and the results are quite robust. CPCCA is encouraged for a range of applications.


2013 ◽  
Vol 2 (4) ◽  
pp. 253 ◽  
Author(s):  
Lenka Hudrlikova ◽  
Ludmila Petkovova

The aim of the paper is to provide a ranking of the Czech NUTS 3 regions based onsustainable development indicators. The original list of indicators was published by theCzech Statistical Office in 2008 and reviewedin 2010. In the analysis the same set ofindicators with the latest data was used. The indicators in each pillar are merged by meansof linear aggregation withweights derived from the principal component analysis.Because three pillars of sustainable development (environmental, economic and social)are assumed to be non-compensable, the multiple-criteria decision analysis is applied on apillar level in the final composite indicator. Both two main approaches – Borda andCondorcet were considered. Since the Borda approach leads to the compensability of theindicators, the Condorcet approach was in the spotlight. Advancedrules and adjustmentfor Condorcet approach were employed. Advantages and disadvantages of the methodsare provided. As a result more final rankings exist. The deep discussion about the resultsis provided. The special attention is paid to the capital city Prague, border regions, andindustrial regions. In addition, the correlation between final ranking and other indicatorsis tested.


2021 ◽  
Vol 14 (8) ◽  
pp. 4865-4890
Author(s):  
Peter Uhe ◽  
Daniel Mitchell ◽  
Paul D. Bates ◽  
Nans Addor ◽  
Jeff Neal ◽  
...  

Abstract. Riverine flood hazard is the consequence of meteorological drivers, primarily precipitation, hydrological processes and the interaction of floodwaters with the floodplain landscape. Modeling this can be particularly challenging because of the multiple steps and differing spatial scales involved in the varying processes. As the climate modeling community increases their focus on the risks associated with climate change, it is important to translate the meteorological drivers into relevant hazard estimates. This is especially important for the climate attribution and climate projection communities. Current climate change assessments of flood risk typically neglect key processes, and instead of explicitly modeling flood inundation, they commonly use precipitation or river flow as proxies for flood hazard. This is due to the complexity and uncertainties of model cascades and the computational cost of flood inundation modeling. Here, we lay out a clear methodology for taking meteorological drivers, e.g., from observations or climate models, through to high-resolution (∼90 m) river flooding (fluvial) hazards. Thus, this framework is designed to be an accessible, computationally efficient tool using freely available data to enable greater uptake of this type of modeling. The meteorological inputs (precipitation and air temperature) are transformed through a series of modeling steps to yield, in turn, surface runoff, river flow, and flood inundation. We explore uncertainties at different modeling steps. The flood inundation estimates can then be related to impacts felt at community and household levels to determine exposure and risks from flood events. The approach uses global data sets and thus can be applied anywhere in the world, but we use the Brahmaputra River in Bangladesh as a case study in order to demonstrate the necessary steps in our hazard framework. This framework is designed to be driven by meteorology from observational data sets or climate model output. In this study, only observations are used to drive the models, so climate changes are not assessed. However, by comparing current and future simulated climates, this framework can also be used to assess impacts of climate change.


2020 ◽  
Vol 13 (5) ◽  
pp. 2355-2377
Author(s):  
Vijay S. Mahadevan ◽  
Iulian Grindeanu ◽  
Robert Jacob ◽  
Jason Sarich

Abstract. One of the fundamental factors contributing to the spatiotemporal inaccuracy in climate modeling is the mapping of solution field data between different discretizations and numerical grids used in the coupled component models. The typical climate computational workflow involves evaluation and serialization of the remapping weights during the preprocessing step, which is then consumed by the coupled driver infrastructure during simulation to compute field projections. Tools like Earth System Modeling Framework (ESMF) (Hill et al., 2004) and TempestRemap (Ullrich et al., 2013) offer capability to generate conservative remapping weights, while the Model Coupling Toolkit (MCT) (Larson et al., 2001) that is utilized in many production climate models exposes functionality to make use of the operators to solve the coupled problem. However, such multistep processes present several hurdles in terms of the scientific workflow and impede research productivity. In order to overcome these limitations, we present a fully integrated infrastructure based on the Mesh Oriented datABase (MOAB) (Tautges et al., 2004; Mahadevan et al., 2015) library, which allows for a complete description of the numerical grids and solution data used in each submodel. Through a scalable advancing-front intersection algorithm, the supermesh of the source and target grids are computed, which is then used to assemble the high-order, conservative, and monotonicity-preserving remapping weights between discretization specifications. The Fortran-compatible interfaces in MOAB are utilized to directly link the submodels in the Energy Exascale Earth System Model (E3SM) to enable online remapping strategies in order to simplify the coupled workflow process. We demonstrate the superior computational efficiency of the remapping algorithms in comparison with other state-of-the-science tools and present strong scaling results on large-scale machines for computing remapping weights between the spectral element atmosphere and finite volume discretizations on the polygonal ocean grids.


Sign in / Sign up

Export Citation Format

Share Document