scholarly journals An adaptive two-stage analog/regression model for probabilistic prediction of small-scale precipitation in France

2018 ◽  
Vol 22 (1) ◽  
pp. 265-286 ◽  
Author(s):  
Jérémy Chardon ◽  
Benoit Hingray ◽  
Anne-Catherine Favre

Abstract. Statistical downscaling models (SDMs) are often used to produce local weather scenarios from large-scale atmospheric information. SDMs include transfer functions which are based on a statistical link identified from observations between local weather and a set of large-scale predictors. As physical processes driving surface weather vary in time, the most relevant predictors and the regression link are likely to vary in time too. This is well known for precipitation for instance and the link is thus often estimated after some seasonal stratification of the data. In this study, we present a two-stage analog/regression model where the regression link is estimated from atmospheric analogs of the current prediction day. Atmospheric analogs are identified from fields of geopotential heights at 1000 and 500 hPa. For the regression stage, two generalized linear models are further used to model the probability of precipitation occurrence and the distribution of non-zero precipitation amounts, respectively. The two-stage model is evaluated for the probabilistic prediction of small-scale precipitation over France. It noticeably improves the skill of the prediction for both precipitation occurrence and amount. As the analog days vary from one prediction day to another, the atmospheric predictors selected in the regression stage and the value of the corresponding regression coefficients can vary from one prediction day to another. The model allows thus for a day-to-day adaptive and tailored downscaling. It can also reveal specific predictors for peculiar and non-frequent weather configurations.

2017 ◽  
Author(s):  
Jérémy Chardon ◽  
Benoit Hingray ◽  
Anne-Catherine Favre

Abstract. Statistical Downscaling Methods (SDMs) are often used to produce local weather scenarios from large scale atmospheric information. SDMs include transfer functions which are based on a statistical link identified from observations between local weather and a set of large scale predictors. As physical processes generating surface weather vary in time, the most relevant predictors and the regression link are likely to also vary in time. This is well known for precipitation for instance and the link is thus often estimated after some seasonal stratification of the data. In this study, we present a hybrid model where the regression link is estimated from atmospheric analogs of the current prediction day. Atmospheric analogs are first identified from geopotential fields at 1000 and 500 hPa. For the regression stage, two Generalized Linear Models are further used to model the probability of precipitation occurrence and the distribution of non-zero precipitation amounts respectively. The hybrid model is evaluated for the probabilistic prediction of local precipitation over France. It noticeably improves the skill of the prediction for both precipitation occurrence and quantity. As the analog days vary from one prediction day to another, the atmospheric predictors selected in the regression stage and the value of the corresponding regression coefficients vary from one prediction day to another. The hybrid approach allows thus for a day-to-day adaptive and tailored downscaling. It can also reveal specific predictors for peculiar and non-frequent weather configurations.


2020 ◽  
Author(s):  
Bramka Arga Jafino ◽  
Jan Kwakkel

<p>Climate-related inequality can arise from the implementation of adaptation policies. As an example, the dike expansion policy for protecting rice farmers in the Vietnam Mekong Delta in the long run backfires to the small-scale farmers. The prevention of annual flooding reduces the supply of natural sediments, forcing farmers to apply more and more fertilizers to achieve the same yield. While large-scale farmers can afford this, small-scale farmers do not possess the required economics of scale and are thus harmed eventually. Together with climatic and socioeconomic uncertainties, the implementation of new policies can not only exacerbate existing inequalities, but also induce new inequalities. Hence, distributional impacts to affected stakeholders should be assessed in climate change adaptation planning.</p><p>In this study, we propose a two-stage approach to assess the distributional impacts of policies in model-based support for adaptation planning. The first stage is intended to explore potential inequality patterns that may emerge due to combination of new policies and the realization of exogenous scenarios. This stage comprises four steps: (i) disaggregation of performance indicators in the model in order to observe distributional impacts, (ii) performance of large-scale simulation experimentation to account for deep uncertainties, (iii) clustering of simulation results to identify distinctive inequality patterns, and (iv) application of scenario discovery tools, in particular classification and regression trees, to identify combinations of policies and uncertainties that lead to a specific inequality pattern.</p><p>In the second stage we attempt to asses which policies are morally preferable with respect to the inequality patterns they generate, rather than only descriptively explore the patterns which is the case in the previous stage. To perform a normative evaluation of the distributional impacts, we operationalize five alternative principles of justice: improvement of total welfare (utilitarianism), prioritization of worse-off actors (prioritarianism), reduction of welfare differences across actors (two derivations: absolute inequality and envy measure), and improvement of worst-off actor (Rawlsian difference). The different operationalization of each of these principles forms the so-called social welfare function with which the distributional impacts can be aggregated.</p><p>To test this approach, we use an agricultural planning case study in the upper Vietnam Mekong Delta. Specifically, we assess the distributional impacts of alternative adaptation policies in the upper Vietnam Mekong Delta by using an integrated assessment model. We consider six alternative policies as well as uncertainties related to upstream discharge, sediment supply, and land-use change. Through the first stage, we identify six potential inequality patterns among the 23 districts in the study area, as well as the combinations of policies and uncertainties that result in these types of patterns. From applying the second stage we obtain complete rankings of alternative policies, based on their performance with respect to distributional impacts, under different realizations of scenarios. The explorative stage allows policy-makers to identify potential actions to compensate worse-off actors while the normative stage helps them to easily rank alternative policies based on a preferred moral principle.</p>


2018 ◽  
Vol 842 ◽  
pp. 146-162 ◽  
Author(s):  
Simon J. Illingworth ◽  
Jason P. Monty ◽  
Ivan Marusic

A dynamical systems approach is used to devise a linear estimation tool for channel flow at a friction Reynolds number of $Re_{\unicode[STIX]{x1D70F}}=1000$. The estimator uses time-resolved velocity measurements at a single wall-normal location to estimate the velocity field at other wall-normal locations (the data coming from direct numerical simulations). The estimation tool builds on the work of McKeon & Sharma (J. Fluid Mech., vol. 658, 2010, pp. 336–382) by using a Navier–Stokes-based linear model and treating any nonlinear terms as unknown forcings to an otherwise linear system. In this way nonlinearities are not ignored, but instead treated as an unknown model input. It is shown that, while the linear estimator qualitatively reproduces large-scale flow features, it tends to overpredict the amplitude of velocity fluctuations – particularly for structures that are long in the streamwise direction and thin in the spanwise direction. An alternative linear model is therefore formed in which a simple eddy viscosity is used to model the influence of the small-scale turbulent fluctuations on the large scales of interest. This modification improves the estimator performance significantly. Importantly, as well as improving the performance of the estimator, the linear model with eddy viscosity is also able to predict with reasonable accuracy the range of wavenumber pairs and the range of wall-normal heights over which the estimator will perform well.


2014 ◽  
Vol 9 (1) ◽  
pp. 147-156 ◽  
Author(s):  
Mico Apostolov

Purpose – This paper is a case study of the Republic of Macedonia and focuses on the development of governance and enterprise restructuring. Thus, country's effective corporate governance and corporate control, which impact enterprise restructuring, are essential in the analysis of market-driven restructuring through domestic financial institutions and markets. The data used in this article are analyzed with an econometric regression model, which as employed in this study examines the interrelationships between governance and enterprise restructuring and set of policies that influence the governance patterns. Two basic hypothesis are taken in the analysis: first, governance and enterprise restructuring depend on set of policies, such as, large-scale privatization, small-scale privatization, price liberalization, competition policy, trade and foreign exchange system, banking reform and interest rate liberalization, securities markets and non-bank financial institutions and overall infrastructure reform; and second, governance and enterprise restructuring improves over time due to imposed policies. The paper aims to discuss these issues. Design/methodology/approach – The data used in this article are analyzed with an econometric regression model, which as employed in this study examines the interrelationships between governance and enterprise restructuring and set of policies that influence the governance patterns. Findings – There is still more to be done in order to bring these economies closer to the standards of developed ones. Indeed, it is needed considerable improvement of corporate governance, institution-building to control agency problems and imposing already adopted regulation, as well as, enforcing new enterprise restructuring policies, within existing policies of overall transition economy restructuring. Originality/value – This paper is a contribution to the research developing the business aspects of the Macedonian economy, as there is constant lack of scientific papers that deal with the specific issues of corporate governance and enterprise restructuring.


2020 ◽  
Author(s):  
Xiaoyong Yuan ◽  
Cixiang Chen ◽  
Renato Bassanezi ◽  
Feng Wu ◽  
Zheng Feng ◽  
...  

Huanglongbing (HLB) is a devastating citrus disease worldwide. A three-pronged approach to controlling HLB has been suggested, namely, removal of HLB-symptomatic trees, psyllid control, and replacement with HLB-free trees. However, such a strategy did not lead to successful HLB control in many citrus producing regions. We hypothesize this is because of the small-scale or incomprehensive implementation of the program, conversely, a comprehensive implementation of such a strategy at regional level can successfully control HLB. Here we investigated the effects of region-wide comprehensive implementation of this scheme to control HLB in Gannan region, China, with a total planted citrus acreage of over 110,000 ha from 2013–2019. With the region-wide implementation of comprehensive HLB management, overall HLB incidence in Gannan decreased from 19.71% in 2014 to 3.86% in 2019. A partial implementation of such a program (without a comprehensive inoculum removal) at the regional level in Brazil resulted in HLB incidence increasing from 1.89% in 2010 to 19.02% in 2019. A dynamic regression model analyses predicated that in a region-wide comprehensive implementation of such a program, HLB incidence would be controlled to a level of less than 1%. Economic feasibility analyses showed that average net profits were positive for groves that implemented the comprehensive strategy, but negative for groves without such a program over a ten-year period. Overall, the key for the three-pronged program to successfully control HLB control is the large scale (region-wide) and comprehensiveness in implementation. This study provides valuable information to control HLB and other endemic diseases worldwide.


2005 ◽  
Vol 12 (6) ◽  
pp. 979-991 ◽  
Author(s):  
J. Miksovsky ◽  
A. Raidl

Abstract. We investigated the usability of the method of local linear models (LLM), multilayer perceptron neural network (MLP NN) and radial basis function neural network (RBF NN) for the construction of temporal and spatial transfer functions between different meteorological quantities, and compared the obtained results both mutually and to the results of multiple linear regression (MLR). The tested methods were applied for the short-term prediction of daily mean temperatures and for the downscaling of NCEP/NCAR reanalysis data, using series of daily mean, minimum and maximum temperatures from 25 European stations as predictands. None of the tested nonlinear methods was recognized to be distinctly superior to the others, but all nonlinear techniques proved to be better than linear regression in the majority of the cases. It is also discussed that the most frequently used nonlinear method, the MLP neural network, may not be the best choice for processing the climatic time series - LLM method or RBF NNs can offer a comparable or slightly better performance and they do not suffer from some of the practical disadvantages of MLPs. Aside from comparing the performance of different methods, we paid attention to geographical and seasonal variations of the results. The forecasting results showed that the nonlinear character of relations between climate variables is well apparent over most of Europe, in contrast to rather weak nonlinearity in the Mediterranean and North Africa. No clear large-scale geographical structure of nonlinearity was identified in the case of downscaling. Nonlinearity also seems to be noticeably stronger in winter than in summer in most locations, for both forecasting and downscaling.


2006 ◽  
Vol 15 (02) ◽  
pp. 131-142 ◽  
Author(s):  
TING-YA HSIEH ◽  
MORRIS H. L. WANG ◽  
CHENG-WU CHEN ◽  
CHEN-YUAN CHEN ◽  
SHANG-EN YU ◽  
...  

The least square method is in generally used for curve fitting problems. We here propose a fuzzy S-curve regression model to deal with the case in which the observed data are given by fuzzy numbers. The fuzzy regression curve, obtained for project control and predicting the progress of large-scale or small-scale engineering, is smoothly connected by a Takagi-Sugeno (T-S) fuzzy model. This paper also proposes the concept that the upper bound and lower bound are given instead of the confidence interval when the observed data are not obtained exactly. Based on the project cash flow and progress payment records of an example project taken from the Department of Rapid Transit Systems, Taipei City Government, this model is demonstrated and tentative conclusions concerning the model are given. The S-curve equation developed here could be used in a variety of applications related to project control for the management of working capital for construction firms.


2020 ◽  
Vol 59 (8) ◽  
pp. 1333-1349
Author(s):  
S. C. Pryor ◽  
J. T. Schoof

AbstractClimate science is increasingly using (i) ensembles of climate projections from multiple models derived using different assumptions and/or scenarios and (ii) process-oriented diagnostics of model fidelity. Efforts to assign differential credibility to projections and/or models are also rapidly advancing. A framework to quantify and depict the credibility of statistically downscaled model output is presented and demonstrated. The approach employs transfer functions in the form of robust and resilient generalized linear models applied to downscale daily minimum and maximum temperature anomalies at 10 locations using predictors drawn from ERA-Interim reanalysis and two global climate models (GCM; GFDL-ESM2M and MPI-ESM-LR). The downscaled time series are used to derive several impact-relevant Climate Extreme (CLIMDEX) temperature indices that are assigned credibility based on 1) the reproduction of relevant large-scale predictors by the GCMs (i.e., fraction of regression beta weights derived from predictors that are well reproduced) and 2) the degree of variance in the observations reproduced in the downscaled series following application of a new variance inflation technique. Credibility of the downscaled predictands varies across locations and between the two GCM and is generally higher for minimum temperature than for maximum temperature. The differential credibility assessment framework demonstrated here is easy to use and flexible. It can be applied as is to inform decision-makers about projection confidence and/or can be extended to include other components of the transfer functions, and/or used to weight members of a statistically downscaled ensemble.


2015 ◽  
Vol 2015 ◽  
pp. 1-8
Author(s):  
Yueyue Liu ◽  
Rui Zhang ◽  
Miaomiao Wang ◽  
Xiaoxi Zhu

This paper studies a production scheduling problem with deteriorating jobs, which frequently arises in contemporary manufacturing environments. The objective is to find an optimal sequence of the set of jobs to minimize the total weighted tardiness, which is an indicator of service quality. The problem belongs to the class of NP-hard. When the number of jobs increases, the computational time required by an optimization algorithm to solve the problem will increase exponentially. To tackle large-scale problems efficiently, a two-stage method is presented in this paper. We partition the set of jobs into a few subsets by applying a neural network approach and thereby transform the large-scale problem into a series of small-scale problems. Then, we employ an improved metaheuristic algorithm (called GTS) which combines genetic algorithm with tabu search to find the solution for each subproblem. Finally, we integrate the obtained sequences for each subset of jobs and produce the final complete solution by enumeration. A fair comparison has been made between the two-stage method and the GTS without decomposition, and the experimental results show that the solution quality of the two-stage method is much better than that of GTS for large-scale problems.


Sign in / Sign up

Export Citation Format

Share Document