scholarly journals Increasing parameter certainty and data utility through multi-objective calibration of a spatially distributed temperature and solute model

2011 ◽  
Vol 15 (5) ◽  
pp. 1547-1561 ◽  
Author(s):  
C. Bandaragoda ◽  
B. T. Neilson

Abstract. To support the goal of distributed hydrologic and instream model predictions based on physical processes, we explore multi-dimensional parameterization determined by a broad set of observations. We present a systematic approach to using various data types at spatially distributed locations to decrease parameter bounds sampled within calibration algorithms that ultimately provide information regarding the extent of individual processes represented within the model structure. Through the use of a simulation matrix, parameter sets are first locally optimized by fitting the respective data at one or two locations and then the best results are selected to resolve which parameter sets perform best at all locations, or globally. This approach is illustrated using the Two-Zone Temperature and Solute (TZTS) model for a case study in the Virgin River, Utah, USA, where temperature and solute tracer data were collected at multiple locations and zones within the river that represent the fate and transport of both heat and solute through the study reach. The result was a narrowed parameter space and increased parameter certainty which, based on our results, would not have been as successful if only single objective algorithms were used. We also found that the global optimum is best defined by multiple spatially distributed local optima, which supports the hypothesis that there is a discrete and narrowly bounded parameter range that represents the processes controlling the dominant hydrologic responses. Further, we illustrate that the optimization process itself can be used to determine which observed responses and locations are most useful for estimating the parameters that result in a global fit to guide future data collection efforts.

2010 ◽  
Vol 7 (5) ◽  
pp. 8309-8345
Author(s):  
C. Bandaragoda ◽  
B. T. Neilson

Abstract. When prediction in space and time is the goal of distributed hydrologic and instream models, the importance of basing model structure and parameterization on physical processes is fundamental. In this paper, we present a systematic approach to using various data types at spatially distributed locations to decrease parameter bounds sampled within calibration algorithms that ultimately provide information regarding the extent of individual processes represented within the model structure. Through the use of a simulation matrix, parameter sets are first locally optimized by fitting the respective data at two locations and then the best results are selected to resolve which parameter sets perform best at all locations, or globally. This approach is illustrated using the Two-Zone Temperature and Solute (TZTS) model for a case study in the Virgin River, Utah, USA, where temperature and solute tracer data were collected at multiple locations and zones within the river that represent the fate and transport of both heat and solute through the study reach. We found improved model performance over the range of spatially distributed datasets relative to more common calibration approaches that use data at one location with multiple criteria objectives or at multiple locations with a single criteria objective. We also found that the global optimum is best defined by multiple spatially distributed local optima, which supports the hypothesis that there is a discrete and narrowly bounded parameter range that represents the processes controlling dominant hydrologic responses. Further, we illustrate that the optimization process itself can be used to determine which observed responses and locations are most useful for estimating the parameters that result in a global fit to guide future data collection efforts.


2016 ◽  
Vol 50 (3) ◽  
pp. 109-113
Author(s):  
Michael G. Morley ◽  
Marlene A. Jeffries ◽  
Steven F. Mihály ◽  
Reyna Jenkyns ◽  
Ben R. Biffard

AbstractOcean Networks Canada (ONC) operates the NEPTUNE and VENUS cabled ocean observatories to collect continuous data on physical, chemical, biological, and geological ocean conditions over multiyear time periods. Researchers can download real-time and historical data from a large variety of instruments to study complex earth and ocean processes from their home laboratories. Ensuring that the users are receiving the most accurate data is a high priority at ONC, requiring QAQC (quality assurance and quality control) procedures to be developed for a variety of data types (Abeysirigunawardena et al., 2015). Acquiring long-term time series of oceanographic data from remote locations on the seafloor presents significant challenges from a QAQC perspective. In order to identify and study important scientific events and trends, data consolidated from multiple deployments and instruments need to be self-consistent and free of biases due to changes to instrument configurations, calibrations, metadata, biofouling, or a degradation in instrument performance. As a case study, this paper describes efforts at ONC to identify and correct systematic biases in ocean current directions measured by ADCPs (acoustic Doppler current profilers), as well as the lessons learned to improve future data quality.


2021 ◽  
pp. 104742
Author(s):  
Noor Fadzilah Yusof ◽  
Tukimat Lihan ◽  
Wan Mohd Razi Idris ◽  
Zulfahmi Ali Rahman ◽  
Muzneena Ahmad Mustapha ◽  
...  

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Gianluca Solazzo ◽  
Ylenia Maruccia ◽  
Gianluca Lorenzo ◽  
Valentina Ndou ◽  
Pasquale Del Vecchio ◽  
...  

Purpose This paper aims to highlight how big social data (BSD) and analytics exploitation may help destination management organisations (DMOs) to understand tourist behaviours and destination experiences and images. Gathering data from two different sources, Flickr and Twitter, textual and visual contents are used to perform different analytics tasks to generate insights on tourist behaviour and the affective aspects of the destination image. Design/methodology/approach This work adopts a method based on a multimodal approach on BSD and analytics, considering multiple BSD sources, different analytics techniques on heterogeneous data types, to obtain complementary results on the Salento region (Italy) case study. Findings Results show that the generated insights allow DMOs to acquire new knowledge about discovery of unknown clusters of points of interest, identify trends and seasonal patterns of tourist demand, monitor topic and sentiment and identify attractive places. DMOs can exploit insights to address its needs in terms of decision support for the management and development of the destination, the enhancement of destination attractiveness, the shaping of new marketing and communication strategies and the planning of tourist demand within the destination. Originality/value The originality of this work is in the use of BSD and analytics techniques for giving DMOs specific insights on a destination in a deep and wide fashion. Collected data are used with a multimodal analytic approach to build tourist characteristics, images, attitudes and preferred destination attributes, which represent for DMOs a unique mean for problem-solving, decision-making, innovation and prediction.


2021 ◽  
Vol 12 (4) ◽  
pp. 98-116
Author(s):  
Noureddine Boukhari ◽  
Fatima Debbat ◽  
Nicolas Monmarché ◽  
Mohamed Slimane

Evolution strategies (ES) are a family of strong stochastic methods for global optimization and have proved their capability in avoiding local optima more than other optimization methods. Many researchers have investigated different versions of the original evolution strategy with good results in a variety of optimization problems. However, the convergence rate of the algorithm to the global optimum stays asymptotic. In order to accelerate the convergence rate, a hybrid approach is proposed using the nonlinear simplex method (Nelder-Mead) and an adaptive scheme to control the local search application, and the authors demonstrate that such combination yields significantly better convergence. The new proposed method has been tested on 15 complex benchmark functions and applied to the bi-objective portfolio optimization problem and compared with other state-of-the-art techniques. Experimental results show that the performance is improved by this hybridization in terms of solution eminence and strong convergence.


Author(s):  
Ravindra S Waghmare ◽  
Arun S Moharir

For complex reactions the optimal reactor networks can involve several reactors operating at various temperature profiles. The often-reported strategy of optimizing parameters of a heuristically predetermined reactor system (Super-structure Approach) falls short of obtaining true solution due to the presence of multiple local optima. Attainable set method gives Global optimum but requires study of each reaction scheme in depth. Here one such study using phase-plane analysis (instead of convexity based analysis) is reported for finding globally optimal non-isothermal reactor network for van de Vusse reaction (A -> B -> C, 2A -> D, objective is to maximize yield of B). Compared to two-reactor networks proposed earlier, it is found that up to 5 reactors (CSTR with/without bypass of feed, Isothermal PFR, Non-isothermal PFR, CSTR, Isothermal PFR) may be required to get the highest yield of the desired intermediate. The proposed method involves only elementary calculus. The detailed solution algorithm has been described using analogy with highways. Three cases with the values of reaction constants reported in the literature have been solved.


2005 ◽  
Vol 15 (03) ◽  
pp. 337-352 ◽  
Author(s):  
THOMAS NITSCHE

Data distributions are an abstract notion for describing parallel programs by means of overlapping data structures. A generic data distribution layer serves as a basis for implementing specific data distributions over arbitrary algebraic data types and arrays as well as generic skeletons. The necessary communication operations for exchanging overlapping data elements are derived automatically from the specification of the overlappings. This paper describes how the communication operations used internally by the generic skeletons are derived, especially for the asynchronous and synchronous communication scheduling. As a case study, we discuss the iterative solution of PDEs and compare a hand-coded MPI version with a skeletal one based on overlapping data distributions.


Sign in / Sign up

Export Citation Format

Share Document