scholarly journals Ship Emission Mitigation Strategies Choice Under Uncertainty

Energies ◽  
2020 ◽  
Vol 13 (9) ◽  
pp. 2213 ◽  
Author(s):  
Jun Yuan ◽  
Haowei Wang ◽  
Szu Hui Ng ◽  
Victor Nian

Various mitigation strategies have been proposed to reduce the CO2 emissions from ships, which have become a major contributor to global emissions. The fuel consumption under different mitigation strategies can be evaluated based on two data sources, real data from the real ship systems and simulated data from the simulation models. In practice, the uncertainties in the obtained data may have non-negligible impacts on the evaluation of mitigation strategies. In this paper, a Gaussian process metamodel-based approach is proposed to evaluate the ship fuel consumption under different mitigation strategies. The proposed method not only can incorporate different data sources but also consider the uncertainties in the data to obtain a more reliable evaluation. A cost-effectiveness analysis based on the fuel consumption prediction is then applied to rank the mitigation strategies under uncertainty. The accuracy and efficiency of the proposed method is illustrated in a chemical tanker case study, and the results indicate that it is critical to consider the uncertainty, as they can lead to suboptimal decisions when ignored. Here, trim optimisation is ranked more effective than draft optimisation when the uncertainty is ignored, but the reverse is the case when the uncertainty in the estimations are fully accounted for.

2019 ◽  
Vol 36 (06) ◽  
pp. 1940011
Author(s):  
Giulia Pedrielli ◽  
K. Selcuk Candan ◽  
Xilun Chen ◽  
Logan Mathesen ◽  
Alireza Inanalouganji ◽  
...  

Real-time decision making has acquired increasing interest as a means to efficiently operating complex systems. The main challenge in achieving real-time decision making is to understand how to develop next generation optimization procedures that can work efficiently using: (i) real data coming from a large complex dynamical system, (ii) simulation models available that reproduce the system dynamics. While this paper focuses on a different problem with respect to the literature in RL, the methods proposed in this paper can be used as a support in a sequential setting as well. The result of this work is the new Generalized Ordinal Learning Framework (GOLF) that utilizes simulated data interpreting them as low accuracy information to be intelligently collected offline and utilized online once the scenario is revealed to the user. GOLF supports real-time decision making on complex dynamical systems once a specific scenario is realized. We show preliminary results of the proposed techniques that motivate the authors in further pursuing the presented ideas.


2021 ◽  
Author(s):  
Xiaopeng Li ◽  
Jianliang Huang ◽  
Martin Willberg ◽  
Roland Pail ◽  
Cornelis Slobbe ◽  
...  

<p>The theories of downward continuation (DC) have been extensively studied for many decades, during which many different approaches were developed. In real applications, however, researchers often just use one method, probably due to resource limitations or to finish their work, without a rigorous head-to-head comparison with other alternatives. Considering that different methods perform quite differently under various conditions, comparing results from different methods can help a lot for identifying potential problems when dramatic differences occur, and for confirming the correctness of the solutions when results converge together, which is extremely important for real applications such as building official national vertical datums. This paper gives exactly such a case study by recording the collective wisdom recently developed within  the IAG’s study group SC2.4.1. A total of six normally used DC methods, which are SHA (NGS), LSC (DTU Space), Poisson and ADC (NRCan), RBF (DU Delft), and RLSC (TUM), are applied to both simulated data (in the combination of two sampling strategies with three noise levels) and real data in a Colorado-area test bed. The data are downward continued to both surface points and to the reference ellipsoid surface. The surface points are directly evaluated with the observed gravity data on the topography. The ellipsoid points are then transformed into geoid heights according to NRCan’s Stokes-Helmert’s scheme and eventually evaluated at the GNSS/Leveling benchmarks. In this presentation, we will summarize the work done and results obtained by the aforementioned workgroup.</p>


Risks ◽  
2018 ◽  
Vol 6 (3) ◽  
pp. 83 ◽  
Author(s):  
Michelle Xia

In this paper, we study the problem of misrepresentation under heavy-tailed regression models with the presence of both misrepresented and correctly-measured risk factors. Misrepresentation is a type of fraud when a policy applicant gives a false statement on a risk factor that determines the insurance premium. Under the regression context, we introduce heavy-tailed misrepresentation models based on the lognormal, Weibull and Pareto distributions. The proposed models allow insurance modelers to identify risk characteristics associated with the misrepresentation risk, by imposing a latent logit model on the prevalence of misrepresentation. We prove the theoretical identifiability and implement the models using Bayesian Markov chain Monte Carlo techniques. The model performance is evaluated through both simulated data and real data from the Medical Panel Expenditure Survey. The simulation study confirms the consistency of the Bayesian estimators in large samples, whereas the case study demonstrates the necessity of the proposed models for real applications when the losses exhibit heavy-tailed features.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Made Ayu Dwi Octavanny ◽  
I. Nyoman Budiantara ◽  
Heri Kuswanto ◽  
Dyah Putri Rahmawati

Existing literature in nonparametric regression has established a model that only applies one estimator to all predictors. This study is aimed at developing a mixed truncated spline and Fourier series model in nonparametric regression for longitudinal data. The mixed estimator is obtained by solving the two-stage estimation, consisting of a penalized weighted least square (PWLS) and weighted least square (WLS) optimization. To demonstrate the performance of the proposed method, simulation and real data are provided. The results of the simulated data and case study show a consistent finding.


2012 ◽  
Vol 20 (3) ◽  
pp. 203-224 ◽  
Author(s):  
Shon R. Grabbe ◽  
Banavar Sridhar ◽  
Avijit Mukherjee ◽  
Alexander Morando

2020 ◽  
Author(s):  
George Karagiannakis

This paper deals with state of the art risk and resilience calculations for industrial plants. Resilience is a top priority issue on the agenda of societies due to climate change and the all-time demand for human life safety and financial robustness. Industrial plants are highly complex systems containing a considerable number of equipment such as steel storage tanks, pipe rack-piping systems, and other installations. Loss Of Containment (LOC) scenarios triggered by past earthquakes due to failure on critical components were followed by severe repercussions on the community, long recovery times and great economic losses. Hence, facility planners and emergency managers should be aware of possible seismic damages and should have already established recovery plans to maximize the resilience and minimize the losses. Seismic risk assessment is the first step of resilience calculations, as it establishes possible damage scenarios. In order to have an accurate risk analysis, the plant equipment vulnerability must be assessed; this is made feasible either from fragility databases in the literature that refer to customized equipment or through numerical calculations. Two different approaches to fragility assessment will be discussed in this paper: (i) code-based Fragility Curves (FCs); and (ii) fragility curves based on numerical models. A carbon black process plant is used as a case study in order to display the influence of various fragility curve realizations taking their effects on risk and resilience calculations into account. Additionally, a new way of representing the total resilience of industrial installations is proposed. More precisely, all possible scenarios will be endowed with their weighted recovery curves (according to their probability of occurrence) and summed together. The result is a concise graph that can help stakeholders to identify critical plant equipment and make decisions on seismic mitigation strategies for plant safety and efficiency. Finally, possible mitigation strategies, like structural health monitoring and metamaterial-based seismic shields are addressed, in order to show how future developments may enhance plant resilience. The work presented hereafter represents a highly condensed application of the research done during the XP-RESILIENCE project, while more detailed information is available on the project website https://r.unitn.it/en/dicam/xp-resilience.


Tellus B ◽  
2011 ◽  
Vol 63 (3) ◽  
Author(s):  
R. J. Andres ◽  
J. S. Gregg ◽  
L. Losey ◽  
G. Marland ◽  
T. A. Boden

Metabolites ◽  
2021 ◽  
Vol 11 (4) ◽  
pp. 214
Author(s):  
Aneta Sawikowska ◽  
Anna Piasecka ◽  
Piotr Kachlicki ◽  
Paweł Krajewski

Peak overlapping is a common problem in chromatography, mainly in the case of complex biological mixtures, i.e., metabolites. Due to the existence of the phenomenon of co-elution of different compounds with similar chromatographic properties, peak separation becomes challenging. In this paper, two computational methods of separating peaks, applied, for the first time, to large chromatographic datasets, are described, compared, and experimentally validated. The methods lead from raw observations to data that can form inputs for statistical analysis. First, in both methods, data are normalized by the mass of sample, the baseline is removed, retention time alignment is conducted, and detection of peaks is performed. Then, in the first method, clustering is used to separate overlapping peaks, whereas in the second method, functional principal component analysis (FPCA) is applied for the same purpose. Simulated data and experimental results are used as examples to present both methods and to compare them. Real data were obtained in a study of metabolomic changes in barley (Hordeum vulgare) leaves under drought stress. The results suggest that both methods are suitable for separation of overlapping peaks, but the additional advantage of the FPCA is the possibility to assess the variability of individual compounds present within the same peaks of different chromatograms.


2021 ◽  
Vol 10 (7) ◽  
pp. 435
Author(s):  
Yongbo Wang ◽  
Nanshan Zheng ◽  
Zhengfu Bian

Since pairwise registration is a necessary step for the seamless fusion of point clouds from neighboring stations, a closed-form solution to planar feature-based registration of LiDAR (Light Detection and Ranging) point clouds is proposed in this paper. Based on the Plücker coordinate-based representation of linear features in three-dimensional space, a quad tuple-based representation of planar features is introduced, which makes it possible to directly determine the difference between any two planar features. Dual quaternions are employed to represent spatial transformation and operations between dual quaternions and the quad tuple-based representation of planar features are given, with which an error norm is constructed. Based on L2-norm-minimization, detailed derivations of the proposed solution are explained step by step. Two experiments were designed in which simulated data and real data were both used to verify the correctness and the feasibility of the proposed solution. With the simulated data, the calculated registration results were consistent with the pre-established parameters, which verifies the correctness of the presented solution. With the real data, the calculated registration results were consistent with the results calculated by iterative methods. Conclusions can be drawn from the two experiments: (1) The proposed solution does not require any initial estimates of the unknown parameters in advance, which assures the stability and robustness of the solution; (2) Using dual quaternions to represent spatial transformation greatly reduces the additional constraints in the estimation process.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Camilo Broc ◽  
Therese Truong ◽  
Benoit Liquet

Abstract Background The increasing number of genome-wide association studies (GWAS) has revealed several loci that are associated to multiple distinct phenotypes, suggesting the existence of pleiotropic effects. Highlighting these cross-phenotype genetic associations could help to identify and understand common biological mechanisms underlying some diseases. Common approaches test the association between genetic variants and multiple traits at the SNP level. In this paper, we propose a novel gene- and a pathway-level approach in the case where several independent GWAS on independent traits are available. The method is based on a generalization of the sparse group Partial Least Squares (sgPLS) to take into account groups of variables, and a Lasso penalization that links all independent data sets. This method, called joint-sgPLS, is able to convincingly detect signal at the variable level and at the group level. Results Our method has the advantage to propose a global readable model while coping with the architecture of data. It can outperform traditional methods and provides a wider insight in terms of a priori information. We compared the performance of the proposed method to other benchmark methods on simulated data and gave an example of application on real data with the aim to highlight common susceptibility variants to breast and thyroid cancers. Conclusion The joint-sgPLS shows interesting properties for detecting a signal. As an extension of the PLS, the method is suited for data with a large number of variables. The choice of Lasso penalization copes with architectures of groups of variables and observations sets. Furthermore, although the method has been applied to a genetic study, its formulation is adapted to any data with high number of variables and an exposed a priori architecture in other application fields.


Sign in / Sign up

Export Citation Format

Share Document