deterministic methods
Recently Published Documents


TOTAL DOCUMENTS

192
(FIVE YEARS 45)

H-INDEX

16
(FIVE YEARS 2)

Author(s):  
N. A. Ulyanov ◽  
S. V. Yaskevich ◽  
Dergach P. A. ◽  
A. V. YablokovAV

Manual processing of large volumes of continuous observations produced by local seismic networks takes a lot of time, therefore, to solve this problem, automatic algorithms for detecting seismic events are used. Deterministic methods for solving the problem of detection, which do an excellent job of detecting intensive earthquakes, face critical problems when detecting weak seismic events (earthquakes). They are based on principles based on the calculation of energy, which causes multiple errors in detection: weak seismic events may not be detected, and high-amplitude noise may be mistakenly detected as an event. In our work, we propose a detection method capable of surpassing deterministic methods in detecting events on seismograms, successfully detecting a similar or more events with fewer false detections.


2022 ◽  
Vol 165 ◽  
pp. 108633
Author(s):  
Leili Taghizadeh ◽  
Ahmadreza Zolfaghari ◽  
Mahdi Zangian ◽  
Javad Mokhtari ◽  
Yohannes Sardjono

2021 ◽  
Author(s):  
◽  
Melissa Welsh

<p>Acute rheumatic fever is a major cause of heart disease in many parts of the world. Though it is generally considered rare in developed countries, is remains a large issue in New Zealand. Of particular concern is the prevalence of acute rheumatic fever among Maori and Paci c Island peoples. In this thesis we develop a model to simulate acute rheumatic fever in a population. We discuss the use of both deterministic methods and stochastic processes. Demographics and statistics speci c to New Zealand are then used to develop the model in a way that ts speci cally to the situation in New Zealand. We also consider the introduction of treatment strategies for acute rheumatic fever and discuss how risk factors can be used to focus such strategies.</p>


2021 ◽  
Author(s):  
◽  
Melissa Welsh

<p>Acute rheumatic fever is a major cause of heart disease in many parts of the world. Though it is generally considered rare in developed countries, is remains a large issue in New Zealand. Of particular concern is the prevalence of acute rheumatic fever among Maori and Paci c Island peoples. In this thesis we develop a model to simulate acute rheumatic fever in a population. We discuss the use of both deterministic methods and stochastic processes. Demographics and statistics speci c to New Zealand are then used to develop the model in a way that ts speci cally to the situation in New Zealand. We also consider the introduction of treatment strategies for acute rheumatic fever and discuss how risk factors can be used to focus such strategies.</p>


2021 ◽  
Author(s):  
Greg Michael Nelson ◽  
Robert Barrie

Abstract Objectives / Scope Re-wheeling compressors to match late-life field conditions gives significant benefits in operational efficiency and carbon reduction. But changing the compressor wheels and increasing shaft speeds also introduces a risk in terms of the rotor-dynamic stability of the system. API assessments use deterministic methods to assess the design change, but give less information in terms of the key risks and how to control them. This paper outlines new methods for assessing rotor dynamic risks to compressors during re-wheeling and their value over traditional methods. Methods New methods were developed to extend beyond the API requirements in order to assess and manage the rotor-dynamic risk as part of a peer review process of re-wheeling a compressor train. A combination of sensitivity studies on key parameters and Self Organizing Maps (SOMs - a machine learning technique) was used to identify the factors which present the greatest risk to the re-wheeling, and a Monte Carlo analysis was used to identify the change in risk of rotor-dynamic problems when compared with the existing machine. Results The Monte Carlo analysis used random distributions of factors on key input parameters, and the same factors were applied to the existing and re-wheeled designs. It identified that although the re-wheeled design was nominally more stable than the existing design according to the API analysis, it actually presented a greater risk of instability. This is because the distribution of resulting stability values had a higher mean but a greater spread than the existing machine when subject to uncertainty in input parameters. Since the existing machine is free from dynamics problems, the parameter combinations which resulted in an unstable existing machine could be discounted, but the resulting subset of factors when applied to the re-wheeled design still gave some unstable cases. Therefore, the fact that the existing machine is free from dynamics problems does not in itself discount the possibility of problems following the re-wheel. SOMs were used to identify the components which posed the greatest risk to the re-wheeled design. This highlighted that low stiffness in two particular bearings along the high speed shaft would pose the greatest risk to shaft stability, meaning that close attention can be paid by the operators and OEMs to this to manage the risks as the re-wheel progresses. Novel Information This work shows that probabilistic and machine learning techniques have significant value in managing risks during compressor re-wheeling, highlighting risks which would not be identified using standard deterministic methods and focusing attention on the aspects which are most important to manage them.


2021 ◽  
Author(s):  
Avgoustinos Vouros ◽  
Stephen Langdell ◽  
Mike Croucher ◽  
Eleni Vasilaki

AbstractK-Means is one of the most used algorithms for data clustering and the usual clustering method for benchmarking. Despite its wide application it is well-known that it suffers from a series of disadvantages; it is only able to find local minima and the positions of the initial clustering centres (centroids) can greatly affect the clustering solution. Over the years many K-Means variations and initialisation techniques have been proposed with different degrees of complexity. In this study we focus on common K-Means variations along with a range of deterministic and stochastic initialisation techniques. We show that, on average, more sophisticated initialisation techniques alleviate the need for complex clustering methods. Furthermore, deterministic methods perform better than stochastic methods. However, there is a trade-off: less sophisticated stochastic methods, executed multiple times, can result in better clustering. Factoring in execution time, deterministic methods can be competitive and result in a good clustering solution. These conclusions are obtained through extensive benchmarking using a range of synthetic model generators and real-world data sets.


2021 ◽  
Vol 11 (8) ◽  
pp. 3699
Author(s):  
Rajeev Das ◽  
Azzedine Soulaimani

The parameters of the constitutive models used in the design of rockfill dams are associated with a high degree of uncertainty. This occurs because rockfill dams are comprised of numerous zones, each with different soil materials, and it is not feasible to extract materials from such structures to accurately ascertain their behavior or their respective parameters. The general approach involves laboratory tests using small material samples or empirical data from the literature. However, such measures lack an accurate representation of the actual scenario, resulting in uncertainties. This limits the suitability of the model in the design process. Inverse analysis provides an option to better understand dam behavior. This procedure involves the use of real monitored data, such as deformations and stresses, from the dam structure via installed instruments. Fundamentally, it is a non-destructive approach that considers optimization methods and actual performance data to determine the values of the parameters by minimizing the differences between simulated and observed results. This paper considers data from an actual rockfill dam and proposes a surrogate assisted non-deterministic framework for its inverse analysis. A suitable error/objective function that measures the differences between the actual and simulated displacement values is defined first. Non-deterministic algorithms are used as the optimization technique, as they can avoid local optima and are more robust when compared to the conventional deterministic methods. Three such approaches, the genetic algorithm, differential evolution, and particle swarm optimization are evaluated to identify the best strategy in solving problems of this nature. A surrogate model in the form of a polynomial regression is studied and recommended in place of the actual numerical model of the dam to reduce computation cost. Finally, this paper presents the relevant dam parameters estimated by the analysis and provides insights into the performance of the three procedures to solve the inverse problem.


Sign in / Sign up

Export Citation Format

Share Document