scholarly journals Investigating the Influence of Ethical and Epistemic Values on Decisions in the Watershed Modeling Process

Author(s):  
Autumn R. Deitrick ◽  
Sarah A. Torhan ◽  
Caitlin A. Grady
2013 ◽  
Vol 58 (3) ◽  
pp. 871-875
Author(s):  
A. Herberg

Abstract This article outlines a methodology of modeling self-induced vibrations that occur in the course of machining of metal objects, i.e. when shaping casting patterns on CNC machining centers. The modeling process presented here is based on an algorithm that makes use of local model fuzzy-neural networks. The algorithm falls back on the advantages of fuzzy systems with Takagi-Sugeno-Kanga (TSK) consequences and neural networks with auxiliary modules that help optimize and shorten the time needed to identify the best possible network structure. The modeling of self-induced vibrations allows analyzing how the vibrations come into being. This in turn makes it possible to develop effective ways of eliminating these vibrations and, ultimately, designing a practical control system that would dispose of the vibrations altogether.


Fact Sheet ◽  
2012 ◽  
Author(s):  
Diana E. Pedraza ◽  
Darwin J. Ockerman

1971 ◽  
Vol 2 (3) ◽  
pp. 146-166 ◽  
Author(s):  
DAVID A. WOOLHISER

Physically-based, deterministic models, are considered in this paper. Physically-based, in that the models have a theoretical structure based primarily on the laws of conservation of mass, energy, or momentum; deterministic in the sense that when initial and boundary conditions and inputs are specified, the output is known with certainty. This type of model attempts to describe the structure of a particular hydrologic process and is therefore helpful in predicting what will happen when some change occurs in the system.


2019 ◽  
Vol 952 (10) ◽  
pp. 2-9
Author(s):  
Yu.M. Neiman ◽  
L.S. Sugaipova ◽  
V.V. Popadyev

As we know the spherical functions are traditionally used in geodesy for modeling the gravitational field of the Earth. But the gravitational field is not stationary either in space or in time (but the latter is beyond the scope of this article) and can change quite strongly in various directions. By its nature, the spherical functions do not fully display the local features of the field. With this in mind it is advisable to use spatially localized basis functions. So it is convenient to divide the region under consideration into segments with a nearly stationary field. The complexity of the field in each segment can be characterized by means of an anisotropic matrix resulting from the covariance analysis of the field. If we approach the modeling in this way there can arise a problem of poor coherence of local models on segments’ borders. To solve the above mentioned problem it is proposed in this article to use new basis functions with Mahalanobis metric instead of the usual Euclidean distance. The Mahalanobis metric and the quadratic form generalizing this metric enables us to take into account the structure of the field when determining the distance between the points and to make the modeling process continuous.


Data & Policy ◽  
2021 ◽  
Vol 3 ◽  
Author(s):  
Harrison Wilde ◽  
Lucia L. Chen ◽  
Austin Nguyen ◽  
Zoe Kimpel ◽  
Joshua Sidgwick ◽  
...  

Abstract Rough sleeping is a chronic experience faced by some of the most disadvantaged people in modern society. This paper describes work carried out in partnership with Homeless Link (HL), a UK-based charity, in developing a data-driven approach to better connect people sleeping rough on the streets with outreach service providers. HL's platform has grown exponentially in recent years, leading to thousands of alerts per day during extreme weather events; this overwhelms the volunteer-based system they currently rely upon for the processing of alerts. In order to solve this problem, we propose a human-centered machine learning system to augment the volunteers' efforts by prioritizing alerts based on the likelihood of making a successful connection with a rough sleeper. This addresses capacity and resource limitations whilst allowing HL to quickly, effectively, and equitably process all of the alerts that they receive. Initial evaluation using historical data shows that our approach increases the rate at which rough sleepers are found following a referral by at least 15% based on labeled data, implying a greater overall increase when the alerts with unknown outcomes are considered, and suggesting the benefit in a trial taking place over a longer period to assess the models in practice. The discussion and modeling process is done with careful considerations of ethics, transparency, and explainability due to the sensitive nature of the data involved and the vulnerability of the people that are affected.


2021 ◽  
Vol 11 (2) ◽  
Author(s):  
Yin Chung Au

AbstractThis paper proposes an extended version of the interventionist account for causal inference in the practical context of biological mechanism research. This paper studies the details of biological mechanism researchers’ practices of assessing the evidential legitimacy of experimental data, arguing why quantity and variety are two important criteria for this assessment. Because of the nature of biological mechanism research, the epistemic values of these two criteria result from the independence both between the causation of data generation and the causation in question and between different interventions, not techniques. The former independence ensures that the interventions in the causation in question are not affected by the causation that is responsible for data generation. The latter independence ensures the reliability of the final mechanisms not only in the empirical but also the formal aspects. This paper first explores how the researchers use quantity to check the effectiveness of interventions, where they at the same time determine the validity of the difference-making revealed by the results of interventions. Then, this paper draws a distinction between experimental interventions and experimental techniques, so that the reliability of mechanisms, as supported by the variety of evidence, can be safely ensured in the probabilistic sense. The latter process is where the researchers establish evidence of the mechanisms connecting the events of interest. By using case studies, this paper proposes to use ‘intervention’ as the fruitful connecting point of literature between evidence and mechanisms.


2021 ◽  
Vol 13 (11) ◽  
pp. 6194
Author(s):  
Selma Tchoketch_Kebir ◽  
Nawal Cheggaga ◽  
Adrian Ilinca ◽  
Sabri Boulouma

This paper presents an efficient neural network-based method for fault diagnosis in photovoltaic arrays. The proposed method was elaborated on three main steps: the data-feeding step, the fault-modeling step, and the decision step. The first step consists of feeding the real meteorological and electrical data to the neural networks, namely solar irradiance, panel temperature, photovoltaic-current, and photovoltaic-voltage. The second step consists of modeling a healthy mode of operation and five additional faulty operational modes; the modeling process is carried out using two networks of artificial neural networks. From this step, six classes are obtained, where each class corresponds to a predefined model, namely, the faultless scenario and five faulty scenarios. The third step involves the diagnosis decision about the system’s state. Based on the results from the above step, two probabilistic neural networks will classify each generated data according to the six classes. The obtained results show that the developed method can effectively detect different types of faults and classify them. Besides, this method still achieves high performances even in the presence of noises. It provides a diagnosis even in the presence of data injected at reduced real-time, which proves its robustness.


Sign in / Sign up

Export Citation Format

Share Document