Optimal Sensor Placement for Active Sensing

Author(s):  
Eric Flynn ◽  
Michael Todd

We present a novel approach for optimal actuator and sensor placement for active sensing-based structural health monitoring (SHM). Of particular interest is the optimization of actuator-sensor arrays making use of Lamb wave propagation for detecting damage in thin plate-like structures. Using a detection theory framework, we establish the optimum configuration as the minimization of the expected percentage of the structure to show type I or type II error during the damage detection process. The detector incorporates a statistical model of the active sensing process which implements both pulse-echo and pitch-catch actuation schemes and takes into account line of site and non-uniform damage probabilities. The optimization space was searched using a genetic algorithm with a time varying mutation rate. We provide four example actuator/sensor placement scenarios and the optimal solutions as generated by the algorithm.

Cells ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 1615
Author(s):  
Zhongwei Zhang ◽  
Yosuke Kurashima

It is well known that mast cells (MCs) initiate type I allergic reactions and inflammation in a quick response to the various stimulants, including—but not limited to—allergens, pathogen-associated molecular patterns (PAMPs), and damage-associated molecular patterns (DAMPs). MCs highly express receptors of these ligands and proteases (e.g., tryptase, chymase) and cytokines (TNF), and other granular components (e.g., histamine and serotonin) and aggravate the allergic reaction and inflammation. On the other hand, accumulated evidence has revealed that MCs also possess immune-regulatory functions, suppressing chronic inflammation and allergic reactions on some occasions. IL-2 and IL-10 released from MCs inhibit excessive immune responses. Recently, it has been revealed that allergen immunotherapy modulates the function of MCs from their allergic function to their regulatory function to suppress allergic reactions. This evidence suggests the possibility that manipulation of MCs functions will result in a novel approach to the treatment of various MCs-mediated diseases.


1996 ◽  
Vol 26 (2) ◽  
pp. 149-160 ◽  
Author(s):  
J. K. Belknap ◽  
S. R. Mitchell ◽  
L. A. O'Toole ◽  
M. L. Helms ◽  
J. C. Crabbe

2021 ◽  
Vol 9 (4) ◽  
pp. 65
Author(s):  
Daniela Rybárová ◽  
Helena Majdúchová ◽  
Peter Štetka ◽  
Darina Luščíková

The aim of this paper is to assess the reliability of alternative default prediction models in local conditions, with subsequent comparison with other generally known and globally disseminated default prediction models, such as Altman’s Z-score, Quick Test, Creditworthiness Index, and Taffler’s Model. The comparison was carried out on a sample of 90 companies operating in the Slovak Republic over a period of 3 years (2016, 2017, and 2018) with a narrower focus on three sectors: construction, retail, and tourism, using alternative default prediction models, such as CH-index, G-index, Binkert’s Model, HGN2 Model, M-model, Gulka’s Model, Hurtošová’s Model, Model of Delina and Packová, and Binkert’s Model. To verify the reliability of these models, tests of the significance of statistical hypotheses were used, such as type I and type II error. According to research results, the highest reliability and accuracy was achieved by an alternative local Model of Delina and Packová. The least reliable results within the list of models were reported by the most globally disseminated model, Altman’s Z-score. Significant differences between sectors were identified.


2005 ◽  
Vol 7 (1) ◽  
pp. 41 ◽  
Author(s):  
Mohamad Iwan

This research examines financial ratios that distinguish between bankrupt and non-bankrupt companies and make use of those distinguishing ratios to build a one-year prior to bankruptcy prediction model. This research also calculates how many times the type I error is more costly compared to the type II error. The costs of type I and type II errors (cost of misclassification errors) in conjunction to the calculation of prior probabilities of bankruptcy and non-bankruptcy are used in the calculation of the ZETAc optimal cut-off score. The bankruptcy prediction result using ZETAc optimal cut-off score is compared to the bankruptcy prediction result using a cut-off score which does not consider neither cost of classification errors nor prior probabilities as stated by Hair et al. (1998), and for later purposes will be referred to Hair et al. optimum cutting score. Comparison between the prediction results of both cut-off scores is purported to determine the better cut-off score between the two, so that the prediction result is more conservative and minimizes expected costs, which may occur from classification errors.  This is the first research in Indonesia that incorporates type I and II errors and prior probabilities of bankruptcy and non-bankruptcy in the computation of the cut-off score used in performing bankruptcy prediction. Earlier researches gave the same weight between type I and II errors and prior probabilities of bankruptcy and non-bankruptcy, while this research gives a greater weigh on type I error than that on type II error and prior probability of non-bankruptcy than that on prior probability of bankruptcy.This research has successfully attained the following results: (1) type I error is in fact 59,83 times more costly compared to type II error, (2) 22 ratios distinguish between bankrupt and non-bankrupt groups, (3) 2 financial ratios proved to be effective in predicting bankruptcy, (4) prediction using ZETAc optimal cut-off score predicts more companies filing for bankruptcy within one year compared to prediction using Hair et al. optimum cutting score, (5) Although prediction using Hair et al. optimum cutting score is more accurate, prediction using ZETAc optimal cut-off score proved to be able to minimize cost incurred from classification errors.


Author(s):  
Aniek Sies ◽  
Iven Van Mechelen

AbstractWhen multiple treatment alternatives are available for a certain psychological or medical problem, an important challenge is to find an optimal treatment regime, which specifies for each patient the most effective treatment alternative given his or her pattern of pretreatment characteristics. The focus of this paper is on tree-based treatment regimes, which link an optimal treatment alternative to each leaf of a tree; as such they provide an insightful representation of the decision structure underlying the regime. This paper compares the absolute and relative performance of four methods for estimating regimes of that sort (viz., Interaction Trees, Model-based Recursive Partitioning, an approach developed by Zhang et al. and Qualitative Interaction Trees) in an extensive simulation study. The evaluation criteria were, on the one hand, the expected outcome if the entire population would be subjected to the treatment regime resulting from each method under study and the proportion of clients assigned to the truly best treatment alternative, and, on the other hand, the Type I and Type II error probabilities of each method. The method of Zhang et al. was superior regarding the first two outcome measures and the Type II error probabilities, but performed worst in some conditions of the simulation study regarding Type I error probabilities.


Methodology ◽  
2010 ◽  
Vol 6 (4) ◽  
pp. 147-151 ◽  
Author(s):  
Emanuel Schmider ◽  
Matthias Ziegler ◽  
Erik Danay ◽  
Luzi Beyer ◽  
Markus Bühner

Empirical evidence to the robustness of the analysis of variance (ANOVA) concerning violation of the normality assumption is presented by means of Monte Carlo methods. High-quality samples underlying normally, rectangularly, and exponentially distributed basic populations are created by drawing samples which consist of random numbers from respective generators, checking their goodness of fit, and allowing only the best 10% to take part in the investigation. A one-way fixed-effect design with three groups of 25 values each is chosen. Effect-sizes are implemented in the samples and varied over a broad range. Comparing the outcomes of the ANOVA calculations for the different types of distributions, gives reason to regard the ANOVA as robust. Both, the empirical type I error α and the empirical type II error β remain constant under violation. Moreover, regression analysis identifies the factor “type of distribution” as not significant in explanation of the ANOVA results.


Author(s):  
Sarfaraz Nawaz ◽  
Ajay Bansal ◽  
M. P. Sharma

<p>A novel approach is proposed in this paper for optimal placement of DG units in reconfigured distribution system with the aim of reduction of real power losses while satisfying operating constraints. The proposed analytical method for optimal DG placement is developed based on a new mathematical formulation. Type-I and type-II DG units are used here. The results of the proposed technique are validated on IEEE 69 bus distribution system. The level of DG penetration is also considered in a range of 0–50% of total system load. A novel index is also proposed which incorporates level of DG penetration and percentage reduction in real power losses. The results are promising when compared with recently proposed algorithms.</p>


Sign in / Sign up

Export Citation Format

Share Document