threshold criterion
Recently Published Documents


TOTAL DOCUMENTS

106
(FIVE YEARS 28)

H-INDEX

18
(FIVE YEARS 2)

2021 ◽  
Vol 14 (1) ◽  
pp. 39
Author(s):  
Qian Zhang ◽  
Weibo Huo ◽  
Jifang Pei ◽  
Yongchao Zhang ◽  
Jianyu Yang ◽  
...  

The robust target detection ability of marine navigation radars is essential for safe shipping. However, time-varying river and sea surfaces will induce target scattering changes, known as fluctuating characteristics. Moreover, the targets exhibiting stronger fluctuation disappear in some frames of the radar images, which is known as flickering characteristics. This phenomenon causes a severe decline in the detection performance of traditional detection methods. A biological memory model-based dynamic programming multi-target joint detection method was proposed to address this issue in this paper. Firstly, a global detection operator is used to discretize the multi-target state into multiple single-target states, achieving the discretization of numerous targets. Meanwhile, updating the formula of the memory weight merit function can strengthen the joint frame correlation of the flickering characteristics target. The progressive loop integral is utilized to update the target states to optimize the candidate target set. Finally, a two-stage threshold criterion is utilized to detect the target at different amplitude levels accurately. Simulation and experimental results are given to validate the assertion that the detection performance of the proposed method is greatly improved under a low SCR of 3-8 dB for multiple flickering target detection.


2021 ◽  
Vol 54 (2D) ◽  
pp. 125-137
Author(s):  
Mustafa Adil Issa

Mechanical rock properties are essential to minimize many well problems during drilling and production operations. While these properties are crucial in designing optimum mud weights during drilling operations, they are also necessary to reduce the sanding risk during production operations. This study has been conducted on the Zubair sandstone reservoir, located in the south of Iraq. The primary purpose of this study is to develop a set of empirical correlations that can be used to estimate the mechanical rock properties of sandstone reservoirs. The correlations are established using laboratory (static) measurements and well logging (dynamic) data. The results support the evidence that porosity and sonic travel time are consistent indexes in determining the mechanical rock properties. Four correlations have been developed in this study which are static Young’s modulus, uniaxial compressive strength, internal friction angle, and static Poisson’s ratio with high performance capacity (determination coefficient of 0.79, 0.91, 0.73, and 0.78, respectively). Compared with previous correlations, the current local correlations are well-matched in determining the actual rock mechanical properties. Continuous profiles of borehole-rock mechanical properties of the upper sand unit are then constructed to predict the sand production risk. The ratio of shear modulus to bulk compressibility (G/Cb) as well as rock strength are being used as the threshold criterion to determine the sanding risks. The results showed that sanding risk or rock failure occurs when the rock strength is less than 7250 psi (50 MPa) and the ratio of G/Cb is less than 0.8*1012 psi2. This study presents a set of empirical correlations which are fewer effective costs for applications related to reservoir geomechanics.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yaser Gamiljj ◽  
Ismail Abd Rahman

Purpose The purpose of this paper is to develop a structural relationship model to study the relationship between causes and effects of poor communication and information exchange in construction projects using Smart-PLS. Design/methodology/approach The first method of this research is to identify the causes and effects factors of poor communication in construction projects from the extant of literature. The data used to develop the model was collected using a questionnaire survey, which targeted construction practitioners in the Malaysian construction industry. A five-point Likert type scale was used to rate the significance of the factors. The factors were classified under their relevant construct/group using exploratory factor analysis. A hypothetical model was developed and then transformed into Smart-PLS in which the hypothetical model suggested that each group of the cause factors has a direct impact on the effect groups. The hypothesis was tested using t-values and p-values. The model was assessed for its inner and outer components and achieved the threshold criterion. Further, the model was verified by engaging 14 construction experts to verify its applicability in the construction project setting. Findings The study developed a structural equation model to clarify the relationships between causes and effects of poor communication in construction projects. The model explained the degree of relationships among causes and effects of poor communication in construction projects. Originality/value The published academic and non-academic literature introduced many studies on the issue of communication including the definitions, importance, barriers to effective communication and means of poor communication. However, these studies ended up only on the general issue of communication lacking an in-depth investigation of the causes and effects of poor communication in the construction industry. The study implemented advanced structural modeling to study the causes and effects. The questionnaire, the data and concluding results fill the identified research gap of this study. The addressed issue is also of interest because communication is considered one of the main knowledge areas in construction management.


10.6036/10236 ◽  
2021 ◽  
Vol 96 (5) ◽  
pp. 484-491
Author(s):  
Dongmei Zhang ◽  
Jing Xu ◽  
Kunhua Chen

To promote the low-carbonization of rural logistics, the embeddedness among low-carbon rural logistics potential (LRLP), low-carbon rural logistics strength (LRLS) and environmental regulations (ERs) was clarified by using the analytical paradigm of impetus–basis–target. According to the embeddedness, an evolutionary framework of the low-carbon rural logistics system was constructed to analyze the evolutionary process of this system. Then, a three-dimensional evolutionary equation of the low-carbon rural logistics system was proposed based on the evolutionary framework, and the evolutionary threshold criterion of the system was obtained through the stability analysis of the evolutionary equation. The interaction mechanism within the evolutionary process of the low-carbon rural logistics system in China was verified through data simulation in MATLAB. The results show that in an environment with lower ERs, the effect of simply relying on LRLS to promote the sustainable development of low-carbon rural logistics is not obvious, and the overall effect is not significant. In the long term, under the influence of different external control variables, LRLP plays a pivotal role in evolution, and improving LRLP is conducive to promoting the sustainable development process of low-carbon rural logistics. Keywords: environmental regulations; low-carbon rural logistics potential; low-carbon rural logistics strength; the evolutionary framework; three-dimensional evolutionary equation


2021 ◽  
Author(s):  
Phillip M. Alday ◽  
Jeroen van Paridon

Traditionally, artifacts are handled one of two ways in ERP studies: (1) rejection of affected segments and (2) correction via e.g. ICA. Threshold-based rejection is problematic because of the arbitrariness of the chosen limits and particular threshold criterion (e.g. peak-to-peak, absolute, slope, etc.), resulting in large researcher degrees of freedom. Manual rejection may suffer from low inter-rater reliability and is often done without appropriate blinding. Additionally, rejections are typically done for an entire trial, even if the ERP measure of interest isn't impacted by the artifact in question (e.g. motion artifact at the end of the trial). Additionally, fixed thresholds cannot distinguish between non-artifactual extreme values (i.e. those arising from brain activity and which have some 'signal' and some 'noise') and truly artifactual values (e.g. those arising from muscle activity or the electrical environment and which are essentially pure 'noise'). These aspects all become particularly problematic when analyzing EEG recorded under more naturalistic conditions, such as free dialogue in hyperscanning or virtual reality. By using modern, robust statistical methods, we can avoid setting arbitrary thresholds and allow the statistical model to extract the signal from the noise. To demonstrate this, we re-analyzed data from a multimodal virtual-reality N400 paradigm. We created two versions of the dataset, one using traditional threshold-based peak-to-peak artifact rejection (150µV), and one without artifact rejection, and examined the mean voltage at 250-350ms after stimulus onset. We then analyzed the data with both robust and traditional techniques from both a frequentist and Bayesian perspective. The non-robust models yielded different effect estimates when fit to dirty data than when fit to cleaned data, as well as different estimates of the residual variation. The robust models meanwhile estimated similar effect sizes for the dirty and cleaned data, with slightly different estimates of the residual variation. In other words, the robust model worked equally well with or without artifact rejection and did not require setting any arbitrary thresholds. Conversely, the standard, non-robust model was sensitive to the degree of data cleaning. This suggests that robust methods should become the standard in ERP analysis, regardless of data cleaning procedure.


2021 ◽  
Author(s):  
Jingtao Zhao ◽  
Quanyou Chen ◽  
Zhidong Chen ◽  
Chaoyang Chen ◽  
Zhong Liu ◽  
...  

Abstract Positive-intrinsic-negative (PIN) limiter are widely used to protect sensitive components from leakage power itself and adjacent high-power injection. Being the core of a PIN limiter, the PIN diode is possible to be burnt out by the external microwave pulses. Here, numerical simulations by our self-designed device-circuit joint simulator were carried out to study the influences of the I layer thickness and the anode diameter of the PIN diode on the maximum temperature variation curve of the PIN diode limiter. The damage threshold criterion in the numerical simulation was first studied by comparing experimental results with simulation results. Then, we determined the impact of the structure on the thermal burnout effect induced by microwave pulses of PIN limiter diodes.


2021 ◽  
Vol 11 (6) ◽  
pp. 692
Author(s):  
Stefania Solazzo ◽  
Nada Kojovic ◽  
François Robain ◽  
Marie Schaer

The presence of a restricted interest in written materials, including an early ability to name and recognize letters and numbers, is regularly reported in preschoolers with autism spectrum disorders (ASDs). There is, however, scarce information on this early ability akin to emerging hyperlexic traits in preschoolers with ASD younger than 3 years old. Here, we defined a measure of early naming and recognition of letters and numbers in 155 preschoolers with ASD using a sliding window approach combined with a 90th percentile threshold criterion, and subsequently compared the profiles of children with ASD with and without early hyperlexic traits. Using this measure, we found that 9% of children with ASD showed early hyperlexic traits. The early ability to name and recognize letters and numbers was associated with a higher level of restricted and repetitive behaviors yet more social-oriented behaviors at baseline and with better expressive and written communication at baseline and one year later. This study contributes to a better definition of the profile of children with ASD with an early ability in letters and numbers akin to emerging hyperlexic traits, a skill that is associated with promising social strengths and language abilities in this subgroup of children.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2675
Author(s):  
Beanbonyka Rim ◽  
Sungjin Lee ◽  
Ahyoung Lee ◽  
Hyo-Wook Gil ◽  
Min Hong

Whole cardiac segmentation in chest CT images is important to identify functional abnormalities that occur in cardiovascular diseases, such as coronary artery disease (CAD) detection. However, manual efforts are time-consuming and labor intensive. Additionally, labeling the ground truth for cardiac segmentation requires the extensive manual annotation of images by the radiologist. Due to the difficulty in obtaining the annotated data and the required expertise as an annotator, an unsupervised approach is proposed. In this paper, we introduce a semantic whole-heart segmentation combining K-Means clustering as a threshold criterion of the mean-thresholding method and mathematical morphology method as a threshold shifting enhancer. The experiment was conducted on 500 subjects in two cases: (1) 56 slices per volume containing full heart scans, and (2) 30 slices per volume containing about half of the top of heart scans before the liver appears. In both cases, the results showed an average silhouette score of the K-Means method of 0.4130. Additionally, the experiment on 56 slices per volume achieved an overall accuracy (OA) and mean intersection over union (mIoU) of 34.90% and 41.26%, respectively, while the performance for the first 30 slices per volume achieved an OA and mIoU of 55.10% and 71.46%, respectively.


2021 ◽  
Author(s):  
Friederike Fröb ◽  
Tatiana Ilyina

<p>Long-term changes in ocean biogeochemistry that are projected under an evolving climate in the 21<sup>st</sup> century are superimposed by short-term extreme events. Of particular interest are compound events, where such extreme events occur successively or simultaneously, combining or amplifying the impact of multiple stressors on ocean ecosystems. The resilience of marine species to the simultaneous exposure of extremely high temperature, low pH and low oxygen concentration presumably depends on the magnitude and variability of the perturbation, which is likely to increase and intensify in response to rising global mean temperatures. However, changes in marine heat waves, ocean acidification and deoxygenation extremes, remain to be detected, in order to quantify their combined impact. Here, we use the Grand Ensemble of the fully coupled Max Planck Institute Earth System Model (MPI-GE) that consists of 100 members forced by historical CO<sub>2</sub> emissions and those according to the Representative Concentration Pathway 4.5 (RCP4.5). The daily frequency of the simulation output for sea surface temperature, hydrogen ion concentration and oxygen concentration allows analysing spatio-temporal changes of marine extreme events between 1850 and 2100. We assess the number, duration, and intensity of extreme states using a moving threshold criterion, and aim to identify concurrent and consecutive driving mechanisms for such events in the surface ocean in order to evaluate potential risks for the marine ecosystem.</p>


Sign in / Sign up

Export Citation Format

Share Document