Continuous Entropy Estimation with Different Unsupervised Discretization Methods

2013 ◽  
Vol 380-384 ◽  
pp. 1617-1620
Author(s):  
Jian Fang ◽  
Li Na Sui ◽  
Hong Yi Jian

In this paper, we compare and analyze the performances of nine unsupervised discretization methods, i.e., equal width, equal frequency, k-means clustering discretization, ordinal, fixed frequency, non-disjoint, proportional, weight proportional, mean value and standard deviation discretizations in the framework of continues entropy estimation based on 15 probability density distributions, i.e., Beta, Cauchy, Central Chi-Squared, Exponential, F, Gamma, Laplace, Logistic, Lognormal, Normal, Rayleigh, Student's-t, Triangular, Uniform, and Weibull distributions.

2020 ◽  
Vol 38 (29_suppl) ◽  
pp. 279-279
Author(s):  
Victoria Andreotti ◽  
Marika Cinausero ◽  
Silvio Ken Garattini ◽  
Lucia Bortot ◽  
Lorenza Palmero ◽  
...  

279 Background: In the last years, the introduction of immune checkpoint inhibitors (ICI) in clinical practice translated into major changes in oncology workload. We conducted a study aimed to estimate the shift in workload generated, within 1 year of first consultation, by any new metastatic cancer patient receiving ICI at the Oncology Department of the Academic Hospital of Udine, Italy. Methods: We collected from our electronic accountability system data all new cases of metastatic cancer between 01.01.2017 and 31.12.2018, leading to at least a second clinical episode (treatment sessions, unplanned presentations, hospitalizations, re-evaluations, follow-up, and inpatient oncology advices) during the following year. Patients (pts) were divided into those receiving ICI (anti-CTLA4/PD1/PDL1) versus pts receiving other treatments. Mean number per patient and standard deviation were calculated for clinical episodes, and the mean numbers in each group were compared using Student’s t-test (significance p<0.05). Follow-up continued until 31.12.2019. Results: 969 pts were included: 115 were treated with ICI, 854 received other treatments. In the first group a greater number of treatment sessions, re-evaluations and unplanned presentations was generated, with a statistically significant increased workload. On the other hand, pts receiving other treatments generated a greater workload in terms of follow-up. In detail, data are reported in Table. Conclusions: ICI have transformed the oncology landscape, leading to longer lasting treatment period with emerging toxicities. Estimating the workload generated by ICI is crucial for the implementation of more sustainable systems and for planning clinical activities. Mean number of clinical episodes in the first year of treatment with ICI for metastatic disease. Mean number per patient is represented by mean value and standard deviation (SD). Total number of clinical episodes is shown (N=). Data are reported for ICI versus other treatments group. [Table: see text]


Cancers ◽  
2021 ◽  
Vol 13 (10) ◽  
pp. 2421
Author(s):  
Roberta Fusco ◽  
Vincenza Granata ◽  
Mauro Mattace Raso ◽  
Paolo Vallone ◽  
Alessandro Pasquale De Rosa ◽  
...  

Purpose. To combine blood oxygenation level dependent magnetic resonance imaging (BOLD-MRI), dynamic contrast enhanced MRI (DCE-MRI), and diffusion weighted MRI (DW-MRI) in differentiation of benign and malignant breast lesions. Methods. Thirty-seven breast lesions (11 benign and 21 malignant lesions) pathologically proven were included in this retrospective preliminary study. Pharmaco-kinetic parameters including Ktrans, kep, ve, and vp were extracted by DCE-MRI; BOLD parameters were estimated by basal signal S0 and the relaxation rate R2*; and diffusion and perfusion parameters were derived by DW-MRI (pseudo-diffusion coefficient (Dp), perfusion fraction (fp), and tissue diffusivity (Dt)). The correlation coefficient, Wilcoxon-Mann-Whitney U-test, and receiver operating characteristic (ROC) analysis were calculated and area under the ROC curve (AUC) was obtained. Moreover, pattern recognition approaches (linear discrimination analysis and decision tree) with balancing technique and leave one out cross validation approach were considered. Results. R2* and D had a significant negative correlation (−0.57). The mean value, standard deviation, Skewness and Kurtosis values of R2* did not show a statistical significance between benign and malignant lesions (p > 0.05) confirmed by the ‘poor’ diagnostic value of ROC analysis. For DW-MRI derived parameters, the univariate analysis, standard deviation of D, Skewness and Kurtosis values of D* had a significant result to discriminate benign and malignant lesions and the best result at the univariate analysis in the discrimination of benign and malignant lesions was obtained by the Skewness of D* with an AUC of 82.9% (p-value = 0.02). Significant results for the mean value of Ktrans, mean value, standard deviation value and Skewness of kep, mean value, Skewness and Kurtosis of ve were obtained and the best AUC among DCE-MRI extracted parameters was reached by the mean value of kep and was equal to 80.0%. The best diagnostic performance in the discrimination of benign and malignant lesions was obtained at the multivariate analysis considering the DCE-MRI parameters alone with an AUC = 0.91 when the balancing technique was considered. Conclusions. Our results suggest that the combined use of DCE-MRI, DW-MRI and/or BOLD-MRI does not provide a dramatic improvement compared to the use of DCE-MRI features alone, in the classification of breast lesions. However, an interesting result was the negative correlation between R2* and D.


Author(s):  
Athanasios N. Papadimopoulos ◽  
Stamatios A. Amanatiadis ◽  
Nikolaos V. Kantartzis ◽  
Theodoros T. Zygiridis ◽  
Theodoros D. Tsiboukis

Purpose Important statistical variations are likely to appear in the propagation of surface plasmon polariton waves atop the surface of graphene sheets, degrading the expected performance of real-life THz applications. This paper aims to introduce an efficient numerical algorithm that is able to accurately and rapidly predict the influence of material-based uncertainties for diverse graphene configurations. Design/methodology/approach Initially, the surface conductivity of graphene is described at the far infrared spectrum and the uncertainties of its main parameters, namely, the chemical potential and the relaxation time, on the propagation properties of the surface waves are investigated, unveiling a considerable impact. Furthermore, the demanding two-dimensional material is numerically modeled as a surface boundary through a frequency-dependent finite-difference time-domain scheme, while a robust stochastic realization is accordingly developed. Findings The mean value and standard deviation of the propagating surface waves are extracted through a single-pass simulation in contrast to the laborious Monte Carlo technique, proving the accomplished high efficiency. Moreover, numerical results, including graphene’s surface current density and electric field distribution, indicate the notable precision, stability and convergence of the new graphene-based stochastic time-domain method in terms of the mean value and the order of magnitude of the standard deviation. Originality/value The combined uncertainties of the main parameters in graphene layers are modeled through a high-performance stochastic numerical algorithm, based on the finite-difference time-domain method. The significant accuracy of the numerical results, compared to the cumbersome Monte Carlo analysis, renders the featured technique a flexible computational tool that is able to enhance the design of graphene THz devices due to the uncertainty prediction.


2018 ◽  
Vol 189 ◽  
pp. 04009
Author(s):  
Kun Liu ◽  
Shiping Wang ◽  
Linyuan He ◽  
Duyan Bi ◽  
Shan Gao

Aiming at the color distortion of the restored image in the sky region, we propose an image dehazing algorithm based on double priors constraint. Firstly, we divided the haze image into sky and non-sky regions. Then the Color-lines prior and dark channel prior are used for estimating the transmission of sky and non-sky regions respectively. After introducing color-lines prior to correct sky regions restored by the dark channel prior, we get an accurate transmission. Finally, the local media mean value and standard deviation are used to refine the transmission to obtain the dehazing image. Experimental results show that the algorithm has obvious advantages in the recovery of the sky area.


1982 ◽  
Vol 14 (7) ◽  
pp. 869-888 ◽  
Author(s):  
P F Lesse

This paper deals with a class of models which describe spatial interactions and are based on Jaynes's principle. The variables entering these models can be partitioned in four groups: (a) probability density distributions (for example, relative traffic flows), (b) expected values (average cost of travel), (c) their duals (Lagrange multipliers, traffic impedance coefficient), and (d) operators transforming probabilities into expected values. The paper presents several dual formulations replacing the problem of maximizing entropy in terms of the group of variables (a) by equivalent extreme problems involving groups (b)-(d). These problems form the basis of a phenomenological theory. The theory makes it possible to derive useful relationships among groups (b) and (c). There are two topics discussed: (1) practical application of the theory (with examples), (2) the relationship between socioeconomic modelling and statistical mechanics.


Author(s):  
Gloria D’Alessandro ◽  
Stefania Palmieri ◽  
Alice Cola ◽  
Marta Barba ◽  
Stefano Manodoro ◽  
...  

Abstract Introduction and hypothesis There is still no consensus on definitions of detrusor underactivity; therefore, it is difficult to estimate the prevalence. The primary objective of the study was to evaluate the prevalence of detrusor underactivity in a cohort of patients with pelvic floor disorders according to different proposed urodynamics definitions. The secondary objectives were to estimate the association between detrusor underactivity and symptoms, anatomy and urodynamic findings and to build predictive models. Methods Patients who performed urodynamic evaluation for pelvic floor disorders between 2008 and 2016 were retrospectively analyzed. Detrusor underactivity was evaluated according to Schafer’s detrusor factor, Abrams’ bladder contractility index and Jeong cut-offs. The degree of concordance between each method was measured with Cohenʼs kappa, and differences were tested using Student’s t test, Wilcoxon test and Pearson’s chi-squared test. Results The prevalence of detrusor underactivity among a cohort of 2092 women, concerning the three urodynamic definitions, was 33.7%, 37.0% and 4.1%, respectively. Age, menopausal status, voiding/bulging symptoms, anterior and central prolapse, first desire to void and positive postvoid residual were directly related to detrusor underactivity. Conversely, stress urinary incontinence, detrusor pressures during voiding and maximum flow were inversely associated. Final models for detrusor underactivity resulted in poor accuracy for all considered definitions. Conclusions The prevalence of detrusor underactivity varies depending on the definition considered. Although several clinical variables resulted as independent predictors of detrusor underactivity, instrumental evaluation still plays a key role in the diagnosis.


2014 ◽  
Vol 496-500 ◽  
pp. 1643-1647
Author(s):  
Ying Feng Wu ◽  
Gang Yan Li

IR-based large scale volume localization system (LSVLS) can localize the mobile robot working in large volume, which is constituted referring to the MSCMS-II. Hundreds cameras in LSVLS must be connected to the control station (PC) through network. Synchronization of cameras which are mounted on different control stations is significant, because the image acquisition of the target must be synchronous to ensure that the target is localized precisely. Software synchronization method is adopted to ensure the synchronization of camera. The mean value of standard deviation of eight cameras mounted on two workstations is 12.53ms, the localization performance of LSVLS is enhanced.


2011 ◽  
Vol 1 (4) ◽  
pp. 305-312 ◽  
Author(s):  
Y. Wang

Precise computation of the direct and indirect topographic effects of Helmert's 2nd method of condensation using SRTM30 digital elevation modelThe direct topographic effect (DTE) and indirect topographic effect (ITE) of Helmert's 2nd method of condensation are computed using the digital elevation model (DEM) SRTM30 in 30 arc-seconds globally. The computations assume a constant density of the topographic masses. Closed formulas are used in the inner zone of half degree, and Nagy's formulas are used in the innermost column to treat the singularity of integrals. To speed up the computations, 1-dimensional fast Fourier transform (1D FFT) is applied in outer zone computations. The computation accuracy is limited to 0.1 mGal and 0.1cm for the direct and indirect effect, respectively.The mean value and standard deviation of the DTE are -0.8 and ±7.6 mGal over land areas. The extreme value -274.3 mGal is located at latitude -13.579° and longitude 289.496°, at the height of 1426 meter in the Andes Mountains. The ITE is negative everywhere and has its minimum of -235.9 cm at the peak of Himalayas (8685 meter). The standard deviation and mean value over land areas are ±15.6 cm and -6.4 cm, respectively. Because the Stokes kernel does not contain the zero and first degree spherical harmonics, the mean value of the ITE can't be compensated through the remove-restore procedure under the Stokes-Helmert scheme, and careful treatment of the mean value in the ITE is required.


Author(s):  
Zhenyu Liu ◽  
Shien Zhou ◽  
Chan Qiu ◽  
Jianrong Tan

The performance of mechanical products is closely related to their key feature errors. It is essential to predict the final assembly variation by assembly variation analysis to ensure product performance. Rigid–flexible hybrid construction is a common type of mechanical product. Existing methods of variation analysis in which rigid and flexible parts are calculated separately are difficult to meet the requirements of these complicated mechanical products. Another methodology is a result of linear superposition with rigid and flexible errors, which cannot reveal the quantitative relationship between product assembly variation and part manufacturing error. Therefore, a kind of complicated products’ assembly variation analysis method based on rigid–flexible vector loop is proposed in this article. First, shapes of part surfaces and sidelines are estimated according to different tolerance types. Probability density distributions of discrete feature points on the surface are calculated based on the tolerance field size with statistical methods. Second, flexible parts surface is discretized into a set of multi-segment vectors to build vector-loop model. Each vector can be orthogonally decomposed into the components representing position information and error size. Combining the multi-segment vector set of flexible part with traditional rigid part vector, a uniform vector-loop model is constructed to represent rigid and flexible complicated products. Probability density distributions of discrete feature points on part surface are regarded as inputs to calculate assembly variation values of products’ key features. Compared with the existing methods, this method applies to the assembly variation prediction of complicated products that consist of both rigid and flexible parts. Impact of each rigid and flexible part’s manufacturing error on product assembly variation can be determined, and it provides the foundation of parts tolerance optimization design. Finally, an assembly example of phased array antenna verifies effectiveness of the proposed method in this article.


2018 ◽  
Author(s):  
Uwe Berger ◽  
Gerd Baumgarten ◽  
Jens Fiedler ◽  
Franz-Josef Lübken

Abstract. In this paper we present a new description about statistical probability density distributions (pdfs) of Polar Mesospheric Clouds (PMC) and noctilucent clouds (NLC). The analysis is based on observations of maximum backscatter, ice mass density, ice particle radius, and number density of ice particles measured by the ALOMAR RMR-lidar for all NLC seasons from 2002 to 2016. From this data set we derive a new class of pdfs that describe the statistics of PMC/NLC events which is different from previously statistical methods using the approach of an exponential distribution commonly named g-distribution. The new analysis describes successfully the probability statistic of ALOMAR lidar data. It turns out that the former g-function description is a special case of our new approach. In general the new statistical function can be applied to many kinds of different PMC parameters, e.g. maximum backscatter, integrated backscatter, ice mass density, ice water content, ice particle radius, ice particle number density or albedo measured by satellites. As a main advantage the new method allows to connect different observational PMC distributions of lidar, and satellite data, and also to compare with distributions from ice model studies. In particular, the statistical distributions of different ice parameters can be compared with each other on the basis of a common assessment that facilitate, for example, trend analysis of PMC/NLC.


Sign in / Sign up

Export Citation Format

Share Document