scholarly journals Generalised Informative Discrimination Measure and Its Properties with Applications

2021 ◽  
pp. 22-33
Author(s):  
Sharma D.K. ◽  
Sonali Saxena

The Shannon interval uncertainty was suggested as a convenient functional uncertainty metric for dual-sided shortened latent variables in the composite reliability. A new measure of disparity has recently been suggested between two doubly truncated distributions of life. The generalised informative discrimination measure for lifetime distribution in the given time interval is described in the present paper. Few features of the new generalised measure are also being investigated. An analysis of lung-cancer data is done using survival function, and Kaplan Meier estimator using Python libraries.

2018 ◽  
Vol 36 (4) ◽  
pp. 860
Author(s):  
Vera Lucia Damasceno TOMAZELLA ◽  
Eder Ângelo MILANI ◽  
Teresa Cristina Martins DIAS

Survival models with frailty are used when some variables are non-available to explain the occurrence time of an event of interest. This non-availability may be considered as a random effect related to unobserved covariates, or that cannot be measured, such as environmental or genetic factors. This paper focuses on the Gamma-Gompertz (denoted by G-G) model that is one of a class of models that investigate the effects of unobservable heterogeneity. We assume that the baseline mortality rate in the G-G model is the Gompertz model, in which mortality increases exponentially with age and the frailty is a fixed property of the individual, and the distribution of frailty is a gamma distribution. The proposed methodology uses the Laplace transform to find the unconditional survival function in the individual frailty. Estimation is based on maximum likelihood methods and this distribution is compared with its particular case. A simulation study examines the bias, the mean squared errors and the coverage probabilities considering various samples sizes and censored data. A real example with lung cancer data illustrates the applicability of the methodology, where we compared the G-G and without frailty models via criteria which select thebest fitted model to the data. 


2020 ◽  
Vol 4 (5) ◽  
pp. 805-812
Author(s):  
Riska Chairunisa ◽  
Adiwijaya ◽  
Widi Astuti

Cancer is one of the deadliest diseases in the world with a mortality rate of 57,3% in 2018 in Asia. Therefore, early diagnosis is needed to avoid an increase in mortality caused by cancer. As machine learning develops, cancer gene data can be processed using microarrays for early detection of cancer outbreaks. But the problem that microarray has is the number of attributes that are so numerous that it is necessary to do dimensional reduction. To overcome these problems, this study used dimensions reduction Discrete Wavelet Transform (DWT) with Classification and Regression Tree (CART) and Random Forest (RF) as classification method. The purpose of using these two classification methods is to find out which classification method produces the best performance when combined with the DWT dimension reduction. This research use five microarray data, namely Colon Tumors, Breast Cancer, Lung Cancer, Prostate Tumors and Ovarian Cancer from Kent-Ridge Biomedical Dataset. The best accuracy obtained in this study for breast cancer data were 76,92% with CART-DWT, Colon Tumors 90,1% with RF-DWT, lung cancer 100% with RF-DWT, prostate tumors 95,49% with RF-DWT, and ovarian cancer 100% with RF-DWT. From these results it can be concluded that RF-DWT is better than CART-DWT.  


2017 ◽  
Vol 920 (2) ◽  
pp. 57-60
Author(s):  
F.E. Guliyeva

The study of results of relevant works on remote sensing of forests has shown that the known methods of remote estimation of forest cuts and growth don’t allow to calculate the objective average value of forests cut volume during the fixed time period. The existing mathematical estimates are not monotonous and make it possible to estimate primitively the scale of cutting by computing the ratio of data in two fixed time points. In the article the extreme properties of the considered estimates for deforestation and reforestation models are researched. The extreme features of integrated averaged values of given estimates upon limitations applied on variables, characterizing the deforestation and reforestation processes are studied. The integrated parameter, making it possible to calculate the averaged value of estimates of forest cutting, computed for all fixed time period with a fixed step is suggested. It is shown mathematically that the given estimate has a monotonous feature in regard of value of given time interval and make it possible to evaluate objectively the scales of forest cutting.


Games ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 11
Author(s):  
Nikolai Grigorenko ◽  
Lilia Luk’yanova

A model of production funds acquisition, which includes two differential links of the zero order and two series-connected inertial links, is considered in a one-sector economy. Zero-order differential links correspond to the equations of the Ramsey model. These equations contain scalar bounded control, which determines the distribution of the available funds into two parts: investment and consumption. Two series-connected inertial links describe the dynamics of the changes in the volume of the actual production at the current production capacity. For the considered control system, the problem is posed to maximize the average consumption value over a given time interval. The properties of optimal control are analytically established using the Pontryagin maximum principle. The cases are highlighted when such control is a bang-bang, as well as the cases when, along with bang-bang (non-singular) portions, control can contain a singular arc. At the same time, concatenation of singular and non-singular portions is carried out using chattering. A bang-bang suboptimal control is presented, which is close to the optimal one according to the given quality criterion. A positional terminal control is proposed for the first approximation when a suboptimal control with a given deviation of the objective function from the optimal value is numerically found. The obtained results are confirmed by the corresponding numerical calculations.


2021 ◽  
pp. 0272989X2199895
Author(s):  
Adinda Mieras ◽  
Annemarie Becker-Commissaris ◽  
Hanna T. Klop ◽  
H. Roeline W. Pasman ◽  
Denise de Jong ◽  
...  

Background Previous studies have investigated patients’ treatment goals before starting a treatment for metastatic lung cancer. Data on the evaluation of treatment goals are lacking. Aim To determine if patients with metastatic lung cancer and their oncologists perceive the treatment goals they defined at the start of systemic treatment as achieved after treatment and if in hindsight they believe it was the right decision to start systemic therapy. Design and Participants A prospective multicenter study in 6 hospitals across the Netherlands between 2016 and 2018. Following systemic treatment, 146 patients with metastatic lung cancer and 23 oncologists completed a questionnaire on the achievement of their treatment goals and whether they made the right treatment decision. Additional interviews with 15 patients and 5 oncologists were conducted. Results According to patients and oncologists, treatment goals were achieved in 30% and 37% for ‘quality of life,’ 49% and 41% for ‘life prolongation,’ 26% and 44% for ‘decrease in tumor size,’ and 44% for ‘cure’, respectively. Most patients and oncologists, in hindsight, felt they had made the right decision to start treatment and also if they had not achieved their goals (72% and 93%). This was related to the feeling that they had to do ‘something.’ Conclusions Before deciding on treatment, the treatment options, including their benefits and side effects, and the goals patients have should be discussed. It is key that these discussions include not only systemic treatment but also palliative care as effective options for doing ‘something.’


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Xiaoqing Luo ◽  
Shunli Peng ◽  
Sijie Ding ◽  
Qin Zeng ◽  
Rong Wang ◽  
...  

Abstract Background Serum Deprivation Protein Response (SDPR) plays an important role in formation of pulmonary alveoli. However, the functions and values of SDPR in lung cancer remain unknown. We explored prognostic value, expression pattern, and biological function of SDPR in non-small cell lung cancer (NSCLC) and KRAS-mutant lung cancers. Methods SDPR expression was evaluated by quantitative real-time PCR (RT-qPCR), immunohistochemistry (IHC), and Western blot on human NSCLC cells, lung adenocarcinoma tissue array, KRAS-mutant transgenic mice, TCGA and GEO datasets. Prognostic values of SDPR were evaluated by Kaplan–Meier and Cox regression analysis. Bioinformatics implications of SDPR including SDPR-combined transcription factors (TFs) and microRNAs were predicted. In addition, correlations between SDPR, immune checkpoint molecules, and tumor infiltration models were illustrated. Results SDPR expression was downregulated in tumor cells and tissues. Low SDPR expression was an independent factor that correlated with shorter overall survival of patients both in lung cancer and KRAS-mutant subgroups. Meanwhile, ceRNA network was constructed to clarify the regulatory and biological functions of SDPR. Negative correlations were found between SDPR and immune checkpoint molecules (PD-L1, TNFRSF18, TNFRSF9, and TDO2). Moreover, diversity immune infiltration models were observed in NSCLC with different SDPR expression and copy number variation (CNV) patterns. Conclusions This study elucidated regulation network of SDPR in KRAS-mutant NSCLC, and it illustrated correlations between low SDPR expression and suppressed immune system, unfolding a prognostic factor and potential target for the treatment of lung cancer, especially for KRAS-mutant NSCLC.


2020 ◽  
Vol 8 (1) ◽  
pp. 70-91 ◽  
Author(s):  
Miguel Navascués ◽  
Elie Wolfe

AbstractThe causal compatibility question asks whether a given causal structure graph — possibly involving latent variables — constitutes a genuinely plausible causal explanation for a given probability distribution over the graph’s observed categorical variables. Algorithms predicated on merely necessary constraints for causal compatibility typically suffer from false negatives, i.e. they admit incompatible distributions as apparently compatible with the given graph. In 10.1515/jci-2017-0020, one of us introduced the inflation technique for formulating useful relaxations of the causal compatibility problem in terms of linear programming. In this work, we develop a formal hierarchy of such causal compatibility relaxations. We prove that inflation is asymptotically tight, i.e., that the hierarchy converges to a zero-error test for causal compatibility. In this sense, the inflation technique fulfills a longstanding desideratum in the field of causal inference. We quantify the rate of convergence by showing that any distribution which passes the nth-order inflation test must be $\begin{array}{} \displaystyle {O}{\left(n^{{{-}{1}}/{2}}\right)} \end{array}$-close in Euclidean norm to some distribution genuinely compatible with the given causal structure. Furthermore, we show that for many causal structures, the (unrelaxed) causal compatibility problem is faithfully formulated already by either the first or second order inflation test.


2020 ◽  
Vol 1471 ◽  
pp. 012043
Author(s):  
Yessi Jusman ◽  
Zul Indra ◽  
Roni Salambue ◽  
Siti Nurul Aqmariah Mohd Kanafiah ◽  
Muhammad Ahdan Fawwaz Nurkholid

Author(s):  
MINNIE H. PATEL ◽  
H.-S. JACOB TSAO

Empirical cumulative lifetime distribution function is often required for selecting lifetime distribution. When some test items are censored from testing before failure, this function needs to be estimated, often via the approach of discrete nonparametric maximum likelihood estimation (DN-MLE). In this approach, this empirical function is expressed as a discrete set of failure-probability estimates. Kaplan and Meier used this approach and obtained a product-limit estimate for the survivor function, in terms exclusively of the hazard probabilities, and the equivalent failure-probability estimates. They cleverly expressed the likelihood function as the product of terms each of which involves only one hazard probability ease of derivation, but the estimates for failure probabilities are complex functions of hazard probabilities. Because there are no closed-form expressions for the failure probabilities, the estimates have been calculated numerically. More importantly, it has been difficult to study the behavior of the failure probability estimates, e.g., the standard errors, particularly when the sample size is not very large. This paper first derives closed-form expressions for the failure probabilities. For the special case of no censoring, the DN-MLE estimates for the failure probabilities are in closed forms and have an obvious, intuitive interpretation. However, the Kaplan–Meier failure-probability estimates for cases involving censored data defy interpretation and intuition. This paper then develops a simple algorithm that not only produces these estimates but also provides a clear, intuitive justification for the estimates. We prove that the algorithm indeed produces the DN-MLE estimates and demonstrate numerically their equivalence to the Kaplan–Meier-based estimates. We also provide an alternative algorithm.


Sign in / Sign up

Export Citation Format

Share Document