Information Content of Diagnostic Tests in the Medical Literature

1990 ◽  
Vol 29 (01) ◽  
pp. 61-66 ◽  
Author(s):  
P. S. Heckerling

AbstractDiagnostic tests provide information about the presence or absence of disease. However, even after application of diagnostic tests, significant uncertainty about the state of the patient often remains. This uncertainty can be quantified through the use of information theory. The “information” contained in diagnostic tests published in the medical literature of the years 1982 through 1986 was evaluated using Shannon information functions. Information content, averaged over all prior probabilities of disease, ranged from 0.002 bits to 0.720 bits of information; the tests therefore provided from 0.3% to 100% of the information needed for diagnostic certainty. Median average information was 0.395 bits, corresponding to only 55% of the information required for diagnostic certainty. Reclassifying test results into multiple mutually exclusive outcome categories allowed extraction of a median of 14% and a maximum of 109% more average information than that obtained using a dichotomous positive/negative classification. We conclude that the “information” provided by many of the tests published in the medical literature is insufficient to overcome diagnostic uncertainty. Information theory can quantify the uncertainty associated with diagnostic testing and suggest strategies for reducing this uncertainty.

Entropy ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. 97 ◽  
Author(s):  
William A. Benish

The fundamental information theory functions of entropy, relative entropy, and mutual information are directly applicable to clinical diagnostic testing. This is a consequence of the fact that an individual’s disease state and diagnostic test result are random variables. In this paper, we review the application of information theory to the quantification of diagnostic uncertainty, diagnostic information, and diagnostic test performance. An advantage of information theory functions over more established test performance measures is that they can be used when multiple disease states are under consideration as well as when the diagnostic test can yield multiple or continuous results. Since more than one diagnostic test is often required to help determine a patient’s disease state, we also discuss the application of the theory to situations in which more than one diagnostic test is used. The total diagnostic information provided by two or more tests can be partitioned into meaningful components.


Author(s):  
Kumar Dharmarajan ◽  
Kelly M Strait ◽  
Tara Lagu ◽  
Shu-Xia Li ◽  
Harlan M Krumholz

Background: Patients hospitalized with heart failure (HF) are often treated for concomitant respiratory disease due to diagnostic uncertainty or coexisting conditions. Testing with natriuretic peptides, chest radiograph (CXR), and transthoracic echocardiogram (TTE) is common and may influence these treatment decisions. We examined hospital variation in diagnostic testing among inpatients with HF and how testing relates to additional treatment for coexisting respiratory conditions. Methods: We identified hospitalizations with a principal discharge diagnosis of HF from 2009-2010 Premier, Inc. hospitals and age>18y, known admission source, non-pediatric attending physician, receipt of HF treatment (loop diuretics, inotropes, or IV vasodilators), and >2 day hospital stay. We excluded hospitalizations with present-on-admission codes for infections besides pneumonia or inflammatory, allergic, or autoimmune conditions besides COPD. For hospital days 1-2, we calculated each hospital’s proportion of admissions receiving selected diagnostic tests (natriuretic peptides, CXR, TTE) and respiratory treatments (short-acting inhaled bronchodilators, antibiotics, high-dose steroids). Treatment categories were mutually exclusive. The proportion of admissions receiving diagnostic testing and respiratory treatments was calculated for each hospital, and summary statistics were reported across hospitals. Results: We identified 164,494 HF hospitalizations among 368 hospitals. Natriuretic peptide testing across hospitals was done in 81% to 92% (IQR; median 87%) of HF admissions, CXR testing was done in 87% to 94% (IQR; median 91%), and TTE testing was done in 39% to 56% (IQR; median 48%). The median proportion of hospitalizations receiving diagnostic testing at the hospital level was similar among patients treated only for HF and those also treated with at least one respiratory therapy (respectively, 88% vs. 90% for natriuretic peptides, 90% vs. 93% for CXR, and 51% vs. 49% for TTE). Detailed description of diagnostic testing among each treatment group at the hospital level is provided in the accompanying table. Conclusion: Hospital use of relatively inexpensive diagnostic tests among HF inpatients including natriuretic peptides and CXR is frequent with little inter-hospital variation. In contrast, more expensive testing with TTE is less common though more variable across hospitals. Although often ordered, natriuretic peptides, CXR, and TTE do not appear to influence physicians’ decisions to treat only for heart failure or also for potential coexisting respiratory conditions.


2021 ◽  
pp. bmjqs-2020-012576
Author(s):  
Joris L J M Müskens ◽  
Rudolf Bertijn Kool ◽  
Simone A van Dulmen ◽  
Gert P Westert

BackgroundOveruse of diagnostic testing substantially contributes to healthcare expenses and potentially exposes patients to unnecessary harm. Our objective was to systematically identify and examine studies that assessed the prevalence of diagnostic testing overuse across healthcare settings to estimate the overall prevalence of low-value diagnostic overtesting.MethodsPubMed, Web of Science and Embase were searched from inception until 18 February 2020 to identify articles published in the English language that examined the prevalence of diagnostic testing overuse using database data. Each of the assessments was categorised as using a patient-indication lens, a patient-population lens or a service lens.Results118 assessments of diagnostic testing overuse, extracted from 35 studies, were included in this study. Most included assessments used a patient-indication lens (n=67, 57%), followed by the service lens (n=27, 23%) and patient-population lens (n=24, 20%). Prevalence estimates of diagnostic testing overuse ranged from 0.09% to 97.5% (median prevalence of assessments using a patient-indication lens: 11.0%, patient-population lens: 2.0% and service lens: 30.7%). The majority of assessments (n=85) reported overuse of diagnostic testing to be below 25%. Overuse of diagnostic imaging tests was most often assessed (n=96). Among the 33 assessments reporting high levels of overuse (≥25%), preoperative testing (n=7) and imaging for uncomplicated low back pain (n=6) were most frequently examined. For assessments of similar diagnostic tests, major variation in the prevalence of overuse was observed. Differences in the definitions of low-value tests used, their operationalisation and assessment methods likely contributed to this observed variation.ConclusionOur findings suggest that substantial overuse of diagnostic testing is present with wide variation in overuse. Preoperative testing and imaging for non-specific low back pain are the most frequently identified low-value diagnostic tests. Uniform definitions and assessments are required in order to obtain a more comprehensive understanding of the magnitude of diagnostic testing overuse.


Author(s):  
S.I. Agasieva ◽  
E.A. Smetanin ◽  
A.R. Vechkanov ◽  
A.V. Gubanov

Statement of the problem of this article - one of the most important problems is protection from especially dangerous infectious diseases. The use of biosensors in clinical trials will significantly reduce the time for obtaining the results of analyzes, thereby speeding up the appointment of treatment to patients. The purpose of the article is to present modern designs of biosensors based on gallium nitride, the possibilities of their application and characteristics. Consider the principles of operation, areas of application and characteristics. As a result, the design of modern biosensors and modern trends in their use from various sources of literature in recent years are shown. Biosensors, principles of their action, areas of application and characteristics are considered, which will reduce the possible socio-economic damage from temporary disability for sick citizens due to the rapid and timely implementation of anti-epidemic measures. Practical value: the proposed biosensors are of interest as devices for detecting diseases. The use of biosensors in clinical disease research has several potential advantages over other clinical analysis methods, including increased analysis speed and flexibility, multipurpose analysis capability, automation, reduced diagnostic testing costs, and the ability to integrate molecular diagnostic tests into local healthcare systems.


2017 ◽  
Vol 28 (7) ◽  
pp. 954-966 ◽  
Author(s):  
Colin Bannard ◽  
Marla Rosner ◽  
Danielle Matthews

Of all the things a person could say in a given situation, what determines what is worth saying? Greenfield’s principle of informativeness states that right from the onset of language, humans selectively comment on whatever they find unexpected. In this article, we quantify this tendency using information-theoretic measures and report on a study in which we tested the counterintuitive prediction that children will produce words that have a low frequency given the context, because these will be most informative. Using corpora of child-directed speech, we identified adjectives that varied in how informative (i.e., unexpected) they were given the noun they modified. In an initial experiment ( N = 31) and in a replication ( N = 13), 3-year-olds heard an experimenter use these adjectives to describe pictures. The children’s task was then to describe the pictures to another person. As the information content of the experimenter’s adjective increased, so did children’s tendency to comment on the feature that adjective had encoded. Furthermore, our analyses suggest that children balance informativeness with a competing drive to ease production.


2021 ◽  
Author(s):  
Sindew M. Feleke ◽  
Emily N. Reichert ◽  
Hussein Mohammed ◽  
Bokretsion G. Brhane ◽  
Kalkidan Mekete ◽  
...  

AbstractMalaria diagnostic testing in Africa is threatened by Plasmodium falciparum parasites lacking histidine-rich protein 2 (pfhrp2) and 3 (pfhrp3) genes. Among 12,572 subjects enrolled along Ethiopia’s borders with Eritrea, Sudan, and South Sudan and using multiple assays, we estimate HRP2-based rapid diagnostic tests would miss 9.7% (95% CI 8.5-11.1) of falciparum malaria cases due to pfhrp2 deletion. Established and novel genomic tools reveal distinct subtelomeric deletion patterns, well-established pfhrp3 deletions, and recent expansion of pfhrp2 deletion. Current diagnostic strategies need to be urgently reconsidered in Ethiopia, and expanded surveillance is needed throughout the Horn of Africa.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ted Sichelman

Many scholars have employed the term “entropy” in the context of law and legal systems to roughly refer to the amount of “uncertainty” present in a given law, doctrine, or legal system. Just a few of these scholars have attempted to formulate a quantitative definition of legal entropy, and none have provided a precise formula usable across a variety of legal contexts. Here, relying upon Claude Shannon's definition of entropy in the context of information theory, I provide a quantitative formalization of entropy in delineating, interpreting, and applying the law. In addition to offering a precise quantification of uncertainty and the information content of the law, the approach offered here provides other benefits. For example, it offers a more comprehensive account of the uses and limits of “modularity” in the law—namely, using the terminology of Henry Smith, the use of legal “boundaries” (be they spatial or intangible) that “economize on information costs” by “hiding” classes of information “behind” those boundaries. In general, much of the “work” performed by the legal system is to reduce legal entropy by delineating, interpreting, and applying the law, a process that can in principle be quantified.


1981 ◽  
Vol 16 (1) ◽  
pp. 77-81 ◽  
Author(s):  
Lawrence R. Bigongiari ◽  
David F. Preston ◽  
LARRY COOK ◽  
Samuel J. Dwyer ◽  
Steve Fritz ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document