scholarly journals Bayesian Information-Theoretic Calibration of Radiotherapy Sensitivity Parameters for Informing Effective Scanning Protocols in Cancer

2020 ◽  
Vol 9 (10) ◽  
pp. 3208
Author(s):  
Heyrim Cho ◽  
Allison L. Lewis ◽  
Kathleen M. Storey

With new advancements in technology, it is now possible to collect data for a variety of different metrics describing tumor growth, including tumor volume, composition, and vascularity, among others. For any proposed model of tumor growth and treatment, we observe large variability among individual patients’ parameter values, particularly those relating to treatment response; thus, exploiting the use of these various metrics for model calibration can be helpful to infer such patient-specific parameters both accurately and early, so that treatment protocols can be adjusted mid-course for maximum efficacy. However, taking measurements can be costly and invasive, limiting clinicians to a sparse collection schedule. As such, the determination of optimal times and metrics for which to collect data in order to best inform proper treatment protocols could be of great assistance to clinicians. In this investigation, we employ a Bayesian information-theoretic calibration protocol for experimental design in order to identify the optimal times at which to collect data for informing treatment parameters. Within this procedure, data collection times are chosen sequentially to maximize the reduction in parameter uncertainty with each added measurement, ensuring that a budget of n high-fidelity experimental measurements results in maximum information gain about the low-fidelity model parameter values. In addition to investigating the optimal temporal pattern for data collection, we also develop a framework for deciding which metrics should be utilized at each data collection point. We illustrate this framework with a variety of toy examples, each utilizing a radiotherapy treatment regimen. For each scenario, we analyze the dependence of the predictive power of the low-fidelity model upon the measurement budget.

2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Michelle Przedborski ◽  
Munisha Smalley ◽  
Saravanan Thiyagarajan ◽  
Aaron Goldman ◽  
Mohammad Kohandel

AbstractAnti-PD-1 immunotherapy has recently shown tremendous success for the treatment of several aggressive cancers. However, variability and unpredictability in treatment outcome have been observed, and are thought to be driven by patient-specific biology and interactions of the patient’s immune system with the tumor. Here we develop an integrative systems biology and machine learning approach, built around clinical data, to predict patient response to anti-PD-1 immunotherapy and to improve the response rate. Using this approach, we determine biomarkers of patient response and identify potential mechanisms of drug resistance. We develop systems biology informed neural networks (SBINN) to calculate patient-specific kinetic parameter values and to predict clinical outcome. We show how transfer learning can be leveraged with simulated clinical data to significantly improve the response prediction accuracy of the SBINN. Further, we identify novel drug combinations and optimize the treatment protocol for triple combination therapy consisting of IL-6 inhibition, recombinant IL-12, and anti-PD-1 immunotherapy in order to maximize patient response. We also find unexpected differences in protein expression levels between response phenotypes which complement recent clinical findings. Our approach has the potential to aid in the development of targeted experiments for patient drug screening as well as identify novel therapeutic targets.


2021 ◽  
Vol 18 (2) ◽  
pp. 172988142199958
Author(s):  
Larkin Folsom ◽  
Masahiro Ono ◽  
Kyohei Otsu ◽  
Hyoshin Park

Mission-critical exploration of uncertain environments requires reliable and robust mechanisms for achieving information gain. Typical measures of information gain such as Shannon entropy and KL divergence are unable to distinguish between different bimodal probability distributions or introduce bias toward one mode of a bimodal probability distribution. The use of a standard deviation (SD) metric reduces bias while retaining the ability to distinguish between higher and lower risk distributions. Areas of high SD can be safely explored through observation with an autonomous Mars Helicopter allowing safer and faster path plans for ground-based rovers. First, this study presents a single-agent information-theoretic utility-based path planning method for a highly correlated uncertain environment. Then, an information-theoretic two-stage multiagent rapidly exploring random tree framework is presented, which guides Mars helicopter through regions of high SD to reduce uncertainty for the rover. In a Monte Carlo simulation, we compare our information-theoretic framework with a rover-only approach and a naive approach, in which the helicopter scouts ahead of the rover along its planned path. Finally, the model is demonstrated in a case study on the Jezero region of Mars. Results show that the information-theoretic helicopter improves the travel time for the rover on average when compared with the rover alone or with the helicopter scouting ahead along the rover’s initially planned route.


Author(s):  
Anne-Sophie Schuurman ◽  
Anirudh Tomer ◽  
K. Martijn Akkerhuis ◽  
Ewout J. Hoorn ◽  
Jasper J. Brugts ◽  
...  

Abstract Background High mortality and rehospitalization rates demonstrate that improving risk assessment in heart failure patients remains challenging. Individual temporal evolution of kidney biomarkers is associated with poor clinical outcome in these patients and hence may carry the potential to move towards a personalized screening approach. Methods In 263 chronic heart failure patients included in the prospective Bio-SHiFT cohort study, glomerular and tubular biomarker measurements were serially obtained according to a pre-scheduled, fixed trimonthly scheme. The primary endpoint (PE) comprised cardiac death, cardiac transplantation, left ventricular assist device implantation or heart failure hospitalization. Personalized scheduling of glomerular and tubular biomarker measurements was compared to fixed scheduling in individual patients by means of a simulation study, based on clinical characteristics of the Bio-SHiFT study. For this purpose, repeated biomarker measurements and the PE were jointly modeled. For personalized scheduling, using this fitted joint model, we determined the optimal time point of the next measurement based on the patient’s individual risk profile as estimated by the joint model and the maximum information gain on the patient’s prognosis. We compared the schedule’s capability of enabling timely intervention before the occurrence of the PE and number of measurements needed. Results As compared to a pre-defined trimonthly scheduling approach, personalized scheduling of glomerular and tubular biomarker measurements showed similar performance with regard to prognostication, but required a median of 0.4–2.7 fewer measurements per year. Conclusion Personalized scheduling is expected to reduce the number of patient visits and healthcare costs. Thus, it may contribute to efficient monitoring of chronic heart failure patients and could provide novel opportunities for timely adaptation of treatment. Graphic abstract


1983 ◽  
Vol 245 (5) ◽  
pp. R620-R623
Author(s):  
M. Berman ◽  
P. Van Eerdewegh

A measure is proposed for the information content of data with respect to models. A model, defined by a set of parameter values in a mathematical framework, is considered a point in a hyperspace. The proposed measure expresses the information content of experimental data as the contribution they make, in units of information bits, in defining a model to within a desired region of the hyperspace. This measure is then normalized to conventional statistical measures of uncertainty. It is shown how the measure can be used to estimate the information of newly planned experiments and help in decisions on data collection strategies.


2018 ◽  
Vol 20 ◽  
pp. 664-673 ◽  
Author(s):  
Stelios Angeli ◽  
Kyrre E. Emblem ◽  
Paulina Due-Tonnessen ◽  
Triantafyllos Stylianopoulos

Author(s):  
Saurav Jindal ◽  
Poonam Saini

In recent years, data collection and data mining have emerged as fast-paced computational processes as the amount of data from different sources has increased manifold. With the advent of such technologies, major concern is exposure of an individual's self-contained information. To confront the unusual situation, anonymization of dataset is performed before being released into public for further usage. The chapter discusses various existing techniques of anonymization. Thereafter, a novel redaction technique is proposed for generalization to minimize the overall cost (penalty) of the process being inversely proportional to utility of generated dataset. To validate the proposed work, authors assume a pre-processed dataset and further compare our algorithm with existing techniques. Lastly, the proposed technique is made scalable thus ensuring further minimization of generalization cost and improving overall utility of information gain.


2004 ◽  
Vol 21 (3) ◽  
pp. 331-336 ◽  
Author(s):  
DAVID H. FOSTER ◽  
SÉRGIO M.C. NASCIMENTO ◽  
KINJIRO AMANO

If surfaces in a scene are to be distinguished by their color, their neural representation at some level should ideally vary little with the color of the illumination. Four possible neural codes were considered: von-Kries-scaled cone responses from single points in a scene, spatial ratios of cone responses produced by light reflected from pairs of points, and these quantities obtained with sharpened (opponent-cone) responses. The effectiveness of these codes in identifying surfaces was quantified by information-theoretic measures. Data were drawn from a sample of 25 rural and urban scenes imaged with a hyperspectral camera, which provided estimates of surface reflectance at 10-nm intervals at each of 1344 × 1024 pixels for each scene. In computer simulations, scenes were illuminated separately by daylights of correlated color temperatures 4000 K, 6500 K, and 25,000 K. Points were sampled randomly in each scene and identified according to each of the codes. It was found that the maximum information preserved under illuminant changes varied with the code, but for a particular code it was remarkably stable across the different scenes. The standard deviation over the 25 scenes was, on average, approximately 1 bit, suggesting that the neural coding of surface color can be optimized independent of location for any particular range of illuminants.


2016 ◽  
Vol 113 (48) ◽  
pp. E7663-E7671 ◽  
Author(s):  
Guillermo Lorenzo ◽  
Michael A. Scott ◽  
Kevin Tew ◽  
Thomas J. R. Hughes ◽  
Yongjie Jessica Zhang ◽  
...  

Recently, mathematical modeling and simulation of diseases and their treatments have enabled the prediction of clinical outcomes and the design of optimal therapies on a personalized (i.e., patient-specific) basis. This new trend in medical research has been termed “predictive medicine.” Prostate cancer (PCa) is a major health problem and an ideal candidate to explore tissue-scale, personalized modeling of cancer growth for two main reasons: First, it is a small organ, and, second, tumor growth can be estimated by measuring serum prostate-specific antigen (PSA, a PCa biomarker in blood), which may enable in vivo validation. In this paper, we present a simple continuous model that reproduces the growth patterns of PCa. We use the phase-field method to account for the transformation of healthy cells to cancer cells and use diffusion−reaction equations to compute nutrient consumption and PSA production. To accurately and efficiently compute tumor growth, our simulations leverage isogeometric analysis (IGA). Our model is shown to reproduce a known shape instability from a spheroidal pattern to fingered growth. Results of our computations indicate that such shift is a tumor response to escape starvation, hypoxia, and, eventually, necrosis. Thus, branching enables the tumor to minimize the distance from inner cells to external nutrients, contributing to cancer survival and further development. We have also used our model to perform tissue-scale, personalized simulation of a PCa patient, based on prostatic anatomy extracted from computed tomography images. This simulation shows tumor progression similar to that seen in clinical practice.


Author(s):  
Alessandro Satriano ◽  
Edward J. Vigmond ◽  
Elena S. Di Martino

When complex biological structures are modeled, one of the most critical issues is the assignment of geometrical, mechanical and electrical properties to the meshed surfaces. Properties of interest are commonly obtained from diagnostic imaging, experimental tests or anatomical observation. These parameters are usually lumped into individual values assigned to a specific region after subdividing the structure in sub-regions. This practice simplifies the problem avoiding the cumbersome assignment of parameter values to each element. However, sub-regions may not adequately represent the smooth transition between regions thus resulting in artificial discontinuities. In addition, some parameters, such as for example the organization of cardiomyocytes, which is the objective of our research, may be obtained through destructive tests or through sophisticated methods that can only be performed on a limited number of samples. Or else, data structure obtained for one animal species could be applied on a different species. Furthermore, in a clinical environment the need for fast turnout of patient-specific models would benefit from the assignment of tissue properties in a semi-automatic manner.


Sign in / Sign up

Export Citation Format

Share Document