scholarly journals Designing Robust N-of-1 Studies for Precision Medicine: Simulation Study and Design Recommendations (Preprint)

2018 ◽  
Author(s):  
Bethany Percha ◽  
Edward B Baskerville ◽  
Matthew Johnson ◽  
Joel T Dudley ◽  
Noah Zimmerman

BACKGROUND Recent advances in molecular biology, sensors, and digital medicine have led to an explosion of products and services for high-resolution monitoring of individual health. The N-of-1 study has emerged as an important methodological tool for harnessing these new data sources, enabling researchers to compare the effectiveness of health interventions at the level of a single individual. OBJECTIVE N-of-1 studies are susceptible to several design flaws. We developed a model that generates realistic data for N-of-1 studies to enable researchers to optimize study designs in advance. METHODS Our stochastic time-series model simulates an N-of-1 study, incorporating all study-relevant effects, such as carryover and wash-in effects, as well as various sources of noise. The model can be used to produce realistic simulated data for a near-infinite number of N-of-1 study designs, treatment profiles, and patient characteristics. RESULTS Using simulation, we demonstrate how the number of treatment blocks, ordering of treatments within blocks, duration of each treatment, and sampling frequency affect our ability to detect true differences in treatment efficacy. We provide a set of recommendations for study designs on the basis of treatment, outcomes, and instrument parameters, and make our simulation software publicly available for use by the precision medicine community. CONCLUSIONS Simulation can facilitate rapid optimization of N-of-1 study designs and increase the likelihood of study success while minimizing participant burden.

10.2196/12641 ◽  
2019 ◽  
Vol 21 (4) ◽  
pp. e12641 ◽  
Author(s):  
Bethany Percha ◽  
Edward B Baskerville ◽  
Matthew Johnson ◽  
Joel T Dudley ◽  
Noah Zimmerman

Background Recent advances in molecular biology, sensors, and digital medicine have led to an explosion of products and services for high-resolution monitoring of individual health. The N-of-1 study has emerged as an important methodological tool for harnessing these new data sources, enabling researchers to compare the effectiveness of health interventions at the level of a single individual. Objective N-of-1 studies are susceptible to several design flaws. We developed a model that generates realistic data for N-of-1 studies to enable researchers to optimize study designs in advance. Methods Our stochastic time-series model simulates an N-of-1 study, incorporating all study-relevant effects, such as carryover and wash-in effects, as well as various sources of noise. The model can be used to produce realistic simulated data for a near-infinite number of N-of-1 study designs, treatment profiles, and patient characteristics. Results Using simulation, we demonstrate how the number of treatment blocks, ordering of treatments within blocks, duration of each treatment, and sampling frequency affect our ability to detect true differences in treatment efficacy. We provide a set of recommendations for study designs on the basis of treatment, outcomes, and instrument parameters, and make our simulation software publicly available for use by the precision medicine community. Conclusions Simulation can facilitate rapid optimization of N-of-1 study designs and increase the likelihood of study success while minimizing participant burden.


2018 ◽  
Author(s):  
Bethany Percha ◽  
Edward B. Baskerville ◽  
Matthew Johnson ◽  
Joel Dudley ◽  
Noah Zimmerman

AbstractRecent advances in molecular biology, sensors, and digital medicine have led to an explosion of products and services for high-resolution monitoring of individual health. The N-of-1 study has emerged as an important methodological tool for harnessing these new data sources, enabling researchers to compare the effectiveness of health interventions at the level of a single individual. We have developed a stochastic time series model that simulates an N-of-1 study, facilitating rapid optimization of N-of-1 study designs and increasing the likelihood of study success while minimizing participant burden. Using simulation, we demonstrate how the number of treatment blocks, ordering of treatments within blocks, duration of each treatment, and sampling frequency affect our ability to detect true differences in treatment efficacy. We provide a set of recommendations for study designs based on treatment, outcome, and instrument parameters, and provide our simulation software as a supplement to the paper.


2001 ◽  
Vol 7 (1) ◽  
pp. 97-112 ◽  
Author(s):  
Yulia R. Gel ◽  
Vladimir N. Fomin

Usually the coefficients in a stochastic time series model are partially or entirely unknown when the realization of the time series is observed. Sometimes the unknown coefficients can be estimated from the realization with the required accuracy. That will eventually allow optimizing the data handling of the stochastic time series.Here it is shown that the recurrent least-squares (LS) procedure provides strongly consistent estimates for a linear autoregressive (AR) equation of infinite order obtained from a minimal phase regressive (ARMA) equation. The LS identification algorithm is accomplished by the Padé approximation used for the estimation of the unknown ARMA parameters.


2018 ◽  
Vol 15 (3) ◽  
pp. 286-293
Author(s):  
Jonathan C Hibbard ◽  
Jonathan S Friedstat ◽  
Sonia M Thomas ◽  
Renee E Edkins ◽  
C Scott Hultman ◽  
...  

Background/aims: Laser treatment of burns scars is considered by some providers to be standard of care. However, there is little evidence-based research as to the true benefit. A number of factors hinder evaluation of the benefit of laser treatment. These include significant heterogeneity in patient response and possible delayed effects from the laser treatment. Moreover, laser treatments are often provided sequentially using different types of equipment and settings, so there are effectively a large number of overall treatment options that need to be compared. We propose a trial capable of coping with these issues and that also attempts to take advantage of the heterogeneous response in order to estimate optimal treatment plans personalized to each individual patient. It will be the first large-scale randomized trial to compare the effectiveness of laser treatments for burns scars and, to our knowledge, the very first example of the utility of a Sequential Multiple Assignment Randomized Trial in plastic surgery. Methods: We propose using a Sequential Multiple Assignment Randomized Trial design to investigate the effect of various permutations of laser treatment on hypertrophic burn scars. We will compare and test hypotheses regarding laser treatment effects at a general population level. Simultaneously, we hope to use the data generated to discover possible beneficial personalized treatment plans, tailored to individual patient characteristics. Results: We show that the proposed trial has good power to detect laser treatment effect at the overall population level, despite comparing a large number of treatment combinations. The trial will simultaneously provide high-quality data appropriate for estimating precision-medicine treatment rules. We detail population-level comparisons of interest and corresponding sample size calculations. We provide simulations to suggest the power of the trial to detect laser effect and also the possible benefits of personalization of laser treatment to individual characteristics. Conclusion: We propose, to our knowledge, the first use of a Sequential Multiple Assignment Randomized Trial in surgery. The trial is rigorously designed so that it is reasonably straightforward to implement and powered to answer general overall questions of interest. The trial is also designed to provide data that are suitable for the estimation of beneficial precision-medicine treatment rules that depend both on individual patient characteristics and on-going real-time patient response to treatment.


Author(s):  
Santo Banerjee ◽  
M K Hassan ◽  
Sayan Mukherjee ◽  
A Gowrisankar

2011 ◽  
pp. 130-153 ◽  
Author(s):  
Toshio Tsuji ◽  
Nan Bu ◽  
Osamu Fukuda

In the field of pattern recognition, probabilistic neural networks (PNNs) have been proven as an important classifier. For pattern recognition of EMG signals, the characteristics usually used are: (1) amplitude, (2) frequency, and (3) space. However, significant temporal characteristic exists in the transient and non-stationary EMG signals, which cannot be considered by traditional PNNs. In this article, a recurrent PNN, called recurrent log-linearized Gaussian mixture network (R-LLGMN), is introduced for EMG pattern recognition, with the emphasis on utilizing temporal characteristics. The structure of R-LLGMN is based on the algorithm of a hidden Markov model (HMM), which is a routinely used technique for modeling stochastic time series. Since R-LLGMN inherits advantages from both HMM and neural computation, it is expected to have higher representation ability and show better performance when dealing with time series like EMG signals. Experimental results show that R-LLGMN can achieve high discriminant accuracy in EMG pattern recognition.


Sign in / Sign up

Export Citation Format

Share Document