Basic Time-to-Event Analyses of Online Educational Data

This chapter introduces the use of basic time-to-event analysis (a variation of “survival analysis”) to identify time-series patterns from learning management system (LMS) data portal datasets to enable empirical-based theorizing and interpretation. This approach addresses questions such as How long does it usually take before a particular event occurs? What time patterns may be seen in empirical data? What sorts of analysis and decision making can be understood from the time patterns? This chapter uses multiple datasets—related to assignment submittals and their time to grading, learner enrollments and the updates to those enrollments, and group membership and how long groups last, and other data—to demonstrate this process.

2020 ◽  
pp. 181-218
Author(s):  
Bendix Carstensen

This chapter describes survival analysis. Survival analysis concerns data where the outcome is a length of time, namely the time from inclusion in the study (such as diagnosis of some disease) till death or some other event — hence the term 'time to event analysis', which is also used. There are two primary targets normally addressed in survival analysis: survival probabilities and event rates. The chapter then looks at the life table estimator of survival function and the Kaplan–Meier estimator of survival. It also considers the Cox model and its relationship with Poisson models, as well as the Fine–Gray approach to competing risks.


Plants ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 617
Author(s):  
Alessandro Romano ◽  
Piergiorgio Stevanato

Germination data are analyzed by several methods, which can be mainly classified as germination indexes and traditional regression techniques to fit non-linear parametric functions to the temporal sequence of cumulative germination. However, due to the nature of germination data, often different from other biological data, the abovementioned methods may present some limits, especially when ungerminated seeds are present at the end of an experiment. A class of methods that could allow addressing these issues is represented by the so-called “time-to-event analysis”, better known in other scientific fields as “survival analysis” or “reliability analysis”. There is relatively little literature about the application of these methods to germination data, and some reviews dealt only with parts of the possible approaches such as either non-parametric and semi-parametric or parametric ones. The present study aims to give a contribution to the knowledge about the reliability of these methods by assessing all the main approaches to the same germination data provided by sugar beet (Beta vulgaris L.) seeds cohorts. The results obtained confirmed that although the different approaches present advantages and disadvantages, they could generally represent a valuable tool to analyze germination data providing parameters whose usefulness depends on the purpose of the research.


Cephalalgia ◽  
1999 ◽  
Vol 19 (6) ◽  
pp. 552-556 ◽  
Author(s):  
C Allen ◽  
K Jiang ◽  
W Malbecq ◽  
PJ Goadsby

Survival analysis, or, more generally, time-to-event analysis, is of interest when the data represent the time to a defined event. While well established in oncology, it has not been widely applied to migraine research, possibly because the data are usually collected intermittently, rather than continuously, and because of the awkwardness of interpreting treatment effect in survival terms. However, it represents an interesting approach for the analysis of time-to-headache relief, which addresses the clinically relevant question of who gets better sooner. The analysis uses data from all time-points to define the likelihood of headache relief following treatment throughout the entire assessment period. These data can then be used to quantify and test the difference between two therapies.


2011 ◽  
Vol 38 (5) ◽  
pp. 431 ◽  
Author(s):  
A. M. Wubs ◽  
E. Heuvelink ◽  
L. F. M. Marcelis ◽  
L. Hemerik

Time-to-event analysis, or survival analysis, is a method to analyse the timing of events and to quantify the effects of contributing factors. We apply this method to data on the timing of abortion of reproductive organs. This abortion often depends on source and sink strength. We hypothesise that the effect of source and sink strength on abortion rate can be quantified with a statistical model, obtained via survival analysis. Flower and fruit abortion in Capsicum annuum L., observed in temperature and planting density experiments, were analysed. Increasing the source strength as well as decreasing the sink strength decreased the abortion rate. The effect was non-linear, e.g. source strengths above 6 g CH2O per plant per d did not decrease abortion rates further. The maximum abortion rate occurred around 100 degree-days after anthesis. Analyses in which sink strength was replaced with the number of fruits in a specified age category had an equal or better fit to the data. We discuss the advantages and disadvantages of using survival analyses for this kind of data. The technique can also be used for other crops showing reproductive organ abortion (e.g. soybean (Glycine max L.), cucumber (Cucumis sativus L.)), but also on other event types like bud break or germination.


2019 ◽  
Vol 15 (2) ◽  
pp. 647-659 ◽  
Author(s):  
Zahra Moeini Najafabadi ◽  
Mehdi Bijari ◽  
Mehdi Khashei

Purpose This study aims to make investment decisions in stock markets using forecasting-Markowitz based decision-making approaches. Design/methodology/approach The authors’ approach offers the use of time series prediction methods including autoregressive, autoregressive moving average and artificial neural network, rather than calculating the expected rate of return based on distribution. Findings The results show that using time series prediction methods has a significant effect on improving investment decisions and the performance of the investments. Originality/value In this study, in contrast to previous studies, the alteration in the Markowitz model started with the investment expected rate of return. For this purpose, instead of considering the distribution of returns and determining the expected returns, time series prediction methods were used to calculate the future return of each asset. Then, the results of different time series methods replaced the expected returns in the Markowitz model. Finally, the overall performance of the method, as well as the performance of each of the prediction methods used, was examined in relation to nine stock market indices.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Bethany E. Higgins ◽  
Giovanni Montesano ◽  
Alison M. Binns ◽  
David P. Crabb

AbstractIn age-related macular degeneration (AMD) research, dark adaptation has been found to be a promising functional measurement. In more severe cases of AMD, dark adaptation cannot always be recorded within a maximum allowed time for the test (~ 20–30 min). These data are recorded either as censored data-points (data capped at the maximum test time) or as an estimated recovery time based on the trend observed from the data recorded within the maximum recording time. Therefore, dark adaptation data can have unusual attributes that may not be handled by standard statistical techniques. Here we show time-to-event analysis is a more powerful method for analysis of rod-intercept time data in measuring dark adaptation. For example, at 80% power (at α = 0.05) sample sizes were estimated to be 20 and 61 with uncapped (uncensored) and capped (censored) data using a standard t-test; these values improved to 12 and 38 when using the proposed time-to-event analysis. Our method can accommodate both skewed data and censored data points and offers the advantage of significantly reducing sample sizes when planning studies where this functional test is an outcome measure. The latter is important because designing trials and studies more efficiently equates to newer treatments likely being examined more efficiently.


2021 ◽  
pp. medethics-2020-107134
Author(s):  
Thana Cristina de Campos-Rudinsky ◽  
Eduardo Undurraga

Although empirical evidence may provide a much desired sense of certainty amidst a pandemic characterised by uncertainty, the vast gamut of available COVID-19 data, including misinformation, has instead increased confusion and distrust in authorities’ decisions. One key lesson we have been gradually learning from the COVID-19 pandemic is that the availability of empirical data and scientific evidence alone do not automatically lead to good decisions. Good decision-making in public health policy, this paper argues, does depend on the availability of reliable data and rigorous analyses, but depends above all on sound ethical reasoning that ascribes value and normative judgement to empirical facts.


Sign in / Sign up

Export Citation Format

Share Document