total event
Recently Published Documents


TOTAL DOCUMENTS

22
(FIVE YEARS 2)

H-INDEX

6
(FIVE YEARS 0)

2021 ◽  
pp. 000276422110216
Author(s):  
Scott Althaus ◽  
Buddy Peyton ◽  
Dan Shalmon

Understanding how useful any particular set of event data might be for conflict research requires appropriate methods for assessing validity when ground truth data about the population of interest do not exist. We argue that a total error framework can provide better leverage on these critical questions than previous methods have been able to deliver. We first define a total event data error approach for identifying 19 types of error that can affect the validity of event data. We then address the challenge of applying a total error framework when authoritative ground truth about the actual distribution of relevant events is lacking. We argue that carefully constructed gold standard datasets can effectively benchmark validity problems even in the absence of ground truth data about event populations. To illustrate the limitations of conventional strategies for validating event data, we present a case study of Boko Haram activity in Nigeria over a 3-month offensive in 2015 that compares events generated by six prominent event extraction pipelines—ACLED, SCAD, ICEWS, GDELT, PETRARCH, and the Cline Center’s SPEED project. We conclude that conventional ways of assessing validity in event data using only published datasets offer little insight into potential sources of error or bias. Finally, we illustrate the benefits of validating event data using a total error approach by showing how the gold standard approach used to validate SPEED data offers a clear and robust method for detecting and evaluating the severity of temporal errors in event data.



2021 ◽  
Vol 251 ◽  
pp. 04023
Author(s):  
Gilbert Badaro ◽  
Ulf Behrens ◽  
Andrea Bocci ◽  
James Branson ◽  
Philipp Brummer ◽  
...  

The High Luminosity LHC (HL-LHC) will start operating in 2027 after the third Long Shutdown (LS3), and is designed to provide an ultimate instantaneous luminosity of 7:5 × 1034 cm−2 s−1, at the price of extreme pileup of up to 200 interactions per crossing. The number of overlapping interactions in HL-LHC collisions, their density, and the resulting intense radiation environment, warrant an almost complete upgrade of the CMS detector. The upgraded CMS detector will be read out by approximately fifty thousand highspeed front-end optical links at an unprecedented data rate of up to 80 Tb/s, for an average expected total event size of approximately 8 − 10 MB. Following the present established design, the CMS trigger and data acquisition system will continue to feature two trigger levels, with only one synchronous hardware-based Level-1 Trigger (L1), consisting of custom electronic boards and operating on dedicated data streams, and a second level, the High Level Trigger (HLT), using software algorithms running asynchronously on standard processors and making use of the full detector data to select events for offline storage and analysis. The upgraded CMS data acquisition system will collect data fragments for Level-1 accepted events from the detector back-end modules at a rate up to 750 kHz, aggregate fragments corresponding to individual Level- 1 accepts into events, and distribute them to the HLT processors where they will be filtered further. Events accepted by the HLT will be stored permanently at a rate of up to 7.5 kHz. This paper describes the baseline design of the DAQ and HLT systems for the Phase-2 of CMS.



2020 ◽  
Author(s):  
Máté Krisztián Kardos ◽  
Péter Budai ◽  
Adrienne Clement ◽  
Marcell Knolmár

<p>Besides agricultural land, settlement areas are among the primary sources for diffuse contamination of surface waters. Both organic and inorganic compounds originate from wash-off of road and roof surfaces, industrial areas as well as illegal wastewater discharge.</p><p>In an 18-month measurement campaign, flow triggered composite water samples were gathered using an automatic sampler, partly in small urban creeks draining settlement areas, partly from storm water channels in 7 mid-sized to large towns (30,000 to 1,800,000 inhabitants) in Hungary. Besides the automatic samples, characteristic runoff events were manually grab-sampled, leading to a time series of the contaminants. Both types of samples were analyzed for the total amount of nutrients (N and P), heavy metals (As, Cd, Cr, Cu, Hg, Ni, Pb, Sb, Zn) and 16 PAH forms.</p><p>In this contribution, the first results of the sample analyses are presented. The concentration of the measured contaminants is significantly higher during runoff events than in dry periods and can be linked to the amount of road and roof areas on the catchment. Flow triggered composite water samples are efficient in estimating total event load amounts, which were calculated for the pilot catchment areas.</p>



2020 ◽  
Author(s):  
Lorenzo Marchi ◽  
Massimo Arattano ◽  
Marco Cavalli ◽  
Federico Cazorzi ◽  
Stefano Crema ◽  
...  

<p>Debris-flow research requires experimental data that are difficult to collect because of the intrinsic characteristics of these processes. Both post-event field observations and monitoring in instrumented channels are suitable to collect debris-flow field data, even if with different resolutions and purposes. Monitoring in instrumented channels enables recording data that cannot be gathered by means of post-event surveys in ungauged channels. Extending the monitoring activities over multidecadal time intervals increases the significance of collected data because longer time series permit recognizing changes in debris-flow response as a consequence of changes in controlling factors, such as climate, land use, and the implementation of control works.</p><p>This paper presents debris-flows data recorded in the Moscardo Torrent (eastern Italian Alps) between 1990 and 2019. As far as we know, the Moscardo Torrent basin was the first catchment equipped with permanent instrumentation for debris-flow monitoring in Europe. The monitoring activities in the Moscardo Torrent began in 1989-1990 and still keep on, although with some gaps due to the implementation of control works in the instrumented channel (1998-2000) and the obsolescence of the instrumentation between 2007 and 2010.</p><p>Thirty debris flows were observed between 1990 and 2019; 26 of them were monitored by sensors installed on the channel (at two measuring stations for most events), while four debris flows were documented by means of post-event observations. Monitored data consist of debris-flow hydrographs, measured by means of ultrasonic sensors, and rainfall. Debris flows in the Moscardo Torrent occur from early June to the end of September, with higher frequency in the first part of summer.</p><p>This contribution presents data on triggering rainfall, flow velocity, peak discharge and volume for the monitored hydrographs. The relatively large number of debris-flow events recorded in the Moscardo Torrent has permitted to recognize the main characteristics of the debris-flow hydrographs. We used the data related to duration and the maximum depth of the debris-flow surges to define triangular hydrographs related to different event severity. Simplified triangular hydrographs show the distinctive features of debris flows (short total event duration and very short time to peak) and can help defining realistic inputs to debris-flow propagation models. A more detailed representation of hydrographs shape was achieved by averaging the recorded hydrographs of debris-flow surges. This analysis was performed on the debris flows recorded between 2002 and 2019: data for 12 surges for each of the two flow measuring stations were available. Dimensionless hydrographs were generated normalizing the flow depth by its maximum value and the time by the total surge duration. Flow peaks were aligned to preserve the sharp shape that is a distinctive feature of debris-flow hydrographs. Finally, the ordinates were averaged, and mean debris-flow hydrographs were obtained.</p><p>Debris-flow data collected in the Moscardo Torrent dataset could contribute to further analysis, including the comparison of triggering rainfall and flow variables with those recorded in other basins instrumented for debris-flows monitoring under different climate and geolithological conditions.</p>



2019 ◽  
pp. 004912411988245
Author(s):  
Leila Demarest ◽  
Arnim Langer

While conflict event data sets are increasingly used in contemporary conflict research, important concerns persist regarding the quality of the collected data. Such concerns are not necessarily new. Yet, because the methodological debate and evidence on potential errors remains scattered across different subdisciplines of social sciences, there is little consensus concerning proper reporting practices in codebooks, how best to deal with the different types of errors, and which types of errors should be prioritised. In this article, we introduce a new analytical framework—that is, the Total Event Error (TEE) framework—which aims to elucidate the methodological challenges and errors that may affect whether and how events are entered into conflict event data sets, drawing on different fields of study. Potential errors are diverse and may range from errors arising from the rationale of the media source (e.g., selection of certain types of events into the news) to errors occurring during the data collection process or the analysis phase. Based on the TEE framework, we propose a set of strategies to mitigate errors associated with the construction and use of conflict event data sets. We also identify a number of important avenues for future research concerning the methodology of creating conflict event data sets.



2019 ◽  
Vol 6 (2) ◽  
pp. 97-103 ◽  
Author(s):  
Heinz Drexel ◽  
Giuseppe M C Rosano ◽  
Basil S Lewis ◽  
Kurt Huber ◽  
Alexander Vonbank ◽  
...  

Abstract Randomized clinical trials (RCTs) are important and the Gold Standard for drugs in modern cardiovascular (CV) therapy. The cornerstone of RCTs is the recording of hard clinical endpoints instead of surrogates. It is important to select an appropriate endpoint. Efficacy endpoints must be clinically relevant and can be hierarchically divided. A very interesting innovation in endpoint acquisition is the total event paradigm.



2018 ◽  
Vol 12 (3) ◽  
pp. 273-278 ◽  
Author(s):  
B. Hedrick ◽  
F. K. Gettys ◽  
S. Richards ◽  
R. D. Muchow ◽  
C.-H. Jo ◽  
...  

Purpose The Ponseti method of treatment is the standard of care for idiopathic clubfoot. Following serial casting, percutaneous tendo-Achilles tenotomy (TAT) is performed to correct residual equinus. This procedure can be performed in either the outpatient clinic or the operating room. The purpose of this study was to evaluate the expense of this procedure by examining hospital charges in both settings. Methods We retrospectively reviewed charts of 382 idiopathic clubfoot patients with a mean age of 2.4 months (0.6 to 26.6) treated with the Ponseti method at three institutions. Patients were divided into three groups depending on the setting for the TAT procedure: 140 patients in the outpatient clinic (CL), 219 in the operating room with discharge following the procedure (OR) and 23 in the operating room with admission to hospital for observation (OR+). Medical records were reviewed to analyze age, deformity, perioperative complications and specific time spent in each setting. Hospital charges for all three groups were standardized to one institution’s charge structure. Results Charges among the three groups undergoing TAT (CL, OR, OR+) were found to be significantly different ($3840.60 versus $7962.30 versus $9110.00, respectively; p ≤ 0.001), and remained significant when separating unilateral and bilateral deformities (p < 0.001). There were nine total perioperative complications (six returns to the ER and three unexpected admissions to the hospital): five (2.3%) in the OR group, four (17.4%) in the OR+ group and none in the CL group. The OR+ group statistically had a higher rate of complications compared with the other two groups (p = 0.006). The total event time of the CL group was significantly shorter compared with the OR and OR+ groups (129.1, 171.7 and 1571.6 minutes respectively; p < 0.001). Conclusion Hospital charges and total event time were significantly less when percutaneous TAT was performed in the outpatient clinic compared with the operating room. In addition, performing the procedure in clinic was associated with the lowest rate of complications. Level of Evidence Therapeutic, Level III



2018 ◽  
Vol 11 (2) ◽  
pp. 327-337 ◽  
Author(s):  
Min-Ge Xie ◽  
John Kolassa ◽  
Dungang Liu ◽  
Regina Liu ◽  
Sifan Liu


2017 ◽  
Vol 365 ◽  
pp. 93-105 ◽  
Author(s):  
Bin Wen ◽  
Nickitas Georgas ◽  
Charles Dujardins ◽  
Anand Kumaraswamy ◽  
Alan Cohn


Stroke ◽  
2015 ◽  
Vol 46 (suppl_1) ◽  
Author(s):  
Yasumasa Yamamoto ◽  
Yoshinari Nagakane ◽  
Naoki Makita ◽  
Shinji Ashida

Purpose: The present study aimed to assess the rate of and predictors for early recurrence or worsening after transient ischemic attack (TIA) or minor ischemic stroke (MIS). Methods: From 1806 consecutive patients with acute ischemic stroke, 474 patients with TIA or MIS have been studied. MIS was defined by an NIH Stroke Scale (NIHSS) score ≤3. The primary outcome was total events that include new-onset stroke or TIA and early worsening in patients with MIS in the first 90 days. Worsening was defined as clinical deterioration by ≥2 points on the NIHSS. Patients were classified into 6 stroke subtypes, i.e., G1: Intracranial atherothrombotic (ATB) (n=53), G2: Extracranial ATB (34), G3: Cardioembolic (85), G4: Penetrating artery (PA) disease (200), G5: Coagulopathy (9), and G6: Other embolism included paradoxical (22), aortogenic (31) and cryptogenic (44) embolism. Patients were also classified into 4 groups in terms of diffusion weighted image (DWI) pattern, i.e., D1: Single cortical and subcortical (n=53), D2: Multiple cortical and subcortical (108), D3: Penetrating artery territory (209) and D4: None (90). Results: Penetrating artery disease as stroke subtype is most prevalent (42.1%). There were 83 total events, of those 65% were worsening. Higher NIHSS (≥2) at admission and positive DWI were significantly higher in the group with events than without (OR: 1.67, 2.39, respectively). Most worsening occurred within 6 days. The incidence of total events/ worsening in different stroke subtypes were G1: 13 (24.5%)/8 (15.0%), G2: 8 (23.4)/4 (11.7), G3: 13 (15.2)/ 8 (9.4), G4: 41 (20.5)/32 (16.0), G5: 2 (22.2) /1 (11.1), and G6: 6 (6.4) /3 (3.2). Total event were higher in G1, G2 and G4, and worsening were higher in G1 (60% of total events) and G4 (80%). The incidence of total events/ worsening in different DWI patterns were D1: 8 (11.9%)/5 (7.4%), D2: 21 (19.4)/14 (12.9), D3: 46 (22.0)/ 36 (17.2), D4: 8 (8.8)/0 (0). Total events were higher in D2 and D3, of those worsening was 66.6% in D2 and 78.2% in D3. Conclusions: A majority of events after TIA/MIS was worsening that is prevalent especially in the patients with intracranial ATB and PA disease and those with high DWI signals of multiple cortical and PA territory. A strategy to halt progressive stroke should be tailored.



Sign in / Sign up

Export Citation Format

Share Document