event distribution
Recently Published Documents


TOTAL DOCUMENTS

57
(FIVE YEARS 20)

H-INDEX

10
(FIVE YEARS 3)

2022 ◽  
Author(s):  
P. Gubarev

Abstract. The authors describe the analysis of the current state of the problem under consideration. A definition of "averaged failure flow parameter" is given. The periods of traction rolling stock life cycle are considered. The assumption of event distribution laws exponentiality is introduced, which makes it possible to obtain expressions of the main reliability indices in the analytical form. The work of depot service locomotives to ensure the required reliability and readiness of the rolling stock during their normal operation has been assessed. The introduction of the term "readiness" into the modern practice of traction rolling stock reliability estimation is considered. The initial data for calculating the indexes of locomotive uptime and readiness are presented. Calculated values of readiness and no-failure indices of electric locomotives in operation are obtained. The calculated values of internal and technical availability coefficients are compared with similar indicators established by technical specifications. Control procedures were performed to determine the compliance of each set of locomotives (EP1, 2ES4K) with the uptime requirements. As a result of comparing the calculated values of internal and technical availability factors (for electric locomotives EP1 and 2ES4K with analogous values set by specifications (EP1 and 2ES4K) it was determined that the surveyed locomotives comply with the established availability requirements. As a result of control procedures to determine the compliance of each set of EP1 and 2ES4K locomotives with the uptime requirements, it was determined that the set of 2ES4K electric locomotives for the run in question does not fully comply with the uptime requirement. And the set of EP1 electric locomotives meets the reliability requirements, but the error value is higher than 20%. To clarify both events, it is necessary to increase the mileage interval of the locomotives and repeat the procedure for determining compliance with the uptime requirements. The method of assessing the uptime and readiness of locomotives during their normal operation makes it possible to identify existing shortcomings in the operation of rolling stock and to form measures to improve the quality of rolling stock operation.


2021 ◽  
pp. 096228022110239
Author(s):  
Shaun R Seaman ◽  
Anne Presanis ◽  
Christopher Jackson

Time-to-event data are right-truncated if only individuals who have experienced the event by a certain time can be included in the sample. For example, we may be interested in estimating the distribution of time from onset of disease symptoms to death and only have data on individuals who have died. This may be the case, for example, at the beginning of an epidemic. Right truncation causes the distribution of times to event in the sample to be biased towards shorter times compared to the population distribution, and appropriate statistical methods should be used to account for this bias. This article is a review of such methods, particularly in the context of an infectious disease epidemic, like COVID-19. We consider methods for estimating the marginal time-to-event distribution, and compare their efficiencies. (Non-)identifiability of the distribution is an important issue with right-truncated data, particularly at the beginning of an epidemic, and this is discussed in detail. We also review methods for estimating the effects of covariates on the time to event. An illustration of the application of many of these methods is provided, using data on individuals who had died with coronavirus disease by 5 April 2020.


Author(s):  
T. Aronova ◽  
G. Aronov ◽  
T. Protasovitskaya ◽  
A. Aronova

An annual review of the seismicity in the territory of Belarus based on the data of two analog (operated in the first half-year) and seventeen digital stations is presented. A total of 80 events with Кd=4.6–8.4 were recorded all of them being confined to the southern part of the territory, the Soligorsk mining area included. The map of all the event epicenters for 2015 is given. The table of the distribution of the seismic events by their energy classes and seismic energy in months is presented. The maximum values of the seismic energy release fell in August, and the maximum number of the events was observed in July. The level of the seismic energy released in 2015 is the same as in 2014 but 2.18 times lower than its long-time average value for 1983–2014. The number of the events in 2015 is 1.4 times more than their number in 2014 and 1.86 times more than the Nср value for the previous 32 years. The distribution of the earthquakes in the depth intervals layers showed that the earthquake foci are mostly located in the upper 20 km part of the Earth’s crust. However, the foci of 47 earthquakes are located at depths above 10 km. The distribution of all the events in 2015 is represented in real-time. The quiet seismic periods and seismic activation periods were determined. The distribution of the seismic events in the hourly intervals showed the periods of the increase of the seismic events number. The maximum and minimum values N in the seismic event distribution by the days of the week were determined.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Jaclyn M. Beca ◽  
Kelvin K. W. Chan ◽  
David M. J. Naimark ◽  
Petros Pechlivanoglou

Abstract Introduction Extrapolation of time-to-event data from clinical trials is commonly used in decision models for health technology assessment (HTA). The objective of this study was to assess performance of standard parametric survival analysis techniques for extrapolation of time-to-event data for a single event from clinical trials with limited data due to small samples or short follow-up. Methods Simulated populations with 50,000 individuals were generated with an exponential hazard rate for the event of interest. A scenario consisted of 5000 repetitions with six sample size groups (30–500 patients) artificially censored after every 10% of events observed. Goodness-of-fit statistics (AIC, BIC) were used to determine the best-fitting among standard parametric distributions (exponential, Weibull, log-normal, log-logistic, generalized gamma, Gompertz). Median survival, one-year survival probability, time horizon (1% survival time, or 99th percentile of survival distribution) and restricted mean survival time (RMST) were compared to population values to assess coverage and error (e.g., mean absolute percentage error). Results The true exponential distribution was correctly identified using goodness-of-fit according to BIC more frequently compared to AIC (average 92% vs 68%). Under-coverage and large errors were observed for all outcomes when distributions were specified by AIC and for time horizon and RMST with BIC. Error in point estimates were found to be strongly associated with sample size and completeness of follow-up. Small samples produced larger average error, even with complete follow-up, than large samples with short follow-up. Correctly specifying the event distribution reduced magnitude of error in larger samples but not in smaller samples. Conclusions Limited clinical data from small samples, or short follow-up of large samples, produce large error in estimates relevant to HTA regardless of whether the correct distribution is specified. The associated uncertainty in estimated parameters may not capture the true population values. Decision models that base lifetime time horizon on the model’s extrapolated output are not likely to reliably estimate mean survival or its uncertainty. For data with an exponential event distribution, BIC more reliably identified the true distribution than AIC. These findings have important implications for health decision modelling and HTA of novel therapies seeking approval with limited evidence.


2021 ◽  
Vol 873 (1) ◽  
pp. 012066
Author(s):  
P A Subakti ◽  
M I Sulaiman ◽  
D Y Faimah ◽  
I Madrinovella ◽  
I Herawati ◽  
...  

Abstract The Seram Trough is located in the northern part of Indonesia and has a complex tectonic setting. The uniqueness of these regions lies in the U-shape subduction system. Several models have been proposed in this region, such as one subduction system that has been rotated 90° or 180°, two subduction systems, and one subduction that having a slab roll-back that causes extension systems. In this study, we try to invert velocity and seismicity using double-difference tomography with the target of better imaging the sub-surface structure in the region. We use data catalogue collection from the Indonesian Agency of Meteorology, Climatology, and Geophysics. The length of data is 4 years from January 2015 to December 2019 from 16 permanent stations. Earthquake relocations show a focused hypocenter distribution at shallow depth, and we interpreted some of these shallow depth events are related to the magmatic activity. Event distribution also displays a steep angle of seismicity pattern that represents the dipping subduction slab. Inverted Tomography models show a band of faster velocity models that dip from North to South, suggesting a subductions slab. We also observe a possibility of a tear in the slab from the seismicity pattern and tomogram model. The slower velocity perturbation is seen at shallow depth that may associate with magmatic and frequent shallow seismicity. A possibility of partial melting is also seen with low-velocity zone at a depth of 70 km next to the fast dipping velocity.


2021 ◽  
Author(s):  
David Harris ◽  
Douglas Dodge

<p>Under special circumstances, waveform observations of seismic events related by a common, spatially-distributed source process exhibit a geometric architecture that is a distorted image of event distribution in the source region.  We describe a prescription for visualizing this signal space image and use a machine-learning algorithm, Isomap, and an algorithm due to Menke to invert collections of waveforms directly for relative event location.  We illustrate concepts and methods with well-characterized induced seismicity at a coal mine in the U.S. state of Utah observed by two local seismic instruments, and with synthetics.  We anticipate application of these methods to repetitive volcanic seismicity, icequakes and induced seismicity.</p><p>This is Lawrence Livermore National Laboratory contribution LLNL-ABS-818445.</p>


2021 ◽  
Vol 9 (1) ◽  
pp. T1-T7
Author(s):  
Dewei Li ◽  
Ruizhao Yang ◽  
Lingbin Meng ◽  
Wang Li

Many factors can impact the location data of microseismic events, including natural fractures, rock lithology, in situ stress, and hydraulic-fracturing parameters. The distribution of microseismic events generally tends toward highly brittle areas or areas with brittle minerals. Moreover, location data of microseismic events lack effective evaluation methods. Therefore, we have developed a method to use lithologic information and prestack seismic data to explain the distribution of well Tian Xing microseismic events. We have analyzed the brittleness of the target formation through the well logs and core. We inverted the Young’s modulus and Poisson’s ratio based on simultaneous amplitude variation with offset inversion by the prestack seismic data. We then computed the 3D brittleness index (BI) property volume by Grieser and Rickman’s method. In addition, the microseismic event distribution and BI map were then combined to show the internal relationship between the two results. We found that the well logs and core analysis demonstrated that the target formation has high brittleness. Generally, areas with more natural fractures have a higher probability of inducing hydraulic fractures. However, the analysis results show that the BI has an impact on the distribution of hydraulic fractures. Therefore, BI explains the reason for the distribution of almost all events in the northeast of the perforation. These observations also supported the concept that microseismic events preferentially grow toward more brittle areas.


2021 ◽  
Vol 251 ◽  
pp. 04001
Author(s):  
Rafał Dominik Krawczyk ◽  
Flavio Pisani ◽  
Tommaso Colombo ◽  
Markus Frank ◽  
Niko Neufeld

This paper evaluates the real-time distribution of data over Ethernet for the upgraded LHCb data acquisition cluster at CERN. The system commissioning ends in 2021 and its total estimated input throughput is 32 Terabits per second. After the events are assembled, they must be distributed for further data selection to the filtering farm of the online trigger. High-throughput and very low overhead transmissions will be an essential feature of such a system. In this work RoCE (Remote Direct Memory Access over Converged Ethernet) high-throughput Ethernet protocol and Ethernet flow control algorithms have been used to implement lossless event distribution. To generate LHCb-like traffic, a custom benchmark has been implemented. It was used to stress-test the selected Ethernet networks and to check resilience to uneven workload distribution. Performance tests were made with selected evaluation clusters. 100 Gb/s and 25 Gb/s links were used. Performance results and overall evaluation of this Ethernet-based approach are discussed.


Author(s):  
T. Aronova ◽  
G. Aronov ◽  
T. Protasovitskaya ◽  
V. Aronov

. The review of annual seismicity in the territory of Belarus based on the data of two analog and seventeen digital stations is presented. 57 events with Кd=4.6–8.8 are recorded, all of them are located in the southern part of the territory, including the Soligorsk mining area. The maximum seismic energy released in March, August, October and November. The maximum number of earthquakes was observed from July to August and from October to November. The N(K) andΣE functions in 2014 were compared with those within 1983–2013. The number of events in 2014 is 1.34 times more than its average value for previous 31 years. The level of the seismic energy released in 2014 is 2.43 times more than in 2013 and 2.05 times lower than its long-time average value. The distributionof earthquakes by depth intervals showed that the earthquake foci are mostly located in the upper 20 km part of the Earth’s crust. However, the foci of 47 earthquakes are located at depths below 10 km. A slope of the graph showing the recurrence of the events with representative energy classes Кd=6–8 in 2014 was calculated. Its modulus γ=|0.48| is lower than the value γ=|0.5| in 2013. The distribution of all the events in 2014 is represented in real time. Quiet seismic periods and seismic activation periods were determined. The distribution of the seismic events by the hourly intervals showed the periods of the daytime and nighttime increase of the seismic event number. The maximum and minimum values N in the seismic event distribution by the days of the week were determined. The seismicity analysis has shown that the seismic activity level in 2014 was higher than that in 2013, but lower than its long-time average value.


Author(s):  
David B Harris ◽  
Douglas A Dodge

Summary Under favorable circumstances, seismic waveforms corresponding to an ensemble of events related by a common, spatially-distributed process collectively exhibit a regular, signal-space geometry. When events in the ensemble have a common, or nearly common, source mechanism, this geometry is a distorted image of the distribution of events in the source region. The signal-space image can be visualized using a relatively simple waveform alignment and projection operation. Ensemble waveform correlation measurements can be inverted to estimate the distribution of the events in the source region, up to an arbitrary rotation, reflection and scaling, with residual distortion. We demonstrate these concepts with synthetic waveforms and with observations of long-wall mining induced seismicity for which substantial ground truth information is available. Our experience with these data has implications for location, correlation detection and machine learning, and possible application to studies of repeating events in induced, volcanic and glacial seismicity. Our results place limits on the widely-held assumption that waveform correlation is a useful measure of event separation. We suggest that the constraints on event separation need to be evaluated in the context of a population of related events, whose waveforms sample the signal space image of the source region. A better indicator of event separation is the length of the shortest path in signal space along the image.


Sign in / Sign up

Export Citation Format

Share Document