occurrence times
Recently Published Documents


TOTAL DOCUMENTS

70
(FIVE YEARS 14)

H-INDEX

14
(FIVE YEARS 1)

Author(s):  
Anish Rai ◽  
Ajit Mahata ◽  
Md Nurujjaman ◽  
Om Prakash

During any unique crisis, panic sell-off leads to a massive stock market crash that may continue for more than a day, termed as mainshock. The effect of a mainshock in the form of aftershocks can be felt throughout the recovery phase of stock price. As the market remains in stress during recovery, any small perturbation leads to a relatively smaller aftershock. The duration of the recovery phase has been estimated using structural break analysis. We have carried out statistical analyses of 1987 stock market crash, 2008 financial crisis and 2020 COVID-19 pandemic considering the actual crash times of the mainshock and aftershocks. Earlier, such analyses were done considering absolute one-day return, which cannot capture a crash properly. The results show that the mainshock and aftershock in the stock market follow the Gutenberg–Richter (GR) power law. Further, we obtained higher [Formula: see text] value for the COVID-19 crash compared to the financial-crisis-2008 from the GR law. This implies that the recovery of stock price during COVID-19 may be faster than the financial-crisis-2008. The result is consistent with the present recovery of the market from the COVID-19 pandemic. The analysis shows that the high-magnitude aftershocks are rare, and low-magnitude aftershocks are frequent during the recovery phase. The analysis also shows that the inter-occurrence times of the aftershocks follow the generalized Pareto distribution, i.e. [Formula: see text], where [Formula: see text] and [Formula: see text] are constants and [Formula: see text] is the inter-occurrence time. This analysis may help investors to restructure their portfolio during a market crash.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Xueguang Yu ◽  
Xintian Liu ◽  
Xu Wang ◽  
Xiaolan Wang

Purpose This study aims to propose an improved affine interval truncation algorithm to restrain interval extension for interval function. Design/methodology/approach To reduce the occurrence times of related variables in interval function, the processing method of interval operation sequence is proposed. Findings The interval variable is evenly divided into several subintervals based on correlation analysis of interval variables. The interval function value is modified by the interval truncation method to restrain larger estimation of interval operation results. Originality/value Through several uncertain displacement response engineering examples, the effectiveness and applicability of the proposed algorithm are verified by comparing with interval method and optimization algorithm.


Author(s):  
Fernando C. Loio Pinto ◽  
Henrique Neiva ◽  
Célia Nunes ◽  
Mário C. Marques ◽  
António C. Sousa ◽  
...  

Fight analysis produces relevant technical–tactical information. However, this knowledge is limited in hybrid full-contact combat sports. Therefore, this study aimed to characterize the results of the fights’ outcomes through the winners at the World Ultimate Full Contact (WUFC) Championships between 2008 and 2017. Methods: 170 combats between senior male fighters (master class) from 38 countries were observed; all fight outcome methods, their occurrence times, inherent skills and their development forms were analyzed through frequencies, percentages, crosstabs and chi-square test, considering a Fisher’s exact value of p < 0.05. The fight outcome methods were, in decreasing order, as follows: submission; decision and technical knockout (TKO); knockout (KO); and doctor stoppage. Only 19.4% fights completed the regular time 10 min (600 s), and 68.8% fight outcomes occurred in the first 5 min (300 s). Chokes were more used than joint locks, primarily developed in single actions. Head punches and kicks were the skills most responsible for KO, developed more in combinations and counter-attacks, while TKO was always through combination attacks and mostly by ground and pound. Ground fighting is most effective. In stand-up fighting, combination attacks and counter-attack are most effective. It is important to increase the technical–tactical capacities and adjustable decision-making to perform the regular fight time.


2020 ◽  
Vol 65 (3) ◽  
pp. 1-8
Author(s):  
Sanghyun Shin ◽  
Abhishek Vaidya ◽  
Inseok Hwang

In recent years, the National Transportation Safety Board has highlighted the importance of analyzing flight data as one of the effective methods to improve the safety and efficiency of helicopter operations. Since cockpit audio data contain various sounds from engines, alarms, crew conversations, and other sources within a cockpit, analyzing cockpit audio data can help identify the causes of incidents and accidents. Among the various types of the sounds in cockpit audio data, this paper focuses on cockpit alarm and engine sounds as an object of analysis. This paper proposes cockpit audio analysis algorithms, which can detect types and occurrence times of alarm sounds for an abnormal flight and estimate engine-related flight parameters such as an engine torque. This is achieved by the following: for alarm sound analysis, finding the highest correlation with the short time Fourier transform, and the Cumulative Sum Control Chart (CUSUM) using a database of the characteristic features of the alarm; and for engine sound analysis, using data mining and statistical modeling techniques to identify specific frequencies associated with engine operations. The proposed algorithm is successfully applied to a set of simulated audio data, which were generated by the X-plane flight simulator, and real audio data, which were recorded by GoPro cameras in Sikorsky S-76 helicopters to demonstrate its desired performance.


Author(s):  
Bing Zhang ◽  
Chun Shan ◽  
Munawar Hussain ◽  
Jiadong Ren ◽  
Guoyan Huang

Because of the sequence and number of calls of functions, software network cannot reflect the real execution of software. Thus, to detect crucial functions (DCF) based on software network is controversial. To address this issue, from the viewpoint of software dynamic execution, a novel approach to DCF is proposed in this paper. It firstly models, the dynamic execution process as an execution sequence by taking functions as nodes and tracing the stack changes occurring. Second, an algorithm for deleting repetitive patterns is designed to simplify execution sequence and construct software sequence pattern sets. Third, the crucial function detection algorithm is presented to identify the distribution law of the numbers of patterns at different levels and rank those functions so as to generate a decision-function-ranking-list (DFRL) by occurrence times. Finally, top-k discriminative functions in DFRL are chosen as crucial functions, and similarity the index of decision function sets is set up. Comparing with the results from Degree Centrality Ranking and Betweenness Centrality Ranking approaches, our approach can increase the node coverage to 80%, which is proven to be an effective and accurate one by combining advantages of the two classic algorithms in the experiments of different test cases on four open source software. The monitoring and protection on crucial functions can help increase the efficiency of software testing, strength software reliability and reduce software costs.


2020 ◽  
Author(s):  
Isabel Serra ◽  
David Moriña ◽  
Pere Puig ◽  
Álvaro Corral

&lt;p&gt;Intense geomagnetic storms can cause severe damage to electrical systems and communications. this work proposes a counting process with Weibull inter-occurrence times in order to estimate the probability of extreme geomagnetic events. It is found that the scale parameter of the inter-occurrence time distribution grows exponentially with the absolute value of the intensity threshold defining the storm, whereas the shape parameter keeps rather constant. The model is able to forecast the probability of occurrence of an event for a given intensity threshold; in particular, the probability of occurrence on the next decade of an extreme event of a magnitude comparable or larger than the well-known Carrington event of 1859 is explored, and estimated to be between 0.46% and 1.88% (with a 95% confidence), a much lower value than those reported in the existing literature. &lt;/p&gt;


Risks ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 30 ◽  
Author(s):  
Franck Adékambi ◽  
Essodina Takouda

This paper considers the risk model perturbed by a diffusion process with a time delay in the arrival of the first two claims and takes into account dependence between claim amounts and the claim inter-occurrence times. Assuming that the time arrival of the first claim follows a generalized mixed equilibrium distribution, we derive the integro-differential Equations of the Gerber–Shiu function and its defective renewal equations. For the situation where claim amounts follow exponential distribution, we provide an explicit expression of the Gerber–Shiu function. Numerical examples are provided to illustrate the ruin probability.


2020 ◽  
Vol 99 ◽  
pp. 102068 ◽  
Author(s):  
Ioannis Manolopoulos ◽  
Kimon Kontovasilis ◽  
Ioannis Stavrakakis ◽  
Stelios C.A. Thomopoulos

2019 ◽  
Vol 14 (9) ◽  
pp. 1227-1235
Author(s):  
Tomohiro Ishizawa ◽  
◽  
Toru Danjo

The July 2018 heavy rain, which was actually a series of intermittent downpours instead of a short-term continuous heavy rainfall, triggered a large number of sediment disasters. This study was conducted to evaluate sediment disaster triggers. In the study, an interview-based survey was conducted on the occurrence times of the sediment disasters caused by the heavy rain and a rainfall analysis was completed using analyzed rainfall data from the Japan Meteorological Agency. These were followed by an analysis of estimated occurrence times of the sediment disasters and the temporal changes in rainfall indices determined through the rainfall analysis. An analysis of disasters at 36 sites examined for the purposes of this study showed that many occurred when the soil water index (SWI) during the study period (June 28, 2018, to the estimated occurrence time of a sediment disaster) was maximized. The analysis also indicated that slope failures tended to occur when hourly rainfall was relatively low and the SWI was high and debris flows occurred when the SWI was high and hourly rainfall was relatively high. Examination of the data, considering the alert level of the SWI, showed that in cases where the SWI continued to increase after exceeding the alert level, 75% of the sediment disasters analyzed occurred within approximately 19 h.


Sign in / Sign up

Export Citation Format

Share Document