Optimal equidistant checkpointing of fault tolerant systems subject to correlated failure

Author(s):  
Bentolhoda Jafary ◽  
Lance Fiondella ◽  
Ping-Chen Chang

Checkpointing is a technique to back up work at periodic intervals so that if computation fails, it will not be necessary to restart from the beginning but will instead be able to restart from the latest checkpoint. Performing checkpointing operations requires time. Therefore, it is necessary to consider the tradeoff between the time to perform checkpointing operations and the time saved when computation restarts at a checkpoint. This article presents a method to model the impact of correlated failures on an application that performs a specified amount of computation and implements checkpointing operations at equidistant periods during this computation. We develop a Markov model and superimpose a correlated life distribution. Two cases are considered. The first assumes that reaching a checkpoint resets the failure distribution. The second allows the probability of failure to progress. We illustrate the approach through a series of examples. The results indicate that correlation can negatively impact checkpointing, necessitating more frequent checkpointing and increasing the total time required, but that the approach can still identify the optimal number of equidistant checkpoints, despite this correlation.

Author(s):  
Anusha Krishna Murthy ◽  
Saikath Bhattacharya ◽  
Lance Fiondella

Most reliability models assume that components and systems experience one failure mode. Several systems such as hardware, however, are prone to more than one mode of failure. Past two-failure mode research derives equations to maximize reliability or minimize cost by identifying the optimal number of components. However, many if not all of these equations are derived from models that make the simplifying assumption that components fail in a statistically independent manner. In this paper, models to assess the impact of correlation on two-failure mode system reliability and cost are developed and corresponding expressions for reliability and cost optimal designs derived. Our illustrations demonstrate that, despite correlation, the approach identifies reliability and cost optimal designs.


Author(s):  
M. Novokhatskyi ◽  
◽  
V. Targonya ◽  
T. Babinets ◽  
O. Gorodetskyi ◽  
...  

Aim. Assessment of the impact of the most common systems of basic tillage and biological methods of optimization of nutrition regimes on the realization of the potential of grain productivity of soybean in the Forest-Steppe of Ukraine. Methods. The research used general scientific (hypothesis, experiment, observation) and special (field experiment, morphological analysis) methods Results. The analysis of the results of field experiments shows that the conservation system of soil cultivation, which provided the formation of 27.6 c/ha of grain, is preferable by the level of biological yield of soybean. The use of other systems caused a decrease in the biological yield level: up to 26.4 c/ha for the use of the traditional system, up to 25.3 c/ha for the use of mulching and up to 23.0 c/ha for the use of the mini-till. With the use of Groundfix, the average biological yield of soybean grain increases to 25.6 c / ha for application rates of 5 l/ha, and to 28.2 c/ha for application rates of 10 l/ha when control variants (without the use of the specified preparation) an average of 22.6 c/ha of grain was formed with fluctuations in soil tillage systems from 21.0 (mini-bodies) to 25.8 c/ha (traditional).The application of Groundfix (10 l/ha) reduced the seed abortion rate from 11.0% (average without biofertilizer variants) to 8.0%, forming the optimal number of stem nodes with beans, increasing the attachment height of the lower beans and improving other indicators of biological productivity soybeans. Conclusions. It has been found that the use of the canning tillage system generates an average of 27.6 cent soybean grains, which is the highest indicator among the main tillage systems within the scheme of our research. The use of Groundfix caused a change in this indicator: if the variants with a conservative system of basic tillage without the use of biological preparation (control) were formed on average 24.1 c/ha, the use of Ground Licks caused the increase of biological productivity up to 29.4 c/ha, and at a dose of 10 l/ha biological yield was 32.2 c/ha. It was found that both the use of Groundfix and the basic tillage system influenced the elements of the yield structure: the density of the plants at the time of harvest depended more on the tillage system than on the use of Groundfix; the use of Groundfix and increasing its dose within the scheme of our studies positively reflected on the density of standing plants; the height of attachment of the lower beans and reduced the abortion of the seeds.


Author(s):  
J. R. Barnes ◽  
C. A. Haswell

AbstractAriel’s ambitious goal to survey a quarter of known exoplanets will transform our knowledge of planetary atmospheres. Masses measured directly with the radial velocity technique are essential for well determined planetary bulk properties. Radial velocity masses will provide important checks of masses derived from atmospheric fits or alternatively can be treated as a fixed input parameter to reduce possible degeneracies in atmospheric retrievals. We quantify the impact of stellar activity on planet mass recovery for the Ariel mission sample using Sun-like spot models scaled for active stars combined with other noise sources. Planets with necessarily well-determined ephemerides will be selected for characterisation with Ariel. With this prior requirement, we simulate the derived planet mass precision as a function of the number of observations for a prospective sample of Ariel targets. We find that quadrature sampling can significantly reduce the time commitment required for follow-up RVs, and is most effective when the planetary RV signature is larger than the RV noise. For a typical radial velocity instrument operating on a 4 m class telescope and achieving 1 m s−1 precision, between ~17% and ~ 37% of the time commitment is spent on the 7% of planets with mass Mp < 10 M⊕. In many low activity cases, the time required is limited by asteroseismic and photon noise. For low mass or faint systems, we can recover masses with the same precision up to ~3 times more quickly with an instrumental precision of ~10 cm s−1.


Geosciences ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 99 ◽  
Author(s):  
Yueqi Gu ◽  
Orhun Aydin ◽  
Jacqueline Sosa

Post-earthquake relief zone planning is a multidisciplinary optimization problem, which required delineating zones that seek to minimize the loss of life and property. In this study, we offer an end-to-end workflow to define relief zone suitability and equitable relief service zones for Los Angeles (LA) County. In particular, we address the impact of a tsunami in the study due to LA’s high spatial complexities in terms of clustering of population along the coastline, and a complicated inland fault system. We design data-driven earthquake relief zones with a wide variety of inputs, including geological features, population, and public safety. Data-driven zones were generated by solving the p-median problem with the Teitz–Bart algorithm without any a priori knowledge of optimal relief zones. We define the metrics to determine the optimal number of relief zones as a part of the proposed workflow. Finally, we measure the impacts of a tsunami in LA County by comparing data-driven relief zone maps for a case with a tsunami and a case without a tsunami. Our results show that the impact of the tsunami on the relief zones can extend up to 160 km inland from the study area.


2003 ◽  
Vol 18 (2) ◽  
pp. 190-200 ◽  
Author(s):  
JianHui Jiang ◽  
YingHua Min ◽  
ChengLian Peng

2014 ◽  
Vol 660 ◽  
pp. 971-975 ◽  
Author(s):  
Mohd Norzaim bin Che Ani ◽  
Siti Aisyah Binti Abdul Hamid

Time study is the process of observation which concerned with the determination of the amount of time required to perform a unit of work involves of internal, external and machine time elements. Originally, time study was first starting to be used in Europe since 1760s in manufacturing fields. It is the flexible technique in lean manufacturing and suitable for a wide range of situations. Time study approach that enable of reducing or minimizing ‘non-value added activities’ in the process cycle time which contribute to bottleneck time. The impact on improving process cycle time for organization that it was increasing the productivity and reduce cost. This project paper focusing on time study at selected processes with bottleneck time and identify the possible root cause which was contribute to high time required to perform a unit of work.


Sign in / Sign up

Export Citation Format

Share Document