scholarly journals Schedulability of probabilistic mixed-criticality systems

2021 ◽  
Author(s):  
Stefan Draskovic ◽  
Rehan Ahmed ◽  
Pengcheng Huang ◽  
Lothar Thiele

AbstractMixed-criticality systems often need to fulfill safety standards that dictate different requirements for each criticality level, for example given in the ‘probability of failure per hour’ format. A recent trend suggests designing this kind of systems by jointly scheduling tasks of different criticality levels on a shared platform. When this is done, the usual assumption is that tasks of lower criticality are degraded when a higher criticality task needs more resources, for example when it overruns a bound on its execution time. However, a way to quantify the impact this degradation has on the overall system is not well understood. Meanwhile, to improve schedulability and to avoid over-provisioning of resources due to overly pessimistic worst-case execution time estimates of higher criticality tasks, a new paradigm emerged where task’s execution times are modeled with random variables. In this paper, we analyze a system with probabilistic execution times, and propose metrics that are inspired by safety standards. Among these metrics are the probability of deadline miss per hour, the expected time before degradation happens, and the duration of the degradation. We argue that these quantities provide a holistic view of the system’s operation and schedulability.

2020 ◽  
Vol 245 ◽  
pp. 05037
Author(s):  
Caterina Marcon ◽  
Oxana Smirnova ◽  
Servesh Muralidharan

Experimental observations and advanced computer simulations in High Energy Physics (HEP) paved the way for the recent discoveries at the Large Hadron Collider (LHC) at CERN. Currently, Monte Carlo simulations account for a very significant amount of computational resources of the Worldwide LHC Computing Grid (WLCG). The current growth in available computing performance will not be enough to fulfill the expected demand for the forthcoming High Luminosity run (HL-LHC). More efficient simulation codes are therefore required. This study focuses on evaluating the impact of different build methods on the simulation execution time. The Geant4 toolkit, the standard simulation code for the LHC experiments, consists of a set of libraries which can be either dynamically or statically linked to the simulation executable. Dynamic libraries are currently the preferred build method. In this work, three versions of the GCC compiler, namely 4.8.5, 6.2.0 and 8.2.0 have been used. In addition, a comparison between four optimization levels (Os, O1, O2 and O3) has also been performed. Static builds for all the GCC versions considered, exhibit a reduction in execution times of about 10%. Switching to newer GCC version results in an average of 30% improvement in the execution time regardless of the build type. In particular, a static build with GCC 8.2.0 leads to an improvement of about 34% with respect to the default configuration (GCC 4.8.5, dynamic, O2). The different GCC optimization flags do not affect the execution times.


2020 ◽  
Vol 27 (2) ◽  
pp. 218-233
Author(s):  
Mark G. Gonopolskiy ◽  
Alevtina B. Glonina

The paper presents an algorithm for the worst case response time (WCRT) estimation for multiprocessor systems with fixed-priority preemptive schedulers and the interval uncertainty of tasks execution times. Each task has a unique priority within its processor, a period, an execution time interval [BCET, WCET] and can have data dependency on other tasks. If a decrease in the execution time of the task A can lead to an increase in the response time of the another task B, then task A is called an anomalous task for task B. According to the chosen approach, in order to estimate a task’s WCRT, two steps should be performed. The first one is to construct a set of anomalous tasks using the proposed algorithm for the given task. The paper provides the algorithm and the proof of its correctness. The second one is to find the WCRT estimation using a genetic algorithm. The proposed approach has been implemented software as a program in Python3. A set of experiments have been carried out in order to compare the proposed method in terms of precision and speed with two well-known WCRT estimating methods: the method that does not take into account interval uncertainty (assuming that the execution time of a given task is equal to WCET) and the brute force method. The results of the experiments have shown that, in contrast to the brute force method, the proposed method is applicable to the analysis of the real scale computing systems and also allows to achieve greater precision than the method that does not take into account interval uncertainty.


Mathematics ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 314 ◽  
Author(s):  
Matteo Fusi ◽  
Fabio Mazzocchetti ◽  
Albert Farres ◽  
Leonidas Kosmidis ◽  
Ramon Canal ◽  
...  

Some high performance computing (HPC) applications exhibit increasing real-time requirements, which call for effective means to predict their high execution times distribution. This is a new challenge for HPC applications but a well-known problem for real-time embedded applications where solutions already exist, although they target low-performance systems running single-threaded applications. In this paper, we show how some performance validation and measurement-based practices for real-time execution time prediction can be leveraged in the context of HPC applications on high-performance platforms, thus enabling reliable means to obtain real-time guarantees for those applications. In particular, the proposed methodology uses coordinately techniques that randomly explore potential timing behavior of the application together with Extreme Value Theory (EVT) to predict rare (and high) execution times to, eventually, derive probabilistic Worst-Case Execution Time (pWCET) curves. We demonstrate the effectiveness of this approach for an acoustic wave inversion application used for geophysical exploration.


2019 ◽  
pp. 124-136
Author(s):  
Victor D. Gazman

The article considers prerequisites for the formation of a new paradigm in the energy sector. The factors that may affect the imminent change of leadership among the energy generation are analyzed. The variability of the projects of creation and functioning of power stations is examined. The focus is made on problematic aspects of the new generation, especially, storage and supply of energy, achieving a system of parity that ensures balance in pricing generations. The author substantiates the principles of forming system of parities arising when comparing traditional and new generations. The article presents the results of an empirical analysis of the 215 projects for the construction of facilities for renewable energy. The significance and direction of the impact of these factors on the growth in investment volumes of transactions are determined. The author considers leasing as an effective financial instrument for overcoming stereotypes of renewable energy and as a promising direction for accelerated implementation of investment projects.


Author(s):  
Gianluca Bardaro ◽  
Alessio Antonini ◽  
Enrico Motta

AbstractOver the last two decades, several deployments of robots for in-house assistance of older adults have been trialled. However, these solutions are mostly prototypes and remain unused in real-life scenarios. In this work, we review the historical and current landscape of the field, to try and understand why robots have yet to succeed as personal assistants in daily life. Our analysis focuses on two complementary aspects: the capabilities of the physical platform and the logic of the deployment. The former analysis shows regularities in hardware configurations and functionalities, leading to the definition of a set of six application-level capabilities (exploration, identification, remote control, communication, manipulation, and digital situatedness). The latter focuses on the impact of robots on the daily life of users and categorises the deployment of robots for healthcare interventions using three types of services: support, mitigation, and response. Our investigation reveals that the value of healthcare interventions is limited by a stagnation of functionalities and a disconnection between the robotic platform and the design of the intervention. To address this issue, we propose a novel co-design toolkit, which uses an ecological framework for robot interventions in the healthcare domain. Our approach connects robot capabilities with known geriatric factors, to create a holistic view encompassing both the physical platform and the logic of the deployment. As a case study-based validation, we discuss the use of the toolkit in the pre-design of the robotic platform for an pilot intervention, part of the EU large-scale pilot of the EU H2020 GATEKEEPER project.


2020 ◽  
Vol 51 (1) ◽  
pp. 1-26
Author(s):  
Tobias Arnold ◽  
Sean Mueller ◽  
Adrian Vatter

Abstract Over the past decades, decentralization has become the new paradigm in how states should organize power territorially. Carefully planned institutional re-designs are the most visible expression thereof. Yet the Great Recession of 2007–2009 has pushed governments into the opposite direction, i.e., towards centralization, to better weather the fiscal drought. Given these contradictory developments, this article compares the effects of twenty-three separate state reforms with the impact of the Great Recession on fiscal centralization in twenty-nine countries over more than two decades. In the main, our analyses attribute a larger effect to design, i.e., pro-active policy making through reforms, than reactive crisis management after a great shock. However, this difference is only apparent once we consider a state’s institutional structure, that is whether a political system is unitary or federal. Our findings thus highlight the need for a multidimensional approach to better understand the drivers of fiscal de/centralization.


Author(s):  
Stephen G. Wiedemann ◽  
Leo Biggs ◽  
Quan V. Nguyen ◽  
Simon J. Clarke ◽  
Kirsi Laitala ◽  
...  

Abstract Purpose Garment production and use generate substantial environmental impacts, and the care and use are key determinants of cradle-to-grave impacts. The present study investigated the potential to reduce environmental impacts by applying best practices for garment care combined with increased garment use. A wool sweater is used as an example because wool garments have particular attributes that favour reduced environmental impacts in the use phase. Methods A cradle-to-grave life cycle assessment (LCA) was used to compare six plausible best and worst-case practice scenarios for use and care of a wool sweater, relative to current practices. These focussed on options available to consumers to reduce impacts, including reduced washing frequency, use of more efficient washing machines, reduced use of machine clothing dryers, garment reuse by multiple users, and increasing number of garment wears before disposal. A sixth scenario combined all options. Worst practices took the worst plausible alternative for each option investigated. Impacts were reported per wear in Western Europe for climate change, fossil energy demand, water stress and freshwater consumption. Results and discussion Washing less frequently reduced impacts by between 4 and 20%, while using more efficient washing machines at capacity reduced impacts by 1 to 6%, depending on the impact category. Reduced use of machine dryer reduced impacts by < 5% across all indicators. Reusing garments by multiple users increased life span and reduced impacts by 25–28% across all indicators. Increasing wears from 109 to 400 per garment lifespan had the largest effect, decreasing impacts by 60% to 68% depending on the impact category. Best practice care, where garment use was maximised and care practices focussed on the minimum practical requirements, resulted in a ~ 75% reduction in impacts across all indicators. Unsurprisingly, worst-case scenarios increased impacts dramatically: using the garment once before disposal increased GHG impacts over 100 times. Conclusions Wool sweaters have potential for long life and low environmental impact in use, but there are substantial differences between the best, current and worst-case scenarios. Detailed information about garment care and lifespans is needed to understand and reduce environmental impacts. Opportunities exist for consumers to rapidly and dramatically reduce these impacts. The fashion industry can facilitate this through garment design and marketing that promotes and enables long wear life and minimal care.


Author(s):  
Luis Fernando Arcaro ◽  
Karila Palma Silva ◽  
Romulo Silva de Oliveira ◽  
Luis Almeida

1988 ◽  
Vol 11 (1) ◽  
pp. 1-19
Author(s):  
Andrzej Rowicki

The purpose of the paper is to consider an algorithm for preemptive scheduling for two-processor systems with identical processors. Computations submitted to the systems are composed of dependent tasks with arbitrary execution times and contain no loops and have only one output. We assume that preemptions times are completely unconstrained, and preemptions consume no time. Moreover, the algorithm determines the total execution time of the computation. It has been proved that this algorithm is optimal, that is, the total execution time of the computation (schedule length) is minimized.


Sign in / Sign up

Export Citation Format

Share Document