scholarly journals GRIDCC: A Real-Time Grid Workflow System with QoS

2007 ◽  
Vol 15 (4) ◽  
pp. 213-234 ◽  
Author(s):  
A. Stephen McGough ◽  
Asif Akram ◽  
Li Guo ◽  
Marko Krznaric ◽  
Luke Dickens ◽  
...  

The over-arching aim of Grid computing is to move computational resources from individual institutions where they can only be used for in-house work, to a more open vision of vast online ubiquitous `virtual computational' resources which support individuals and collaborative projects. A major step towards realizing this vision is the provision of instrumentation – such as telescopes, accelerators or electrical power stations – as Grid resources, and the tools to manage these resources online. The GRIDCC project attempts to satisfy these requirements by providing the following four co-dependent components; a flexible wrapper for publishing instruments as Grid resources; workflow support for the orchestration of multiple Grid resources in a timely manner; the machinery to make reservation agreements on Grid resources; and the facility to satisfy quality of service (QoS) requirements on elements within workflows. In this paper we detail the set of services developed as part of the GRIDCC project to provide the last three of these components. We provide a detailed architecture for these services along with experimental results from load testing experiments. These services are currently deployed as a test-bed at a number of institutions across Europe, and are poised to provide a 'virtual lab' to production level applications.

Author(s):  
Kenneth J. Turner ◽  
Paul Lambert ◽  
K. L. Tan ◽  
Vernon Gayle ◽  
Richard O. Sinnott ◽  
...  

Grid computing is named by analogy with the electrical power grid. Power stations are linked into a universal supply that delivers electricity on demand to consumers. Similarly, computational resources can be linked into a grid that delivers computing or data on demand to the user’s desktop. The origins of grid computing lie in networked computing, distributed computing, and parallel computing. Grid computing coordinates distributed resources that are not subject to central control, using standard protocols and interfaces to meet the required levels of service (Foster, 2002).


2006 ◽  
Vol 14 (3-4) ◽  
pp. 231-250 ◽  
Author(s):  
Ivona Brandic ◽  
Sabri Pllana ◽  
Siegfried Benkner

Many important scientific and engineering problems may be solved by combining multiple applications in the form of a Grid workflow. We consider that for the wide acceptance of Grid technology it is important that the user has the possibility to express requirements on Quality of Service (QoS) at workflow specification time. However, most of the existing workflow languages lack constructs for QoS specification. In this paper we present an approach for high level workflow specification that considers a comprehensive set of QoS requirements. Besides performance related QoS, it includes economical, legal and security aspects. For instance, for security or legal reasons the user may express the location affinity regarding Grid resources on which certain workflow tasks may be executed. Our QoS-aware workflow system provides support for the whole workflow life cycle from specification to execution. Workflow is specified graphically, in an intuitive manner, based on a standard visual modeling language. A set of QoS-aware service-oriented components is provided for workflow planning to support automatic constraint-based service negotiation and workflow optimization. For reducing the complexity of workflow planning, we introduce a QoS-aware workflow reduction technique. We illustrate our approach with a real-world workflow for maxillo facial surgery simulation.


2011 ◽  
Vol 121-126 ◽  
pp. 1744-1748
Author(s):  
Xiang Yang Jin ◽  
Tie Feng Zhang ◽  
Li Li Zhao ◽  
He Teng Wang ◽  
Xiang Yi Guan

To determine the efficiency, load-bearing capacity and fatigue life of beveloid gears with intersecting axes, we design a mechanical gear test bed with closed power flow. To test the quality of its structure and predict its overall performance, we establish a three-dimensional solid model for various components based on the design parameters and adopt the technology of virtual prototyping simulation to conduct kinematics simulation on it. Then observe and verify the interactive kinematic situation of each component. Moreover, the finite element method is also utilized to carry out structural mechanics and dynamics analysis on some key components. The results indicate that the test bed can achieve the desired functionality, and the static and dynamic performance of some key components can also satisfy us.


2012 ◽  
Vol 19 (1) ◽  
pp. 39-48 ◽  
Author(s):  
Jarosław Zygarlicki ◽  
Janusz Mroczka

Variable-Frequency Prony Method in the Analysis of Electrical Power QualityThe article presents a new modification of the the least squares Prony method. The so-called variable-frequency Prony method can be a useful tool for estimating parameters of sinusoidal components, which, in the analyzed signal, are characterized by time-dependent frequencies. The authors propose use of the presented method for testing the quality of electric energy. It allows observation of phenomena which, when using traditional methods, are averaged in the analysis window. The proposed modification of least squares Prony method is based on introduction and specific selection of a frequency matrix. This matrix represents frequencies of estimated components and their variability in time.


Author(s):  
O. Salor ◽  
B. Gultekin ◽  
S. Buhan ◽  
B. Boyrazoglu ◽  
T. Inan ◽  
...  

2017 ◽  
Vol 2017 ◽  
pp. 1-14 ◽  
Author(s):  
Vennila Ganesan ◽  
Manikandan MSK

Managing the performance of the Session Initiation Protocol (SIP) server under heavy load conditions is a critical task in a Voice over Internet Protocol (VoIP) network. In this paper, a two-tier model is proposed for the security, load mitigation, and distribution issues of the SIP server. In the first tier, the proposed handler segregates and drops the malicious traffic. The second tier provides a uniform load of distribution, using the least session termination time (LSTT) algorithm. Besides, the mean session termination time is minimized by reducing the waiting time of the SIP messages. Efficiency of the LSTT algorithm is evaluated through the experimental test bed by considering with and without a handler. The experimental results establish that the proposed two-tier model improves the throughput and the CPU utilization. It also reduces the response time and error rate while preserving the quality of multimedia session delivery. This two-tier model provides robust security, dynamic load distribution, appropriate server selection, and session synchronization.


Author(s):  
Bilikis T. Folarin ◽  
Mohamed Abdallah ◽  
Temilola O. Oluseyi ◽  
Stuart Harrad ◽  
Kehinde O. Olayinka

From the first self-sustaining nuclear reaction to the present day represents a span of three decades: within that time large-scale generation of electrical power from nuclear energy has become acknowledged as economic, safe and environmentally acceptable. Within the U .K . 10% of electricity consumed is of nuclear origin. Some of the C.E.G.B. reactors have been in service for over 10 years. The operating experience that has been gained shows how the original design concepts have been ultimately developed. Some of the difficulties encountered and the engineering solutions are presented. Operating experience feeds back to the design philosophy and safety requirements for future nuclear plant. In this way a foundation is provided for the further exploitation of what must become a major source of energy in the next decade.


Author(s):  
MARIO PIATTINI ◽  
MARCELA GENERO ◽  
LUIS JIMÉNEZ

It is generally accepted in the information system (IS) field that IS quality is highly dependent on the decisions made early in the development life cycle. The construction of conceptual data models is often an important task of this early development. Therefore, improving the quality of conceptual data models will be a major step towards the quality improvement of the IS development. Several quality frameworks for conceptual data models have been proposed, but most of them lack valid quantitative measures in order to evaluate the quality of conceptual data models in an objective way. In this article we will define measures for the structural complexity (internal attribute) of entity relationship diagrams (ERD) and use them for predicting their maintainability (external attribute). We will theoretically validate the proposed metrics following Briand et al.'s framework with the goal of demonstrating the properties that characterise each metric. We will also show how it is possible to predict each of the maintainability sub-characteristics using a prediction model generated using a novel method for induction of fuzzy rules.


Sign in / Sign up

Export Citation Format

Share Document