A Web-Services Accessible Turbulence Database and Application to A-Priori Testing of a Matrix Exponential Subgrid Model

Author(s):  
Charles Meneveau
2021 ◽  
Vol 15 (2) ◽  
pp. 1-25
Author(s):  
Amal Alhosban ◽  
Zaki Malik ◽  
Khayyam Hashmi ◽  
Brahim Medjahed ◽  
Hassan Al-Ababneh

Service-Oriented Architectures (SOA) enable the automatic creation of business applications from independently developed and deployed Web services. As Web services are inherently a priori unknown, how to deliver reliable Web services compositions is a significant and challenging problem. Services involved in an SOA often do not operate under a single processing environment and need to communicate using different protocols over a network. Under such conditions, designing a fault management system that is both efficient and extensible is a challenging task. In this article, we propose SFSS, a self-healing framework for SOA fault management. SFSS is predicting, identifying, and solving faults in SOAs. In SFSS, we identified a set of high-level exception handling strategies based on the QoS performances of different component services and the preferences articled by the service consumers. Multiple recovery plans are generated and evaluated according to the performance of the selected component services, and then we execute the best recovery plan. We assess the overall user dependence (i.e., the service is independent of other services) using the generated plan and the available invocation information of the component services. Due to the experiment results, the given technique enhances the service selection quality by choosing the services that have the highest score and betters the overall system performance. The experiment results indicate the applicability of SFSS and show improved performance in comparison to similar approaches.


Author(s):  
Nacéra Bennacer ◽  
Guy Vidal-Naquet

This paper proposes an Ontology-driven and Community-based Web Services (OCWS) framework which aims at automating discovery, composition and execution of web services. The purpose is to validate and to execute a user’s request built from the composition of a set of OCWS descriptions and a set of user constraints. The defined framework separates clearly the OCWS external descriptions from internal realistic implementations of e-services. It identifies three levels: the knowledge level, the community level and e-services level and uses different participant agents deployed in a distributed architecture. First, the reasoner agent uses a description logic extended for actions in order to reason about: (i) consistency of the pre-conditions and post-conditions of OCWS descriptions and the user constraints with ontologies semantics, (ii) consistency of the workflow matching assertions and the execution dependency graph. Then the execution plan model is generated automatically to be run by the composer agents using the dynamic execution plan algorithm (DEPA), according to the workflow matching and the established execution order. The community composer agents invoke the appropriate e-services and ensure that the non functional constraints are satisfied. DEPA algorithm works dynamically without a priori information about e-services states and has interesting properties such as taking into account the non-determinism of e-services and reducing the search space.


Author(s):  
Marco Alberti ◽  
Marco Gavanelli ◽  
Evelina Lamma ◽  
Federico Chesani ◽  
Paola Mello ◽  
...  
Keyword(s):  

Author(s):  
Ramesh Kumar ◽  
Lavanya V ◽  
Karthika R ◽  
Harshini B

We propose a two-step, context-based semantic approach to the problem of matching and ranking Web services for possible service composition. Semantic understanding of Web services may provide added value by identifying new possibilities for compositions of services and context-based semantic. The semantic matching ranking approach is unique since it provides the Web service designer with an explicit numeric estimation of the extent to which a possible composition. This process consists of multiple services that can be executed in sequence or in a parallel process. Given a service request, a set of candidates (available services and service patterns) is dynamically generated layer by layer from inputs to outputs of this request. For each layer, the algorithm traverses a priori search space which is a set of service patterns from historical solutions, then it searches available services from the repositories. That means each layer contains all services and service patterns that can be executed with a set of outputs provided by previous layers. The search process terminates until all the outputs of a request are obtained.


Author(s):  
Florin-Claudiu Pop ◽  
Marcel Cremene ◽  
Mircea Vaida ◽  
Michel Riveill ◽  
Jean-Yves Tigli ◽  
...  

The widespread of Web services in the ubiquitous computing era and the impossibility to predict a priori all possible user needs generates the necessity for on-demand service composition. Natural language is one of the the easiest ways for a user to express what he expects regarding a service. Two main problems need to be solved in order to create a composite service to satisfy the user: a)retrieval of relevant services and b) orchestration/composition of the selected services in order to fulfill the user request. We solve the first problem by using semantic concepts associated with the services and we define a conceptual distance to measure the similarity between the user request and a service configuration. Retrieved services are composed, based on aspect oriented templates called Aspects of Assembly. We have tested our application in an environment for pervasive computing called Ubiquarium, where our system composes a service according to the user request described by a sentence. The implementation is based on the WComp middleware that enables us to use regular Web services but also Web services for devices.


1999 ◽  
Vol 398 ◽  
pp. 321-346 ◽  
Author(s):  
JACOB A. LANGFORD ◽  
ROBERT D. MOSER

It is shown that there is an abstract subgrid model that is in all senses ideal. An LES using the ideal subgrid model will exactly reproduce all single-time, multi-point statistics, and at the same time will have minimum possible error in instantaneous dynamics. The ideal model is written as an average over the real turbulent fields whose large scales match the current LES field. But this conditional average cannot be computed directly. Rather, the ideal model is the target for approximation when developing practical models, though no new practical models are presented here. To construct such models, the conditional average can be formally approximated using stochastic estimation. These optimal formulations are presented, and it is shown that a relatively simple but general class of one-point estimates can be computed from two-point correlation data, and that the estimates retain some of the statistical properties of the ideal model.To investigate the nature of these models, optimal formulations were applied to forced isotropic turbulence. A variety of optimal models of increasing complexity were computed. In all cases, it was found that the errors between the real and estimated subgrid force were nearly as large as the subgrid force itself. It is suggested that this may also be characteristic of the ideal model in isotropic turbulence. If this is the case, then it explains why subgrid models produce reasonable results in actual LES while performing poorly in a priori tests. Despite the large errors in the optimal models, one feature of the subgrid interaction that is exactly represented is the energy transfer to the subgrid scales by each wavenumber.


2002 ◽  
Vol 59 (4) ◽  
pp. 861-876 ◽  
Author(s):  
Berengere Dubrulle ◽  
Jean-Philippe Laval ◽  
Peter P. Sullivan ◽  
Joseph Werne

2013 ◽  
Vol 7 (3) ◽  
pp. 53-67 ◽  
Author(s):  
Michele Tomaiuolo

In the context of Web services, access control presents some interesting challenges, especially when services are exposed to a global audience, with users accessing them from different systems and under different security settings. A decentralized approach to access control, which can be applied to such open environments, is represented by Trust Management. In fact, it is based on the peer-to-peer delegation of access rights among users, also across organizational boundaries, without supposing a-priori the existence of trusted third parties in the system. This article presents dDelega, a Trust Management framework for SOAP-style and REST-style Web services, available as open source software and usable in different application scenarios. The framework allows users to create multiple levels of delegation of access rights for protected resources. It defines various certificates, for binding names, permissions and oblivious attributes to users, adhering to relevant standards, such as WS-Security, SAML and XACML.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


Sign in / Sign up

Export Citation Format

Share Document