Page Digest for large-scale Web services

Author(s):  
D. Rocco ◽  
D. Buttler ◽  
Ling Liu
Keyword(s):  
Author(s):  
S. Blaser ◽  
J. Meyer ◽  
S. Nebiker ◽  
L. Fricker ◽  
D. Weber

Abstract. Advances in digitalization technologies lead to rapid and massive changes in infrastructure management. New collaborative processes and workflows require detailed, accurate and up-to-date 3D geodata. Image-based web services with 3D measurement functionality, for example, transfer dangerous and costly inspection and measurement tasks from the field to the office workplace. In this contribution, we introduced an image-based backpack mobile mapping system and new georeferencing methods for capture previously inaccessible outdoor locations. We carried out large-scale performance investigations at two different test sites located in a city centre and in a forest area. We compared the performance of direct, SLAM-based and image-based georeferencing under demanding real-world conditions. Both test sites include areas with restricted GNSS reception, poor illumination, and uniform or ambiguous geometry, which create major challenges for reliable and accurate georeferencing. In our comparison of georeferencing methods, image-based georeferencing improved the median precision of coordinate measurement over direct georeferencing by a factor of 10–15 to 3 mm. Image-based georeferencing also showed a superior performance in terms of absolute accuracies with results in the range from 4.3 cm to 13.2 cm. Our investigations showed a great potential for complementing 3D image-based geospatial web-services of cities as well as for creating such web services for forest applications. In addition, such accurately georeferenced 3D imagery has an enormous potential for future visual localization and augmented reality applications.


2014 ◽  
Vol 989-994 ◽  
pp. 4350-4354
Author(s):  
Chun Shan ◽  
Lu Xia Wu ◽  
Yang Yang ◽  
Jing Feng Xue

Communication between heterogeneous systems is difficult to be achieved without affecting the existing systems in Enterprise Application Integration. A kind of dynamic web services publishing software based on CXF is proposed in this paper. The purpose of this software are twofold. Firstly, it encapsulates specified data called by other systems into web service published dynamically which unifies customization among heterogeneous systems. Secondly, the function of service monitoring provided allows users to get the real-time information of published web services. The proposal software has been deployed in a Chinese large-scale military enterprise. Practice in the enterprise shows that the software can simplify web service development, quickly build a web service which encapsulates large amounts of data, and achieve information sharing within the enterprise environment.


2013 ◽  
Vol 846-847 ◽  
pp. 1868-1872
Author(s):  
Shuai Gang

In recent years, the number and size of Web services on the Internet have a rapid development. Industry and academia start to study the web service. In Internet resources, if the web cannot be found, the web service will become meaningless. So for web services, large-scale managements and problems are the keys of the study of Internet service resources. This paper studies large-scale distributed web services in network resources based on SOA architecture ideas. It also designs the unified management and organization system of ideological and political education which treat the ideological and political education as the content. It proposes SN network resource service model of ideological and political education. With the development and popularization of the Internet today, the study on Internet resources of ideological and political education in this paper provides a theoretical reference for the innovation of the ideological and political education.


2011 ◽  
Vol 8 (2) ◽  
pp. 95-114 ◽  
Author(s):  
Paweł Sztromwasser ◽  
Kjell Petersen ◽  
Pál Puntervoll

Summary Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.


2007 ◽  
Author(s):  
Yongwang Zhao ◽  
Chunyang Hu ◽  
Yonggang Huang ◽  
Dianfu Ma

In Service Oriented Architecture (SOA) web services plays important role. Web services are web application components that can be published, found, and used on the Web. Also machine-to-machine communication over a network can be achieved through web services. Cloud computing and distributed computing brings lot of web services into WWW. Web service composition is the process of combing two or more web services to together to satisfy the user requirements. Tremendous increase in the number of services and the complexity in user requirement specification make web service composition as challenging task. The automated service composition is a technique in which Web Service Composition can be done automatically with minimal or no human intervention. In this paper we propose a approach of web service composition methods for large scale environment by considering the QoS Parameters. We have used stacked autoencoders to learn features of web services. Recurrent Neural Network (RNN) leverages uses the learned features to predict the new composition. Experiment results show the efficiency and scalability. Use of deep learning algorithm in web service composition, leads to high success rate and less computational cost.


2021 ◽  
Author(s):  
◽  
Yang Yu

<p>Web service composition has become a promising technique to build powerful enterprise applications by making use of distributed services with different functions. In the age of big data, more and more web services are created to deal with a large amount of data, which are called data-intensive services. Due to the explosion in the volume of data, providing efficient approaches to composing data-intensive services will become more and more important in the field of service-oriented computing. Meanwhile, as numerous web services have been emerging to offer identical or similar functionality on the Internet, web service composition is usually performed with end-to-end Quality of Service (QoS) properties which are adopted to describe the non-functional properties (e.g., response time, execution cost, reliability, etc.) of a web service. In addition, the executions of composite web services are typically coordinated by a centralized workflow engine. As a result, the centralized execution paradigm suffers from inefficient communication and a single point of failure. This is particularly problematic in the context of data-intensive processes. To that end, more decentralized and flexible execution paradigms are required for the execution of data-intensive applications.  From a computational point of view, the problems of QoS-aware data-intensive web service composition and execution can be characterised as complex, large-scale, constrained and multi-objective optimization problems. Therefore, genetic programming (GP) based solutions are presented in this thesis to address the problems. A series of simulation experiments are provided to demonstrate the performance of the proposed approaches, and the empirical observations are also described in this thesis.  Firstly, we propose a hybrid approach that integrates the local search procedure of tabu search into the global search process of GP to solving the problem of QoS-aware data-intensive web service composition. A mathematical model is developed for considering the mass data transmission across different component services in a data-intensive service composition. The experimental results show that our proposed approach can provide better performance than the standard GP approach and two traditional optimization methods.  Next, a many-objective evolutionary approach is proposed for tackling the QoS-aware data-intensive service composition problem having more than three competing quality objectives. In this approach, the original search space of the problem is reduced before a recently developed many-objective optimization algorithm, NSGA-III, is adopted to solve the many-objective optimization problem. The experimental results demonstrate the effectiveness of our approach, as well as its superiority than existing single-objective and multi-objective approaches.  Finally, a GP-based approach to partitioning a composite data-intensive service for decentralized execution is put forth in this thesis. Similar to the first problem, a mathematical model is developed for estimating the communication overhead inside a partition and across the partitions. The data and control dependencies in the original composite web service can be properly preserved in the deployment topology generated by our approach. Compared with two existing heuristic algorithms, the proposed approach exhibits better scalability and it is more suitable for large-scale partitioning problems.</p>


Sign in / Sign up

Export Citation Format

Share Document