BF*: Web Services Discovery and Composition as Graph Search Problem

Author(s):  
Seog-Chan Oh ◽  
B. On ◽  
E.J. Larson ◽  
Dongwon Lee
2020 ◽  
Vol 6 (11) ◽  
pp. 112
Author(s):  
Faisal R. Al-Osaimi

This paper presents a unique approach for the dichotomy between useful and adverse variations of key-point descriptors, namely the identity and the expression variations in the descriptor (feature) space. The descriptors variations are learned from training examples. Based on labels of the training data, the equivalence relations among the descriptors are established. Both types of descriptor variations are represented by a graph embedded in the descriptor manifold. Invariant recognition is then conducted as a graph search problem. A heuristic graph search algorithm suitable for the recognition under this setup was devised. The proposed approach was tested on the FRGC v2.0, the Bosphorus and the 3D TEC datasets. It has shown to enhance the recognition performance, under expression variations, by considerable margins.


2014 ◽  
Vol 687-691 ◽  
pp. 1637-1640
Author(s):  
Liang Liang Hao

With the Development of web service technology, a single web service cannot fulfill different users’ diverse requirements. Adding semantic information to the input-output message of web services provides us a method to implement web service composition automatically. After researching on existing algorithms for web service composition, this article proposed a QoS-oriented web service composition algorithm based on graph search with semantic information.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Bin Wang ◽  
Weiling Hu ◽  
Jiquan Liu ◽  
Jianmin Si ◽  
Huilong Duan

Gastroscopic examination is one of the most common methods for gastric disease diagnosis. In this paper, a multitarget tracking approach is proposed to assist endoscopists in identifying lesions under gastroscopy. This approach analyzes numerous preobserved gastroscopic images and constructs a gastroscopic image graph. In this way, the deformation registration between gastroscopic images is regarded as a graph search problem. During the procedure, the endoscopist marks suspicious lesions on the screen and the graph is utilized to locate and display the lesions in the appropriate frames based on the calculated registration model. Compared to traditional gastroscopic lesion surveillance methods (e.g., tattooing or probe-based optical biopsy), this approach is noninvasive and does not require additional instruments. In order to assess and quantify the performance, this approach was applied to stomach phantom data andin vivodata. The clinical experimental results demonstrated that the accuracy at angularis, antral, and stomach body was 6.3 ± 2.4 mm, 7.6 ± 3.1 mm, and 7.9 ± 1.6 mm, respectively. The mean accuracy was 7.31 mm, average targeting time was 56 ms, and thePvalue was 0.032, which makes it an attractive candidate for clinical practice. Furthermore, this approach provides a significant reference for endoscopic target tracking of other soft tissue organs.


Author(s):  
Thiago Castanheira Retes de Sousa ◽  
Rafael Lima de Carvalho

Artificial Intelligence has always been used in designing of automated agents for playing games such as Chess, Go, Defense of the Ancients 2, Snake Game, billiard and many others. In this work, we present the development and performance evaluation of an automated bot that mimics a real life player for the RPG Game Tibia. The automated bot is built using a combination of AI techniques such as graph search algorithm A* and computer vision tools like template matching. Using four algorithms to get global position of player in game, handle its health and mana, target monsters and walk through the game, we managed to develop a fully automated Tibia bot based in raw input image. We evaluated the performance of the agent in three different scenarios, collecting and analyzing metrics such as XP Gain, Supplies Usage and Balance. The simulation results shows that the developed bot is capable of producing competitive results according to in-game metrics when compared to human players.


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 833
Author(s):  
Veera Boonjing ◽  
Pisit Chanvarasuth

This paper formulates the problem of determining all reducts of an information system as a graph search problem. The search space is represented in the form of a rooted graph. The proposed algorithm uses a breadth-first search strategy to search for all reducts starting from the graph root. It expands nodes in breadth-first order and uses a pruning rule to decrease the search space. It is mathematically shown that the proposed algorithm is both time and space efficient.


Author(s):  
John Gekas ◽  
Maria Fasli

The Web services paradigm has enabled an increasing number of providers to host remotely accessible services. However, the true potential of such a distributed infrastructure can only be reached when such autonomic services can be combined together as parts of a workflow, in order to collectively achieve combined functionality. In this paper we present our work in the area of automatic workflow composition among Web services with semantically described functionality capabilities. For this purpose, we use a set of heuristics derived from the connectivity structure of the service repository in order to effectively guide the composition process. The methodologies presented in this paper have been inspired by research in areas such as graph network analysis, social network analysis and bibliometrics. In addition, we present comparative experimentation results in order to evaluate the presented techniques.


Author(s):  
Daniel S. F. Alves ◽  
E. Elael M. Soares ◽  
Guilherme C. Strachan ◽  
Guilherme P. S. Carvalho ◽  
Marco F. S. Xaud ◽  
...  

Many interesting and difficult practical problems need to be tackled in the areas of firefighting, biological and/or chemical decontamination, tactical and/or rescue searches, and Web spamming, among others. These problems, however, can be mapped onto the graph decontamination problem, also called the graph search problem. Once the target space is mapped onto a graph G(N,E), where N is the set of G nodes and E the set of G edges, one initially considers all nodes in N to be contaminated. When a guard, i.e., a decontaminating agent, is placed in a node i ??N, i becomes (clean and) guarded. In case such a guard leaves node i, it can only be guaranteed that i will remain clean if all its neighboring nodes are either clean or clean and guarded. The graph decontamination/search problem consists of determining a sequence of guard movements, requiring the minimum number of guards needed for the decontamination of G. This chapter presents a novel swarm robotics approach to firefighting, a conflagration in a hypothetical apartment ground floor. The mechanism has been successfully simulated on the Webots platform, depicting a firefighting swarm of e-puck robots.


Graphs are keenly studied by people of numerous domains as most of the applications we encounter in our daily lives can be easily given a graph-based representation. All the problems may then be easily studied as grap-based problems. In this chapter, the authors study the problem of robot motion planning as a graph search problem. The key steps involve the representation of the problem as a graph and solving the problem as a standard graph search problem. A number of graph search algorithms exist, each having its own advantages and disadvantages. In this chapter, the authors explain the concept, working methodology, and issues associated with some of these algorithms. The key algorithms under discussion include Breadth First Search, Depth First Search, A* Algorithm, Multi Neuron Heuristic Search, Dijkstra’s Algorithm, D* Algorithm, etc. Experimental results of some of these algorithms are also discussed. The chapter further presents the advantages and disadvantages of graph-based motion planning.


Sign in / Sign up

Export Citation Format

Share Document