scholarly journals A Probabilistic Analysis of a String Editing Problem and its Variations

1995 ◽  
Vol 4 (2) ◽  
pp. 143-166 ◽  
Author(s):  
Guy Louchard ◽  
Wojciech Szpankowski

We consider a string editing problem in a probabilistic framework. This problem is of considerable interest to many facets of science, most notably molecular biology and computer science. A string editing transforms one string into another by performing a series of weighted edit operations of overall maximum (minimum) cost. The problem is equivalent to finding an optimal path in a weighted grid graph. In this paper we provide several results regarding a typical behaviour of such a path. In particular, we observe that the optimal path (i.e. edit distance) is almost surely (a.s.) equal to αn for large n where α is a constant and n is the sum of lengths of both strings. More importantly, we show that the edit distance is well concentrated around its average value. In the so called independent model in which all weights (in the associated grid graph) are statistically independent, we derive some bounds for the constant α. As a by-product of our results, we also present a precise estimate of the number of alignments between two strings. To prove these findings we use techniques of random walks, diffusion limiting processes, generating functions, and the method of bounded difference.

2014 ◽  
Vol 25 (03) ◽  
pp. 307-329 ◽  
Author(s):  
YOSHIYUKI YAMAMOTO ◽  
KOUICHI HIRATA ◽  
TETSUJI KUBOYAMA

In this paper, we investigate the problem of computing structural sensitive variations of an unordered tree edit distance. First, we focus on the variations tractable by the algorithms including the submodule of a network algorithm, either the minimum cost maximum flow algorithm or the maximum weighted bipartite matching algorithm. Then, we show that both network algorithms are replaceable, and hence the time complexity of computing these variations can be reduced to O(nmd) time, where n is the number of nodes in a tree, m is the number of nodes in another tree and d is the minimum degree of given two trees. Next, we show that the problem of computing the bottom-up distance is MAX SNP-hard. Note that the well-known linear-time algorithm for the bottom-up distance designed by Valiente (2001) computes just a bottom-up indel (insertion-deletion) distance allowing no substitutions.


Author(s):  
Г. М. Хорошун

In the paper the diffraction and interference images received by numerical simulation and experimentally in solving fundamental and applied problems of photonics. Images are structures with a special intensity distribution formed by the initial field and the optical system. To increase the speed of processing of the image data compression method with further implementation in databases is developed. The method of diffraction and interference images compression is based on the intensity quantization. An algorithm for image quantization has been developed: target intensity values have been determined, which allow setting quantization levels, and data visualization techniques, which determine the threshold values for these levels. The algorithm also contains image segmentation by the size of the minimum size of the topological object. The vicinity of topological object is defined under the conditions of a visual registration form and do not crossing with other regions. The topological objects of the diffraction field determine as the maximum, minimum and zero intensity, and in the interference pattern such topological objects are the maximum, minimum and the region of the band splitting. Important parameters are the average value of the intensity of the whole image - which highlights its overall structure and the average value of the intensity of the local segment. The following results were obtained by compressing data from an 8-bit image in grayscale to 2 bits of color depth are enough for an interference image, and to 3 bits, which are enough for a diffraction image. The quantization differences for diffraction images and interference patterns are shown. Data compression ratios are calculated. On the one hand, the application of the obtained results and recommendations is possible in various fields of medicine, biology and pharmacy , which use laser technology, and on the other hand in the development of separate IT identification of topological objects in the light field, optical image processing and decision support in optical problems.


2013 ◽  
Vol 448-453 ◽  
pp. 795-801
Author(s):  
Yue Ren Wang ◽  
Yan Li Song ◽  
Hong Di Mao

The value of wasteheat utilization in large general hospitals is a critical problem that can not be ignored with social development. To make use of this source means to establish a system of wasteheat utilization, and two priorities are determining the weigh of different wasteheat utilization point properly and determine the best path for waste heat utilization. In this paper, based on Analytic Hierarchy Process (AHP) and Information Entropy of wasteheat utilization point distribution model, we find the way to determine the weights. At the same time, the Minimum Spanning Tree Model based on Kruskal's algorithm helps to find the new method of optimal path in using wasteheat. Both models will be used in the heat research of a hospital, and we identify both clear wasteheat utilization point weights distribution and minimum cost of the best path which really paves the way for the further system construction of wasteheat utilization in large general hospital with further research and application value.


2021 ◽  
Vol 11 (2) ◽  
pp. 633
Author(s):  
Guodong Yi ◽  
Chuanyuan Zhou ◽  
Yanpeng Cao ◽  
Hangjian Hu

Assembly path planning of complex products in virtual assembly is a necessary and complicated step, which will become long and inefficient if the assembly path of each part is completely planned in the assembly space. The coincidence or partial coincidence of the assembly paths of some parts provides an opportunity to solve this problem. A path planning algorithm based on prior path reuse (PPR algorithm) is proposed in this paper, which realizes rapid planning of an assembly path by reusing the planned paths. The core of the PPR algorithm is a dual-tree fusion strategy for path reuse, which is implemented by improving the rapidly exploring random tree star (RRT *) algorithm. The dual-tree fusion strategy is used to find the nearest prior node, the prior connection node, the nearest exploring node, and the exploring connection node and to connect the exploring tree to the prior tree after the exploring tree is extended to the prior space. Then, the optimal path selection strategy is used to calculate the costs of all planned paths and select the one with the minimum cost as the optimal path. The PPR algorithm is compared with the RRT * algorithm in path planning for one start node and multiple start nodes. The results show that the total time and the number of sampling points for assembly path planning of batch parts using the PPR algorithm are far less than those using the RRT * algorithm.


Blood ◽  
2018 ◽  
Vol 132 (Supplement 1) ◽  
pp. 5918-5918
Author(s):  
Mauricio Sarmiento ◽  
Nicole Beffermann ◽  
Ximena Solar ◽  
Jara Veronica ◽  
Soto Katherine ◽  
...  

Abstract Introduction. The use of cryopreservation of hematopoietic precursors for autologous or allogeneic transplantation with DMSO is a procedure commonly used in many countries, but it is expensive and is not exempt of risks, since serious adverse events have been described. Our group recently published a comparative study between two centers and showed that avoiding DMSO cryopreservation favors better transplantation tolerance, significantly shorter hospitalizations, fewer episodes of febrile neutropenia and mucositis. In this report we show the financial analysis of both modalities of CRYO and Non CRYO preserved transplants. Methodology Database of the adult hematopoietic transplant program of our institution was retrospectively analyzed. Since Non CRYO modality was initiated in our institution on 2015, we compared 3 years before and after that date, assigning two groups (CRYO and Non CRYO). Costs data of mobilization, apheresis, hospitalization, freezing and blood banking were analyzed. Results Between 2013 and 2018, 90 autologous hematopoietic transplants were performed, 41 CRYO and 49 Non CRYO. The average cost of mobilization therapy with filgrastim and the use of plerixafor did not differ between groups (p = 0.26). The number of apheresis needed to achieve a satisfactory count was higher in CRYO (median 1.5 vs 1.2 p = 0.001) with 55% higher cost on average. The cost of transplant hospitalization including antibiotics costs, blood banking costs, analgesics requirements and nutritional parenteral support was lower in Non CRYO (minimum cost USD 8000, maximum USD 160000 average value USD 20.000) than in CRYO (minimum cost USD 11.000, maximum USD 200.000, average value USD 32000) (p = 0.001). Conclusions In addition to promoting better patient tolerance, Non CRYO modality in our country has lower cost. This information has important relevance for health systems of developing countries and can promote better access to transplant. Disclosures No relevant conflicts of interest to declare.


Author(s):  
P. K. Jain ◽  
S. P. Manoochehri

Abstract This paper presents a network optimization approach to the generation of tool tip paths in three-dimensional space for robot manipulators working in the presence of obstacles. The developed algorithm relies on a graph structure enumeration of possible path segments between discrete points inside the workspace. An intelligent heuristic scheme is used to select a small searchspace over which the search for the optimal path is carried out. Paths which are optimal with respect to pre-determined objective functions based on robot kinematic and dynamic solutions are synthesized by applying Dijkstra’s minimum cost algorithm. Collision avoidance is checked for the tool tip as well as for the robot body. The methodology is robot independent and can be applied to any robot whose kinematics and dynamics can be solved. A computer program has been developed to implement this approach for three-axis manipulators. Results of the application of this scheme to some robots in this class are also presented.


2012 ◽  
Vol 236-237 ◽  
pp. 1101-1105
Author(s):  
Hong She Dang ◽  
Li Na Tian ◽  
Fang Zhang

In allusion to the contradiction between salt-and-pepper noise attenuation and image detail-preserving, a maximum-minimum filtering algorithm base on threshold is proposed in this paper. The gray values of the pixels in the filtering windows are given firstly. If the test pixel value is the extreme value of the filtering windows, this pixel is a suspicious noise point. And then sort the absolute value of the difference between the suspicious noise point value and pixels gray value in its neighborhood. Judge the pixel is a real noise point according to the average value of the middle two values in the difference sequence. The original gray value of noise point is determined by the maximum-minimum relationship between the pixel values of the neighborhood. If the pixel is not a noise point, its value is kept. The simulation results show that the proposed algorithm has greatly impact on signal to noise ratio and mean square error which retains more image details while making sure to remove noise.


1973 ◽  
Vol 30 (12) ◽  
pp. 2070-2076 ◽  
Author(s):  
B. A. Campbell

In the late sixties, Canada introduced legislation to provide for the limitation of fishing licenses in two specialized fisheries — the lobster fishery of the Atlantic coast, and the salmon fishery of the Pacific coast. These regulations were introduced for economic and not conservation reasons.As a result of the limitation program the number of licensed salmon vessels in British Columbia had been reduced from 7548 in 1968 to 5890 in 1972. The program called for salmon vessels to be grouped into two categories based on their record of production in 1967 or 1968. Vessels with low production ("B" category) can continue in the salmon fishery for only 10 years. License fees for "A" category vessels have been increased sharply. New salmon vessels can only be licensed provided an equal tonnage of "A" category vessels is retired from the commercial fishery. The Government is using the money from licenses to purchase from fishermen active salmon vessels and retire them from the fishing industry. Indians have been given the special privilege, for social reasons, of obtaining salmon licenses for their vessels at minimum cost, but they still cannot license a new vessel unless an equivalent tonnage of vessels has been retired. The average value of individual salmon vessels remaining in the fleet has increased substantially and the higher values are not justified in relation to the increased returns that can be achieved because of the fewer number of fishing vessels. The final phase of the plan is under review by an industry committee charged with making representation to the Government in 1973.In the Maritime Provinces the limitation program for lobster boats has not progressed as quickly as the salmon program in British Columbia. A base, however, has been established: vessels have been divided into "A" and "B" categories but no action had been taken up to September 1972 to limit the life of "B" category lobster vessels. Until this is done, or other means taken to reduce the fleet, very little reduction in the number of boats is expected.These limitation programs, for what are essentially small boat fisheries, are unique, since few attempts have been made in the world’s fishing communities to control the number of vessels for economic reasons. The progress of these programs will provide administrators and economists with criteria for future management.


2021 ◽  
Vol 2083 (3) ◽  
pp. 032011
Author(s):  
Dayun Ge

Abstract In the process of multimodal transport, the cost and time of transportation are particularly important. In order to avoid unreasonable container transportation and unnecessary waste of transportation capacity and transportation cost, we must effectively integrate the advantages of various transportation modes, select the most suitable transportation mode and the most reasonable transportation path, and take the minimum cost and time as the goal to ensure the smooth transportation of goods to the destination. Therefore, optimizing multimodal transport network has very important practical significance. This paper starts with the multimodal transport network under a single task, designs the solution method of the model combined with ant colony algorithm, and gives an example. Finally, the model and algorithm design are proved to be reasonable by using MATLAB solution algorithm.


2018 ◽  
Vol 7 (4) ◽  
pp. 2218
Author(s):  
K Praveen Kumar Rao ◽  
Dr T. Senthil Murugan

TCP allows sharing the information about network status using Cross Layer mechanism for the different layers. The Cross Layer approach in wireless MANETs is provided for improving TCP performance. The Optimal Path or Route Selection is an important aspect for increasing energy efficiency and lifetime of the network. A proposal for Trust Aware Routing protocol is described for selecting an optimal route in MANETs. Direct and Indirect trust values are used for estimating the trust value of every node in the network. The route cost amount is estimated and the best path with minimum cost value is chosen for the network. The data packet is transmitted on the optimal path, and the optimal path is selected on the minimum cost value of the network. There is a possibility of the packets being dropped due to congestion and / or mobility and there are chances of the packets being reordered during the transmission process. Reorder Identification mechanism is the procedure used for dropped and / or reordered packets. The simulation results of the proposed work are compared with the existing work and it shows more effective in terms of energy efficiency and network lifetime. 


Sign in / Sign up

Export Citation Format

Share Document