Pattern Recognition in Histo-Pathology: Basic Considerations

1982 ◽  
Vol 21 (01) ◽  
pp. 15-22 ◽  
Author(s):  
W. Schlegel ◽  
K. Kayser

A basic concept for the automatic diagnosis of histo-pathological specimen is presented. The algorithm is based on tissue structures of the original organ. Low power magnification was used to inspect the specimens. The form of the given tissue structures, e. g. diameter, distance, shape factor and number of neighbours, is measured. Graph theory is applied by using the center of structures as vertices and the shortest connection of neighbours as edges. The algorithm leads to two independent sets of parameters which can be used for diagnostic procedures. First results with colon tissue show significant differences between normal tissue, benign and malignant growth. Polyps form glands that are twice as wide as normal and carcinomatous tissue. Carcinomas can be separated by the minimal distance of the glands formed. First results of pattern recognition using graph theory are discussed.

Author(s):  
Hong-Sen Yan ◽  
Feng-Ming Ou ◽  
Ming-Feng Tang

An algorithm is presented, based on graph theory, for enumerating all feasible serial and/or parallel combined mechanisms from the given rotary or translational power source and specific kinematic building blocks. Through the labeled out-tree representations for the configurations of combined mechanisms, the enumeration procedure is developed by adapting the algorithm for the enumeration of trees. A rotary power source and four kinematic building blocks: a crank-rocker linkage, a rack-pinion, a double-slider mechanism, and a cam-follower mechanism, are chosen as the combination to illustrate the algorithm. And, two examples are provided to validate the algorithm.


Author(s):  
Bill Jackson ◽  
Tibor Jordán

In the network localization problem the goal is to determine the location of all nodes by using only partial information on the pairwise distances (and by computing the exact location of some nodes, called anchors). The network is said to be uniquely localizable if there is a unique set of locations consistent with the given data. Recent results from graph theory and combinatorial rigidity made it possible to characterize uniquely localizable networks in two dimensions. Based on these developments, extensions, related optimization problems, algorithms, and constructions also became tractable. This chapter gives a detailed survey of these new results from the graph theorist’s viewpoint.


2018 ◽  
Author(s):  
Johann-Mattis List

Sound correspondence patterns play a crucial role for linguistic reconstruction. Linguists use them to prove language relationship, to reconstruct proto-forms, and for classical phylogenetic reconstruction based on shared innovations. Cognate words which fail to conform with expected patterns can further point to various kinds of exceptions in sound change, such as analogy or assimilation of frequent words. Here we present an automatic method for the inference of sound correspondence patterns across multiple languages based on a network approach. The core idea is to represent all columns in aligned cognate sets as nodes in a network with edges representing the degree of compatibility between the nodes. The task of inferring all compatible correspondence sets can then be handled as the well-known minimum clique cover problem in graph theory, which essentially seeks to split the graph into the smallest number of cliques in which each node is represented by exactly one clique. The resulting partitions represent all correspondence patterns which can be inferred for a given dataset. By excluding those patterns which occur in only a few cognate sets, the core of regularly recurring sound correspondences can be inferred. Based on this idea, the paper presents a method for automatic correspondence pattern recognition, which is implemented as part of a Python library which supplements the paper. To illustrate the usefulness of the method, we present how the inferred patterns can be used to predict words that have not been observed before.


2020 ◽  
Vol 13 (44) ◽  
pp. 4483-4489
Author(s):  
C Beaula ◽  

Background/Objective: The Coronavirus Covid-19 has affected almost all the countries and millions of people got infected and more deaths have been reported everywhere. The uncertainty and fear created by the pandemic can be used by hackers to steal the data from both private and public systems. Hence, there is an urgent need to improve the security of the systems. This can be done only by building a strong cryptosystem. So many researchers started embedding different topics of mathematics like algebra, number theory, and so on in cryptography to keep the system, safe and secure. In this study, a cryptosystem using graph theory has been attempted, to strengthen the security of the system. Method: A new graph is constructed from the given graph, known as a double vertex graph. The edge labeling of this double vertex graph is used in encryption and decryption. Findings: A new cryptosystem using the amalgamation of the path, its double vertex graph and edge labeling has been proposed. From the double vertex graph of a path, we have given a method to find the original path. To hack such an encrypted key, the knowledge of graph theory is important, which makes the system stronger. Applications:The one-word encryption method will be useful in every security system that needs a password for secure communication or storage or authentication. Keywords: Double vertex graphs; path; adjacency matrix; encryption; cryptography


2005 ◽  
Vol 97 (1) ◽  
pp. 309-320 ◽  
Author(s):  
Martin E. Arendasy ◽  
Andreas Hergovich ◽  
Markus Sommer ◽  
Bettina Bognar

The study at hand reports first results about the dimensionality and construct validity of a newly developed objective, video-based personality test, which assesses the willingness to take risks in traffic situations. On the basis of the theory of risk homeostasis developed by Wilde, different traffic situations with varying amounts of objective danger were filmed. These situations mainly consisted of situations with passing maneuvers and speed choice or traffic situations at intersections. Each of these traffic situations describes an action which should be carried out. The videos of the traffic situations are presented twice. Before the first presentation, a short written explanation of the preceding traffic situation and a situation-contingent reaction is provided. The respondents are allowed to obtain an overview of the given situations during the first presentation of each traffic situation. During the second presentation the respondents are asked to indicate at which point the action that is contingent on the described situation will become too dangerous to carry out. Latencies for items were recorded as a measure for the magnitude of the person's subjectively accepted willingness to take risks in the sense of the risk homeostasis theory by Wilde. In a study with 243 people with different education and sex, the one-dimensionality of the test corresponding to the latency model by Scheiblechner was investigated. Analysis indicated that the new measure assesses a one-dimensional latent personality trait which can be interpreted as subjectively accepted amount of risk (target risk value). First indicators for the construct validity of the test are given by a significant correlation with the construct-related secondary scale, adventurousness of the Eysenck Personality Profiler with, at the same time, nonsignificant correlations to the two secondary scales, extroversion and emotional stability, that are not linked to the construct.


Author(s):  
P. Viswanath ◽  
Narasimha M. Murty ◽  
Bhatnagar Shalabh

Parametric methods first choose the form of the model or hypotheses and estimates the necessary parameters from the given dataset. The form, which is chosen, based on experience or domain knowledge, often, need not be the same thing as that which actually exists (Duda, Hart & Stork, 2000). Further, apart from being highly error-prone, this type of methods shows very poor adaptability for dynamically changing datasets. On the other hand, non-parametric pattern recognition methods are attractive because they do not derive any model, but works with the given dataset directly. These methods are highly adaptive for dynamically changing datasets. Two widely used non-parametric pattern recognition methods are (a) the nearest neighbor based classification and (b) the Parzen-Window based density estimation (Duda, Hart & Stork, 2000). Two major problems in applying the non-parametric methods, especially, with large and high dimensional datasets are (a) the high computational requirements and (b) the curse of dimensionality (Duda, Hart & Stork, 2000). Algorithmic improvements, approximate methods can solve the first problem whereas feature selection (Isabelle Guyon & André Elisseeff, 2003), feature extraction (Terabe, Washio, Motoda, Katai & Sawaragi, 2002) and bootstrapping techniques (Efron, 1979; Hamamoto, Uchimura & Tomita, 1997) can tackle the second problem. We propose a novel and unified solution for these problems by deriving a compact and generalized abstraction of the data. By this term, we mean a compact representation of the given patterns from which one can retrieve not only the original patterns but also some artificial patterns. The compactness of the abstraction reduces the computational requirements, and its generalization reduces the curse of dimensionality effect. Pattern synthesis techniques accompanied with compact representations attempt to derive compact and generalized abstractions of the data. These techniques are applied with (a) the nearest neighbor classifier (NNC) which is a popular non-parametric classifier used in many fields including data mining since its conception in the early fifties (Dasarathy, 2002) and (b) the Parzen-Window based density estimation which is a well known non-parametric density estimation method (Duda, Hart & Stork, 2000).


Author(s):  
Jörg Andreas Walter

For many tasks of exploratory data analysis, visualization plays an important role. It is a key for efficient integration of human expertise — not only to include his background knowledge, intuition and creativity, but also his powerful pattern recognition and processing capabilities. The design goals for an optimal user interaction strongly depend on the given visualization task, but they certainly include an easy and intuitive navigation with strong support for the user’s orientation.


2017 ◽  
Vol 7 (2) ◽  
pp. 340-348 ◽  
Author(s):  
M. Domini ◽  
G. Langergraber ◽  
L. Rondi ◽  
S. Sorlini ◽  
S. Maswaga

The Sanitation Safety Planning methodology is implemented within a cooperation project in Iringa, Tanzania. The study presents the methodology and its adaptation and use for the given context, in order to assess risks and to support stakeholders in improving the current sanitation system and validate the design of an improved one. First results of the application of the methodology, obtained in one of the four peri-urban wards of Iringa, demonstrated its efficacy and utility in prioritising risks and identifying cost-effective control measures. Risks were assessed by the use of a semi-quantitative approach, and a simplified risk assessment matrix was developed for the case study. A sensitivity analysis was carried out in order to evaluate criteria for prioritising control measures to be selected for the development of an achievable improvement plan.


2016 ◽  
Vol 32 ◽  
pp. 64-65
Author(s):  
S. Strolin ◽  
L. Strigari ◽  
V. Bruzzaniti ◽  
S. Ungania ◽  
R. Nigro ◽  
...  

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Shu Gong ◽  
Haci Mehmet Baskonus ◽  
Wei Gao

The security of a network is closely related to the structure of the network graph. The denser the network graph structure is, the better it can resist attacks. Toughness and isolated toughness are used to characterize the vulnerable programs of the network which have been paid attention from mathematics and computer scholars. On this basis, considering the particularity of the sun component structures, sun toughness was introduced in mathematics and applied to computer networks. From the perspective of modern graph theory, this paper presents the sun toughness conditions of the path factor uniform graph and the path factor critical avoidable graph in P ≥ 2 -factor and P ≥ 3 -factor settings. Furthermore, examples show that the given boundaries are sharp.


Sign in / Sign up

Export Citation Format

Share Document