scholarly journals Preserving Minority Structures in Graph Sampling

Author(s):  
Ying Zhao

<p>Sampling is a widely used graph reduction technique to accelerate graph computations and simplify graph visualizations. By comprehensively analyzing the literature on graph sampling, we assume that existing algorithms cannot effectively preserve minority structures that are rare and small in a graph but are very important in graph analysis. In this work, we initially conduct a pilot user study to investigate representative minority structures that are most appealing to human viewers. We then perform an experimental study to evaluate the performance of existing graph sampling algorithms regarding minority structure preservation. Results confirm our assumption and suggest key points for designing a new graph sampling approach named mino-centric graph sampling (MCGS). In this approach, a triangle-based algorithm and a cut-point-based algorithm are proposed to efficiently identify minority structures. A set of importance assessment criteria are designed to guide the preservation of important minority structures. Three optimization objectives are introduced into a greedy strategy to balance the preservation between minority and majority structures and suppress the generation of new minority structures. A series of experiments and case studies are conducted to evaluate the effectiveness of the proposed MCGS.</p>

2020 ◽  
Author(s):  
Ying Zhao

<p>Sampling is a widely used graph reduction technique to accelerate graph computations and simplify graph visualizations. By comprehensively analyzing the literature on graph sampling, we assume that existing algorithms cannot effectively preserve minority structures that are rare and small in a graph but are very important in graph analysis. In this work, we initially conduct a pilot user study to investigate representative minority structures that are most appealing to human viewers. We then perform an experimental study to evaluate the performance of existing graph sampling algorithms regarding minority structure preservation. Results confirm our assumption and suggest key points for designing a new graph sampling approach named mino-centric graph sampling (MCGS). In this approach, a triangle-based algorithm and a cut-point-based algorithm are proposed to efficiently identify minority structures. A set of importance assessment criteria are designed to guide the preservation of important minority structures. Three optimization objectives are introduced into a greedy strategy to balance the preservation between minority and majority structures and suppress the generation of new minority structures. A series of experiments and case studies are conducted to evaluate the effectiveness of the proposed MCGS.</p>


Author(s):  
Xiaofang Lv ◽  
Da Yu ◽  
Wenqing Li ◽  
Bohui Shi ◽  
Jing Gong

Hydrate formation and blockage in long deepwater pipelines has long been a trouble for offshore petroleum production. Consequently, understandings of the procedures as well as influencing factors of hydrate blockage are key points to make reasonable flow assurance strategies. Thus two series of experiments have been conducted in a high-pressure hydrate flow loop newly constructed by multi-phase flow research group in China University of Petroleum (Beijing). One of the systems consists of water and CO2, while the other one includes water, diesel oil and natural gas. The relative time of hydrate blockage has been studied by varying pressure and flow rate for both two systems. The dimensions of hydrate particles in fluid during plugging are also investigated. The results indicate that the influencing factor exerts a similar effect on the relative time for the different systems. Besides, the sizes of particles in the fluid would change significantly due to hydrate formation.


2010 ◽  
Vol 36 ◽  
pp. 162-166
Author(s):  
Rui Wang ◽  
Yuan Bao Leng ◽  
Chang Zheng Li

Sub-bottom profiler is a kind of underwater acoustic imaging equipment. It can scan the sub-water stratums with acoustic signals and presents the section imaging. The frequency rang and transmitting power are key points to choice a suitable profiler. Generally, high frequency means high resolution and small imaging range. Transmitting power affects the imaging range also. Sub-bottom profiler can tell hydraulic and civil engineers what the embankments’ foundation like, especially the distribution of enrockments. With these information, engineers can evaluate the safety of embankments and decide what to do to keep them standing strong. A typical profiler called X-Star and a series of experiments carried on Yellow River, the famous sediment-laden and the 2nd longest river of China.


2013 ◽  
Vol 416-417 ◽  
pp. 1738-1740
Author(s):  
Chun Hong Xu ◽  
Zhan Fu Jing

Mass concrete technology has been widely used in the construction field. In engineering construction, the quality and construction technology of mass concrete are directly related to the quality of the projects. Thus, before the construction of mass concrete, a series of experiments should be made, and also the key points of the construction technology should be strictly followed in the construction process. In this paper, by taking the application of mass concrete in stadium project as an example, the test before the construction of mass concrete and the technical application in the construction are analyzed.


Robotica ◽  
2014 ◽  
Vol 33 (4) ◽  
pp. 986-1002 ◽  
Author(s):  
Nutan Chen ◽  
Keng Peng Tee ◽  
Chee-Meng Chew

SUMMARYTeleoperated grasping requires the abilities to follow the intended trajectory from the user and autonomously search for a suitable pre-grasp pose relative to the object of interest. Challenges include dealing with uncertainty due to the noise of the teleoperator, human elements and calibration errors in the sensors. To address these challenges, an effective and robust algorithm is introduced to assist grasping during teleoperation. Although without premature object contact or regrasping strategies, the algorithm enables the robot to perform online adjustments to reach a pre-grasp pose before final grasping. We use three infrared (IR) sensors that are mounted on the robot hand, and design an algorithm that controls the robot hand to grasp objects using the information from the sensors' readings and the interface component. Finally, a series of experiments demonstrate that the system is robust when grasping a wide range of objects and tracking slow-moving mobile objects. Empirical data from a five-subject user study allows us to tune the relative contributions from the IR sensors and the interface component so as to achieve a balance of grasp assistance and teleoperation.


2016 ◽  
Vol 33 (4) ◽  
pp. 202-218 ◽  
Author(s):  
Marija Vištica ◽  
Ani Grubišic ◽  
Branko Žitko

Purpose – In order to initialize a student model in intelligent tutoring systems, some form of initial knowledge test should be given to a student. Since the authors cannot include all domain knowledge in that initial test, a domain knowledge subset should be selected. The paper aims to discuss this issue. Design/methodology/approach – In order to generate a knowledge sample that represents truly a certain domain knowledge, the authors can use sampling algorithms. In this paper, the authors present five sampling algorithms (Random Walk, Metropolis-Hastings Random Walk, Forest Fire, Snowball and Represent algorithm) and investigate which structural properties of the domain knowledge sample are preserved after sampling process is conducted. Findings – The samples that the authors got using these algorithms are compared and the authors have compared their cumulative node degree distributions, clustering coefficients and the length of the shortest paths in a sampled graph in order to find the best one. Originality/value – This approach is original as the authors could not find any similar work that uses graph sampling methods for student modeling.


2018 ◽  
Vol 7 (3) ◽  
pp. 1286
Author(s):  
Nidha Khanam ◽  
Rupali Sunil Wagh

Graphs or networks very commonly are used to represent connected or linked data. With the penetration of www in every sphere of life networked relationships can be easily established through communication links over web and network and graph analysis come as obvious choices of data representation and analysis. There are processes which can be analysed as network not through web but through the knowledge links available in these domains. In both these cases network analysis is challenged by the enormous size of the network in terms of nodes and links. Sub graph sampling can effectively be employed on large network structures to reduce the size of data while preserving the original properties of the network. Through this paper authors present a case study on application of sub graph sampling approach to obtain reduced case citation network in legal domain.  


Sign in / Sign up

Export Citation Format

Share Document