scholarly journals A Multi-Granularity Backbone Network Extraction Method Based on the Topology Potential

Complexity ◽  
2018 ◽  
Vol 2018 ◽  
pp. 1-8 ◽  
Author(s):  
Hanning Yuan ◽  
Yanni Han ◽  
Ning Cai ◽  
Wei An

Inspired by the theory of physics field, in this paper, we propose a novel backbone network compression algorithm based on topology potential. With consideration of the network connectivity and backbone compression precision, the method is flexible and efficient according to various network characteristics. Meanwhile, we define a metric named compression ratio to evaluate the performance of backbone networks, which provides an optimal extraction granularity based on the contributions of degree number and topology connectivity. We apply our method to the public available Internet AS network and Hep-th network, which are the public datasets in the field of complex network analysis. Furthermore, we compare the obtained results with the metrics of precision ratio and recall ratio. All these results show that our algorithm is superior to the compared methods. Moreover, we investigate the characteristics in terms of degree distribution and self-similarity of the extracted backbone. It is proven that the compressed backbone network has a lot of similarity properties to the original network in terms of power-law exponent.

2010 ◽  
Vol 40-41 ◽  
pp. 361-365
Author(s):  
Ya Qin Fan ◽  
Hao Fan ◽  
Chao Sun

This paper features of IPSec and MPLS technologies, proposes a suitable MPLS VPN security solutions. The program to solve the VPN using MPLS backbone network in the public transport there is a second layer of information can not automatically encrypt, or connect easily made because of errors due to the interruption of information disclosure and other issues. To verify the proposed security program, the actual MPLS VPN can be simplified, abstracestablishing a simulation system MPLS VPN, and using OPNET simulation software simulation, simulation results of VPN officers practical reference value.


Author(s):  
Taye Girma Debelee ◽  
Abrham Gebreselasie ◽  
Friedhelm Schwenker ◽  
Mohammadreza Amirian ◽  
Dereje Yohannes

In this paper, a modified adaptive K-means (MAKM) method is proposed to extract the region of interest (ROI) from the local and public datasets. The local image datasets are collected from Bethezata General Hospital (BGH) and the public datasets are from Mammographic Image Analysis Society (MIAS). The same image number is used for both datasets, 112 are abnormal and 208 are normal. Two texture features (GLCM and Gabor) from ROIs and one CNN based extracted features are considered in the experiment. CNN features are extracted using Inception-V3 pre-trained model after simple preprocessing and cropping. The quality of the features are evaluated individually and by fusing features to one another and five classifiers (SVM, KNN, MLP, RF, and NB) are used to measure the descriptive power of the features using cross-validation. The proposed approach was first evaluated on the local dataset and then applied to the public dataset. The results of the classifiers are measured using accuracy, sensitivity, specificity, kappa, computation time and AUC. The experimental analysis made using GLCM features from the two datasets indicates that GLCM features from BGH dataset outperformed that of MIAS dataset in all five classifiers. However, Gabor features from the two datasets scored the best result with two classifiers (SVM and MLP). For BGH and MIAS, SVM scored an accuracy of 99%, 97.46%, the sensitivity of 99.48%, 96.26% and specificity of 98.16%, 100% respectively. And MLP achieved an accuracy of 97%, 87.64%, the sensitivity of 97.40%, 96.65% and specificity of 96.26%, 75.73% respectively. Relatively maximum performance is achieved for feature fusion between Gabor and CNN based extracted features using MLP classifier. However, KNN, MLP, RF, and NB classifiers achieved almost 100% performance for GLCM texture features and SVM scored an accuracy of 96.88%, the sensitivity of 97.14% and specificity of 96.36%. As compared to other classifiers, NB has scored the least computation time in all experiments.


2020 ◽  
Vol 10 (17) ◽  
pp. 5922 ◽  
Author(s):  
Yong Fang ◽  
Jian Gao ◽  
Zhonglin Liu ◽  
Cheng Huang

In the context of increasing cyber threats and attacks, monitoring and analyzing network security incidents in a timely and effective way is the key to ensuring network infrastructure security. As one of the world’s most popular social media sites, users post all kinds of messages on Twitter, from daily life to global news and political strategy. It can aggregate a large number of network security-related events promptly and provide a source of information flow about cyber threats. In this paper, for detecting cyber threat events on Twitter, we present a multi-task learning approach based on the natural language processing technology and machine learning algorithm of the Iterated Dilated Convolutional Neural Network (IDCNN) and Bidirectional Long Short-Term Memory (BiLSTM) to establish a highly accurate network model. Furthermore, we collect a network threat-related Twitter database from the public datasets to verify our model’s performance. The results show that the proposed model works well to detect cyber threat events from tweets and significantly outperform several baselines.


2020 ◽  
Vol 12 (22) ◽  
pp. 3818
Author(s):  
YuAn Wang ◽  
Liang Chen ◽  
Peng Wei ◽  
XiangChen Lu

Based on the hypothesis of the Manhattan world, we propose a tightly-coupled monocular visual-inertial odometry (VIO) system that combines structural features with point features and can run on a mobile phone in real-time. The back-end optimization is based on the sliding window method to improve computing efficiency. As the Manhattan world is abundant in the man-made environment, this regular world can use structural features to encode the orthogonality and parallelism concealed in the building to eliminate the accumulated rotation error. We define a structural feature as an orthogonal basis composed of three orthogonal vanishing points in the Manhattan world. Meanwhile, to extract structural features in real-time on the mobile phone, we propose a fast structural feature extraction method based on the known vertical dominant direction. Our experiments on the public datasets and self-collected dataset show that our system is superior to most existing open-source systems, especially in the situations where the images are texture-less, dark, and blurry.


2015 ◽  
Vol 23 (3) ◽  
pp. 596-600 ◽  
Author(s):  
Taha A Kass-Hout ◽  
Zhiheng Xu ◽  
Matthew Mohebbi ◽  
Hans Nelsen ◽  
Adam Baker ◽  
...  

Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Hao Hu ◽  
Yuling Liu ◽  
Yingjie Yang ◽  
Hongqi Zhang ◽  
Yuchen Zhang

The attack graph (AG) is an abstraction technique that reveals the ways an attacker can use to leverage vulnerabilities in a given network to violate security policies. The analyses developed to extract security-relevant properties are referred to as AG-based security evaluations. In recent years, many evaluation approaches have been explored. However, they are generally limited to the attacker’s “monotonicity” assumption, which needs further improvements to overcome the limitation. To address this issue, the stochastic mathematical model called absorbing Markov chain (AMC) is applied over the AG to give some new insights, namely, the expected success probability of attack intention (EAIP) and the expected attack path length (EAPL). Our evaluations provide the preferred mitigating target hosts and the vulnerabilities patching prioritization of middle hosts. Tests on the public datasets DARPA2000 and Defcon’s CTF23 both verify that our evaluations are available and reliable.


2014 ◽  
Vol 41 (9) ◽  
pp. 788-799 ◽  
Author(s):  
Liliana Quintero-Cano ◽  
Mohamed Wahba ◽  
Tarek Sayed

A transit network, visualized as a graph, can be evaluated using indicators such as connectivity, coverage, directness, and complexity, among others, based on the relationships between network elements. This study focuses on the analysis of interconnected and operationally complicated bus networks, a shortcoming of existing approaches tailored to simpler, metro networks. A new procedure is proposed for drawing bus networks as graphs, by disaggregating them into sub-networks at the traffic analysis zone level. As well, improved network connectivity indicators are proposed which incorporate the influence of bus operational characteristics. The effect of bus route transfers is analyzed by introducing intermediate walking transfer edges. The contribution of this research will provide transit agencies with quantitative measures to analyze the network characteristics and the related operational attributes at a zonal sub-network level across the agency’s coverage area. The proposed methodology was demonstrated by applying it to the Greater Vancouver Regional District public transportation system.


2021 ◽  
Vol 33 (5) ◽  
pp. 83-104
Author(s):  
Aleksandr Igorevich Getman ◽  
Maxim Nikolaevich Goryunov ◽  
Andrey Georgievich Matskevich ◽  
Dmitry Aleksandrovich Rybolovlev

The paper discusses the issues of training models for detecting computer attacks based on the use of machine learning methods. The results of the analysis of publicly available training datasets and tools for analyzing network traffic and identifying features of network sessions are presented sequentially. The drawbacks of existing tools and possible errors in the datasets formed with their help are noted. It is concluded that it is necessary to collect own training data in the absence of guarantees of the public datasets reliability and the limited use of pre-trained models in networks with characteristics that differ from the characteristics of the network in which the training traffic was collected. A practical approach to generating training data for computer attack detection models is proposed. The proposed solutions have been tested to evaluate the quality of model training on the collected data and the quality of attack detection in conditions of real network infrastructure.


2021 ◽  
Vol 7 (1) ◽  
pp. 122-131
Author(s):  
A. Spirkina

Over the past decade, in addition to multiservice networks, blockchain technology has undergone significant development due to the possibility of organizing a safe, integral, reliable exchange and storage of information. Due to the great demand for the technology, there is a problem of data transmission to the operators' networks. At the same time, a key task appears to consider the effect of this technology on network characteristics to predict traffic behavior on the network and ensure the required service quality indicators, as well as the stability of the state of the public communication network elements when the distributed ledger technology is operating. However, to consider and analyze the influence of technology in a full-scale experiment is a labor-intensive task that cannot always be performed, therefore, in this article, the authors propose to consider approaches to structural-parametric modeling of these systems.


Sign in / Sign up

Export Citation Format

Share Document