scholarly journals Network Architecture of Large Scale Enterprises

2017 ◽  
Author(s):  
Masoud Karimi

In this paper we will discuss a summary of TCP/IP Protocol Suite and each layer of the model. And we will also talk about how data communications and computer networks impact the business function. In this paper, we will elaborate more on Walmart as a large business and discuss the logical and physical connection. We also mention the security issues which is one of the big concerns for all large scale organizations.

2020 ◽  
Vol 2020 (10) ◽  
pp. 181-1-181-7
Author(s):  
Takahiro Kudo ◽  
Takanori Fujisawa ◽  
Takuro Yamaguchi ◽  
Masaaki Ikehara

Image deconvolution has been an important issue recently. It has two kinds of approaches: non-blind and blind. Non-blind deconvolution is a classic problem of image deblurring, which assumes that the PSF is known and does not change universally in space. Recently, Convolutional Neural Network (CNN) has been used for non-blind deconvolution. Though CNNs can deal with complex changes for unknown images, some CNN-based conventional methods can only handle small PSFs and does not consider the use of large PSFs in the real world. In this paper we propose a non-blind deconvolution framework based on a CNN that can remove large scale ringing in a deblurred image. Our method has three key points. The first is that our network architecture is able to preserve both large and small features in the image. The second is that the training dataset is created to preserve the details. The third is that we extend the images to minimize the effects of large ringing on the image borders. In our experiments, we used three kinds of large PSFs and were able to observe high-precision results from our method both quantitatively and qualitatively.


2021 ◽  
Vol 13 (9) ◽  
pp. 5108
Author(s):  
Navin Ranjan ◽  
Sovit Bhandari ◽  
Pervez Khan ◽  
Youn-Sik Hong ◽  
Hoon Kim

The transportation system, especially the road network, is the backbone of any modern economy. However, with rapid urbanization, the congestion level has surged drastically, causing a direct effect on the quality of urban life, the environment, and the economy. In this paper, we propose (i) an inexpensive and efficient Traffic Congestion Pattern Analysis algorithm based on Image Processing, which identifies the group of roads in a network that suffers from reoccurring congestion; (ii) deep neural network architecture, formed from Convolutional Autoencoder, which learns both spatial and temporal relationships from the sequence of image data to predict the city-wide grid congestion index. Our experiment shows that both algorithms are efficient because the pattern analysis is based on the basic operations of arithmetic, whereas the prediction algorithm outperforms two other deep neural networks (Convolutional Recurrent Autoencoder and ConvLSTM) in terms of large-scale traffic network prediction performance. A case study was conducted on the dataset from Seoul city.


2020 ◽  
pp. 1-26
Author(s):  
Qinwen Hu ◽  
Muhammad Rizwan Asghar ◽  
Nevil Brownlee

HTTPS refers to an application-specific implementation that runs HyperText Transfer Protocol (HTTP) on top of Secure Socket Layer (SSL) or Transport Layer Security (TLS). HTTPS is used to provide encrypted communication and secure identification of web servers and clients, for different purposes such as online banking and e-commerce. However, many HTTPS vulnerabilities have been disclosed in recent years. Although many studies have pointed out that these vulnerabilities can lead to serious consequences, domain administrators seem to ignore them. In this study, we evaluate the HTTPS security level of Alexa’s top 1 million domains from two perspectives. First, we explore which popular sites are still affected by those well-known security issues. Our results show that less than 0.1% of HTTPS-enabled servers in the measured domains are still vulnerable to known attacks including Rivest Cipher 4 (RC4), Compression Ratio Info-Leak Mass Exploitation (CRIME), Padding Oracle On Downgraded Legacy Encryption (POODLE), Factoring RSA Export Keys (FREAK), Logjam, and Decrypting Rivest–Shamir–Adleman (RSA) using Obsolete and Weakened eNcryption (DROWN). Second, we assess the security level of the digital certificates used by each measured HTTPS domain. Our results highlight that less than 0.52% domains use the expired certificate, 0.42% HTTPS certificates contain different hostnames, and 2.59% HTTPS domains use a self-signed certificate. The domains we investigate in our study cover 5 regions (including ARIN, RIPE NCC, APNIC, LACNIC, and AFRINIC) and 61 different categories such as online shopping websites, banking websites, educational websites, and government websites. Although our results show that the problem still exists, we find that changes have been taking place when HTTPS vulnerabilities were discovered. Through this three-year study, we found that more attention has been paid to the use and configuration of HTTPS. For example, more and more domains begin to enable the HTTPS protocol to ensure a secure communication channel between users and websites. From the first measurement, we observed that many domains are still using TLS 1.0 and 1.1, SSL 2.0, and SSL 3.0 protocols to support user clients that use outdated systems. As the previous studies revealed security risks of using these protocols, in the subsequent studies, we found that the majority of domains updated their TLS protocol on time. Our 2020 results suggest that most HTTPS domains use the TLS 1.2 protocol and show that some HTTPS domains are still vulnerable to the existing known attacks. As academics and industry professionals continue to disclose attacks against HTTPS and recommend the secure configuration of HTTPS, we found that the number of vulnerable domain is gradually decreasing every year.


2021 ◽  
Vol 51 (3) ◽  
pp. 9-16
Author(s):  
José Suárez-Varela ◽  
Miquel Ferriol-Galmés ◽  
Albert López ◽  
Paul Almasan ◽  
Guillermo Bernárdez ◽  
...  

During the last decade, Machine Learning (ML) has increasingly become a hot topic in the field of Computer Networks and is expected to be gradually adopted for a plethora of control, monitoring and management tasks in real-world deployments. This poses the need to count on new generations of students, researchers and practitioners with a solid background in ML applied to networks. During 2020, the International Telecommunication Union (ITU) has organized the "ITU AI/ML in 5G challenge", an open global competition that has introduced to a broad audience some of the current main challenges in ML for networks. This large-scale initiative has gathered 23 different challenges proposed by network operators, equipment manufacturers and academia, and has attracted a total of 1300+ participants from 60+ countries. This paper narrates our experience organizing one of the proposed challenges: the "Graph Neural Networking Challenge 2020". We describe the problem presented to participants, the tools and resources provided, some organization aspects and participation statistics, an outline of the top-3 awarded solutions, and a summary with some lessons learned during all this journey. As a result, this challenge leaves a curated set of educational resources openly available to anyone interested in the topic.


2021 ◽  
Vol 40 (3) ◽  
pp. 1-13
Author(s):  
Lumin Yang ◽  
Jiajie Zhuang ◽  
Hongbo Fu ◽  
Xiangzhi Wei ◽  
Kun Zhou ◽  
...  

We introduce SketchGNN , a convolutional graph neural network for semantic segmentation and labeling of freehand vector sketches. We treat an input stroke-based sketch as a graph with nodes representing the sampled points along input strokes and edges encoding the stroke structure information. To predict the per-node labels, our SketchGNN uses graph convolution and a static-dynamic branching network architecture to extract the features at three levels, i.e., point-level, stroke-level, and sketch-level. SketchGNN significantly improves the accuracy of the state-of-the-art methods for semantic sketch segmentation (by 11.2% in the pixel-based metric and 18.2% in the component-based metric over a large-scale challenging SPG dataset) and has magnitudes fewer parameters than both image-based and sequence-based methods.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2852
Author(s):  
Parvathaneni Naga Srinivasu ◽  
Jalluri Gnana SivaSai ◽  
Muhammad Fazal Ijaz ◽  
Akash Kumar Bhoi ◽  
Wonjoon Kim ◽  
...  

Deep learning models are efficient in learning the features that assist in understanding complex patterns precisely. This study proposed a computerized process of classifying skin disease through deep learning based MobileNet V2 and Long Short Term Memory (LSTM). The MobileNet V2 model proved to be efficient with a better accuracy that can work on lightweight computational devices. The proposed model is efficient in maintaining stateful information for precise predictions. A grey-level co-occurrence matrix is used for assessing the progress of diseased growth. The performance has been compared against other state-of-the-art models such as Fine-Tuned Neural Networks (FTNN), Convolutional Neural Network (CNN), Very Deep Convolutional Networks for Large-Scale Image Recognition developed by Visual Geometry Group (VGG), and convolutional neural network architecture that expanded with few changes. The HAM10000 dataset is used and the proposed method has outperformed other methods with more than 85% accuracy. Its robustness in recognizing the affected region much faster with almost 2× lesser computations than the conventional MobileNet model results in minimal computational efforts. Furthermore, a mobile application is designed for instant and proper action. It helps the patient and dermatologists identify the type of disease from the affected region’s image at the initial stage of the skin disease. These findings suggest that the proposed system can help general practitioners efficiently and effectively diagnose skin conditions, thereby reducing further complications and morbidity.


2020 ◽  
Vol 34 (07) ◽  
pp. 11693-11700 ◽  
Author(s):  
Ao Luo ◽  
Fan Yang ◽  
Xin Li ◽  
Dong Nie ◽  
Zhicheng Jiao ◽  
...  

Crowd counting is an important yet challenging task due to the large scale and density variation. Recent investigations have shown that distilling rich relations among multi-scale features and exploiting useful information from the auxiliary task, i.e., localization, are vital for this task. Nevertheless, how to comprehensively leverage these relations within a unified network architecture is still a challenging problem. In this paper, we present a novel network structure called Hybrid Graph Neural Network (HyGnn) which targets to relieve the problem by interweaving the multi-scale features for crowd density as well as its auxiliary task (localization) together and performing joint reasoning over a graph. Specifically, HyGnn integrates a hybrid graph to jointly represent the task-specific feature maps of different scales as nodes, and two types of relations as edges: (i) multi-scale relations capturing the feature dependencies across scales and (ii) mutual beneficial relations building bridges for the cooperation between counting and localization. Thus, through message passing, HyGnn can capture and distill richer relations between nodes to obtain more powerful representations, providing robust and accurate results. Our HyGnn performs significantly well on four challenging datasets: ShanghaiTech Part A, ShanghaiTech Part B, UCF_CC_50 and UCF_QNRF, outperforming the state-of-the-art algorithms by a large margin.


2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Muhammad Muzamil Aslam ◽  
Liping Du ◽  
Xiaoyan Zhang ◽  
Yueyun Chen ◽  
Zahoor Ahmed ◽  
...  

Recently, 5G installation has been started globally. Different capabilities are in the consistent procedure, like ultrareliability, mass connectivity, and specific low latency. Though, 5G is insufficient to meet all the necessities of the future technology in 2030 and so on. Next generation information and communication technology is playing an important role in attraction of researchers, industries, and technical people. With respect to 5G networks, sixth-generation (6G) CR networks are anticipated to familiarize innovative use cases and performance metrics, such as to offer worldwide coverage, cost efficiency, enhanced spectral, energy improved intelligence, and safety. To reach such requirements, upcoming 6G CRNs will trust novel empowering technologies. Innovative network architecture and transmission technologies and air interface are of excessive position, like multiple accesses, waveform design, multiantenna technologies, and channel coding schemes. (1) To content, the condition should be of worldwide coverage, there will be no limit on 6G to global CR communication networks that may require to be completed with broadcast networks, like satellite communication networks, therefore, attaining a sea integrated communication network. (2) The spectrums overall will be entirely travelled to the supplementary rise connection density data rates in optical frequency bands, millimeter wave (mmWave), sub-6 GHz, and terahertz (THz). (3) To see big datasets created because of tremendously varied CR communication networks, antenna rush, diverse communication scenarios, new provision necessities, wide bandwidth, and 6G CRNs will allow an innovative variety of intelligent applications with the assistance of big data and AI technologies. (4) Need to improve network security when deploying 6G technology in CR networks. 6G is decentralized, intended, intelligent innovative, and distributed network. In this article, we studied a survey of current developments and upcoming trends. We studied the predicted applications, possible technologies, and security issues for 6G CR network communication. We also discussed predicted future key challenges in 6G.


Author(s):  
Mykola Ryzhkov ◽  
Anastasiia Siabro

Achievements in the sphere of automatization and telecommunication are an essential component of transformation of the international peace and security system. This article presents, that consequences of changes are of a dual character. On the one hand, new technologies are becoming an important component of society modernization strategies in developing countries, on the other hand, they can be used for armament modernization or creation of new means of confrontation in modern international relations. APR countries face the most relevant issue of information technologies usage. The article deals with the process of discussion of new challenges and threats to international security, emerging as a result of development and large-scale implementation of information-communication technologies. Positions of states regarding the adoption of resolution in the sphere of international information security were studied through examples of Japan, India, and China. It is proved in the article, that information technologies have become an important component of the security system in the world. Technologies usage may lead to steady international development as well as to information arms race. That is why working out a common position on international information security issues is of crucial importance. It is within the framework of the UN, that different states of the world are given an opportunity to express their visions of the problem of international information security and work out common approaches to its solution. The article shows, that states’ positions have similar as well as different features. For instance, all states express concern regarding possible limitation of technology transfer for the establishment of a more controlled international political environment. But states’ positions have major differences as to mechanisms of information security provision. Thus, Japan and India strive to achieve a balanced system of international information security, which should at the same time have preventive mechanisms against the emergence of threats in the information and science and technology spheres and guarantee continuation of scientific-technological development, which is a crucial component of development and modernization strategies in many countries of the world. China came forward with position of strong regulation of international information security issues and suggested framing of corresponding regulations of the states’ conduct in the cyberspace.


2018 ◽  
Vol 7 (2.20) ◽  
pp. 254
Author(s):  
M Dhasaratham ◽  
R P. Singh

Endless forces anticipate that customers can cut non-public information like electronic prosperity records for information examination or mining, transferral security issues. Anonymizing instructional accumulations by ways for hypothesis to satisfy bound assurance necessities, parenthetically, k-anonymity may be a for the foremost half used arrangement of security shielding frameworks. At appear, the live of information in varied cloud applications augments massively consistent with the massive information slant, on these lines creating it a take a look at for habitually used programming instruments to confine, supervise, and method such large scale information within an appropriate snuck hobby. during this manner, it's a take a look at for existing anonymization approaches to manage accomplish security preservation on insurance sensitive monumental scale instructive files as a results of their insufficiency of skillfulness. during this paper, we have a tendency to propose a versatile 2 part top-down specialization (TDS) to anonymize broad scale instructive accumulations victimisation the MapReduce structure on cloud. In mboth times of our approach, we have a tendency to advisedly layout a affair of innovative MapReduce occupations to determinedly accomplish the specialization reckoning in an awfully versatile means. wildcat assessment happens demonstrate that with our approach, the flexibleness and adequacy of TDS may be basically redesigned over existing philosophies.  


Sign in / Sign up

Export Citation Format

Share Document