scholarly journals Performance analysis of security framework for software defined network architectures

Author(s):  
K. A. Varun Kumar ◽  
D. Arivudainambi

<p>Software defined data centers (SDDC) and software defined networking (SDN) are two emerging areas in the field of cloud data centers. SDN based centrally controlled services takes a global view of the entire cloud infrastructure between SDDC and SDN, whereas Network Function Virtualization (NFV) is widely used for providing virtual networking between host and Internet Service Providers (ISP’s). Some Application as a Service used in NFV data centers have a wide range in building security services like Virtual firewalls, Intrusion Detection System (IDS), load balancing, bandwidth allocation and management. In this paper, a novel security framework is proposed to combat SDDC and SDN based on NFV security features. The proposed framework consists of a Virtual firewall and an efficient bandwidth manager to handle multiple heterogeneous application requests from different ISPs. Real time data were taken from an experiment for a week and A new simulation based proof of concept is admitted in this paper for validation of the proposed framework which was deployed in real time SDNs using Mininet and POX controller.</p>

Author(s):  
Jānis Kampars ◽  
Krišjānis Pinka

For customers of cloud-computing platforms it is important to minimize the infrastructure footprint and associated costs while providing required levels of Quality of Service (QoS) and Quality of Experience (QoE) dictated by the Service Level Agreement (SLA). To assist with that cloud service providers are offering: (1) horizontal resource scaling through provisioning and destruction of virtual machines and containers, (2) vertical scaling through changing the capacity of individual cloud nodes. Existing scaling solutions mostly concentrate on low-level metrics like CPU load and memory consumption which doesn’t always correlate with the level of SLA conformity. Such technical measures should be preprocessed and viewed from a higher level of abstraction. Application level metrics should also be considered when deciding upon scaling the cloud-based solution. Existing scaling platforms are mostly proprietary technologies owned by cloud service providers themselves or by third parties and offered as Software as a Service. Enterprise applications could span infrastructures of multiple public and private clouds, dictating that the auto-scaling solution should not be isolated inside a single cloud infrastructure. The goal of this paper is to address the challenges above by presenting the architecture of Auto-scaling and Adjustment Platform for Cloud-based Systems (ASAPCS). It is based on open-source technologies and supports integration of various low and high level performance metrics, providing higher levels of abstraction for design of scaling algorithms. ASAPCS can be used with any cloud service provider and guarantees that move from one cloud platform to another will not result in complete redesign of the scaling algorithm. ASAPCS itself is horizontally scalable and can process large amounts of real-time data which is particularly important for applications developed following the microservices architectural style. ASAPCS approaches the scaling problem in a nonstandard way by considering real-time adjustments of the application logic to be part of the scalability strategy if it can result in performance improvements.


Author(s):  
Olexander Melnikov ◽  
◽  
Konstantin Petrov ◽  
Igor Kobzev ◽  
Viktor Kosenko ◽  
...  

The article considers the development and implementation of cloud services in the work of government agencies. The classification of the choice of cloud service providers is offered, which can serve as a basis for decision making. The basics of cloud computing technology are analyzed. The COVID-19 pandemic has identified the benefits of cloud services in remote work Government agencies at all levels need to move to cloud infrastructure. Analyze the prospects of cloud computing in Ukraine as the basis of e-governance in development. This is necessary for the rapid provision of quality services, flexible, large-scale and economical technological base. The transfer of electronic information interaction in the cloud makes it possible to attract a wide range of users with relatively low material costs. Automation of processes and their transfer to the cloud environment make it possible to speed up the process of providing services, as well as provide citizens with minimal time to obtain certain information. The article also lists the risks that exist in the transition to cloud services and the shortcomings that may arise in the process of using them.


Author(s):  
Nicole Gailey ◽  
Noman Rasool

Canada and the United States have vast energy resources, supported by thousands of kilometers (miles) of pipeline infrastructure built and maintained each year. Whether the pipeline runs through remote territory or passing through local city centers, keeping commodities flowing safely is a critical part of day-to-day operation for any pipeline. Real-time leak detection systems have become a critical system that companies require in order to provide safe operations, protection of the environment and compliance with regulations. The function of a leak detection system is the ability to identify and confirm a leak event in a timely and precise manner. Flow measurement devices are a critical input into many leak detection systems and in order to ensure flow measurement accuracy, custody transfer grade liquid ultrasonic meters (as defined in API MPMS chapter 5.8) can be utilized to provide superior accuracy, performance and diagnostics. This paper presents a sample of real-time data collected from a field install base of over 245 custody transfer grade liquid ultrasonic meters currently being utilized in pipeline leak detection applications. The data helps to identify upstream instrumentation anomalies and illustrate the abilities of the utilization of diagnostics within the liquid ultrasonic meters to further improve current leak detection real time transient models (RTTM) and pipeline operational procedures. The paper discusses considerations addressed while evaluating data and understanding the importance of accuracy within the metering equipment utilized. It also elaborates on significant benefits associated with the utilization of the ultrasonic meter’s capabilities and the importance of diagnosing other pipeline issues and uncertainties outside of measurement errors.


2021 ◽  
Author(s):  
Saif Al Arfi ◽  
Fatima AlSowaidi ◽  
Fernando Ruiz ◽  
Ibrahim Hamdy ◽  
Yousef Tobji ◽  
...  

Abstract To meet the current oil and gas market challenges, there is an industry need to optimize cost by safely drilling longer horizontal wells to maximize well productivity. Drilling challenges include the highly deviated trajectory that starts from the surface sections and wellhead, the high DogLeg Sevirity (DLS) profile with collision risks, and the thin complex geological structures, especially in new unconventional fields where numerous geological and geomechanical uncertainties are present. To mitigate for those challenges, reviewing the existing drilling techniques and technologies is necessary. To compete in the current Hi-Tech and Automation era, the main challenges for directional drilling service providers are to reduce well time, place wells accurately, and improve reliability, reducing repair and maintenance costs and helping the customer reduce time and costs for the overall project. Offset wells analysis and risk assessments allowed identifying the main challenges and problems during directional drilling phases, which were highlighted and summarized. As a proposed solution, the new generation of intelligent fully rotating high dogleg push-the-bit rotary steerable system has been implemented in the UAE onshore oil and gas fields to improve the directional drilling control and the performance. This implementation reduced the Non-Productive time (NPT) related to the human errors as the fully automation capabilities were being utilized. The new rotary steerable system has the highest mechanical specs in the market including self-diagnosis and self-prognosis through digital electronics and sophisticated algorithms that monitor equipment health in real-time and allow for managing the tool remotely. As a result, the new intelligent RSS was implemented in all possible complex wellbore conditions, such as wells with high DLS profile, drilling vertical, curve, and lateral sections in a single trip with high mud weight and high solid contents. Automation cruise control gave the opportunity to eliminate any well profile issues and maintain the aggressive drilling parameters. Using the Precise Near-bit Inclination and Azimuth and the At-Bit Gamma real-time data and high-frequency tool face measurements in the landing intervals where required for precise positional control to enable entering the reservoir in the correct location and with the correct attitude helping the customer's Geology and Geophysics department to place wells accurately while maintaining a high on bottom ROP.


2016 ◽  
Vol 10 (4) ◽  
pp. 1-32 ◽  
Author(s):  
Abdelaziz Amara Korba ◽  
Mehdi Nafaa ◽  
Salim Ghanemi

In this paper, a cluster-based hybrid security framework called HSFA for ad hoc networks is proposed and evaluated. The proposed security framework combines both specification and anomaly detection techniques to efficiently detect and prevent wide range of routing attacks. In the proposed hierarchical architecture, cluster nodes run a host specification-based intrusion detection system to detect specification violations attacks such as fabrication, replay, etc. While the cluster heads run an anomaly-based intrusion detection system to detect wormhole and rushing attacks. The proposed specification-based detection approach relies on a set of specifications automatically generated, while anomaly-detection uses statistical techniques. The proposed security framework provides an adaptive response against attacks to prevent damage to the network. The security framework is evaluated by simulation in presence of malicious nodes that can launch different attacks. Simulation results show that the proposed hybrid security framework performs significantly better than other existing mechanisms.


2018 ◽  
Vol 173 ◽  
pp. 02029
Author(s):  
XU Jiahui ◽  
YU Hongyuan ◽  
WANG Gang ◽  
WANG Zi ◽  
BAI Jingjie ◽  
...  

The rapid development of mobile Internet technology and the wide spread of smart terminals have brought opportunities for the transformation of power grid business model. Compared to the non-real-time information, the real-time and running data of dispatch and control domain is easy to be intercepted and cracked. To solve this problem, this paper presents a new approach to mobile application security framework for the power grid control field. It is to realize secondary encryption by using the method of MD5+AES mixed encryption algorithm and combining the time stamp in real-time data transmission process. At the same time it is to prevent cross-border operations and brute force by using Token authentication and Session technology. China EPRI safety test results show that the application of the framework significantly improves the integrity, safety and reliability of real-time data in power grid control.


2020 ◽  
Vol 16 (1) ◽  
pp. 116-141
Author(s):  
Bertin Martens ◽  
Frank Mueller-Langer

Abstract Before the arrival of digital car data, car manufacturers had already partly foreclosed the maintenance market through franchising contracts with a network of exclusive official dealers. EU regulation endorsed this foreclosure but mandated access to maintenance data for independent service providers to keep competition in these markets. The arrival of digital car data upsets this balance because manufacturers can collect real-time maintenance data on their servers and send messages to drivers. These can be used to price discriminate and increase the market share of official dealers. There are at least four alternative technical gateways that could give independent service providers similar data access options. However, they suffer in various degrees from data portability issues, switching costs and weak network effects, and insufficient economies of scale and scope in data analytics. Multisided third-party consumer media platforms appear to be better placed to overcome these economic hurdles, provided that an operational real-time data portability regime could be established.


2019 ◽  
Vol 11 (3) ◽  
pp. 69 ◽  
Author(s):  
Aris Leivadeas ◽  
George Kesidis ◽  
Mohamed Ibnkahla ◽  
Ioannis Lambadaris

Network Function Virtualization (NFV) has revolutionized the way network services are offered to end users. Individual network functions are decoupled from expensive and dedicated middleboxes and are now provided as software-based virtualized entities called Virtualized Network Functions (VNFs). NFV is often complemented with the Cloud Computing paradigm to provide networking functions to enterprise customers and end-users remote from their premises. NFV along with Cloud Computing has also started to be seen in Internet of Things (IoT) platforms as a means to provide networking functions to the IoT traffic. The intermix of IoT, NFV, and Cloud technologies, however, is still in its infancy creating a rich and open future research area. To this end, in this paper, we propose a novel approach to facilitate the placement and deployment of service chained VNFs in a network cloud infrastructure that can be extended using the Mobile Edge Computing (MEC) infrastructure for accommodating mission critical and delay sensitive traffic. Our aim is to minimize the end-to-end communication delay while keeping the overall deployment cost to minimum. Results reveal that the proposed approach can significantly reduce the delay experienced, while satisfying the Service Providers’ goal of low deployment costs.


2011 ◽  
Vol 418-420 ◽  
pp. 1988-1991
Author(s):  
Li Juan Zhao ◽  
Xiu Mei Lv ◽  
Wei Tong

This develops the roadheader vibration characteristics test system according to the structural characteristics and working principle of the cantilevered roadheader. Using the piezoelectric acceleration sensor detects vibration signal, and passing by signal processing and A/D conversion, vibration signals are sent to the PC with wireless transmission mode, vibration signals detected by the LabVIEW is realized real-time data acquisition, time-frequency analysis and digital processing. Based on this system testing results can effectively master roadheader operation state, identify the vibration characteristics, look for vibration source and put forward reasonable damping vibration measure, which provide the basis for roadheader in the best running condition. The development of roadheader vibration detection system uses the method that combines theory and simulation experiment , which realizes the real-time detection of roadheader vibration behavior, rational signal analysis of roadheader vibration and accurate processing of data results, it provides an important method to ensure the reliability of roadheader.


2002 ◽  
Vol 36 (1) ◽  
pp. 29-38 ◽  
Author(s):  
Ray Berkelmans ◽  
Jim C. Hendee ◽  
Paul A. Marshall ◽  
Peter V. Ridd ◽  
Alan R. Orpin ◽  
...  

With recent technological advances and a reduction in the cost of automatic weather stations and data buoys, the potential exists for significant advancement in science and environmental management using near real-time, high-resolution data to predict biological and/or physical events. However, real-world examples of how this potential wealth of data has been used in environmental management are few and far between. We describe in detail two examples where near real-time data are being used for the benefit of science and management. These include a prediction of coral bleaching events using temperature, light and wind as primary predictor variables, and the management of a coastal development where dynamic discharge quality limits are maintained with the aid of wind data as a proxy for turbidity in receiving waters. We argue that the limiting factors for the use of near real-time environmental data in management is frequently not the availability of the data, but the lack of knowledge of the quantitative relationships between biological/physical processes or events and environmental variables. We advocate renewed research into this area and an integrated approach to the use of a wide range of data types to deal with management issues in an innovative, cost-effective manner.


Sign in / Sign up

Export Citation Format

Share Document