scholarly journals The Architectural Dynamics of Encapsulated Botnet Detection (EDM)

Author(s):  
Maxwell Scale Uwadia Osagie ◽  
Amenze Joy Osagie

Botnet is one of the numerous attacks ravaging the networking environment. Its approach is said to be brutal and dangerous to network infrastructures as well as client systems. Since the introduction of botnet, different design methods have been employed to solve the divergent approach but the method of taking over servers and client systems is unabated. To solve this, we first identify Mpack, ICEpack and Fiesta as enhanced IRC tool. The analysis of its role in data exchange using OSI model was carried out. This further gave the needed proposal to the development of a High level architecture representing the structural mechanism and the defensive mechanism within network server so as to control the botnet trend. Finally, the architecture was designed to respond in a proactive state when scanning and synergizing the double data verification modules in an encapsulation manner within server system.

2009 ◽  
Author(s):  
K. A. McTaggart ◽  
R. G. Langlois

Replenishment at sea is essential for sustainment of naval operations away from home ports. This paper describes physics-based simulation of the transfer of solid payloads between two ships. For a given operational scenario, the simulation can determine whether events such as breakage of replenishment gear or immersion of payload in the ocean will occur. The simulation includes detailed modelling of the replenishment gear and ship motions. Distributed simulation using the High Level Architecture facilitates time management and data exchange among simulation components.


2015 ◽  
Vol 15 (5) ◽  
pp. 121-130
Author(s):  
Georgi Kirov

Abstract The study is dedicated to High Level Architecture (HLA) standard for software architecture of interoperable distributed simulations. The paper discusses the differences between object-oriented programming and HLA. It presents an extended simulation architecture providing a mechanism for HLA data exchange through Object-Oriented (OO) objects. This eliminates the complex network programming for HLA distributed simulations. The paper shows a sample code that implements the architecture for OO HLA/RTI simulation.


2021 ◽  
Vol 7 ◽  
Author(s):  
Paul T. Grogan

Abstract This paper draws on perspectives from co-design as an integrative and collaborative design activity and co-simulation as a supporting information system to advance engineering design methods for problems of societal significance. Design and implementation of the Sustainable Infrastructure Planning Game provides a prototypical co-design artifact that leverages the High Level Architecture co-simulation standard. Three role players create a strategic infrastructure plan for agricultural, water and energy sectors to meet sustainability objectives for a growing and urbaninzing population in a fictional desert nation. An observational study conducts 15 co-design sessions to understand underlying dynamics between actors and how co-simulation capabilities influence design outcomes. Results characterize the dependencies and conflicts between player roles based on technical exchange of resource flows, identifying tension between agriculture and water roles based on water demands for irrigation. Analysis shows a correlation between data exchange, facilitated by synchronous co-simulation, and highly ranked achievement of joint sustainability outcomes. Conclusions reflect on the opportunities and challenges presented by co-simulation in co-design settings to address engineering systems problems.


2013 ◽  
Vol 462-463 ◽  
pp. 1140-1143
Author(s):  
Jian Bing Tang ◽  
Qi Gao Hu ◽  
Ya Bing Zha

The run-time infrastructure (RTI) is a kind of software that is developed by the interface specification of high level architecture (HLA), which can offer a universal and comparatively independent service for simulation applications in the HLA simulation system. The RTI is the key underpinning software for the HLA simulation system. The quality of the RTI software is very important, and it will directly affect the system running. In order to guarantee the quality of the RTI software, it should be tested and evaluated roundly. The test and evaluation involve the functional and the performance aspects. The former can certify the RTI software correctness, and the latter can check the efficiency for the RTI software. Three editions of RTI software will be tested and evaluated in this paper. Both the function and the performance are tested, and the performance aspect is emphasized. The results show that the three editions of RTI software not only can meet software performance requirements, but can be used in generic real time simulation. Moreover, on the aspect of performance, the KD-RTI has some virtues, such as fast data-exchange velocity, low data loss and low latency.


Information ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 71
Author(s):  
Simon Gorecki ◽  
Jalal Possik ◽  
Gregory Zacharewicz ◽  
Yves Ducq ◽  
Nicolas Perry

Nowadays, industries are implementing heterogeneous systems from different domains, backgrounds, and operating systems. Manufacturing systems are becoming more and more complex, which forces engineers to manage the complexity in several aspects. Technical complexities bring interoperability, risk management, and hazards issues that must be taken into consideration, from the business model design to the technical implementation. To solve the complexities and the incompatibilities between heterogeneous components, several distributed and cosimulation standards and tools can be used for data exchange and interconnection. High-level architecture (HLA) and functional mockup interface (FMI) are the main international standards used for distributed and cosimulation. HLA is mainly used in academic and defense domains while FMI is mostly used in industry. In this article, we propose an HLA/FMI implementation with a connection to an external business process-modeling tool called Papyrus. Papyrus is configured as a master federate that orchestrates the subsimulations based on the above standards. The developed framework is integrated with external heterogeneous components through an FMI interface. This framework is developed with the aim of bringing interoperability to a system used in a power generation company.


Author(s):  
Lichao Xu ◽  
Szu-Yun Lin ◽  
Andrew W. Hlynka ◽  
Hao Lu ◽  
Vineet R. Kamat ◽  
...  

AbstractThere has been a strong need for simulation environments that are capable of modeling deep interdependencies between complex systems encountered during natural hazards, such as the interactions and coupled effects between civil infrastructure systems response, human behavior, and social policies, for improved community resilience. Coupling such complex components with an integrated simulation requires continuous data exchange between different simulators simulating separate models during the entire simulation process. This can be implemented by means of distributed simulation platforms or data passing tools. In order to provide a systematic reference for simulation tool choice and facilitating the development of compatible distributed simulators for deep interdependent study in the context of natural hazards, this article focuses on generic tools suitable for integration of simulators from different fields but not the platforms that are mainly used in some specific fields. With this aim, the article provides a comprehensive review of the most commonly used generic distributed simulation platforms (Distributed Interactive Simulation (DIS), High Level Architecture (HLA), Test and Training Enabling Architecture (TENA), and Distributed Data Services (DDS)) and data passing tools (Robot Operation System (ROS) and Lightweight Communication and Marshalling (LCM)) and compares their advantages and disadvantages. Three specific limitations in existing platforms are identified from the perspective of natural hazard simulation. For mitigating the identified limitations, two platform design recommendations are provided, namely message exchange wrappers and hybrid communication, to help improve data passing capabilities in existing solutions and provide some guidance for the design of a new domain-specific distributed simulation framework.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Bo-yong Park ◽  
Seok-Jun Hong ◽  
Sofie L. Valk ◽  
Casey Paquola ◽  
Oualid Benkarim ◽  
...  

AbstractThe pathophysiology of autism has been suggested to involve a combination of both macroscale connectome miswiring and microcircuit anomalies. Here, we combine connectome-wide manifold learning with biophysical simulation models to understand associations between global network perturbations and microcircuit dysfunctions in autism. We studied neuroimaging and phenotypic data in 47 individuals with autism and 37 typically developing controls obtained from the Autism Brain Imaging Data Exchange initiative. Our analysis establishes significant differences in structural connectome organization in individuals with autism relative to controls, with strong between-group effects in low-level somatosensory regions and moderate effects in high-level association cortices. Computational models reveal that the degree of macroscale anomalies is related to atypical increases of recurrent excitation/inhibition, as well as subcortical inputs into cortical microcircuits, especially in sensory and motor areas. Transcriptomic association analysis based on postmortem datasets identifies genes expressed in cortical and thalamic areas from childhood to young adulthood. Finally, supervised machine learning finds that the macroscale perturbations are associated with symptom severity scores on the Autism Diagnostic Observation Schedule. Together, our analyses suggest that atypical subcortico-cortical interactions are associated with both microcircuit and macroscale connectome differences in autism.


2021 ◽  
Vol 92 (3) ◽  
pp. 1854-1875 ◽  
Author(s):  
Klaus Stammler ◽  
Monika Bischoff ◽  
Andrea Brüstle ◽  
Lars Ceranna ◽  
Stefanie Donner ◽  
...  

Abstract Germany has a long history in seismic instrumentation. The installation of the first station sites was initiated in those regions with seismic activity. Later on, with an increasing need for seismic hazard assessment, seismological state services were established over the course of several decades, using heterogeneous technology. In parallel, scientific research and international cooperation projects triggered the establishment of institutional and nationwide networks and arrays also focusing on topics other than monitoring local or regional areas, such as recording global seismicity or verification of the compliance with the Comprehensive Nuclear-Test-Ban Treaty. At each of the observatories and data centers, an extensive analysis of the recordings is performed providing high-level data products, for example, earthquake catalogs, as a base for supporting state or federal authorities, to inform the public on topics related to seismology, and for information transfer to international institutions. These data products are usually also accessible at websites of the responsible organizations. The establishment of the European Integrated Data Archive (EIDA) led to a consolidation of existing waveform data exchange mechanisms and their definition as standards in Europe, along with a harmonization of the applied data quality assurance procedures. In Germany, the German Regional Seismic Network as national backbone network and the state networks of Saxony, Saxony-Anhalt, Thuringia, and Bavaria spearheaded the national contributions to EIDA. The benefits of EIDA are attracting additional state and university networks, which are about to join the EIDA community now.


Sign in / Sign up

Export Citation Format

Share Document