scholarly journals Security event data collection and analysis in large corporate networks

Author(s):  
E V Chernova ◽  
P N Polezhaev ◽  
A E Shukhman ◽  
Yu A Ushakov ◽  
I P Bolodurina ◽  
...  

Every year computer networks become more complex, which directly affects the provision of a high level of information security. Different commercial services, critical systems, and information resources prevailing in such networks are profitable targets for terrorists, cyber-spies, and criminals. The consequences range from the theft of strategic, highly valued intellectual property and direct financial losses to significant damages to a brand and customer trust. Attackers have the advantage in complex computer networks – it is easier to hide their tracks. The detection and identification of security incidents are the most important and difficult tasks. It is required to detect security incidents as soon as possible, to analyze and respond to them correctly, so as not to complicate the work of the enterprise computer network. The difficulty is that different event sources offer different data formats or can duplicate events. In addition, some events do not indicate any problems on their own, but their sequence may indicate the presence of a security incident. All collection processes of security events must be performed in real-time, which means streaming data processing.

Author(s):  
S.N. John ◽  
A.A. Anoprienko ◽  
C.U. Ndujiuba

This chapter provides solutions for increasing the efficiency of data transfer in modern computer network applications and computing network environments based on the TCP/IP protocol suite. In this work, an imitation model and simulation was used as the basic method in the research. A simulation model was developed for designing and analyzing the computer networks based on TCP/IP protocols suite which fully allows the exact features in realizing the protocols and their impact on increasing the efficiency of data transfer in local and corporate networks. The method of increasing efficiency in the performance of computer networks was offered, based on the TCP/IP protocols by perfection of the modes of data transfer in them. This allows an increased efficient usage of computer networks and network applications without additional expenditure on infrastructure of the network. Practically, the results obtained from this research enable significant increase in the performance efficiency of data transfer in the computer networks environment. An example is the “Donetsk National Technical University” network.


2020 ◽  
Vol 13 (1) ◽  
pp. 300
Author(s):  
Juan Carlos Fandos-Roig ◽  
Javier Sánchez-García ◽  
Sandra Tena-Monferrer ◽  
Luis José Callarisa-Fiol

The main aim of this paper is to analyze the influence of service companies’ corporate social responsibility (CSR) actions on final customer’s loyalty. A theoretical model of loyalty formation based on CSR was proposed and a sample of 1125 final customers of financial services in Spain was studied. Structural equation models were used to verify the hypothesized relationships. Based on the CSR theory oriented to stakeholders, this work justifies the direct and positive relationship between the perception of CSR actions in the shopping experience and customer trust. We also verified a positive indirect influence on loyalty. The services industry was chosen to conduct this research due to its own particularities (intangibility, inseparability, heterogeneity and perishability). As it is impossible to evaluate a service before its consumption, a high level of trust in the supplier will be necessary to motivate the purchase decision. We conclude that CSR becomes a key strategic asset for determining trust and loyalty among consumers. As major findings, we have verified the special importance of CSR in the services market. CSR improves customer trust in the service provider. Thus, this paper has significant managerial implications. Through CSR strategies, both the perception of the customer’s purchasing experience and trust can be enhanced, resulting in more loyal customers. As a limitation, this research was carried out among financial services. Further research should test the model across different industries and countries in order to determine the generalizability and consistency of the findings of this study.


2013 ◽  
Vol 299 ◽  
pp. 130-134
Author(s):  
Li Wei ◽  
Da Zhi Deng

In recent years,china input in the construction of the network management is constantly increasing;information technology has improved continuously,but,making a variety of network security incidents occur frequently,due to the vulnerability of the computer network system inherent,a direct impact on national security and social and political stability. Because of the popularity of computers and large-scale development of the Internet, network security has been increasing as the theme. Reasonable safeguards against violations of resources; regular Internet user behavior and so on has been the public's expectations of future Internet. This paper described a stable method of getting telnet user’s account in development of network management based on telnet protocol.


Author(s):  
Julie Roux ◽  
Katell Morin-Allory ◽  
Vincent Beroulle ◽  
Regis Leveugle ◽  
Lilian Bossuet ◽  
...  

2018 ◽  
Vol 7 (2) ◽  
pp. 61-67
Author(s):  
Iga Revva Princiss Jeinever

Computer networks are basically not safe to access freely. Security gaps in the network can be seen by irresponsible people with various techniques. Opening a port for access carries a high risk of being attacked by an attacker. In this connection, network administrators are required to work more to be able to secure the computer network they manage. One form of network security that is often used by network administrators in server management is through remote login such as ports on telnet, SSH, etc. A port that is always open is a network security hole that can be used by people who are not responsible for logging into the server. Focusing on these problems, in this study, Random Port Knocking is the right way and can be used to increase network security. With Random Port Knocking, the port will be opened as needed, the port will automatically change when it fails to log in more than three times and IP will automatically be blocked and access will not continue so that attacks on the network can be avoided and network security stability can be further improved. The final result of this research shows that the method applied in this research makes server safe. Because port randomization and IP block make irresponsible parties try harder to penetrate firewall walls.


2021 ◽  
Author(s):  
Celia J. Li

This thesis research has successfully completed two developments: an efficient Power-system Role-based Access Control (PRAC) and a secure Power-system Role-based kEy management (PREM). The PRAC significantly increases the security of computer networks for power systems, and surmounts the challenges caused by typical security and reliability concerns due to current technological and political changes faced in the electricity power industry. The PREM is designed to support the efficient operation of the PRAC using one-way hash functions and utilizing their advantages of computationally efficient and irreversibility security. PRAC and PREM are not only developed for handling single local computer network domain, but also extended for supporting multiple computer network domains. A platform for the comprehensive assessment of PREM is established for the fast and economical assessment of the key management developed in this thesis research.


2021 ◽  
Author(s):  
Celia J. Li

This thesis research has successfully completed two developments: an efficient Power-system Role-based Access Control (PRAC) and a secure Power-system Role-based kEy management (PREM). The PRAC significantly increases the security of computer networks for power systems, and surmounts the challenges caused by typical security and reliability concerns due to current technological and political changes faced in the electricity power industry. The PREM is designed to support the efficient operation of the PRAC using one-way hash functions and utilizing their advantages of computationally efficient and irreversibility security. PRAC and PREM are not only developed for handling single local computer network domain, but also extended for supporting multiple computer network domains. A platform for the comprehensive assessment of PREM is established for the fast and economical assessment of the key management developed in this thesis research.


Author(s):  
A. Danladi ◽  
G. P. Vasira

Fractal dimension is mathematically defined as a ratio of statistical complexity of network traffic; its significant manifestation can affect the network performance. In this work, two models of corporate computer networks have been developed using optimized network engineering tool (OPNET) technology. Raw packet generator (RPG) traffic was imposed on the corporate networks and modeled using H = 0.7 and D = 1.3, under the influence of Pareto distribution. Autocorrelation function and power law were used to confirm the presence of fractal traffic on the networks. Average Hurst index (H) of 50 and 100 workstations were estimated using aggregate of variance, absolute moment, periodogram and R/S methods as 0.627, 0.608 and its corresponding fractal dimensions (D) were obtained as 1.371 and 1.391 respectively. These results obtained mean, there is a manifestation of fractal traffic and delay is minimised on the network.  


2020 ◽  
pp. 45-51
Author(s):  
Igor Butusov ◽  
◽  
Aleksandr Romanov ◽  

The purpose of the article is to support the processes of preventing information security incidents in conditions of high uncertainty. Method: methods of mathematical (theoretical) computer science and fuzzy set theory. Result: an information security Incident, including a computer incident, is considered as a violation or termination of the functioning of an automated information system and (or) a violation of information stored and processed in this system, including those caused by a computer attack. Information descriptions are presented in the form of structured data about signs of computer attacks. Structured data is the final sequence of strings of symbols in a formal language. The Damerau-Levenstein editorial rule is proposed as a metric for measuring the distance between strings of characters from a particular alphabet. The possibility of presenting the semantics of information descriptions of attack features in the form of fuzzy sets is proved. Thresholds (degrees) of separation of fuzzy information descriptions are defined. The influence of semantic certainty of information descriptions of features (degrees of blurring of fuzzy information descriptions) on the decision-making about their identity (similarity) is evaluated. It is shown that the semantic component of information descriptions of signs of computer attacks presupposes the presence of some semantic metric (for its measurement and interpretation), which, as a rule, is formally poorly defined, ambiguously interpreted and characterized by uncertainty of the type of fuzziness, the presence of semantic information and the inability to directly apply a probabilistic measure to determine the degree of similarity of input and stored information descriptions of signs. An approach is proposed to identify fuzzy information descriptions of computer attacks and to apply methods for separating elements of reference sets on which these information descriptions are defined. It is shown that the results of the procedure for identifying fuzzy information descriptions of computer attacks depend on the degree of separation of the reference sets and on the indicators of semantic uncertainty of these descriptions


Author(s):  
Sudha Srinivasan ◽  
D. S. Chauhan ◽  
Rekha R.

Field programmable gate arrays (FPGAs) are finding increasing number of applications in high integrity safety critical systems of aerospace and defence industry. Though FPGA design goes through various development processes, it is widely observed that the critical errors are observed in the final stages of development, thereby impacting time and cost. The risk of failure in complex embedded systems is overcome by using the independent verification and validation (IV&V) technique. Independent verification and validation (IV&V) of FPGA-based design is essential for evaluating the correctness, quality, and safety of the airborne embedded systems throughout the development life cycle and provides early detection and identification of risk elements. The process of IV&V and its planning needs to be initiated early in the development life cycle. This chapter describes the IV&V methodology for FPGA-based design during the development life cycle along with the certification process.


Sign in / Sign up

Export Citation Format

Share Document