data loss
Recently Published Documents


TOTAL DOCUMENTS

635
(FIVE YEARS 251)

H-INDEX

24
(FIVE YEARS 5)

Author(s):  
Andrey Makashov ◽  
Andrew Makhorin ◽  
Maxim Terentiev

A wireless sensor network (WSN) of a tree-like topology is considered, which performs measurements and transmits their results to the consumer. Under the interference influence, the WSN nodes transmitters low power makes the transmitted information vulnerable, which leads to significant data loss. To reduce the data loss during transmission, a noise-immune WSN model is proposed. Such a WSN, having detected a stable connection absence between a pair of nodes, transfers the interaction between these nodes to a radio channel free from interference influence. For this, the model, in addition to forming a network and transferring application data, provides for checking the communication availability based on the keep-alive mechanism and restoring the network with a possible channel change. A feature point of the proposed approach is the ability to restore network connectivity when exposed to interference of significant power and duration, which makes it impossible to exchange service messages on the channel selected for the interaction of nodes. To support the model, work algorithms and data structures have been developed, indicators have been formalized to assess an anti-jamming system work quality.


2022 ◽  
Vol 20 (8) ◽  
pp. 3107
Author(s):  
O. V. Pachulia ◽  
R. A. Illarionov ◽  
E. S. Vashukova ◽  
N. A. Yurkina ◽  
M. G. Butenko ◽  
...  

The main condition for ensuring effective sampling for creating a bioresource collection is quality management, which implies careful planning and predicting errors at all stages. Risk management of samples and data loss is ensured by correct logistics, circumspect algorithms and standardization of processes. Features of the logistic processes for creating biosample collection from the pregnant women are described in this article.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Debabrata Singh ◽  
Jyotishree Bhanipati ◽  
Anil Kumar Biswal ◽  
Debabrata Samanta ◽  
Shubham Joshi ◽  
...  

Wireless sensor networks (WSNs) have attracted much more attention in recent years. Hence, nowadays, WSN is considered one of the most popular technologies in the networking field. The reason behind its increasing rate is only for its adaptability as it works through batteries which are energy efficient, and for these characteristics, it has covered a wide market worldwide. Transmission collision is one of the key reasons for the decrease in performance in WSNs which results in excessive delay and packet loss. The collision range should be minimized in order to mitigate the risk of these packet collisions. The WSNs that contribute to minimize the collision area and the statistics show that the collision area which exceeds equivalents transmission power has been significantly reduced by this technique. This proposed paper optimally reduced the power consumption and data loss through proper routing of packets and the method of congestion detection. WSNs typically require high data reliability to preserve identification and responsiveness capacity while also improving data reliability, transmission, and redundancy. Retransmission is determined by the probability of packet arrival as well as the average energy consumption.


Author(s):  
KOSTIUK Yuliia ◽  
SHESTAK Yaroslav

Background. The transport layer is designed to deliver data without errors, losses and duplication in the order in which they were transmitted. It provides data transfer between two applications with the required level of reliability. Transport layer protocols, which guarantee reliable data delivery, establish a virtual connection before data exchange and resend segments in case of loss or damage. The aim of the study was to determine the role of transport security protocols in computer networks. Materials and methods. To achieve the goal, the study used statistical analysis and a systematic approach. Results. TCP provides reliable message transmission through the formation of logical connections, while allowing peers on the sending computer and the receiving computer to support data exchange in duplex mode. It also has the ability to seamlessly send a byte stream generated on one of the computers to any other computer connected to the network.In addition, TCP controls the connection load, UDP does not control anything but the integrity of the received datagrams. Conclusion. The difference between TCP and UDP is the so-called "delivery guarantee". TCP requires a response from the client to whom the data packet is delivered, confirmation of delivery, and for this he needs a pre-established connection. TCP is also considered reliable, unlike UDP, which is called "unreliable datagram protocol". TCP eliminates data loss, duplication and shuffling of packets, delays, UDP allows all this, and it does not need a connection to work., as a result of which the data is transferred on UDP, should manage received, even with losses.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8407
Author(s):  
Fuad Al Abir ◽  
Md. Al Siam ◽  
Abu Sayeed ◽  
Md. Al Mehedi Hasan ◽  
Jungpil Shin

The act of writing letters or words in free space with body movements is known as air-writing. Air-writing recognition is a special case of gesture recognition in which gestures correspond to characters and digits written in the air. Air-writing, unlike general gestures, does not require the memorization of predefined special gesture patterns. Rather, it is sensitive to the subject and language of interest. Traditional air-writing requires an extra device containing sensor(s), while the wide adoption of smart-bands eliminates the requirement of the extra device. Therefore, air-writing recognition systems are becoming more flexible day by day. However, the variability of signal duration is a key problem in developing an air-writing recognition model. Inconsistent signal duration is obvious due to the nature of the writing and data-recording process. To make the signals consistent in length, researchers attempted various strategies including padding and truncating, but these procedures result in significant data loss. Interpolation is a statistical technique that can be employed for time-series signals to ensure minimum data loss. In this paper, we extensively investigated different interpolation techniques on seven publicly available air-writing datasets and developed a method to recognize air-written characters using a 2D-CNN model. In both user-dependent and user-independent principles, our method outperformed all the state-of-the-art methods by a clear margin for all datasets.


2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Shen Lvping

With the development of information technology and network technology, digital archive management systems have been widely used in archive management. Different from the inherent uniqueness and strong tamper-proof modification of traditional paper archives, electronic archives are stored in centralized databases which face more risks of network attacks, data loss, or stealing through malicious software and are more likely to be forged and tampered by internal managers or external attackers. The management of intangible cultural heritage archives is an important part of intangible cultural heritage protection. Because intangible heritage archives are different from traditional official archives, traditional archive management methods cannot be fully applied to intangible heritage archives’ management. This study combines the characteristics of blockchain technology with distributed ledgers, consensus mechanisms, encryption algorithms, etc., and proposes intangible cultural heritage file management based on blockchain technology for the complex, highly dispersed, large quantity, and low quality of intangible cultural heritage files. Optimizing methods, applying blockchain technology to the authenticity protection of electronic archives and designing and developing an archive management system based on blockchain technology, help to solve a series of problems in the process of intangible cultural heritage archives management.


2021 ◽  
pp. 19-26
Author(s):  
Yana Chumburidze ◽  
◽  
Tatiana Omelchenko ◽  

Data loss as a result of threats or natural disasters can lead not only to huge financial losses, but also damage the reputation of the company. The most effective way to protect data from loss is backup. The purpose of the study is to select the most appropriate method of data backup and develop a software tool based on it. We discussed the main methods of data backup such as full backup, incremental backup, differential backup, reverse incremental backup and synthetic backup. We identified the following criteria to determine the most appropriate backup method: backup speed, restore speed, backup repository, reliability, network workload, redundancy. We performed a comparative analysis based on the selected criteria to reveal that the most appropriate method of data backup is reverse incremental backup. A functional model, architecture and interface of the software tool have been designed. The main purpose of the software tool is to implement the method of reverse incremental backup to prevent information loss. The conformity of the backup data obtained as a result of performing a reverse backup to the current state of the system is considered to be the achievement of the goal. We conducted a series of experiments that showed that the backup copy corresponds to the current state of the system.


Author(s):  
Arnold Mashud Abukari ◽  
Edem Kwedzo Bankas ◽  
Mohammed Muniru Iddrisu

In this research paper, a Redundant Residue Number System (n,k) code is introduced to enhance Cloud ERP Data storage. The research findings have been able to demonstrate the application  of Redundant Residue Number System (RRNS) in the concept of Cloud ERP Data storage. The scheme contributed in addressing data loss challenges during data transmission. The proposed scheme also addressed and improved the probability of failure to access data compared to other existing systems. The proposed scheme adopted the concept of Homomorphic encryption and secret sharing whiles applying Redundant Residue Number System to detect and correct errors.The moduli set used is {2m, 2m + 1, 2m+1 - 1, 2m+1 + 1, 2m+1 + k, 22m - k, 22m + 1} where k is the number of the information moduli set used. The information moduli set is {2m, 2m + 1, 2m+1 - 1} and the redundant moduli is {2m+1 + 1, 2m+1 + k, 22m - k, 22m + 1}. The proposed scheme per the simulation results using python reveals that it performs far better in terms of data loss and failure to access data related concerns. The proposed scheme performed better between 41.2% for data loss to about 99% for data access based on the combination of (2, 4) and (2, 5) data shares respectively in a (k, n) settings.


Sign in / Sign up

Export Citation Format

Share Document