data blocking
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 1)

H-INDEX

3
(FIVE YEARS 0)

2021 ◽  
Vol 13 (03) ◽  
pp. 97-107
Author(s):  
Su Man Nam ◽  
Youn Kyoung Seo

In wireless sensor networks, sensor nodes have the disadvantage of being vulnerable to several attacks due to the use of wireless communication and constrained energy. Adversaries exploit vulnerable characteristics of these nodes to capture them and generate false positive and false negative attacks. These attacks result in false alarms in a base station and information loss in intermediate nodes. A context-aware architecture for a probabilistic voting-based filtering scheme (CAA-PVFS) identifies compromised nodes that cause the damage. Although this method immediately detects the compromised nodes using its CAA, its additional network use consumes unnecessary energy. In this paper, our proposed method configures geofencing for the compromised nodes and blocks the nodes using false data injection. The proposed method reduces the unnecessary energy of the additional network while maintaining security strength. Experimental results indicate that the proposed method offers energy savings of up to 17% while maintaining the security strength against the two attacks as compared to the existing method.



2020 ◽  
Vol 39 (6) ◽  
pp. 8557-8564
Author(s):  
Mukta Jagdish ◽  
Amelec Viloria ◽  
Jesus Vargas ◽  
Omar Bonerge Pineda Lezama ◽  
David Ovallos-Gazabon

Cloud-based computation is known as the source architecture of the upcoming generation of IT enterprise. In context to up-coming trade solutions, the Information Technology sections are established under logical, personnel, and physical control, it transfers application software and large database to appropriate data centers, where security and management of database with services are not trustworthy fully. So this process may face many challenges towards society and organizations and that not been well understood over a while duration. This becomes one of the major challenges days today. So in this research, it focuses on security-based data storage using cloud, which plays one of the important aspects bases on qualities of services. To assure user data correctness in the cloud system, a flexible and effective distributed technique with two different salient features was examined by utilizing the token called homomorphic with erasure-coded data for distributed verification, based on this technique it achieved error data localization and integration of storage correctness. Also, it identifies server misbehaving, efficient, and security-based dynamic operations on data blocking such as data append, delete, and update methods. Performance analysis and security show the proposed method is more effective resilient and efficient against Byzantine failure, even server colluding attacks and malicious data modification attacks.



Symmetry ◽  
2019 ◽  
Vol 11 (4) ◽  
pp. 575
Author(s):  
Li ◽  
Dai ◽  
Wang

In banks, governments, and internet companies, due to the increasing demand for data in various information systems and continuously shortening of the cycle for data collection and update, there may be a variety of data quality issues in a database. As the expansion of data scales, methods such as pre-specifying business rules or introducing expert experience into a repair process are no longer applicable to some information systems requiring rapid responses. In this case, we divided data cleaning into supervised and unsupervised forms according to whether there were interventions in the repair processes and put forward a new dimension suitable for unsupervised cleaning in this paper. For weak logic errors in unsupervised data cleaning, we proposed an attribute correlation-based (ACB)-Framework under blocking, and designed three different data blocking methods to reduce the time complexity and test the impact of clustering accuracy on data cleaning. The experiments showed that the blocking methods could effectively reduce the repair time by maintaining the repair validity. Moreover, we concluded that the blocking methods with a too high clustering accuracy tended to put tuples with the same elements into a data block, which reduced the cleaning ability. In summary, the ACB-Framework with blocking can reduce the corresponding time cost and does not need the guidance of domain knowledge or interventions in repair, which can be applied in information systems requiring rapid responses, such as internet web pages, network servers, and sensor information acquisition.





2014 ◽  
Vol 644-650 ◽  
pp. 1414-1417
Author(s):  
Gang Fu ◽  
Jian Hua Lin ◽  
Qian He

Analyzing the parallel synchronous transmission data based on the proposed use of time-multiplexing transmission mode dynamic solve the transmission delay is large, the traditional structure of the software causing the radio frame data are transmitted in parallel low-speed data blocking phenomenon; and problems caused by data transmission and processing delay for virtual radio monitoring system uses asynchronous mode uncertainty, proposed time synchronization method based on the data frame structure to a more generalized approach to achieve a measure of sync.



2014 ◽  
Vol 7 (21) ◽  
pp. 4476-4489 ◽  
Author(s):  
Ijaz Ali Shoukat ◽  
Kamalrulnizam Abu Bakar ◽  
Subariah Ibrahim


2012 ◽  
Vol 51 (01) ◽  
pp. 66-73
Author(s):  
Nabil Al-Adani


2010 ◽  
Vol 18 (2) ◽  
pp. 107-123 ◽  
Author(s):  
Min Zhou ◽  
Onkar Sahni ◽  
Mark S. Shephard ◽  
Christopher D. Carothers ◽  
Kenneth E. Jansen

Effective use of the processor memory hierarchy is an important issue in high performance computing. In this work, a part level mesh topological traversal algorithm is used to define a reordering of both mesh vertices and regions that increases the spatial locality of data and improves overall cache utilization during on processor finite element calculations. Examples based on adaptively created unstructured meshes are considered to demonstrate the effectiveness of the procedure in cases where the load per processing core is varied but balanced (e.g., elements are equally distributed across cores for a given partition). In one example, the effect of the current ajacency-based data reordering is studied for different phases of an implicit analysis including element-data blocking, element-level computations, sparse-matrix filling and equation solution. These results are compared to a case where reordering is applied to mesh vertices only. The computations are performed on various supercomputers including IBM Blue Gene (BG/L and BG/P), Cray XT (XT3 and XT5) and Sun Constellation Cluster. It is observed that reordering improves the per-core performance by up to 24% on Blue Gene/L and up to 40% on Cray XT5. The CrayPat hardware performance tool is used to measure the number of cache misses across each level of the memory hierarchy. It is determined that the measured decrease in L1, L2 and L3 cache misses when data reordering is used, closely accounts for the observed decrease in the overall execution time.



2002 ◽  
Vol 55 (4b) ◽  
pp. 311-329 ◽  
Author(s):  
Chris J. Mitchell ◽  
Peter F. Lovibond

Blocking was observed in two human Pavlovian conditioning studies in which colour cues signalled shock. Both forward (Experiment 1) and backward (Experiment 2) blocking was demonstrated, but only when prior verbal and written instructions suggested that if two signals of shock (A+ and B+) were presented together, a double shock would result (AB++). In this case, participants could assume that the outcome magnitude was additive. Participants given non-additivity instructions (A+ and B+ combined would result in the same outcome, a single shock) failed to show blocking. Modifications required for associative models of learning, and normative statistical accounts of causal induction, to account for the impact of additivity instructions on the blocking effect, are discussed. It is argued that the blocking shown in the present experiments resulted from the operation, not of an error-correction learning rule, nor of a simple contingency detection mechanism, but of a more complex inferential process based on propositional knowledge. Consistent with the present data, blocking is a logical outcome of an A+/AB+ design only if participants can assume that outcomes will be additive.



Sign in / Sign up

Export Citation Format

Share Document