scholarly journals Concentrated Stopping Set Design for Coded Merkle Tree: Improving Security Against Data Availability Attacks in Blockchain Systems

Author(s):  
Debarnab Mitra ◽  
Lev Tauz ◽  
Lara Dolecek
2021 ◽  
Author(s):  
Debarnab Mitra ◽  
Lev Tauz ◽  
Lara Dolecek

<div>In blockchain systems, full nodes store the entire blockchain ledger and validate all transactions in the system by operating on the entire ledger. However, for better scalability and decentralization of the system, blockchains also run light nodes that only store a small portion of the ledger. In blockchain systems having a majority of malicious full nodes, light nodes are vulnerable to a data availability (DA) attack. In this attack, a malicious node makes the light nodes accept an invalid block by hiding the invalid portion of the block from the nodes in the system. Recently, a technique based on LDPC codes called Coded Merkle Tree (CMT) was proposed by Yu et al. that enables light nodes to detect a DA attack by randomly requesting/sampling portions of the block from the malicious node. However, light nodes fail to detect a DA attack with high probability if a malicious node hides a small stopping set of the LDPC code. To mitigate this problem, Yu et al. used well-studied techniques to design random LDPC codes with high minimum stopping set size. Although effective, these codes are not necessarily optimal for this application. In this paper, we demonstrate that a suitable co-design of specialized LDPC codes and the light node sampling strategy can improve the probability of detection of DA attacks. We consider different adversary models based on their computational capabilities of finding stopping sets in LDPC codes. For a weak adversary model, we devise a new LDPC code construction termed as the entropy-constrained PEG (EC-PEG) algorithm which concentrates stopping sets to a small group of variable nodes. We demonstrate that the EC-PEG algorithm coupled with a greedy sampling strategy improves the probability of detection of DA attacks. For stronger adversary models, we provide a co-design of a sampling strategy called linear-programming-sampling (LP-sampling) and an LDPC code construction called linear-programming-constrained PEG (LC-PEG) algorithm. The new co-design demonstrates a higher probability of detection of DA attacks compared to approaches proposed in earlier literature.</div>


Author(s):  
Mingchao Yu ◽  
Saeid Sahraei ◽  
Songze Li ◽  
Salman Avestimehr ◽  
Sreeram Kannan ◽  
...  

2021 ◽  
Author(s):  
Debarnab Mitra ◽  
Lev Tauz ◽  
Lara Dolecek

<div>In blockchain systems, full nodes store the entire blockchain ledger and validate all transactions in the system by operating on the entire ledger. However, for better scalability and decentralization of the system, blockchains also run light nodes that only store a small portion of the ledger. In blockchain systems having a majority of malicious full nodes, light nodes are vulnerable to a data availability (DA) attack. In this attack, a malicious node makes the light nodes accept an invalid block by hiding the invalid portion of the block from the nodes in the system. Recently, a technique based on LDPC codes called Coded Merkle Tree (CMT) was proposed by Yu et al. that enables light nodes to detect a DA attack by randomly requesting/sampling portions of the block from the malicious node. However, light nodes fail to detect a DA attack with high probability if a malicious node hides a small stopping set of the LDPC code. To mitigate this problem, Yu et al. used well-studied techniques to design random LDPC codes with high minimum stopping set size. Although effective, these codes are not necessarily optimal for this application. In this paper, we demonstrate that a suitable co-design of specialized LDPC codes and the light node sampling strategy can improve the probability of detection of DA attacks. We consider different adversary models based on their computational capabilities of finding stopping sets in LDPC codes. For a weak adversary model, we devise a new LDPC code construction termed as the entropy-constrained PEG (EC-PEG) algorithm which concentrates stopping sets to a small group of variable nodes. We demonstrate that the EC-PEG algorithm coupled with a greedy sampling strategy improves the probability of detection of DA attacks. For stronger adversary models, we provide a co-design of a sampling strategy called linear-programming-sampling (LP-sampling) and an LDPC code construction called linear-programming-constrained PEG (LC-PEG) algorithm. The new co-design demonstrates a higher probability of detection of DA attacks compared to approaches proposed in earlier literature.</div>


Author(s):  
Nur Amiratun Nazihah Roslan ◽  
Hairulnizam Mahdin ◽  
Shahreen Kasim

With the rise of social networking approach, there has been a surge of users generated content all over the world and with that in an era where technology advancement are up to the level where it could put us in a step ahead of pathogens and germination of diseases, we couldn’t help but to take advantage of that advancement and provide an early precaution measures to overcome it. Twitter on the other hand are one of the social media platform that provides access towards a huge data availability. To manipulate those data and transform it into an important information that could be used in many different scope that could help improve people’s life for the better. In this paper, we gather all algorithm that are available inside Meta Classifier to compare between them on which algorithm suited the most with the dengue fever dataset. This research are using WEKA as the data mining tool for data analyzation.


2006 ◽  
Vol 27 (2) ◽  
pp. 260-288
Author(s):  
Tanit Mendes ◽  
Janet Tulloch
Keyword(s):  

2018 ◽  
Vol 26 (2) ◽  
pp. 48-48
Author(s):  
A Asrat ◽  
◽  
P Braconnot ◽  
E Book ◽  
C Chiesi ◽  
...  
Keyword(s):  

2020 ◽  
Vol 47 (1) ◽  
pp. 55-74
Author(s):  
Ryan P. McDonough ◽  
Paul J. Miranti ◽  
Michael P. Schoderbek

ABSTRACT This paper examines the administrative and accounting reforms coordinated by Herman A. Metz around the turn of the 20th century in New York City. Reform efforts were motivated by deficiencies in administering New York City's finances, including a lack of internal control over monetary resources and operational activities, and opaque financial reports. The activities of Comptroller Metz, who collaborated with institutions such as the New York Bureau of Municipal Research, were paramount in initiating and implementing the administrative and accounting reforms in the city, which contributed to reform efforts across the country. Metz promoted the adoption of functional cost classifications for city departments, developed flowcharts for improved transaction processing, strengthened internal controls, and published the 1909 Manual of Accounting and Business Procedure of the City of New York, which laid the groundwork for transparent financial reports capable of providing vital information about the city's activities and subsidiary units. JEL Classifications: H72, M41, N91. Data Availability: Data are available from the public sources cited in the text.


2015 ◽  
Vol 29 (3) ◽  
pp. 551-575 ◽  
Author(s):  
Colleen M. Boland ◽  
Scott N. Bronson ◽  
Chris E. Hogan

SYNOPSIS We examine whether regulations requiring accelerated filing deadlines and internal control reporting and testing affect financial statement reliability. Unlike prior research, we examine whether these regulatory changes are associated with an increase in the likelihood that misstatements originate in the period following the respective change. If the implementation of these rules causes a misstatement, then the misstatement would most likely occur in the period immediately following the rule change. We provide evidence that accelerated filers (AFs) experience an increase in the likelihood of an originating misstatement following the acceleration of filing deadlines from 90 to 75 days. Large accelerated filers (LAFs), however, do not experience a similar increase following this acceleration or the subsequent acceleration from 75 to 60 days. After the implementation of the SOX Section 404 internal control requirements, we find that the likelihood of an originating misstatement declined for AFs but not for LAFs. Taken together, the findings suggest that, although AFs experienced an initial decrease in financial statement reliability, this decrease was temporary. Data Availability: Data are publicly available from the sources identified in the text.


Sign in / Sign up

Export Citation Format

Share Document