scholarly journals Towards open data blockchain analytics: a Bitcoin perspective

2018 ◽  
Vol 5 (8) ◽  
pp. 180298 ◽  
Author(s):  
D. McGinn ◽  
D. McIlwraith ◽  
Y. Guo

Bitcoin is the first implementation of a technology that has become known as a ‘public permissionless’ blockchain. Such systems allow public read/write access to an append-only blockchain database without the need for any mediating central authority. Instead, they guarantee access, security and protocol conformity through an elegant combination of cryptographic assurances and game theoretic economic incentives. Not until the advent of the Bitcoin blockchain has such a trusted, transparent, comprehensive and granular dataset of digital economic behaviours been available for public network analysis. In this article, by translating the cumbersome binary data structure of the Bitcoin blockchain into a high fidelity graph model, we demonstrate through various analyses the often overlooked social and econometric benefits of employing such a novel open data architecture. Specifically, we show: (i) how repeated patterns of transaction behaviours can be revealed to link user activity across the blockchain; (ii) how newly mined bitcoin can be associated to demonstrate individual accumulations of wealth; (iii) through application of the naïve quantity theory of money that Bitcoin's disinflationary properties can be revealed and measured; and (iv) how the user community can develop coordinated defences against repeated denial of service attacks on the network. Such public analyses of this open data are exemplary benefits unavailable to the closed data models of the ‘private permissioned’ distributed ledger architectures currently dominating enterprise-level blockchain development owing to existing issues of scalability, confidentiality and governance.

Author(s):  
Jerome P. Jarrett ◽  
William N. Dawes ◽  
P. John Clarkson

Aeroengines are designed using fractured processes. Complexity has driven the design of such machines to be subdivided by specialism, customer and function. While this approach has worked well in the past, with component efficiencies, current material performance and the possibilities presented by scaling existing designs for future needs becoming progressively exhausted it is necessary to reverse this process of disintegration. Our research addresses this aim. The strategy we use has two symbiotic arms. The first is an open data architecture from which existing disparate design codes all derive their input and to which all send their output. The second is a dynamic design process management system known as “SignPosting”. Both the design codes and parameters are arranged into complementary multiple level hierarchies: fundamental to the successful implementation of our strategy is the robustness of the mechanisms we have developed to ensure consistency in this environment as the design develops over time. One of the key benefits of adopting a hierarchical structure is that it confers not only the ability to use mean-line, throughflow and fully 3D CFD techniques in the same environment but also to cross specialism boundaries and insert mechanical, material, thermal, electrical and structural codes, enabling exploration of the design space for multi-disciplinary non-linear responses to design changes and their exploitation. We present results from trials of an early version of the system applied to the redesign of a generic civil aeroengine core compressor. SignPosting has allowed us to examine the hardness of design constraints across disciplines which has shown that it is far more profitable not to strive for even higher aerodynamic performance, but rather improve the commercial performance by maintaining design and part speed pressure ratios stability and efficiency while increasing rotor blade creep life by up to 70%.


Author(s):  
Georgios Larkou ◽  
Julia Metochi ◽  
Georgios Chatzimilioudis ◽  
Demetrios Zeinalipour-Yazti

PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0250887
Author(s):  
Luke A. McGuinness ◽  
Athena L. Sheppard

Objective To determine whether medRxiv data availability statements describe open or closed data—that is, whether the data used in the study is openly available without restriction—and to examine if this changes on publication based on journal data-sharing policy. Additionally, to examine whether data availability statements are sufficient to capture code availability declarations. Design Observational study, following a pre-registered protocol, of preprints posted on the medRxiv repository between 25th June 2019 and 1st May 2020 and their published counterparts. Main outcome measures Distribution of preprinted data availability statements across nine categories, determined by a prespecified classification system. Change in the percentage of data availability statements describing open data between the preprinted and published versions of the same record, stratified by journal sharing policy. Number of code availability declarations reported in the full-text preprint which were not captured in the corresponding data availability statement. Results 3938 medRxiv preprints with an applicable data availability statement were included in our sample, of which 911 (23.1%) were categorized as describing open data. 379 (9.6%) preprints were subsequently published, and of these published articles, only 155 contained an applicable data availability statement. Similar to the preprint stage, a minority (59 (38.1%)) of these published data availability statements described open data. Of the 151 records eligible for the comparison between preprinted and published stages, 57 (37.7%) were published in journals which mandated open data sharing. Data availability statements more frequently described open data on publication when the journal mandated data sharing (open at preprint: 33.3%, open at publication: 61.4%) compared to when the journal did not mandate data sharing (open at preprint: 20.2%, open at publication: 22.3%). Conclusion Requiring that authors submit a data availability statement is a good first step, but is insufficient to ensure data availability. Strict editorial policies that mandate data sharing (where appropriate) as a condition of publication appear to be effective in making research data available. We would strongly encourage all journal editors to examine whether their data availability policies are sufficiently stringent and consistently enforced.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Tao Li ◽  
Yuling Chen ◽  
Yanli Wang ◽  
Yilei Wang ◽  
Minghao Zhao ◽  
...  

Blockchain has been an emerging technology, which comprises lots of fields such as distributed systems and Internet of Things (IoT). As is well known, blockchain is the underlying technology of bitcoin, whose initial motivation is derived from economic incentives. Therefore, lots of components of blockchain (e.g., consensus mechanism) can be constructed toward the view of game theory. In this paper, we highlight the combination of game theory and blockchain, including rational smart contracts, game theoretic attacks, and rational mining strategies. When put differently, the rational parties, who manage to maximize their utilities, involved in blockchain chose their strategies according to the economic incentives. Consequently, we focus on the influence of rational parties with respect to building blocks. More specifically, we investigate the research progress from the aspects of smart contract, rational attacks, and consensus mechanism, respectively. Finally, we present some future directions based on the brief survey with respect to game theory and blockchain.


Sign in / Sign up

Export Citation Format

Share Document