scholarly journals Highly Available DHTs: Keeping Data Consistency After Updates

Author(s):  
Predrag Knežević ◽  
Andreas Wombacher ◽  
Thomas Risse
Keyword(s):  
Author(s):  
Haoyu Jiang ◽  
Kai Chen ◽  
Quanbo Ge ◽  
Jinqiang Xu ◽  
Yingying Fu ◽  
...  

2020 ◽  
Vol 17 (6) ◽  
pp. 692-725
Author(s):  
Peter Krüger Andersen

The revised Markets in Financial Instruments Directive and Regulation (the MiFID II regime)See Directive 2014/65/EU (MiFID II) and Regulation (EU) 600/2014 (MiFIR). is one of the most comprehensive reforms of market structural and investor protection regimes the world has yet seen. The MiFID II regime will affect the European – and likely the global – market structure for years to come. Based on relevant perspectives from the revised best execution regime under MiFID II, this article suggest that it is time to reduce complexity. It is argued that unless a sufficient degree of horizontal and vertical integration of the best execution regulation takes place, the policy objectives cannot be reached. Further, it is argued that the significant data exercise that comes with the new rules only serves end-investors if a sufficient level of data consistency can be achieved. From this outset, the article emphasises the increased importance of data in today’s EU financial regulation. The article includes relevant comparisons to the equivalent US rules on best execution.


Optik ◽  
2020 ◽  
Vol 202 ◽  
pp. 163603
Author(s):  
Shaojie Tang ◽  
Baolei Li ◽  
Zhiwei Qiao ◽  
Yining Zhu ◽  
Cong Guo ◽  
...  

2017 ◽  
Vol 26 (03) ◽  
pp. 1750002
Author(s):  
Fouad Hanna ◽  
Lionel Droz-Bartholet ◽  
Jean-Christophe Lapayre

The consensus problem has become a key issue in the field of collaborative telemedicine systems because of the need to guarantee the consistency of shared data. In this paper, we focus on the performance of consensus algorithms. First, we studied, in the literature, the most well-known algorithms in the domain. Experiments on these algorithms allowed us to propose a new algorithm that enhances the performance of consensus in different situations. During 2014, we presented our very first initial thoughts to enhance the performance of the consensus algorithms, but the proposed solution gave very moderate results. The goal of this paper is to present a new enhanced consensus algorithm, named Fouad, Lionel and J.-Christophe (FLC). This new algorithm was built on the architecture of the Mostefaoui-Raynal (MR) consensus algorithm and integrates new features and some known techniques in order to enhance the performance of consensus in situations where process crashes are present in the system. The results from our experiments running on the simulation platform Neko show that the FLC algorithm gives the best performance when using a multicast network model on different scenarios: in the first scenario, where there are no process crashes nor wrong suspicion, and even in the second one, where multiple simultaneous process crashes take place in the system.


Author(s):  
Suyud Widiono

A database server called the Database Management System (DBMS) that relates tables in a database is called the Relational Database Management System (RDBMS). DBMS/RDBMS is a computer program that provides data services for computers or other computer programs. One of the RDBMS type database server (hereinafter referred to as a database server) is MariaDB. The database server is in charge of managing and providing data, so data must always be ready, fast presented, accurate, and safe, it cannot be damaged or even lost. One way to provide this data is to install several database servers using the concept of replication in the Multiple Server Database system. Replication in a cluster server database is a method of installing several database server nodes that allow between node servers to copy each other and distribute data from one node to another database server node, which then synchronizes data between server nodes to maintain data consistency. This study looks for the most optimal number of minimal database server nodes to provide accurate, fast and safe data on the MariaDB Cluster RDBMS. From the results of the replication test from the cluster server database, it can be concluded that the number of 3 (three) node servers can be known to always synchronize and consistency of data between server nodes, so there are 3 (three) nodes of minimum database node with MariaDB RDBMS.


Geophysics ◽  
2019 ◽  
Vol 84 (5) ◽  
pp. V281-V293 ◽  
Author(s):  
Qiang Zhao ◽  
Qizhen Du ◽  
Xufei Gong ◽  
Xiangyang Li ◽  
Liyun Fu ◽  
...  

Simultaneous source acquisition has attracted more and more attention from geophysicists because of its cost savings, whereas it also brings some challenges that have never been addressed before. Deblending of simultaneous source data is usually considered as an underdetermined inverse problem, which can be effectively solved with a least-squares (LS) iterative procedure between data consistency ([Formula: see text]-norm) and regularization ([Formula: see text]-norm or [Formula: see text]-norm). However, when it comes to abnormal noise that follows non-Gaussian distribution and possesses high-amplitude features (e.g., erratic noise, swell noise, and power line noise), the [Formula: see text]-norm is a nonrobust statistic that can easily lead to suboptimal deblended results. Although abnormal noise can be attenuated in the common source domain at first, it is still challenging to apply a coherency-based filter due to the sparse receiver or crossline sampling, e.g., that commonly found in ocean bottom node (OBN) acquisition. To address this problem, we have developed a normalized shaping regularization to make the inversion-based deblending approach robust for the separation of blended data when abnormal noise exists. Its robustness comes from the normalized shaping operator defined by the confidence interval of normal distribution, which minimizes the abnormal risk to a normal level to satisfy the assumption of LS shaping regularization. In special cases, the proposed approach will revert to the classic LS shaping regularization once the normalized coefficient is large enough. Experimental results on synthetic and field data indicate that the proposed method can effectively restore the separated records from blended data at essentially the same convergence rate as the LS shaping regularization for the abnormal noise-free scenario, but it can obtain better deblending performance and less energy leakage when abnormal noise exists.


2016 ◽  
Vol 9 (3) ◽  
pp. 1279-1301 ◽  
Author(s):  
Rosemary Munro ◽  
Rüdiger Lang ◽  
Dieter Klaes ◽  
Gabriele Poli ◽  
Christian Retscher ◽  
...  

Abstract. The Global Ozone Monitoring Experiment-2 (GOME-2) flies on the Metop series of satellites, the space component of the EUMETSAT Polar System. In this paper we will provide an overview of the instrument design, the on-ground calibration and characterization activities, in-flight calibration, and level 0 to 1 data processing. The current status of the level 1 data is presented and points of specific relevance to users are highlighted. Long-term level 1 data consistency is also discussed and plans for future work are outlined. The information contained in this paper summarizes a large number of technical reports and related documents containing information that is not currently available in the published literature. These reports and documents are however made available on the EUMETSAT web pages and readers requiring more details than can be provided in this overview paper will find appropriate references at relevant points in the text.


Sign in / Sign up

Export Citation Format

Share Document