data configuration
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 11)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Jue Ma

To improve the performance for distributed blockchain system, a novel and effective consensus algorithm is designed in this paper. It firstly constructs a more random additive constant through the generation matrix of the error correction code and uses the value of the hash entropy to prove that the constructed hash function can meet the requirements of high throughput and fast consensus in performance. In addition, a distributed consensus coordination service system is used in the blockchain system to realize the synchronization of metadata and ensure the consistency of block data, configuration information, and transaction information. The experiment results show that our proposed strategy can reduce the waste of computing resources, increase the block generation speed, and ensure the fairness of nodes participating in the competition, which is an effective solution to ensure the stable operation of the blockchain system.


Minerals ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. 1281
Author(s):  
Jan von Harten ◽  
Miguel de la Varga ◽  
Michael Hillier ◽  
Florian Wellmann

Geological models are commonly used to represent geological structures in 3D space. A wide range of methods exists to create these models, with much scientific work focusing recently on implicit representation methods, which perform an interpolation of a three-dimensional field where the relevant boundaries are then isosurfaces in this field. However, this method has well-known problems with inhomogeneous data distributions: if regions with densely sampled data points exist, modeling artifacts are common. We present here an approach to overcome this deficiency through a combination of an implicit interpolation algorithm with a local smoothing approach. The approach is based on the concepts of nugget effect and filtered kriging known from conventional geostatistics. It reduces the impact of regularly occurring modeling artifacts that result from data uncertainty and data configuration and additionally aims to improve model robustness for scale-dependent fit-for-purpose modeling. Local smoothing can either be manually adjusted, inferred from quantified uncertainties associated with input data or derived automatically from data configuration. The application for different datasets with varying configuration and noise is presented for a low complexity geologic model. The results show that the approach enables a reduction of artifacts, but may require a careful choice of parameter settings for very inhomogeneous data sets.


Author(s):  
А. V. Shefer ◽  
E. N. Belykh

A clinical observation of a successful staged treatment of a patient with severe acute pancreatitis based on early diagnosis of damage to the pancreatic duct according to CT data (configuration of pancreatic necrosis) and high level of amylase in the fluid collection is presented.


2021 ◽  
Vol 6 (1) ◽  
pp. 20
Author(s):  
Ilyas Ahmad Huqqani ◽  
Tay Lea Tien ◽  
Junita Mohamad-Saleh

Landslide is a natural disaster that occurs mostly in hill areas. Landslide hazard mapping is used to classify the prone areas to mitigate the risk of landslide hazards. This paper aims to compare spatial landslide prediction performance using an artificial neural network (ANN) model based on different data input configurations, different numbers of hidden neurons, and two types of normalization techniques on the data set of Penang Island, Malaysia. The data set involves twelve landslide influencing factors in which five factors are in continuous values, while the remaining seven are in categorical/discrete values. These factors are considered in three different configurations, i.e., original (OR), frequency ratio (FR), and mixed-type (MT) data, which act as an input to train the ANN model separately. A significant effect on the final output is the number of hidden neurons in the hidden layer. In addition, three data configurations are processed using two different normalization methods, i.e., mean-standard deviation (Mean-SD) and Min-Max. The landslide causative data often consist of correlated information caused by overlapping of input instances. Therefore, the principal component analysis (PCA) technique is used to eliminate the correlated information. The area under the receiver of characteristics (ROC) curve, i.e., AUC is also applied to verify the produced landslide hazard maps. The best result of AUC for both Mean-SD and Min-Max with PCA schemes are 96.72% and 96.38%, respectively. The results show that Mean-SD with PCA of MT data configuration yields the best validation accuracy, AUC, and lowest AIC at 100 number of hidden neurons. MT data configuration with the Mean-SD normalization and PCA scheme is more robust and stable in the MLP model's training for landslide prediction.  Keywords: Landslide; ANN; Hidden Neurons; Normalization; PCA; ROC; Hazard map   Copyright (c) 2021 Geosfera Indonesia and Department of Geography Education, University of Jember This work is licensed under a Creative Commons Attribution-Share A like 4.0 International License


2021 ◽  
Author(s):  
Jan von Harten ◽  
Florian Wellmann ◽  
Miguel de la Varga

<p>Implicit methods have been the basis of many developments in 3-D structural geologic modeling.  Typical input data for these types of models include surface points and orientations of geologic units, as well as the corresponding age relations (stratigraphic pile). In addition, the range of influence of input points needs to be defined, but it is difficult to infer a reasonable stationary estimate from data with highly variable configuration.</p><p>Often, this results in models that show artefacts due to data configuration including oversimplified results (underfitting) in areas where data is missing, overcomplex results (overfitting) in areas of high data density and geologically unreasonable surface shapes.</p><p>In this work we explore various methods to improve 3-D implicit geologic modeling by manipulating the data configuration using locally varying anisotropic kernels and kernel density estimation. In other words, the influence of input data in the interpolation is weighted based on directions and data density. Input parameters for these methods can either be based on the original input data configuration, inferred from additional supportive data, or be based on geologic expert knowledge. The proposed methods aim to increase model control while retaining the key advantages of implicit modeling.</p><p>Model improvements will be shown using a set of typical geologic structures and regularly occurring artefacts. We compare results to previously proposed methods that integrate anisotropies in traditional kriging applications and discuss the specific requirements for applicability in implicit structural geomodeling.</p>


2020 ◽  
Vol 1 (3) ◽  
pp. 98-105
Author(s):  
Vahid Kaviani J ◽  
Parvin Ahmadi Doval Amiri ◽  
Farsad Zamani Brujeni ◽  
Nima Akhlaghi

This paper is a review of types of modification data attack based on computer systems and it explores the vulnerabilities and mitigations. Altering information is a kind of cyber-attack during which intruders interfere, catch, alter, take or erase critical data on the PCs and applications through using network exploit or by running malicious executable codes on victim's system. One of the most difficult and trendy areas in information security is to protect the sensitive information and secure devices from any kind of threats. Latest advancements in information technology in the field of information security reveal huge amount of budget funded for and spent on developing and addressing security threats to mitigate them. This helps in a variety of settings such as military, business, science, and entertainment. Considering all concerns, the security issues almost always come at first as the most critical concerns in the modern time. As a matter of fact, there is no ultimate security solution; although recent developments in security analysis are finding daily vulnerabilities, there are many motivations to spend billions of dollars to ensure there are vulnerabilities waiting for any kind of breach or exploit to penetrate into the systems and networks and achieve particular interests. In terms of modifying data and information, from old-fashioned attacks to recent cyber ones, all of the attacks are using the same signature: either controlling data streams to easily breach system protections or using non-control-data attack approaches. Both methods can damage applications which work on decision-making data, user input data, configuration data, or user identity data to a large extent. In this review paper, we have tried to express trends of vulnerabilities in the network protocols’ applications.


2020 ◽  
Vol 223 (1) ◽  
pp. 398-419
Author(s):  
Matthieu Plasman ◽  
Christel Tiberi ◽  
Cecilia Cadio ◽  
Anita Thea Saraswati ◽  
Gwendoline Pajot-Métivier ◽  
...  

SUMMARY The emergence of high resolution satellite measurements of the gravitational field (GOCE mission) offers promising perspectives for the study of the Earth’s interior. These new data call for the development of innovant analysis and interpretation methods. Here we combine a forward prism computation with a Bayesian resolution approach to invert for these gravity gradient data configuration. We apply and test our new method on satellite data configuration, that is 225 km height with a global and homogeneous geographic distribution. We first quantify the resolution of our method according to both data and parametrization characteristics. It appears that for reasonable density contrast values (0.1 g cm−3) crustal structures have to be wider than ∼28 km to be detectable in the GOCE signal. Deeper bodies are distinguishable for greater size (35 km size at 50 km depth, ∼80 km at 300 km depth). We invert the six tensor components, among which five are independent. By carefully testing each of them and their different combinations, we enlighten a trade off between the recovery of data and the sensitivity to inversion parameters. We particularly discussed this characteristic in terms of geometry of the synthetic model tested (structures orientation, 3-D geometry, etc.). In terms of RMS value, each component is always better explained if inverted solely, but the result is strongly affected by the inversion parametrization (smoothing, variances, etc.). On the contrary, the simultaneous inversion of several components displays a significant improvement for the global tensor recovery, more dependent on data than on density variance or on smoothness control. Comparing gravity and gradient inversions, we highlight the superiority of the GG data to better reproduce the structures especially in terms of vertical location. We successfully test our method on a realistic case of a complex subduction case for both gradient and gravity data. While the imaging of small crustal structures requires terrestrial gravity data set, the longest wavelength of the slab is well recovered with both data sets. The precision and homogeneous coverage of GOCE data however, counterbalance the heterogeneous and often quite non-existence coverage of terrestrial gravity data. This is particularly true in large areas which requires a coherent assemblage of heterogeneous data sets, or in high relief, vegetally covered and offshore zones.


2020 ◽  
Vol 245 ◽  
pp. 05026
Author(s):  
Chris Burr ◽  
Ben Couturier

GitLab’s Continuous Integration has proven to be an efficient tool to manage the lifecycle of experimental software. This has sparked interest in uses that exceed simple unit tests, and therefore require more resources, such as production data configuration and physics data analysis. The default GitLab CI runner software is not appropriate for such tasks, and we show that it is possible to use the GitLab API and modern container orchestration technologies to build a custom CI runner that integrates with DIRAC, the middleware used by the LHCb experiment to run its job on the Worldwide LHC Computing Grid. This system allows for excellent utilisation of computing resources while also providing additional flexibility for defining jobs and providing authentication.


Sign in / Sign up

Export Citation Format

Share Document