A software infrastructure for distributed data models in Cloud

Author(s):  
Bojan Jelacic ◽  
Srdan Vukmirovic ◽  
Sebastiajn Stoja
F1000Research ◽  
2017 ◽  
Vol 6 ◽  
pp. 686 ◽  
Author(s):  
Ferdinando Villa ◽  
Stefano Balbi ◽  
Ioannis N. Athanasiadis ◽  
Caterina Caracciolo

Correct and reliable linkage of independently produced information is a requirement to enable sophisticated applications and processing workflows. These can ultimately help address the challenges posed by complex systems (such as socio-ecological systems), whose many components can only be described through independently developed data and model products. We discuss the first outcomes of an investigation in the conceptual and methodological aspects of semantic annotation of data and models, aimed to enable a high standard of interoperability of information. The results, operationalized in the context of a long-term, active, large-scale project on ecosystem services assessment, include: A definition of interoperability based on semantics and scale;A conceptual foundation for the phenomenology underlying scientific observations, aimed to guide the practice of semantic annotation in domain communities;A dedicated language and software infrastructure that operationalizes the findings and allows practitioners to reap the benefits of data and model interoperability. The work presented is the first detailed description of almost a decade of work with communities active in socio-ecological system modeling. After defining the boundaries of possible interoperability based on the understanding of scale, we discuss examples of the practical use of the findings to obtain consistent, interoperable and machine-ready semantic specifications that can integrate semantics across diverse domains and disciplines.


Author(s):  
Miguel Ángel Rodríguez ◽  
Alberto Fernández ◽  
Antonio Peregrín ◽  
Francisco Herrera
Keyword(s):  

Author(s):  
Эдуард Дадян ◽  
Eduard Dadyan

The textbook examines and analyzes: databases and DBMS, data and computers, database concept, DBMS architecture, infological, datological and physical data models, date types of logical data models, data presentation using the model " Entity-Relationship, Entity-Relationship diagram, data integrity. An overview of the notation used to build the entity-link diagrams is given. In detail: relational databases, operations with relational database tables, relational relationship generation rules from entity to link model. The following sections are presented in detail: fast data access tools, SQL language, physical database organization, client-server architecture, distributed data processing and database server structure. The textbook also formulates conceptual bases of the concept "knowledge", sets out concepts and definitions of knowledge, knowledge bases, models of representation of knowledge. Here are the principles of building systems for data analysis-data warehousing, data models used in building data warehouses. The tutorial ends with the consideration of data protection issues.


2005 ◽  
Vol 4 (2) ◽  
pp. 393-400
Author(s):  
Pallavali Radha ◽  
G. Sireesha

The data distributors work is to give sensitive data to a set of presumably trusted third party agents.The data i.e., sent to these third parties are available on the unauthorized places like web and or some ones systems, due to data leakage. The distributor must know the way the data was leaked from one or more agents instead of as opposed to having been independently gathered by other means. Our new proposal on data allocation strategies will improve the probability of identifying leakages along with Security attacks typically result from unintended behaviors or invalid inputs.  Due to too many invalid inputs in the real world programs is labor intensive about security testing.The most desirable thing is to automate or partially automate security-testing process. In this paper we represented Predicate/ Transition nets approach for security tests automated generationby using formal threat models to detect the agents using allocation strategies without modifying the original data.The guilty agent is the one who leaks the distributed data. To detect guilty agents more effectively the idea is to distribute the data intelligently to agents based on sample data request and explicit data request. The fake object implementation algorithms will improve the distributor chance of detecting guilty agents.


Author(s):  
D. V. Gribanov

Introduction. This article is devoted to legal regulation of digital assets turnover, utilization possibilities of distributed computing and distributed data storage systems in activities of public authorities and entities of public control. The author notes that some national and foreign scientists who study a “blockchain” technology (distributed computing and distributed data storage systems) emphasize its usefulness in different activities. Data validation procedure of digital transactions, legal regulation of creation, issuance and turnover of digital assets need further attention.Materials and methods. The research is based on common scientific (analysis, analogy, comparing) and particular methods of cognition of legal phenomena and processes (a method of interpretation of legal rules, a technical legal method, a formal legal method and a formal logical one).Results of the study. The author conducted an analysis which resulted in finding some advantages of the use of the “blockchain” technology in the sphere of public control which are as follows: a particular validation system; data that once were entered in the system of distributed data storage cannot be erased or forged; absolute transparency of succession of actions while exercising governing powers; automatic repeat of recurring actions. The need of fivefold validation of exercising governing powers is substantiated. The author stresses that the fivefold validation shall ensure complex control over exercising of powers by the civil society, the entities of public control and the Russian Federation as a federal state holding sovereignty over its territory. The author has also conducted a brief analysis of judicial decisions concerning digital transactions.Discussion and conclusion. The use of the distributed data storage system makes it easier to exercise control due to the decrease of risks of forge, replacement or termination of data. The author suggests defining digital transaction not only as some actions with digital assets, but also as actions toward modification and addition of information about legal facts with a purpose of its establishment in the systems of distributed data storage. The author suggests using the systems of distributed data storage for independent validation of information about activities of the bodies of state authority. In the author’s opinion, application of the “blockchain” technology may result not only in the increase of efficiency of public control, but also in the creation of a new form of public control – automatic control. It is concluded there is no legislation basis for regulation of legal relations concerning distributed data storage today.


Author(s):  
Sara Ferreira ◽  
Thiago RPM Rúbioa ◽  
João Jacob ◽  
Henrique Lopes Cardoso ◽  
Daniel Castro Silva ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document