Developing a Common Global Framework for Marine Data Management

Author(s):  
Helen M. Glaves

The paradigm shift in marine research moving from the traditional discipline based methodology to a multidisciplinary, ecosystem level approach is being driven by changes in both the policies for the management and exploitation of the ocean, and the scientific method itself. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. The Ocean Data Interoperability Platform (ODIP) project has been funded in parallel by the European Commission, National Science Foundation in the USA and the Australian Government to promote the development of a common framework for marine data management that leverages the existing marine e-infrastructures which have been created in response to the need for greater sharing of marine data at a regional level.

Author(s):  
Helen M. Glaves

The paradigm shift in marine research moving from the traditional discipline based methodology to a multidisciplinary, ecosystem level approach is being driven by changes in both the policies for the management and exploitation of the ocean, and the scientific method itself. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. The Ocean Data Interoperability Platform (ODIP) project has been funded in parallel by the European Commission, National Science Foundation in the USA and the Australian Government to promote the development of a common framework for marine data management that leverages the existing marine e-infrastructures which have been created in response to the need for greater sharing of marine data at a regional level.


2020 ◽  
Author(s):  
Leda Pecci ◽  
Michele Fichaut ◽  
Dick Schaap

<p>The pan-European SeaDataNet marine and ocean data infrastructure started in early 2000, by means of a European funded project to create a framework for the management of large and diverse sets of data deriving from in situ measurements. It has been improved thanks to different European projects, it represents the joint efforts of several marine institutes around the European and the Mediterranean seas. The current project that is improving the infrastructure is the SeaDataCloud Horizon 2020 project; it involves a network of 56 partners across 29 countries.</p><p>According to our main objectivest he project designed and implemented actions which can spur a response on an international level, creating the basis to reinforce the pan-European SeaDataCloud community.</p><p> </p><p>Information Technology (IT) has an important impact on how people work together. In the SeaDataCloud project the following web communication tools are used:</p><ul><li>SeaDataNet website and Extranet;</li> <li>Partners’ websites;</li> <li>Mailing lists;</li> <li>Electronic newsletters;</li> <li>On line educational materials;</li> <li>Videos and video tutorials;</li> <li>Twitter;</li> <li>Articles in e-journals;</li> </ul><p> </p><p>Members of the SeaDataCloud and SeaDataNet I and II, have had the opportunity of face to face meetings, the norm is to travel even for meetings of short duration. This investment in time and money allows direct contact between the partners of the projects. This creates an opportunity for people across Europe to meet each other, to work together and to speak openly.</p><p> </p><p>The IMDIS (International Conference on Marine Data and Information Systems) conferences have been organized in the framework of the European funded projects that have allowed the SeaDataNet infrastructure to be developed and upgraded. The meetings started in 2005 with the first conference organised in Brest (France), to share knowledge and best practices on marine data management. IMDIS is a unique platform and has the following goals:</p><ul><li>Raise awareness of the SeaDataNet infrastructure, new development and standards;</li> <li>Share experiences in ocean data management;</li> <li>Enable synergies between data providers and data managers.</li> </ul><p> </p><p>It has been a breeding ground for inspirational ideas, for example the project ODIP (Ocean Data Interoperability Platform) that led to its successor ODIP II project was conceived during one of the conferences. The challenges and objectives of the projects were to find common interoperability solutions to problems in ocean data sharing, in collaboration with institutions from Europe, USA and Australia. In this case the IMDIS series of conferences have represented an opportunity not only for knowledge exchange in ocean data management but they have led to significant results in terms of new synergies that made it possible to find new partners and projects.</p><p>The direct interactions during the meetings as well as the on line tools have had a positive impact on reinforcing the development of a large SeaDataNet community across Europe and beyond.</p><p>The SeaDataCloud project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement Nº 730960.</p>


Sensors ◽  
2019 ◽  
Vol 19 (9) ◽  
pp. 1978 ◽  
Author(s):  
Argyro Mavrogiorgou ◽  
Athanasios Kiourtis ◽  
Konstantinos Perakis ◽  
Stamatios Pitsios ◽  
Dimosthenis Kyriazis

It is an undeniable fact that Internet of Things (IoT) technologies have become a milestone advancement in the digital healthcare domain, since the number of IoT medical devices is grown exponentially, and it is now anticipated that by 2020 there will be over 161 million of them connected worldwide. Therefore, in an era of continuous growth, IoT healthcare faces various challenges, such as the collection, the quality estimation, as well as the interpretation and the harmonization of the data that derive from the existing huge amounts of heterogeneous IoT medical devices. Even though various approaches have been developed so far for solving each one of these challenges, none of these proposes a holistic approach for successfully achieving data interoperability between high-quality data that derive from heterogeneous devices. For that reason, in this manuscript a mechanism is produced for effectively addressing the intersection of these challenges. Through this mechanism, initially, the collection of the different devices’ datasets occurs, followed by the cleaning of them. In sequel, the produced cleaning results are used in order to capture the levels of the overall data quality of each dataset, in combination with the measurements of the availability of each device that produced each dataset, and the reliability of it. Consequently, only the high-quality data is kept and translated into a common format, being able to be used for further utilization. The proposed mechanism is evaluated through a specific scenario, producing reliable results, achieving data interoperability of 100% accuracy, and data quality of more than 90% accuracy.


2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.


2014 ◽  
Vol 668-669 ◽  
pp. 1374-1377 ◽  
Author(s):  
Wei Jun Wen

ETL refers to the process of data extracting, transformation and loading and is deemed as a critical step in ensuring the quality, data specification and standardization of marine environmental data. Marine data, due to their complication, field diversity and huge volume, still remain decentralized, polyphyletic and isomerous with different semantics and hence far from being able to provide effective data sources for decision making. ETL enables the construction of marine environmental data warehouse in the form of cleaning, transformation, integration, loading and periodic updating of basic marine data warehouse. The paper presents a research on rules for cleaning, transformation and integration of marine data, based on which original ETL system of marine environmental data warehouse is so designed and developed. The system further guarantees data quality and correctness in analysis and decision-making based on marine environmental data in the future.


2017 ◽  
Vol 47 (1) ◽  
pp. 46-55 ◽  
Author(s):  
S Aqif Mukhtar ◽  
Debbie A Smith ◽  
Maureen A Phillips ◽  
Maire C Kelly ◽  
Renate R Zilkens ◽  
...  

Background: The Sexual Assault Resource Center (SARC) in Perth, Western Australia provides free 24-hour medical, forensic, and counseling services to persons aged over 13 years following sexual assault. Objective: The aim of this research was to design a data management system that maintains accurate quality information on all sexual assault cases referred to SARC, facilitating audit and peer-reviewed research. Methods: The work to develop SARC Medical Services Clinical Information System (SARC-MSCIS) took place during 2007–2009 as a collaboration between SARC and Curtin University, Perth, Western Australia. Patient demographics, assault details, including injury documentation, and counseling sessions were identified as core data sections. A user authentication system was set up for data security. Data quality checks were incorporated to ensure high-quality data. Results: An SARC-MSCIS was developed containing three core data sections having 427 data elements to capture patient’s data. Development of the SARC-MSCIS has resulted in comprehensive capacity to support sexual assault research. Four additional projects are underway to explore both the public health and criminal justice considerations in responding to sexual violence. The data showed that 1,933 sexual assault episodes had occurred among 1881 patients between January 1, 2009 and December 31, 2015. Sexual assault patients knew the assailant as a friend, carer, acquaintance, relative, partner, or ex-partner in 70% of cases, with 16% assailants being a stranger to the patient. Conclusion: This project has resulted in the development of a high-quality data management system to maintain information for medical and forensic services offered by SARC. This system has also proven to be a reliable resource enabling research in the area of sexual violence.


2021 ◽  
pp. 147807712110121
Author(s):  
Adam Tamas Kovacs ◽  
Andras Micsik

This article discusses a BIM Quality Control Ecosystem that is based on Requirement Linked Data in order to create a framework where automated BIM compliance checking methods can be widely used. The meaning of requirements is analyzed in a building project context as a basis for data flow analysis: what are the main types of requirements, how they are handled, and what sources they originate from. A literature review has been conducted to find the present development directions in quality checking, besides a market research on present, already widely used solutions. With the conclusions of these research and modern data management theory, the principles of a holistic approach have been defined for quality checking in the Architecture, Engineering and Construction (AEC) industry. A comparative analysis has been made on current BIM compliance checking solutions according to our review principles. Based on current practice and ongoing research, a state-of-the-art BIM quality control ecosystem is proposed that is open, enables automation, promotes interoperability, and leaves the data governing responsibility at the sources of the requirements. In order to facilitate the flow of requirement and quality data, we propose a model for requirements as Linked Data and provide example for quality checking using Shapes Constraint Language (SHACL). As a result, an opportunity is given for better quality and cheaper BIM design methods to be implemented in the industry.


Author(s):  
Akriti Mishra ◽  
Kamini Mishra ◽  
Dipayan Bose ◽  
Abhijit Chakrabarti ◽  
Puspendu Kumar Das

Characterization of nanoparticle protein corona has gained tremendous importance lately. The parameters which quantitatively establish a specific nanoparticle-protein interaction need to be measured accurately since good quality data is necessary...


Sign in / Sign up

Export Citation Format

Share Document