legacy data
Recently Published Documents


TOTAL DOCUMENTS

284
(FIVE YEARS 114)

H-INDEX

17
(FIVE YEARS 5)

Author(s):  
Daniel Stansbie

Is a big data analytical approach viable using archaeological artefact and ecofact data? In particular is it possible to use Bowker and Star's (1999) concept of the 'boundary object' to manage the issues caused by data scale, complexity, diversity and variable information standards when attempting to carry out this kind of research? This paper reviews the theoretical and methodological debates around archaeological big data as they impact upon research into assemblages of artefacts and ecofacts and presents a methodology for the construction and use of a large archaeological database of legacy artefact and ecofact data created as part of the English Landscapes and Identities Project at the University of Oxford.


2021 ◽  
Vol 5 (1) ◽  
pp. 70
Author(s):  
Vagia Makri ◽  
George Panagopoulos ◽  
Konstantinos Nikolaou ◽  
Spyridon Bellas ◽  
Nikos Pasadakis

It is evident that the increased focus on energy transition, will increase the demand for gas as it is the transitional fuel to the net zero CO2 emission era. The West Katakolo field is the only oil and gas discovery in Western Greece, and it is operated by Energean. The three offshore West Katakolo wells have defined both the oil and the gas zones, while onshore exploration wells have penetrated biogenic gas-saturated Plio-Pleistocene sands. This study assesses the gas generation potential of the local Plio-Pleistocene and Triassic sources using thermal maturity modelling based on the available legacy data, with limitations being addressed by running several case-scenarios. In conclusion, this study supports the generation of thermogenic and biogenic gas from the Triassic and Plio-Pleistocene sources respectively, demonstrating the importance of maturity modelling in hydrocarbon exploration, applied on the Katakolo case; a potential gas source to facilitate the energy transition in Greece.


2021 ◽  
Author(s):  
Saif Ali Al Mesaabi ◽  
Guillaume Marie Cambois ◽  
James Cowell ◽  
David Arnold ◽  
Mohamed Fawzi Boukhanfra ◽  
...  

Abstract In 2017 ADNOC decided to cover the entire Abu Dhabi Emirate, onshore and offshore, with high- resolution and high-fold 3D seismic. Acquisition of the world's largest continuous seismic survey started in late 2018 and is around 77% complete at the time of writing. Data processing is well under way and interpretation of the first delivered 3D cubes is ongoing. Now is an opportune time to review the status of this gigantic project and draw preliminary lessons. Comparison with legacy data shows a massive improvement in deep imaging, which was one of the main objectives of this survey. The basement can clearly be interpreted, while it is hardly visible on legacy data being covered with high energy multiples. A thorough analysis demonstrated that increased offset is the main reason for the uplift. The large fold and the low frequency sweep also help recover signal down to 3 Hz. This extends the bandwidth in the low frequencies by one to two octaves compared to legacy data, which tremendously benefits structural interpretation and stratigraphic inversion.


Author(s):  
L. H. Hansen ◽  
R. van Son ◽  
A. Wieser ◽  
E. Kjems

Abstract. In this paper we address the issue of unreliable subsurface utility information. Data on subsurface utilities are often positionally inaccurate, not up to date, and incomplete, leading to increased uncertainty, costs, and delays incurred in underground-related projects. Despite opportunities for improvement, the quality of legacy data remains unaddressed. We address the legacy data issue by making an argument for an approach towards subsurface utility data reconciliation that relies on the integration of heterogeneous data sources. These data sources can be collected at opportunities that occur throughout the life cycle of subsurface utilities and include as-built GIS records, GPR scans, and open excavation 3D scans. By integrating legacy data with newly captured data sources, it is possible to verify, (re)classify and update the data and improve it for future use. To demonstrate the potential of an integration-driven data reconciliation approach, we present real-world use cases from Denmark and Singapore. From these cases, challenges towards implementation of the approach were identified that include a lack of technological readiness, a lack of incentive to capture and share the data, increased cost, and data sharing concerns. Future research should investigate in detail how various data sources lead to improved data quality, develop a data model that brings together all necessary data sources for integration, and a framework for governance and master data management to ensure roles and responsibilities can be feasibly enacted.


2021 ◽  
Vol 126 (10) ◽  
Author(s):  
Shelby A. Jones ◽  
Eric Blinman ◽  
Lisa Tauxe ◽  
J. Royce Cox ◽  
Stacey Lengyel ◽  
...  
Keyword(s):  

2021 ◽  
pp. 000-000
Author(s):  
Raymond B. Huey ◽  
Donald B. Miles ◽  
Eric R. Pianka
Keyword(s):  

2021 ◽  
Author(s):  
Yinbin Miao ◽  
Aaron Oaks ◽  
Abdellatif Yacout ◽  
Christopher Matthews ◽  
Stephen Novascone

2021 ◽  
Author(s):  
Hannes Ulrich ◽  
Paul Behrend ◽  
Joshua Wiedekopf ◽  
Cora Drenkhahn ◽  
Ann-Kristin Kock-Schoppenhauer ◽  
...  

With the steady increase in the connectivity of the healthcare system, new requirements and challenges are emerging. In addition to the seamless exchange of data between service providers on a national level, the local legacy data must also meet the new requirements. For this purpose, the applications used must be tested securely and sufficiently. However, the availability of suitable and realistic test data is not always given. Therefore, this study deals with the creation of test data based on real electronic health record data provided by the Medical Information Mart for Intensive Care (MIMIC-IV) database. In addition to converting the data to the current FHIR R4, conversion to the core data sets of the German Medical Informatics Initiative was also presented and made available. The test data was generated to simulate a legacy data transfer. Moreover, four different FHIR servers were tested for performance. This study is the first step toward comparable test scenarios around shared datasets and promotes comparability among providers on a national level.


Sign in / Sign up

Export Citation Format

Share Document