Dealing with Data: A Case Study on Information and Data Management Literacy

2015 ◽  
pp. 3-12
Author(s):  
Melissa Haendel ◽  
Nicole Vasilevsky ◽  
Jacqueline Wirz
Keyword(s):  
Author(s):  
Nilo Legowo ◽  
Gunawan Wang ◽  
Sabiq Adzhani Hammam ◽  
Wirianto ◽  
Ali Gunawan ◽  
...  

2007 ◽  
Vol 40 (4) ◽  
pp. 2070 ◽  
Author(s):  
S. Vassilopoulou ◽  
K. Chousianitis ◽  
V. Sakkas ◽  
B. Damiata ◽  
E. Lagios

The present study is concerned with the management of multi-thematic geo-data of Cephallonia Island, related to crustal deformation. A large amount of heterogeneous data (vector, raster, ascii files) involving geology, tectonics, topography, geomorphology and DGPS measurements was compiled. Crustal deformation was studied using GPS network consisting of '23 stations. This was installed and measured in October 2001 and re-measured during September 2003 following the Lefkas earthquake of August 2003 (Mw=6.2), and also in July 2006. With proper spatial analysis, a large number of thematic and synthetic layers and maps were produced. Simultaneously, a GIS Data base was organized in order to make an easy extraction of conclusions in specific questions.


2021 ◽  
Vol 16 (1) ◽  
pp. 11
Author(s):  
Klaus Rechert ◽  
Jurek Oberhauser ◽  
Rafael Gieschke

Software and in particular source code became an important component of scientific publications and henceforth is now subject of research data management.  Maintaining source code such that it remains a usable and a valuable scientific contribution is and remains a huge task. Not all code contributions can be actively maintained forever. Eventually, there will be a significant backlog of legacy source-code. In this article we analyse the requirements for applying the concept of long-term reusability to source code. We use simple case study to identify gaps and provide a technical infrastructure based on emulator to support automated builds of historic software in form of source code.  


2020 ◽  
Vol 2 (2) ◽  
pp. 47-61
Author(s):  
Daniel Adityatama ◽  
◽  
Rizky Mahardhika ◽  
Dorman Purba ◽  
Farhan Muhammad ◽  
...  

Drilling is one of the major cost components in geothermal exploration and development. Effective and cost-efficient drilling significantly contribute to the success of geothermal development. Key factors in reducing drilling costs are optimising operations, utilising manpower to its fullest potential, and also benchmarking with other drilling activities to evaluate one’s performance objectively. This is possible if the information regarding the previous drilling activities is stored and easily gathered and analysed before making plans for the drilling campaign. The importance of drilling data analysis and drilling data management have been a subject of study and discussion since the 1980s, but it is still not that common in geothermal drilling, especially in Indonesia. The purpose of this paper is to summarise the definition and examples of drilling data management in a more well-established industry such as oil and gas from various studies in the past, assess the advantages of having a proper drilling database or data management system, and how can the data be used for potentially improving future drilling operation. A case study of converting legacy data from previous drilling campaign of two geothermal fields in Java into a database is also discussed to demonstrate how legacy drilling data can be used to evaluate drilling performance.


2016 ◽  
Author(s):  
Alfred Enyekwe ◽  
Osahon Urubusi ◽  
Raufu Yekini ◽  
Iorkam Azoom ◽  
Oloruntoba Isehunwa

ABSTRACT Significant emphasis on data quality is placed on real-time drilling data for the optimization of drilling operations and on logging data for quality lithological and petrophysical description of a field. This is evidenced by huge sums spent on real time MWD/LWD tools, broadband services, wireline logging tools, etc. However, a lot more needs to be done to harness quality data for future workover and or abandonment operations where data being relied on is data that must have been entered decades ago and costs and time spent are critically linked to already known and certified information. In some cases, data relied on has been migrated across different data management platforms, during which relevant data might have been lost, mis-interpreted or mis-placed. Another common cause of wrong data is improperly documented well intervention operations which have been done in such a short time, that there is no pressure to document the operation properly. This leads to confusion over simple issues such as what depth a plug was set, or what junk was left in hole. The relative lack of emphasis on this type of data quality has led to high costs of workover and abandonment operations. In some cases, well control incidents and process safety incidents have arisen. This paper looks at over 20 workover operations carried out in a span of 10 years. An analysis is done on the wells’ original timeline of operation. The data management system is generally analyzed and a categorization of issues experienced during the workover operations is outlined. Bottlenecks in data management are defined and solutions currently being implemented to manage these problems are listed as recommended good practices.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Elizaveta Gavrikova ◽  
Irina Volkova ◽  
Yegor Burda

PurposeThe purpose of this paper is to design a framework for asset data management in power companies. The authors consider asset data management from a strategic perspective, linking operational-level data with corporate strategy and taking into account the organizational context and stakeholder expectations.Design/methodology/approachThe authors conducted a multiple case study based on a literature review and three series of in-depth interviews with experts from three Russian electric power companies.FindingsThe main challenge in asset data management for electric power companies is the increasing amount and complexity of asset data, which is frequently incomplete or inaccurately collected, hard to translate to managerial language, focused primarily on the operational level. Such fragmented approach negatively affects strategic decision-making. The proposed framework introduces a holistic approach, provides context and accountability for decision-making and attributes data flows, roles and responsibilities to different management levels.Research limitations/implicationsThe limitations of our study lie in the exploratory nature of case study research and limited generalization of the observed cases. However, the authors used multiple sources of evidence to ensure validity and generalization of the results. This article is a first step toward further understanding of the issues of transformation in power companies and other asset intensive businesses.Originality/valueThe novelty of the framework lies in the scope, focus and detailed treatment of asset data management in electric power companies.


Sign in / Sign up

Export Citation Format

Share Document