Challenges and Opportunities Related to Use of Models in Drilling Operations

2021 ◽  
Author(s):  
Maria Vatshaug Ottermo ◽  
Knut Steinar Bjørkevoll ◽  
Tor Onshus

Abstract Automation of managed pressure drilling (MPD) has a big potential for improving consistency, efficiency, and safety of operations, and is therefore pursued by many actors. While this development mitigates many risk elements, it also adds some related to for example mathematical algorithms and remote access. This work is based on document reviews, interviews, and working sessions with the industry, and adds insight on how risks associated with moving to higher levels of automation of MPD can be mitigated to a level where benefits are significantly larger than the sum of added risks. The work has resulted in many recommendations for the industry, where most were related to testing, verification, and validation of the models and data inputs, as well as meaningful human control and employing a holistic approach when introducing new models. Recommendations were also given to the Petroleum Safety Authority Norway. These were related to missing or inadequate use of standards by the industry, lack of ICT knowledge, and encouraging increased experience sharing. Future work should address how to enable meaningful human control as models become more complex or to a larger extent is based on empirical data and artificial intelligence as opposed to models based on first principles. Human control is important in unexpected situations in which the system fails to act safely. There is also a need to address ICT security issues arising when remote operation becomes more common.

2020 ◽  
Vol 17 (2-3) ◽  
Author(s):  
Dagmar Waltemath ◽  
Martin Golebiewski ◽  
Michael L Blinov ◽  
Padraig Gleeson ◽  
Henning Hermjakob ◽  
...  

AbstractThis paper presents a report on outcomes of the 10th Computational Modeling in Biology Network (COMBINE) meeting that was held in Heidelberg, Germany, in July of 2019. The annual event brings together researchers, biocurators and software engineers to present recent results and discuss future work in the area of standards for systems and synthetic biology. The COMBINE initiative coordinates the development of various community standards and formats for computational models in the life sciences. Over the past 10 years, COMBINE has brought together standard communities that have further developed and harmonized their standards for better interoperability of models and data. COMBINE 2019 was co-located with a stakeholder workshop of the European EU-STANDS4PM initiative that aims at harmonized data and model standardization for in silico models in the field of personalized medicine, as well as with the FAIRDOM PALs meeting to discuss findable, accessible, interoperable and reusable (FAIR) data sharing. This report briefly describes the work discussed in invited and contributed talks as well as during breakout sessions. It also highlights recent advancements in data, model, and annotation standardization efforts. Finally, this report concludes with some challenges and opportunities that this community will face during the next 10 years.


Author(s):  
Ryan Mullins ◽  
Deirdre Kelliher ◽  
Ben Nargi ◽  
Mike Keeney ◽  
Nathan Schurr

Recently, cyber reasoning systems demonstrated near-human performance characteristics when they autonomously identified, proved, and mitigated vulnerabilities in software during a competitive event. New research seeks to augment human vulnerability research teams with cyber reasoning system teammates in collaborative work environments. However, the literature lacks a concrete understanding of vulnerability research workflows and practices, limiting designers’, engineers’, and researchers’ ability to successfully integrate these artificially intelligent entities into teams. This paper contributes a general workflow model of the vulnerability research process, and identifies specific collaboration challenges and opportunities anchored in this model. Contributions were derived from a qualitative field study of work habits, behaviors, and practices of human vulnerability research teams. These contributions will inform future work in the vulnerability research domain by establishing an empirically-driven workflow model that can be adapted to specific organizational and functional constraints placed on individual and teams.


Author(s):  
Neville Moray ◽  
Toshiuki Inagaki ◽  
Makoto Itoh

Sheridan's “Levels of Automation” were explored in an experiment on fault management of a continuous process control task which included situation adaptive automation. Levels of automation with more or less automation autonomy, and different levels of advice to the operator were compared, with automatic diagnosis whose reliability varied. The efficiency of process control and of fault management were explored under human control and automation in fault management, and aspects of the task in which human or automation were the more efficient defined. The results are related to earlier work on trust and self confidence in allocation of function by Lee, Moray, and Muir.


2019 ◽  
Vol 17 (1) ◽  
pp. 133-141
Author(s):  
Gregory P. Tapis ◽  
Kanu Priya

ABSTRACT Data analytics is receiving increasing emphasis in accounting programs. This emphasis has emerged from both practitioners and accrediting bodies. In April 2018, the Association to Advance Collegiate Schools of Business (AACSB) International released Standard A5, which calls for a more holistic approach to teaching and incorporating data analytics into accounting programs. Specifically, accounting programs are required to focus on students' agility and adaptability as they relate to changes and disruptions in technology. Such characteristics present challenges and opportunities for accounting educators when developing and assessing data analytics in accounting programs. In this paper, we propose using a combination of practitioner involvement and measurements from the psychology literature to create a continuous holistic approach to course assessment and improvement. Specifically, utilizing proxies for adaptability and agility, we propose a methodology for measuring changes in students' agility and adaptability throughout a semester.


2018 ◽  
pp. 132-150
Author(s):  
Taiseera Al Balushi ◽  
Saqib Ali ◽  
Osama Rehman

Initiatives carried by companies, institutes and governments to flourish and embellish the Information and Communication Technology (ICT) among the public have led to its penetration into every walk of life. ICT enhances the efficiency of various systems, such as the organisation and transfer of data. However, with the digital and remote access features of ICT comes the motivation towards financial, political and military gains by rivals. Security threats and vulnerabilities in existing ICT systems have resulted in cyber-attacks that are usually followed by substantial financial losses. This study discusses the security in ICT from a business, economic and government perspective. The study makes an attempt to understand the seriousness of the security issues and highlights the consequences of security breech from an economic perspective. Based on the performed analysis, the factors behind these attacks are provided along with recommendations for better preparations against them.


2016 ◽  
pp. 1162-1190
Author(s):  
Peter Sasvari ◽  
Zoltán Nagymate

Innovation capability has increasingly been searched by the ICT sector in cloud computing applications recently. This chapter describes the economic potentials of cloud computing and explores the characteristics of its usage among Hungarian enterprises. Although enterprises are aware of the basic concept of cloud computing, they have concerns about its application mainly due to data security issues and the lack of education. The chance of using cloud computing services is mainly facilitated by the creation of easier application and consultation would positively affect their usage. According to microenterprises and corporations, faster information flow and remote access are the key benefits of cloud usage. In the case of small-sized enterprises, the two main advantages are easier system recoverability and a higher level of mobility in case of a system breakdown. For the medium-sized enterprises, remote access and greater data security were the key benefits of using cloud computing services in 2014.


2020 ◽  
Vol 12 (1) ◽  
Author(s):  
Syed Shah Sultan Mohiuddin Qadri ◽  
Mahmut Ali Gökçe ◽  
Erdinç Öner

Abstract Introduction Due to the menacing increase in the number of vehicles on a daily basis, abating road congestion is becoming a key challenge these years. To cope-up with the prevailing traffic scenarios and to meet the ever-increasing demand for traffic, the urban transportation system needs effective solution methodologies. Changes made in the urban infrastructure will take years, sometimes may not even be feasible. For this reason, traffic signal timing (TST) optimization is one of the fastest and most economical ways to curtail congestion at the intersections and improve traffic flow in the urban network. Purpose Researchers have been working on using a variety of approaches along with the exploitation of technology to improve TST. This article is intended to analyze the recent literature published between January 2015 and January 2020 for the computational intelligence (CI) based simulation approaches and CI-based approaches for optimizing TST and Traffic Signal Control (TSC) systems, provide insights, research gaps and possible directions for future work for researchers interested in the field. Methods In analyzing the complex dynamic behavior of traffic streams, simulation tools have a prominent place. Nowadays, microsimulation tools are frequently used in TST related researches. For this reason, a critical review of some of the widely used microsimulation packages is provided in this paper. Conclusion Our review also shows that approximately 77% of the papers included, utilizes a microsimulation tool in some form. Therefore, it seems useful to include a review, categorization, and comparison of the most commonly used microsimulation tools for future work. We conclude by providing insights into the future of research in these areas.


Systems ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 10 ◽  
Author(s):  
Clifford D. Johnson ◽  
Michael E. Miller ◽  
Christina F. Rusnock ◽  
David R. Jacques

Levels of Automation (LOA) provide a method for describing authority granted to automated system elements to make individual decisions. However, these levels are technology-centric and provide little insight into overall system operation. The current research discusses an alternate classification scheme, referred to as the Level of Human Control Abstraction (LHCA). LHCA is an operator-centric framework that classifies a system’s state based on the required operator inputs. The framework consists of five levels, each requiring less granularity of human control: Direct, Augmented, Parametric, Goal-Oriented, and Mission-Capable. An analysis was conducted of several existing systems. This analysis illustrates the presence of each of these levels of control, and many existing systems support system states which facilitate multiple LHCAs. It is suggested that as the granularity of human control is reduced, the level of required human attention and required cognitive resources decreases. Thus, it is suggested that designing systems that permit the user to select among LHCAs during system control may facilitate human-machine teaming and improve the flexibility of the system.


Cells ◽  
2019 ◽  
Vol 8 (11) ◽  
pp. 1305 ◽  
Author(s):  
Livia Roseti ◽  
Giovanna Desando ◽  
Carola Cavallo ◽  
Mauro Petretta ◽  
Brunella Grigolo

There has been considerable advancement over the last few years in the treatment of osteoarthritis, common chronic disease and a major cause of disability in older adults. In this pathology, the entire joint is involved and the regeneration of articular cartilage still remains one of the main challenges, particularly in an actively inflammatory environment. The recent strategies for osteoarthritis treatment are based on the use of different therapeutic solutions such as cell and gene therapies and tissue engineering. In this review, we provide an overview of current regenerative strategies highlighting the pros and cons, challenges and opportunities, and we try to identify areas where future work should be focused in order to advance this field.


Polar Record ◽  
2016 ◽  
Vol 52 (5) ◽  
pp. 518-534 ◽  
Author(s):  
Frigga Kruse

ABSTRACTThe Arctic is commonly perceived as a pristine wilderness, yet more than four centuries of human industry have not left Svalbard untouched. This paper explores the historical dimension of human-induced ecosystem change using human presence as a proxy. Its aims are fourfold: to reconstruct and quantify historical human presence, to ascertain if human presence is a suitable indicator of long-term anthropogenic pressure, to deduce trends in anthropogenic pressure on five selected species of game animal, and to postulate trends in their subpopulation sizes. Published sources give rise to 57 datasets dealing with the annual voyages to Svalbard as well as the participants in them. All known archaeological sites are visualised in a distribution map. Despite the large amount of data, the quantification of historical human presence remains biased and partial. Only with the aid of a timeline of known milestones is it possible to make hypotheses about changes in anthropogenic pressure and animal subpopulations over time. The exercise is nonetheless a necessary and instructive one: it confirms that the erroneous view of Svalbard as a pristine ecosystem hinders timely historical-ecological research. Future work must aim at the systematic quantification of past human impact in a holistic approach to environmental conservation and restoration.


Sign in / Sign up

Export Citation Format

Share Document