complete automation
Recently Published Documents


TOTAL DOCUMENTS

122
(FIVE YEARS 33)

H-INDEX

10
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Rong Wang ◽  
Muhammad Shafeeque ◽  
Haowen Yan ◽  
Lu Xiaoming

Abstract It is qualitatively evident that the greater the map scale change, the greater the optimal distance threshold of the Douglas-Peucker Algorithm, which is used in polyline simplification. However, no specific quantitative relationships between them are known by far, causing uncertainties in complete automation of the algorithm. To fill this gap, the current paper constructs quantitative relationships based on the spatial similarity theories of polylines. A quantitative spatial similarity relationship model was proposed and evaluated by setting two groups of control experiments and taking <C, T> as coordinates. In order to realize the automatic generalization of the polyline, we verified whether these quantitative relationships could be fitted using the same function with the same coefficients. The experiments revealed that the unary quadratic function is the best, whether the polylines were derived from different or the same geographical feature area(s). The results also show that using the same optimal distance threshold is unreasonable to simplify all polylines from different geographical feature areas. On the other hand, the same geographical feature area polylines could be simplified using the same optimal distance threshold. The uncertainties were assessed by evaluating the automated generalization results for position and geometric accuracy perspectives using polylines from the same geographic feature areas. It is demonstrated that in addition to maintaining the geographical features, the proposed model maintains the shape characteristics of polylines. Limiting the uncertainties would support the realization of completely automatic generalization of polylines and the construction of vector map geodatabases.


2021 ◽  
Author(s):  
Fahd Siddiqui ◽  
Mohammadreza Kamyab ◽  
Michael Lowder

Abstract The economic success of unconventional reservoirs relies on driving down completion costs. Manually measuring the operational efficiency for a multi-well pad can be error-prone and time-prohibitive. Complete automation of this analysis can provide an effortless real-time insight to completion engineers. This study presents a real-time method for measuring the time spent on each completion activity, thereby enabling the identification and potential cost reduction avenues. Two data acquisition boxes are utilized at the completion site to transmit both the fracturing and wireline data in real-time to a cloud server. A data processing algorithm is described to determine the start and end of these two operations for each stage of every well on the pad. The described method then determines other activity intervals (fracturing swap-over, wireline swap-over, and waiting on offset wells) based on the relationship between the fracturing and wireline segments of all the wells. The processed data results can be viewed in real-time on mobile or computers connected to the cloud. Viewing the full operational time log in real-time helps engineers analyze the whole operation and determine key performance indicators (KPIs) such as the number of fractured stages per day, pumping percentage, average fracture, and wireline swap-over durations for a given time period. In addition, the performance of the day and night crews can be evaluated. By plotting a comparison of KPIs for wireline and fracturing times, trends can be readily identified for improving operational efficiency. Practices from best-performing stages can be adopted to reduce non-pumping times. This helps operators save time and money to optimize for more efficient operations. As the number of wells increases, the complexity of manual generation of time-log increases. The presented method can handle multi-well fracturing and wireline operations without such difficulty and in real-time. A case study is also presented, where an operator in the US Permian basin used this method in real-time to view and optimize zipper operations. Analysis indicated that the time spent on the swap over activities could be reduced. This operator set a realistic goal of reducing 10 minutes per swap-over interval. Within one pad, the goal was reached utilizing this method, resulting in reducing 15 hours from the total pad time. The presented method provides an automated overview of fracturing operations. Based on the analysis, timely decisions can be made to reduce operational costs. Moreover, because this method is automated, it is not limited to single well operations but can handle multi-well pad completion designs that are commonplace in unconventionals.


IARJSET ◽  
2021 ◽  
Vol 8 (11) ◽  
Author(s):  
Aileen Sonia Dhas.P ◽  
Aquiline Lydia .L ◽  
Sowmiya .A

Author(s):  
Abdullah AlAli ◽  
Fatai Anifowose

AbstractSeismic velocity modeling is a crucial step in seismic processing that enables the use of velocity information from both seismic and wells to map the depth and thickness of subsurface layers interpreted from seismic images. The velocity can be obtained in the form of normal moveout (NMO) velocity or by an inversion (optimization) process such as in full-waveform inversion (FWI). These methods have several limitations. These limitations include enormous time consumption in the case of NMO due to manual and heavy human involvement in the picking. As an optimization problem, it incurs high cost and suffers from nonlinearity issues. Researchers have proposed various machine learning (ML) techniques including unsupervised, supervised, and semi-supervised learning methods to model the velocity more efficiently. The focus of the studies is mostly to automate the NMO velocity picking, improve the convergence in FWI, and apply FWI using ML directly from the data. In the purview of the digital transformation roadmap of the petroleum industry, this paper presents a chronologic review of these studies, appraises the progress made so far, and concludes with a set of recommendations to overcome the prevailing challenges through the implementation of more advanced ML methodologies. We hope that this work will benefit experts, young professionals, and ML enthusiasts to help push forward their research efforts to achieving complete automation of the NMO velocity and further enhancing the performance of ML applications used in the FWI framework.


Author(s):  
Tummala Sri Ranga Sai Krishna

In recent years, Virtual Personal Assistants(VPA) have worked with utmost efficacy sorting out queries and specific tasks posted by the individual users on the website by AI and Natural Language Processing . VPA developers develop functions to either scrape the query result from the Internet. The result data include copious formats from a simple definition in Wikipedia to complex calculations or recommendations. However, VPA’s designed for desktops do not work as extensively as the VPA’s featuring in the smart phones . They do not provide a complete automation of desktop websites due to continuous and frequent development. The current desktop personal assistant’s can show you the top results of the query ‘Biryani’, but cannot order on behalf of you. In this study, we propose a Virtual Personal Assistant ARCHER for desktop automation using Selenium by using the specifications of the behavior data of websites.


Author(s):  
Hasan Smajic ◽  
Toni Duspara

The COVID-19 pandemic confronts universities with great challenges to maintain research and teaching activities with as little contact as possible. Lecturers currently have to migrate to Internet teaching. In most cases, e-learning and digital tools are used to continue online courses to replace classroom teaching. But current online approaches are limited to just lectures and theoretical mathematical exercises. In this paper it will be shown how practical exercises can be carried out remotely via internet in a real technical environment. Experimental laboratory equipment for automation technology and mechatronics is always associated with high costs. The reason for the high investments are the costs for different intelligent devices within an automation solution and the costs for extensive engineering. Beyond the costs, the number of workstations usually does not correspond to the required number of students to be trained. In this case, the same exercises have to be repeated several times, which also leads to in-creased personnel costs. Remote laboratories are a very cost-effective solution for these problems. This paper describes how this goal can be achieved by implementing a WBT server (WBT - Web-Based Training Server) and a Java-based client-server architecture. The idea behind a remote controlled laboratory is to use web technologies and the Internet as communication infrastructure to perform an experimental part of the training with programmable automation devices. First of all, a detailed requirement profile for the laboratory was developed. Primarily technical, didactical and organizational requirements are concerned. In addition, the laboratory is to improve the education of the students by interactive, problem-oriented learning on real industrial automation components. The aim of the training is to learn suitable working methods for the design (engineering) of complete automation solutions starting from simple to medium complex machines and plants.


2021 ◽  
Author(s):  
Felipe De Amorim ◽  
Rodolfo Adamshuk Silva ◽  
Lincoln Costa ◽  
Francisco Carlos Souza

The testing process consists of activities that demand efforts asproducing, executing, and validating test scenarios. Covering alltest scenarios manually is unfeasible since it is error-prone andlabor-expensive. Thereby, partial or complete automation reducescosts and increases tests’ effectiveness. The increasing availabilityof hardware resources provides opportunities to scale testingusing parallel execution of test cases or suites blocks. Some toolsperform parallel execution of tests, but their use requires complicatedsettings, and when combined with some methodologies asBehavior-Driven Development, it may create an overhead for users.This paper presents the Multi-Threaded Testing (MTT) tool for parallelexecution of test scenarios in the context of Behavior-DrivenDevelopment that aims to reduce the computational time requiredto test Java projects. Furthermore, the present paper reports anexperimental study to evaluate the MTT tool’s performance intwo different hardware configurations. Our results demonstrate theMTT reached a speedup of 4,59 using ten threads in CPU Intel Corei5-9300H with an efficiency of 46%, and a speedup of 3,45 with anefficiency of 43% using eight threads in CPU Intel Core i7-7700HQ.


Author(s):  
A. V. Gaboutchian ◽  
V. A. Knyaz ◽  
M. M. Novikov ◽  
S. V. Vazyliev ◽  
D. V. Korost ◽  
...  

Abstract. Studies of teeth represent a significant part of palaeoanthropological research. Over the past two decades these studies have significantly developed with implementation of high resolution imaging based on x-ray scanning techniques. Highly informative reconstructions based on image processing have provided an opportunity to study morphological layers and structures of teeth which are usually hidden under the outer layer of dental enamel. Thus micro-computed tomography of the studied teeth has been performed in order to obtain reconstructions of enamel and dentin surfaces. The material is represented by well-preserved teeth of an adolescent from Upper Palaeolithic archaeological site of Sunghir world-renowned archaeological site in Vladimir Oblast in the Russian Federation. The characteristic feature of the studied teeth is in their unusual, presumably archaic, morphology, which has been previously studied and described through measurements by application of automated digital odontometry method; however the mentioned study referred to the enamel surface. And in the current study these algorithms are applied to measure the surface of dentin. As this is the first successful attempt of measuring dentin surface morphology, the process has to be improved for complete automation. Nevertheless even currently applied approaches allow to compare enamel and dentin morphology through measurements.


2021 ◽  
Vol 11 (2) ◽  
pp. 7018-7022
Author(s):  
A. Hashmi

Drones are widely known for their mobility and ease of use and represent a significant technological breakthrough with various applications. In this paper, a novel inexpensive Search and Rescue (SAR)-based approach for application in indoor environments is presented. The usage of Bluetooth Low Energy (BLE) has been evaluated with respect to other technologies and a conceptual view of the complete setup has been presented. Besides the cost of the drone and the locator devices, the other hardware is relatively inexpensive costing only a fraction of a US dollar. The system is believed to cover a wide area in a small-time frame ranging a few minutes, for instance, a 3600m2 surface area could be scanned in less than 5 minutes. The system is tested by attaching a BLE device in the payload to evaluate the presence of target beacons. Potential upgrades in the system are also proposed, including design modifications for outdoor use and the application in locating missing objects. This system can confidently replace search parties dealing with missing children in public places or venues, with minimal human interaction while bearing the potential for complete automation.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2569
Author(s):  
Andrei Nicolae ◽  
Adrian Korodi ◽  
Ioan Silea

The Industrial Internet of Things and Industry 4.0 paradigms are steering the industrial landscape towards better connected entities, superior interoperability and information exchange, which lays the basis for developing more intelligent solutions that are already starting to bring numerous benefits. The current research aligns to this course, in an attempt to build an automated and autonomous software tool, capable of reducing the energy consumption of a water treatment and distribution facility, by optimizing the water sources usage. Based on several previous researches, the present paper details both the complete automation of the optimizing strategy inside a proactive historian application and the tests executed with the finished solution. Possessing the abilities to directly influence the monitored system in a non-invasive manner, and to link all the sequences of the algorithm automatically, the solution is now ready for long-term functioning without any external interference.


Sign in / Sign up

Export Citation Format

Share Document