initial model
Recently Published Documents


TOTAL DOCUMENTS

389
(FIVE YEARS 133)

H-INDEX

24
(FIVE YEARS 6)

2022 ◽  
Vol 9 ◽  
Author(s):  
Zhonghan Liu ◽  
Yingcai Zheng ◽  
Hua-Wei Zhou

To better interpret the subsurface structures and characterize the reservoir, a depth model quantifying P-wave velocity together with additional rock’s physical parameters such as density, the S-wave velocity, and anisotropy is always preferred by geologists and engineers. Tradeoffs among different parameters can bring extra challenges to the seismic inversion process. In this study, we propose and test the Direct Waveform Inversion (DWI) scheme to simultaneously invert for 1D layered velocity and density profiles, using reflection seismic waveforms recorded on the surface. The recorded data includes primary reflections and interbed multiples. DWI is implemented in the time-space domain then followed by a wavefield extrapolation to downward continue the source and receiver. By explicitly enforcing the wavefield time-space causality, DWI can recursively determine the subsurface seismic structure in a local layer-by-layer fashion for both sharp interfaces and the properties of the layers, from shallow to deep depths. DWI is different from the layer stripping methods in the frequency domain. By not requiring a global initial model, DWI also avoids many nonlinear optimization problems, such as the local minima or the need for an accurate initial model in most waveform inversion schemes. Two numerical tests show the validity of this DWI scheme serving as a new strategy for multi-parameter seismic inversion.


Author(s):  
Susanne Marx ◽  
◽  

Open Innovation (OI) research has covered various organizational forms in dimensions of durability (permanent versus temporary organizing) and organizational scope (intra- or inter-organizational). Inter-organizational forms - both temporary and permanent – are regarded mainly as modes of OI. However, these organizational forms also act as initiators of OI activities to extend knowledge transfer across the inter-organizational consortium borders, which is hardly researched. To address this gap, the research presented in this article develops an OI process for inter-organizational projects (IOP) as initiators of OI. The initial model is developed by action research with an IOP of museums and educational institutions implementing a series of hackathons. The model’s applicability is then evaluated for other IOPs by a survey, indicating the model’s suitability for practitioners. Findings point to the importance of collaborative activities for aligning the OI initiative with both individual partners’ and common project goals, while outbound activities are regarded least important despite the time-limitation of the project. The research is limited by its focus on the specific IOP environment of EU-funded projects and the small scope of the survey.


2021 ◽  
Vol 119 (1) ◽  
pp. 104
Author(s):  
Guomin Han ◽  
Hongbo Li ◽  
Yujin Liu ◽  
Jie Zhang ◽  
Ning Kong ◽  
...  

In tandem cold rolling, the control of the temperature of high-grade non-oriented silicon steel is a difficult problem for its large deformation resistance and the preheating procedure before rolling. And it is complicated to calculate the total temperature rise of rolling deformation zone due to the comprehensive influence of the plastic deformation heat, the friction heat and the contact heat loss. So, to precisely calculate the total temperature rise, firstly, based on the four classical cold rolling force formulas, the initial total temperature rise calculation models are established correspondingly by theoretically analyzing the temperature rise of deformation heat, the temperature rise of friction heat and the temperature drop of contact heat loss; then, the model based on the improved Lian rolling force formula is adopted, which leads to calculated best matching the measured temperature; finally, considering the complex formula calculation of the initial model, based on the influences of different rolling parameters on the total temperature rise, a simplified model for convenient calculation is proposed by the nonlinear regression analysis of the initial model calculation results and main rolling parameters, which is convenient for the actual application by the field technicians.


2021 ◽  
Author(s):  
Nathan C Hurley ◽  
Nihar Desai ◽  
Sanket Dhruva ◽  
Rohan Khera ◽  
Wade L Schulz ◽  
...  

Background: Bleeding is a common complication of percutaneous coronary intervention (PCI), leading to significant morbidity, mortality, and cost. While several risk models exist to predict post-PCI bleeding risk, however these existing models produce a single estimate of bleeding risk anchored at a single point in time. These models do not update the risk estimates as new clinical information emerges, despite the dynamic nature of risk. Objective: We sought to develop models that update estimates of patient risk of bleeding over time, enabling a dynamic estimate of risk that incorporates evolving clinical information, and to demonstrate updated predictive performance by incorporating this information. Methods: Using data available from the National Cardiovascular Data Registry (NCDR) CathPCI, we trained 6 different XGBoost tree-based machine learning models to estimate the risk of bleeding at key decision points: 1) choice of access site, 2) prescription of medication prior to PCI, and 3) the choice of closure device. Results: We included 2,868,808 PCIs; 2,314,446 (80.7%) prior to 2014 for training and 554,362 (19.3%) remaining for validation. Discrimination improved from an AUROC of 0.812 (95% Confidence Interval: 0.812-0.812) using only presentation variables to 0.845 (0.845-0.845) using all variables. Among 123,712 patients classified as low risk by the initial model, 14,441 were reclassified as moderate risk (1.4% experienced bleeds), while 723 patients were reclassified as high risk (12.5% experienced bleeds). Among 160,165 patients classified as high risk by the initial model, there were 40 patients reclassified to low risk (0% experienced bleeds), and 43,265 patients reclassified to moderate risk (2.5% experienced bleeds). Conclusion: Accounting for the time-varying nature of data and capturing the association between treatment decisions and changes in risk provide up-to-date information that may guide individualized care throughout a hospitalization.


2021 ◽  
Author(s):  
Javier Eusebio Gomez ◽  
Marcelo Robles ◽  
Cristian Di Giuseppe ◽  
Federico Galliano ◽  
Jeronimo Centineo ◽  
...  

Abstract This paper presents the process and results of the application of Data Physics to optimize production of a mature field in the Gulf of San Jorge Basin in Argentina. Data Physics is a novel technology that blends the reservoir physics (black oil) used in traditional numerical simulation with machine learning and advanced optimization techniques. Data Physics was described in detail in a prior paper (Sarma, et al SPE-185507-MS) as a physics-based modeling approach augmented by machine learning. In essence, historical production and injection data are assimilated using an Ensemble Kalman Filter (EnKF) to infer the petrophysical parameters and create a predictive model of the field. This model is then used with Evolutionary Algorithms (EA) to find the pareto front for multiple optimization objectives like production, injection and NPV. Ultimately, the main objective of Data Physics is to enable Closed Loop Optimization. The technology was applied on a small section of a very large field in the Gulf of San Jorge comprised of 134 wells including 83 active producers and 27 active water injectors; up to 12 mandrels per well are used to provide with selective injection, while production is carried out in a comingled manner. Production zonal allocation is calculated using an in-house process based on swabbing tests and recovery factors and is used as input to the Data Physics application, while injection allocation is based on tracer logs performed in each injection well twice a year. This paper describes the modeling and optimization phases as well as the implementation in the field and the results obtained after performing two close loop optimization cycles. The initial model was developed between October and December 2018 and initial field implementation took place between January to March 2019. A second optimization cycle was then executed in January 2020 and results observed for several months.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Tianfang Xue ◽  
Haibin Yu

As deep reinforcement learning methods have made great progress in the visual navigation field, metalearning-based algorithms are gaining more attention since they greatly improve the expansibility of moving agents. According to metatraining mechanism, typically an initial model is trained as a metalearner by existing navigation tasks and becomes well performed in new scenes through relatively few recursive trials. However, if a metalearner is overtrained on the former tasks, it may hardly achieve generalization on navigating in unfamiliar environments as the initial model turns out to be quite biased towards former ambient configuration. In order to train an impartial navigation model and enhance its generalization capability, we propose an Unbiased Model-Agnostic Metalearning (UMAML) algorithm towards target-driven visual navigation. Inspired by entropy-based methods, maximizing the uncertainty over output labels in classification tasks, we adopt inequality measures used in Economics as a concise metric to calculate the loss deviation across unfamiliar tasks. With succinctly minimizing the inequality of task losses, an unbiased navigation model without overperforming in particular scene types can be learnt based on Model-Agnostic Metalearning mechanism. The exploring agent complies with a more balanced update rule, able to gather navigation experience from training environments. Several experiments have been conducted, and results demonstrate that our approach outperforms other state-of-the-art metalearning navigation methods in generalization ability.


Author(s):  
Kevin McGuigan ◽  
Kieran Collins ◽  
Kevin McDaid

Analysis of 3926 shots from the 2019 Senior inter-county football championship aims to establish the impact of distance, angle, shot type, method and pressure on shot success. Findings demonstrate that shots from free kicks contribute 20.5% of the total attempts in Gaelic football, with a success rate of 75%, in contrast to 50% success of shots from open play. Moreover, the range from which free kick success is >57.6% accuracy extends to 40 m, while from open play this is passed at a range of 28 m. There were almost twice as many right foot shots (64.4%) compared with the left foot (32.4%), with right foot attempts marginally more accurate. Shots under low pressure were most successful, while those under medium pressure were less successful than those under high pressure, albeit taken from an average distance of 7.5 m closer to the target. A logistic regression model to explore the impact of all variables on shot outcome demonstrates the significance of shot distance, angle and pressure on the kicker, as well as whether shots are taken with the hand or foot. This research provides an important step in understanding the scale of the impact of a range of variables on shot success in Gaelic football while simultaneously providing an initial model to predict the shot outcome based on these variables.


2021 ◽  
Vol 930 (1) ◽  
pp. 012040
Author(s):  
G A P Eryani ◽  
I M S Amerta ◽  
M W Jayantari

Abstract In water resource planning, information on water availability is needed. Nowadays, data on water availability is still difficult to obtain. With technology in the form of a rainfall-runoff simulation model that can predict water availability in the Unda watershed. It can add information about the potential for water in the Unda watershed. It can be used to prepare water resources management in the Unda watershed so that the existing potential can be used sustainably. Based on the rainfall simulation model results in the Unda watershed, it can be concluded that after running the initial model and calibration. The results are obtained R2 value was 0.68 and increased by 9.81% to 0.754. Both the initial model and the calibration model show an efficient R2 value, NASH value increases by 49.93% to 0.713, which includes satisfactory criteria, RMSE value of 1.135 and decreased by 49.47% to 0.758, and the PBIAS value was 44.70% which was classified as unsatisfactory and decreased from 80.24% to 24.80% at the time of calibration which was classified as satisfactory. In general, the overall simulation results are quite good for representing the watershed’s efficient hydrological process.


Author(s):  
Svitlana Kiyko

The article deals with the principles of compiling the “German-Ukrainian Dictionary of Terminology of Life Safety” and selection pecularities of the lexical material such as the principles of compliance with the goals and objectives of learning, frequency, word-forming value of the term, associative value, subjectivity and semantics. Compilation of the dictionary involves many stages: analysis of existing dictionaries, research of the needs of the addressee, determination of requirements and future characteristics of the dictionary, development of its macro- and microstructure, collection of lexical material, design of dictionary articles, selection of translation equivalents, ordering of the dictionary in accordance with the developed structure, editing, checking the compliance of the received product with the set goals. The author offers the most productive and speed methods of compiling a dictionary with the help of BootCat generation program and Morphy program of paradigms synthesis, which allowed to single out 20,000 terms of the professional language of life safety in the shortest time and find their Ukrainian equivalents. The generation of a body of texts is carried out with S. Sharoff’s method, which provides the search for professional texts with the help of randomly combined four basic terms. This ensures the organization of a homogeneous selection of thematically related texts from the Internet (manuals, reference books, scientific articles, newspaper reports, instructions, sights, abstracts and annotations of articles, etc.). The obtained texts are processed with the help of the Morphy paradigm synthesis program, which automatically assigns all possible grammatical categories to each word in the sentence, and compiles the initial list of terminological dictionary. The next task is to provide equivalent words in the language of translation, able to accurately convey the semantics of the register word. Consequently, in the hands of the user of the dictionary, there is a certain linguistic model of the German professional language of life safety in its equivalent reproduction in the Ukrainian language. The presence of such an initial model will allow the user to perceive adequately scientific texts and, thus, successfully expand their scientific and conceptual apparatus in the future. Key words: dictionary, term, German professional language of life safety, terminological system, lexicography, synthesis of paradigms.


2021 ◽  
Author(s):  
Robson T. Paula ◽  
Décio G. Aguiar Neto ◽  
Davi Romero ◽  
Paulo T. Guerra

A chatbot is an artificial intelligence based system aimed at chatting with users, commonly used as a virtual assistant to help people or answer questions. Intent classification is an essential task for chatbots where it aims to identify what the user wants in a certain dialogue. However, for many domains, little data are available to properly train those systems. In this work, we evaluate the performance of two methods to generate synthetic data for chatbots, one based on template questions and another based on neural text generation. We build four datasets that are used training chatbot components in the intent classification task. We intend to simulate the task of migrating a search-based portal to an interactive dialogue-based information service by using artificial datasets for initial model training. Our results show that template-based datasets are slightly superior to those neural-based generated in our application domain, however, neural-generated present good results and they are a viable option when one has limited access to domain experts to hand-code text templates.


Sign in / Sign up

Export Citation Format

Share Document