Machine-Learning-Based Online Transient Analysis via Iterative Computation of Generator Dynamics

Author(s):  
Jiaming Li ◽  
Meng Yue ◽  
Yue Zhao ◽  
Guang Lin
2021 ◽  
Author(s):  
Zheng-Kai Yang ◽  
Ming-Hsien Hsu ◽  
Chung Yuan Chang ◽  
Ya-Wen Ho ◽  
PO-NING LIU ◽  
...  

Machine learning (ML) compact device models (CM) have emerged as an alternative to physics-based CMs. ML CMs can find a mathematical model close to the device characteristics without much prior knowledge, which saves the time of model formation. Additionally, versatile capabilities such as process-awareness, model merging, and fitting new technologies, promote the usage of ML CMs. While ML CMs draw great attention in CAD, their convergence in SPICE has not been carefully studied. Here different activation functions are used to create ML CMs, and then the circuit convergence is tested. We found that inverse square root unit (ISRU) activation has the best convergence. Besides, gate-to-source and gate-to-drain capacitance is founded to benefit the convergence in transient analysis. The circuit convergence rate is 100% for ISRU, sigmoid, and tanh when the capacitor is present. On the other hand, ISRU significantly outperforms other activation functions in DC sweep, achieving 81% convergence. If quasi-static transient analysis is employed to replace DC sweep, 100% convergence is achieved by ISRU. Due to its superior convergence, ISRU is the most promising for future ML CMs in SPICE.


2021 ◽  
Author(s):  
Sukotrihadiyono Tejo ◽  
Yasutra Amega ◽  
Irawan Dedy

Abstract The efficiency of perforation is an important aspect in gas well since it affects near wellbore pressure drop related to turbulent flow. The perforation efficiency is correlated with non-Darcy skin that is able to be distinguished by pressure transient analysis of isochronal test (Swift et al., 1962), or evaluated from multi-rate flow test data plot coefficients (Jones et al., 1976), or type curve of single build up test following constant-rate production (Spivey et al., 2004). A simple single rate pressure transient analysis which is supported by parameters derived from historical multi rate test data was also proven to differentiate skin damage and non-Darcy skin (Aminian et al., 2007). Unfortunately there are trade-offs between accurateness and analysis time in these aforementioned methods. Quick analysis of perforation efficiency is often needed during well completion and workover activities, to decide whether re-perforation job is required or not. To overcome the challenges of limited time for data acquisition and evaluation, an empirical relation between actual perforation length, skin damage, and laminar-turbulence flow coefficients that are obtained from short-time multi rate test is important to predict the perforation efficiency. The empirical relation will be developed using machine learning. A simple gas reservoir model is built and then run with variations of reservoir permeability, perforation interval length, near wellbore permeability, and vertical anisotropy to generate large numbers of hypothetical multi rate test data. The data set of laminar coefficient, turbulence coefficient, absolute open flow, skin damage, and perforation length will then be trained and tested to create empirical relation using supervised regression method which will afterwards be applied to several actual field cases. This study will elaborate the development of empirical relation of perforation efficiency with the distinct parameters obtained from simple short-time multi rate test data, what other factors will influence the empirical relation, as well as become the possible condition limit of the field application of the developed empirical relation.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Mohammed J. Zaki ◽  
Wagner Meira, Jr
Keyword(s):  

2020 ◽  
Author(s):  
Marc Peter Deisenroth ◽  
A. Aldo Faisal ◽  
Cheng Soon Ong
Keyword(s):  

Author(s):  
Lorenza Saitta ◽  
Attilio Giordana ◽  
Antoine Cornuejols

Author(s):  
Shai Shalev-Shwartz ◽  
Shai Ben-David
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document