A hybrid approach for developing special purpose simulation tools

2006 ◽  
Vol 33 (12) ◽  
pp. 1505-1515 ◽  
Author(s):  
Y Mohamed ◽  
S M AbouRizk

The use of simulation techniques is an effective approach for modeling construction operations. Unfortunately, the high level of technical knowledge and development time required for building functional simulation models renders simulation modeling an impractical technology for many in the construction industry. Research in construction simulation tackles this conflict by providing modeling approaches that reduce the knowledge and time usually required for building simulation models of construction operations. Special purpose simulation (SPS) allows construction engineers with only minimal simulation knowledge to build practical simulation models. This paper presents a hybrid approach (HSPS) for effective and time-saving development of SPS tools. The approach utilizes visual, general purpose modeling elements to customize the simulation behaviors of new SPS elements, minimizing the programming effort required for developing these elements. This paper describes the theoretical background to the HSPS approach, its implementation, and a sample application successfully created subsequently. It also shows the results of an experiment quantifying the savings in development time achieved using this approach.Key words: automation, simulation models, computerized simulation, tunnel construction, construction management.

Author(s):  
Marta K. Isaeva

The paper dedicates in commemoration of K.A. Bagrinovsky, known scientist, doctor of economic sciences, professor. His thesis was theoretic problems of mathematical modeling and operation of economy. His works in the operations research, the methods making decision, the simulation were received in scientific world. The analysis and the modeling of the mechanisms for scientific and technological development for the production systems of different level in economic hierarchic both centrally controlled economy and making mechanism were conduced by Bagrinovsky in CEMI RAS. The paper presents the investigations (2001–2015) of the analysis and the simulation of the different mechanisms of the innovational activity. It also discusses the methods of the development the complex of the simulation models. In a sense simulation modeling is the science and the art as the selection of the salient parameters for the construction model, intake simplification, the computer experiment and the making decision based on scarcity of accuracy models rest on the heuristic power of men: the practical trial, the intelligence and the intuition. K.A. Bagrinovsky introduced the considerable endowment in the development of this direction for economic and mathematical investigation.The principal object was to show that the relationship between the innovational policy and the technological structure, scientific research sector and the introducing of the progressive production and the organizational structure is obtainable by the models. The character of these relationships may be to use in control of the parameters for the modernization economic. The construction simulation models and the experimental computation analysis were presented the investigations the different mechanisms of the innovational development ant the variants of the estimation have been accomplished on the modeling level by the computer experiment.


1987 ◽  
Vol 14 (3) ◽  
pp. 134-140 ◽  
Author(s):  
K.A. Clarke

Practical classes in neurophysiology reinforce and complement the theoretical background in a number of ways, including demonstration of concepts, practice in planning and performance of experiments, and the production and maintenance of viable neural preparations. The balance of teaching objectives will depend upon the particular group of students involved. A technique is described which allows the embedding of real compound action potentials from one of the most basic introductory neurophysiology experiments—frog sciatic nerve, into interactive programs for student use. These retain all the elements of the “real experiment” in terms of appearance, presentation, experimental management and measurement by the student. Laboratory reports by the students show that the experiments are carefully and enthusiastically performed and the material is well absorbed. Three groups of student derive most benefit from their use. First, students whose future careers will not involve animal experiments do not spend time developing dissecting skills they will not use, but more time fulfilling the other teaching objectives. Second, relatively inexperienced students, struggling to produce viable neural material and master complicated laboratory equipment, who are often left with little time or motivation to take accurate readings or ponder upon neurophysiological concepts. Third, students in institutions where neurophysiology is taught with difficulty because of the high cost of equipment and lack of specific expertise, may well have access to a low cost general purpose microcomputer system.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Bo-yong Park ◽  
Seok-Jun Hong ◽  
Sofie L. Valk ◽  
Casey Paquola ◽  
Oualid Benkarim ◽  
...  

AbstractThe pathophysiology of autism has been suggested to involve a combination of both macroscale connectome miswiring and microcircuit anomalies. Here, we combine connectome-wide manifold learning with biophysical simulation models to understand associations between global network perturbations and microcircuit dysfunctions in autism. We studied neuroimaging and phenotypic data in 47 individuals with autism and 37 typically developing controls obtained from the Autism Brain Imaging Data Exchange initiative. Our analysis establishes significant differences in structural connectome organization in individuals with autism relative to controls, with strong between-group effects in low-level somatosensory regions and moderate effects in high-level association cortices. Computational models reveal that the degree of macroscale anomalies is related to atypical increases of recurrent excitation/inhibition, as well as subcortical inputs into cortical microcircuits, especially in sensory and motor areas. Transcriptomic association analysis based on postmortem datasets identifies genes expressed in cortical and thalamic areas from childhood to young adulthood. Finally, supervised machine learning finds that the macroscale perturbations are associated with symptom severity scores on the Autism Diagnostic Observation Schedule. Together, our analyses suggest that atypical subcortico-cortical interactions are associated with both microcircuit and macroscale connectome differences in autism.


2021 ◽  
Vol 11 (9) ◽  
pp. 4122
Author(s):  
Young-Jun Park ◽  
Chang-Yong Yi

Construction quality is one of the primary management objectives relating to duration and cost for construction projects. Project managers struggle with minimizing duration and cost while maximizing quality for construction projects. In construction projects, duration and cost have management priorities. On the other hand, quality is considered a matter of achievement only when it reaches a certain level. Although the importance of quality control in construction management has been constantly discussed, it has still been sacrificed under the goal of shortening construction duration and reducing costs. This study presents a method for estimating the quantitative quality performance of construction operations in which the level of detail is breaking into the work task level for intuitive quality performance evaluation. For this purpose, quality weights of resources that have a proportional quality importance weight and quality performance indexes of resources are utilized for estimating the quantitative quality performance of construction operations. Quality performance estimation and the resource allocation optimization system is presented and validated using a construction simulation model.


Author(s):  
Robert H. Sturges ◽  
Jui-Te Yang

Abstract In support of the effort to bring downstream issues to the attention of the designer as parts take shape, an analysis system is being built to extract certain features relevant to the assembly process, such as the dimension, shape, and symmetry of an object. These features can be applied to a model during the downstream process to evaluate handling and assemblability. In this paper, we will focus on the acquisition phase of the assembly process and employ a Design for Assembly (DFA) evaluation to quantify factors in this process. The capabilities of a non-homogeneous, non-manifold boundary representation geometric modeling system are used with an Index of Difficulty (ID) that represents the dexterity and time required to assemble a product. A series of algorithms based on the high-level abstractions of loop and link are developed to extract features that are difficult to orient, which is one of the DFA criteria. Examples for testing the robustness of the algorithms are given. Problems related to nearly symmetric outlines are also discussed.


2021 ◽  
Author(s):  
Subba Ramarao Rachapudi Venkata ◽  
Nagaraju Reddicharla ◽  
Shamma Saeed Alshehhi ◽  
Indra Utama ◽  
Saber Mubarak Al Nuimi ◽  
...  

Abstract Matured hydrocarbon fields are continuously deteriorating and selection of well interventions turn into critical task with an objective of achieving higher business value. Time consuming simulation models and classical decision-making approach making it difficult to rapidly identify the best underperforming, potential rig and rig-less candidates. Therefore, the objective of this paper is to demonstrate the automated solution with data driven machine learning (ML) & AI assisted workflows to prioritize the intervention opportunities that can deliver higher sustainable oil rate and profitability. The solution consists of establishing a customized database using inputs from various sources including production & completion data, flat files and simulation models. Automation of Data gathering along with technical and economical calculations were implemented to overcome the repetitive and less added value tasks. Second layer of solution includes configuration of tailor-made workflows to conduct the analysis of well performance, logs, output from simulation models (static reservoir model, well models) along with historical events. Further these workflows were combination of current best practices of an integrated assessment of subsurface opportunities through analytical computations along with machine learning driven techniques for ranking the well intervention opportunities with consideration of complexity in implementation. The automated process outcome is a comprehensive list of future well intervention candidates like well conversion to gas lift, water shutoff, stimulation and nitrogen kick-off opportunities. The opportunity ranking is completed with AI assisted supported scoring system that takes input from technical, financial and implementation risk scores. In addition, intuitive dashboards are built and tailored with the involvement of management and engineering departments to track the opportunity maturation process. The advisory system has been implemented and tested in a giant mature field with over 300 wells. The solution identified more techno-economical feasible opportunities within hours instead of weeks or months with reduced risk of failure resulting into an improved economic success rate. The first set of opportunities under implementation and expected a gain of 2.5MM$ with in first one year and expected to have reoccurring gains in subsequent years. The ranked opportunities are incorporated into the business plan, RMP plans and drilling & workover schedule in accordance to field development targets. This advisory system helps in maximizing the profitability and minimizing CAPEX and OPEX. This further maximizes utilization of production optimization models by 30%. Currently the system was implemented in one of ADNOC Onshore field and expected to be scaled to other fields based on consistent value creation. A hybrid approach of physics and machine learning based solution led to the development of automated workflows to identify and rank the inactive strings, well conversion to gas lift candidates & underperforming candidates resulting into successful cost optimization and production gain.


2021 ◽  
Author(s):  
Mokhles Mezghani ◽  
Mustafa AlIbrahim ◽  
Majdi Baddourah

Abstract Reservoir simulation is a key tool for predicting the dynamic behavior of the reservoir and optimizing its development. Fine scale CPU demanding simulation grids are necessary to improve the accuracy of the simulation results. We propose a hybrid modeling approach to minimize the weight of the full physics model by dynamically building and updating an artificial intelligence (AI) based model. The AI model can be used to quickly mimic the full physics (FP) model. The methodology that we propose consists of starting with running the FP model, an associated AI model is systematically updated using the newly performed FP runs. Once the mismatch between the two models is below a predefined cutoff the FP model is switch off and only the AI model is used. The FP model is switched on at the end of the exercise either to confirm the AI model decision and stop the study or to reject this decision (high mismatch between FP and AI model) and upgrade the AI model. The proposed workflow was applied to a synthetic reservoir model, where the objective is to match the average reservoir pressure. For this study, to better account for reservoir heterogeneity, fine scale simulation grid (approximately 50 million cells) is necessary to improve the accuracy of the reservoir simulation results. Reservoir simulation using FP model and 1024 CPUs requires approximately 14 hours. During this history matching exercise, six parameters have been selected to be part of the optimization loop. Therefore, a Latin Hypercube Sampling (LHS) using seven FP runs is used to initiate the hybrid approach and build the first AI model. During history matching, only the AI model is used. At the convergence of the optimization loop, a final FP model run is performed either to confirm the convergence for the FP model or to re iterate the same approach starting from the LHS around the converged solution. The following AI model will be updated using all the FP simulations done in the study. This approach allows the achievement of the history matching with very acceptable quality match, however with much less computational resources and CPU time. CPU intensive, multimillion-cell simulation models are commonly utilized in reservoir development. Completing a reservoir study in acceptable timeframe is a real challenge for such a situation. The development of new concepts/techniques is a real need to successfully complete a reservoir study. The hybrid approach that we are proposing is showing very promising results to handle such a challenge.


Author(s):  
Matias Javier Oliva ◽  
Pablo Andrés García ◽  
Enrique Mario Spinelli ◽  
Alejandro Luis Veiga

<span lang="EN-US">Real-time acquisition and processing of electroencephalographic signals have promising applications in the implementation of brain-computer interfaces. These devices allow the user to control a device without performing motor actions, and are usually made up of a biopotential acquisition stage and a personal computer (PC). This structure is very flexible and appropriate for research, but for final users it is necessary to migrate to an embedded system, eliminating the PC from the scheme. The strict real-time processing requirements of such systems justify the choice of a system on a chip field-programmable gate arrays (SoC-FPGA) for its implementation. This article proposes a platform for the acquisition and processing of electroencephalographic signals using this type of device, which combines the parallelism and speed capabilities of an FPGA with the simplicity of a general-purpose processor on a single chip. In this scheme, the FPGA is in charge of the real-time operation, acquiring and processing the signals, while the processor solves the high-level tasks, with the interconnection between processing elements solved by buses integrated into the chip. The proposed scheme was used to implement a brain-computer interface based on steady-state visual evoked potentials, which was used to command a speller. The first tests of the system show that a selection time of 5 seconds per command can be achieved. The time delay between the user’s selection and the system response has been estimated at 343 µs.</span>


2004 ◽  
Vol 11 (33) ◽  
Author(s):  
Aske Simon Christensen ◽  
Christian Kirkegaard ◽  
Anders Møller

We show that it is possible to extend a general-purpose programming language with a convenient high-level data-type for manipulating XML documents while permitting (1) precise static analysis for guaranteeing validity of the constructed XML documents relative to the given DTD schemas, and (2) a runtime system where the operations can be performed efficiently. The system, named Xact, is based on a notion of immutable XML templates and uses XPath for deconstructing documents. A companion paper presents the program analysis; this paper focuses on the efficient runtime representation.


Sign in / Sign up

Export Citation Format

Share Document