computer implementation
Recently Published Documents


TOTAL DOCUMENTS

372
(FIVE YEARS 40)

H-INDEX

32
(FIVE YEARS 2)

2021 ◽  
Author(s):  
Hamidreza Validi ◽  
Austin Buchanan ◽  
Eugene Lykhovyd

For nearly 60 years, operations research techniques have assisted in the creation of political districting plans, beginning with an integer programming model. This model, which seeks compactness as its objective, tends to generate districts that are contiguous, or nearly so, but provides no guarantee of contiguity. In the paper “Imposing contiguity constraints in political districting models” by Hamidreza Validi, Austin Buchanan, and Eugene Lykhovyd, the authors consider and analyze four different contiguity models (two old and two new). Their computer implementation can handle redistricting instances as large as Indiana (1,511 census tracts). Their fastest approach uses a branch-and-cut algorithm, where contiguity constraints are added in a callback. Critically, many variables can be fixed to zero a priori by Lagrangian arguments. All test instances and source code are publicly available.


2021 ◽  
pp. 102-105
Author(s):  
M.V. Kurkina ◽  
I.V. Ponomarev

One of the actively developing areas of modern computational problems is data analysis. The studied data have a different structure, which causes certain difficulties in the process of smoothing and analysis. This fact entails the need to search for new universal algorithms for data processing and create computer programs that analyze data of various nature. Today, a widely used method of data processing is regression modeling. It is used in problems of pattern recognition, classification, dimensionality reduction, and many others. The literature describes various methods of constructing regression models, the basis of which is the optimization of a certain indicator — the quality functional. A very important requirement for the quality of such models is the absence of outliers (outliers) in the data. This article discusses a method for examining a sample for outliers. The obtained algorithm can be applied to regression models estimated by the most common methods (least squares method, least modulus method). The mathematical basis of this procedure is the Legendre transformation, which provides computational accuracy in computer implementation. The adequacy of the obtained algorithm was investigated on a number of test samples. All tests were positive in terms of emissions. The MatLab system is used to develop a set of programs, which allows the building of various regression models and evaluation of the original sample for sharply distinguished observations.


Author(s):  
Alla Nesterenko ◽  
Oleksandr Duchenko

The paper is devoted to the methods of geometric modeling of plane curves given in the natural parameterization. The paper considers numerical modeling methods that make it possible to find the equation of curvature of the desired curve for different cases of the input data. The unknown curvature distribution coefficients of the required curve are determined by solving a system of nonlinear integral equations. Various numerical methods are considered to solve this nonlinear system. The results of computer implementation of the proposed methods for modeling two curvilinear contours with different initial data are presented. For the first curve, the input data are the coordinates of three points, the angles of inclination of the tangents at the extreme points and the linear law of curvature distribution. The second example considers an S-shaped curve with a quadratic law of curvature distributi.


2021 ◽  
Author(s):  
Prosper Kiisi Lekia

Abstract One of the challenges of the petroleum industry is achieving maximum recovery from oil reservoirs. The natural energy of the reservoir, primary recoveries in most cases do not exceed 20%. To improve recovery, secondary recovery techniques are employed. With secondary recovery techniques such as waterflooding, an incremental recovery ranging from 15 to 25% can be achieved. Several theories and methods have been developed for predicting waterflood performance. The Dykstra-Parson technique stands as the most widely used of these methods. The authors developed a discrete, analytical solution from which the vertical coverage, water-oil ratio, cumulative oil produced, cumulative water produced and injected, and the time required for injection was determined. Reznik et al extended the work of Dykstra and Parson to include exact, analytical, continuous solutions, with explicit solutions for time, constant injection pressure, and constant overall injection rate conditions, property time, real or process time, with the assumption of piston-like displacement. This work presents a computer implementation to compare the results of the Dykstra and Parson method, and the Reznik et al extension. A user-friendly graphical user interface executable application has been developed for both methods using Python 3. The application provides an interactive GUI output for graphs and tables with the python matplotlib module, and Pandastable. The GUI was built with Tkinter and converted to an executable desktop application using Pyinstaller and the Nullsoft Scriptable Install System, to serve as a hands-on tool for petroleum engineers and the industry. The results of the program for both methods gave a close match with that obtained from the simulation performed with Flow (Open Porous Media). The results provided more insight into the underlying principles and applications of the methods.


Mathematics ◽  
2021 ◽  
Vol 9 (15) ◽  
pp. 1818
Author(s):  
Esko Turunen ◽  
Klara Dolos

We investigate the applicability and usefulness of the GUHA data mining method and its computer implementation LISp-Miner for driver characterization based on digital vehicle data on gas pedal position, vehicle speed, and others. Three analytical questions are assessed: (1) Which measured features, also called attributes, distinguish each driver from all other drivers? (2) Comparing one driver separately in pairs with each of the other drivers, which are the most distinguishing attributes? (3) Comparing one driver separately in pairs with each of the other drivers, which attributes values show significant differences between drivers? The analyzed data consist of 94,380 measurements and contain clear and understandable patterns to be found by LISp-Miner. In conclusion, we find that the GUHA method is well suited for such tasks.


2021 ◽  
Vol 20 ◽  
pp. 107-117
Author(s):  
TIMOTHY MICHAEL CHÁVEZ ◽  
DUC THAI NGUYEN

While the minimum cost flow (MCF) problems have been well documented in many publications, due to its broad applications, little or no effort have been devoted to explaining the algorithms for identifying loop formation and computing the θ value needed to solve MCF network problems. This paper proposes efficient algorithms, and MATLAB computer implementation, for solving MCF problems. Several academic and real-life network problems have been solved to validate the proposed algorithms; the numerical results obtained by the developed MCF code have been compared and matched with the built-in MATLAB function Linprog() (Simplex algorithm) for further validation.


Author(s):  
Jekaterina Aleksejeva ◽  
Sharif Guseynov

In the present paper, on the basis of the theory of inverse and ill-posed problems, an algorithm is proposed that allows to unambiguously determine the stoichiometric coefficients in the equations of chemical reactions of any type, including redox reactions and acid-base reactions, and, regardless of whether the constructed system of linear algebraic equations for the desired stoichiometric coefficients is underdetermined (i.e. there are fewer equations than unknowns) or overdetermined (i.e. there are more equations than unknowns). The proposed algorithm is a regularized algorithm (according to Tikhonov), which ensures that, in a computer implementation, possible computational errors will not make the comprised system of linear algebraic equations to be incapable of solving.


Author(s):  
Alessandro Signa ◽  
Antonio Chella ◽  
Manuel Gentile

Abstract Purpose of Review The theory of consciousness is a subject that has kept scholars and researchers challenged for centuries. Even today it is not possible to define what consciousness is. This has led to the theorization of different models of consciousness. Starting from Baars’ Global Workspace Theory, this paper examines the models of cognitive architectures that are inspired by it and that can represent a reference point in the field of robot consciousness. Recent Findings Global Workspace Theory has recently been ranked as the most promising theory in its field. However, this is not reflected in the mathematical models of cognitive architectures inspired by it: they are few, and most of them are a decade old, which is too long compared to the speed at which artificial intelligence techniques are improving. Indeed, recent publications propose simple mathematical models that are well designed for computer implementation. Summary In this paper, we introduce an overview of consciousness and robot consciousness, with some interesting insights from the literature. Then we focus on Baars’ Global Workspace Theory, presenting it briefly. Finally, we report on the most interesting and promising models of cognitive architectures that implement it, describing their peculiarities.


Sign in / Sign up

Export Citation Format

Share Document