Proceedings of the Institute for System Programming of RAS
Latest Publications


TOTAL DOCUMENTS

840
(FIVE YEARS 276)

H-INDEX

5
(FIVE YEARS 2)

Published By Institute For System Programming Of The Russian Academy Of Sciences

2220-6426, 2079-8156

2021 ◽  
Vol 33 (5) ◽  
pp. 219-236
Author(s):  
Javier Ortiz-Hernández ◽  
Víctor Josue Ruiz-Martínez ◽  
María Yasmín Hernández-Pérez ◽  
Rito Mijarez-Castro

On September 19, 2017 an earthquake occurred in Mexico with epicenter in the limits of the states of Puebla and Morelos. It had a magnitude of 7.1 on the Richter scale. 31,090 homes were affected, of which 1659 with total damage. In order to attend to the contingency, the government of Morelos formed an inter-institutional committee in the first hours to carry out an initial diagnosis of the damage and to provide emergency services. This article presents a case study and lessons learned from the software engineering support for the development of a data-driven platform in the various phases of contingency response: census of damaged homes, identification of aid beneficiaries, determination of aid packages according to a damage assessment, logistics and follow-up of aid package delivery, data-driven decision making, and a public portal for open data and budget transparency. KEYWORDS


Author(s):  
Nikolay Anatolievich Vershkov ◽  
Mikhail Grigoryevich Babenko ◽  
Viktor Andreevich Kuchukov ◽  
Natalia Nikolaevna Kuchukova

The article deals with the problem of recognition of handwritten digits using feedforward neural networks (perceptrons) using a correlation indicator. The proposed method is based on the mathematical model of the neural network as an oscillatory system similar to the information transmission system. The article uses theoretical developments of the authors to search for the global extremum of the error function in artificial neural networks. The handwritten digit image is considered as a one-dimensional input discrete signal representing a combination of "perfect digit writing" and noise, which describes the deviation of the input implementation from "perfect writing". The ideal observer criterion (Kotelnikov criterion), which is widely used in information transmission systems and describes the probability of correct recognition of the input signal, is used to form the loss function. In the article is carried out a comparative analysis of the convergence of learning and experimentally obtained sequences on the basis of the correlation indicator and widely used in the tasks of classification of the function CrossEntropyLoss with the use of the optimizer and without it. Based on the experiments carried out, it is concluded that the proposed correlation indicator has an advantage of 2-3 times.


2021 ◽  
Vol 33 (1) ◽  
pp. 173-188
Author(s):  
Maribel Tello-Rodríguez ◽  
Jorge Octavio Ocharán-Hernández ◽  
Juan Carlos Pérez-Arriaga ◽  
Xavier Limón ◽  
Ángel J. Sánchez-García

Cloud computing trends such as Software as a Service (SaaS) enable providers to host complex applications over the Internet, making them available to external consumers through an Application Programming Interface (API). The success of a SaaS, and in some sense any distributed system, is greatly influenced by its API. Highly usable APIs improve the efficiency of the development process and its quality, ensuring that programmers continue to appreciate other aspects of the API while increasing their productivity. Different studies state that the design phase within the development process of an API is the most appropriate to address usability issues. Therefore, usability should be considered as an explicit criterion in the design of an API. In this paper, we propose a design guide for web APIs with an emphasis on usability, using the best practices of usable web APIs design. Our design guide is based on an adaptation of the design science research methodology (DSRM), and it is complemented with a systematic literature review and gray literature analysis concerning methods, techniques, and tools used to develop usable APIs.


Author(s):  
Viktor Sergeevich Kryshtapovich

Gradual typing is a modern approach for combining benefits of static typing and dynamic typing. Although scientific research aim for soundness of type systems, many of languages intentionally make their type system unsound for speeding up performance. This paper describes an implementation of a dialect for Lama programming language that supports gradual typing with explicit annotation of dangerous parts of code. The target of current implementation is to grant type safety to programs while keeping their power of untyped expressiveness. This paper covers implementation issues and properties of created type system. Finally, some perspectives on improving precision and soundness of type system are discussed.


2021 ◽  
Vol 33 (3) ◽  
pp. 109-122
Author(s):  
Anton Mikhailovich Rigin ◽  
Sergey Andreevich Shershakov

These days, most of time-critical business processes are performed using computer technologies. As an example, one can consider financial processes including trading on stock exchanges powered by electronic communication protocols such as the Financial Information eXchange (FIX) Protocol. One of the main challenges emerging with such processes concerns maintaining the best possible performance since any unspecified delay may cause a large financial loss or other damage. Therefore, performance analysis of time-critical systems and applications is required. In the current work, we develop a novel method for a performance analysis of time-critical applications based on the db-net formalism, which combines the ability of colored Petri nets to model a system control flow with the ability to model relational database states. This method allows to conduct a performance analysis for time-critical applications that work as transactional systems and have log messages which can be represented in the form of table records in a relational database. One of such applications is a FIX protocol-based trading communication system. This system is used in the work to demonstrate applicability of the proposed method for time-critical systems performance analysis. However, there are plenty of similar systems existing for different domains, and the method can also be applied for a performance analysis of these systems. The software prototype is developed for testing and demonstrating abilities of the method. This software prototype is based on an extension of Renew software tool, which is a reference net simulator. The testing input for the software prototype includes a test log with FIX messages, provided by a software developer of testing solutions for one of the global stock exchanges. An application of the method for quantitative analysis of maximum acceptable delay violations is presented. The developed method allows to conduct a performance analysis as a part of conformance checking of a considered system. The method can be used in further research in this domain as well as in testing the performance of real time-critical software systems.


Author(s):  
Anna Vadimovna Lapkina ◽  
Andrew Alexandrovitch Petukhov

The problem of automatic requests classification, as well as the problem of determining the routing rules for the requests on the server side, is directly connected with analysis of the user interface of dynamic web pages. This problem can be solved at the browser level, since it contains complete information about possible requests arising from interaction interaction between the user and the web application. In this paper, in order to extract the classification features, using data from the request execution context in the web client is suggested. A request context or a request trace is a collection of additional identification data that can be obtained by observing the web page JavaScript code execution or the user interface elements changes as a result of the interface elements activation. Such data, for example, include the position and the style of the element that caused the client request, the JavaScript function call stack, and the changes in the page's DOM tree after the request was initialized. In this study the implementation of the Chrome Developer Tools Protocol is used to solve the problem at the browser level and to automate the request trace selection.


2021 ◽  
Vol 33 (5) ◽  
pp. 271-282
Author(s):  
Elena Sergeevna Baimetova ◽  
Albina Firdavesovna Gizzatullina ◽  
Maria Ravilevna Koroleva ◽  
Olga Vladimirovna Mishchenkova ◽  
Fyodor Nikolaevich Pushkarev ◽  
...  

This study is devoted to the problem of numerical modeling of the conjugate heat transfer in a closed-type power installation. The working elements of that are ribbed bimetallic tubes using the openFoam toolbox. The heat transfer process modeling in bimetallic tubes is associated with solving the problem of determining the value of the contact thermal resistance at the metal / metal interface. Considered design of a bimetallic tube involves crimping copper washers on the surface of an aluminum cylindrical tube. Hence, the contact surface of the tube is not isotropic in its properties. A mathematical model of conjugate heat transfer for air / bimetal / coolant medium is proposed. The features of the organization of thermophysical processes at the metal contact interface and at the metal / air and metal / coolant medium are shown. A qualitative comparison of the obtained results with the famous experimental data is carried out. Generalized temperature profiles in the rib longitudinal section are obtained by mathematical modeling. The given distributions of temperature and heat flux make it possible to estimate the contribution of each individual rib to the investigated heat removal process from the air environment. The efficiency of the considered technology of manufacturing a bimetallic finned tube is shown.


2021 ◽  
Vol 33 (4) ◽  
pp. 163-176
Author(s):  
Nikolai Mikhailovich Suvorov ◽  
Lyudmila Nickolaevna Lyadova

Visual modeling is widely used nowadays, but the existing modeling platforms cannot meet all the user requirements. Visual languages are usually based on graph models, but the graph types used have significant restrictions. A new graph model, called HP-graph, whose main element is a set of poles, the subsets of which are combined into vertices and edges, has been previously presented to solve the problem of insufficient expressiveness of the existing graph models. Transformations and many other operations on visual models face a problem of subgraph matching, which slows down their execution. A multilayer approach to subgraph matching can be a solution for this problem if a modeling system is based on the HP-graph. In this case, the search is started on the higher level of the graph model, where vertices and hyperedges are compared without revealing their structures, and only when a candidate is found, it moves to the level of poles, where the comparison of the decomposed structures is performed. The description of the idea of the multilayer approach is given. A backtracking algorithm based on this approach is presented. The Ullmann algorithm and VF2 are adapted to this approach and are analyzed for complexity. The proposed approach incrementally decreases the search field of the backtracking algorithm and helps to decrease its overall complexity. The paper proves that the existing subgraph matching algorithms except ones that modify a graph pattern can be successfully adapted to the proposed approach.


Author(s):  
Aleksandr Igorevich Getman ◽  
Maria Kirillovna Ikonnikova

The article is dedicated to the problem of classifying network traffic into three categories: transparent, compressed and opaque, preferably in real-time. It begins with the description of the areas where this problem needs to be solved, then proceeds to the existing solutions with their methods, advantages and limitations. As most of the current research is done either in the area of separating traffic into transparent and opaque or into compressed and encrypted, the need arises to combine a subset of existing methods to unite these two problems into one. As later the main mathematical ideas and suggestions that lie behind the ideas used in the research done by other scientists are described, the list of the best performing of them is composed to be combined together and used as the features for the random forest classificator, which will divide the provided traffic into three classes. The best performing of these features are used, the optimal tree parameters are chosen and, what’s more, the initial three class classifier is divided into two sequential ones to save time needed for classifying in case of transparent packets. Then comes the proposition of the new method to classify the whole network flow as one into one of those three classes, the validity of which is confirmed on several examples of the protocols most specific in this area (SSH, SSL). The article concludes with the directions in which this research is to be continued, mostly optimizing it for real-time classification and obtaining more samples of traffic suitable for experiments and demonstrations.


2021 ◽  
Vol 33 (4) ◽  
pp. 195-210
Author(s):  
Roman Vyacheslavovich Baev ◽  
Leonid Vladlenovich Skvortsov ◽  
Evgeny Alekseevich Kudryashov ◽  
Ruben Arturovich Buchatskiy ◽  
Roman Aleksandrovich Zhuykov

Aggressive optimization in modern compilers may uncover vulnerabilities in program code that did not lead to bugs prior to optimization. The source of these vulnerabilities is in code with undefined behavior. Programmers use such constructs relying on some particular behavior these constructs showed before in their experience, but the compiler is not obliged to stick to that behavior and may change the behavior if it’s needed for optimization since the behavior is undefined by language standard. This article describes approaches to detection and elimination of vulnerabilities arising from optimization in the case when source code is available but its modification is undesirable or impossible. Concept of a safe compiler (i.e. compiler that ensures no vulnerability is added to the program during optimization) is presented and implementation of such a compiler on top of GCC compiler is described. Implementation of safe compiler’s functionality is divided into three security levels whose applicability is discussed in the article. Feasibility of using the safe compiler on real-world codebases is demonstrated and possible performance losses are estimated.


Sign in / Sign up

Export Citation Format

Share Document