ReS-Algorithm for Converting Normalized Values of Cost Criteria Into Benefit Criteria in MCDM Tasks

2020 ◽  
Vol 19 (05) ◽  
pp. 1389-1423
Author(s):  
Irik Z. Mukhametzyanov

A review of modern methods of data normalization in the tasks of multicriteria decision-making and multidimensional classification is presented. The invariant properties of linear normalization methods are determined. Two basic principles of normalization of multidimensional data are defined: preservation of dispositions of natural and normalized values on the measurement scale and the absence of a displacement in the areas of normalized values of various criteria relative to each other. A method is proposed for converting normalized values of cost criteria to profit criteria based on the reverse sorting algorithm (ReS-algorithm). The ReS-algorithm preserves the dispositions of the natural and normalized values of the attributes of the alternatives and eliminates the displacement the areas of normalized values of the cost criteria relative to the profit criteria, which ensures the equality of the contributions of various criteria to the performance indicator of the alternatives.

2017 ◽  
Vol 4 (2) ◽  
pp. 24 ◽  
Author(s):  
Ioannis Dimitrakopoulos ◽  
Kostas Karamanis

The aim of this paper is to offer an applicable evaluation framework relating to the right choice of one’s profession via his/her studies. The first part of the paper consists of the basic principles of Multicriteria Decision Making. To begin with, the paper initially focuses on the Macbeth Method. This helps to provide a perspective for procedural types of decisions in which various qualitative and quantitative aspects are incorporated. In the second part of the paper, the above-mentioned multicriteria method is applied to a “real-world” case concerning a specific case of a student, Eva. For this specific study, it is concluded that the factors of greatest importance that lead to choosing the University Eva finally chose, were four: the cost of undergraduate studies, the reputation-status of the University, its logistics and infrastructure and its interconnections with other Universities and other Academic Institutions.


2020 ◽  
Vol 10 (24) ◽  
pp. 9154
Author(s):  
Paula Morella ◽  
María Pilar Lambán ◽  
Jesús Royo ◽  
Juan Carlos Sánchez ◽  
Jaime Latapia

The purpose of this work is to develop a new Key Performance Indicator (KPI) that can quantify the cost of Six Big Losses developed by Nakajima and implements it in a Cyber Physical System (CPS), achieving a real-time monitorization of the KPI. This paper follows the methodology explained below. A cost model has been used to accurately develop this indicator together with the Six Big Losses description. At the same time, the machine tool has been integrated into a CPS, enhancing the real-time data acquisition, using the Industry 4.0 technologies. Once the KPI has been defined, we have developed the software that can turn these real-time data into relevant information (using Python) through the calculation of our indicator. Finally, we have carried out a case of study showing our new KPI results and comparing them to other indicators related with the Six Big Losses but in different dimensions. As a result, our research quantifies economically the Six Big Losses, enhances the detection of the bigger ones to improve them, and enlightens the importance of paying attention to different dimensions, mainly, the productive, sustainable, and economic at the same time.


2021 ◽  
Author(s):  
Iryna Melnychuk ◽  
◽  
Oksana Lopatovska ◽  

The modern accountant works in extremely difficult conditions, which is associated with the rapid development of the service economy, which leads to the formation of new and complex business processes and operations. Since all transactions must be reflected in the accounting system of the enterprise, there is a need to find sound methods that allow to implement it within the existing legal field. However, domestic legislation in the field of accounting and taxation is increasingly undergoing changes that are not always relevant and systematic. As a result, the accountant in his work is faced with problematic issues that need to be addressed immediately. However, the solution of such problems is not always directly regulated by law. In such conditions, the role and importance of professional judgment of the accountant, the content of which is disclosed in the article, increases significantly. It is determined that professional judgment is bases on acquired knowledge, own skills, abilities, experience and professional sense, and is a kind of superstructure of professional opportunities. The cost of such a judgment depends on the result obtained on the basis of its application. Professional judgment is a variable characteristic of an accountant's capabilities and requires constant development. To do this, you should follow certain principles, which include consistency, argumentation, reliability, completeness, logic. Adherence to these principles will form the level of professional judgment that will provide an opportunity to effectively solve non-standard production situations and bring additional income to the accountant. In addition, we believe that the application of international accounting and reporting standards provides more opportunities for the development of accounting judgment. This is because international standards describe the basic principles that a particular entity must comply with. Domestic accounting regulations provide many alternatives by which objects can be recognized in accounting. Therefore, the development of professional judgment in the application of national provisions is primarily related to the justification of a specific alternative or scheme of application of methods of recognition, evaluation, accounting of individual objects.


2010 ◽  
Vol 76 (12) ◽  
pp. 3863-3868 ◽  
Author(s):  
J. Kirk Harris ◽  
Jason W. Sahl ◽  
Todd A. Castoe ◽  
Brandie D. Wagner ◽  
David D. Pollock ◽  
...  

ABSTRACT Constructing mixtures of tagged or bar-coded DNAs for sequencing is an important requirement for the efficient use of next-generation sequencers in applications where limited sequence data are required per sample. There are many applications in which next-generation sequencing can be used effectively to sequence large mixed samples; an example is the characterization of microbial communities where ≤1,000 sequences per samples are adequate to address research questions. Thus, it is possible to examine hundreds to thousands of samples per run on massively parallel next-generation sequencers. However, the cost savings for efficient utilization of sequence capacity is realized only if the production and management costs associated with construction of multiplex pools are also scalable. One critical step in multiplex pool construction is the normalization process, whereby equimolar amounts of each amplicon are mixed. Here we compare three approaches (spectroscopy, size-restricted spectroscopy, and quantitative binding) for normalization of large, multiplex amplicon pools for performance and efficiency. We found that the quantitative binding approach was superior and represents an efficient scalable process for construction of very large, multiplex pools with hundreds and perhaps thousands of individual amplicons included. We demonstrate the increased sequence diversity identified with higher throughput. Massively parallel sequencing can dramatically accelerate microbial ecology studies by allowing appropriate replication of sequence acquisition to account for temporal and spatial variations. Further, population studies to examine genetic variation, which require even lower levels of sequencing, should be possible where thousands of individual bar-coded amplicons are examined in parallel.


2019 ◽  
Vol 15 (1) ◽  
pp. 1-18 ◽  
Author(s):  
Ferrahi Ibtisam Ibtisam ◽  
Sandro Bimonte ◽  
Kamel Boukhalfa

The emergence of spatial or geographic data in DW Systems defines new models that support the storage and manipulation of the data. The need to build an SDW and to optimize SOLAP queries continues to attract the interest of researchers in recent years. Several spatial data models have been investigated to extend classical multidimensional data models with spatial concepts. However, most of existing models do not handle a non-strict spatial hierarchy. Moreover, the complexity of the spatial data makes the execution time of spatial queries very considerable. Often, spatial indexation methods are applied to optimizing access to large volumes of data and helps reduce the cost of spatial OLAP queries. Most of existing indexes support predefined spatial hierarchies. The authors show, in this article, that the logical models proposed in the literature and indexing techniques are not suitable to non-strict hierarchies. The authors propose a new logical schema supporting the non-strict hierarchies and a bitmap index to optimize queries defined by spatial dimensions with several non-strict hierarchies.


2021 ◽  
Vol 6 (3(31)) ◽  
pp. 35-38
Author(s):  
Yuliya Aleksandrovna Mironova ◽  
Svetlana Aleksandrovna Dedeeva

This article presents the basic principles of pricing in the economy, examines the factors influencing the formation of tariffs in the energy sector. Using the example of the Sakmarskaya termal power stantion, the categories that determine the cost of finished products are distinguished. Measures are proposed that can reduce production costs and thereby reduce the cost of electric and thermal energy by about 1.5 times.


2018 ◽  
Author(s):  
Zhenfeng Wu ◽  
Weixiang Liu ◽  
Xiufeng Jin ◽  
Deshui Yu ◽  
Hua Wang ◽  
...  

AbstractData normalization is a crucial step in the gene expression analysis as it ensures the validity of its downstream analyses. Although many metrics have been designed to evaluate the current normalization methods, the different metrics yield inconsistent results. In this study, we designed a new metric named Area Under normalized CV threshold Curve (AUCVC) and applied it with another metric mSCC to evaluate 14 commonly used normalization methods, achieving consistency in our evaluation results using both bulk RNA-seq and scRNA-seq data from the same library construction protocol. This consistency has validated the underlying theory that a sucessiful normalization method simultaneously maximizes the number of uniform genes and minimizes the correlation between the expression profiles of gene pairs. This consistency can also be used to analyze the quality of gene expression data. The gene expression data, normalization methods and evaluation metrics used in this study have been included in an R package named NormExpression. NormExpression provides a framework and a fast and simple way for researchers to evaluate methods (particularly some data-driven methods or their own methods) and then select a best one for data normalization in the gene expression analysis.


2020 ◽  
Author(s):  
Satya Kumara

Cost overrun in the field of project construction is still a global issue, and it is a challenge to seek for solution from various scientific discipline including information technology. This thesis aims to design a business intelligence application model for analyzing the cost of construction project to provide follow-up information as quickly and as early as possible to assist the executives to make valuable business decisions. This case study was prepared through literature reviews, interviews, and direct observation to obtain overview of the business processes in the construction company. The modeling of BI application which can be successfully developed using agile methods is very effective in addressing the change. This case study found that the application of business intelligence which provides information on key performance indicator (KPI) can assist the executives of the company to make decisions quickly and precisely, thus improving the performance of the project, especially in the cost control.


2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Haonan Li ◽  
Xu Wu ◽  
Yinghui Liang ◽  
Chen Zhang

Airport gate assignment performance indicator selection is a complicated decision-making problem with strong subjectivity and difficulty in measuring the importance of each indicator. A better selection of performance indicators (PIs) can greatly increase the airport overall benefit. We adopt a multicriteria decision-making approach to quantify qualitative PIs and conduct subsequent selection using the fuzzy clustering method. First, we identify 21 commonly used PIs through literature review and survey. Subsequently, the fuzzy analytic hierarchy process technique was employed to obtain the selection criteria weights based on the relative importance of significance, availability, and generalisability. Further, we aggregated the selection criteria weights and experts’ score to evaluate each PI for the clustering process. The fuzzy-possibilistic product partition c-means algorithm was applied to divide the PIs into different groups based on the three selection criteria as partitioning features. The cluster with highest weights of the centre was identified as the very high-influence cluster, and 10 PIs were identified as a result. This study revealed that the passenger-oriented objective is the most important performance criterion; however, the relevance of the airport/airline-oriented and robustness-oriented performance objectives was highlighted as well. It also offers a scientific approach to determine the objective functions for future gate assignment research. And, we believe, through slight modifications, this model can be used in other airports, other indicator selection problems, or other scenarios at the same airport to facilitate policy making and real situation practice, hence facilitate the management system for the airport.


Sign in / Sign up

Export Citation Format

Share Document