original table
Recently Published Documents


TOTAL DOCUMENTS

1487
(FIVE YEARS 9)

H-INDEX

2
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Lingmei Zhang ◽  
Guangxia Wang ◽  
Lingyu Chen

Chart is one kind of ubiquitous information, which is widely utilized and easy for people to understand. Due to there are so many different kinds and different styles of charts, it is not an easy task for a computer to recognize a chart, as well as to redraw the chart or redesign it. This study proposes a three-stage method to chart recognition: analyze the classification of charts, analyze the structure of charts, and analyze the content of charts. When classifying charts, we choose ResNet-50. When recognizing the structure and content of charts, we use different deep frameworks to extract key points based on different types of charts. Besides, we also introduce two datasets, UCCD and UCID, to train deep models to classify and recognize charts. Finally, we utilize some traditional geometric methods to obtain an original table of a chart, so we can redraw it.


2021 ◽  
Vol 40 (1) ◽  
pp. 1001-1015
Author(s):  
Yen-Liang Chen ◽  
Fang-Chi Chi

In the rough set theory proposed by Pawlak, the concept of reduct is very important. The reduct is the minimum attribute set that preserves the partition of the universe. A great deal of research in the past has attempted to reduce the representation of the original table. The advantage of using a reduced representation table is that it can summarize the original table so that it retains the original knowledge without distortion. However, using reduct to summarize tables may encounter the problem of the table still being too large, so users will be overwhelmed by too much information. To solve this problem, this article considers how to further reduce the size of the table without causing too much distortion to the original knowledge. Therefore, we set an upper limit for information distortion, which represents the maximum degree of information distortion we allow. Under this upper limit of distortion, we seek to find the summary table with the highest compression. This paper proposes two algorithms. The first is to find all summary tables that satisfy the maximum distortion constraint, while the second is to further select the summary table with the greatest degree of compression from these tables.


2020 ◽  
pp. 138-148
Author(s):  
O.N. Paulin ◽  
◽  
N.O. Komleva ◽  

The aim of this work is to increase the efficiency of methods and algorithms for solving the problem of finding coverage. Efficiency is understood as the minimum delay of the procedure that implements this method. To increase the efficiency of the “Columnization” method, a characteristic vector (CV) is introduced into the decision tree construction procedure, obtained by summing the units in columns / rows of the coverage table (CT); it characterizes the current state of the coverage table. The idea of this method is to gradually decompose CT into sub-tables using their reduction according to certain rules. We consider 3 ways to reduce the original table / current sub-tables in the methods: 1) "Border search over a concave set"; 2) "Using the properties of the coverage table"; 3) "The minimum column is the maximum row." In the latter method, CV was used for the first time, which made it possible to accelerate the coating finding procedure up to one and a half times. The complexity estimates for the considered coating methods are calculated; we have: S1 = O (n ^ 3); S2 = O (2 ^ n); S3 = O (n ^ 2), where n is the determining parameter of the coverage problem (number of columns), and the applicability limits of these methods are determined. It is shown that the use of CV in methods 1 and 2 is impractical.


2020 ◽  
pp. 31-63
Author(s):  
Jonathan Hardy

The ways newspapers developed as products for readers was influenced by their costs and financing, forms of ownership and market competition. This chapter explores the economics of print and online news publishing in Britain and Ireland from 1900 to 2018 and examines the patterns of ownership, concentration and competition in the national, regional and local newspaper markets. The chapter explores the relationship between the financing of the press, market competition and ownership patterns showing how these have mutually influenced provision. The chapter integrates original analysis of primary and secondary sources and draws on the author’s research on media and advertising relationships to trace the significance of advertising subsidy for publications, up to contemporary challenges, including falling ad revenues for print, and business responses including advertiser sponsored content. The chapter also provides original table data setting out newspaper ownership and circulation for the period.


2020 ◽  
Vol 9 (1) ◽  
pp. 43
Author(s):  
Ryo Inoue ◽  
Mao Li

A quadrilateral table cartogram is a rectangle-shaped figure that visualizes table-form data; quadrilateral cells in a table cartogram are transformed to express the magnitude of positive weights by their areas, while maintaining the adjacency of cells in the original table. However, the previous construction method is difficult to implement because it consists of multiple operations that do not have a unique solution and require complex settings to obtain the desired outputs. In this article, we propose a new construction for quadrilateral table cartograms by recasting the construction as an optimization problem. The proposed method is formulated as a simple minimization problem to achieve mathematical clarity. It can generate quadrilateral table cartograms with smaller deformation of rows and columns, thereby aiding readers to recognize the correspondence between table cartograms and original tables. In addition, we also propose a means of sorting rows and/or columns prior to the construction of table cartograms to reduce excess shape deformation. Applications of the proposed method confirm its capability to output table cartograms that clearly visualize the characteristics of datasets.


2020 ◽  
Vol 20 (1) ◽  
pp. 5-19
Author(s):  
Victor A. Bazhanov ◽  
Lyudmila S. Veselaya ◽  
Iina I. Oreshko

The article attempts to assess the possible parameters of the state of mechanical engineering in Russia in the event of a significant increase in the export of its products provided for by the national project "International Cooperation and Export". The evaluation of the parameters is based on the premise that an increase in the export of engineering products will entail an increase in the volume of its production (provided that there is no decrease in domestic consumption) in the country, which in turn will necessitate a possible re-equipment and reconstruction of existing industries and the creation of new ones. In addition, increasing the output of competitive high-tech products will require increased R&D and the creation of related organizations. All this in total may require significant investment. Evaluation of hypothetical investments in the implementation of intentions to increase the export of engineering products was carried out according to the table of use of domestic products at basic prices from the Rosstat system of input-output tables. At the same time, the original table was transformed into a worksheet with the inclusion of the investment unit, as a result of calculations on which, apart from hypothetical investments equal to the difference between the initial and estimated values of fixed assets, the impact on the output of economic activities related to engineering was determined.


Algorithms ◽  
2019 ◽  
Vol 12 (8) ◽  
pp. 165
Author(s):  
Kazuhiro Minami ◽  
Yutaka Abe

The objective of the cell suppression problem (CSP) is to protect sensitive cell values in tabular data under the presence of linear relations concerning marginal sums. Previous algorithms for solving CSPs ensure that every sensitive cell has enough uncertainty on its values based on the interval width of all possible values. However, we find that every deterministic CSP algorithm is vulnerable to an adversary who possesses the knowledge of that algorithm. We devise a matching attack scheme that narrows down the ranges of sensitive cell values by matching the suppression pattern of an original table with that of each candidate table. Our experiments show that actual ranges of sensitive cell values are significantly narrower than those assumed by the previous CSP algorithms.


2019 ◽  
Vol 1 ◽  
pp. 1-2 ◽  
Author(s):  
Mao Li ◽  
Ryo Inoue

<p><strong>Abstract.</strong> A table cartogram, visualization of table-form data, is a rectangle-shaped table in which each cell is transformed to express the magnitude of positive weight by its area while maintaining the adjacency relationship of cells in the original table. Winter (2011) applies an area cartogram generation method of Gastner and Newman (2004) for their generation, and Evans et al. (2018) proposes a new geometric procedure. The rows and columns on a table cartogram should be easily recognized by readers, however, no methods have focused to enhance the readability. This study proposes a method that defines table cartogram generation as an optimization problem and attempts to minimize vertical and horizontal deformation. Since the original tables are comprised of regular quadrangles, this study uses quadrangles to express cells in a table cartogram and fixes the outer border to attempt to retain the shape of a standard table.</p><p>This study proposes a two-step approach for table cartogram generation with cells that begin as squares and with fixed outer table borders. The first step only adjusts the vertical and horizontal borders of cells to express weights to the greatest possible degree. All cells maintain their rectangular shape after this step, although the limited degree of freedom of this operation results in low data representation accuracy. The second step adapts the cells of the low-accuracy table cartogram to accurately fit area to weight by relaxing the constraints on the directions of borders of cells. This study utilizes an area cartogram generation method proposed by Inoue and Shimizu (2006), which defines area cartogram generation as an optimization problem. The formulation with vertex coordinate parameters consists of an objective function that minimizes the difference between the given data and size of each cell, and a regularization term that controls the changes of bearing angles. It is formulated as non-linear least squares, and is solved through the iteration of linear least squares by linearizing the problem at the coordinates of vertices and updating the estimated coordinates until the value of the objective function becomes small enough.</p>


Author(s):  
Jintao Gao ◽  
Zhanhuai Li ◽  
Wenjie Liu

Cardinality estimation is an important component of query optimization. Its accuracy and efficiency directly decide effect of query optimization. Traditional cardinality estimation strategy is based on original table or sample to collect statistics, then inferring cardinality by collected statistics. It will be low-efficiency when handling big data; Statistics exist update latency and are gotten by inferring, which can not guarantee correctness; Some strategies can get the actual cardinality by executing some subqueries, but they do not keep the result, leading to low efficiency of fetching statistics. Against these problems, this paper proposes a novel cardinality estimation strategy, called cardinality estimation based on query result(CEQR). For keeping correctness of cardinality, CEQR directly gets statistics from query results, which is not related with data size; we build a cardinality table to store the statistics of basic tables and middle results under specific predicates. Cardinality table can provide cardinality services for subsequent queries, and we build a suit of rules to maintain cardinality table; To improve the efficiency of fetching statistics, we introduce the source aware strategy, which hashes cardinality item to appropriate cache. This paper gives the adaptability and deviation analytic of CEQR, and proves that CEQR is more efficient than traditional cardinality estimation strategy by experiments.


Sign in / Sign up

Export Citation Format

Share Document