scholarly journals A New Method to Estimate Skew Angle in Printed and Historical Document Images

2019 ◽  
Vol 8 (4) ◽  
pp. 3936-3943

In this paper, a new approach to assess the skew angle for scanned/printed documents and historical document images has been proposed. This is substantial for an automatic document processing system (as text and image segmentation) to avert errors in auxiliary stages. The proposed tactic is based on the statistical analysis of the slope of the connected lines in the document. The proposed technique detects skew and corrects it by initial letter (X1, Y1+200) from left margin of the resized (800X800) image and (X1+200, Y1) from top margin. Final letter (X2, Y2-200) and (X2-200, Y2) were chosen from right and bottom margins of the same image. The skew angle estimation is done for standard skewed dataset and effective correction of the same is performed with minimum errors

2017 ◽  
Vol 32 (suppl_1) ◽  
pp. i134-i149 ◽  
Author(s):  
Hao Wei ◽  
Mathias Seuret ◽  
Marcus Liwicki ◽  
Rolf Ingold

Radiocarbon ◽  
2013 ◽  
Vol 55 (2) ◽  
pp. 720-730 ◽  
Author(s):  
Christopher Bronk Ramsey ◽  
Sharen Lee

OxCal is a widely used software package for the calibration of radiocarbon dates and the statistical analysis of 14C and other chronological information. The program aims to make statistical methods easily available to researchers and students working in a range of different disciplines. This paper will look at the recent and planned developments of the package. The recent additions to the statistical methods are primarily aimed at providing more robust models, in particular through model averaging for deposition models and through different multiphase models. The paper will look at how these new models have been implemented and explore the implications for researchers who might benefit from their use. In addition, a new approach to the evaluation of marine reservoir offsets will be presented. As the quantity and complexity of chronological data increase, it is also important to have efficient methods for the visualization of such extensive data sets and methods for the presentation of spatial and geographical data embedded within planned future versions of OxCal will also be discussed.


2002 ◽  
Vol 02 (03) ◽  
pp. 481-499
Author(s):  
JANE YOU ◽  
DAVID ZHANG

This paper presents a new approach to smart sensor system design for real-time remote sensing. A combination of techniques for image analysis and image compression is investigated. The proposed algorithms include: (1) a fractional discrimination function for image analysis, (2) a comparison of effective algorithms for image compression, (3) a pipeline architecture for parallel image classification and compression on-board satellites, and (4) a task control strategy for mapping image computing models to hardware processing elements. The efficiency and accuracy of the proposed techniques are demonstrated throughout system simulation.


2021 ◽  
Vol 2021 (3-4) ◽  
pp. 25-30
Author(s):  
Kirill Tkachenko

The article proposes a new approach for adjusting the parameters of computing nodes being a part of a data processing system based on analytical simulation of a queuing system with subsequent estimation of probabilities of hypotheses regarding the computing node state. Methods of analytical modeling of queuing systems and mathematical statistics are used. The result of the study is a mathematical model for assessing the information situation for a computing node, which differs from the previously published system model used. Estimation of conditional probabilities of hypotheses concerning adequate data processing by a computing node allows making a decision on the need of adjusting the parameters of a computing node. This adjustment makes it possible to improve the efficiency of working with tasks on the computing node of the data processing system. The implementation of the proposed model for adjusting the parameters of the computer node of the data processing system increases both the efficiency of process applications on the node and, in general, the efficiency of its operation. The application of the approach to all computing nodes of the data processing system increases the dependability of the system as a whole.


2020 ◽  
Vol 19 (01) ◽  
pp. 283-316 ◽  
Author(s):  
Luis Morales ◽  
José Aguilar ◽  
Danilo Chávez ◽  
Claudia Isaza

This paper proposes a new approach to improve the performance of Learning Algorithm for Multivariable Data Analysis (LAMDA). This algorithm can be used for supervised and unsupervised learning, based on the calculation of the Global Adequacy Degree (GAD) of one individual to a class, through the contributions of all its descriptors. LAMDA has the capability of creating new classes after the training stage. If an individual does not have enough similarity to the preexisting classes, it is evaluated with respect to a threshold called the Non-Informative Class (NIC), this being the novelty of the algorithm. However, LAMDA has problems making good classifications, either because the NIC is constant for all classes, or because the GAD calculation is unreliable. In this work, its efficiency is improved by two strategies, the first one, by the calculation of adaptable NICs for each class, which prevents that correctly classified individuals create new classes; and the second one, by computing the Higher Adequacy Degree (HAD), which grants more robustness to the algorithm. LAMDA-HAD is validated by applying it in different benchmarks and comparing it with LAMDA and other classifiers, through a statistical analysis to determinate the cases in which our algorithm presents a better performance.


Sign in / Sign up

Export Citation Format

Share Document