scholarly journals Adding linguistic information to parsed corpora

2019 ◽  
Vol 18 (1) ◽  
Author(s):  
Susan Pintzuk

No matter how comprehensively corpus builders design their annotation schemes, users frequently find that information is missing that they need for their research. In this methodological paper I describe and illustrate five methods of adding linguistic information to corpora that have been morphosyntactically annotated (=parsed) in the style of Penn treebanks. Some of these methods involve manual operations; some are executed by CorpusSearch functions; some require a combination of manual and automated procedures. Which method is used depends almost entirely on the type of information to be added and the goals of the user. Of course, the main goal, regardless of method, is to record within the corpus additional information that can be used for analysis and also retained through further searches and data processing.

2018 ◽  
Vol 182 ◽  
pp. 01007
Author(s):  
Vladimir Boykov ◽  
Aleksandr Povarecho

This paper presents selected problems connected with automation of procedures involved in assessment of machine degradation degree using vibration method with special emphasis on the machine state prognosis. The current knowledge of these problems is not sufficient and needs further research on data processing, analysis of efficiency of diagnostic and prognostic procedures, collection and selection of diagnostic parameters and development of automatic procedures for recognition and prognosis of a machine state. New solutions and different aspects of diagnostic prognosis based on the proposed partial procedures focus on factors determining automation of procedures for identification of technical systems states. New automated procedures for acquisition and processing of symptoms indicating the machine state provide better possibilities of control and supervision of technical systems operation and maintenance through identification of their current states, and its good prognosis.


Author(s):  
A. Yu. Skripnik ◽  
V. A. Fokin ◽  
R. R. Mironchuk ◽  
V. E. Uspenskiy ◽  
O. B. Irtyuga ◽  
...  

Aim. To modernize the computed tomography angiography (CTA) protocol with advanced data processing for the diagnosis of ascending aortic (AA) aneurysms, determining the aortic distensibility and compliance. Material and methods. We examined 24 patients (14 men) aged 43 to 72 years old with aneurysm or dilatation of ascending aorta (AA). CTA was performed on Siemens Somatom Definition AS and Philips Ingenuity Elite 128-slice scanners with electrocardiographic (ECG) synchronization after a bolus injection of contrast agents (100-120 ml). End-systolic and end-diastolic frames, maximum aortic diameter and cross-sectional area were determined; aortic distensibility and compliance were calculated.Results. According to AA diameter in end-diastolic frame, patients were divided into 3 groups. Group 1 — 6 patients, d< 45 mm (39 [39; 40] mm), group 2 — 7 patients, d =45-50 mm (48 [46; 49] mm) and group 3 — 11 patients, d >50 mm (51 [51; 54] mm). A correlation between aortic distensibility and compliance and such parameters as age, systolic blood pressure, systolic and was found. Correlation between the aortic compliance and diastolic diameter can be used for predicting of diameter increase rate.Conclusion. The designed CTA protocol with advanced data processing allows evaluating the AA distensibility and compliance by the diameter and cross-sectional area in patients with AA dilatation. These criteria provide additional information about the aortic elastic properties and can be used for determining the management strategy.


Author(s):  
Florence Sagnard

The extraction of quantitative information from Ground Penetrating Radar (GPR) data sets (radargrams) to detect and map underground utility pipelines is a challenging task. This study proposes several algorithms included in the main stages of a data processing chain associated with radargrams. It comprises preprocessing, hyperbola enhancing, hyperbola detection and localization, and parameter extraction. Additional parameters related to the GPR system such as the frequency band and the polarization bring data sets additional information that need to be exploited. Presently, the algorithms have been applied step by step on synthetic and experimental data. The results help to guide future developments in signal processing for quantitative parameter estimation.


2020 ◽  
Author(s):  
Jack Shaw ◽  
et al.

Additional information about data processing, taxon- and geography-specific analyses, other sampling biases, determinants of fossilization potential, predictive modeling procedures, and the impact of Lagerstätten on estimating fossilization potential.<br>


PLoS ONE ◽  
2021 ◽  
Vol 16 (2) ◽  
pp. e0247535
Author(s):  
Warren C. Jochem ◽  
Andrew J. Tatem

Spatial datasets of building footprint polygons are becoming more widely available and accessible for many areas in the world. These datasets are important inputs for a range of different analyses, such as understanding the development of cities, identifying areas at risk of disasters, and mapping the distribution of populations. The growth of high spatial resolution imagery and computing power is enabling automated procedures to extract and map building footprints for whole countries. These advances are enabling coverage of building footprint datasets for low and middle income countries which might lack other data on urban land uses. While spatially detailed, many building footprints lack information on structure type, local zoning, or land use, limiting their application. However, morphology metrics can be used to describe characteristics of size, shape, spacing, orientation and patterns of the structures and extract additional information which can be correlated with different structure and settlement types or neighbourhoods. We introduce the foot package, a new set of open-source tools in a flexible R package for calculating morphology metrics for building footprints and summarising them in different spatial scales and spatial representations. In particular our tools can create gridded (or raster) representations of morphology summary metrics which have not been widely supported previously. We demonstrate the tools by creating gridded morphology metrics from all building footprints in England, Scotland and Wales, and then use those layers in an unsupervised cluster analysis to derive a pattern-based settlement typology. We compare our mapped settlement types with two existing settlement classifications. The results suggest that building patterns can help distinguish different urban and rural types. However, intra-urban differences were not well-predicted by building morphology alone. More broadly, though, this case study demonstrates the potential of mapping settlement patterns in the absence of a housing census or other urban planning data.


2020 ◽  
Author(s):  
Jack Shaw ◽  
et al.

Additional information about data processing, taxon- and geography-specific analyses, other sampling biases, determinants of fossilization potential, predictive modeling procedures, and the impact of Lagerstätten on estimating fossilization potential.<br>


1979 ◽  
Vol 46 ◽  
pp. 368
Author(s):  
Clinton B. Ford

A “new charts program” for the Americal Association of Variable Star Observers was instigated in 1966 via the gift to the Association of the complete variable star observing records, charts, photographs, etc. of the late Prof. Charles P. Olivier of the University of Pennsylvania (USA). Adequate material covering about 60 variables, not previously charted by the AAVSO, was included in this original data, and was suitably charted in reproducible standard format.Since 1966, much additional information has been assembled from other sources, three Catalogs have been issued which list the new or revised charts produced, and which specify how copies of same may be obtained. The latest such Catalog is dated June 1978, and lists 670 different charts covering a total of 611 variables none of which was charted in reproducible standard form previous to 1966.


Author(s):  
G. Lehmpfuhl

Introduction In electron microscopic investigations of crystalline specimens the direct observation of the electron diffraction pattern gives additional information about the specimen. The quality of this information depends on the quality of the crystals or the crystal area contributing to the diffraction pattern. By selected area diffraction in a conventional electron microscope, specimen areas as small as 1 µ in diameter can be investigated. It is well known that crystal areas of that size which must be thin enough (in the order of 1000 Å) for electron microscopic investigations are normally somewhat distorted by bending, or they are not homogeneous. Furthermore, the crystal surface is not well defined over such a large area. These are facts which cause reduction of information in the diffraction pattern. The intensity of a diffraction spot, for example, depends on the crystal thickness. If the thickness is not uniform over the investigated area, one observes an averaged intensity, so that the intensity distribution in the diffraction pattern cannot be used for an analysis unless additional information is available.


Author(s):  
Eva-Maria Mandelkow ◽  
Eckhard Mandelkow ◽  
Joan Bordas

When a solution of microtubule protein is changed from non-polymerising to polymerising conditions (e.g. by temperature jump or mixing with GTP) there is a series of structural transitions preceding microtubule growth. These have been detected by time-resolved X-ray scattering using synchrotron radiation, and they may be classified into pre-nucleation and nucleation events. X-ray patterns are good indicators for the average behavior of the particles in solution, but they are difficult to interpret unless additional information on their structure is available. We therefore studied the assembly process by electron microscopy under conditions approaching those of the X-ray experiment. There are two difficulties in the EM approach: One is that the particles important for assembly are usually small and not very regular and therefore tend to be overlooked. Secondly EM specimens require low concentrations which favor disassembly of the particles one wants to observe since there is a dynamic equilibrium between polymers and subunits.


Sign in / Sign up

Export Citation Format

Share Document