scholarly journals Whole Earth Telescope Headquarters Standard Data Formats

2003 ◽  
Vol 12 (2) ◽  
Author(s):  
R. L. Riddle ◽  
S. D. Kawaler

AbstractAs the WET moves to CCD systems, we move away from the uniformity of the standard WET photometer into an arena where each system can be radically different. There are many possible CCD photometry systems that can fulfil the requirements of a WET instrument, but each of these will have their own unique native data format. During XCov22, it became readily apparent that the WET requires a defined data format for all CCD data that arrives at HQ. This paper describes the proposed format for the next generation of WET data; the final version will be the default format for XQED, the new photometry package discussed elsewhere in these proceedings.

2003 ◽  
Vol 12 (2) ◽  
Author(s):  
R. L. Riddle

AbstractFor many years, the Whole Earth Telescope has used the QED software package, created by R. E. Nather, to reduce the data gathered from the WET standard, PMT based photometers. While essential for reducing this data, QED alone is not a sufficient package for reducing CCD photometry data, which is becoming a larger fraction of WET data. In addition, QED requires DOS, while many astronomers, and WET HQ, do everything in a different environment (usually Unix based). A new version of the data reduction software will allow the WET to continue to operate with future CCD photometers and systems, and to reduce archival photometry data with new computer systems. Here a software package is described which will satisfy these new needs.


CivilEng ◽  
2021 ◽  
Vol 2 (1) ◽  
pp. 174-192
Author(s):  
Alcinia Zita Sampaio ◽  
Augusto Martins Gomes

The building information modelling (BIM) methodology supports collaborative works, based on the centralization of all information in a federated BIM model and on an efficient level of interoperability between BIM-based platforms. Concerning the structure design, the interoperability capacity of the most used software presents limitations that must be identified and alternative solutions must be proposed. This study analyzes the process of transfer of structure models between modeling and structure analysis tools. Distinct building cases were performed in order to recognize the type of limitations verified in the transfer processes concerning two-way data flow between several software. The study involves the modeling software ArchiCAD 2020, Revit 2020, and AECOsim 2019 and the structure analyzes tools SAP 2020, Robot 2020, and ETABS 22020. The transfer processes are realized in two ways: using the native data format; using a universal standard data transfer, the Industry Foundation Classes (IFC) format. The level of maturity of BIM in structure design is still relatively low, caused essentially by interoperability problems, but despite the limitations detected, this study shows throughout the development of several building case, that the methodology has clear advantages in the development of the structure project.


Sensors ◽  
2018 ◽  
Vol 18 (7) ◽  
pp. 2327 ◽  
Author(s):  
Jinsong Zhang ◽  
Wenjie Xing ◽  
Mengdao Xing ◽  
Guangcai Sun

In recent years, terahertz imaging systems and techniques have been developed and have gradually become a leading frontier field. With the advantages of low radiation and clothing-penetrable, terahertz imaging technology has been widely used for the detection of concealed weapons or other contraband carried on personnel at airports and other secure locations. This paper aims to detect these concealed items with deep learning method for its well detection performance and real-time detection speed. Based on the analysis of the characteristics of terahertz images, an effective detection system is proposed in this paper. First, a lots of terahertz images are collected and labeled as the standard data format. Secondly, this paper establishes the terahertz classification dataset and proposes a classification method based on transfer learning. Then considering the special distribution of terahertz image, an improved faster region-based convolutional neural network (Faster R-CNN) method based on threshold segmentation is proposed for detecting human body and other objects independently. Finally, experimental results demonstrate the effectiveness and efficiency of the proposed method for terahertz image detection.


2013 ◽  
Vol 35 (8) ◽  
pp. 611-621 ◽  
Author(s):  
Elda Rossi ◽  
Stefano Evangelisti ◽  
Antonio Laganà ◽  
Antonio Monari ◽  
Sergio Rampino ◽  
...  

2001 ◽  
Vol 34 (4) ◽  
pp. 519-522 ◽  
Author(s):  
E. Homan ◽  
M. Konijnenburg ◽  
C. Ferrero ◽  
R. E. Ghosh ◽  
I. P. Dolbnya ◽  
...  

The small/wide-angle X-ray scattering (SAXS/WAXS) system on the DUBBLE CRG beamline at the ESRF is used for both static and time-resolved measurements. The integrated system developed for control and data reduction deals effectively with the high rates of incoming data from the different detector systems, as well as the presentation of results for the user. To ensure that the data may be used directly by a wide range of packages, they may be recorded in a number of output formats, thus serving as a practical test bed where developing standards may be compared and contrasted. The software system implements proposals raised at the canSAS meetings to promote a limited set of standard data formats for small-angle scattering studies. The system presented can cope with a volume of results in excess of 10 Gbytes of data per experiment and shows the advantages achieved by minimizing the dependence on raw-data formats.


2021 ◽  
Vol 13 (21) ◽  
pp. 4399
Author(s):  
Alberto Arienzo ◽  
Bruno Aiazzi ◽  
Luciano Alparone ◽  
Andrea Garzelli

In this work, we investigate whether the performance of pansharpening methods depends on their input data format; in the case of spectral radiance, either in its original floating-point format or in an integer-packed fixed-point format. It is theoretically proven and experimentally demonstrated that methods based on multiresolution analysis are unaffected by the data format. Conversely, the format is crucial for methods based on component substitution, unless the intensity component is calculated by means of a multivariate linear regression between the upsampled bands and the lowpass-filtered Pan. Another concern related to data formats is whether quality measurements, carried out by means of normalized indexes depend on the format of the data on which they are calculated. We will focus on some of the most widely used with-reference indexes to provide a novel insight into their behaviors. Both theoretical analyses and computer simulations, carried out on GeoEye-1 and WorldView-2 datasets with the products of nine pansharpening methods, show that their performance does not depend on the data format for purely radiometric indexes, while it significantly depends on the data format, either floating-point or fixed-point, for a purely spectral index, like the spectral angle mapper. The dependence on the data format is weak for indexes that balance the spectral and radiometric similarity, like the family of indexes, Q2n, based on hypercomplex algebra.


2021 ◽  
Author(s):  
Jewgenij Torizin ◽  
Nick Schüßler ◽  
Michael Fuchs

Abstract. This paper introduces the Landslide Susceptibility Assessment Tools – Project Manager Suite (LSAT PM), an open-source, easy-to-use software written in Python. Primarily developed to conduct landslide susceptibility analyses (LSA), it is not limited to this issue and applies to any other research dealing with supervised spatial binary classification. With its standardized project framework, LSAT PM provides efficient interactive data management supported by handy tools. The application utilizes standard data formats ensuring data transferability to all geographic information systems. LSAT PM has a modular structure allowing to extend the existing toolkit by additional analyses. The LSAT PM v1.0.0b implements heuristic and data-driven methods such as the analytical hierarchy process, weights of evidence, logistic regression, and artificial neural networks. The software was developed and tested over the years in different projects dealing with landslide susceptibility assessment. The emphasis on model uncertainties and statistical model evaluation makes the software a practical modeling tool. Also, it provides the possibility to explore and evaluate different LSA models, even those not created with LSAT PM. The software distribution package includes comprehensive documentation. A dataset for testing purposes of the software is available. LSAT PM is subject to continuous further development.


2017 ◽  
Vol 9 (3) ◽  
pp. 267-276 ◽  
Author(s):  
Daiga Plase ◽  
Laila Niedrite ◽  
Romans Taranovs

In this paper, file formats like Avro and Parquet are compared with text formats to evaluate the performance of the data queries. Different data query patterns have been evaluated. Cloudera’s open-source Apache Hadoop distribution CDH 5.4 has been chosen for the experiments presented in this article. The results show that compact data formats (Avro and Parquet) take up less storage space when compared with plain text data formats because of binary data format and compression advantage. Furthermore, data queries from the column based data format Parquet are faster when compared with text data formats and Avro.


2014 ◽  
Vol 1 ◽  
pp. 1-26
Author(s):  
George Alter ◽  
Kees Mandemakers

The Intermediate Data Structure (IDS) is a standard data format that has been adopted by several large longitudinal databases on historical populations. Since the publication of the first version in Historical Social Research in 2009, two improved and extended versions have been published in the Collaboratory Historical Life Courses. In this publication we present version 4 which is the latest ‘official’ standard of the IDS. Discussions with users over the last four years resulted in important changes, like the inclusion of a new table defining the hierarchical relationships among ‘contexts’, decision schemes for recording relationships, additional fields in the metadata table, rules for handling stillbirths, a reciprocal model for relationships, guidance for linking IDS data with geospatial information, and the introduction of an extended IDS for computed variables.


Sign in / Sign up

Export Citation Format

Share Document