Research on CAD Model Data Conversion for RP Technology

2011 ◽  
Vol 314-316 ◽  
pp. 2253-2258
Author(s):  
Dong Gen Cai ◽  
Tian Rui Zhou

The data processing and conversion plays an important role in RP processes in which the choice of data format determines data processing procedure and method. In this paper, the formats and features of commonly used interface standards such as STL, IGES and STEP are introduced, and the data conversion experiments of CAD models are carried out based on Pro/E system in which the conversion effects of different data formats are compared and analyzed, and the most reasonable data conversion format is proposed.

2020 ◽  
Author(s):  
Christian Zeeden ◽  
Christian Laag ◽  
Pierre Camps ◽  
Yohan Guyodo ◽  
Ulrich Hambach ◽  
...  

<p>Paleomagnetic data are used in different data formats, adapted to data output of a variety of devices and specific analysis software. This includes widely used openly available software, e.g. PMag.py/MagIC, AGICO/.jr6 & .ged, and PuffinPlot/.ppl. Besides these, individual software and data formats have been established by individual laboratories.</p><p>Here we compare different data formats, identify similarities and create a common and interchangeable data basis. We introduce the idea of a paleomagnetic object (pmob), a simple data table that can include any and all data that would be relevant to the user. We propose a basic nomenclature of abbreviations for the most common paleomagnetic data to merge different data formats. For this purpose, we introduce a set of automatization routines for paleomagnetic data conversion. Our routines bring several data formats into a common data format (pmob), and also allow reversion into selected formats. We propose creating similar routines for all existing paleomagnetic data formats; our suite of computation tools will provide the basis to facilitate the inclusion of further data formats. Furthermore, automatized data processing allows quality assessment of data.</p>


2007 ◽  
Vol 24 (3) ◽  
pp. 529-536 ◽  
Author(s):  
Qiang Ji

Abstract In using pyranometers to measure solar irradiance, it is important to know the magnitudes and the consequences of the thermal effect, which is introduced by the glass domes of the instruments. Historically, the thermal dome effect was not monitored on a regular basis. Case studies show that, due to the thermal dome effect, the output of the pyranometers altered from less than 5 W m−2 in the nighttime to over 20 W m−2 around noontime during the Aerosol Recirculation and Rainfall Experiment (ARREX) in 1999 and the Southern African Fire–Atmosphere Research Initiative (SAFARI) in 2000 field campaigns, depending on sky conditions. A calibration and data processing procedure with the thermal dome effect incorporated has been tested to resolve the issue. It is demonstrated that the intrinsic calibration constants of the pyranometers can be obtained if two pyranometers are used side by side, and the thermal dome effect may be inferred whenever a pyranometer and a pyrgeometer are collocated.


Author(s):  
F. Tsai ◽  
T.-S. Wu ◽  
I.-C. Lee ◽  
H. Chang ◽  
A. Y. S. Su

This paper presents a data acquisition system consisting of multiple RGB-D sensors and digital single-lens reflex (DSLR) cameras. A systematic data processing procedure for integrating these two kinds of devices to generate three-dimensional point clouds of indoor environments is also developed and described. In the developed system, DSLR cameras are used to bridge the Kinects and provide a more accurate ray intersection condition, which takes advantage of the higher resolution and image quality of the DSLR cameras. Structure from Motion (SFM) reconstruction is used to link and merge multiple Kinect point clouds and dense point clouds (from DSLR color images) to generate initial integrated point clouds. Then, bundle adjustment is used to resolve the exterior orientation (EO) of all images. Those exterior orientations are used as the initial values to combine these point clouds at each frame into the same coordinate system using Helmert (seven-parameter) transformation. Experimental results demonstrate that the design of the data acquisition system and the data processing procedure can generate dense and fully colored point clouds of indoor environments successfully even in featureless areas. The accuracy of the generated point clouds were evaluated by comparing the widths and heights of identified objects as well as coordinates of pre-set independent check points against in situ measurements. Based on the generated point clouds, complete and accurate three-dimensional models of indoor environments can be constructed effectively.


Author(s):  
Yuan Sun ◽  
Hao Xu ◽  
Jianqing Wu ◽  
Jianying Zheng ◽  
Kurt M. Dietrich

High-resolution vehicle data including location, speed, and direction is significant for new transportation systems, such as connected-vehicle applications, micro-level traffic performance evaluation, and adaptive traffic control. This research developed a data processing procedure for detection and tracking of multi-lane multi-vehicle trajectories with a roadside light detection and ranging (LiDAR) sensor. Different from existing methods for vehicle onboard sensing systems, this procedure was developed specifically to extract high-resolution vehicle trajectories from roadside LiDAR sensors. This procedure includes preprocessing of the raw data, statistical outlier removal, a Least Median of Squares based ground estimation method to accurately remove the ground points, vehicle data clouds clustering, a principle component-based oriented bounding box method to estimate the location of the vehicle, and a geometrically-based tracking algorithm. The developed procedure has been applied to a two-way-stop-sign intersection and an arterial road in Reno, Nevada. The data extraction procedure has been validated by comparing tracking results and speeds logged from a testing vehicle through the on-board diagnostics interface. This data processing procedure could be applied to extract high-resolution trajectories of connected and unconnected vehicles for connected-vehicle applications, and the data will be valuable to practices in traffic safety, traffic mobility, and fuel efficiency estimation.


Author(s):  
Ryan Mackenzie White

Adoption of non-traditional data sources to augment or replace traditional survey vehicles can reduce respondent burden, provide more timely information for policy makers, and gain insights into the society that may otherwise be hidden or missed through traditional survey vehicles. The use of non-traditional data sources imposes several technological challenges due to the volume, velocity and quality of the data. The lack of applied industry-standard data format is a limiting factor which affects the reception, processing and analysis of these data sources. The adoption of a standardized, cross-language, in-memory data format that is organized for efficient analytic operations on modern hardware as a system of record for all administrative data sources has several implications: Enables the efficient use of computational resources related to I/O, processing and storage. Improves data sharing, management and governance capabilities. Increases analyst accessibility to tools, technologies and methods. Statistics Canada developed a framework for selecting computing architecture models for efficient data processing based on benchmark data pipelines representative of common administrative data processes. The data pipelines demonstrate the benefits of a standardized data format for data management, and the efficient use of computational resources. The data pipelines define the preprocessing requirements, data ingestion, data conversion, and metadata modeling, for integration into a common computing architecture. The integration of a standardized data format into a distributed data processing framework based on container technologies is discussed as a general technique to process large volumes of administrative data.


Author(s):  
Chikahiro Minowa ◽  
Nobuyoshi Yamaguchi ◽  
Toshio Chiba

Observation system of the seismic wave has greatly progressed and many accelerometers have been set all over Japan. Furthermore, the data processing procedure was developed and the reasonable permanent displacement and the displacement wave were going to be obtained from the measured acceleration data. The baseline correction method was adopted as a data processing procedure. To estimate the adaptability of the baseline correction method, the permanent displacements and displacement wave of major records in 2003 Off Tokachi Earthquake were calculated. The displacements were compared with the data of JAPAN Geographical Survey Institute and Port and Harbor Research Institute. These data were fairly similar to each other. Additionally, sloshing response of the fired large liquid storage tank in Tomakomai was calculated using these data. The baseline correction method presented here can be used successfully to correct strong motion records and present the displacement data for the seismic design and the vibration test.


Sign in / Sign up

Export Citation Format

Share Document