Tractor Manufacturing Quality Data Acquisition, Analysis and Utilization

Author(s):  
Ying Li ◽  
Zhilin Zhao ◽  
Fang Cheng
2011 ◽  
Vol 51 (1) ◽  
pp. 259
Author(s):  
Rajesh Trivedi ◽  
Shripad Biniwale ◽  
Adil Jabur

With a vision of innovation, integrity and agility, Nexus Energy began first production of Longtom field in October 2009. The Longtom gas field is located in the Gippsland Basin, offshore Victoria where the produced gas is transported to Santos’ Patricia Baleen gas processing plant. All production data is acquired by Santos with the supervisory control and data acquisition (SCADA) system. The challenge for Nexus Energy was to monitor the field remotely in the absence of a data historian and to support the operational people proactively. Data acquisition from Santos, validation, and storage in a secured centralised repository were therefore key tasks. A system was needed that would not only track accurate production volumes to meet the daily contractual quantity (DCQ) production targets but that would also be aligned with Nexus’s vision for asset optimisation. We describe how real-time data is acquired, validated, and stored automatically in the absence of a data historian for Longtom field, and how the deployed system provides a framework for an integrated Production Operation System (iPOS). The solution uses an integrated methodology that allows effective monitoring of real-time data trends to anticipate and prevent potential well and equipment problems, thus assisting in meeting DCQ targets and providing effective analysis techniques for decision making. Based on full workflow automation, the system is deployed for acquisition, allocation, reporting and analysis. This has increased accuracy, accountability and timely availability of quality data, which has helped Nexus improve productivity. The comprehensive reporting tool provides access to operational and production reports via email for managers, output reports in various formats for joint venture partners, and nontechnical users without direct access to the core application. A powerful surveillance tool, integrated with the operational database, provides alarms and notifications on operation issues, which helps engineers make proactive operational decisions. The framework allows a streamlined data flow for dynamic updates of well and simulation models, improving process integration and reducing the runtime cycle.


1993 ◽  
Vol 36 (1) ◽  
pp. 49-56
Author(s):  
David Hunt ◽  
Ralph Brillhart

A wide variety of challenges have been encountered during the past 10 years of aerospace modal testing. New excitation methods have evolved, including single and multiple input random. Enhancements to traditional single and multiple input sine methods have been developed. Data analysis techniques that allow more consistent modal models to be extracted in less time than previously required have also been developed. New data acquisition hardware allows more rapid acquisition of modal data. As a result of these new excitation methods, data acquisition hardware and analysis tools, more high-quality data can be collected in considerably less time than was possible in the past. Modal surveys with 200 to 400 channels of response are becoming more commonplace. During the development and implementation of these new capabilities, many lessons have been learned about how to manage the increased amount of data collected and how to ensure that the quality remains high.


2020 ◽  
Vol 4 (4) ◽  
pp. 354-359
Author(s):  
Ari Ercole ◽  
Vibeke Brinck ◽  
Pradeep George ◽  
Ramona Hicks ◽  
Jilske Huijben ◽  
...  

AbstractBackground:High-quality data are critical to the entire scientific enterprise, yet the complexity and effort involved in data curation are vastly under-appreciated. This is especially true for large observational, clinical studies because of the amount of multimodal data that is captured and the opportunity for addressing numerous research questions through analysis, either alone or in combination with other data sets. However, a lack of details concerning data curation methods can result in unresolved questions about the robustness of the data, its utility for addressing specific research questions or hypotheses and how to interpret the results. We aimed to develop a framework for the design, documentation and reporting of data curation methods in order to advance the scientific rigour, reproducibility and analysis of the data.Methods:Forty-six experts participated in a modified Delphi process to reach consensus on indicators of data curation that could be used in the design and reporting of studies.Results:We identified 46 indicators that are applicable to the design, training/testing, run time and post-collection phases of studies.Conclusion:The Data Acquisition, Quality and Curation for Observational Research Designs (DAQCORD) Guidelines are the first comprehensive set of data quality indicators for large observational studies. They were developed around the needs of neuroscience projects, but we believe they are relevant and generalisable, in whole or in part, to other fields of health research, and also to smaller observational studies and preclinical research. The DAQCORD Guidelines provide a framework for achieving high-quality data; a cornerstone of health research.


2016 ◽  
Vol 56 (2) ◽  
pp. 601
Author(s):  
Nabeel Yassi

The desire to conduct onshore seismic surveys without cables has been an elusive dream since the dawn of seismic exploration. Since the late 1970s, seismic surveys were conducted with cabled multi-channels acquisition systems. As the number of channels steadily grew, a fundamental restriction appeared with hundreds of kilometres of line cables dragged on the ground. Seismic surveys within rugged terrain—across rivers, steep cliffs, urban areas, and culturally and environmentally sensitive zones—were both challenging and expansive exercises. Modern technology has made different cable-free solutions practical. High-resolution analogue to digital converters are now affordable, as are GPS radios for timing and location. Microprocessors and memory are readily available for autonomous recording systems, along with a battery the size and weight of a field nodal now promising to power an acquisition unit for as long as required for normal seismic crew operations. Many successful 2D and 3D seismic data acquisition using cable-free autonomous nodal systems were attempted in the past few years; however, there remain a number of concerns with these systems. The first concern queries whether the units are working according to manufacturer specifications during the data acquisition window. The second is the limited or no real-time data quality control that inspires sceptics to use the term blind acquisition to nodal operations. The third is the traditional question of geophone array versus point receiver acquisition. Although a string of the geophone can be connected to autonomous nodes, the preference is to deploy a single or internal geophone with the nodes to maintain the proposed flexibility of cable-free recording systems. This case study elaborates on the benefits of the cable-free seismic surveys, with specific examples of 2D and 3D exploration programs conducted in Australia in the past few years. Optimisation of field crew size, field crew resources, cost implications, and footprint to the environment, wildlife and domestic livestock will be discussed. In addition, the study focuses on the data quality/data assurance and the processes implanted during data acquisition to maintain equivalent industry standards to cable recording. Emphases will also include data analysis and test results of the geophone array versus the cable-free point receiver recording.


Author(s):  
O.L. Krivanek ◽  
W.J. de Ruijter ◽  
C.E. Meyer ◽  
M.L. Leber ◽  
J. Wilbrink

Automated electron microscopy promises to perform many tasks better and faster than a human operator. It should also allow the operator to concentrate on the larger picture without having to worry about countless details that can be best handled by a computer. It requires three essential components: 1) data acquisition system that provides the computer with high-quality data on line, 2) computer and software able to analyze the incoming data in real time, and 3) control links that enable the computer to adjust the important microscope parameters.An optimized system architecture is shown schematically in Fig. 1. The microscope is equipped with various microprocessors that control its hardware, and provide data processing abilities devoted to different types of signals (e.g., X-ray spectra). These microprocessors use a standardized communication protocol to communicate over a standard network (such as AppleTalk or Ethernet) with a “master computer”, which provides the user interface, as well as the computing power necessary to handle the most demanding tasks.


2016 ◽  
Vol 72 (9) ◽  
pp. 1036-1048 ◽  
Author(s):  
Arnau Casanas ◽  
Rangana Warshamanage ◽  
Aaron D. Finke ◽  
Ezequiel Panepucci ◽  
Vincent Olieric ◽  
...  

The development of single-photon-counting detectors, such as the PILATUS, has been a major recent breakthrough in macromolecular crystallography, enabling noise-free detection and novel data-acquisition modes. The new EIGER detector features a pixel size of 75 × 75 µm, frame rates of up to 3000 Hz and a dead time as low as 3.8 µs. An EIGER 1M and EIGER 16M were tested on Swiss Light Source beamlines X10SA and X06SA for their application in macromolecular crystallography. The combination of fast frame rates and a very short dead time allows high-quality data acquisition in a shorter time. The ultrafine φ-slicing data-collection method is introduced and validated and its application in finding the optimal rotation angle, a suitable rotation speed and a sufficient X-ray dose are presented. An improvement of the data quality up to slicing at one tenth of the mosaicity has been observed, which is much finer than expected based on previous findings. The influence of key data-collection parameters on data quality is discussed.


Sign in / Sign up

Export Citation Format

Share Document