Lessons Learned from Modal Testing of Aerospace Structures

1993 ◽  
Vol 36 (1) ◽  
pp. 49-56
Author(s):  
David Hunt ◽  
Ralph Brillhart

A wide variety of challenges have been encountered during the past 10 years of aerospace modal testing. New excitation methods have evolved, including single and multiple input random. Enhancements to traditional single and multiple input sine methods have been developed. Data analysis techniques that allow more consistent modal models to be extracted in less time than previously required have also been developed. New data acquisition hardware allows more rapid acquisition of modal data. As a result of these new excitation methods, data acquisition hardware and analysis tools, more high-quality data can be collected in considerably less time than was possible in the past. Modal surveys with 200 to 400 channels of response are becoming more commonplace. During the development and implementation of these new capabilities, many lessons have been learned about how to manage the increased amount of data collected and how to ensure that the quality remains high.

2016 ◽  
Vol 2016 ◽  
pp. 1-11 ◽  
Author(s):  
Xin Li ◽  
Qing-qing Xiao ◽  
Fu-lun Li ◽  
Rong Xu ◽  
Bin Fan ◽  
...  

Objective. To determine whether immunological serum markers IFN-γ, IL-4, IL-17, IL-23, IL-6, TNF-α, and IL-10 are elevated or decreased in patients compared with healthy controls.Methods. A complete search of the literature on this topic within the past 30 years was conducted across seven databases. Seventeen studies including 768 individuals were identified. Differences in serum marker levels between subjects and controls were pooled as MDs using the random-effects model.Results. The pooled MDs were higher in patients than in healthy controls for IFN-γ(MD 24.9, 95% CI 12.36–37.43), IL-17 (MD 28.92, 95% CI 17.44–40.40), IL-23 (MD 310.60, 95% CI 4.96–616.24), and TNF-α(MD 19.84, 95% CI 13.80–25.87). Pooled IL-4 (MD −13.5, 95% CI −17.74–−9.26) and IL-10 (MD −10.33, 95% CI −12.03–−8.63) levels were lower in patients.Conclusion. The pooled analyses suggest that levels of IFN-γ, IL-17, IL-23, and TNF-αare significantly elevated and that levels of IL-4 and IL-10 are significantly decreased in sera of patients with psoriasis vulgaris of blood-heat syndrome. Measuring progression of blood-heat syndrome of psoriasis vulgaris will require additional high-quality data, with a low risk of bias and adequate sample sizes, before and after antipsoriatic therapy.


Author(s):  
Gordon M. Cressman ◽  
Michael V. McKay ◽  
Abdul-wahid Al-Mafazy ◽  
Mahdi M. Ramsan ◽  
Abdullah S. Ali ◽  
...  

Decision support systems for malaria elimination must support rapid response to contain outbreaks. The integrated mobile system in Zanzibar has been recognized as one of the most advanced in the world. The system consists of a simple facility-based case notification system that uses common feature phones, and a mobile application for Android tablet computers. The resulting system enables rapid response to new cases, helps to rapidly diagnose and treat secondary case, and provides high-quality data for identifying hot spots, trends, and transmission patterns. This presentation will review the history, technology, results, lessons-learned, and applicability to other contexts.


Author(s):  
Paul Farquhar-Smith

The landmark paper discussed in this chapter is ‘Prevalence of pain in patients with cancer: A systematic review of the past 40 years’, published by van den Beuken et al. in 2007. It is not surprising that this definitive study on cancer pain prevalence is one of the most cited papers in cancer pain. Despite the extent of cancer pain literature, this paper’s 2007 publication is surprisingly recent for the first methodologically sound and major study of cancer pain prevalence. Many previous estimates lacked accuracy, and were prone to bias. What was known was that, despite apparent increasing interest in, research in, and recognition of pain in cancer patients, the prevalence of such pain was still high, even after treatment. This paper attempted to accurately quantify just how high by statistically pooling available high-quality data while avoiding the pitfalls of combining heterogeneous studies, as had plagued previous reports.


2020 ◽  
Vol 4 (4) ◽  
pp. 354-359
Author(s):  
Ari Ercole ◽  
Vibeke Brinck ◽  
Pradeep George ◽  
Ramona Hicks ◽  
Jilske Huijben ◽  
...  

AbstractBackground:High-quality data are critical to the entire scientific enterprise, yet the complexity and effort involved in data curation are vastly under-appreciated. This is especially true for large observational, clinical studies because of the amount of multimodal data that is captured and the opportunity for addressing numerous research questions through analysis, either alone or in combination with other data sets. However, a lack of details concerning data curation methods can result in unresolved questions about the robustness of the data, its utility for addressing specific research questions or hypotheses and how to interpret the results. We aimed to develop a framework for the design, documentation and reporting of data curation methods in order to advance the scientific rigour, reproducibility and analysis of the data.Methods:Forty-six experts participated in a modified Delphi process to reach consensus on indicators of data curation that could be used in the design and reporting of studies.Results:We identified 46 indicators that are applicable to the design, training/testing, run time and post-collection phases of studies.Conclusion:The Data Acquisition, Quality and Curation for Observational Research Designs (DAQCORD) Guidelines are the first comprehensive set of data quality indicators for large observational studies. They were developed around the needs of neuroscience projects, but we believe they are relevant and generalisable, in whole or in part, to other fields of health research, and also to smaller observational studies and preclinical research. The DAQCORD Guidelines provide a framework for achieving high-quality data; a cornerstone of health research.


2016 ◽  
Vol 56 (2) ◽  
pp. 601
Author(s):  
Nabeel Yassi

The desire to conduct onshore seismic surveys without cables has been an elusive dream since the dawn of seismic exploration. Since the late 1970s, seismic surveys were conducted with cabled multi-channels acquisition systems. As the number of channels steadily grew, a fundamental restriction appeared with hundreds of kilometres of line cables dragged on the ground. Seismic surveys within rugged terrain—across rivers, steep cliffs, urban areas, and culturally and environmentally sensitive zones—were both challenging and expansive exercises. Modern technology has made different cable-free solutions practical. High-resolution analogue to digital converters are now affordable, as are GPS radios for timing and location. Microprocessors and memory are readily available for autonomous recording systems, along with a battery the size and weight of a field nodal now promising to power an acquisition unit for as long as required for normal seismic crew operations. Many successful 2D and 3D seismic data acquisition using cable-free autonomous nodal systems were attempted in the past few years; however, there remain a number of concerns with these systems. The first concern queries whether the units are working according to manufacturer specifications during the data acquisition window. The second is the limited or no real-time data quality control that inspires sceptics to use the term blind acquisition to nodal operations. The third is the traditional question of geophone array versus point receiver acquisition. Although a string of the geophone can be connected to autonomous nodes, the preference is to deploy a single or internal geophone with the nodes to maintain the proposed flexibility of cable-free recording systems. This case study elaborates on the benefits of the cable-free seismic surveys, with specific examples of 2D and 3D exploration programs conducted in Australia in the past few years. Optimisation of field crew size, field crew resources, cost implications, and footprint to the environment, wildlife and domestic livestock will be discussed. In addition, the study focuses on the data quality/data assurance and the processes implanted during data acquisition to maintain equivalent industry standards to cable recording. Emphases will also include data analysis and test results of the geophone array versus the cable-free point receiver recording.


Author(s):  
O.L. Krivanek ◽  
W.J. de Ruijter ◽  
C.E. Meyer ◽  
M.L. Leber ◽  
J. Wilbrink

Automated electron microscopy promises to perform many tasks better and faster than a human operator. It should also allow the operator to concentrate on the larger picture without having to worry about countless details that can be best handled by a computer. It requires three essential components: 1) data acquisition system that provides the computer with high-quality data on line, 2) computer and software able to analyze the incoming data in real time, and 3) control links that enable the computer to adjust the important microscope parameters.An optimized system architecture is shown schematically in Fig. 1. The microscope is equipped with various microprocessors that control its hardware, and provide data processing abilities devoted to different types of signals (e.g., X-ray spectra). These microprocessors use a standardized communication protocol to communicate over a standard network (such as AppleTalk or Ethernet) with a “master computer”, which provides the user interface, as well as the computing power necessary to handle the most demanding tasks.


2000 ◽  
Vol 178 ◽  
pp. 411-420 ◽  
Author(s):  
Clark R. Wilson

AbstractConceptual models of polar motion have evolved over the past century, as improved data revealed signals over progressively wider frequency bands. In the 1890s, Chandler represented polar motion as a sum of discrete components, 14 month and annual terms, and this component model effectively summarized the observations, but did not provide a physical explanation for them. Over time both the search for a physical understanding of polar motion, and the ability to observe the broad band continuum outside the Chandler and annual bands have led to an understanding of polar motion as a continuum of variations, not adequately described by a few discrete components. The continuum concept is now the working model in most studies of polar motion. The transition from component to continuum conceptual models preceded the arrival of high quality data by several decades, and reflected significant contributions from Harold Jeffreys. With modern space geodetic observations and good global numerical models of the atmosphere, oceans, and other climate processes, it is clear that air and water motion and redistribution are the dominant contributors to the excitation continuum.


Author(s):  
Sethu Arun Kumar ◽  
Thirumoorthy Durai Ananda Kumar ◽  
Narasimha M Beeraka ◽  
Gurubasavaraj Veeranna Pujar ◽  
Manisha Singh ◽  
...  

Predicting novel small molecule bioactivities for the target deconvolution, hit-to-lead optimization in drug discovery research, requires molecular representation. Previous reports have demonstrated that machine learning (ML) and deep learning (DL) have substantial implications in virtual screening, peptide synthesis, drug ADMET screening and biomarker discovery. These strategies can increase the positive outcomes in the drug discovery process without false-positive rates and can be achieved in a cost-effective way with a minimum duration of time by high-quality data acquisition. This review substantially discusses the recent updates in AI tools as cheminformatics application in medicinal chemistry for the data-driven decision making of drug discovery and challenges in high-quality data acquisition in the pharmaceutical industry while improving small-molecule bioactivities and properties.


2013 ◽  
Vol 29 (4) ◽  
pp. 473-488 ◽  
Author(s):  
Ton de Waal

Abstract National statistical institutes are responsible for publishing high quality statistical information on many different aspects of society. This task is complicated considerably by the fact that data collected by statistical offices often contain errors. The process of correcting errors is referred to as statistical data editing. For many years this has been a purely manual process, with people checking the collected data record by record and correcting them if necessary. For this reason the data editing process has been both expensive and time-consuming. This article sketches some of the important methodological developments aiming to improve the efficiency of the data editing process that have occurred during the past few decades. The article focuses on selective editing, which is based on an idea rather shocking for people working in the production of high-quality data: that it is not necessary to find and correct all errors. Instead of trying to correct all errors, it generally suffices to correct only those errors where data editing has substantial influence on publication figures. This overview article sketches the background of selective editing, describes the most usual form of selective editing up to now, and discusses the contributions to this special issue of the Journal of Official Statistics on selective editing. The article concludes with describing some possible directions for future research on selective editing and statistical data editing in general.


2018 ◽  
Vol 47 (2) ◽  
pp. 124-130 ◽  
Author(s):  
Andras Nagy ◽  
Ingo Jahn

Today renewable energies, particularly wind energy are important to meet 24 hour energy demands while keeping car­bon emissions low. As the cost of renewable energies are high, improving their efficiency is a key factor to reduce energy prices. High quality experimental data is essential to develop deeper understanding of the existing systems and to improve their efficiency.This paper introduces an unmanned airborne data acquisition system that can measure properties around wind-turbines to pro­vide new insight into aerodynamic performance and loss mech­anisms and to provide validation data for wind-turbine design methods. The described system is a flexible and portable platform for collecting high quality data from existing full-scale wind-tur­bine installations. This allows experiments to be conducted with­out scaling and with real-world boundary conditions.The system consists of two major parts: the unmanned flying platform (UAV) and the data acquisition system (DAQ). For the UAV a commercially available unit is selected, which has the ability to fly a route autonomously with sufficient preci­sion, the ability to hover, and sufficient load capacity to carry the DAQ system. The DAQ system in contrast, is developed in-house to achieve a high quality data collection capability and to increase flexibility.


Sign in / Sign up

Export Citation Format

Share Document