Corrosion Control Cost Reduction through Improved QA Information Management - An OSD Funded Joint Navy Army Initiative

2008 ◽  
Vol 38 ◽  
pp. 88-92
Author(s):  
Vincent F. Hock ◽  
Susan Drozdz ◽  
Andrew Seelinger ◽  
Delmar Doyle

When the coating application does not meet the required standards, the lifetime of the coating can be substantially reduced. In the worst case, the coating may catastrophically fail immediately after being placed into service. This ongoing joint project conducted by the U.S. Navy and the U.S. Army is intended to demonstrate and provide for the automation of data collection for painting projects on critical structures and the make this data a more effective resource for making effective management decisions for the protection of DoD assets.

Author(s):  
Saeed Babanajad ◽  
Yun Bai ◽  
Helmut Wenzel ◽  
Moritz Wenzel ◽  
Hooman Parvardeh ◽  
...  

The effective management of bridges requires a good understanding of their life expectancies. Improved prediction of bridge service life is required to be developed in order to better understand bridge deterioration and to find more effective maintenance and repair strategies. These models are integral components of the Long-Term Bridge Performance Program (LTBP), a 20-year research effort initiated by the U.S. Federal Highway Administration (FHWA) to improve the understanding of bridge performance. In this paper, the development of a life expectancy model framework, as part of the research effort in this program, is presented. The framework is established based on a semi-probabilistic approach to adherently maintain the advantages of both deterministic and probabilistic techniques. The modeling follows a step-by-step process which incorporates data collected from historical records, training the data, creating a model based on the most suitable approach, and reducing the associated uncertainties. The basic model is first trained by the network of bridge inventory and the uncertainties are reflected by determining lower and upper margins. Then the model is improved by introducing the new knowledge gained from the external attributes influencing the structure. Finally, the condition states of the bridge components are employed directly to refine the model for realistic assessment. The developed model is later automated into the Bridge Portal, the main core of the bridge-performance data warehouse. A detailed example using the Mid-Atlantic cluster bridge inventory data is presented in this paper to illustrate the application of the method described above.


Author(s):  
Alexander Yakovlev

Today is the time of transnational corporations and large companies. They bring to their shareholders and owners the major profits, and they are the main sponsors of scientific and technological progress. However, the extensive way of its development is not possible for environmental, marketing, resource, and many other reasons. So, the main field of competition between companies becomes a fight for the client, the individualization of approach to him, and the maximum cost reduction. At the same time, a series of scandals that erupted in the early 2000s with such major corporations as Enron Corporation, WorldCom, Tyco International, Adelphia, and Peregrine Systems has shown that the system of corporate governance, on which depends the welfare of hundreds of thousands of people, requires serious improvements in terms of transparency and openness. In this regard, the U.S. adopted the Sarbanes-Oxley Act of 2002, under which management companies legally obliged to prove that his decisions are based on reliable, relevant, credible and accurate information (Devenport & Harris, 2010).


2020 ◽  
Vol 54 (6) ◽  
pp. 37-43
Author(s):  
Alicia M. Gorton ◽  
Will J. Shaw

AbstractAs countries continue to implement sustainable and renewable energy goals, the need for affordable low-carbon technologies, including those related to offshore wind energy, is accelerating. The U.S. federal government recognizes the environmental and economic benefits of offshore wind development and is taking the necessary steps to overcome critical challenges facing the industry to realize these benefits. The U.S. Department of Energy (DOE) is investing in buoy-mounted lidar systems to facilitate offshore measurement campaigns that will advance our understanding of the offshore environment and provide the observational data needed for model validation, particularly at hub height where offshore observations are particularly lacking. On behalf of the DOE, the Pacific Northwest National Laboratory manages a Lidar Buoy Program that facilitates meteorological and oceanographic data collection using validated methods to support the U.S. offshore wind industry. Since being acquired in 2014, two DOE lidar buoys have been deployed on the U.S. east and west coasts, and their data represent the first publicly available multi-seasonal hub height data to be collected in U.S. waters. In addition, the buoys have undergone performance testing, significant upgrades, and a lidar validation campaign to ensure the accuracy and reliability of the lidar data needed to support wind resource characterization and model validation (the lidars were validated against a reference lidar installed on the Air-Sea Interaction Tower operated by the Woods Hole Oceanographic Institution). The Lidar Buoy Program is providing valuable offshore data to the wind energy community, while focusing data collection on areas of acknowledged high priority.


2019 ◽  
Vol 21 (4) ◽  
pp. 571-581
Author(s):  
Shobana Sivaraman ◽  
Punit Soni

Public health deals with promotion of health, prevention and treatment of communicable and non-communicable diseases by designing appropriate health interventions and services to deliver through the health systems. There is a need for robust database on the magnitude of disease burden, socio-demographic characteristics and associated risk factors for evidence-based effective planning and developing appropriate strategies, their implementation, monitoring and evaluation. Although India has vast information available through various large-scale surveys and research studies, it still lacks a reliable health information management system. The available data are seldom analysed to draw meaningful conclusions, to develop evidence for policies and strategies and to measure effectiveness of health programmes. The challenges faced in the survey research are multifaceted, from data collection in the field to its rapid transmission of data to central data servers. There is an increasing trend in using technology, especially computer-assisted personal interviews (CAPI) which is not only expensive but also requires extensive training and information management for transmission of data and its storage. This article examines the application of technology in survey research for efficient data management and to improve data quality. A software called Open Data Kit (ODK) was used for data collection and real-time monitoring of interviewers in field to improve the quality of data collection, achieve desired response rate (RR) and for better field operations’ management. The data collection and field reporting forms designed using ODK act as a significant tool to demonstrate how technology can be used to articulate research expectations at various levels with lower cost and higher efficiency. The research article examines all possible aspects of using technology in Health Survey Research. It aims to introduce further discussion of using technology for field data collection and monitoring.


1993 ◽  
Vol 76 (3) ◽  
pp. 637-643 ◽  
Author(s):  
Joe W Dorner ◽  
Paul D Blankenship ◽  
Richard J Cole

Abstract A study was conducted to measure the precision of 2 rapid aflatoxin assay systems in use at 37 peanut buying points during the 1991 harvest season. Aflatoxin laboratories were established at the 37 buying points to analyze peanut samples from all incoming farmers’ stock loads as part of a joint project sponsored by various segments of the U.S. peanut industry and the U.S. Department of Agriculture. Eighteen laboratories were equipped with Neogen’s veratox FSP rapid assay system, whereas 19 laboratories used Vicam’s Aflatest rapid assay system. To monitor the performance of the field laboratories during the project, 3 portions of each of six 27 kg samples of ground peanuts were sent to each laboratory for analysis over a period of 6 weeks. Aflatoxin concentrations ranged from 0 to 300 ng/g when eight 200 g subsamples of each sample were analyzed by liquid chromatography (LC). For the 5 samples contaminated with aflatoxin, relative standard deviations for repeatability (RSDr) for laboratories using veratox FSP ranged from 18.66 to 53.29%, and the relative standard deviations for reproducibility (RSDR) ranged from 22.79 to 59.29%. For laboratories using the Aflatest system, RSDr values ranged from 18.70 to 41.48%, and RSDR values ranged from 23.84 to 47.56%. Horwitz ratios < 2.0 were found for 4 of the 5 contaminated samples for both methods, indicating that the overall precision of the 2 methods used in the project was good. Mean aflatoxin concentrations, as determined with the rapid assay systems, were generally lower than those determined by LC, particularly for more highly contaminated samples. This could not be attributed to instability of aflatoxin in peanut paste, because additional information gathered in the study indicated that the stability of aflatoxin in peanut paste stored for 58 days was good.


Risk Analysis ◽  
2003 ◽  
Vol 23 (5) ◽  
pp. 865-881 ◽  
Author(s):  
Paul R. Kleindorfer ◽  
James C. Belke ◽  
Michael R. Elliott ◽  
Kiwan Lee ◽  
Robert A. Lowe ◽  
...  

2018 ◽  
Vol 244 ◽  
pp. 01006
Author(s):  
Vladimíra Schindlerová ◽  
Ivana Šajdlerová

An important characteristic for efficient management of production systems is the ability of a product, component or material to be tracked. That is, to be assigned with a unique symbol, number, or other code (identifier) that can be traced back both within the production process and to the customer (e. g. when complaining about a defective part). Traceability leads to a cost reduction in eliminating the risks associated with the difficult identification of the material or parts, their handling in pre-production, the manufacturing process, or the storage and sale of finished products to customers. In case of problems, it makes it easier to implement the necessary measures and reduces the time to remedy the situation either within the company or even outside. Individual companies within the Czech Republic usually solve the identification and traceability independently. The paper deals with the results of the analysis of the current state of record keeping and identification of metallurgical materials in selected companies, and presents a proposal for improvement of the current situation in a specific company, especially in the field of work with remaining material.


2019 ◽  
Vol 37 (4) ◽  
pp. 244-249
Author(s):  
Akshay Rajaram ◽  
Trevor Morey ◽  
Sonam Shah ◽  
Naheed Dosani ◽  
Muhammad Mamdani

Background: Considerable gains are being made in data-driven efforts to advance quality improvement in health care. However, organizations providing hospice-oriented palliative care for structurally vulnerable persons with terminal illnesses may not have the enabling data infrastructure or framework to derive such benefits. Methods: We conducted a pilot cross-sectional qualitative study involving a convenience sample of hospice organizations across North America providing palliative care services for structurally vulnerable patients. Through semistructured interviews, we surveyed organizations on the types of data collected, the information systems used, and the challenges they faced. Results: We contacted 13 organizations across North America and interviewed 9. All organizations served structurally vulnerable populations, including the homeless and vulnerably housed, socially isolated, and HIV-positive patients. Common examples of collected data included the number of referrals, the number of admissions, length of stay, and diagnosis. More than half of the organizations (n = 5) used an electronic medical record, although none of the record systems were specifically designed for palliative care. All (n = 9) the organizations used the built-in reporting capacity of their information management systems and more than half (n = 6) augmented this capacity with chart reviews. Discussion: A number of themes emerged from our discussions. Present data collection is heterogeneous, and storage of these data is highly fragmented within and across organizations. Funding appeared to be a key enabler of more robust data collection and use. Future work should address these gaps and examine opportunities for innovative ways of analysis and reporting to improve care for structurally vulnerable populations.


2019 ◽  
Vol 14 (3) ◽  
pp. 156-158
Author(s):  
Jordan Patterson

A Review of: Lund, B., & Agbaji, D. (2018). Use of Dewey Decimal Classification by academic libraries in the United States. Cataloging and Classification Quarterly, 56(7), 653-661. https://doi.org/10.1080/01639374.2018.1517851 Abstract Objective – To determine the current use of Dewey Decimal Classification in academic libraries in the United States of America (U.S.). Design – Cross-sectional survey using a systematic sampling method. Setting – Online academic library catalogues in the U.S. Subjects – 3,973 academic library catalogues. Methods – The researchers identified 3,973 academic libraries affiliated with degree-granting post-secondary institutions in the U.S. The researchers searched each library’s online catalogue for 10 terms from a predetermined list. From the results of each search, the researchers selected at least five titles, noted the classification scheme used to classify each title, and coded the library as using Dewey Decimal Classification (DDC), Library of Congress Classification (LCC), both DDC and LCC, or other classification schemes. Based on the results of their data collection, the researchers calculated totals. The totals of this current study’s data collection were compared to statistics on DDC usage from two previous reports, one published in 1975 and one in 1996. The researchers performed statistical analyses to determine if there were any discernible trends from the earliest reported statistics through to the current study. Main Results – Collections classified using DDC were present in 717 libraries (18.9%). Adjusting for the increase in the number of academic libraries in the U.S. between 1975 and 2017, DDC usage in academic libraries has declined by 56% in that time frame. The number of libraries with only DDC in evidence is unreported. Conclusion – The previous four decades have seen a significant decrease in the use of DDC in U.S. academic libraries in favour of LCC; however, the rate at which DDC has disappeared from academic libraries has slowed dramatically since the 1960s. There is no clear indication that DDC will disappear from academic libraries completely.


Sign in / Sign up

Export Citation Format

Share Document