Bridge Health Monitoring by Infrared Thermography

Author(s):  
Masato Matsumoto ◽  
Kyle Ruske

<p>Condition ratings of bridge components in the Federal Highway Administration (FHWA)’s Structural Inventory and Appraisal database are determined by bridge inspectors in the field, often by visual confirmation or direct- contact sounding techniques. However, the determination of bridge condition ratings is generally subjective depending on individual inspectors’ knowledge and experience, as well as varying field conditions. There are also limitations to access, unsafe working conditions, and negative impacts of lane closures to account for. This paper describes an alternative method to obtaining informative and diagnostic inspection data for concrete bridge decks: mobile nondestructive bridge deck evaluation technology. The technology uses high- definition infrared and visual imaging to monitor bridge conditions over long-term (or desired) intervals. This combination of instruments benefits from rapid and large-scale data acquisition capabilities. Through its implementation in Japan over the course of two decades, the technology is opening new possibilities in a field with much untapped potential. Findings and lessons learned from our experience in the states of Virginia and Pennsylvania are described as examples of highway-speed mobile nondestructive evaluation in action. To validate the accuracy of delamination detection by the visual and infrared scanning, findings were proofed by physical sounding of the target deck structures.</p>

2020 ◽  
Vol 16 (1) ◽  
Author(s):  
Yasuko Inagaki ◽  
Tomoko Kobayashi ◽  
Yoshihito Suda ◽  
Kazuya Kusama ◽  
Kazuhiko Imakawa

Abstract Background Infection with bovine leukemia virus (BLV), the causative agent for enzootic bovine leukosis (EBL), is increasing in dairy farms of Japan. The tendency of tumor development following BLV infection in certain cow families and bull lines has previously been described. We therefore hypothesized the existence of a genetic component which differentiates cattle susceptibility to the disease. Results We analyzed routinely collected large-scale data including postmortem inspection data, which were combined with pedigree information and epidemiological data of BLV infection. A total of 6,022 postmortem inspection records of Holstein cattle, raised on 226 farms served by a regional abattoir over 10 years from 2004 to 2015, were analyzed for associations between sire information and EBL development. We then identified statistically the relative susceptibility to EBL development for the progeny of specific sires and paternal grandsires (PGSs). The heritability of EBL development was calculated as 0.19. Similarly, proviral loads (PVLs) of progeny from identified sires and PGSs were analyzed, but no significant differences were found. Conclusions These observations suggest that because EBL development in our Holstein population is, at least in part, influenced by genetic factors independent of PVL levels, genetic improvement for lower incidence of EBL development in cattle notwithstanding BLV infection is possible.


2015 ◽  
Vol 101 (4) ◽  
pp. 392-397 ◽  
Author(s):  
Trevor Duke ◽  
Edilson Yano ◽  
Adrian Hutchinson ◽  
Ilomo Hwaihwanje ◽  
Jimmy Aipit ◽  
...  

Although the WHO recommends all countries use International Classification of Diseases (ICD)-10 coding for reporting health data, accurate health facility data are rarely available in developing or low and middle income countries. Compliance with ICD-10 is extremely resource intensive, and the lack of real data seriously undermines evidence-based approaches to improving quality of care and to clinical and public health programme management. We developed a simple tool for the collection of accurate admission and outcome data and implemented it in 16 provincial hospitals in Papua New Guinea over 6 years. The programme was low cost and easy to use by ward clerks and nurses. Over 6 years, it gathered data on the causes of 96 998 admissions of children and 7128 deaths. National reports on child morbidity and mortality were produced each year summarising the incidence and mortality rates for 21 common conditions of children and newborns, and the lessons learned for policy and practice. These data informed the National Policy and Plan for Child Health, triggered the implementation of a process of clinical quality improvement and other interventions to reduce mortality in the neediest areas, focusing on diseases with the highest burdens. It is possible to collect large-scale data on paediatric morbidity and mortality, to be used locally by health workers who gather it, and nationally for improving policy and practice, even in very resource-limited settings where ICD-10 coding systems such as those that exist in some high-income countries are not feasible or affordable.


2021 ◽  
Vol 5 (1) ◽  
pp. 44-50
Author(s):  
Rishab Srivastava

Breakthrough technologies can be considered as exponentially disruptive to organizations across industries within the last few decades of the 21st century, as they have significantly altered the way their business units or customers operate. Artificial Intelligence related cognitive technologies are some of the latest disruptive solutions currently being adopted by organizations. Organizational leaders may feel both the pressure and excitement of adopting such nascent technology quickly and at scale. However, due to organizational knowledge gaps of nascent solutions, transformative large-scale initiatives have a higher risk of negative impact on failure to implement. On the other hand, an iterative approach allows for the implementation to occur in smaller amounts and leaves room for incorporating feedback and lessons learned in future iterations, thus mitigating the risks involved with the undertaking. This article breaks down the nascent field of advanced cognitive technologies into three main categories based on their business use cases: process automation, cognitive insights, and cognitive engagement. It then explores implementing this technology in each of its three categories through the lens of a popular iterative product lifecycle management approach (i.e., the Minimum Viable Product) to reduce the risk of failure or other negative impacts on an organization adopting cognitive solutions.


2018 ◽  
Vol 20 (12) ◽  
pp. 4473-4491 ◽  
Author(s):  
Nick Couldry ◽  
Jun Yu

As World Economic Forum’s definition of personal data as ‘the new “oil” – a valuable resource of the 21st century’ shows, large-scale data processing is increasingly considered the defining feature of contemporary economy and society. Commercial and governmental discourse on data frequently argues its benefits, and so legitimates its continuous and large-scale extraction and processing as the starting point for developments in specific industries, and potentially as the basis for societies as a whole. Against the background of the General Data Protection Regulation, this article unravels how general discourse on data covers over the social practices enabling collection of data, through the analysis of high-profile business reports and case studies of health and education sectors. We show how conceptualisation of data as having a natural basis in the everyday world protects data collection from ethical questioning while endorsing the use and free flow of data within corporate control, at the expense of its potentially negative impacts on personal autonomy and human freedom.


2021 ◽  
Vol 19 (3) ◽  
Author(s):  
Álvaro Fernández Casaní ◽  
Juan M. Orduña ◽  
Javier Sánchez ◽  
Santiago González de la Hoz

AbstractThe Large Hadron Collider (LHC) is about to enter its third run at unprecedented energies. The experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousands of physics users. The ATLAS EventIndex project, currently running in production, builds a complete catalogue of particle collisions, or events, for the ATLAS experiment at the LHC. The distributed nature of the experiment data model is exploited by running jobs at over one hundred Grid data centers worldwide. Millions of files with petabytes of data are indexed, extracting a small quantity of metadata per event, that is conveyed with a data collection system in real time to a central Hadoop instance at CERN. After a successful first implementation based on a messaging system, some issues suggested performance bottlenecks for the challenging higher rates in next runs of the experiment. In this work we characterize the weaknesses of the previous messaging system, regarding complexity, scalability, performance and resource consumption. A new approach based on an object-based storage method was designed and implemented, taking into account the lessons learned and leveraging the ATLAS experience with this kind of systems. We present the experiment that we run during three months in the real production scenario worldwide, in order to evaluate the messaging and object store approaches. The results of the experiment show that the new object-based storage method can efficiently support large-scale data collection for big data environments like the next runs of the ATLAS experiment at the LHC.


2020 ◽  
Vol 29 (3S) ◽  
pp. 638-647 ◽  
Author(s):  
Janine F. J. Meijerink ◽  
Marieke Pronk ◽  
Sophia E. Kramer

Purpose The SUpport PRogram (SUPR) study was carried out in the context of a private academic partnership and is the first study to evaluate the long-term effects of a communication program (SUPR) for older hearing aid users and their communication partners on a large scale in a hearing aid dispensing setting. The purpose of this research note is to reflect on the lessons that we learned during the different development, implementation, and evaluation phases of the SUPR project. Procedure This research note describes the procedures that were followed during the different phases of the SUPR project and provides a critical discussion to describe the strengths and weaknesses of the approach taken. Conclusion This research note might provide researchers and intervention developers with useful insights as to how aural rehabilitation interventions, such as the SUPR, can be developed by incorporating the needs of the different stakeholders, evaluated by using a robust research design (including a large sample size and a longer term follow-up assessment), and implemented widely by collaborating with a private partner (hearing aid dispensing practice chain).


2018 ◽  
Author(s):  
Mike Nutt ◽  
Gregory Raschke

Library spaces that blend collaboration areas, advanced technologies, and librarian expertise are creating new modes of scholarly communication. These spaces enable scholarship created within high-definition, large-scale visual collaborative environments. This emergent model of scholarly communication can be experienced within those specific contexts or through digital surrogates on the networked Web. From experiencing in three dimensions the sermons of John Donne in 1622 to interactive media interpretations of American wars, scholars are partnering with libraries to create immersive digital scholarship. Viewing the library as a research platform for these emergent forms of digital scholarship presents several opportunities and challenges. Opportunities include re-engaging faculty in the use of library space, integrating the full life-cycle of the research enterprise, and engaging broad communities in the changing nature of digitally-driven scholarship. Issues such as identifying and filtering collaborations, strategically managing staff resources, creating surrogates of immersive digital scholarship, and preserving this content for the future present an array of challenges for libraries that require coordination across organizations. From engaging and using high-technology spaces to documenting the data and digital objects created, this developing scholarly communication medium brings to bear the multifaceted skills and organizational capabilities of libraries.


Sign in / Sign up

Export Citation Format

Share Document