An Automated System For Data Acquisition And Analysis Of Long Term Sleep Recordings For Circadian Studies

Author(s):  
Vivaldi ◽  
Wyneken ◽  
Roncagliolo ◽  
Ocampo ◽  
Zapata
Author(s):  
Sarkis Shahin ◽  
Celso Duran

While long-term monitoring and stewardship means many things to many people, DOE has defined it as: “The physical controls, institutions, information, and other mechanisms needed to ensure protection of people and the environment at sites where DOE has completed or plans to complete cleanup (e.g., landfill closures, remedial actions, and facility stabilization).” Across the United States, there are thousands of contaminated sites with multiple contaminants released from multiple sources where contaminants have transported and commingled. The U.S. government and U.S. industry are responsible for most of the contamination and are landowners of many of these contaminated properties. These sites must be surveyed periodically for various criteria including structural deterioration, water intrusion, integrity of storage containers, atmospheric conditions, and hazardous substance release. The surveys, however, are intrusive, time-consuming, and expensive and expose survey personnel to radioactive contamination. In long-term monitoring, there’s a need for an automated system that will gather and report data from sensors without costly human labor. In most cases, a SCADA (Supervisory Control and Data Acquisition) unit is used to collect and report data from a remote location. A SCADA unit consists of an embedded computer with data acquisition capabilities. The unit can be configured with various sensors placed in different areas of the site to be monitored. A system of this type is static, i.e., the sensors, once placed, cannot be moved to other locations within the site. For those applications where the number of sampling locations would require too many sensors, or where exact location of future problems is unknown, a mobile sensing platform is an ideal solution. In many facilities that undergo regular inspections, the number of video cameras and air monitors required to eliminate the need for human inspections is very large and far too costly. HCET’s remote harsh-environment surveyor (RHES) is a robotic platform with SCADA capabilities equipped with a sonar-imaging scanner, a high-resolution color CCD camera, and various combinations of sensors. The RHES is controlled remotely via a PC. This paper will discuss the development and application of this system.


1987 ◽  
Vol 78 (4) ◽  
pp. 269-274 ◽  
Author(s):  
T. Poodle

ABSTRACTThe Scottish Hydrometric Network consists of a number of river gauging stations which have been located at sites considered suitable to provide long term flow records. Economic recession has placed some stress on the gauging programme, and has given rise to extensive closures of gauging stations in England and, to a minor extent so far, in Scotland. The way in which the network became established provides a mixture of strengths and weaknesses which could have unpredictable consequences in an adverse economic climate. Changing technology provides some opportunity to reduce the cost of data acquisition and improve the deployment of manpower, while maintaining data standards. In these changing circumstances, particularly with extensive use of computer systems, it is important that standards are established for data returned to the Water Archive and that the network is not allowed to degenerate by default.


Author(s):  
Ezequiel Saretta ◽  
Antonio P. de Camargo ◽  
Tarlei A. Botrel ◽  
Marinaldo F. Pinto ◽  
Geancarlo T. Katsurayama ◽  
...  

ABSTRACT Current meters are equipment widely used for estimating flow velocity in rivers and streams. Periodic calibrations of current meters are important to ensure the quality of measurements, but the required testing facilities are complex and only available in a few institutions. However, advances in electronics and automation may contribute to developing simple and reliable calibration systems. Thus, this study aimed to develop an automated system for testing current meters, which consisted of a trapezoidal channel, a step motor, a tow car and a management system, composed of a supervisory application and microprocessed modules to control the motor and the data acquisition. Evaluations of the displacement velocity showed that it matched the reference value up to 1.85 m s-1 for a vertical-axis current meter and 2.3 m s-1 for a horizontal-axis one. The developed system showed reliability during tests, for both current meter movement and data acquisition. The management of the system based on the developed modules and the supervisory application improved its user interface, turning all the procedure into a simple task.


1986 ◽  
Vol 8 ◽  
pp. 27-30 ◽  
Author(s):  
André Champoux ◽  
C.S.L. Ommanney

The inventory of Glacier National Park, B.C., has provided an opportunity to test a number of new procedures which might be applied to future Canadian glacier inventories. The objectives of this particular study were to complete a tripartite inventory based on available map and photo sources and develop an automated system of data acquisition and processing. The conjunction of the two has permitted an analysis of the evolution of the glaciers in this area during the last 100 years. The techniques used and a summary of the results are reported.


Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2991
Author(s):  
Damianos Chatzievangelou ◽  
Jacopo Aguzzi ◽  
Martin Scherwath ◽  
Laurenz Thomsen

Deep-sea environmental datasets are ever-increasing in size and diversity, as technological advances lead monitoring studies towards long-term, high-frequency data acquisition protocols. This study presents examples of pre-analysis data treatment steps applied to the environmental time series collected by the Internet Operated Deep-sea Crawler “Wally” during a 7-year deployment (2009–2016) in the Barkley Canyon methane hydrates site, off Vancouver Island (BC, Canada). Pressure, temperature, electrical conductivity, flow, turbidity, and chlorophyll data were subjected to different standardizing, normalizing, and de-trending methods on a case-by-case basis, depending on the nature of the treated variable and the range and scale of the values provided by each of the different sensors. The final pressure, temperature, and electrical conductivity (transformed to practical salinity) datasets are ready for use. On the other hand, in the cases of flow, turbidity, and chlorophyll, further in-depth processing, in tandem with data describing the movement and position of the crawler, will be needed in order to filter out all possible effects of the latter. Our work evidences challenges and solutions in multiparametric data acquisition and quality control and ensures that a big step is taken so that the available environmental data meet high quality standards and facilitate the production of reliable scientific results.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 990 ◽  
Author(s):  
Feng Chen ◽  
Shouzhi Xu ◽  
Ying Zhao ◽  
Hui Zhang

Portable meteorological stations are widely applied in environment monitoring systems, but they are always limited in power-supplying due to no cable power, especially in long-term monitoring scenarios. Reducing power consumption by adjusting a suitable frequency of sensor acquisition is very important for wireless sensor nodes. The regularity of historical environment data from a monitoring system is analyzed, and then an optimization model of an adaptive genetic algorithm for environment monitoring data acquisition strategies is proposed to lessen sampling frequency. According to the historical characteristics, the algorithm dynamically changes the recent data acquisition frequency so as to collect data with a smaller acquisition frequency, which will reduce the energy consumption of the sensor. Experiment results in a practical environment show that the algorithm can greatly reduce the acquisition frequency, and can obtain the environment monitoring data changing curve with less error compared with the high-frequency acquisition of fixed frequency.


2018 ◽  
Vol 29 (1) ◽  
pp. 1-22 ◽  
Author(s):  
Roman Lukyanenko ◽  
Jeffrey Parsons

The emergence of crowdsourcing as an important mode of information production has attracted increasing research attention. In this article, the authors review crowdsourcing research in the data management field. Most research in this domain can be termed tasked-based, focusing on micro-tasks that exploit scale and redundancy in crowds. The authors' review points to another important type of crowdsourcing – which they term observational – that can expand the scope of extant crowdsourcing data management research. Observational crowdsourcing consists of projects that harness human sensory ability to support long-term data acquisition. The authors consider the challenges in this domain, review approaches to data management for crowdsourcing, and suggest directions for future research that bridges the gaps between the two research streams.


2018 ◽  
Vol 18 (3) ◽  
pp. 819-837 ◽  
Author(s):  
Giacomo Vincenzo Demarie ◽  
Donato Sabia

Measuring the response of a structure to the ambient and service loads is a source of information that can be used to estimate some important engineering parameters or, to a certain extent, to characterize the structural behavior as a whole. By repeating the data acquisition over a period of time, it is possible to check for variations in the structure’s response, which may be correlated to the appearance or growth of a damage (e.g. following some exceptional event as the earthquake, or as a consequence of materials and components aging). The complexity of some existing structures and their environment very often requires the execution of a monitoring plan in order to support analyses and decisions through the evidence of measured data. If the monitoring is implemented through a sensor network continuously acquiring over time, then the evolution of the structural behavior may be tracked continuously as well. Such approach has become a viable option for practical applications since the last decade, as a consequence of the progress in the data acquisition and storage systems. However, proper methods and algorithms are needed for managing the large amount of data and the extraction of valuable knowledge from it. This article presents a methodology aimed at making automatic the process of structural monitoring in case it is carried continuously over time. It relies on some existing methods from the machine learning and data mining fields, which are casted into a process targeted to delimit the need of the human being intervention to the training phase and the engineering judgment of the results. The methodology has been successfully applied to the real-world case of an ancient masonry bell tower, the Ghirlandina Tower (Modena, Italy), where a network made of 12 accelerometers and 3 thermocouples has been acquiring continuously since August 2012. The structural characterization is performed by identifying the first modes of vibration, whose evolution over time has been tracked.


Sign in / Sign up

Export Citation Format

Share Document