scholarly journals A Mobile Sensing System for UrbanPM2.5Monitoring with Adaptive Resolution

2016 ◽  
Vol 2016 ◽  
pp. 1-15 ◽  
Author(s):  
Hongjie Guo ◽  
Guojun Dai ◽  
Jin Fan ◽  
Yifan Wu ◽  
Fangyao Shen ◽  
...  

This paper develops a mobile sensing system, the first system used in adaptive resolution urban air quality monitoring. In this system, we employ several taxis as sensor carries to collect originalPM2.5data and collect a variety of datasets, including meteorological data, traffic status data, and geographical data in the city. This paper also presents a novel method AG-PCEM (Adaptive Grid-Probabilistic Concentration Estimation Method) to infer thePM2.5concentration for undetected grids using dynamic adaptive grids. We gradually collect the measurements throughout a year using a prototype system in Xiasha District of Hangzhou City, China. Experimental data has verified that the proposed system can achieve good performance in terms of computational cost and accuracy. The computational cost of AG-PCEM is reduced by about 40.2% compared with a static grid method PCEM under the condition of reaching the close accuracy, and the accuracy of AG-PCEM is far superior as widely used artificial neural network (ANN) and Gaussian process (GP), enhanced by 38.8% and 14.6%, respectively. The system can be expanded to wide-range air quality monitor by adjusting the initial grid resolution, and our findings can tell citizens actual air quality and help official management find pollution sources.

Author(s):  
Gary Sutlieff ◽  
Lucy Berthoud ◽  
Mark Stinchcombe

Abstract CBRN (Chemical, Biological, Radiological, and Nuclear) threats are becoming more prevalent, as more entities gain access to modern weapons and industrial technologies and chemicals. This has produced a need for improvements to modelling, detection, and monitoring of these events. While there are currently no dedicated satellites for CBRN purposes, there are a wide range of possibilities for satellite data to contribute to this field, from atmospheric composition and chemical detection to cloud cover, land mapping, and surface property measurements. This study looks at currently available satellite data, including meteorological data such as wind and cloud profiles, surface properties like temperature and humidity, chemical detection, and sounding. Results of this survey revealed several gaps in the available data, particularly concerning biological and radiological detection. The results also suggest that publicly available satellite data largely does not meet the requirements of spatial resolution, coverage, and latency that CBRN detection requires, outside of providing terrain use and building height data for constructing models. Lastly, the study evaluates upcoming instruments, platforms, and satellite technologies to gauge the impact these developments will have in the near future. Improvements in spatial and temporal resolution as well as latency are already becoming possible, and new instruments will fill in the gaps in detection by imaging a wider range of chemicals and other agents and by collecting new data types. This study shows that with developments coming within the next decade, satellites should begin to provide valuable augmentations to CBRN event detection and monitoring. Article Highlights There is a wide range of existing satellite data in fields that are of interest to CBRN detection and monitoring. The data is mostly of insufficient quality (resolution or latency) for the demanding requirements of CBRN modelling for incident control. Future technologies and platforms will improve resolution and latency, making satellite data more viable in the CBRN management field


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Sebastiano Piccolroaz ◽  
Bieito Fernández-Castro ◽  
Marco Toffolon ◽  
Henk A. Dijkstra

AbstractA multi-site, year-round dataset comprising a total of 606 high-resolution turbulence microstructure profiles of shear and temperature gradient in the upper 100 m depth is made available for Lake Garda (Italy). Concurrent meteorological data were measured from the fieldwork boat at the location of the turbulence measurements. During the fieldwork campaign (March 2017-June 2018), four different sites were sampled on a monthly basis, following a standardized protocol in terms of time-of-day and locations of the measurements. Additional monitoring activity included a 24-h campaign and sampling at other sites. Turbulence quantities were estimated, quality-checked, and merged with water quality and meteorological data to produce a unique turbulence atlas for a lake. The dataset is open to a wide range of possible applications, including research on the variability of turbulent mixing across seasons and sites (demersal vs pelagic zones) and driven by different factors (lake-valley breezes vs buoyancy-driven convection), validation of hydrodynamic lake models, as well as technical studies on the use of shear and temperature microstructure sensors.


Information ◽  
2021 ◽  
Vol 12 (8) ◽  
pp. 296
Author(s):  
Laila Esheiba ◽  
Amal Elgammal ◽  
Iman M. A. Helal ◽  
Mohamed E. El-Sharkawi

Manufacturers today compete to offer not only products, but products accompanied by services, which are referred to as product-service systems (PSSs). PSS mass customization is defined as the production of products and services to meet the needs of individual customers with near-mass-production efficiency. In the context of the PSS mass customization environment, customers are overwhelmed by a plethora of previously customized PSS variants. As a result, finding a PSS variant that is precisely aligned with the customer’s needs is a cognitive task that customers will be unable to manage effectively. In this paper, we propose a hybrid knowledge-based recommender system that assists customers in selecting previously customized PSS variants from a wide range of available ones. The recommender system (RS) utilizes ontologies for capturing customer requirements, as well as product-service and production-related knowledge. The RS follows a hybrid recommendation approach, in which the problem of selecting previously customized PSS variants is encoded as a constraint satisfaction problem (CSP), to filter out PSS variants that do not satisfy customer needs, and then uses a weighted utility function to rank the remaining PSS variants. Finally, the RS offers a list of ranked PSS variants that can be scrutinized by the customer. In this study, the proposed recommendation approach was applied to a real-life large-scale case study in the domain of laser machines. To ensure the applicability of the proposed RS, a web-based prototype system has been developed, realizing all the modules of the proposed RS.


2021 ◽  
Vol 48 (4) ◽  
pp. 3-3
Author(s):  
Ingo Weber

Blockchain is a novel distributed ledger technology. Through its features and smart contract capabilities, a wide range of application areas opened up for blockchain-based innovation [5]. In order to analyse how concrete blockchain systems as well as blockchain applications are used, data must be extracted from these systems. Due to various complexities inherent in blockchain, the question how to interpret such data is non-trivial. Such interpretation should often be shared among parties, e.g., if they collaborate via a blockchain. To this end, we devised an approach codify the interpretation of blockchain data, to extract data from blockchains accordingly, and to output it in suitable formats [1, 2]. This work will be the main topic of the keynote. In addition, application developers and users of blockchain applications may want to estimate the cost of using or operating a blockchain application. In the keynote, I will also discuss our cost estimation method [3, 4]. This method was designed for the Ethereum blockchain platform, where cost also relates to transaction complexity, and therefore also to system throughput.


Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3338
Author(s):  
Ivan Vajs ◽  
Dejan Drajic ◽  
Nenad Gligoric ◽  
Ilija Radovanovic ◽  
Ivan Popovic

Existing government air quality monitoring networks consist of static measurement stations, which are highly reliable and accurately measure a wide range of air pollutants, but they are very large, expensive and require significant amounts of maintenance. As a promising solution, low-cost sensors are being introduced as complementary, air quality monitoring stations. These sensors are, however, not reliable due to the lower accuracy, short life cycle and corresponding calibration issues. Recent studies have shown that low-cost sensors are affected by relative humidity and temperature. In this paper, we explore methods to additionally improve the calibration algorithms with the aim to increase the measurement accuracy considering the impact of temperature and humidity on the readings, by using machine learning. A detailed comparative analysis of linear regression, artificial neural network and random forest algorithms are presented, analyzing their performance on the measurements of CO, NO2 and PM10 particles, with promising results and an achieved R2 of 0.93–0.97, 0.82–0.94 and 0.73–0.89 dependent on the observed period of the year, respectively, for each pollutant. A comprehensive analysis and recommendations on how low-cost sensors could be used as complementary monitoring stations to the reference ones, to increase spatial and temporal measurement resolution, is provided.


2018 ◽  
Author(s):  
Fabien Maussion ◽  
Anton Butenko ◽  
Julia Eis ◽  
Kévin Fourteau ◽  
Alexander H. Jarosch ◽  
...  

Abstract. Despite of their importance for sea-level rise, seasonal water availability, and as source of geohazards, mountain glaciers are one of the few remaining sub-systems of the global climate system for which no globally applicable, open source, community-driven model exists. Here we present the Open Global Glacier Model (OGGM, http://www.oggm.org), developed to provide a modular and open source numerical model framework for simulating past and future change of any glacier in the world. The modelling chain comprises data downloading tools (glacier outlines, topography, climate, validation data), a preprocessing module, a mass-balance model, a distributed ice thickness estimation model, and an ice flow model. The monthly mass-balance is obtained from gridded climate data and a temperature index melt model. To our knowledge, OGGM is the first global model explicitly simulating glacier dynamics: the model relies on the shallow ice approximation to compute the depth-integrated flux of ice along multiple connected flowlines. In this paper, we describe and illustrate each processing step by applying the model to a selection of glaciers before running global simulations under idealized climate forcings. Even without an in-depth calibration, the model shows a very realistic behaviour. We are able to reproduce earlier estimates of global glacier volume by varying the ice dynamical parameters within a range of plausible values. At the same time, the increased complexity of OGGM compared to other prevalent global glacier models comes at a reasonable computational cost: several dozens of glaciers can be simulated on a personal computer, while global simulations realized in a supercomputing environment take up to a few hours per century. Thanks to the modular framework, modules of various complexity can be added to the codebase, allowing to run new kinds of model intercomparisons in a controlled environment. Future developments will add new physical processes to the model as well as tools to calibrate the model in a more comprehensive way. OGGM spans a wide range of applications, from ice-climate interaction studies at millenial time scales to estimates of the contribution of glaciers to past and future sea-level change. It has the potential to become a self-sustained, community driven model for global and regional glacier evolution.


Sign in / Sign up

Export Citation Format

Share Document