online sensor
Recently Published Documents


TOTAL DOCUMENTS

74
(FIVE YEARS 15)

H-INDEX

13
(FIVE YEARS 1)

Chemosensors ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 175
Author(s):  
Lukas Wunderlich ◽  
Peter Hausler ◽  
Susanne Märkl ◽  
Rudolf Bierl ◽  
Thomas Hirsch

The increasing popularity of nanoparticles in many applications has led to the fact that these persistent materials pollute our environment and threaten our health. An online sensor system for monitoring the presence of nanoparticles in fresh water would be highly desired. We propose a label-free sensor based on SPR imaging. The sensitivity was enhanced by a factor of about 100 by improving the detector by using a high-resolution camera. This revealed that the light source also needed to be improved by using LED excitation instead of a laser light source. As a receptor, different self-assembled monolayers have been screened. It can be seen that the nanoparticle receptor interaction is of a complex nature. The best system when taking sensitivity as well as reversibility into account is given by a dodecanethiol monolayer on the gold sensor surface. Lanthanide-doped nanoparticles, 29 nm in diameter and with a similar refractive index to the most common silica nanoparticles were detected in water down to 1.5 µg mL−1. The sensor can be fully regenerated within one hour without the need for any washing buffer. This sensing concept is expected to be easily adapted for the detection of nanoparticles of different size, shape, and composition, and upon miniaturization, suitable for long-term applications to monitor the quality of water.


2020 ◽  
Vol 10 (2) ◽  
pp. 555 ◽  
Author(s):  
Anacleto Rizzo ◽  
Riccardo Bresciani ◽  
Nicola Martinuzzi ◽  
Fabio Masi

Nature-based solutions, such as Constructed Wetlands (CWs), for the treatment of industrial wastewater can be more efficiently operated making use of online monitored parameters as inlet/outlet flows and concentrations for specific substances. The present study compares different datasets acquired in a two-and-a-half-year-long period by normal laboratory methods and also from a specific COD/BOD sensor installed at a winery CWs wastewater treatment plant in Tuscany, Italy. The CW wastewater treatment plant (WWTP) is composed of: equalization tank (70 m3); French Reed Bed (1200 m2); horizontal subsurface flow (HF) CW (960 m2): free water system (850 m2); optional post-treatment sand filter (50 m2); and emergency recirculation. The obtained average performances for this last period are for COD 97.5%, for MBAS 93.1%, for N-NO2- 84.7%, for NO3- 39.9%, and for TP 45.5%. The online sensor has shown excellent performance in following the COD concentration patterns along the observed period. The qualitative and quantitative validity of the online sensor measurements has been assessed by statistical analysis (t-test) and reported in the paper. Online data, acquired every 30 min, availability is of extreme importance for the CW system performance optimization, for understanding the behavior of the WWTP in different operative scenarios, and finally for driving the powering on or off eventual process enhancement tools.


Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4441
Author(s):  
Jaekwang Cha ◽  
Jinhyuk Kim ◽  
Shiho Kim

Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user’s intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.


Sign in / Sign up

Export Citation Format

Share Document