scholarly journals A Low-Cost Monitoring System and Operating Database for Quality Control in Small Food Processing Industry

2019 ◽  
Vol 8 (4) ◽  
pp. 52 ◽  
Author(s):  
Paladino ◽  
Fissore ◽  
Neviani

The use of completely automated systems for collecting sensor data with the aim of monitoring and controlling the quality of small-scale food processes is not widespread. Small and micro-enterprises usually do not carry out their own precompetitive research or prototype development as regards to automation technologies. This study proposes a web-based, low-cost monitoring and supervisory control and data acquisition (SCADA) system whose kernel is available for free, as a possible solution that could be adopted by these food producers. It is mainly based on open SW/HW so as its configuration is adaptable to the application and type of plant. It presents a modular architecture and its main functionalities encompass the acquisition, management, aggregation and visualization of process data, providing an operating database. It also provides food tracking and process quality control: The time series are browsable due to QR-Code generation and different early warning detection strategies are implemented. A tool for solving migration problems based on Fick’s equation is offered as a packaging decision support system.

2021 ◽  
Author(s):  
Adrian Wenzel ◽  
Jia Chen ◽  
Florian Dietrich ◽  
Sebastian T. Thekkekara ◽  
Daniel Zollitsch ◽  
...  

<p>Modeling urban air pollutants is a challenging task not only due to the complicated, small-scale topography but also due to the complex chemical processes within the chemical regime of a city. Nitrogen oxides (NOx), particulate matter (PM) and other tracer gases, e.g. formaldehyde, hold information about which chemical regime is present in a city. As we are going to test and apply chemical models for urban pollution – especially with respect to spatial and temporally variability – measurement data with high spatial and temporal resolution are critical.</p><p>Since governmental monitoring stations of air pollutants such as PM, NOx, ozone (O<sub>3</sub>) or carbon monoxide (CO) are large and costly, they are usually only sparsely distributed throughout a city. Hence, the official monitoring sites are not sufficient to investigate whether small-scale variability and its integrated effects are captured well by models. Smart networks consisting of small low-cost air pollutant sensors have the ability to provide the required grid density and are therefore the tool of choice when it comes to setting up or validating urban modeling frameworks. Such sensor networks have been established and run by several groups, achieving spatial and temporal high-resolution concentration maps [1, 2].</p><p>After having conducted a measurement campaign in 2016 to create a high-resolution NO<sub>2</sub> concentration map for Munich [3], we are currently setting up a low-cost sensor network to measure NOx, PM, O<sub>3</sub> and CO concentrations as well as meteorological parameters [4]. The sensors are stand-alone, so that they do not demand mains supply, which gives us a high flexibility in their deployment. Validating air quality models not only requires dense but also high-accuracy measurements. Therefore, we will calibrate our sensor nodes on a weekly basis using a mobile reference instrument and apply the gathered sensor data to a Machine Learning model of the sensor nodes. This will help minimize the often occurring drawbacks of low-cost sensors such as sensor drift, environmental influences and sensor cross sensitivities.</p><p> </p><p>[1] Bigi, A., Mueller, M., Grange, S. K., Ghermandi, G., and Hueglin, C.: Performance of NO, NO2 low cost sensors and three calibration approaches within a real world application, Atmos. Meas. Tech., 11, 3717–3735, https://doi.org/10.5194/amt-11-3717-2018, 2018</p><p>[2] Kim, J., Shusterman, A. A., Lieschke, K. J., Newman, C., and Cohen, R. C.: The BErkeley Atmospheric CO2 Observation Network: field calibration and evaluation of low-cost air quality sensors, Atmos. Meas. Tech., 11, 1937–1946, https://doi.org/10.5194/amt-11-1937-2018, 2018</p><p>[3] Zhu, Y., Chen, J., Bi, X., Kuhlmann, G., Chan, K. L., Dietrich, F., Brunner, D., Ye, S., and Wenig, M.: Spatial and temporal representativeness of point measurements for nitrogen dioxide pollution levels in cities, Atmos. Chem. Phys., 20, 13241–13251, https://doi.org/10.5194/acp-20-13241-2020, 2020</p><p>[4] Zollitsch, D., Chen, J., Dietrich, F., Voggenreiter, B., Setili, L., and Wenig, M.: Low-Cost Air Quality Sensor Network in Munich, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-19276, https://doi.org/10.5194/egusphere-egu2020-19276, 2020</p>


2021 ◽  
Author(s):  
Carlos Erazo Ramirez ◽  
Yusuf Sermet ◽  
Frank Molkenthin ◽  
Ibrahim Demir

This paper presents HydroLang, an open-source and integrated community-driven computational web framework to support research and education in hydrology and water resources. HydroLang uses client-side web technologies and standards to perform different routines which aim towards the acquisition, management, transformation, analysis and visualization of hydrological datasets. HydroLang is comprised of four main high-cohesion low-coupling modules for: (1) retrieving, manipulating, and transforming raw hydrological data, (2) statistical operations, hydrological analysis, and creating models, (3) generating graphical and tabular data representations, and (4) mapping and geospatial data visualization. Two extensive case studies (i.e., evaluation of lumped models and development of a rainfall disaggregation model) have been presented to demonstrate the framework’s capabilities, portability, and interoperability. HydroLang’s unique modular architecture and open-source nature allow it to be easily tailored into any use case and web framework and promote iterative enhancements with community involvement to establish the comprehensive next-generation hydrological software toolkit.


2020 ◽  
Author(s):  
Lennart Schmidt ◽  
Hannes Mollenhauer ◽  
Corinna Rebmann ◽  
David Schäfer ◽  
Antje Claussnitzer ◽  
...  

<p>With more and more data being gathered from environmental sensor networks, the importance of automated quality-control (QC) routines to provide usable data in near-real time is becoming increasingly apparent. Machine-learning (ML) algorithms exhibit a high potential to this respect as they are able to exploit the spatio-temporal relation of multiple sensors to identify anomalies while allowing for non-linear functional relations in the data. In this study, we evaluate the potential of ML for automated QC on two spatio-temporal datasets at different spatial scales: One is a dataset of atmospheric variables at 53 stations across Northern Germany. The second dataset contains timeseries of soil moisture and temperature at 40 sensors at a small-scale measurement plot.</p><p>Furthermore, we investigate strategies to tackle three challenges that are commonly present when applying ML for QC: 1) As sensors might drop out, the ML models have to be designed to be robust against missing values in the input data. We address this by comparing different data imputation methods, coupled with a binary representation of whether a value is missing or not. 2) Quality flags that mark erroneous data points to serve as ground truth for model training might not be available. And 3) There is no guarantee that the system under study is stationary, which might render the outputs of a trained model useless in the future. To address 2) and 3), we frame the problem both as a supervised and unsupervised learning problem. Here, the use of unsupervised ML-models can be beneficial as they do not require ground truth data and can thus be retrained more easily should the system be subject to significant changes. In this presentation, we discuss the performance, advantages and drawbacks of the proposed strategies to tackle the aforementioned challenges. Thus, we provide a starting point for researchers in the largely untouched field of ML application for automated quality control of environmental sensor data.</p>


2021 ◽  
Author(s):  
Julius Polz ◽  
Lennart Schmidt ◽  
Luca Glawion ◽  
Maximilian Graf ◽  
Christian Werner ◽  
...  

<p>We can observe a global decrease of well maintained weather stations by meteorological services and governmental institutes. At the same time, environmental sensor data is increasing through the use of opportunistic or remote sensing approaches. Overall, the trend for environmental sensor networks is strongly going towards automated routines, especially for quality-control (QC) to provide usable data in near real-time. A common QC scenario is that data is being flagged manually using expert knowledge and visual inspection by humans. To reduce this tedious process and to enable near-real time data provision, machine-learning (ML) algorithms exhibit a high potential as they can be designed to imitate the experts actions. </p><p>Here we address these three common challenges when applying ML for QC: 1) Robustness to missing values in the input data. 2) Availability of training data, i.e. manual quality flags that mark erroneous data points. And 3) Generalization of the model regarding non-stationary behavior of one  experimental system or changes in the experimental setup when applied to a different study area. We approach the QC problem and the related issues both as a supervised and an unsupervised learning problem using deep neural networks on the one hand and dimensionality reduction combined with clustering algorithms on the other.</p><p>We compare the different ML algorithms on two time-series datasets to test their applicability across scales and domains. One dataset consists of signal levels of 4000 commercial microwave links distributed all over Germany that can be used to monitor precipitation. The second dataset contains time-series of soil moisture and temperature from 120 sensors deployed at a small-scale measurement plot at the TERENO site “Hohes Holz”.</p><p>First results show that supervised ML provides an optimized performance for QC for an experimental system not subject to change and at the cost of a laborious preparation of the training data. The unsupervised approach is also able to separate valid from erroneous data at reasonable accuracy. However, it provides the additional benefit that it does not require manual flags and can thus be retrained more easily in case the system is subject to significant changes. </p><p>In this presentation, we discuss the performance, advantages and drawbacks of the proposed ML routines to tackle the aforementioned challenges. Thus, we aim to provide a starting point for researchers in the promising field of ML application for automated QC of environmental sensor data.</p>


Greenhouse automation system using Internet of Things (IoT) is a technical approach that benefits farmers by the automation and control of the greenhouse environment including plants health monitoring. Farmers' activities in the greenhouse are considered important in terms of producing strategic food for the population. In general, greenhouses are usually affected by the weather and plant diseases, as a result, their yield can be minimized and thus income is reduced. Through the analysis of the current situation of small-scale greenhouses, this paper proposes a low-cost solution for controlling, identifying, and classifying of infected plant leaves and automation of agricultural greenhouse. Design and prototype development of the proposed project has been done using Raspberry Pi, NODE MCU SP8266, different sensors and MATLAB. The programming language, MATLAB, is used to classify infected plant leaves, and sensors have been used to measure temperature and humidity of the greenhouse environment. In addition, controlling of actuators have been attained through solid state relays in order to turn the water drip system on or off upon reaching the predetermined threshold value. Finally, the greenhouse farmers interact with the proposed system via the cloud-based platform. This greenhouse automation system will benefit greenhouse farmers by enabling them to automatically monitor and control the greenhouse environment without their direct supervision.


2016 ◽  
Vol 9 (2) ◽  
pp. 128-136
Author(s):  
Hishar Hassan ◽  
Suharzelim Bakar ◽  
Khairul Halim ◽  
Jaleezah Idris ◽  
Abdul Nordin

2012 ◽  
Vol 44 (2) ◽  
pp. 75-93
Author(s):  
Peter Mortensen

This essay takes its cue from second-wave ecocriticism and from recent scholarly interest in the “appropriate technology” movement that evolved during the 1960s and 1970s in California and elsewhere. “Appropriate technology” (or AT) refers to a loosely-knit group of writers, engineers and designers active in the years around 1970, and more generally to the counterculture’s promotion, development and application of technologies that were small-scale, low-cost, user-friendly, human-empowering and environmentally sound. Focusing on two roughly contemporary but now largely forgotten American texts Sidney Goldfarb’s lyric poem “Solar-Heated-Rhombic-Dodecahedron” (1969) and Gurney Norman’s novel Divine Right’s Trip (1971)—I consider how “hip” literary writers contributed to eco-technological discourse and argue for the 1960s counterculture’s relevance to present-day ecological concerns. Goldfarb’s and Norman’s texts interest me because they conceptualize iconic 1960s technologies—especially the Buckminster Fuller-inspired geodesic dome and the Volkswagen van—not as inherently alienating machines but as tools of profound individual, social and environmental transformation. Synthesizing antimodernist back-to-nature desires with modernist enthusiasm for (certain kinds of) machinery, these texts adumbrate a humanity- and modernity-centered post-wilderness model of environmentalism that resonates with the dilemmas that we face in our increasingly resource-impoverished, rapidly warming and densely populated world.


Author(s):  
Christian Frilund ◽  
Esa Kurkela ◽  
Ilkka Hiltunen

AbstractFor the realization of small-scale biomass-to-liquid (BTL) processes, low-cost syngas cleaning remains a major obstacle, and for this reason a simplified gas ultracleaning process is being developed. In this study, a low- to medium-temperature final gas cleaning process based on adsorption and organic solvent-free scrubbing methods was coupled to a pilot-scale staged fixed-bed gasification facility including hot filtration and catalytic reforming steps for extended duration gas cleaning tests for the generation of ultraclean syngas. The final gas cleaning process purified syngas from woody and agricultural biomass origin to a degree suitable for catalytic synthesis. The gas contained up to 3000 ppm of ammonia, 1300 ppm of benzene, 200 ppm of hydrogen sulfide, 10 ppm of carbonyl sulfide, and 5 ppm of hydrogen cyanide. Post-run characterization displayed that the accumulation of impurities on the Cu-based deoxygenation catalyst (TOS 105 h) did not occur, demonstrating that effective main impurity removal was achieved in the first two steps: acidic water scrubbing (AWC) and adsorption by activated carbons (AR). In the final test campaign, a comprehensive multipoint gas analysis confirmed that ammonia was fully removed by the scrubbing step, and benzene and H2S were fully removed by the subsequent activated carbon beds. The activated carbons achieved > 90% removal of up to 100 ppm of COS and 5 ppm of HCN in the syngas. These results provide insights into the adsorption affinity of activated carbons in a complex impurity matrix, which would be arduous to replicate in laboratory conditions.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2944
Author(s):  
Benjamin James Ralph ◽  
Marcel Sorger ◽  
Benjamin Schödinger ◽  
Hans-Jörg Schmölzer ◽  
Karin Hartl ◽  
...  

Smart factories are an integral element of the manufacturing infrastructure in the context of the fourth industrial revolution. Nevertheless, there is frequently a deficiency of adequate training facilities for future engineering experts in the academic environment. For this reason, this paper describes the development and implementation of two different layer architectures for the metal processing environment. The first architecture is based on low-cost but resilient devices, allowing interested parties to work with mostly open-source interfaces and standard back-end programming environments. Additionally, one proprietary and two open-source graphical user interfaces (GUIs) were developed. Those interfaces can be adapted front-end as well as back-end, ensuring a holistic comprehension of their capabilities and limits. As a result, a six-layer architecture, from digitization to an interactive project management tool, was designed and implemented in the practical workflow at the academic institution. To take the complexity of thermo-mechanical processing in the metal processing field into account, an alternative layer, connected with the thermo-mechanical treatment simulator Gleeble 3800, was designed. This framework is capable of transferring sensor data with high frequency, enabling data collection for the numerical simulation of complex material behavior under high temperature processing. Finally, the possibility of connecting both systems by using open-source software packages is demonstrated.


Atmosphere ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 179
Author(s):  
Said Munir ◽  
Martin Mayfield ◽  
Daniel Coca

Small-scale spatial variability in NO2 concentrations is analysed with the help of pollution maps. Maps of NO2 estimated by the Airviro dispersion model and land use regression (LUR) model are fused with measured NO2 concentrations from low-cost sensors (LCS), reference sensors and diffusion tubes. In this study, geostatistical universal kriging was employed for fusing (integrating) model estimations with measured NO2 concentrations. The results showed that the data fusion approach was capable of estimating realistic NO2 concentration maps that inherited spatial patterns of the pollutant from the model estimations and adjusted the modelled values using the measured concentrations. Maps produced by the fusion of NO2-LCS with NO2-LUR produced better results, with r-value 0.96 and RMSE 9.09. Data fusion adds value to both measured and estimated concentrations: the measured data are improved by predicting spatiotemporal gaps, whereas the modelled data are improved by constraining them with observed data. Hotspots of NO2 were shown in the city centre, eastern parts of the city towards the motorway (M1) and on some major roads. Air quality standards were exceeded at several locations in Sheffield, where annual mean NO2 levels were higher than 40 µg/m3. Road traffic was considered to be the dominant emission source of NO2 in Sheffield.


Sign in / Sign up

Export Citation Format

Share Document