Algorithm for real-time colour mixing of a five-channel LED system while optimising spectral quality parameters

2021 ◽  
pp. 147715352110580
Author(s):  
A Eissfeldt ◽  
TQ Khanh

Multichannel LED luminaires with more than three channels offer the advantage to vary the spectrum and keeping the chromaticity steady. However, the optimisation calculations of various quality metrics are a challenge for real-time implementation, especially for the limited resources of a luminaire’s microcontroller. Here, we present a method in which a five-channel system is simulated with a quickly solvable 3-channel system by defining virtual channels, each consisting of two LED channels. An analysis of the influence of the parameterisation of the virtual valences on various quality metrics is presented. It shows how these parameters must be set at the time of the mixing calculation, in order to optimise the desired quality aspect. The mixing calculation can thus be carried out in real-time without high hardware requirements and is suitable for further developments, for example, to compensate for colour drift of the LEDs through sensor feedback.

TAPPI Journal ◽  
2019 ◽  
Vol 18 (11) ◽  
pp. 679-689
Author(s):  
CYDNEY RECHTIN ◽  
CHITTA RANJAN ◽  
ANTHONY LEWIS ◽  
BETH ANN ZARKO

Packaging manufacturers are challenged to achieve consistent strength targets and maximize production while reducing costs through smarter fiber utilization, chemical optimization, energy reduction, and more. With innovative instrumentation readily accessible, mills are collecting vast amounts of data that provide them with ever increasing visibility into their processes. Turning this visibility into actionable insight is key to successfully exceeding customer expectations and reducing costs. Predictive analytics supported by machine learning can provide real-time quality measures that remain robust and accurate in the face of changing machine conditions. These adaptive quality “soft sensors” allow for more informed, on-the-fly process changes; fast change detection; and process control optimization without requiring periodic model tuning. The use of predictive modeling in the paper industry has increased in recent years; however, little attention has been given to packaging finished quality. The use of machine learning to maintain prediction relevancy under everchanging machine conditions is novel. In this paper, we demonstrate the process of establishing real-time, adaptive quality predictions in an industry focused on reel-to-reel quality control, and we discuss the value created through the availability and use of real-time critical quality.


Water ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 1547
Author(s):  
Jian Sha ◽  
Xue Li ◽  
Man Zhang ◽  
Zhong-Liang Wang

Accurate real-time water quality prediction is of great significance for local environmental managers to deal with upcoming events and emergencies to develop best management practices. In this study, the performances in real-time water quality forecasting based on different deep learning (DL) models with different input data pre-processing methods were compared. There were three popular DL models concerned, including the convolutional neural network (CNN), long short-term memory neural network (LSTM), and hybrid CNN–LSTM. Two types of input data were applied, including the original one-dimensional time series and the two-dimensional grey image based on the complete ensemble empirical mode decomposition algorithm with adaptive noise (CEEMDAN) decomposition. Each type of input data was used in each DL model to forecast the real-time monitoring water quality parameters of dissolved oxygen (DO) and total nitrogen (TN). The results showed that (1) the performances of CNN–LSTM were superior to the standalone model CNN and LSTM; (2) the models used CEEMDAN-based input data performed much better than the models used the original input data, while the improvements for non-periodic parameter TN were much greater than that for periodic parameter DO; and (3) the model accuracies gradually decreased with the increase of prediction steps, while the original input data decayed faster than the CEEMDAN-based input data and the non-periodic parameter TN decayed faster than the periodic parameter DO. Overall, the input data preprocessed by the CEEMDAN method could effectively improve the forecasting performances of deep learning models, and this improvement was especially significant for non-periodic parameters of TN.


2012 ◽  
Vol 20 (26) ◽  
pp. B543 ◽  
Author(s):  
R. Schmogrow ◽  
R. Bouziane ◽  
M. Meyer ◽  
P. A. Milder ◽  
P. C. Schindler ◽  
...  

Foods ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 2740
Author(s):  
Antonia Albrecht ◽  
Maureen Mittler ◽  
Martin Hebel ◽  
Claudia Waldhans ◽  
Ulrike Herbert ◽  
...  

The high perishability of fresh meat results in short sales and consumption periods, which can lead to high amounts of food waste, especially when a fixed best-before date is stated. Thus, the aim of this study was the development of a real-time dynamic shelf-life criterion (DSLC) for fresh pork filets based on a multi-model approach combining predictive microbiology and sensory modeling. Therefore, 647 samples of ma-packed pork loin were investigated in isothermal and non-isothermal storage trials. For the identification of the most suitable spoilage predictors, typical meat quality parameters (pH-value, color, texture, and sensory characteristics) as well as microbial contamination (total viable count, Pseudomonas spp., lactic acid bacteria, Brochothrix thermosphacta, Enterobacteriaceae) were analyzed at specific investigation points. Dynamic modeling was conducted using a combination of the modified Gompertz model (microbial data) or a linear approach (sensory data) and the Arrhenius model. Based on these models, a four-point scale grading system for the DSLC was developed to predict the product status and shelf-life as a function of temperature data in the supply chain. The applicability of the DSLC was validated in a pilot study under real chain conditions and showed an accurate real-time prediction of the product status.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Wei Chen ◽  
Xiao Hao ◽  
JianRong Lu ◽  
Kui Yan ◽  
Jin Liu ◽  
...  

In order to solve the problems of high labor cost, long detection period, and low degree of information in current water environment monitoring, this paper proposes a lake water environment monitoring system based on LoRa and Internet of Things technology. The system realizes remote collection, data storage, dynamic monitoring, and pollution alarm for the distributed deployment of multisensor node information (water temperature, pH, turbidity, conductivity, and other water quality parameters). Moreover, the system uses STM32L151C8T6 microprocessor and multiple types of water quality sensors to collect water quality parameters in real time, and the data is packaged and sent to the LoRa gateway remotely by LoRa technology. Then, the gateway completes the bridging of LoRa link to IP link and forwards the water quality information to the Alibaba Cloud server. Finally, end users can realize the water quality control of monitored water area by monitoring management platform. The experimental results show that the system has a good performance in terms of real-time data acquisition accuracy, data transmission reliability, and pollution alarm success rate. The average relative errors of water temperature, pH, turbidity, and conductivity are 0.31%, 0.28%, 3.96%, and 0.71%, respectively. In addition, the signal reception strength of the system within 2 km is better than -81 dBm, and the average packet loss rate is only 94%. In short, the system’s high accuracy, high reliability, and long distance characteristics meet the needs of large area water quality monitoring.


Author(s):  
S. Boubakri ◽  
H. Rhinane

The monitoring of water quality is, in most cases, managed in the laboratory and not on real time bases. Besides this process being lengthy, it doesn’t provide the required specifications to describe the evolution of the quality parameters that are of interest. This study presents the integration of Geographic Information Systems (GIS) with wireless sensor networks (WSN) aiming to create a system able to detect the parameters like temperature, salinity and conductivity in a Moroccan catchment scale and transmit information to the support station. This Information is displayed and evaluated in a GIS using maps and spatial dashboard to monitor the water quality in real time.


2022 ◽  
Vol 12 ◽  
Author(s):  
Silvia Seoni ◽  
Simeon Beeckman ◽  
Yanlu Li ◽  
Soren Aasmul ◽  
Umberto Morbiducci ◽  
...  

Background: Laser-Doppler Vibrometry (LDV) is a laser-based technique that allows measuring the motion of moving targets with high spatial and temporal resolution. To demonstrate its use for the measurement of carotid-femoral pulse wave velocity, a prototype system was employed in a clinical feasibility study. Data were acquired for analysis without prior quality control. Real-time application, however, will require a real-time assessment of signal quality. In this study, we (1) use template matching and matrix profile for assessing the quality of these previously acquired signals; (2) analyze the nature and achievable quality of acquired signals at the carotid and femoral measuring site; (3) explore models for automated classification of signal quality.Methods: Laser-Doppler Vibrometry data were acquired in 100 subjects (50M/50F) and consisted of 4–5 sequences of 20-s recordings of skin displacement, differentiated two times to yield acceleration. Each recording consisted of data from 12 laser beams, yielding 410 carotid-femoral and 407 carotid-carotid recordings. Data quality was visually assessed on a 1–5 scale, and a subset of best quality data was used to construct an acceleration template for both measuring sites. The time-varying cross-correlation of the acceleration signals with the template was computed. A quality metric constructed on several features of this template matching was derived. Next, the matrix-profile technique was applied to identify recurring features in the measured time series and derived a similar quality metric. The statistical distribution of the metrics, and their correlates with basic clinical data were assessed. Finally, logistic-regression-based classifiers were developed and their ability to automatically classify LDV-signal quality was assessed.Results: Automated quality metrics correlated well with visual scores. Signal quality was negatively correlated with BMI for femoral recordings but not for carotid recordings. Logistic regression models based on both methods yielded an accuracy of minimally 80% for our carotid and femoral recording data, reaching 87% for the femoral data.Conclusion: Both template matching and matrix profile were found suitable methods for automated grading of LDV signal quality and were able to generate a quality metric that was on par with the signal quality assessment of the expert. The classifiers, developed with both quality metrics, showed their potential for future real-time implementation.


Author(s):  
A. K. Singh ◽  
H. V. Kumar ◽  
G. R. Kadambi ◽  
J. K. Kishore ◽  
J. Shuttleworth ◽  
...  

In this paper, the quality metrics evaluation on hyperspectral images has been presented using k-means clustering and segmentation. After classification the assessment of similarity between original image and classified image is achieved by measurements of image quality parameters. Experiments were carried out on four different types of hyperspectral images. Aerial and spaceborne hyperspectral images with different spectral and geometric resolutions were considered for quality metrics evaluation. Principal Component Analysis (PCA) has been applied to reduce the dimensionality of hyperspectral data. PCA was ultimately used for reducing the number of effective variables resulting in reduced complexity in processing. In case of ordinary images a human viewer plays an important role in quality evaluation. Hyperspectral data are generally processed by automatic algorithms and hence cannot be viewed directly by human viewers. Therefore evaluating quality of classified image becomes even more significant. An elaborate comparison is made between k-means clustering and segmentation for all the images by taking Peak Signal-to-Noise Ratio (PSNR), Mean Square Error (MSE), Maximum Squared Error, ratio of squared norms called L2RAT and Entropy. First four parameters are calculated by comparing the quality of original hyperspectral image and classified image. Entropy is a measure of uncertainty or randomness which is calculated for classified image. Proposed methodology can be used for assessing the performance of any hyperspectral image classification techniques.


2021 ◽  
Author(s):  
Annet M Nankya ◽  
Luke Nyakarahuka ◽  
Stephen Balinandi ◽  
John Kayiwa ◽  
Julius Lutwama ◽  
...  

Abstract Back ground: Corona Virus Disease 2019 (COVID 19) in Uganda was first reported in a male traveler from Dubai on 21st March, 2020 shortly after WHO had announced the condition as a global pandemic. Timely laboratory diagnosis of COVID -19 for all samples from both symptomatic and asymptomatic patients was observed as key in containing the pandemic and breaking the chain of transmission. However, there was a challenge of limited resources required for testing SARS-COV-2 in low and middle income countries. To mitigate this, a study was conducted to evaluate a sample pooling strategy for COVI-19 using real time PCR. The cost implication and the turn around time of pooled sample testing versus individual sample testing were also compared.Methods: In this study, 1260 randomly selected samples submitted to Uganda Virus Research Institute for analysis were batched in pools of 5, 10, and 15. The pools were then extracted using a Qiagen kit. Both individual and pooled RNA were screened for the SARS-COV-2 E gene using a Berlin kit. Results: Out of 1260 samples tested, 21 pools were positive in pools of 5 samples, 16 were positive in pools of 10 and 14 were positive in pools of 15 samples. The study also revealed that the pooling strategy helps to save a lot on resources, time and expands diagnostic capabilities without affecting the sensitivity of the test in areas with low SARS-COV-2 prevalence.Conclusion: This study demonstrated that the pooling strategy for COVID-19 reduced on the turnaround time and there was a substantial increase in the overall testing capacity with limited resources as compared to individual testing.


Sign in / Sign up

Export Citation Format

Share Document