Sewage Detection System Design Based on Large-Scale Embedded Microcontroller

2014 ◽  
Vol 651-653 ◽  
pp. 507-510
Author(s):  
Xiao Ling Xia

A sewage detection system based on large-scale embedded microcontroller was designed. With the increasing national environmental control requirements, the existing detection methods have been unable to meet the emissions compliance requirements. It is necessary to adopt new detection technologies for faster, more stable and reliable sewage discharge test results. This paper uses an embedded microcontroller technology for chemical plant wastewater discharge testing, which can effectively control the quality of effluent discharge, meet environmental requirements, and achieve stable production, energy saving purposes. Practice has proved that, sewage detection system based on large-scale embedded microcontroller has a strong monitoring capabilities.

2014 ◽  
Vol 3 (3) ◽  
Author(s):  
Agelos Papaioannou ◽  
George Rigas ◽  
Panagiotis Papastergiou ◽  
Christos Hadjichristodoulou

<em>Background</em>. Worldwide, the aim of managing water is to safeguard human health whilst maintaining sustainable aquatic and associated terrestrial, ecosystems. Because human enteric viruses are the most likely pathogens responsible for waterborne diseases from recreational water use, but detection methods are complex and costly for routine monitoring, it is of great interest to determine the quality of coastal bathing water with a minimum cost and maximum safety. <br /><em>Design and methods.</em> This study handles the assessment and modelling of the microbiological quality data of 2149 seawater bathing areas in Greece over 10-year period (1997-2006) by chemometric methods. <br /><em>Results</em>. Cluster analysis results indicated that the studied bathing beaches are classified in accordance with the seasonality in three groups. Factor analysis was applied to investigate possible determining factors in the groups resulted from the cluster analysis, and also two new parameters were created in each group; VF1 includes <em>E. coli</em>, faecal coliforms and total coliforms and VF2 includes faecal streptococci/enterococci. By applying the cluster analysis in each seasonal group, three new groups of coasts were generated, group A (ultraclean), group B (clean) and group C (contaminated). <em>Conclusions</em>. The above analysis is confirmed by the application of discriminant analysis, and proves that chemometric methods are useful tools for assessment and modeling microbiological quality data of coastal bathing water on a large scale, and thus could attribute to effective and economical monitoring of the quality of coastal bathing water in a country with a big number of bathing coasts, like Greece.


2013 ◽  
Vol 438-439 ◽  
pp. 1084-1088
Author(s):  
Ummin Okumura ◽  
Yu Jie Qi ◽  
Yun Long ◽  
Tian Hang Zhang

Based on the platform of LabVIEW, a set of roller intelligent detecting system is developed. With this system, it is easy to realize functions of fast nondestructive testing of subgrade compaction degree, roller speed, rollers compaction trajectory, compaction times, GPS real-time positioning as well as saving and printing report forms. Compared with traditional detection methods, this detecting system can test and control on-site compaction quality much more easily, in order to speed up the construction progress, improve the quality of subgrade compaction, control and manage compaction work better.


2015 ◽  
Vol 9 (1) ◽  
pp. 697-702
Author(s):  
Guodong Sun ◽  
Wei Xu ◽  
Lei Peng

The traditional quality detection method for transparent Nonel tubes relies on human vision, which is inefficient and susceptible to subjective factors. Especially for Nonel tubes filled with the explosive, missed defects would lead to potential danger in blasting engineering. The factors affecting the quality of Nonel tubes mainly include the uniformity of explosive filling and the external diameter of Nonel tubes. The existing detection methods, such as Scalar method, Analysis method and infrared detection technology, suffer from the following drawbacks: low detection accuracy, low efficiency and limited detection items. A new quality detection system of Nonel tubes has been developed based on machine vision in order to overcome these drawbacks. Firstly the system architecture for quality detection is presented. Then the detection method of explosive dosage and the relevant criteria are proposed based on mapping relationship between the explosive dosage and the gray value in order to detect the excessive explosive faults, insufficient explosive faults and black spots. Finally an algorithm based on image processing is designed to measure the external diameter of Nonel tubes. The experiments and practical operations in several Nonel tube manufacturers have proved the defect recognition rate of proposed system can surpass 95% at the detection speed of 100m/min, and system performance can meet the quality detection requirements of Nonel tubes. Therefore this quality detection method can save human resources and ensure the quality of Nonel tubes.


Biosensors ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. 346
Author(s):  
Mohd Syafiq Awang ◽  
Yazmin Bustami ◽  
Hairul Hisham Hamzah ◽  
Nor Syafirah Zambry ◽  
Mohamad Ahmad Najib ◽  
...  

Large-scale food-borne outbreaks caused by Salmonella are rarely seen nowadays, thanks to the advanced nature of the medical system. However, small, localised outbreaks in certain regions still exist and could possess a huge threat to the public health if eradication measure is not initiated. This review discusses the progress of Salmonella detection approaches covering their basic principles, characteristics, applications, and performances. Conventional Salmonella detection is usually performed using a culture-based method, which is time-consuming, labour intensive, and unsuitable for on-site testing and high-throughput analysis. To date, there are many detection methods with a unique detection system available for Salmonella detection utilising immunological-based techniques, molecular-based techniques, mass spectrometry, spectroscopy, optical phenotyping, and biosensor methods. The electrochemical biosensor has growing interest in Salmonella detection mainly due to its excellent sensitivity, rapidity, and portability. The use of a highly specific bioreceptor, such as aptamers, and the application of nanomaterials are contributing factors to these excellent characteristics. Furthermore, insight on the types of biorecognition elements, the principles of electrochemical transduction elements, and the miniaturisation potential of electrochemical biosensors are discussed.


Author(s):  
Joanna Mabe ◽  
Keefe Murphy ◽  
Gareth Williams ◽  
Andrew Welsh

This paper describes the process of incremental pipeline filling and the phased commissioning of a real-time leak detection system for the 1768 km long BTC crude oil pipeline. Due to stringent environmental requirements, it is essential for the leak detection system to work from the moment that crude oil is introduced into the pipeline. Without any prior operational data and with the pipeline partially filled, it is challenging for the leak detection system to monitor the integrity of the pipeline throughout the whole filling process. The application of the pig tracking software to track the oil front as the crude displaces nitrogen is also discussed.


Symmetry ◽  
2021 ◽  
Vol 13 (8) ◽  
pp. 1352
Author(s):  
Semih Yavuzkilic ◽  
Abdulkadir Sengur ◽  
Zahid Akhtar ◽  
Kamran Siddique

Deepfake is one of the applications that is deemed harmful. Deepfakes are a sort of image or video manipulation in which a person’s image is changed or swapped with that of another person’s face using artificial neural networks. Deepfake manipulations may be done with a variety of techniques and applications. A quintessential countermeasure against deepfake or face manipulation is deepfake detection method. Most of the existing detection methods perform well under symmetric data distributions, but are still not robust to asymmetric datasets variations and novel deepfake/manipulation types. In this paper, for the identification of fake faces in videos, a new multi-stream deep learning algorithm is developed, where three streams are merged at the feature level using the fusion layer. After the fusion layer, the fully connected, Softmax, and classification layers are used to classify the data. The pre-trained VGG16 model is adopted for transferred CNN1stream. In transfer learning, the weights of the pre-trained CNN model are further used for training the new classification problem. In the second stream (transferred CNN2), the pre-trained VGG19 model is used. Whereas, in the third stream, the pre-trained ResNet18 model is considered. In this paper, a new large-scale dataset (i.e., World Politicians Deepfake Dataset (WPDD)) is introduced to improve deepfake detection systems. The dataset was created by downloading videos of 20 different politicians from YouTube. Over 320,000 frames were retrieved after dividing the downloaded movie into little sections and extracting the frames. Finally, various manipulations were performed to these frames, resulting in seven separate manipulation classes for men and women. In the experiments, three fake face detection scenarios are investigated. First, fake and real face discrimination is studied. Second, seven face manipulations are performed, including age, beard, face swap, glasses, hair color, hairstyle, smiling, and genuine face discrimination. Third, performance of deepfake detection system under novel type of face manipulation is analyzed. The proposed strategy outperforms the prior existing methods. The calculated performance metrics are over 99%.


Database ◽  
2017 ◽  
Author(s):  
Qingyu Chen ◽  
Justin Zobel ◽  
Karin Verspoor

Duplication of information in databases is a major data quality challenge. The presence of duplicates, implying either redundancy or inconsistency, can have a range of impacts on the quality of analyses that use the data. To provide a sound basis for research on this issue in databases of nucleotide sequences, we have developed new, large-scale validated collections of duplicates, which can be used to test the effectiveness of duplicate detection methods. Previous collections were either designed primarily to test efficiency, or contained only a limited number of duplicates of limited kinds. To date, duplicate detection methods have been evaluated on separate, inconsistent benchmarks, leading to results that cannot be compared and, due to limitations of the benchmarks, of questionable generality. In this study, we present three nucleotide sequence database benchmarks, based on information drawn from a range of resources, including information derived from mapping to two data sections within the UniProt Knowledgebase (UniProtKB), UniProtKB/Swiss-Prot and UniProtKB/TrEMBL. Each benchmark has distinct characteristics. We quantify these characteristics and argue for their complementary value in evaluation. The benchmarks collectively contain a vast number of validated biological duplicates; the largest has nearly half a billion duplicate pairs (although this is probably only a tiny fraction of the total that is present). They are also the first benchmarks targeting the primary nucleotide databases. The records include the 21 most heavily studied organisms in molecular biology research. Our quantitative analysis shows that duplicates in the different benchmarks, and in different organisms, have different characteristics. It is thus unreliable to evaluate duplicate detection methods against any single benchmark. For example, the benchmark derived from UniProtKB/Swiss-Prot mappings identifies more diverse types of duplicates, showing the importance of expert curation, but is limited to coding sequences. Overall, these benchmarks form a resource that we believe will be of great value for development and evaluation of the duplicate detection or record linkage methods that are required to help maintain these essential resources. Database URL : https://bitbucket.org/biodbqual/benchmarks


2016 ◽  
Author(s):  
Qingyu Chen ◽  
Justin Zobel ◽  
Karin Verspoor

AbstractDuplication of information in databases is a major data quality challenge. The presence of duplicates, implying either redundancy or inconsistency, can have a range of impacts on the quality of analyses that use the data. To provide a sound basis for research on this issue in databases of nucleotide sequences, we have developed new, large-scale validated collections of duplicates, which can be used to test the effectiveness of duplicate detection methods. Previous collections were either designed primarily to test efficiency, or contained only a limited number of duplicates of limited kinds. To date, duplicate detection methods have been evaluated on separate, inconsistent benchmarks, leading to results that cannot be compared and, due to limitations of the benchmarks, of questionable generality.In this study we present three nucleotide sequence database benchmarks, based on information drawn from a range of resources, including information derived from mapping to Swiss-Prot and TrEMBL. Each benchmark has distinct characteristics. We quantify these characteristics and argue for their complementary value in evaluation. The benchmarks collectively contain a vast number of validated biological duplicates; the largest has nearly half a billion duplicate pairs (although this is probably only a tiny fraction of the total that is present). They are also the first benchmarks targeting the primary nucleotide databases. The records include the 21 most heavily studied organisms in molecular biology research. Our quantitative analysis shows that duplicates in the different benchmarks, and in different organisms, have different characteristics. It is thus unreliable to evaluate duplicate detection methods against any single benchmark. For example, the benchmark derived from Swiss-Prot mappings identifies more diverse types of duplicates, showing the importance of expert curation, but is limited to coding sequences. Overall, these benchmarks form a resource that we believe will be of great value for development and evaluation of the duplicate detection methods that are required to help maintain these essential resources.Availability: The benchmark data sets are available at https://bitbucket.org/biodbqual/benchmarks.


2018 ◽  
Vol 32 (30) ◽  
pp. 1850363
Author(s):  
Dongjie Li ◽  
Cong Liu

There are some disadvantages such as low efficiency, high work intensity by using the manual methods to detect the quality of micro-nano parts because of the characteristics such as small size and fragile structure. Considering about the disadvantages, computer microscopic vision is introduced into the detection system in this paper, which can collects the image information of the parts into the computer system efficiently. The parts to be detected are transmitted by the spin material platform driven by the stepping motor. It is CNN based on deep learning that used to detect the surface quality and classify the defects of the parts according to the image information in this paper, which can improve the accuracy of the detection and reduce the work intensity of human compared with not only the traditional manual detection methods but also some edge detection methods that former researchers used.


Author(s):  
Chris Apps ◽  
Istemi Ozkan ◽  
Tania Rizwan ◽  
Marzie Derakhshesh ◽  
Scott Medynski

When it comes to evaluating traditional computational leak detection technologies pipeline operators have a suite of simulated testing methods available. In the last several years however External Leak Detection Technologies have become more mature and potentially could provide operators with another layer of leak detection with more sensitivity than seen in traditional methods. The challenge with these technologies is in the evaluation of their sensitivity, reliability, and robustness. ENBRIDGE INC (Enbridge) and C-FER Technologies 1999 Inc. (C-FER) begun a comprehensive study to assess the state-of-the-art external, continuously distributed sensors for leak detection in early 2012. Initially, a technology review was undertaken to identify commercial, off-the-shelf technologies with the potential to detect small leaks of oil from buried pipelines. From this literature review, four technologies were identified; Distributed Temperature Sensing (DTS), Distributed Acoustic Sensing (DAS), Vapor Sensing Tubes (VST), and Hydrocarbon Sensing Cables (HSC). All four methods require proprietary materials and technology, which have had limited independent testing efforts to date. To evaluate these four leak detection methods and their vendors in an objective way, Enbridge and C-FER initiated the design and construction of a large-scale External Leak Detection Experimental Research apparatus (ELDER) that can accommodate a full-size segment of pipeline within a trench, at the same scale used in pipeline construction in North America. An instrumented pipe segment is buried in the trench with sensing cables laid alongside. The apparatus generates leaks with controlled variables including rate, pressure and temperature, and at various locations to accurately represent pipeline leaks. This paper summarizes the literature review on the four selected leak detection technologies that were identified as candidates for large-scale evaluation. The discussion will also include features of the ELDER apparatus, and re-engineered pipeline construction techniques that were required to accurately represent a full-scale pipeline trench within a laboratory environment.


Sign in / Sign up

Export Citation Format

Share Document