Parallel Outlier Detection for Streamed Data Using Non-Parameterized Approach

2017 ◽  
Vol 8 (2) ◽  
pp. 25-37
Author(s):  
Harshad Dattatray Markad ◽  
S. M. Sangve

Outlier detection is used in various applications like detection of fraud, network analysis, monitoring traffic over networks, manufacturing and environmental software. The data streams which are generated are continuous and changing over time. This is the reason why it becomes nearly difficult to detect the outliers in the existing data which is huge and continuous in nature. The streamed data is real time and changes over time and hence it is impractical to store data in the data space and analyze it for abnormal behavior. The limitations in data space has led to the problem of real time analysis of data and processing it in FCFS basis. The results regarding the abnormal behavior have to be done very quickly and in a limited time frame and on an infinite set of data streams coming over the networks. To address the problem of detecting outliers on a real-time basis is a challenging task and hence has to be monitored with the help of the processing power used to design the graphics of any processing unit. The algorithm used in this paper uses a kernel function to accomplish the task. It produces timely outcome on high speed multi- dimensional data. This method increases the speed of outlier detection by 20 times and the speed goes on increasing with the increase with the number of data attributes and input data rate.

2021 ◽  
Vol 11 (7) ◽  
pp. 3122
Author(s):  
Srujana Neelam ◽  
Audrey Lee ◽  
Michael A. Lane ◽  
Ceasar Udave ◽  
Howard G. Levine ◽  
...  

Since opportunities for spaceflight experiments are scarce, ground-based microgravity simulation devices (MSDs) offer accessible and economical alternatives for gravitational biology studies. Among the MSDs, the random positioning machine (RPM) provides simulated microgravity conditions on the ground by randomizing rotating biological samples in two axes to distribute the Earth’s gravity vector in all directions over time. Real-time microscopy and image acquisition during microgravity simulation are of particular interest to enable the study of how basic cell functions, such as division, migration, and proliferation, progress under altered gravity conditions. However, these capabilities have been difficult to implement due to the constantly moving frames of the RPM as well as mechanical noise. Therefore, we developed an image acquisition module that can be mounted on an RPM to capture live images over time while the specimen is in the simulated microgravity (SMG) environment. This module integrates a digital microscope with a magnification range of 20× to 700×, a high-speed data transmission adaptor for the wireless streaming of time-lapse images, and a backlight illuminator to view the sample under brightfield and darkfield modes. With this module, we successfully demonstrated the real-time imaging of human cells cultured on an RPM in brightfield, lasting up to 80 h, and also visualized them in green fluorescent channel. This module was successful in monitoring cell morphology and in quantifying the rate of cell division, cell migration, and wound healing in SMG. It can be easily modified to study the response of other biological specimens to SMG.


Designs ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 15
Author(s):  
Andreas Thoma ◽  
Abhijith Moni ◽  
Sridhar Ravi

Digital Image Correlation (DIC) is a powerful tool used to evaluate displacements and deformations in a non-intrusive manner. By comparing two images, one from the undeformed reference states of the sample and the other from the deformed target state, the relative displacement between the two states is determined. DIC is well-known and often used for post-processing analysis of in-plane displacements and deformation of the specimen. Increasing the analysis speed to enable real-time DIC analysis will be beneficial and expand the scope of this method. Here we tested several combinations of the most common DIC methods in combination with different parallelization approaches in MATLAB and evaluated their performance to determine whether the real-time analysis is possible with these methods. The effects of computing with different hardware settings were also analyzed and discussed. We found that implementation problems can reduce the efficiency of a theoretically superior algorithm, such that it becomes practically slower than a sub-optimal algorithm. The Newton–Raphson algorithm in combination with a modified particle swarm algorithm in parallel image computation was found to be most effective. This is contrary to theory, suggesting that the inverse-compositional Gauss–Newton algorithm is superior. As expected, the brute force search algorithm is the least efficient method. We also found that the correct choice of parallelization tasks is critical in attaining improvements in computing speed. A poorly chosen parallelization approach with high parallel overhead leads to inferior performance. Finally, irrespective of the computing mode, the correct choice of combinations of integer-pixel and sub-pixel search algorithms is critical for efficient analysis. The real-time analysis using DIC will be difficult on computers with standard computing capabilities, even if parallelization is implemented, so the suggested solution would be to use graphics processing unit (GPU) acceleration.


Author(s):  
Manudul Pahansen de Alwis ◽  
Karl Garme

The stochastic environmental conditions together with craft design and operational characteristics make it difficult to predict the vibration environments aboard high-performance marine craft, particularly the risk of impact acceleration events and the shock component of the exposure often being associated with structural failure and human injuries. The different timescales and the magnitudes involved complicate the real-time analysis of vibration and shock conditions aboard these craft. The article introduces a new measure, severity index, indicating the risk of severe impact acceleration, and proposes a method for real-time feedback on the severity of impact exposure together with accumulated vibration exposure. The method analyzes the immediate 60 s of vibration exposure history and computes the severity of impact exposure as for the present state based on severity index. The severity index probes the characteristic of the present acceleration stochastic process, that is, the risk of an upcoming heavy impact, and serves as an alert to the crew. The accumulated vibration exposure, important for mapping and logging the crew exposure, is determined by the ISO 2631:1997 vibration dose value. The severity due to the impact and accumulated vibration exposure is communicated to the crew every second as a color-coded indicator: green, yellow and red, representing low, medium and high, based on defined impact and dose limits. The severity index and feedback method are developed and validated by a data set of 27 three-hour simulations of a planning craft in irregular waves and verified for its feasibility in real-world applications by full-scale acceleration data recorded aboard high-speed planing craft in operation.


Author(s):  
Chen Yuan ◽  
Jun Wu

Abstract A real-time hard X-ray (HXR) tomographic system is designed for HL-2A tokamak, which is dedicated to the real-time tomography of fast electron bremsstrahlung radiation during the lower hybrid (LH) driven mode within the energy range of 20keV to 200keV. This system has realized the investigation of HXR energy from 12 different chords on the equatorial plane of the reaction region. The spatial and temporal resolutions of the system are 2cm and 10ms, separately. HXR detection is accomplished by a self-designed detector array, with a structure of 12 arc arranged cadmium telluride (CdTe) semiconductors and their corresponding collimators. The real-time HXR acquisition and processing is achieved by the main electronic system, which is comprised of a high speed analog-to-digital module and a high performance signal processing unit. Due to high HXR flux and the real-time demand in measurement, the HXR tomography is accomplished by several customized digital processing algorithms based on FPGA logic resources, such as the digital real-time spectrum measurement, the trapezoidal shaper, the pile up filter, and the baseline restorer, etc. This system has been proved to be qualified as a dependable platform of fast electron bremsstrahlung radiation research during LH mode on HL-2A, which provides indispensable parameters for plasma state during fusion reaction.


2018 ◽  
Vol 7 (3) ◽  
pp. 1208
Author(s):  
Ajai Sunny Joseph ◽  
Elizabeth Isaac

Melanoma is recognized as one of the most dangerous type of skin cancer. A novel method to detect melanoma in real time with the help of Graphical Processing Unit (GPU) is proposed. Existing systems can process medical images and perform a diagnosis based on Image Processing technique and Artificial Intelligence. They are also able to perform video processing with the help of large hardware resources at the backend. This incurs significantly higher costs and space and are complex by both software and hardware. Graphical Processing Units have high processing capabilities compared to a Central Processing Unit of a system. Various approaches were used for implementing real time detection of Melanoma. The results and analysis based on various approaches and the best approach based on our study is discussed in this work. A performance analysis for the approaches on the basis of CPU and GPU environment is also discussed. The proposed system will perform real-time analysis of live medical video data and performs diagnosis. The system when implemented yielded an accuracy of 90.133% which is comparable to existing systems.  


Sensors ◽  
2020 ◽  
Vol 20 (20) ◽  
pp. 5829 ◽  
Author(s):  
Jen-Wei Huang ◽  
Meng-Xun Zhong ◽  
Bijay Prasad Jaysawal

Outlier detection in data streams is crucial to successful data mining. However, this task is made increasingly difficult by the enormous growth in the quantity of data generated by the expansion of Internet of Things (IoT). Recent advances in outlier detection based on the density-based local outlier factor (LOF) algorithms do not consider variations in data that change over time. For example, there may appear a new cluster of data points over time in the data stream. Therefore, we present a novel algorithm for streaming data, referred to as time-aware density-based incremental local outlier detection (TADILOF) to overcome this issue. In addition, we have developed a means for estimating the LOF score, termed "approximate LOF," based on historical information following the removal of outdated data. The results of experiments demonstrate that TADILOF outperforms current state-of-the-art methods in terms of AUC while achieving similar performance in terms of execution time. Moreover, we present an application of the proposed scheme to the development of an air-quality monitoring system.


Sign in / Sign up

Export Citation Format

Share Document