Data Summarization Using Sampling Algorithms: Data Stream Case Study

Author(s):  
Rayane El Sibai ◽  
Jacques Bou Abdo ◽  
Yousra Chabchoub ◽  
Jacques Demerjian ◽  
Raja Chiky ◽  
...  
1998 ◽  
Vol 12 (3) ◽  
pp. 283-302 ◽  
Author(s):  
James Allen Fill

The elementary problem of exhaustively sampling a finite population without replacement is used as a nonreversible test case for comparing two recently proposed MCMC algorithms for perfect sampling, one based on backward coupling and the other on strong stationary duality. The backward coupling algorithm runs faster in this case, but the duality-based algorithm is unbiased for user impatience. An interesting by-product of the analysis is a new and simple stochastic interpretation of a mixing-time result for the move-to-front rule.


2016 ◽  
Vol 72 (10) ◽  
pp. 3927-3959 ◽  
Author(s):  
Simon Fong ◽  
Kexing Liu ◽  
Kyungeun Cho ◽  
Raymond Wong ◽  
Sabah Mohammed ◽  
...  

2020 ◽  
Vol 6 ◽  
pp. e250
Author(s):  
Varsha Kakkara ◽  
Karthi Balasubramanian ◽  
B. Yamuna ◽  
Deepak Mishra ◽  
Karthikeyan Lingasubramanian ◽  
...  

Integrated circuits may be vulnerable to hardware Trojan attacks during its design or fabrication phases. This article is a case study of the design of a Viterbi decoder and the effect of hardware Trojans on a coded communication system employing the Viterbi decoder. Design of a Viterbi decoder and possible hardware Trojan models for the same are proposed. An FPGA-based implementation of the decoder and the associated Trojan circuits have been discussed. The noise-added encoded input data stream is stored in the block RAM of the FPGA and the decoded data stream is monitored on the PC through an universal asynchronous receiver transmitter interface. The implementation results show that there is barely any change in the LUTs used (0.5%) and power dissipation (3%) due to the insertion of the proposed Trojan circuits, thus establishing the surreptitious nature of the Trojan. In spite of the fact that the Trojans cause negligible changes in the circuit parameters, there are significant changes in the bit error rate (BER) due to the presence of Trojans. In the absence of Trojans, BER drops down to zero for signal to noise rations (SNRs) higher than 6 dB, but with the presence of Trojans, BER doesn’t reduce to zero even at a very high SNRs. This is true even with the Trojan being activated only once during the entire duration of the transmission.


Author(s):  
Ms. Shailaja B. Jadhav ◽  
Dr. D. V. Kodavade

Nowadays, big data processing systems are evolving to be more stream-oriented; where each data record is processed as it arrives by distributed and low latency computational frameworks [18]. Data streams have been extensively used in several fields of computational analytics such as data mining, business intelligence etc. [17]. In every field, the data stream can be considered as an ordered sequence of data items, as they continuously arrive over the period. Due to this characteristic, streaming data analytics is a challenging area of research [5, 11]. This paper aims to present data stream processing as a growing research field , along with streaming analytics frameworks as a rich focus area. The paper also contributes to evaluate the efficacy of available stream analytics frameworks. One of the Industry 4.0 use case - predictive maintenance rail transportation - has been illustrated here as a case study design mapped with streaming analytics framework.


2018 ◽  
Vol 1 (2) ◽  
pp. 79-85
Author(s):  
I Putu Agus Eka Pratama ◽  
Putu Adhika Dharmesta

Deep Packet Inspection (DPI) is a technique commonly used by network administrator to be able to monitor in detail the flow of data in the form of data packets that occur at that moment. This data stream will produce an information that can be used for network management purposes. One example of a case study that can be done with this technique is the intranet that is available in Information Technology major Udayana University’s. Deep Packet Inspection Technique is done with purpose to identifying the initial slowing down of network speed on Information Technology major Udayana University’s.


Author(s):  
Thomas Plagemann ◽  
Vera Goebel ◽  
Andrea Bergamini ◽  
Giacomo Tolu ◽  
Guillaume Urvoy-Keller ◽  
...  

2015 ◽  
Vol 5 (5) ◽  
pp. 1108-1115 ◽  
Author(s):  
Simon Fong ◽  
Shirley W. I. Siu ◽  
Suzy Zhou ◽  
Jonathan H. Chan ◽  
Sabah Mohammed ◽  
...  

2021 ◽  
Vol 25 (02) ◽  
pp. 1-8
Author(s):  
Ali M. Al-Bdairy ◽  
◽  
Ahmed A. A. Al-Duroobi ◽  
Maan A. Tawfiq ◽  
◽  
...  

Although the rapid development of reverse engineering techniques such as a modern 3D laser scanners, but can’t use this techniques immediately to generate a perfect surface model for the scanned parts, due to the huge data, the noisy data which associated to the scanning process, and the accuracy limitation of some scanning devices, so, the present paper present a points cloud pre-processing and sampling algorithms have been proposed based on distance calculations and statistical considerations to simplify the row points cloud which obtained using MATTER and FORM 3D laser scanner as a manner to obtain the required geometrical features and mathematical representation from the row points cloud of the scanned object through detection, isolating, and deleting the noised points. A MATLAB program has been constructed for executing the proposed algorithms implemented using a suggested case study with non-uniform shape. The results were proved the validity of the introduced distance algorithms for pre-processing and sampling process where the proficiency percent for pre-processing was (18.65%) with a single attempt, and the counted deviation value rang with the sampling process was (0.0002-0.3497mm).


Sign in / Sign up

Export Citation Format

Share Document