scholarly journals Real Time Product Feedback Review and Analysis Using Apache Technologies and NOSQL Database

2017 ◽  
Vol 6 (10) ◽  
pp. 22551-22558
Author(s):  
BiswaRanjan Samal ◽  
Mrutyunjaya Panda

Whenever a feedback system comes into mind, it’s always a demand of the e-commerce organizations to get the customer feedbacks in real time and to build some strong dashboards on top of these feedbacks/ratings. So that they can easily know the performance of any product at any point of time as well as they could able to take a decision, on what to do with the products those are getting very poor feedbacks. Which will result in a minimum impact on the tangible and intangible assets of the organizations. For achieving the above goal it is very necessary for these organizations to adopt the right tool and implement the required environment which can deal with the real time big data ingestion, enrichment, indexing and have the power to perform simple as well as complex analysis algorithm on the stored data. In this paper, we have collected Amazon Product Ratings for doing analysis and used Apache NiFi for ingesting real-time data into Apache Solr and have taken help of Banana Dashboard to show the real time analysis results in the form of attractive and user-friendly dashboards.

J ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 147-153
Author(s):  
Paula Morella ◽  
María Pilar Lambán ◽  
Jesús Antonio Royo ◽  
Juan Carlos Sánchez

Among the new trends in technology that have emerged through the Industry 4.0, Cyber Physical Systems (CPS) and Internet of Things (IoT) are crucial for the real-time data acquisition. This data acquisition, together with its transformation in valuable information, are indispensable for the development of real-time indicators. Moreover, real-time indicators provide companies with a competitive advantage over the competition since they enhance the calculus and speed up the decision-making and failure detection. Our research highlights the advantages of real-time data acquisition for supply chains, developing indicators that would be impossible to achieve with traditional systems, improving the accuracy of the existing ones and enhancing the real-time decision-making. Moreover, it brings out the importance of integrating technologies 4.0 in industry, in this case, CPS and IoT, and establishes the main points for a future research agenda of this topic.


2020 ◽  
Vol 14 ◽  
pp. 174830262096239 ◽  
Author(s):  
Chuang Wang ◽  
Wenbo Du ◽  
Zhixiang Zhu ◽  
Zhifeng Yue

With the wide application of intelligent sensors and internet of things (IoT) in the smart job shop, a large number of real-time production data is collected. Accurate analysis of the collected data can help producers to make effective decisions. Compared with the traditional data processing methods, artificial intelligence, as the main big data analysis method, is more and more applied to the manufacturing industry. However, the ability of different AI models to process real-time data of smart job shop production is also different. Based on this, a real-time big data processing method for the job shop production process based on Long Short-Term Memory (LSTM) and Gate Recurrent Unit (GRU) is proposed. This method uses the historical production data extracted by the IoT job shop as the original data set, and after data preprocessing, uses the LSTM and GRU model to train and predict the real-time data of the job shop. Through the description and implementation of the model, it is compared with KNN, DT and traditional neural network model. The results show that in the real-time big data processing of production process, the performance of the LSTM and GRU models is superior to the traditional neural network, K nearest neighbor (KNN), decision tree (DT). When the performance is similar to LSTM, the training time of GRU is much lower than LSTM model.


Author(s):  
Ritesh Srivastava ◽  
M.P.S. Bhatia

Twitter behaves as a social sensor of the world. The tweets provided by the Twitter Firehose reveal the properties of big data (i.e. volume, variety, and velocity). With millions of users on Twitter, the Twitter's virtual communities are now replicating the real-world communities. Consequently, the discussions of real world events are also very often on Twitter. This work has performed the real-time analysis of the tweets related to a targeted event (e.g. election) to identify those potential sub-events that occurred in the real world, discussed over Twitter and cause the significant change in the aggregated sentiment score of the targeted event with time. Such type of analysis can enrich the real-time decision-making ability of the event bearer. The proposed approach utilizes a three-step process: (1) Real-time sentiment analysis of tweets (2) Application of Bayesian Change Points Detection to determine the sentiment change points (3) Major sub-events detection that have influenced the sentiment of targeted event. This work has experimented on Twitter data of Delhi Election 2015.


Designs ◽  
2021 ◽  
Vol 5 (1) ◽  
pp. 15
Author(s):  
Andreas Thoma ◽  
Abhijith Moni ◽  
Sridhar Ravi

Digital Image Correlation (DIC) is a powerful tool used to evaluate displacements and deformations in a non-intrusive manner. By comparing two images, one from the undeformed reference states of the sample and the other from the deformed target state, the relative displacement between the two states is determined. DIC is well-known and often used for post-processing analysis of in-plane displacements and deformation of the specimen. Increasing the analysis speed to enable real-time DIC analysis will be beneficial and expand the scope of this method. Here we tested several combinations of the most common DIC methods in combination with different parallelization approaches in MATLAB and evaluated their performance to determine whether the real-time analysis is possible with these methods. The effects of computing with different hardware settings were also analyzed and discussed. We found that implementation problems can reduce the efficiency of a theoretically superior algorithm, such that it becomes practically slower than a sub-optimal algorithm. The Newton–Raphson algorithm in combination with a modified particle swarm algorithm in parallel image computation was found to be most effective. This is contrary to theory, suggesting that the inverse-compositional Gauss–Newton algorithm is superior. As expected, the brute force search algorithm is the least efficient method. We also found that the correct choice of parallelization tasks is critical in attaining improvements in computing speed. A poorly chosen parallelization approach with high parallel overhead leads to inferior performance. Finally, irrespective of the computing mode, the correct choice of combinations of integer-pixel and sub-pixel search algorithms is critical for efficient analysis. The real-time analysis using DIC will be difficult on computers with standard computing capabilities, even if parallelization is implemented, so the suggested solution would be to use graphics processing unit (GPU) acceleration.


2019 ◽  
Vol 26 (1) ◽  
pp. 244-252 ◽  
Author(s):  
Shibom Basu ◽  
Jakub W. Kaminski ◽  
Ezequiel Panepucci ◽  
Chia-Ying Huang ◽  
Rangana Warshamanage ◽  
...  

At the Swiss Light Source macromolecular crystallography (MX) beamlines the collection of serial synchrotron crystallography (SSX) diffraction data is facilitated by the recent DA+ data acquisition and analysis software developments. The SSX suite allows easy, efficient and high-throughput measurements on a large number of crystals. The fast continuous diffraction-based two-dimensional grid scan method allows initial location of microcrystals. The CY+ GUI utility enables efficient assessment of a grid scan's analysis output and subsequent collection of multiple wedges of data (so-called minisets) from automatically selected positions in a serial and automated way. The automated data processing (adp) routines adapted to the SSX data collection mode provide near real time analysis for data in both CBF and HDF5 formats. The automatic data merging (adm) is the latest extension of the DA+ data analysis software routines. It utilizes the sxdm (SSX data merging) package, which provides automatic online scaling and merging of minisets and allows identification of a minisets subset resulting in the best quality of the final merged data. The results of both adp and adm are sent to the MX MongoDB database and displayed in the web-based tracker, which provides the user with on-the-fly feedback about the experiment.


2021 ◽  
Vol 343 ◽  
pp. 03005
Author(s):  
Florina Chiscop ◽  
Bogdan Necula ◽  
Carmen Cristiana Cazacu ◽  
Cristian Eugen Stoica

The topic of this paper represents our research in the process of creating a virtual model (digital twin) for a fast-food company production chain starting with the moment when a customer launches an order, following with the processing of that order, until the customer receives it. The model will describe elements that are included in this process such as equipment, human resources and the necessary space that is needed to host this layout. The virtual model created in a simulation platform will be a replicate of a real fast-food company, thus helping us observe the real time dynamic of this production system. Using WITNESS HORIZON 23 we will construct the model of the layout based on real time data received from the fast-food company. This digital twin will be used to manage the production chain material flow, evaluating the performance of the system architecture in various scenarios. In order to obtain a diagnosis of the system’s performance we will simulate the workflow running through preliminary architecture in compliance with the real time behaviour to identify the bottlenecks and blockages in the flow trajectory. In the end we will propose two different optimised architectures for the fast-food company production chain.


2014 ◽  
Vol 571-572 ◽  
pp. 497-501 ◽  
Author(s):  
Qi Lv ◽  
Wei Xie

Real-time log analysis on large scale data is important for applications. Specifically, real-time refers to UI latency within 100ms. Therefore, techniques which efficiently support real-time analysis over large log data sets are desired. MongoDB provides well query performance, aggregation frameworks, and distributed architecture which is suitable for real-time data query and massive log analysis. In this paper, a novel implementation approach for an event driven file log analyzer is presented, and performance comparison of query, scan and aggregation operations over MongoDB, HBase and MySQL is analyzed. Our experimental results show that HBase performs best balanced in all operations, while MongoDB provides less than 10ms query speed in some operations which is most suitable for real-time applications.


The Analyst ◽  
2018 ◽  
Vol 143 (16) ◽  
pp. 3798-3807 ◽  
Author(s):  
Yangyang Jiang ◽  
Lin Du ◽  
Yuanming Li ◽  
Quanquan Mu ◽  
Zhongxu Cui ◽  
...  

The real-time continuous-flow PCR inside a 3D spiral microchannel is realized by a novel self-activated microdroplet generation/transport mechanism.


Sign in / Sign up

Export Citation Format

Share Document