Study of CDR Real-Time Query Based on Big Data Technologies

2013 ◽  
Vol 462-463 ◽  
pp. 845-848
Author(s):  
Zhi Heng Gao ◽  
Kang Chen ◽  
Ling Yan Bi

This paper describes big data technology layers, analyses the CDR (Call Data Records) real-time query scenario of telecommunications and brings forward a fast indexing and query solution based on the open source Hadoop platform. A CDR real-time query system was built according to the solution. A performance test was conducted with the real dataset of a city with 3 million subscribers. Compared with the existing system, the big data solution can greatly improve data processing performance and support real-time query with lower hardware and software investment.

2017 ◽  
Vol 113 ◽  
pp. 429-434 ◽  
Author(s):  
Y. Nait Malek ◽  
A. Kharbouch ◽  
H. El Khoukhi ◽  
M. Bakhouya ◽  
V. De Florio ◽  
...  

2020 ◽  
Vol 14 ◽  
pp. 174830262096239 ◽  
Author(s):  
Chuang Wang ◽  
Wenbo Du ◽  
Zhixiang Zhu ◽  
Zhifeng Yue

With the wide application of intelligent sensors and internet of things (IoT) in the smart job shop, a large number of real-time production data is collected. Accurate analysis of the collected data can help producers to make effective decisions. Compared with the traditional data processing methods, artificial intelligence, as the main big data analysis method, is more and more applied to the manufacturing industry. However, the ability of different AI models to process real-time data of smart job shop production is also different. Based on this, a real-time big data processing method for the job shop production process based on Long Short-Term Memory (LSTM) and Gate Recurrent Unit (GRU) is proposed. This method uses the historical production data extracted by the IoT job shop as the original data set, and after data preprocessing, uses the LSTM and GRU model to train and predict the real-time data of the job shop. Through the description and implementation of the model, it is compared with KNN, DT and traditional neural network model. The results show that in the real-time big data processing of production process, the performance of the LSTM and GRU models is superior to the traditional neural network, K nearest neighbor (KNN), decision tree (DT). When the performance is similar to LSTM, the training time of GRU is much lower than LSTM model.


Author(s):  
Andrew Peekema ◽  
Daniel Renjewski ◽  
Jonathan Hurst

The control system of a highly dynamic robot requires the ability to respond quickly to changes in the robot’s state. This type of system is needed in varying fields such as dynamic locomotion, multicopter control, and human-robot interaction. Robots in these fields require software and hardware capable of hard real-time, high frequency control. In addition, the application outlined in this paper requires modular components, remote guidance, and mobile control. The described system integrates a computer on the robot for running a control algorithm, a bus for communicating with microcontrollers connected to sensors and actuators, and a remote user interface for interacting with the robot. Current commercial solutions can be expensive, and open source solutions are often time consuming. The key innovation described in this paper is the building of a control system from existing — mostly open source — components that can provide realtime, high frequency control of the robot. This paper covers the development of such a control system based on ROS, OROCOS, and EtherCAT, its implementation on a dynamic bipedal robot, and system performance test results.


2014 ◽  
Vol 23 (01) ◽  
pp. 27-35 ◽  
Author(s):  
S. de Lusignan ◽  
S-T. Liaw ◽  
C. Kuziemsky ◽  
F. Mold ◽  
P. Krause ◽  
...  

Summary Background: Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective: To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results: We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions: Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance.


2017 ◽  
Vol 8 (2) ◽  
pp. 88-105 ◽  
Author(s):  
Gunasekaran Manogaran ◽  
Daphne Lopez

Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.


Author(s):  
Ganesh Chandra Deka

NoSQL databases are designed to meet the huge data storage requirements of cloud computing and big data processing. NoSQL databases have lots of advanced features in addition to the conventional RDBMS features. Hence, the “NoSQL” databases are popularly known as “Not only SQL” databases. A variety of NoSQL databases having different features to deal with exponentially growing data-intensive applications are available with open source and proprietary option. This chapter discusses some of the popular NoSQL databases and their features on the light of CAP theorem.


Author(s):  
Amir A. Khwaja

Big data explosion has already happened and the situation is only going to exacerbate with such a high number of data sources and high-end technology prevalent everywhere, generating data at a frantic pace. One of the most important aspects of big data is being able to capture, process, and analyze data as it is happening in real-time to allow real-time business decisions. Alternate approaches must be investigated especially consisting of highly parallel and real-time computations for big data processing. The chapter presents RealSpec real-time specification language that may be used for the modeling of big data analytics due to the inherent language features needed for real-time big data processing such as concurrent processes, multi-threading, resource modeling, timing constraints, and exception handling. The chapter provides an overview of RealSpec and applies the language to a detailed big data event recognition case study to demonstrate language applicability to big data framework and analytics modeling.


Author(s):  
Dharmpal Singh ◽  
Madhusmita Mishra ◽  
Sudipta Sahana

Big-data-analyzed finding patterns derive meaning and make decisions on data to produce responses to the world with intelligence. It is an emerging area used in business intelligence (BI) for competitive advantage to analyze the structured, semi-structured, and unstructured data stored in different formats. As the big data technology continues to evolve, businesses are turning to predictive intelligence to deepen the engagement to customers with optimization in processes to reduce the operational costs. Predictive intelligence uses sets of advanced technologies that enable organizations to use data stored in real time that move from a historical and descriptive view to a forward-looking perspective of data. The comparison and other security issue of this technology is covered in this book chapter. The combination of big data technology and predictive analytics is sometimes referred to as a never-ending process and has the possibility to deliver significant competitive advantage. This chapter provides an extensive review of literature on big data technologies and its usage in the predictive intelligence.


Big Data ◽  
2016 ◽  
pp. 418-440
Author(s):  
Amir A. Khwaja

Big data explosion has already happened and the situation is only going to exacerbate with such a high number of data sources and high-end technology prevalent everywhere, generating data at a frantic pace. One of the most important aspects of big data is being able to capture, process, and analyze data as it is happening in real-time to allow real-time business decisions. Alternate approaches must be investigated especially consisting of highly parallel and real-time computations for big data processing. The chapter presents RealSpec real-time specification language that may be used for the modeling of big data analytics due to the inherent language features needed for real-time big data processing such as concurrent processes, multi-threading, resource modeling, timing constraints, and exception handling. The chapter provides an overview of RealSpec and applies the language to a detailed big data event recognition case study to demonstrate language applicability to big data framework and analytics modeling.


Author(s):  
Amitava Choudhury ◽  
Kalpana Rangra

Data type and amount in human society is growing at an amazing speed, which is caused by emerging new services such as cloud computing, internet of things, and location-based services. The era of big data has arrived. As data has been a fundamental resource, how to manage and utilize big data better has attracted much attention. Especially with the development of the internet of things, how to process a large amount of real-time data has become a great challenge in research and applications. Recently, cloud computing technology has attracted much attention to high performance, but how to use cloud computing technology for large-scale real-time data processing has not been studied. In this chapter, various big data processing techniques are discussed.


Sign in / Sign up

Export Citation Format

Share Document