Twitter Data Analysis by Live Streaming

2019 ◽  
Vol 16 (8) ◽  
pp. 3178-3182
Author(s):  
Sugnik Roy Chowdhury

Streaming now a days have been of great use when comes to Social Media. Streaming of data have made it easy for Companies to understand the pros and cons of their product. Streaming acts as a survey now a days which a few years ago were done by a team of individual using pen and papers. In order to collect and process the streaming data from various streaming sites to produce an analytical report that helps to get a clear pictorial representation of events, the assets of streaming process generates a huge volume of real time data mainly referred to as “Big Data.” In order to aggregate, store and analyses the streaming data that are being generated Day-By-Day we get into the concept of Hadoop and Flume Technologies, API that helps to collect data from Twitter/other streaming sites by using “#” tag/Keywords. Tweets by the News channel and retweets by the public are being collected.

2021 ◽  
Vol 1 (2) ◽  
Author(s):  
Dilmini Rathnayaka ◽  
Pubudu K.P.N Jayasena ◽  
Iraj Ratnayake

Sentiment analysis mainly supports sorting out the polarity and provides valuable information with the use of raw data in social media platforms. Many fields like health, business, and security require real-time data analysis for instant decision-making situations.Since Twitter is considered a popular social media platform to collect data easily, this paper is considering data analysis methods of Twitter data, real-time Twitter data analysis based on geo-location. Twitter data classification and analysis can be done with the use of diverse algorithms and deciding the most appropriate algorithm for data analysis, can be accomplished by implementing and testing these diverse algorithms.This paper is discussing the major description of sentiment analysis, data collection methods, data pre-processing, feature extraction, and sentiment analysis methods related to Twitter data. Real-time data analysis arises as a major method of analyzing the data available online and the real-time Twitter data analysis process is described throughout this paper. Several methods of classifying the polarized Twitter data are discussed within the paper while depicting a proposed method of Twitter data analyzing algorithm. Location-based Twitter data analysis is another crucial aspect of sentiment analyses, that enables data sorting according to geo-location, and this paper describes the way of analyzing Twitter data based on geo-location. Further, a comparison about several sentiment analysis algorithms used by previous researchers has been reported and finally, a conclusion has been provided.


2020 ◽  
Vol 8 (6) ◽  
pp. 1042-1044

Social media has developed drastically over the years. These days, individuals from all around the globe utilize online networking destinations to share data and information. Twitter is a well known communication site where users update information or messages known as tweets. Users share their day by day lives, post their opinions on everything, for example, brands and places. Various purchasers and advertisers utilize these tweets to accumulate bits of knowledge of their items and opinions on them. The aim of this paper is to exhibit a model that can perform sentiment analysis of real-time data collected from twitter and classify the tweets into positive, negative or neutral based on the sentiment expressed in them.


2016 ◽  
Vol 7 (3) ◽  
pp. 38-55
Author(s):  
Srinivasa K.G. ◽  
Ganesh Hegde ◽  
Kushagra Mishra ◽  
Mohammad Nabeel Siddiqui ◽  
Abhishek Kumar ◽  
...  

With the advancement of portable devices and sensors, there has been a need to build a universal framework, which can serve as a nodal point to aggregate data from different kinds of devices and sensors. We propose a unified framework that will provide a robust set of guidelines for sensors with varied degree of complexities connected to common set of System-on-Chip (SoC). These will help to monitor, control and visualize real time data coming from different type of sensors connected to these SoCs. We have defined a set of APIs, which will help the sensors to register with the server. These APIs will be the standard to which the sensors will comply while streaming data when connected to the client platforms.


2020 ◽  
Vol 12 (23) ◽  
pp. 10175
Author(s):  
Fatima Abdullah ◽  
Limei Peng ◽  
Byungchul Tak

The volume of streaming sensor data from various environmental sensors continues to increase rapidly due to wider deployments of IoT devices at much greater scales than ever before. This, in turn, causes massive increase in the fog, cloud network traffic which leads to heavily delayed network operations. In streaming data analytics, the ability to obtain real time data insight is crucial for computational sustainability for many IoT enabled applications such as environmental monitors, pollution and climate surveillance, traffic control or even E-commerce applications. However, such network delays prevent us from achieving high quality real-time data analytics of environmental information. In order to address this challenge, we propose the Fog Sampling Node Selector (Fossel) technique that can significantly reduce the IoT network and processing delays by algorithmically selecting an optimal subset of fog nodes to perform the sensor data sampling. In addition, our technique performs a simple type of query executions within the fog nodes in order to further reduce the network delays by processing the data near the data producing devices. Our extensive evaluations show that Fossel technique outperforms the state-of-the-art in terms of latency reduction as well as in bandwidth consumption, network usage and energy consumption.


2021 ◽  
Author(s):  
Xin Liu ◽  
Insa Meinke ◽  
Ralf Weisse

Abstract. Storm surges represent a major threat to many low-lying coastal areas in the world. While most places can cope with or are more or less adapted to present-day risks, future risks may increase from factors such as sea level rise, subsidence, or changes in storm activity. This may require further or alternative adaptation and strategies. For most places, both forecasts and real-time observations are available. However, analyses of long-term changes or recent severe extremes that are important for decision-making are usually only available sporadically or with substantial delay. In this paper, we propose to contextualize real-time data with long-term statistics to make such information publicly available in near real-time. We implement and demonstrate the concept of a ”storm surge monitor” for tide gauges along the German North Sea and Baltic Sea coasts. It provides automated near real-time assessments of the course and severity of the ongoing storm surge season and its single events. The assessment is provided in terms of storm surge height, frequency, duration, and intensity. It is proposed that such near real-time assessments provide added value to the public and decision-making. It is further suggested that the concept is transferable to other coastal regions threatened by storm surges.


Author(s):  
Srinivasa K.G. ◽  
Ganesh Hegde ◽  
Kushagra Mishra ◽  
Mohammad Nabeel Siddiqui ◽  
Abhishek Kumar ◽  
...  

With the advancement of portable devices and sensors, there has been a need to build a universal framework, which can serve as a nodal point to aggregate data from different kinds of devices and sensors. We propose a unified framework that will provide a robust set of guidelines for sensors with varied degree of complexities connected to common set of System-on-Chip (SoC). These will help to monitor, control and visualize real time data coming from different type of sensors connected to these SoCs. We have defined a set of APIs, which will help the sensors to register with the server. These APIs will be the standard to which the sensors will comply while streaming data when connected to the client platforms.


Author(s):  
Wajid Khan ◽  
Fiaz Hussain ◽  
Edmond C. Prakash

The arrival of E-commerce systems has contributed a lot to the economy and also played a vital role in collecting a huge amount of transactional data in the form of online orders and web enquiries, with such a huge volume of data it is getting difficult day by day to analyse business and consumer behaviour. There is a greater need for business analytical tools to help decision makers understand data properly - and understanding data will lead to amazing things such as hidden trends, effective resource utilisation, decision making ability and understanding business and its core values.


2014 ◽  
Vol 2014 (1) ◽  
pp. 1607-1620
Author(s):  
Andrew Milanes ◽  
Mark Stevens ◽  
David Alford

ABSTRACT Geographic Information Systems (GIS) has become an integral component to the data management, analysis, and presentation needs during an emergency response. GIS allows for the rapid integration of multiple data sets and is a tool utilized throughout the Incident Command System to aid in timely, informed decision making. Advances in mobile and hand-held devices, such as smart-phones, tablets and GPSs have provided new capabilities in field GIS data collection and dissemination. In addition to GIS data, live streaming data feeds, such as vessel Automatic Identification System (AIS) and video from remotely operated vehicles (ROVs), have become increasingly important to situational awareness. Prompt broadcasting of this data in a Common Operating Picture (COP) framework has become critical as the demand for real-time incident information increases.


Author(s):  
Stanly Wilson ◽  
Sivakumar R

The day-to-day life of the people doesn't depend only on what they think, but it is affected and influenced by what others think. The advertisements and campaigns of the favourite celebrities and mesmerizing personalities influence the way people think and see the world. People get the news and information at lightning speed than ever before. The growth of textual data on the internet is very fast. People express themselves in various ways on the web every minute. They make use of various platforms to share their views and opinions. A huge amount of data is being generated at every moment on this process. Being one of the most important and well-known social media of the present time, millions of tweets are posted on Twitter every day. These tweets are a source of very important information and it can be made use for business, small industries, creating government policies, and various studies can be performed by using it. This paper focuses on the location from where the tweets are posted and the language in which the tweets are written. These details can be effectively analysed by using Hadoop. Hadoop is a tool that is used to analyze distributed big data, streaming data, timestamp data and text data. With the help of Apache Flume, the tweets can be collected from Twitter and then sink in the HDFS (Hadoop Distributed File System). These raw data then analyzed using Apache Pig and the information available can be made use for social and commercial purposes. The result will be visualized using Apache Zeppelin.


We have real-time data everywhere and every day. Most of the data comes from IoT sensors, data from GPS positions, web transactions and social media updates. Real time data is typically generated in a continuous fashion. Such real-time data are called Data streams. Data streams are transient and there is very little time to process each item in the stream. It is a great challenge to do analytics on rapidly flowing high velocity data. Another issue is the percentage of incoming data that is considered for analytics. Higher the percentage greater would be the accuracy. Considering these two issues, the proposed work is intended to find a better solution by gaining insight on real-time streaming data with minimum response time and greater accuracy. This paper combines the two technology giants TensorFlow and Apache Kafka. is used to handle the real-time streaming data since TensorFlow supports analytics support with deep learning algorithms. The Training and Testing is done on Uber connected vehicle public data set RideAustin. The experimental result of RideAustin shows the predicted failure under each type of vehicle parameter. The comparative analysis showed 16% improvement over the traditional Machine Learning algorithm.


Sign in / Sign up

Export Citation Format

Share Document