Exploring ridesourcing trip patterns by fusing multi-source data: A big data approach

2021 ◽  
Vol 64 ◽  
pp. 102499
Author(s):  
Hui Bi ◽  
Zhirui Ye
Keyword(s):  
Big Data ◽  
2020 ◽  
Author(s):  
Jiting Tang ◽  
Saini Yang ◽  
Weiping Wang

<p>In 2019, the typhoon Lekima hit China, bringing strong winds and heavy rainfall to the nine provinces and municipalities on the northeastern coast of China. According to the Ministry of Emergency Management of the People’s Republic of China, Lekima caused 66 direct fatalities, 14 million affected people and is responsible for a direct economic loss in excess of 50 billion yuan. The current observation technologies include remote sensing and meteorological observation. But they have a long time cycle of data collection and a low interaction with disaster victims. Social media big data is a new data source for natural disaster research, which can provide technical reference for natural hazard analysis, risk assessment and emergency rescue information management.</p><p>We propose an assessment framework of social media data-based typhoon-induced flood assessment, which includes five parts: (1) <strong>Data acquisition.</strong> Obtain Sina Weibo text and some tag attributes based on keywords, time and location. (2) <strong>Spatiotemporal quantitative analysis.</strong> Collect the public concerns and trends from the perspective of words, time and space of different scales to judge the impact range of typhoon-induced flood. (3) <strong>Text classification and multi-source heterogeneous data fusion analysis.</strong> Build a hazard intensity and disaster text classification model by CNN (Convolutional Neural Networks), then integrate multi-source data including meteorological monitoring, population economy and disaster report for secondary evaluation and correction. (4) <strong>Text clustering and sub event mining.</strong> Extract subevents by BIRCH (Balanced Iterative Reducing and Clustering using Hierarchies) text clustering algorithms for automatic recognition of emergencies. (5) <strong>Emotional analysis and crisis management.</strong> Use time-space sequence model and four-quadrant analysis method to track the public negative emotions and find the potential crisis for emergency management.</p><p>This framework is validated with the case study of typhoon Lekima. The results show that social media big data makes up for the gap of data efficiency and spatial coverage. Our framework can assess the influence coverage, hazard intensity, disaster information and emergency needs, and it can reverse the disaster propagation process based on the spatiotemporal sequence. The assessment results after the secondary correction of multi-source data can be used in the actual system.</p><p>The proposed framework can be applied on a wide spatial scope and even full coverage; it is spatially efficient and can obtain feedback from affected areas and people almost immediately at the same time as a disaster occurs. Hence, it has a promising potential in large-scale and real-time disaster assessment.</p>


2016 ◽  
Vol 07 (03) ◽  
pp. 31-33
Author(s):  
ATIF AZIZ ◽  
◽  
RAJEEV ARYA ◽  
SANA SHAFIQUE ◽  
◽  
...  

Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3980 ◽  
Author(s):  
Shalli Rani ◽  
Sajjad Chauhdary

Various heterogeneous devices or objects will be integrated for transparent and seamless communication under the umbrella of Internet of things (IoT). This would facilitate the open access of data for the growth of various digital services. Building a general framework of IoT is a complex task because of the heterogeneity in devices, technologies, platforms and services operating in the same system. In this paper, we mainly focus on the framework for Big Data analytics in Smart City applications, which being a broad category specifies the different domains for each application. IoT is intended to support the vision of Smart City, where advance technologies will be used for communication to improve the quality of life of citizens. A novel approach is proposed in this paper to enhance energy conservation and reduce the delay in Big Data gathering at tiny sensor nodes used in IoT framework. To implement the Smart City scenario in terms of Big Data in IoT, an efficient (optimized in quality of service) wireless sensor network (WSN) is required where communication of nodes is energy efficient. Thus, a new protocol, QoS-IoT(quality of service enabled IoT), is proposed on the top layer of the proposed architecture (the five-layer architecture consists of technology, data source, data management, application and utility programs) which is validated over the traditional protocols.


Big Data is a term used to represent huge volume of both unstructured and structured data which cannot be processed by the traditional data processing techniques. This data is too huge, grows exponentially and doesn't fit into the structure of the traditional database systems. Analyzing Big Data is a very challenging task since it involves the processing of huge amount of data. As the industry or its business grows, the data related to the industries also tend to grow on a larger scale. Prominent data analysis tools are required to analyze the data in order to gain value out of it. Hadoop is a sought-after open source framework that uses MapReduce techniques to store and process huge datasets. However, the programs written using MapReduce techniques are not flexible and also require maintenance. This problem is overcome by making use of HiveQL. In order to execute queries in HiveQL, the platform required is Hive. It is an open-source data warehousing set-up built on Hadoop. HiveQL queries are compiled into MapReduce jobs that are executed utilizing Hadoop. In this paper we have analyzed the Indian Premier League dataset using HiveQL and compared its execution time with that of traditional SQL queries. It was found that the HiveQL provided better performance with larger dataset while SQL performed better with smaller datasets


2021 ◽  
Vol 18 (4) ◽  
pp. 763-768
Author(s):  
S. V. Sdobnikova

The data analyzed in this review indicate that an important feature of the natural evolution of diabetic retinopathy (DR) is the possibility of reverse development of its main signs, including newly formed vessels. The term “spontaneous remission”, proposed by M.D. Davis, may be correct for stating this condition. Spontaneous remission can be persistent and its frequency can significantly exceed the generally accepted 10 %. Signs of remission of proliferative diabetic retinopathy (PDR), regardless of the cause of occurrence (spontaneous or resulting from treatment) are: absence of ophthalmoscopically detectable neovessels; increased/appearance of the fibrous component of proliferation, which is accompanied by traction deformation of the retina. Therefore, the scale reflecting the stages of evolution of newly formed vessels and the scale of severity reflecting the degree of threat to visual functions in PDR cannot be identical. Since the development and regression of neovessels is a reflection of multidirectional processes, the identification of the phase of PDR evolution is fundamental in the formation of research design. Due to the possibility of using artificial intelligence for the analysis of “big data”, the effectiveness of the approach to the study of DR will largely be determined by the adequacy of the grouping of the source data. In this regard, the analysis of previous experience is relevant, which allows us to improve some principles of systematization of results. Conclusion: The statement of the phase of evolution of neovessels in PDR is fundamental in epidemiological and scientific studies. The identification of signs indicating the likelihood of spontaneous remission of DR/PDR will allow us to provide a differentiated approach to treatment, as well as to study the association with the dynamics of the patient’s somatic status.


2020 ◽  
Author(s):  
Peng Chu ◽  
Zhiqiang Dong ◽  
Yarong Chen ◽  
Changqing Yu ◽  
Yangchao Huang
Keyword(s):  
Big Data ◽  

2022 ◽  
pp. 30-57
Author(s):  
Richard S. Segall

The purpose of this chapter is to illustrate how artificial intelligence (AI) technologies have been used for COVID-19 detection and analysis. Specifically, the use of neural networks (NN) and machine learning (ML) are described along with which countries are creating these techniques and how these are being used for COVID-19 diagnosis and detection. Illustrations of multi-layer convolutional neural networks (CNN), recurrent neural networks (RNN), and deep neural networks (DNN) are provided to show how these are used for COVID-19 detection and prediction. A summary of big data analytics for COVID-19 and some available COVID-19 open-source data sets and repositories and their characteristics for research and analysis are also provided. An example is also shown for artificial intelligence (AI) and neural network (NN) applications using real-time COVID-19 data.


Sign in / Sign up

Export Citation Format

Share Document