A survey of data visualization tools for analyzing large volume of data in big data platform

Author(s):  
R. S. Raghav ◽  
Sujatha Pothula ◽  
T. Vengattaraman ◽  
Dhavachelvan Ponnurangam

2021 ◽  
Vol 6 (2) ◽  
pp. 24-31
Author(s):  
Stefana Janićijević ◽  
Vojkan Nikolić

Networks are all around us. Graph structures are established in the core of every network system therefore it is assumed to be understood as graphs as data visualization objects. Those objects grow from abstract mathematical paradigms up to information insights and connection channels. Essential metrics in graphs were calculated such as degree centrality, closeness centrality, betweenness centrality and page rank centrality and in all of them describe communication inside the graph system. The main goal of this research is to look at the methods of visualization over the existing Big data and to present new approaches and solutions for the current state of Big data visualization. This paper provides a classification of existing data types, analytical methods, techniques and visualization tools, with special emphasis on researching the evolution of visualization methodology in recent years. Based on the obtained results, the shortcomings of the existing visualization methods can be noticed.



2022 ◽  
pp. 590-621
Author(s):  
Obinna Chimaobi Okechukwu

In this chapter, a discussion is presented on the latest tools and techniques available for Big Data Visualization. These tools, techniques and methods need to be understood appropriately to analyze Big Data. Big Data is a whole new paradigm where huge sets of data are generated and analyzed based on volume, velocity and variety. Conventional data analysis methods are incapable of processing data of this dimension; hence, it is fundamentally important to be familiar with new tools and techniques capable of processing these datasets. This chapter will illustrate tools available for analysts to process and present Big Data sets in ways that can be used to make appropriate decisions. Some of these tools (e.g., Tableau, RapidMiner, R Studio, etc.) have phenomenal capabilities to visualize processed data in ways traditional tools cannot. The chapter will also aim to explain the differences between these tools and their utilities based on scenarios.





Author(s):  
José M. Conejero ◽  
Juan Carlos Preciado ◽  
Alvaro E. Prieto ◽  
Roberto Rodriguez-Echeverria ◽  
Fernando Sánchez-Figueroa

In the last years, the growing volumes and sources of data has made Big Data technologies to become mainstream. In that sense, techniques like Data Visualization are being used more and more to group large amounts of data in order to transform them into useful information. Nevertheless, these techniques are currently included in Business Intelligence approaches to provide companies and public organizations with helpful tools for making decisions based on evidences instead of intuition. The Sankey diagram is an example of those complex visualization tools allowing the user to graphically trace meaningful relationships in large volumes of data. However, this type of diagram is usually static so they must be continuously and manually rebuilt on top of massive multivariable environments whenever decision makers need to evaluate different options and they do not allow to establish conditions over the data shown. This paper presents LiveSankey, an approach to automatically generate dynamic Sankey Diagrams allowing users to filter the data shown. As a result, multiple conditions may be established over the data used and the corresponding diagram can be dynamically rebuilt.



Author(s):  
Rabindra K. Barik ◽  
Rakesh K. Lenka ◽  
Syed Mohd Ali ◽  
Noopur Gupta ◽  
Ananya Satpathy ◽  
...  


Author(s):  
Madhavi Arun Vaidya ◽  
Meghana Sanjeeva

Research, which is an integral part of higher education, is undergoing a metamorphosis. Researchers across disciplines are increasingly utilizing electronic tools to collect, analyze, and organize data. This “data deluge” creates a need to develop policies, infrastructures, and services in organisations, with the objective of assisting researchers in creating, collecting, manipulating, analysing, transporting, storing, and preserving datasets. Research is now conducted in the digital realm, with researchers generating and exchanging data among themselves. Research data management in context with library data could also be treated as big data without doubt due its properties of large volume, high velocity, and obvious variety. To sum up, it can be said that big datasets need to be more useful, visible, and accessible. With new and powerful analytics of big data, such as information visualization tools, researchers can look at data in new ways and mine it for information they intend to have.



Author(s):  
Mrs Poonam ◽  
Mrs. Aditi Mittal

In the 2020 era, there is an exponential increase in the number of devices and machines that are connecting with the internet to transmit information for analysis as well as increasing the rate of data generation. Big data and IoT are two new trends, combining them creates a technical revolution. Approx, ⅛ of an iceberg’s total mass is visible above water. ⅞ of its part stretches into the ocean and is hidden from our view. Similarly, we have many devices which are connected with the internet and huge amounts of data in any field is not entirely used. Now we have large amounts of dark data. Hence ,there is a need for analysis of this data and embed the big data with IoT. In this Paper. We are discussing terms about Big Data and IoT, Functioning of IoT with big data, Role of big data in IoT and Some data visualization tools for representing that data.





Sign in / Sign up

Export Citation Format

Share Document