scholarly journals Analysis on the Application of Big Data Technology in “Internet + Wisdom Judicial Expertise”

2021 ◽  
Vol 251 ◽  
pp. 01029
Author(s):  
Juan Xu

It can be seen from the current development process of the field of forensic expertise that there are many problems in the application of the mechanism of the information structure. These problems lead to imperfect informatization construction process and too limited identification forms, which will have a great impact on the final identification results. This article summarizes the main content of big data technology in the construction of “Internet + Wisdom Judicial Expertise”. This article discusses the specific application recommendations of big data technology in the “Internet + smart forensic appraisal”, from focusing on forensic appraisal data collection and sorting, understanding forensic appraisal big data application requirements, building forensic appraisal big data analysis platform, and strengthening big data analysis platform resource network sharing four aspects.

2021 ◽  
Vol 4 (6) ◽  
Author(s):  
Difei Zhang

Big data technology is widely spread around the world, and is constantly developing and applying. In order to enhance the application value of big data analysis platform, it is necessary to constantly improve the data analysis and processing capacity of big data platform, so as to build a complete data analysis platform, realize resource sharing and real-time data collection. As a key point of contemporary information development, big data analysis platform is of great significance to promote social data exchange. Based on this, this paper focuses on two aspects: first, it describes the construction process and content of big data platform; second, it summarizes the relevant application and development of big data platform for reference.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Kehua Miao ◽  
Jie Li ◽  
Wenxing Hong ◽  
Mingtao Chen

The booming development of data science and big data technology stacks has inspired continuous iterative updates of data science research or working methods. At present, the granularity of the labor division between data science and big data is more refined. Traditional work methods, from work infrastructure environment construction to data modelling and analysis of working methods, will greatly delay work and research efficiency. In this paper, we focus on the purpose of the current friendly collaboration of the data science team to build data science and big data analysis application platform based on microservices architecture for education or nonprofessional research field. In the environment based on microservices that facilitates updating the components of each component, the platform has a personal code experiment environment that integrates JupyterHub based on Spark and HDFS for multiuser use and a visualized modelling tools which follow the modular design of data science engineering based on Greenplum in-database analysis. The entire web service system is developed based on spring boot.


2018 ◽  
Vol 1060 ◽  
pp. 012023
Author(s):  
Zhixiang Wang ◽  
Yao Bu ◽  
Demeng Bai ◽  
Bin Wu ◽  
Jiafeng Qin

2014 ◽  
Vol 484-485 ◽  
pp. 922-926
Author(s):  
Xiang Ju Liu

This paper introduces the operational characteristics of the era of big data and the current era of big data challenges, and exhaustive research and design of big data analytics platform based on cloud computing, including big data analytics platform architecture system, big data analytics platform software architecture , big data analytics platform network architecture big data analysis platform unified program features and so on. The paper also analyzes the cloud computing platform for big data analysis program unified competitive advantage and development of business telecom operators play a certain role in the future.


2019 ◽  
Vol 3 (1) ◽  
Author(s):  
Xi Chen ◽  
Bo Fan ◽  
Jie Zheng ◽  
Hongyan Cui

At present, it has become a hot research field to improve production efficiency and improve life experience through big data analysis. In the process of big data analysis, how to vividly display the results of the analysis is crucial. So, this paper introduces a set of big data visualization analysis platform based on financial field. The platform adopts the MVC system architecture, which is mainly composed of two parts: the background and the front end. The background part is built on the Django framework, and the front end is built with html5, css3, and JavaScript. The chart is rendered by Echarts. The platform can realize the classification of customers' savings potential through bank data, and make portraits of customers with different savings levels. The data analysis results can be dynamically displayed and interact wit


2016 ◽  
Vol 11 (2) ◽  
pp. 164-174 ◽  
Author(s):  
Shunichi Koshimura ◽  

A project titled “Establishing the advanced disaster reduction management system by fusion of real-time disaster simulation and big data assimilation,” was launched as Core Research for Evolutional Science and Technology (CREST) by the Japan Science and Technology Agency (JST). Intended to save as many lives as possible in future national crises involving earthquake and tsunami disasters, the project works on a disaster mitigation system of the big data era, based on cooperation of large-scale, high-resolution, real-time numerical simulations and assimilation of real-time observation data. The world’s most advanced specialists in disaster simulation, disaster management, mathematical science, and information science work together to create the world’s first analysis platform for real-time simulation and big data that effectively processes, analyzes, and assimilates data obtained through various observations. Based on quantitative data, the platform designs proactive measures and supports disaster operations immediately after disaster occurrence. The project was launched in 2014 and is working on the following issues at present.Sophistication and fusion of simulations and damage prediction models using observational big data: Development of a real-time simulation core system that predicts the time evolution of disaster effect by assimilating of location information, fire information, and building collapse information which are obtained from mobile terminals, satellite images, aerial images, and other new observation data in addition to sensing data obtained by the undersea high-density seismic observation network.Latent structure analysis and major disaster scenario creation based on a huge amount of simulation results: Development of an analysis and extraction method for the latent structure of a huge amount of disaster scenarios generated by simulation, and creation of severe scenarios with minimum “unexpectedness” by controlling disaster scenario explosion (an explosive increase in the number of predicted scenarios).Establishment of an earthquake and tsunami disaster mitigation big data analysis platform: Development of an earthquake and tsunami disaster mitigation big data analysis platform that realizes analyses of a huge number of disaster scenarios and increases in speed of data assimilation, and clarifies the requirements for operation of the platform as a disaster mitigation system.The project was launched in 2014 as a 5-year project. It consists of element technology development and system fusion, feasibility study as a next-generation disaster mitigation system (validation with/without introduction of the developed real-time simulation and big data analysis platform) in the affected areas of the Great East Japan Earthquake, and test operations in affected areas of the Tokyo metropolitan earthquake and the Nankai Trough earthquake.


Sign in / Sign up

Export Citation Format

Share Document