Accurate mining of location data in the communication field based on big data

2021 ◽  
pp. 1-14
Author(s):  
Guanqun Cai

In order to extract value from data, data mining and data software technology are widely used in the industry. This study mainly discusses the precise mining of location data in communication field based on big data. Signaling preprocessing layer mainly obtains signaling message through acquisition module, filters FISU message in signaling message, judges abnormal message frame, and stamp it with time stamp, which provides effective data source for next processing. Signaling access layer mainly completes the function of signaling link access, mainly using high resistance jumper technology, time slot convergence technology, optical access technology and 155mdxc conversion technology to access 2 m link and 155 m link respectively. The signaling collection module must collect directly or via a link through DXC in order to reach the front-end data collection machine and access the signaling collection module of the front-end machine. The Signaling Collection Module also completes some of the message processing work. The presentation layer is the window of human-computer interaction of the whole system, which presents to users with friendly interface and perfect functions. The main goal of real-time big data analysis is to obtain signaling data sent by signaling acquisition system, and screen out the effective information in signaling data according to monitoring conditions, and then analyze the final real-time monitoring results. Geographic information module provides visual map control for the regional monitoring big data analysis module. The difficulty of system development can be reduced by using the existing WebGIS map toolkit. When the call from the Customs Bureau of Unicom in different cities is called into the mobile gateway Bureau, the call is rejected by the mobile customs bureau. The call time is 0 seconds, of which the interception success rate is up to 90% within 1 s. This research is of great significance for the better development and maintenance of signaling network and monitoring system.

2017 ◽  
Vol 13 (7) ◽  
pp. 155014771772181 ◽  
Author(s):  
Seok-Woo Jang ◽  
Gye-Young Kim

This article proposes an intelligent monitoring system for semiconductor manufacturing equipment, which determines spec-in or spec-out for a wafer in process, using Internet of Things–based big data analysis. The proposed system consists of three phases: initialization, learning, and prediction in real time. The initialization sets the weights and the effective steps for all parameters of equipment to be monitored. The learning performs a clustering to assign similar patterns to the same class. The patterns consist of a multiple time-series produced by semiconductor manufacturing equipment and an after clean inspection measured by the corresponding tester. We modify the Line, Buzo, and Gray algorithm for classifying the time-series patterns. The modified Line, Buzo, and Gray algorithm outputs a reference model for every cluster. The prediction compares a time-series entered in real time with the reference model using statistical dynamic time warping to find the best matched pattern and then calculates a predicted after clean inspection by combining the measured after clean inspection, the dissimilarity, and the weights. Finally, it determines spec-in or spec-out for the wafer. We will present experimental results that show how the proposed system is applied on the data acquired from semiconductor etching equipment.


2021 ◽  
Author(s):  
Jinhui Yu ◽  
Xinyu Luan ◽  
Yu Sun

Because of the differences in the structure and content of each website, it is often difficult for international applicants to obtain the application information of each school in time. They need to spend a lot of time manually collecting and sorting information. Especially when the information of the school may be constantly updated, the information may become very inaccurate for international applicants. we designed a tool including three main steps to solve the problem: crawling links, processing web pages, and building my pages. In compiling languages, we mainly use Python and store the crawled data in JSON format [4]. In the process of crawling links, we mainly used beautiful soup to parse HTML and designed crawler. In this paper, we use Python language to design a system. First, we use the crawler method to fetch all the links related to the admission information on the school's official website. Then we traverse these links, and use the noise_remove [5] method to process their corresponding page contents, so as to further narrow the scope of effective information and save these processed contents in the JSON files. Finally, we use the Flask framework to integrate these contents into my front-end page conveniently and efficiently, so that it has the complete function of integrating and displaying information.


2020 ◽  
Author(s):  
Pingyu Fan ◽  
Kwok Pan Chun ◽  
Ana Mijic ◽  
Daphne Ngar-Yin Mah

<p>Digital water and energy maps allow fast information retrieval, big data analysis and resources demand prediction for real time responses in 5-G networks. A regulatory systems framework is needed to enable and promote integrated actions grounded on map-based feedback information, to facilitate resources movements and knowledge transfer for water and energy security. At the same time, the proposed regulatory system needs to safeguard national security and personal privacy when general public and the private sectors have access to big databases.</p><p>The Guangdong-Hong Kong-Macao Greater Bay Area (GBA) in China is an initiative on regional economic development involving nine mainland cities and two Special Administrative Regions (SARs). As central policies cannot be efficiently executed in the whole regions, institutional fragmentation could be a prominent barrier to achieve regional water and energy optimum rather than individual city maxima for the water and energy nexus.</p><p>In this study, we propose a systems regulatory framework that integrates natural, urban and social systems across multiple scales in which the relevant laws, policies, decisions and actions are supported by digital maps. On a planning scale, our new regulatory system based on spatial map information promotes optimum uses of natural capitals and ecosystem services (ES). For linking different urban spatial processes on different scales, satellite images and Local Climate Zone (LCZ) maps are used to describe natural environment and urban characteristics from 200km to 10km resolutions for supporting land-use planning laws and estimating regional development carrying capacity to mitigate water and energy insecurity.</p><p>On an operational scale, smart meters and remote sensor systems provide real time water and energy information from a fast developing 5-G network for the proposed digital maps. Forecasted energy and water demands from the digital maps can be used for regional or local environment regulation reinforcement. Proposed spatial maps also improve transboundary collaboration by providing visualisation of legal targets and emission limits. Through digital maps, key agencies and sectors will have a capacity to share transboundary knowledge, information and responsibility, to foster smooth system flows in terms of culture, economy, policy and technology, by active participations and decentralized actions.</p><p>On an evaluation scale, open map information increases the transparency of legal targets and pollution limits. By rapid information retrieval and big data analysis from digital maps, regulators can assess the performance of water and energy security practices.</p><p>In summary, the proposed framework based on LCZ maps for the GBA can be applied to other rapidly developing regions with emerging 5-G networks. The integrated regulatory framework also guides water and energy security practices and transfer central policies to local actions by rapid information retrieval, big data analysis and prediction of demand for real time responses based on digital water and energy maps.</p><p></p><p></p><p></p><p></p><p></p>


2019 ◽  
Vol 3 (1) ◽  
Author(s):  
Xi Chen ◽  
Bo Fan ◽  
Jie Zheng ◽  
Hongyan Cui

At present, it has become a hot research field to improve production efficiency and improve life experience through big data analysis. In the process of big data analysis, how to vividly display the results of the analysis is crucial. So, this paper introduces a set of big data visualization analysis platform based on financial field. The platform adopts the MVC system architecture, which is mainly composed of two parts: the background and the front end. The background part is built on the Django framework, and the front end is built with html5, css3, and JavaScript. The chart is rendered by Echarts. The platform can realize the classification of customers' savings potential through bank data, and make portraits of customers with different savings levels. The data analysis results can be dynamically displayed and interact wit


2016 ◽  
Vol 11 (2) ◽  
pp. 164-174 ◽  
Author(s):  
Shunichi Koshimura ◽  

A project titled “Establishing the advanced disaster reduction management system by fusion of real-time disaster simulation and big data assimilation,” was launched as Core Research for Evolutional Science and Technology (CREST) by the Japan Science and Technology Agency (JST). Intended to save as many lives as possible in future national crises involving earthquake and tsunami disasters, the project works on a disaster mitigation system of the big data era, based on cooperation of large-scale, high-resolution, real-time numerical simulations and assimilation of real-time observation data. The world’s most advanced specialists in disaster simulation, disaster management, mathematical science, and information science work together to create the world’s first analysis platform for real-time simulation and big data that effectively processes, analyzes, and assimilates data obtained through various observations. Based on quantitative data, the platform designs proactive measures and supports disaster operations immediately after disaster occurrence. The project was launched in 2014 and is working on the following issues at present.Sophistication and fusion of simulations and damage prediction models using observational big data: Development of a real-time simulation core system that predicts the time evolution of disaster effect by assimilating of location information, fire information, and building collapse information which are obtained from mobile terminals, satellite images, aerial images, and other new observation data in addition to sensing data obtained by the undersea high-density seismic observation network.Latent structure analysis and major disaster scenario creation based on a huge amount of simulation results: Development of an analysis and extraction method for the latent structure of a huge amount of disaster scenarios generated by simulation, and creation of severe scenarios with minimum “unexpectedness” by controlling disaster scenario explosion (an explosive increase in the number of predicted scenarios).Establishment of an earthquake and tsunami disaster mitigation big data analysis platform: Development of an earthquake and tsunami disaster mitigation big data analysis platform that realizes analyses of a huge number of disaster scenarios and increases in speed of data assimilation, and clarifies the requirements for operation of the platform as a disaster mitigation system.The project was launched in 2014 as a 5-year project. It consists of element technology development and system fusion, feasibility study as a next-generation disaster mitigation system (validation with/without introduction of the developed real-time simulation and big data analysis platform) in the affected areas of the Great East Japan Earthquake, and test operations in affected areas of the Tokyo metropolitan earthquake and the Nankai Trough earthquake.


2018 ◽  
Vol 7 (3.33) ◽  
pp. 248
Author(s):  
Young-Woon Kim ◽  
Hyeopgeon Lee

In the automobile industry, the contract information of vehicles contracted through sales activities, as well as the order data of customers who purchased cars, and vehicle maintenance history information all accumulate in relational databases over time. Although accumulated customer and vehicle information is used for marketing purposes, processing and analyzing this massive data is difficult, as its volume con-stantly increases. This problem of managing big data is commonly solved by utilizing the MapReduce distributed structure of Hadoop, which uses big data distributed processing technology, and R, which is a widely used big data analysis technology. Among the methods that interconnect Hadoop and R, the R and Hadoop integrated programming environment (RHIPE) was developed in this study as a real-time big data analysis system for marketing in the automobile industry. RHIPE allows us to maintain an interactive environment and use the powerful analytical features of R, which is an interpreter language, while achieving a high processing speed using Map and Reduce func-tions. In this study, we developed a real-time big data analysis system that can analyze the orders, reservations, and maintenance history contained in big data using the RHIPE method. 


Author(s):  
Dae Hyun Jung

This study emphasizes the necessity of introducing a blockchain-based joint logistics system to strengthen the competency of medical supply chain management (SCM) and tries to develop a healthcare supply chain management (HSCM) competency measurement item through an analytic hierarchy process. The variables needed for using blockchain-based joint logistics are the performance expectations, effort expectations, promotion conditions, and social impact of the UTAUT model, and the HSCM competency results in increased reliability and transparency, enhanced SCM, and enhanced scalability. Word cloud results, analyzing the most important considerations to realize work efficiency among medical industry-related agencies, mentioned numerous words, including sudden situations, delivery, technology trust, information sharing, effectiveness, urgency, etc. This might imply the need to establish a system that can respond immediately to emergency situations during holidays. It could also suggest the importance of real-time information sharing to increase the efficiency of inventory management. Therefore, there is a need of a business model that can increase the visibility of real-time medical SCM through big data analysis. By analyzing the importance of securing reliability based on the blockchain technology in the establishment of a supply chain network for HSCM competency, we reveal that joint logistics can be achieved and synergistic effects can be created by implementing the integrated database to secure HSCM competency. Strengthening partnerships, such as joint logistics, will eventually lead to HSCM competency. In particular, HSCM should seek ways to upgrade its competitive capabilities through big data analysis based on the establishment of a joint logistics system.


Sign in / Sign up

Export Citation Format

Share Document