scholarly journals BÜYÜK VERİ ANALİZİNDE YAPAY ZEKÂ VE MAKİNE ÖĞRENMESİ UYGULAMALARI - ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING APPLICATIONS IN BIG DATA ANALYSIS

Author(s):  
Muhammet ATALAY ◽  
Enes ÇELİK
2021 ◽  
Vol 3 (6) ◽  
Author(s):  
Difei Zhang

Financial technology changes the logic of financial interpretation through the use of digital and digital centric technologies, commercialization, big data analysis, machine learning and artificial intelligence. From financial institutions that use technology to provide financial services to technology companies that directly provide financial services, fintech companies play an important role in realizing financial brokerage and financial democratization and improving the availability and efficiency of financial services. Based on this, this paper focuses on the plight and path of cooperative governance of financial technology supervision, for the reference of relevant personnel.


2021 ◽  
Author(s):  
Bohdan Polishchuk ◽  
Andrii Berko ◽  
Lyubomyr Chyrun ◽  
Myroslava Bublyk ◽  
Vadim Schuchmann

2021 ◽  
Author(s):  
Jinhui Yu ◽  
Xinyu Luan ◽  
Yu Sun

Because of the differences in the structure and content of each website, it is often difficult for international applicants to obtain the application information of each school in time. They need to spend a lot of time manually collecting and sorting information. Especially when the information of the school may be constantly updated, the information may become very inaccurate for international applicants. we designed a tool including three main steps to solve the problem: crawling links, processing web pages, and building my pages. In compiling languages, we mainly use Python and store the crawled data in JSON format [4]. In the process of crawling links, we mainly used beautiful soup to parse HTML and designed crawler. In this paper, we use Python language to design a system. First, we use the crawler method to fetch all the links related to the admission information on the school's official website. Then we traverse these links, and use the noise_remove [5] method to process their corresponding page contents, so as to further narrow the scope of effective information and save these processed contents in the JSON files. Finally, we use the Flask framework to integrate these contents into my front-end page conveniently and efficiently, so that it has the complete function of integrating and displaying information.


Sign in / Sign up

Export Citation Format

Share Document