scholarly journals Equilibrium of Labor Market: New Security Instruments in the Context of Digitalization

2021 ◽  
Vol 93 ◽  
pp. 03017
Author(s):  
Alena Vankevich ◽  
Iryna Kalinouskaya ◽  
Olga Zaitseva ◽  
Alena Korabava

The actual methods of labor market analysis are based on outdated technologies for collecting information and do not consider the competencies available in the CV and in demand by vacancies. In order to obtain reliable current information on the balance of the workforce quality, as the carrier of certain competencies, and market requirements, the method is proposed for determining the degree of their consistency through the ratio of competencies available to applicants and those requested by employers. The proposed methodology, based on big data technologies, uses artificial intelligence as the main toolkit which makes it possible to quickly and efficiently collect, process and visualize the obtained data, which makes it possible to conduct its further qualitative analysis in the context of the proposed / demanded competencies, professions, regions and types of economic activities. As a practical interpretation of the proposed methodology, the paper analyzes the degree of consistency of existing / demanded competencies in the context of the regions of Belarus.

Big Data ◽  
2015 ◽  
Vol 3 (3) ◽  
pp. 130-137 ◽  
Author(s):  
John J. Horton ◽  
Prasanna Tambe

Amicus Curiae ◽  
2020 ◽  
Vol 1 (3) ◽  
pp. 338-360
Author(s):  
Jamie Grace ◽  
Roxanne Bamford

Policymaking is increasingly being informed by ‘big data’ technologies of analytics, machine learning and artificial intelligence (AI). John Rawls used particular principles of reasoning in his 1971 book, A Theory of Justice, which might help explore known problems of data bias, unfairness, accountability and privacy, in relation to applications of machine learning and AI in government. This paper will investigate how the current assortment of UK governmental policy and regulatory developments around AI in the public sector could be said to meet, or not meet, these Rawlsian principles, and what we might do better by incorporating them when we respond legislatively to this ongoing challenge. This paper uses a case study of data analytics and machine-learning regulation as the central means of this exploration of Rawlsian thinking in relation to the redevelopment of algorithmic governance.


2021 ◽  
Vol 93 ◽  
pp. 01021
Author(s):  
Irina Omelchenko ◽  
Galina Antonova ◽  
Marina Danilina ◽  
Sergey Popkov ◽  
Ludmila Botasheva

In recent years the labor market experienced a number of changes. On the basis of the statistical and content analysis the authors research the main trends and indicators of the labor market. The modern economy and labor market are more and more clearly moving along the path of digitalization - the widespread introduction of the latest generation of advanced technologies (information, communication, robotics, artificial intelligence, etc.) into economic activities, completely changing the usual business processes. The pandemics of COVID19 has shown that the digitalization can be actively used in the companies and can change the situation in the labor market. Thus, digitalization affects all spheres of economic and social life, affects the demand for various categories ofworkers of various qualifications.


2014 ◽  
Vol 12 (2) ◽  
pp. 255-272 ◽  
Author(s):  
Martijn Van Otterlo

Big data technologies are increasingly able to automatically gather data, experiment with action strategies, observe results of such strategies, and learn from their effects. When privacy issues are framed as “control over information” it becomes apparent that some areas in the digital world might be heading to what I call Walden 3.0; communities of interest that are influenced and controlled by measurement and experimentation. Instead of bringing forward Orwell’s 1984 dystopia in the privacy domain as is typically done, I sketch how current developments might be better studied in the context of Skinner’s utopian novel Walden Two. I illustrate several issues through a running example from the domain of artificial intelligence, and by pointing to several areas where automated experimentation can arise. Finally, I raise questions on how to cope with and study the phenomenon of automated experimentation.


Author(s):  
Louise Leenen ◽  
Thomas Meyer

Cybersecurity analysts rely on vast volumes of security event data to predict, identify, characterize, and deal with security threats. These analysts must understand and make sense of these huge datasets in order to discover patterns which lead to intelligent decision making and advance warnings of possible threats, and this ability requires automation. Big data analytics and artificial intelligence can improve cyber defense. Big data analytics methods are applied to large data sets that contain different data types. The purpose is to detect patterns, correlations, trends, and other useful information. Artificial intelligence provides algorithms that can reason or learn and improve their behavior, and includes semantic technologies. A large number of automated systems are currently based on syntactic rules which are generally not sophisticated enough to deal with the level of complexity in this domain. An overview of artificial intelligence and big data technologies in cyber defense is provided, and important areas for future research are identified and discussed.


2019 ◽  
Author(s):  
Norihiko Matsuda ◽  
Tutan Ahmed ◽  
Shinsaku Nomura

Author(s):  
Yang Lv ◽  
Chenwei Ma ◽  
Xiaohan Li ◽  
Min Wu

IntroductionThis study aims to explore the potential of big data technologies for controlling COVID-19 transmission and managing effectively.Material and methodsA systematic review guided by PRISMA guidelines has been followed to obtain the key elements.ResultsThis study identified the most relevant 32 documents for qualitative analysis. It also reveals 10 possible sources and 8 key applications of big data for analyzing the virus infection trend, transmission pattern, virus association, and differences of genetic modifications.ConclusionsThe findings will provide new insight and help policymakers, and administrators to develop data-driven initiatives to tackle and manage the COVID-19 crisis.


Author(s):  
Louise Leenen ◽  
Thomas Meyer

Cybersecurity analysts rely on vast volumes of security event data to predict, identify, characterize, and deal with security threats. These analysts must understand and make sense of these huge datasets in order to discover patterns which lead to intelligent decision making and advance warnings of possible threats, and this ability requires automation. Big data analytics and artificial intelligence can improve cyber defense. Big data analytics methods are applied to large data sets that contain different data types. The purpose is to detect patterns, correlations, trends, and other useful information. Artificial intelligence provides algorithms that can reason or learn and improve their behavior, and includes semantic technologies. A large number of automated systems are currently based on syntactic rules which are generally not sophisticated enough to deal with the level of complexity in this domain. An overview of artificial intelligence and big data technologies in cyber defense is provided, and important areas for future research are identified and discussed.


Sign in / Sign up

Export Citation Format

Share Document