scholarly journals Certain aspects of personal data protection in the social network: european experience and legislative regulation in Ukraine

2020 ◽  
Vol 9 (27) ◽  
pp. 383-390 ◽  
Author(s):  
Iryna Davydova ◽  
Olena Bernaz-Lukavetska ◽  
Semen Reznichenko

The purpose of this study is to examine some aspects of personal data protection in the social network, a comparative analysis of the protection of personal data in the social network under Ukrainian and European legislation, namely the General Data Protection Regulation of the European Union. The methods used in this work are: dialectical, comparative-legal, formal-logical, analysis and dogmatic interpretation. Each of these methods was used in the study to understand and qualitatively explain to the audience categories the individual aspects of personal data protection on the social network. This article reveals the notion of: personal data in the social network, the features of their collection, storage and protection in accordance with European legislation and the development of proposals aimed at improving these processes in Ukraine. The research also addresses the following issues: Features of managing consent to the processing of personal data that have already been obtained; who can act as an "operator" under EU law and what actions he can take; who can act as "controller" and what functions it performs. The article concludes that there is an urgent need to streamline Ukrainian domestic legislation in line with EU law, which should result in a new law on personal data protection that complies with GDPR norms. As a result, a new law on personal data protection may soon emerge in Ukraine, replacing the outdated Law of Ukraine “On Personal Data Protection” of 01.06.2010, which is a “mirror” of the repealed Directive 95/46/EC of the European Parliament and of the Council.

2021 ◽  
Vol 13 (3) ◽  
pp. 66
Author(s):  
Dimitra Georgiou ◽  
Costas Lambrinoudakis

The General Data Protection Regulation (GDPR) harmonizes personal data protection laws across the European Union, affecting all sectors including the healthcare industry. For processing operations that pose a high risk for data subjects, a Data Protection Impact Assessment (DPIA) is mandatory from May 2018. Taking into account the criticality of the process and the importance of its results, for the protection of the patients’ health data, as well as the complexity involved and the lack of past experience in applying such methodologies in healthcare environments, this paper presents the main steps of a DPIA study and provides guidelines on how to carry them out effectively. To this respect, the Privacy Impact Assessment, Commission Nationale de l’Informatique et des Libertés (PIA-CNIL) methodology has been employed, which is also compliant with the privacy impact assessment tasks described in ISO/IEC 29134:2017. The work presented in this paper focuses on the first two steps of the DPIA methodology and more specifically on the identification of the Purposes of Processing and of the data categories involved in each of them, as well as on the evaluation of the organization’s GDPR compliance level and of the gaps (Gap Analysis) that must be filled-in. The main contribution of this work is the identification of the main organizational and legal requirements that must be fulfilled by the health care organization. This research sets the legal grounds for data processing, according to the GDPR and is highly relevant to any processing of personal data, as it helps to structure the process, as well as be aware of data protection issues and the relevant legislation.


Hypertension ◽  
2021 ◽  
Vol 77 (4) ◽  
pp. 1029-1035
Author(s):  
Antonia Vlahou ◽  
Dara Hallinan ◽  
Rolf Apweiler ◽  
Angel Argiles ◽  
Joachim Beige ◽  
...  

The General Data Protection Regulation (GDPR) became binding law in the European Union Member States in 2018, as a step toward harmonizing personal data protection legislation in the European Union. The Regulation governs almost all types of personal data processing, hence, also, those pertaining to biomedical research. The purpose of this article is to highlight the main practical issues related to data and biological sample sharing that biomedical researchers face regularly, and to specify how these are addressed in the context of GDPR, after consulting with ethics/legal experts. We identify areas in which clarifications of the GDPR are needed, particularly those related to consent requirements by study participants. Amendments should target the following: (1) restricting exceptions based on national laws and increasing harmonization, (2) confirming the concept of broad consent, and (3) defining a roadmap for secondary use of data. These changes will be achieved by acknowledged learned societies in the field taking the lead in preparing a document giving guidance for the optimal interpretation of the GDPR, which will be finalized following a period of commenting by a broad multistakeholder audience. In parallel, promoting engagement and education of the public in the relevant issues (such as different consent types or residual risk for re-identification), on both local/national and international levels, is considered critical for advancement. We hope that this article will open this broad discussion involving all major stakeholders, toward optimizing the GDPR and allowing a harmonized transnational research approach.


2020 ◽  
pp. 36-50
Author(s):  
Olga O. Bazina

Biometrics, as a field of science, analyzes the physical and behavioral characteristics of people in order to identify their personality. A huge amount of technology in the field of biometric data collection is developed by IT giants like Google, Facebook, or Alibaba. The European Union (EU) took an important step towards biometric data confidentiality by developing a unified law on the protection of personal data (General Data Protection Regulation, GDPR). The main goal of this action is to return control over personal data to European citizens and at the same time simplify the regulatory legal basis for companies. While European countries and organisations are introducing the GDPR into force, China since 2016 has launched a social credit system as a pilot project. The Social Credit Score (SCS) is based on collecting the maximum amount of data about citizens and assessing the reliability of residents based on their financial, social and online behavior. Only critical opinions can be read about the social credit system in European literature, although the opinions of persons being under this system – Chinese citizens – are quite positive. In this context, we should not forget about the big difference in the mentality of Asians and Europeans. The aim of this article is to compare EU law and the legislation of the People's Republic of China regarding the use and storage of biometric data. On the basis of statistical data and materials analysed, key conclusions will be formulated, that will allow to indicate differences in the positions of state institutions and the attitude of citizens to the issue of personal data protection in China and the European Union.


2021 ◽  
Vol 273 ◽  
pp. 08099
Author(s):  
Mikhail Smolenskiy ◽  
Nikolay Levshin

The EU’s General Data Protection Regulation (GDPR) applies not only to the territory of the European Union, but also to all information systems containing data of EU’s citizens around the world. Misusing or carelessly handling personal data bring fines of up to 20 million euros or 4% of the annual turnover of the offending company. This article analyzes the main trends in the global implementation of the GDPR. Authors considered and analyzed results of personal data protection measures in nineteen regions: The USA, Canada, China, France, Germany, India, Kazakhstan, Nigeria, Russia, South Korea and Thailand, as well as the European Union and a handful of other. This allowed identifying a direct pattern between the global tightening of EU’s citizens personal data protection and the fragmentation of the global mediasphere into separate national segments. As a result of the study, the authors conclude that GDPR has finally slowed down the globalization of the online mediasphere, playing a main role in its regional fragmentation.


2020 ◽  
pp. 155-186
Author(s):  
María Dolores Mas Badia

Despite the differences between credit risk and insurance risk, in many countries large insurance companies include credit history amongst the information to be taken into account when assigning consumers to risk pools and deciding whether or not to offer them an auto or homeowner insurance policy, or to determine the premium that they should pay. In this study, I will try to establish some conclusions concerning the requirements and limits that the use of credit history data by insurers in the European Union should be subject to. In order to do this, I shall focus my attention primarily on Regulation (EU) 2016/679. This regulation, that came into force on 24 May 2018, not only forms the backbone of personal data protection in the EU, but is also set to become a model for regulation beyond the borders of the Union. This article will concentrate on two main aspects: the lawful basis for the processing of credit history data by insurers, and the rules that should apply to decisions based solely on automated processing, including profiling.Received: 30 December 2019Accepted: 07 February 2020Published online: 02 April 2020


2018 ◽  
Author(s):  
Duarte Gonçalves-Ferreira ◽  
Mariana Sousa ◽  
Gustavo M Bacelar-Silva ◽  
Samuel Frade ◽  
Luís Filipe Antunes ◽  
...  

BACKGROUND Concerns about privacy and personal data protection resulted in reforms of the existing legislation in the European Union (EU). The General Data Protection Regulation (GDPR) aims to reform the existing directive on the topic of personal data protection of EU citizens with a strong emphasis on more control of the citizens over their data and in the establishment of rules for the processing of personal data. OpenEHR is a standard that embodies many principles of interoperable and secure software for electronic health records (EHRs) and has been advocated as the best approach for the development of hospital information systems. OBJECTIVE This study aimed to understand to what extent the openEHR standard can help in the compliance of EHR systems to the GDPR requirements. METHODS A list of requirements for an EHR to support GDPR compliance and also a list of the openEHR design principles were made. The requirements were categorized and compared with the principles by experts on openEHR and GDPR. RESULTS A total of 50 GDPR requirements and 8 openEHR design principles were identified. The openEHR principles conformed to 30% (15/50) of GDPR requirements. All the openEHR principles were aligned with GDPR requirements. CONCLUSIONS This study showed that the openEHR principles conform well to GDPR, underlining the common wisdom that truly realizing security and privacy requires it to be built in from the start. By using an openEHR-based EHR, the institutions are closer to becoming compliant with GDPR while safeguarding the medical data.


2021 ◽  
Vol 24 (2) ◽  
pp. 207-222
Author(s):  
Marek Zanker ◽  
Vladimír Bureš ◽  
Anna Cierniak-Emerych ◽  
Martin Nehéz

The General Data Protection Regulation, also known as the ‘gold standard’ or the ‘Magna Carta’ of cyber laws, is a European regulation that deals with rights in the area of privacy and focuses on data collection, storage and data processing. This manuscript presents the results of investigation in the business sphere from eight countries of the European Union. The research focused on awareness of the GDPR, costs associated with the GDPR, number of trainings, how data are secured and subjective evaluation. The questionnaire was used for data collection. The results show that the majority of employees concerned about the GDPR are able to define the GDPR correctly (64%). The correct identification of personal data is in 95% of cases. The vast majority of respondents (94%) assign the right to personal data protection to the GDPR. Most employees are trained in the GDPR once (46%) or twice (45%). Subsequently, the differences between these countries in some areas of the questionnaire survey were examined. For this purpose, Welch ANOVA with post-test Tukey HSD or Kruskal-Wallis test were used. As a result, knowledge about the personal data do not vary significantly between the countries. In the area of rights, the countries are not again statistically different. As for the number of security countries, statistics do not differ significantly. The subjective assessment of the GDPR is different across the countries. The GDPR is rated worst by companies in the Czech Republic and Slovakia. On the contrary, the GDPR is best perceived by companies in France and the United Kingdom.


2021 ◽  
Vol 11 (2) ◽  
pp. 167-188
Author(s):  
Ondřej Pavelek ◽  
Drahomíra Zajíčková

Abstract Personal data protection is one of the important areas of the EU’s operation and the general public is especially aware of the General Data Protection Regulation (GDPR). However, personal data protection has been an issue in the EU for a long time. The Court of Justice of the European Union (CJEU) plays a major role in personal data protection as their function is to interpret EU law and thus also EU legislation related to personal data protection. Until now, research papers have tackled specific issues related to interpreting EU legislation or analyses of specific decisions made by the CJEU. However, no comprehensive empirical legal study has been published so far which would evaluate the decision-making of the CJEU in the area of personal data protection using a combination of quantitative and qualitative methods. Therefore, no analysis has been carried out to determine how many decisions of the CJEU have been related to personal data protection, how their number has increased, or which participants and from which areas have participated in the proceedings. The results of the analysis presented here can be used as a basis for studying the future development of the CJEU’s decision-making in the area of personal data protection in relation to digitization and especially to the COVID-19 pandemic, which undoubtedly has contributed to a significant increase in online communication, posing new challenges towards a more efficient personal data protection in the online world.


2021 ◽  
Vol 12 (1) ◽  
pp. 261-268
Author(s):  
Angel Manchev ◽  

The protection of personal data is one of the core values of modern European societies. This protection is provided by the law of the European Union and by the national legislations of the Member States, to which the Republic of Bulgaria also belongs. As of May 25, 2018, the protection of personal data is being expanded and updated in response to technological progress and the increasingly accelerated data exchange. The reason for this is the entry into force of Regulation (EU ) 2016/679 (General Data Protection Regulation, GDPR) and the changes in our national law that it imposes. In the sense of what has been said so far, the issues of personal data protection in children’s institutions are especially relevant, because these organizations actively handle personal data at any level of children, parents, teachers and staff. In this article, we will try to give short answers to some of the most important questions regarding personal data and the rules for their protection, according to European and Bulgarian legislation.


2021 ◽  
Vol 2 (14) ◽  
pp. 36-49
Author(s):  
Volodymyr Akhramovich

A mathematical model has been developed and a study of the model of personal data protection from network clustering coefficient and data transfer intensity in social networks has been carried out. Dependencies of protection of the system from the size of the system (and from the amount of personal data); information security threats from the network clustering factor. A system of linear equations is obtained, which consists of the equation: rate of change of information flow from social network security and coefficients that reflect the impact of security measures, amount of personal data, leakage rate, change of information protection from network clustering factor, its size, personal data protection. As a result of solving the system of differential equations, mathematical and graphical dependences of the indicator of personal data protection in the social network from different components are obtained. Considering three options for solving the equation near the steady state of the system, we can conclude that, based on the conditions of the ratio of dissipation and natural frequency, the attenuation of the latter to a certain value is carried out periodically, with decaying amplitude, or by exponentially decaying law. A more visual analysis of the system behavior is performed, moving from the differential form of equations to the discrete one and modeling some interval of the system existence. Mathematical and graphical dependences of the system natural frequency, oscillation period, attenuation coefficient are presented. Simulation modeling for values with deviation from the stationary position of the system is carried out. As a result of simulation, it is proved that the social network protection system is nonlinear.


Sign in / Sign up

Export Citation Format

Share Document