Personal Data Broker: A Solution to Assure Data Privacy in EdTech

Author(s):  
Daniel Amo ◽  
David Fonseca ◽  
Marc Alier ◽  
Francisco José García-Peñalvo ◽  
María José Casañ ◽  
...  
Author(s):  
Daniel Amo ◽  
David Fonseca ◽  
Marc Alier ◽  
Francisco José García-Peñalvo ◽  
María José Casañ

2021 ◽  
Vol 4 ◽  
Author(s):  
Vibhushinie Bentotahewa ◽  
Chaminda Hewage ◽  
Jason Williams

The growing dependency on digital technologies is becoming a way of life, and at the same time, the collection of data using them for surveillance operations has raised concerns. Notably, some countries use digital surveillance technologies for tracking and monitoring individuals and populations to prevent the transmission of the new coronavirus. The technology has the capacity to contribute towards tackling the pandemic effectively, but the success also comes at the expense of privacy rights. The crucial point to make is regardless of who uses and which mechanism, in one way another will infringe personal privacy. Therefore, when considering the use of technologies to combat the pandemic, the focus should also be on the impact of facial recognition cameras, police surveillance drones, and other digital surveillance devices on the privacy rights of those under surveillance. The GDPR was established to ensure that information could be shared without causing any infringement on personal data and businesses; therefore, in generating Big Data, it is important to ensure that the information is securely collected, processed, transmitted, stored, and accessed in accordance with established rules. This paper focuses on Big Data challenges associated with surveillance methods used within the COVID-19 parameters. The aim of this research is to propose practical solutions to Big Data challenges associated with COVID-19 pandemic surveillance approaches. To that end, the researcher will identify the surveillance measures being used by countries in different regions, the sensitivity of generated data, and the issues associated with the collection of large volumes of data and finally propose feasible solutions to protect the privacy rights of the people, during the post-COVID-19 era.


2019 ◽  
Vol 22 (1) ◽  
Author(s):  
Miguel Ehecatl Morales-Trujillo ◽  
Gabriel Alberto García-Mireles ◽  
Erick Orlando Matla-Cruz ◽  
Mario Piattini

Protecting personal data in current software systems is a complex issue that requires legal regulations and constraints to manage personal data as well as a methodological support to develop software systems that would safeguard data privacy of their respective users. Privacy by Design (PbD) approach has been proposed to address this issue and has been applied to systems development in a variety of application domains. The aim of this work is to determine the presence of PbD and its extent in software development efforts. A systematic mapping study was conducted in order to identify relevant literature that collects PbD principles and goals in software development as well as methods and/or practices that support privacy aware software development. 53 selected papers address PbD mostly from a theoretical perspective with proposals validation based primarily on experiences or examples. The findings suggest that there is a need to develop privacy-aware methods to be integrated at all stages of software development life cycle and validate them in industrial settings.


Author(s):  
Anastasia Kozyreva ◽  
Philipp Lorenz-Spreen ◽  
Ralph Hertwig ◽  
Stephan Lewandowsky ◽  
Stefan M. Herzog

AbstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.


2021 ◽  
Vol 00 (00) ◽  
pp. 1-19
Author(s):  
Diah Yuniarti ◽  
Sri Ariyanti

This study aims to provide recommendations to the government on regulating licence, content and data privacy and protection for integrated broadcast-broadband (IBB) operations in Indonesia, by referencing Singapore, Japan and Malaysia as case studies, considering the need for umbrella regulations for IBB implementation. Singapore and Japan were chosen as countries that have deployed IBB since they have been using hybrid broadcast broadband television (HbbTV) and Hybridcast standards, respectively. Malaysia was chosen because it is a neighbouring country that has conducted trials of the IBB service, bundled with its digital terrestrial television (DTT) service. The qualitative data are analysed using a comparative method. The results show that Indonesia needs to immediately revise its existing Broadcasting Law to accommodate DTT implementation, which is the basis for IBB and the expansion of the broadcaster’s TV business. Learning from Singapore, Indonesia could include over-the-top (OTT) content in its ‘Broadcast Behaviour Guidelines’ and ‘Broadcast Programme Standards’. Data privacy and protection requirements for each entity involved in the IBB ecosystem are necessary due to the vulnerability of IBB service user data leakage. In light of this, the ratification of the personal data protection law, as a legal umbrella, needs to be accelerated.


Author(s):  
M. Fevzi Esen ◽  
Eda Kocabas

With the new developments in information technologies, personal and business data have become easily accessible through different channels. The huge amounts of personal data across global networks and databases have provided crucial benefits in a scientific manner and many business opportunities, also in the meeting, incentive, convention, and exhibition (MICE) industry. In this chapter, the authors focus on the analysis of MICE industry with regards to the new regulation (GDPR) of personal data protection of all EU citizens and how the industry professionals can adapt their way of business in light of this new regulation. The authors conducted an online interview with five different meetings industry professionals to have more insight about the data produced with its content and new regulations applied to the industry. The importance of personal data privacy and protection is discussed, and the most suitable anonymization techniques for personal data privacy are proposed.


Author(s):  
Irene Chen

The story describes how three school institutes are grappling with the loss of private information, each through a unique set of circumstances. Pasadena City Public Schools discovered that it had sold several computers containing the names and Social Security numbers of employees as surplus. Stephens Public Schools learned that personal information about students at one of its middle schools was lost when a bag containing a thumb drive was stolen. Also, Woodlands Public Schools accidentally exposed employee personal data on a public Web site for a short period of time. How should each of the institutes react?


Author(s):  
Zerin Mahzabin Khan ◽  
Rukhsana Ahmed ◽  
Devjani Sen

No previous research on cancer mobile applications (apps) has investigated issues associated with the data privacy of its consumers. The current chapter addressed this gap in the literature by assessing the content of online privacy policies of selected cancer mobile apps through applying a checklist and performing an in-depth critical analysis to determine how the apps communicated their privacy practices to end users. The results revealed that the privacy policies were mostly ambiguous, with content often presented in a complex manner and inadequate information on the ownership, use, disclosure, retention, and collection of end users' personal data. These results highlight the importance of improving the transparency of privacy practices in health and fitness cancer mobile apps to clearly and effectively communicate how end users' personal data are collected, stored, and shared. The chapter concludes with recommendations and discussion on practical implications for stakeholders like cancer app users, developers, policymakers, and clinicians.


Sign in / Sign up

Export Citation Format

Share Document