Semantic Disclosure Control: semantics meets data privacy

2018 ◽  
Vol 42 (3) ◽  
pp. 290-303 ◽  
Author(s):  
Montserrat Batet ◽  
David Sánchez

Purpose To overcome the limitations of purely statistical approaches to data protection, the purpose of this paper is to propose Semantic Disclosure Control (SeDC): an inherently semantic privacy protection paradigm that, by relying on state of the art semantic technologies, rethinks privacy and data protection in terms of the meaning of the data. Design/methodology/approach The need for data protection mechanisms able to manage data from a semantic perspective is discussed and the limitations of statistical approaches are highlighted. Then, SeDC is presented by detailing how it can be enforced to detect and protect sensitive data. Findings So far, data privacy has been tackled from a statistical perspective; that is, available solutions focus just on the distribution of the data values. This contrasts with the semantic way by which humans understand and manage (sensitive) data. As a result, current solutions present limitations both in preventing disclosure risks and in preserving the semantics (utility) of the protected data. Practical implications SeDC captures more general, realistic and intuitive notions of privacy and information disclosure than purely statistical methods. As a result, it is better suited to protect heterogenous and unstructured data, which are the most common in current data release scenarios. Moreover, SeDC preserves the semantics of the protected data better than statistical approaches, which is crucial when using protected data for research. Social implications Individuals are increasingly aware of the privacy threats that the uncontrolled collection and exploitation of their personal data may produce. In this respect, SeDC offers an intuitive notion of privacy protection that users can easily understand. It also naturally captures the (non-quantitative) privacy notions stated in current legislations on personal data protection. Originality/value On the contrary to statistical approaches to data protection, SeDC assesses disclosure risks and enforces data protection from a semantic perspective. As a result, it offers more general, intuitive, robust and utility-preserving protection of data, regardless their type and structure.

2018 ◽  
Vol 25 (6) ◽  
pp. 1883-1902 ◽  
Author(s):  
Jawahitha Sarabdeen ◽  
Immanuel Azaad Moonesar

Purpose The move toward e-health care in various countries is envisaged to reduce the cost of provision of health care, improve the quality of care and reduce medical errors. The most significant problem is the protection of patients’ data privacy. If the patients are reluctant or refuse to participate in health care system due to lack of privacy laws and regulations, the benefit of the full-fledged e-health care system cannot be materialized. The purpose of this paper is to investigate the available e-health data privacy protection laws and the perception of the people using the e-health care facilities. Design/methodology/approach The researchers used content analysis to analyze the availability and comprehensive nature of the laws and regulations. The researchers also used survey method. Participants in the study comprised of health care professionals (n=46) and health care users (n=187) who are based in the Dubai, United Arab Emirates. The researchers applied descriptive statistics mechanisms and correlational analysis to analyze the data in the survey. Findings The content analysis revealed that the available health data protection laws are limited in scope. The survey results, however, showed that the respondents felt that they could trust the e-health services systems offered in the UAE as the data collected is protected, the rights are not violated. The research also revealed that there was no significance difference between the nationality and the privacy data statements. All the nationality agreed that there is protection in place for the protection of e-health data. There was no significance difference between the demographic data sets and the many data protection principles. Originality/value The findings on the users’ perception could help to evaluate the success in realizing current strategies and an action plan of benchmarking could be introduced.


2020 ◽  
Vol 15 (36) ◽  
pp. 209-232
Author(s):  
Marcos Vinicius Viana da Silva ◽  
Erick Da Luz Scherf ◽  
Jose Everton Da Silva

The protection of personal data in the cyberspace has been an issue of concern for quite some time. However, with the revolutions in information technology, big data and the internet of things, data privacy protection has become paramount in an era of free information flows. Considering this context, this research intends to shine a light on the experience of Brazil regarding data privacy protection, through the analysis of a brand new bill passed by Congress: the Brazilian General Personal Data Protection Act. Our assessment of the legislation was made from the perspective of a human rights-based approach to data, aiming to analyze both advancements, limitations and contradictions of the rights-discourse in the LGPD. Our main conclusions were that the (public and national) security rhetoric, also present in the bill, can create a state of exception regarding the processing of personal data of those considered “enemies of the state”, which may result in violations of fundamental rights and procedural guarantees.


Significance Such programmes contribute not only to Indonesia’s efforts to boost the cyber readiness of its booming digital economy, but are also designed to maintain China's friendly relations with South-east Asia’s largest economy amid the intensifying technology tensions between China and the United States. Impacts The Personal Data Protection Law would need to clarify key provisions and concepts to be effective. The BSSN’s extensive powers will fuel civil society concerns about excessive state surveillance. Turning down Chinese technology suppliers carries cost and wider economic ramifications for Jakarta.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Heather J. Parker ◽  
Stephen Flowerday

Purpose Social media has created a new level of interconnected communication. However, the use of online platforms brings about various ways in which a user’s personal data can be put at risk. This study aims to investigate what drives the disclosure of personal information online and whether an increase in awareness of the value of personal information motivates users to safeguard their information. Design/methodology/approach Fourteen university students participated in a mixed-methods experiment, where responses to Likert-type scale items were combined with responses to interview questions to provide insight into the cost–benefit analysis users conduct when disclosing information online. Findings Overall, the findings indicate that users are able to disregard their concerns due to a resigned and apathetic attitude towards privacy. Furthermore, subjective norms enhanced by fear of missing out (FOMO) further allows users to overlook potential risks to their information in order to avoid social isolation and sanction. Alternatively, an increased awareness of the personal value of information and having experienced a previous privacy violation encourage the protection of information and limited disclosure. Originality/value This study provides insight into privacy and information disclosure on social media in South Africa. To the knowledge of the researchers, this is the first study to include a combination of the theory of planned behaviour and the privacy calculus model, together with the antecedent factors of personal valuation of information, trust in the social media provider, FOMO.


Significance The experience of surfing the net is vastly different for women, who have been disproportionately at the receiving end of cybercrimes that undermine their safety online. As elsewhere, the forms of online offence included bullying, stalking, impersonation and non-consensual pornography. Impacts Lack of online safety will limit the female customer base of digital platforms. Entrenched weaknesses of the judicial systems impede reporting and conviction of cybercrime. Civil society demands for a personal data protection law will rise.


Author(s):  
M. Fevzi Esen ◽  
Eda Kocabas

With the new developments in information technologies, personal and business data have become easily accessible through different channels. The huge amounts of personal data across global networks and databases have provided crucial benefits in a scientific manner and many business opportunities, also in the meeting, incentive, convention, and exhibition (MICE) industry. In this chapter, the authors focus on the analysis of MICE industry with regards to the new regulation (GDPR) of personal data protection of all EU citizens and how the industry professionals can adapt their way of business in light of this new regulation. The authors conducted an online interview with five different meetings industry professionals to have more insight about the data produced with its content and new regulations applied to the industry. The importance of personal data privacy and protection is discussed, and the most suitable anonymization techniques for personal data privacy are proposed.


Author(s):  
M. Fevzi Esen ◽  
Eda Kocabas

With the new developments in information technologies, personal and business data have become easily accessible through different channels. The huge amounts of personal data across global networks and databases have provided crucial benefits in a scientific manner and many business opportunities, also in the meeting, incentive, convention, and exhibition (MICE) industry. In this chapter, the authors focus on the analysis of MICE industry with regards to the new regulation (GDPR) of personal data protection of all EU citizens and how the industry professionals can adapt their way of business in light of this new regulation. The authors conducted an online interview with five different meetings industry professionals to have more insight about the data produced with its content and new regulations applied to the industry. The importance of personal data privacy and protection is discussed, and the most suitable anonymization techniques for personal data privacy are proposed.


2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Andrew Nicholas Cormack

Most studies on the use of digital student data adopt an ethical framework derived from human-studies research, based on the informed consent of the experimental subject. However consent gives universities little guidance on the use of learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses involve an unacceptable risk of harm. Obtaining consent when students join a course will not give them meaningful control over their personal data three or more years later. Relying on consent may exclude those most likely to benefit from early interventions. This paper proposes an alternative framework based on European Data Protection law. Separating the processes of analysis (pattern-finding) and intervention (pattern-matching) gives students and staff continuing protection from inadvertent harm during data analysis; students have a fully informed choice whether or not to accept individual interventions; organisations obtain clear guidance: how to conduct analysis, which analyses should not proceed, and when and how interventions should be offered. The framework provides formal support for practices that are already being adopted and helps with several open questions in learning analytics, including its application to small groups and alumni, automated processing and privacy-sensitive data.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Aylin Ilhan ◽  
Kaja J. Fietkiewicz

PurposeThis investigation aims to examine the differences and similarities between activity tracking technology users from two regions (the USA and Germany) in their intended privacy-related behavior. The focus lies on data handling after hypothetical discontinuance of use, data protection and privacy policy seeking, and privacy concerns.Design/methodology/approachThe data was collected through an online survey in 2019. In order to identify significant differences between participants from Germany and the USA, the chi-squared test and the Mann–Whitney U test were applied.FindingsThe intensity of several privacy-related concerns was significantly different between the two groups. The majority of the participants did not inform themselves about the respective data privacy policies or terms and conditions before installing an activity tracking application. The majority of the German participants knew that they could request the deletion of all their collected data. In contrast, only 35% out of 68 participants from the US knew about this option.Research limitations/implicationsThis study intends to raise awareness about managing the collected health and fitness data after stopping to use activity tracking technologies. Furthermore, to reduce privacy and security concerns, the involvement of the government, companies and users is necessary to handle and share data more considerably and in a sustainable way.Originality/valueThis study sheds light on users of activity tracking technologies from a broad perspective (here, participants from the USA and Germany). It incorporates not only concerns and the privacy paradox but (intended) user behavior, including seeking information on data protection and privacy policy and handling data after hypothetical discontinuance of use of the technology.


Sign in / Sign up

Export Citation Format

Share Document