Management of Private Data: Addressing User Privacy and Economic, Social, and Ethical Concerns

Author(s):  
Dawn Jutla ◽  
Peter Bodorik ◽  
Deyun Gao
2020 ◽  
Author(s):  
Marco Scalvini

<p>The current study is aimed at understanding the impact of TikTok’s recommendation system. The algorithm is perceived as very efficient in targeting users but raises several ethical concerns regarding the ability to manipulate users’ experience and the extent to which private data and preferences are respected. Utilizing the data collected from 40 in-depth interviews, this study explores: How do users perceive TikTok’s ethical responsibilities in regard to their algorithmic recommendation system? Furthermore, the analysis discusses and evaluates the tension between a) how the platform’s algorithm feeds users similar videos that they highly appreciate; and, inversely, b) how the diversification of recommendations is limited. A thematic analysis shows interviewees describe TikTok as a safe space where users can be themselves and feel included in a community of people interested in posting content to connect and engage meaningfully beyond difference. However, the algorithm is perceived as harmful because it tries to manipulate and drive users towards specific videos that increase their ‘addiction’ to the platform. Interviewees consider some of the recommendations on the ForYou page to be questionable because they aimed at persuading or nudging in favor of particular hashtags and social causes. This contradiction may partly be explained by the fact that interviewees report their rationalizations in a performative manner in order to avoid feelings of dissonance while attempting to relate to their own self-identity. This observation leads to the idea that the concept of mediated diversity can explain the tension between the expectation of similarity and diversity. </p>


2021 ◽  
Vol 108 (Supplement_7) ◽  
Author(s):  
Hock Ping Cheah ◽  
Samantha Quah ◽  
Kenneth Wong

Abstract Aims Electronic communication amongst surgical team members improves the team's ability to care for patients. Security and privacy of patient data are significant concerns. Recent controversy involving private data collection with WhatsApp has led to many users changing to other forms of messaging apps to protect user privacy. The aim of this study is the analyse the efficiency and effectiveness of the Signal messaging app in a research setting in Australia. Methods Members of our research group comprising three junior doctors and a supervising consultant surgeon used the Signal app as our main method of communication to discuss matters relating to our various research projects. No patient details were discussed in the messaging app. Results A total of 234 personal and 148 group messages were sent during the study period in a group and personal message setting. Most messages including picture files sent were received within one minute by the recipient. We did encounter a 24 hour period where Signal encountered some technical difficulties and some messages did not go through. Conclusion Signal messaging app is a good alternative to WhatsApp messaging app with better user privacy protection. With more user uptake on Signal app, it has the potential to be used for clinical care as Signal also provides end-to-end encryption to protect patient privacy.


Big data offers various services like storing sensitive, private data and maintaining the data. Big data users may upload encrypted data rather than raw data for preserving data. Processing and analyzing the encrypted data is the primary target for attackers and hackers. Homomorphic Re-Encryption to supports access control, processed cipher-text on encrypted data and ensure data confidentiality. However, the limitation of Homomorphic Re-Encryption is the single-user system, which means it allows the party that owns a homomorphic decryption key to decrypt processed cipher-texts. Original Homomorphic Re-Encryption cannot support multiple users to access the processed cipher texts flexibly. In this paper, propose a Privacy-Preserving Big Data Processing system which support of a Homomorphic Re-Encryption using laplacian phase that extends partially from a single-group user system by offering cipher text re-encryption that allows accessing processed cipher-texts. Through the cooperation of a Data Provider, to increase the flexibility and security of our system, However apply multiple Services to take in charge of the data from their users and design computing operations over cipher-texts belonging to multiple Service. The analysis completed on proves that our Preserving the Privacy of Big Data Processing method’s to performance in terms of security is good on some datasets, inefficiency this also ensures the security and user privacy.


2020 ◽  
Author(s):  
Marco Scalvini

<p>The current study is aimed at understanding the impact of TikTok’s recommendation system. The algorithm is perceived as very efficient in targeting users but raises several ethical concerns regarding the ability to manipulate users’ experience and the extent to which private data and preferences are respected. Utilizing the data collected from 40 in-depth interviews, this study explores: How do users perceive TikTok’s ethical responsibilities in regard to their algorithmic recommendation system? Furthermore, the analysis discusses and evaluates the tension between a) how the platform’s algorithm feeds users similar videos that they highly appreciate; and, inversely, b) how the diversification of recommendations is limited. A thematic analysis shows interviewees describe TikTok as a safe space where users can be themselves and feel included in a community of people interested in posting content to connect and engage meaningfully beyond difference. However, the algorithm is perceived as harmful because it tries to manipulate and drive users towards specific videos that increase their ‘addiction’ to the platform. Interviewees consider some of the recommendations on the ForYou page to be questionable because they aimed at persuading or nudging in favor of particular hashtags and social causes. This contradiction may partly be explained by the fact that interviewees report their rationalizations in a performative manner in order to avoid feelings of dissonance while attempting to relate to their own self-identity. This observation leads to the idea that the concept of mediated diversity can explain the tension between the expectation of similarity and diversity. </p>


2017 ◽  
Author(s):  
Christopher Soghoian

Over the last few years, consumers, corporations and governments have rushed to move their data to “the cloud,” adopting web-based applications and storage solutions provided by companies that include Amazon, Google, Microsoft and Yahoo. Unfortunately the shift to cloud computing needlessly exposes users to privacy invasion and fraud by hackers. Cloud based services also leave end users vulnerable to significant invasions of privacy by the government, resulting in the evisceration of traditional Fourth Amendment protections of a person’s private files and documents. These very real risks associated with the cloud computing model are not communicated to consumers, who are thus unable to make an informed decision when evaluating cloud based services. This paper will argue that the increased risk that users face from hackers is primarily a result of cost-motivated design decisions on the part of the cloud providers, who have repeatedly opted to forgo strong security solutions already used in other Internet based industries. With regard to the intrusion upon user privacy performed by government agencies, fault for this privacy harm does not lie with the service providers; but the inherently coercive powers the government can flex at will. The third party doctrine, which permits government agents to obtain users’ private files from service providers with a mere subpoena, is frequently criticized by privacy scholars. However, this paper will argue that this doctrine becomes moot once encryption is in use and companies no longer have access to their customers’ private data. The real threat to privacy lies with the fact that corporations can and have repeatedly been forced to modify their own products in ways that harm end user privacy, such as by circumventing encryption.


2019 ◽  
Author(s):  
◽  
Mohamad A. Chehab

The wide applicability of Internet of Things (IoT) would truly enable the pervasiveness of smart devices for sensing data. IoT coupled with machine learning would enter us in an era of smart and personalized, services. In order to achieve service personalization, there is a need to collect sensitive data about the users. That yields to privacy concerns due to the possibility of abusing the data or having attackers to gain unauthorized access. Moreover, the nature of IoT devices, being resource and computationally constrained, makes it di cult to perform heavy protection mechanisms. Despite the presence of several solutions for protecting user privacy, they were not created for the purpose of running on small devices at a large scale. On top of that, existing solutions lack the customization of user privacy in which users have little to no control over their own private data. In this regards, we address the aforementioned issue of protecting user's privacy while taking into account e ciency as well as memory usage. The proposed scheme embeds an e cient and lightweight algebra based that targets user privacy and provides e cient policy evaluation. Moreover, an intelligent model to customize user's privacy based on real time behavior is integrated. Experiments conducted on synthetic and real-life scenarios to demonstrate the feasibility and relevance of our proposed framework within IoT environment.


Sign in / Sign up

Export Citation Format

Share Document