privacy preferences
Recently Published Documents


TOTAL DOCUMENTS

147
(FIVE YEARS 41)

H-INDEX

16
(FIVE YEARS 2)

Author(s):  
Gabriele Civitarese ◽  
Juan Ye ◽  
Matteo Zampatti ◽  
Claudio Bettini

One of the major challenges in Human Activity Recognition (HAR) based on machine learning is the scarcity of labeled data. Indeed, collecting a sufficient amount of training data to build a reliable recognition problem is often prohibitive. Among the many solutions in the literature to mitigate this issue, collaborative learning is emerging as a promising direction to distribute the annotation burden over multiple users that cooperate to build a shared recognition model. One of the major issues of existing methods is that they assume a static activity model with a fixed set of target activities. In this paper, we propose a novel approach that is based on Growing When Required (GWR) neural networks. A GWR network continuously adapts itself according to the input training data, and hence it is particularly suited when the users share heterogeneous sets of activities. Like in federated learning, for the sake of privacy preservation, each user contributes to the global activity classifier by sharing personal model parameters, and not by directly sharing data. In order to further mitigate privacy threats, we implement a strategy to avoid releasing model parameters that may indirectly reveal information about activities that the user specifically marked as private. Our results on two well-known publicly available datasets show the effectiveness and the flexibility of our approach.


Author(s):  
Sofia Tsagiopoulou ◽  
Georgios Spathoulas ◽  
Athanasios Kakarountas
Keyword(s):  

2021 ◽  
Author(s):  
Ahmed Alhazmi ◽  
Ghassen Kilani ◽  
William Allen ◽  
TJ OConnor

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Mahdi Nasrullah Al-Ameen ◽  
Apoorva Chauhan ◽  
M.A. Manazir Ahsan ◽  
Huzeyfe Kocabas

Purpose With the rapid deployment of internet of things (IoT) technologies, it has been essential to address the security and privacy issues through maintaining transparency in data practices. The prior research focused on identifying people's privacy preferences in different contexts of IoT usage and their mental models of security threats. However, there is a dearth in existing literature to understand the mismatch between user's perceptions and the actual data practices of IoT devices. Such mismatches could lead users unknowingly sharing their private information, exposing themselves to unanticipated privacy risks. The paper aims to identify these mismatched privacy perceptions in this work. Design/methodology/approach The authors conducted a lab study with 42 participants, where they compared participants’ perceptions with the data practices stated in the privacy policy of 28 IoT devices from different categories, including health and exercise, entertainment, smart homes, toys and games and pets. Findings The authors identified the mismatched privacy perceptions of users in terms of data collection, sharing, protection and storage period. The findings revealed the mismatches between user's perceptions and the data practices of IoT devices for various types of information, including personal, contact, financial, heath, location, media, connected device, online social media and IoT device usage. Originality/value The findings from this study lead to the recommendations on designing simplified privacy notice by highlighting the unexpected data practices, which in turn, would contribute to the secure and privacy-preserving use of IoT devices.


2021 ◽  
Vol 2021 (4) ◽  
pp. 249-269
Author(s):  
Maximilian Hils ◽  
Daniel W. Woods ◽  
Rainer Böhme

Abstract Privacy preference signals are digital representations of how users want their personal data to be processed. Such signals must be adopted by both the sender (users) and intended recipients (data processors). Adoption represents a coordination problem that remains unsolved despite efforts dating back to the 1990s. Browsers implemented standards like the Platform for Privacy Preferences (P3P) and Do Not Track (DNT), but vendors profiting from personal data faced few incentives to receive and respect the expressed wishes of data subjects. In the wake of recent privacy laws, a coalition of AdTech firms published the Transparency and Consent Framework (TCF), which defines an optin consent signal. This paper integrates post-GDPR developments into the wider history of privacy preference signals. Our main contribution is a high-frequency longitudinal study describing how TCF signal gained dominance as of February 2021. We explore which factors correlate with adoption at the website level. Both the number of third parties on a website and the presence of Google Ads are associated with higher adoption of TCF. Further, we show that vendors acted as early adopters of TCF 2.0 and provide two case-studies describing how Consent Management Providers shifted existing customers to TCF 2.0. We sketch ways forward for a pro-privacy signal.


2021 ◽  
Vol 2021 (4) ◽  
pp. 54-75
Author(s):  
Camille Cobb ◽  
Sruti Bhagavatula ◽  
Kalil Anderson Garrett ◽  
Alison Hoffman ◽  
Varun Rao ◽  
...  

Abstract Recent research and articles in popular press have raised concerns about the privacy risks that smart home devices can create for incidental users—people who encounter smart home devices that are owned, controlled, and configured by someone else. In this work, we present the results of a user-centered investigation that explores incidental users’ experiences and the tensions that arise between device owners and incidental users. We conducted five focus group sessions through which we identified specific contexts in which someone might encounter other people’s smart home devices and the main concerns device owners and incidental users have in such situations. We used these findings to inform the design of a survey instrument, which we deployed to a demographically representative sample of 386 adults in the United States. Through this survey, we can better understand which contexts and concerns are most bothersome and how often device owners are willing to accommodate incidental users’ privacy preferences. We found some surprising trends in terms of what people are most worried about and what actions they are willing to take. For example, while participants who did not own devices themselves were often uncomfortable imagining them in their own homes, they were not as concerned about being affected by such devices in homes that they entered as part of their jobs. Participants showed interest in privacy solutions that might have a technical implementation component, but also frequently envisioned an open dialogue between incidental users and device owners to negotiate privacy accommodations.


Author(s):  
V. K. Saxena ◽  
Shashank Pushkar

In the healthcare field, preserving privacy of the patient's electronic health records has been an elementary issue. Numerous techniques have been emerged to maintain privacy of the susceptible information. Acting as a first line of defence against illegal access, traditional access control schemes fall short of defending against misbehaviour of the already genuine and authoritative users: a risk that can harbour overwhelming consequences upon probable data release or leak. This paper introduces a novel risk reduction strategy for the healthcare domain so that the risk related with an access request is evaluated against the privacy preferences of the patient who is undergoing for the medical procedure. The proposed strategy decides the set of data objects that can be safely uncovered to the healthcare service provider such that unreasonably repeated tests and measures can be avoided and the privacy preferences of the patient are preserved.


2021 ◽  
Vol 2021 (3) ◽  
pp. 373-393
Author(s):  
Jooyoung Lee ◽  
Sarah Rajtmajer ◽  
Eesha Srivatsavaya ◽  
Shomir Wilson

Abstract Recent work has brought to light disparities in privacy-related concerns based on socioeconomic status, race and ethnicity. This paper examines relationships between U.S. based Twitter users’ socio-demographic characteristics and their privacy behaviors. Income, gender, age, race/ethnicity, education level and occupation are correlated with stated and observed privacy preferences of 110 active Twitter users. Contrary to our expectations, analyses suggest that neither socioeconomic status (SES) nor demographics is a significant predictor of the use of account security features. We do find that gender and education predict rate of self-disclosure, or voluntary sharing of personal information. We explore variability in the types of information disclosed amongst socio-demographic groups. Exploratory findings indicate that: 1) participants shared less personal information than they recall having shared in exit surveys; 2) there is no strong correlation between people’s stated attitudes and their observed behaviors.


Sign in / Sign up

Export Citation Format

Share Document