privacy threats
Recently Published Documents


TOTAL DOCUMENTS

156
(FIVE YEARS 58)

H-INDEX

11
(FIVE YEARS 4)

Author(s):  
Lev Velykoivanenko ◽  
Kavous Salehzadeh Niksirat ◽  
Noé Zufferey ◽  
Mathias Humbert ◽  
Kévin Huguenin ◽  
...  

Fitness trackers are increasingly popular. The data they collect provides substantial benefits to their users, but it also creates privacy risks. In this work, we investigate how fitness-tracker users perceive the utility of the features they provide and the associated privacy-inference risks. We conduct a longitudinal study composed of a four-month period of fitness-tracker use (N = 227), followed by an online survey (N = 227) and interviews (N = 19). We assess the users' knowledge of concrete privacy threats that fitness-tracker users are exposed to (as demonstrated by previous work), possible privacy-preserving actions users can take, and perceptions of utility of the features provided by the fitness trackers. We study the potential for data minimization and the users' mental models of how the fitness tracking ecosystem works. Our findings show that the participants are aware that some types of information might be inferred from the data collected by the fitness trackers. For instance, the participants correctly guessed that sexual activity could be inferred from heart-rate data. However, the participants did not realize that also the non-physiological information could be inferred from the data. Our findings demonstrate a high potential for data minimization, either by processing data locally or by decreasing the temporal granularity of the data sent to the service provider. Furthermore, we identify the participants' lack of understanding and common misconceptions about how the Fitbit ecosystem works.


JAMIA Open ◽  
2021 ◽  
Author(s):  
Ram D Gopal ◽  
Hooman Hidaji ◽  
Raymond A Patterson ◽  
Niam Yaraghi

Abstract Objectives To examine the impact of COVID-19 pandemic on the extent of potential violations of Internet users’ privacy. Materials and Methods We conducted a longitudinal study of the data sharing practices of the top 1,000 websites in the US between April 9th and August 27th, 2020. We fitted a conditional latent growth curve model on the data to examine the longitudinal trajectory of the third-party data sharing over the 21 weeks period of the study and examine how website characteristics affect this trajectory. We denote websites that asked for permission before placing cookies on users’ browsers as "privacy-respecting". Results As the weekly number of COVID-19 deaths increased by 1,000, the average number of third parties increased by 0.26 [95%CI, 0.15 to 0.37] P<.001 units in the next week. This effect was more pronounced for websites with higher traffic as they increased their third parties by an additional 0.41 [95% CI, 0.18 to 0.64]; P<.001 units per week. However, privacy respecting websites that experienced a surge in traffic reduced their third parties by 1.01 [95% CI, -2.01 to 0]; P = 0.05 units per week in response to every 1,000 COVID-19 deaths in the preceding week. Discussion While in general websites shared their users’ data with more third parties as COVID-19 progressed in the US, websites’ expected traffic and respect for users’ privacy significantly affect such trajectory. Conclusions Attention should also be paid to the impact of the pandemic on elevating online privacy threats, and the variation in third-party tracking among different types of websites. Lay Summary As the COVID-19 pandemic progressed in the country, the demand for online services surged. As the level of Internet use increased, websites’ opportunity to track and monetize users’ data increased with it. In this research, we examine the extent to which websites increased the number of third-parties with which they share their user’ data and how such practices were moderated by a website’s level of respect for users’ privacy and traffic surge. We find that while the number of third parties increased over time, the websites with higher respect for privacy tend to decrease the number of their parties only if they also experience a significant increase in their traffic.


Author(s):  
Gabriele Civitarese ◽  
Juan Ye ◽  
Matteo Zampatti ◽  
Claudio Bettini

One of the major challenges in Human Activity Recognition (HAR) based on machine learning is the scarcity of labeled data. Indeed, collecting a sufficient amount of training data to build a reliable recognition problem is often prohibitive. Among the many solutions in the literature to mitigate this issue, collaborative learning is emerging as a promising direction to distribute the annotation burden over multiple users that cooperate to build a shared recognition model. One of the major issues of existing methods is that they assume a static activity model with a fixed set of target activities. In this paper, we propose a novel approach that is based on Growing When Required (GWR) neural networks. A GWR network continuously adapts itself according to the input training data, and hence it is particularly suited when the users share heterogeneous sets of activities. Like in federated learning, for the sake of privacy preservation, each user contributes to the global activity classifier by sharing personal model parameters, and not by directly sharing data. In order to further mitigate privacy threats, we implement a strategy to avoid releasing model parameters that may indirectly reveal information about activities that the user specifically marked as private. Our results on two well-known publicly available datasets show the effectiveness and the flexibility of our approach.


2021 ◽  
Author(s):  
Junpeng Zhang ◽  
Mengqian Li ◽  
Shuiguang Zeng ◽  
Bin Xie ◽  
Dongmei Zhao

2021 ◽  
Author(s):  
Ram Mohan Rao P ◽  
S Murali Krishna ◽  
AP Siva Kumar

Today we are living in a digital rich and technology driven world where extremely large amounts of data get generated every hour in the public domain, which also includes personal data. Applications like social media, e-commerce, smartphone apps, etc. collect a lot of personal data which can harm individual privacy if leaked, and hence ethical code of conduct is required to ensure data privacy. Some of the privacy threats include Digital profiling, cyberstalking, recommendation systems, etc. leading to the disclosure of sensitive data and sharing of data without the consent of the data owner. Data Privacy has gained significant importance in the recent times and it is evident from the privacy legislation passed in more than 100 countries. Firms dealing with data-sensitive applications need to abide by the privacy legislation of respective territorial regions. To overcome these privacy challenges by incorporating privacy regulations, we have designed guidelines for application development, incorporating key features of privacy regulations along with the implementation strategies which will help in developing data-sensitive applications which can offer strong and coherent privacy protection of personal data.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Li Duan ◽  
Kejia Zhang ◽  
Bo Cheng ◽  
Bingfei Ren

The emerging, overclocking signal-based acoustic covert communication technique allows smart devices to communicate (without users’ consent) utilizing their microphones and speakers in ultrasonic side channels, which offers users imperceptible and convenient personalized services, e.g., cross-device authentication and media tracking. However, microphones and speakers could be maliciously used and pose severe privacy threats to users. In this paper, we propose a novel high-frequency filtering- (HFF-) based protection model, named UltraFilter, which protects user privacy by enabling users to selectively filter out high-frequency signals from the metadata received by the device. We also analyze the feasibility of using audio frequencies (i.e., ≤18 kHz) to the acoustic covert communication and carry out the acoustic covert communication system by introducing the auditory masking effect. Experiments show that UltraFilter can prevent users’ private information from leaking and reduce system load and that the audio frequencies can pose threats to user privacy.


2021 ◽  
Vol 23 (06) ◽  
pp. 1267-1278
Author(s):  
Dr. Anand N. Raut ◽  

Reliance on technology has increased manifold and children have not remained aloof from its use. Covid-19 Pandemic forced schools, playgrounds, public places to remain closed resulting in increased use of technology by children for various purposes like education, gaming, entertainment, creative activities, communication, etc. As few technologies are capable of collecting, maintaining, processing, sharing gigantic amounts of personal data, such technologies especially those which direct services specifically towards children, pose a significant threat to the online privacy of children who are vulnerable due to their inability to comprehend the consequences of their online activities. The paper attempts to discuss the technological application and use by children, the privacy threats they may pose, and critically evaluate the adequacy of law in India to tackle them along with recommendations to strengthen online privacy.


Author(s):  
Pietro Spadaccino ◽  
Domenico Garlisi ◽  
Francesca Cuomo ◽  
Giorgio Pillon ◽  
Patrizio Pisani
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document