privacy policy
Recently Published Documents


TOTAL DOCUMENTS

424
(FIVE YEARS 121)

H-INDEX

22
(FIVE YEARS 3)

2022 ◽  
Vol 6 (GROUP) ◽  
pp. 1-14
Author(s):  
Lindah Kotut ◽  
D. Scott McCrickard

Privacy policy and term agreement documents are considered the gateway for software adoption and use. The documents provide a means for the provider to outline expectations of the software use, and also provide an often-separate document outlining how user data is collected, stored, and used--including if it is shared with other parties. A user agreeing with the terms, assumes that they have a full understanding the terms of the agreement and have provided consent. Often however, users do not read the documents because they are long and full of legalistic and inconsistent language, are regularly amended, and may not disclose all the details on what is done to the user data. Enforcing compliance and ensuring user consent have been persistent challenges to policy makers and privacy researchers. This design fiction puts forward an alternate reality and presents a policy-based approach to fording the consent gap with the TL;DR Charter: an agreement governing the parties involved by harnessing the power of formal governments, industry, and other stakeholders, and taking users expectation of privacy into account. The Charter allows us as researchers to examine the implications on trust, decision-making, consent, accountability and the impact of future technologies.


2022 ◽  
Vol 2022 ◽  
pp. 1-12
Author(s):  
Yaojia Tang ◽  
Luna Wang

Firms’ privacy protection strategies are affected by multiple factors. This study adopted a configurational perspective to examine how regulation policy environment, market structure, and the heterogeneity among enterprises affect their privacy protection policies. Using a fuzzy set qualitative comparative analysis of Chinese listed platform enterprises, we found that three configuration conditions were associated with enterprises formulating a privacy policy with a high level of protection and two configuration conditions were associated with enterprises formulating a privacy policy with a low level of protection. The results showed that privacy protection laws were a necessary condition to ensure that enterprises actively exercised privacy protection. Coordinated regulation systems based on the Personal Information Protection Law and industry standards are recommended as the best practice to safeguard privacy protection in China. As lack of competition can result in two polarized privacy protection strategies, regulatory policies should emphasize the balance between data protection and encouraging necessary data sharing. Furthermore, the conjunctive effect between market structure and business models affected privacy policy formulation, which suggests that the positive effects of users’ rational choices in a competitive market should be further reinforced.


2021 ◽  
Vol 10 (1) ◽  
Author(s):  
Claire McCloskey

This paper critically evaluates the market-based system governing data collection in the United States. The discussion is centred around Big Tech, a group of information intermediaries responsible for the ongoing extraction and exploitation of consumer data. The exploitative system is enabled by the ubiquitous privacy policy, which ostensibly offers data subjects ‘notice’ of data collection and the ‘choice’ to consent to said collection. This paper critiques the ‘notice and choice’ model, concluding the combined ambiguity and opacity of the privacy policy fail to offer subjects meaningful control over their data. To substantiate this argument, the paper evaluates the suitability of the market-based system in a broader sense, arguing that data collection practices precludes the knowledge parity necessary for an operative and fair market-based system. The paper concludes by ascertaining the suitability of state-based regulation, identifying data’s intrinsic relationship with ideals that are core to the Western tradition: equality, democracy, and autonomy.


2021 ◽  
Author(s):  
H. N. Atapattu ◽  
W. S. N. Fernando ◽  
J. P. A. K Somasiri ◽  
P. M. K. Lokuge ◽  
A. N. Senarathne ◽  
...  

Author(s):  
David Lie ◽  
Lisa M. Austin ◽  
Peter Yi Ping Sun ◽  
Wenjun Qiu

We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as adversiting and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.


2021 ◽  
Vol 11 (24) ◽  
pp. 11629
Author(s):  
Zhong Zhang ◽  
Minho Shin

Within the scope of mobile privacy, there are many attack methods that can leak users’ private information. The communication between applications can be used to violate permissions and access private information without asking for the user’s authorization. Hence, many researchers made protection mechanisms against privilege escalation. However, attackers can further utilize inference algorithms to derive new information out of available data or improve the information quality without violating privilege limits. In this work. we describe the notion of Information Escalation Attack and propose a detection and protection mechanism using Inference Graph and Policy Engine for the user to control their policy on the App’s privilege in information escalation. Our implementation results show that the proposed privacy protection service is feasible and provides good useability.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 590-590
Author(s):  
Yvonne Kerkhof ◽  
Teake Ettema ◽  
Karin Dijkstra ◽  
Rose-Marie Dröes ◽  
David Neal

Abstract The ability of people with dementia and their caregivers to successfully navigate online environments is increasingly important to their social health. However, uncertainty about privacy online is an important barrier. Theoretically, access to published privacy policies should allow users of websites or software applications to make informed decisions. In practice, such documents are often complicated texts, and consequently even less accessible to people with cognitive impairment than to the general population. We present results from a multi-stakeholder, user-centred design process, towards an accessible alternative: a ‘dementia-friendly privacy policy’. Three design sprints took place in 2021, led by participants of the ‘Smart Solutions Semester’ at Saxion University of Applied Sciences in the Netherlands, in collaboration with cognitively unimpaired laypeople, people with dementia, informal caregivers, and expert stakeholders. Outputs were specifications for the solution, low-fidelity prototypes and high-fidelity prototypes, respectively. The dementia-friendly privacy policy is now ready for implementation and further evaluation.


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Ming Di ◽  
Shah Nazir ◽  
Fucheng Deng

The wide-ranging implementation of Android applications used in various devices, from smartphones to intelligent television, has made it thought-provoking for developers. The permission granting mechanism is one of the defects imposed by the developers. Such assessing of defects does not allow the user to comprehend the implication of privacy for granting permission. Mobile applications are speedily easily reachable to typical users of mobile. Despite possible applications for improving the affordability, availability, and effectiveness of delivering various services, it handles sensitive data and information. Such data and information carry considerable security and privacy risks. Users are usually unaware of how the data can be managed and used. Reusable resources are available in the form of third-party libraries, which are broadly active in android apps. It provides a diversity of functions that deliver privacy and security concerns. Host applications and third-party libraries are run in the same process and share similar permissions. The current study has presented an overview of the existing approaches, methods, and tools used for influencing user behavior concerning android privacy policy. Various prominent libraries were searched, and their search results were analyzed briefly. The search results were presented in diverse perspectives for showing the details of the work done in the area. This will help researchers to offer new solutions in the area of the research.


Digital ◽  
2021 ◽  
Vol 1 (4) ◽  
pp. 198-215
Author(s):  
Dhiren A. Audich ◽  
Rozita Dara ◽  
Blair Nonnecke

Privacy policies play an important part in informing users about their privacy concerns by operating as memorandums of understanding (MOUs) between them and online services providers. Research suggests that these policies are infrequently read because they are often lengthy, written in jargon, and incomplete, making them difficult for most users to understand. Users are more likely to read short excerpts of privacy policies if they pertain directly to their concern. In this paper, a novel approach and a proof-of-concept tool are proposed that reduces the amount of privacy policy text a user has to read. It does so using a domain ontology and natural language processing (NLP) to identify key areas of the policies that users should read to address their concerns and take appropriate action. Using the ontology to locate key parts of privacy policies, average reading times were substantially reduced from 29 to 32 min to 45 s.


Sign in / Sign up

Export Citation Format

Share Document