privacy policies
Recently Published Documents





2021 ◽  
Siena Gioia ◽  
Irma M Vlassac ◽  
Demsina Babazadeh ◽  
Noah L Fryou ◽  
Elizabeth Do ◽  

UNSTRUCTURED Abstract: Over the last decade, health apps have become an increasingly popular tool utilized by clinicians and researchers to track food consumption and exercise. However, as consumer apps have primarily focused on tracking dietary intake and exercise, many lack technological features to facilitate the capture of critical food timing details. To determine a viable app that recorded both dietary intake and food timing for use in our clinical study, we evaluated the timestamp data, usability, privacy policies, accuracy of nutrient estimates, and general features of 11 mobile apps for dietary assessment. Apps were selected using a keyword search of related terms and the following apps were reviewed: Bitesnap, Cronometer, DiaryNutrition, DietDiary, FoodDiary, FoodView, Macros, MealLogger, myCircadianClock, MyFitnessPal, and MyPlate. Our primary goal was identifying apps that record food timestamps, which 8 of the reviewed apps did (73%). Of those, only 4/11 (36%) allowed users to edit the timestamps, an important feature. Next, we sought to evaluate the usability of the apps, using the System Usability Scale (SUS) across 2 days, with 82% of the apps receiving favorable scores for usability (9/11 apps). To enable use in research and clinic settings, the privacy policies of each app were systematically reviewed using common criteria with 1 Health Insurance Portability and Accountability Act (HIPAA) compliant app (Cronometer). Furthermore, protected health information is collected by 9/11 (81%) of the apps. Lastly, to assess the accuracy of nutrient estimates generated by these apps, we selected 4 sample food items and one researcher’s 3-day dietary record to input into each app. The caloric and macronutrient estimates of the apps were compared to nutrient estimates provided by a registered dietitian using the Nutrition Data System for Research (NDSR). Compared to the 3-day food record, the apps were found to consistently underestimate calories and macronutrients compared to NDSR. Overall, we find the Bitesnap app to provide flexible dietary and food timing functionality capable for research or clinical use with the majority of apps lacking in necessary food timing functionality or user privacy.

David Lie ◽  
Lisa M. Austin ◽  
Peter Yi Ping Sun ◽  
Wenjun Qiu

We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as adversiting and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.

Digital ◽  
2021 ◽  
Vol 1 (4) ◽  
pp. 198-215
Dhiren A. Audich ◽  
Rozita Dara ◽  
Blair Nonnecke

Privacy policies play an important part in informing users about their privacy concerns by operating as memorandums of understanding (MOUs) between them and online services providers. Research suggests that these policies are infrequently read because they are often lengthy, written in jargon, and incomplete, making them difficult for most users to understand. Users are more likely to read short excerpts of privacy policies if they pertain directly to their concern. In this paper, a novel approach and a proof-of-concept tool are proposed that reduces the amount of privacy policy text a user has to read. It does so using a domain ontology and natural language processing (NLP) to identify key areas of the policies that users should read to address their concerns and take appropriate action. Using the ontology to locate key parts of privacy policies, average reading times were substantially reduced from 29 to 32 min to 45 s.

2021 ◽  
Vol 13 (1) ◽  
pp. 5-16
Kayla Clarke

According to the Statistics Canada report from 2019, when it comes to the amount of time spent online, Canada beats out every other country in the world. This has likely been amplified due to the stay-at-home order caused by the COVID-19 crisis, hence why the new Bill C-11 will strengthen the current policies defending Canadians from corporate digital overstep. Alexa, Please: Babysit My Child will explore, analyze, and evaluate Amazon's neuro-capitalistic technologies, specifically pertaining to the technologies made for child-use. Neuro-capitalism is dangerous as it speaks to controlling the mind through the current hyper-technological society. Jurisdictional complexity surrounding A.I. and cybersecurity can be mitigated by government-funded education. Therefore, my research explores the question: From a neuro-capitalistic & digital-colonial standpoint, to what extent are Amazon's child-targeted technologies' (such as Kindle 4 Kids) consistent with the privacy policies of the new, proposed Bill C-11? This policy analysis will consist of three sections—first, an analysis of Amazon's Kindle 4 Kids Terms and Conditions (Site 1). Second, an evaluation of Bill C-11’s ability to protect children from the pernicious aspects of neuro-capitalism (Site 2). Lastly, a compare and contrast section of the two entities, ending with a discussion of the findings. Particularly during the COVID-19 crisis, we must be sure that the Government of Canada is doing everything in their power to aid the youth of the country that spends the most time online and the most time with their babysitter: Alexa. 

2021 ◽  
pp. 1329878X2110416
Ben Egliston ◽  
Marcus Carter

Virtual reality – a site of renewed interest for major players in the tech industry – is increasingly one fraught with questions of data capture. This article examines the case of the Facebook owned virtual reality company Oculus and its intensifying privacy and surveillance risks with respect to the data generated and gathered through its devices. To explore the surveillance-centred structures of Oculus, this article examines Oculus’ privacy policies from December 2014 (the first version following the company's acquisition by Facebook), and October 2020 (the most recent iteration of the policy). In so doing, we examine these policies as sites of discourse, asking how they frame and afford power and control to Facebook, and position Facebook and Oculus’ surveillant aims and logics relative to societal concerns about, and regulations of, data.

Sign in / Sign up

Export Citation Format

Share Document