scholarly journals “Won’t Somebody Think of the Children?” Examining COPPA Compliance at Scale

2018 ◽  
Vol 2018 (3) ◽  
pp. 63-83 ◽  
Author(s):  
Irwin Reyes ◽  
Primal Wijesekera ◽  
Joel Reardon ◽  
Amit Elazari Bar On ◽  
Abbas Razaghpanah ◽  
...  

Abstract We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps’ compliance with the Children’s Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children’s apps, we found that a majority are potentially in violation of COPPA, mainly due to their use of thirdparty SDKs. While many of these SDKs offer configuration options to respect COPPA by disabling tracking and behavioral advertising, our data suggest that a majority of apps either do not make use of these options or incorrectly propagate them across mediation SDKs. Worse, we observed that 19% of children’s apps collect identifiers or other personally identifiable information (PII) via SDKs whose terms of service outright prohibit their use in child-directed apps. Finally, we show that efforts by Google to limit tracking through the use of a resettable advertising ID have had little success: of the 3,454 apps that share the resettable ID with advertisers, 66% transmit other, non-resettable, persistent identifiers as well, negating any intended privacy-preserving properties of the advertising ID.

2021 ◽  
Vol 26 (4) ◽  
Author(s):  
Jordan Samhi ◽  
Kevin Allix ◽  
Tegawendé F. Bissyandé ◽  
Jacques Klein

AbstractDue to the convenience of access-on-demand to information and business solutions, mobile apps have become an important asset in the digital world. In the context of the COVID-19 pandemic, app developers have joined the response effort in various ways by releasing apps that target different user bases (e.g., all citizens or journalists), offer different services (e.g., location tracking or diagnostic-aid), provide generic or specialized information, etc. While many apps have raised some concerns by spreading misinformation or even malware, the literature does not yet provide a clear landscape of the different apps that were developed. In this study, we focus on the Android ecosystem and investigate Covid-related Android apps. In a best-effort scenario, we attempt to systematically identify all relevant apps and study their characteristics with the objective to provide a first taxonomy of Covid-related apps, broadening the relevance beyond the implementation of contact tracing. Overall, our study yields a number of empirical insights that contribute to enlarge the knowledge on Covid-related apps: (1) Developer communities contributed rapidly to the COVID-19, with dedicated apps released as early as January 2020; (2) Covid-related apps deliver digital tools to users (e.g., health diaries), serve to broadcast information to users (e.g., spread statistics), and collect data from users (e.g., for tracing); (3) Covid-related apps are less complex than standard apps; (4) they generally do not seem to leak sensitive data; (5) in the majority of cases, Covid-related apps are released by entities with past experience on the market, mostly official government entities or public health organizations.


2019 ◽  
Vol 42 (2) ◽  
Author(s):  
Alan Toy ◽  
Gehan Gunasekara

The data transfer model and the accountability model, which are the dominant models for protecting the data privacy rights of citizens, have begun to present significant difficulties in regulating the online and increasingly transnational business environment. Global organisations take advantage of forum selection clauses and choice of law clauses and attention is diverted toward the data transfer model and the accountability model as a means of data privacy protection but it is impossible to have confidence that the data privacy rights of citizens are adequately protected given well known revelations regarding surveillance and the rise of technologies such as cloud computing. But forum selection and choice of law clauses no longer have the force they once seemed to have and this opens the possibility that extraterritorial jurisdiction may provide a supplementary conceptual basis for championing data privacy in the globalised context of the Internet. This article examines the current basis for extraterritorial application of data privacy laws and suggests a test for increasing their relevance.


2021 ◽  
Vol 10 (3) ◽  
pp. 283-306
Author(s):  
Yannic Meier ◽  
Johanna Schäwel ◽  
Nicole C. Krämer

Using privacy-protecting tools and reducing self-disclosure can decrease the likelihood of experiencing privacy violations. Whereas previous studies found people’s online self-disclosure being the result of privacy risk and benefit perceptions, the present study extended this so-called privacy calculus approach by additionally focusing on privacy protection by means of a tool. Furthermore, it is important to understand contextual differences in privacy behaviors as well as characteristics of privacy-protecting tools that may affect usage intention. Results of an online experiment (N = 511) supported the basic notion of the privacy calculus and revealed that perceived privacy risks were strongly related to participants’ desired privacy protection which, in turn, was positively related to the willingness to use a privacy-protecting tool. Self-disclosure was found to be context dependent, whereas privacy protection was not. Moreover, participants would rather forgo using a tool that records their data, although this was described to enhance privacy protection.


2011 ◽  
pp. 2784-2797
Author(s):  
Jaymeen R. Shah ◽  
Garry L. White ◽  
James R. Cook

Privacy laws for the Internet are difficult to develop and implement domestically and internationally. A clear problem is how such laws are limited to national jurisdictions. What is legal in one country may be illegal in another. Due to differences in cultures and values, and government types, it may not be possible to establish global standards and legislations to ensure privacy. Due to the nonexistence of global privacy standards, multinational (international) companies usually select one of the following two possible solutions: (1) implement a most restrictive “one size fits all” privacy policy that is used across various countries, or (2) implement different privacy policies that meet the privacy regulations of different countries and expectations of those citizens. In order to investigate a solution that may be used by multinational companies, and how companies view domestic privacy laws, the authors conducted a survey of U.S.-based employees of domestic and multinational companies. The results of the survey suggest that the majority of the multinational companies prefer the first solution—most restrictive “one size fits all” approach. They develop and implement a single set of privacy policies that is used across their operations in different countries. The majority of the companies surveyed consider domestic privacy laws in the United States to be practical, but ineffective.


Author(s):  
Timothy Rouse ◽  
David N. Levine ◽  
Allison Itami ◽  
Benjamin Taylor

The U.S. has no comprehensive national law governing cybersecurity and no uniform framework for measuring the effectiveness of protections, though retirement plan record keepers maintain the personally identifiable information on millions of workers, collecting names, birth dates, social security numbers, and beneficiaries. Plan sponsors frequently engage consultants and attorneys to help them secure sensitive data, but more work is necessary to engage a larger discussion around this issue. The SPARK Institute has outlined a flexible approach for an independent third-party reporting of cyber security capabilities with several key control objectives.


2019 ◽  
Vol 47 (1) ◽  
pp. 70-87 ◽  
Author(s):  
Patricia A. Deverka ◽  
Dierdre Gilmore ◽  
Jennifer Richmond ◽  
Zachary Smith ◽  
Rikki Mangrum ◽  
...  

A medical information commons (MIC) is a networked data environment utilized for research and clinical applications. At three deliberations across the U.S., we engaged 75 adults in two-day facilitated discussions on the ethical and social issues inherent to sharing data with an MIC. Deliberants made recommendations regarding opt-in consent, transparent data policies, public representation on MIC governing boards, and strict data security and privacy protection. Community engagement is critical to earning the public's trust.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Sungtae Kim ◽  
Taeyong Park ◽  
Geochang Jeon ◽  
Jeong Hyun Yi

Mobile apps are booming with the expansion of mobile devices such as smartphones, tablet PCs, smartwatches, and IoT devices. As the capabilities of mobile apps and the types of personal information required to run apps have diversified, the need for increased security has grown. In particular, Android apps are vulnerable to repackaging attacks, so various code protection techniques such as obfuscation and packing have been applied. However, apps protected with these techniques can also be disabled with static and dynamic analyses. In recent years, instead of using such application level protection techniques, a number of approaches have been adopted to monitor the behavior of apps at the platform level. However, in these cases, not only incompatibility of system software due to platform modification, but also self-control functionality cannot be provided at the user level and is very inconvenient. Therefore, in this paper we propose an app protection scheme that can split a part of the app code, store it in a separate IoT device, and self-control the split code through the partial app. In the proposed scheme, the partial app is executed only when it matches the split code stored in the IoT device. It does not require complicated encryption techniques to protect the code like the existing schemes. It also provides solutions to the parameter dependency and register reallocation issues that must be considered when implementing the proposed code splitting scheme. Finally, we present and analyze the results of experimenting the proposed scheme on real devices.


2019 ◽  
Author(s):  
José Javier Flors-Sidro ◽  
Mowafa Househ ◽  
Alaa Abd-Alrazaq ◽  
Josep Vidal-Alaball ◽  
Luis Fernandez-Luque ◽  
...  

BACKGROUND Mobile health has become a major channel for the support of people living with diabetes. Accordingly, the availability of diabetes mobile apps has been steadily increasing. Most of the previous reviews of diabetes apps have focused on the apps’ features and their alignment with clinical guidelines. However, there is a lack of knowledge on the actual compliance of diabetes apps with privacy and data security aspects. OBJECTIVE The aim of this study was to assess the level of privacy of diabetes mobile applications to contribute to raising the awareness of final users, developers and data-protection governmental regulators towards privacy issues. METHODS A web scraper capable of retrieving Android apps’ privacy-related information, particularly the dangerous permissions required by the apps, was developed with the aim of analyzing privacy aspects related to diabetes apps. Following the research selection criteria, the original 882 apps were narrowed down to 497 apps, which were finally included in the analysis. RESULTS 60% of diabetes apps may request dangerous permissions, which poses a significant risk for the users’ data privacy. In addition, 30% of the apps do not return their privacy policy website. Moreover, it was found that 40% of apps contain advertising, and that some apps that declared not to contain it actually had ads. 95.4% of the apps were free of cost, and those belonging to the Medical and Health and Fitness categories were the most popular. However, final users do not always realize that the free-apps’ business model is largely based on advertising, and consequently, on sharing or selling their private data, either directly or indirectly, to unknown third-parties. CONCLUSIONS The aforementioned findings unquestionably confirm the necessity to educate users and raise their awareness regarding diabetes apps privacy aspects. For this purpose, this research recommends properly and comprehensively training users, ensuring that governments and regulatory bodies enforce strict data protection laws, devising much tougher security policies and protocols in Android and in the Google Play Store, and the implication and supervision of all stakeholders in the apps’ development process.


2020 ◽  
Vol 2020 (3) ◽  
pp. 222-242 ◽  
Author(s):  
Catherine Han ◽  
Irwin Reyes ◽  
Álvaro Feal ◽  
Joel Reardon ◽  
Primal Wijesekera ◽  
...  

AbstractIt is commonly assumed that “free” mobile apps come at the cost of consumer privacy and that paying for apps could offer consumers protection from behavioral advertising and long-term tracking. This work empirically evaluates the validity of this assumption by comparing the privacy practices of free apps and their paid premium versions, while also gauging consumer expectations surrounding free and paid apps. We use both static and dynamic analysis to examine 5,877 pairs of free Android apps and their paid counterparts for differences in data collection practices and privacy policies between pairs. To understand user expectations for paid apps, we conducted a 998-participant online survey and found that consumers expect paid apps to have better security and privacy behaviors. However, there is no clear evidence that paying for an app will actually guarantee protection from extensive data collection in practice. Given that the free version had at least one thirdparty library or dangerous permission, respectively, we discovered that 45% of the paid versions reused all of the same third-party libraries as their free versions, and 74% of the paid versions had all of the dangerous permissions held by the free app. Likewise, our dynamic analysis revealed that 32% of the paid apps exhibit all of the same data collection and transmission behaviors as their free counterparts. Finally, we found that 40% of apps did not have a privacy policy link in the Google Play Store and that only 3.7% of the pairs that did reflected differences between the free and paid versions.


Sign in / Sign up

Export Citation Format

Share Document