scholarly journals Biometric Monitoring Devices: Modern Solutions to Protecting Athletes’ Data Privacy

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Tristan A. Dietrick

Smartwatches like Fitbits provide users with easy access to quantifiable health data. In the sports industry, tracking this biometric information may be particularly beneficial to athletes, whose livelihoods revolve around their health and fitness. Nonetheless, under the current regime, professional and collegiate athletes’ biometric health data are inadequately protected. Data privacy law is still in its infancy, but in the meantime, athletes must consider that motivations to sell or misuse players’ biometric information may outpace legal developments. This Paper will analyze the promise and risk of collecting professional and collegiate athletes’ health and biometric data, particularly through fitness wearables. It will provide a closer look at wearables in professional sports and consider the increased risk posed to college athletes. Finally, this Paper will consider possible solutions to maximize the benefits of newfound technology while simultaneously minimizing risks to players’ health information, privacy, and personal data ownership.

2020 ◽  
pp. 004728752095164
Author(s):  
Athina Ioannou ◽  
Iis Tussyadiah ◽  
Graham Miller

Against the backdrop of advancements in technology and its deployment by companies and governments to collect sensitive personal information, information privacy has become an issue of great interest for academics, practitioners, and the general public. The travel and tourism industry has been pioneering the collection and use of biometric data for identity verification. Yet, privacy research focusing on the travel context is scarce. This study developed a valid measurement of Travelers’ Online Privacy Concerns (TOPC) through a series of empirical studies: pilot ( n=277) and cross-validation ( n=287). TOPC was then assessed for its predictive validity in its relationships with trust, risk, and intention to disclose four types of personal data: biometric, identifiers, biographic, and behavioral data ( n=685). Results highlight the role of trust in mitigating the relationship between travelers’ privacy concerns and data disclosure. This study provides valuable contribution to research and practice on data privacy in travel.


2021 ◽  
Vol 41 (1) ◽  
pp. 100-13
Author(s):  
Michele Estrin Gilman

Menstruation is being monetized and surveilled, with the voluntary participation of millions of women. Thousands of downloadable apps promise to help women monitor their periods and manage their fertility. These apps are part of the broader, multi-billion dollar, Femtech industry, which sells technology to help women understand and improve their health. Femtech is marketed with the language of female autonomy and feminist empowerment. Despite this rhetoric, Femtech is part of a broader business strategy of data extraction, in which companies are extracting people’s personal data for profit, typically without their knowledge or meaningful consent. Femtech can oppress menstruators in several ways. Menstruators lose control over their personal data and how it is used. Some of these uses can potentially disadvantage women in the workplace, insurance markets, and credit scoring. In addition, these apps can force users into a gendered binary that does not always comport with their identity. Further, period trackers are sometimes inaccurate, leading to unwanted pregnancies. Additionally, the data is nearly impossible to erase, leading some women to be tracked relentlessly across the web with assumptions about their childbearing and fertility. Despite these harms, there are few legal restraints on menstrual surveillance. American data privacy law largely hinges on the concept of notice and consent, which puts the onus on people to protect their own privacy rather than placing responsibility on the entities that gather and use data. Yet notice and consent is a myth because consumers do not read, cannot comprehend, and have no opportunities to negotiate the terms of privacy policies. Notice and consent is an individualistic approach to data privacy that envisions an atomized person pursing their own self-interest in a competitive marketplace. Menstruators’ needs do not fit this model. Accordingly, this Essay seeks to reconceptualize Femtech within an expanded menstrual justice framework that recognizes the tenets of data feminism. In this vision, Femtech would be an empowering and accurate health tool rather than a data extraction device.


Author(s):  
Stephen Holland ◽  
Jamie Cawthra ◽  
Tamara Schloemer ◽  
Peter Schröder-Bäck

AbstractInformation is clearly vital to public health, but the acquisition and use of public health data elicit serious privacy concerns. One strategy for navigating this dilemma is to build 'trust' in institutions responsible for health information, thereby reducing privacy concerns and increasing willingness to contribute personal data. This strategy, as currently presented in public health literature, has serious shortcomings. But it can be augmented by appealing to the philosophical analysis of the concept of trust. Philosophers distinguish trust and trustworthiness from cognate attitudes, such as confident reliance. Central to this is value congruence: trust is grounded in the perception of shared values. So, the way to build trust in institutions responsible for health data is for those institutions to develop and display values shared by the public. We defend this approach from objections, such as that trust is an interpersonal attitude inappropriate to the way people relate to organisations. The paper then moves on to the practical application of our strategy. Trust and trustworthiness can reduce privacy concerns and increase willingness to share health data, notably, in the context of internal and external threats to data privacy. We end by appealing for the sort of empirical work our proposal requires.


2020 ◽  
Vol 29 (01) ◽  
pp. 032-043 ◽  
Author(s):  
Hannah K. Galvin ◽  
Paul R. DeMuro

Objectives: To survey international regulatory frameworks that serve to protect privacy of personal data as a human right as well as to review the literature regarding privacy protections and data ownership in mobile health (mHealth) technologies between January 1, 2016 and June 1, 2019 in order to identify common themes. Methods: We performed a review of relevant literature available in English published between January 1, 2016 and June 1, 2019 from databases including PubMed, Google Scholar, and Web of Science, as well as relevant legislative background material. Articles out of scope (as detailed below) were eliminated. We categorized the remaining pool of articles and discrete themes were identified, specifically: concerns around data transmission and storage, including data ownership and the ability to re-identify previously de-identified data; issues with user consent (including the availability of appropriate privacy policies) and access control; and the changing culture and variable global attitudes toward privacy of health data. Results: Recent literature demonstrates that the security of mHealth data storage and transmission remains of wide concern, and aggregated data that were previously considered “de-identified” have now been demonstrated to be re-identifiable. Consumer-informed consent may be lacking with regard to mHealth applications due to the absence of a privacy policy and/or to text that is too complex and lengthy for most users to comprehend. The literature surveyed emphasizes improved access control strategies. This survey also illustrates a wide variety of global user perceptions regarding health data privacy. Conclusion: The international regulatory framework that serves to protect privacy of personal data as a human right is diverse. Given the challenges legislators face to keep up with rapidly advancing technology, we introduce the concept of a “healthcare fiduciary” to serve the best interest of data subjects in the current environment.


2020 ◽  
Vol 8 ◽  
pp. 205031212093483 ◽  
Author(s):  
Mary Mallappallil ◽  
Jacob Sabu ◽  
Angelika Gruessner ◽  
Moro Salifu

Universally, the volume of data has increased, with the collection rate doubling every 40 months, since the 1980s. “Big data” is a term that was introduced in the 1990s to include data sets too large to be used with common software. Medicine is a major field predicted to increase the use of big data in 2025. Big data in medicine may be used by commercial, academic, government, and public sectors. It includes biologic, biometric, and electronic health data. Examples of biologic data include biobanks; biometric data may have individual wellness data from devices; electronic health data include the medical record; and other data demographics and images. Big data has also contributed to the changes in the research methodology. Changes in the clinical research paradigm has been fueled by large-scale biological data harvesting (biobanks), which is developed, analyzed, and managed by cheaper computing technology (big data), supported by greater flexibility in study design (real-world data) and the relationships between industry, government regulators, and academics. Cultural changes along with easy access to information via the Internet facilitate ease of participation by more people. Current needs demand quick answers which may be supplied by big data, biobanks, and changes in flexibility in study design. Big data can reveal health patterns, and promises to provide solutions that have previously been out of society’s grasp; however, the murkiness of international laws, questions of data ownership, public ignorance, and privacy and security concerns are slowing down the progress that could otherwise be achieved by the use of big data. The goal of this descriptive review is to create awareness of the ramifications for big data and to encourage readers that this trend is positive and will likely lead to better clinical solutions, but, caution must be exercised to reduce harm.


2018 ◽  
Author(s):  
Michael Veale ◽  
Reuben Binns ◽  
Jef Ausloos

Cite as: Michael Veale, Reuben Binns and Jef Ausloos (2018) When Data Protection by Design and Data Subject Rights Clash. International Data Privacy Law (2018) doi:10.1093/idpl/ipy002. [Note: An earlier draft was entitled "We Can't Find Your Data, But A Hacker Could: How 'Privacy by Design' Trades-Off Data Protection Rights"]Abstract➔Data Protection by Design (DPbD), a holistic approach to embedding principles in technical and organisational measures undertaken by data controllers, building on the notion of Privacy by Design, is now a qualified duty in the GDPR.➔Practitioners have seen DPbD less holistically, instead framing it through the confidentiality-focussed lens of Privacy Enhancing Technologies (PETs).➔While focussing primarily on confidentiality risk, we show that some DPbD strategies deployed by large data controllers result in personal data which, despite remaining clearly reidentifiable by a capable adversary, make it difficult for the controller to grant data subjects rights (eg access, erasure, objection) over for the purposes of managing this risk.➔Informed by case studies of Apple's Siri voice assistant and Transport for London's Wi-Fi analytics, we suggest three main ways to make deployed DPbD more accountable and data subject-centric: building parallel systems to fulfil rights, including dealing with volunteered data; making inevitable trade-offs more explicit and transparent through Data Protection Impact Assessments; and through ex ante and ex post information rights (arts 13-15), which we argue may require the provision of information concerning DPbD trade-offs.➔Despite steep technical hurdles, we call both for researchers in PETs to develop rigorous techniques to balance privacy-as-control with privacy-as-confidentiality, and for DPAs to consider tailoring guidance and future frameworks to better oversee the trade-offs being made by primarily well-intentioned data controllers employing DPbD.


Author(s):  
Javad Pool ◽  
Saeed Akhlaghpour ◽  
Farhad Fatehi

Background: Considering the impacts of the COVID-19 pandemic on health service delivery, the US Office for Civil Rights (OCR) updated the policies on health data processing, and Health Insurance Portability and Accountability Act (HIPAA). Objectives: In this study, we investigated discourses on HIPAA in relation to COVID-19. Methods: Through a search of media sources in the Factiva database, relevant texts were identified. We applied a text mining approach to identify concepts and themes in these texts. Results: Our analysis revealed six central themes, namely, Health, HIPAA, Privacy, Security, Patients, and Need, as well as their associated concepts. Among these, Health was the most frequently discussed theme. It comprised concepts such as public, care, emergency, providers, telehealth, entity, use, discretion, OCR, Health and Human Services (HHS), enforcement, business, and services. Conclusion: Our discourse analysis of media outlets highlights the role of health data privacy law in the response to global public health emergencies and demonstrates how discourse analysis and computational methods can inform health data protection policymaking in the digital health era.


Sign in / Sign up

Export Citation Format

Share Document