The Conflict Between Big Data and Individual Privacy

2018 ◽  
pp. 1-43 ◽  
Author(s):  
Lei Xu ◽  
Chunxiao Jiang ◽  
Yi Qian ◽  
Yong Ren
Keyword(s):  
Big Data ◽  
Author(s):  
Miriam J. Metzger ◽  
Jennifer Jiyoung Suh ◽  
Scott Reid ◽  
Amr El Abbadi

This chapter begins with a case study of Strava, a fitness app that inadvertently exposed sensitive military information even while protecting individual users' information privacy. The case study is analyzed as an example of how recent advances in algorithmic group inference technologies threaten privacy, both for individuals and for groups. It then argues that while individual privacy from big data analytics is well understood, group privacy is not. Results of an experiment to better understand group privacy are presented. Findings show that group and individual privacy are psychologically distinct and uniquely affect people's evaluations, use, and tolerance for a fictitious fitness app. The chapter concludes with a discussion of group-inference technologies ethics and offers recommendations for fitness app designers.


2017 ◽  
Vol 35 (1) ◽  
pp. 36-49
Author(s):  
Feicheng Ma ◽  
Ye Chen ◽  
Yiming Zhao

Purpose This paper aims to propose a conceptual model for improving the organization of user needs information in the big data environment. Design/methodology/approach A conceptual model of the organization of user needs information based on Linked Data techniques is constructed. This model has three layers: the Data Layer, the Semantic Layer and the Application Layer. Findings Requirements for organizing user needs information in the big data environment are identified as follows: improving the intelligence level, establishing standards and guidelines for the description of user needs information, enabling the interconnection of user needs information and considering individual privacy in the organization and analysis of user needs. Practical implications This Web of Needs model could be used to improve knowledge services by matching user needs information with increasing semantic knowledge resources more effectively and efficiently in the big data environment. Originality/value This study proposes a conceptual model, the Web of Needs model, to organize and interconnect user needs. Compared with existing methods, the Web of Needs model satisfies the requirements for the organization of user needs information in the big data environment with regard to four aspects: providing the basis and conditions for intelligent processing of user needs information, using RDF as a description norm, enabling the interconnection of user needs information and setting various protocols to protect user privacy.


2015 ◽  
Vol 19 (4) ◽  
pp. 579-596 ◽  
Author(s):  
Lemi Baruh ◽  
Mihaela Popescu

This article looks at how the logic of big data analytics, which promotes an aura of unchallenged objectivity to the algorithmic analysis of quantitative data, preempts individuals’ ability to self-define and closes off any opportunity for those inferences to be challenged or resisted. We argue that the predominant privacy protection regimes based on the privacy self-management framework of “notice and choice” not only fail to protect individual privacy, but also underplay privacy as a collective good. To illustrate this claim, we discuss how two possible individual strategies—withdrawal from the market (avoidance) and complete reliance on market-provided privacy protections (assimilation)—may result in less privacy options available to the society at large. We conclude by discussing how acknowledging the collective dimension of privacy could provide more meaningful alternatives for privacy protection.


Author(s):  
Miriam J. Metzger ◽  
Jennifer Jiyoung Suh ◽  
Scott Reid ◽  
Amr El Abbadi

This chapter begins with a case study of Strava, a fitness app that inadvertently exposed sensitive military information even while protecting individual users' information privacy. The case study is analyzed as an example of how recent advances in algorithmic group inference technologies threaten privacy, both for individuals and for groups. It then argues that while individual privacy from big data analytics is well understood, group privacy is not. Results of an experiment to better understand group privacy are presented. Findings show that group and individual privacy are psychologically distinct and uniquely affect people's evaluations, use, and tolerance for a fictitious fitness app. The chapter concludes with a discussion of group-inference technologies ethics and offers recommendations for fitness app designers.


2019 ◽  
Vol 63 (6) ◽  
pp. 743-758 ◽  
Author(s):  
Adalbertus Kamanzi ◽  
Megan Romania

This article takes the big data era as a starting point to examine common assumptions about confidentiality and privacy, arguing that confidentiality is a westernized notion that is currently facing various challenges because of the present shift toward the openness of data access across multiple platforms. We contend that the notion of privacy is more dynamic in many non-Western societies and, therefore, we want to challenge confidentiality as a necessary condition for research. Revisiting the first author’s experience working with participatory qualitative data collection methods, we argue that there are communities where confidentiality matters less; in such communities, instead of engaging in strict confidentiality procedures, particularly since confidentiality will never be achieved, qualitative researchers should build on the attitudes of such communities and go beyond the notion of individual privacy to better facilitate the formulation of community action plans for possible interventions in areas of perceived societal needs.


2020 ◽  
Vol 27 (5) ◽  
Author(s):  
Leesa Lin ◽  
Zhiyuan Hou

To combat COVID-19, at least 29 countries/regions have resorted to digital technology; some embedded it with strict containment measures and achieved great success. We need to improve cryptography and regulations that would enable contact-tracing systems without mass surveillance in order to attain the benefits of location-tracking while protecting individual privacy.


2019 ◽  
Vol 28 (2) ◽  
pp. 183-197 ◽  
Author(s):  
Paola Mavriki ◽  
Maria Karyda

Purpose User profiling with big data raises significant issues regarding privacy. Privacy studies typically focus on individual privacy; however, in the era of big data analytics, users are also targeted as members of specific groups, thus challenging their collective privacy with unidentified implications. Overall, this paper aims to argue that in the age of big data, there is a need to consider the collective aspects of privacy as well and to develop new ways of calculating privacy risks and identify privacy threats that emerge. Design/methodology/approach Focusing on a collective level, the authors conducted an extensive literature review related to information privacy and concepts of social identity. They also examined numerous automated data-driven profiling techniques analyzing at the same time the involved privacy issues for groups. Findings This paper identifies privacy threats for collective entities that stem from data-driven profiling, and it argues that privacy-preserving mechanisms are required to protect the privacy interests of groups as entities, independently of the interests of their individual members. Moreover, this paper concludes that collective privacy threats may be different from threats for individuals when they are not members of a group. Originality/value Although research evidence indicates that in the age of big data privacy as a collective issue is becoming increasingly important, the pluralist character of privacy has not yet been adequately explored. This paper contributes to filling this gap and provides new insights with regard to threats for group privacy and their impact on collective entities and society.


2017 ◽  
Vol 8 (1) ◽  
pp. 6-15 ◽  
Author(s):  
Donatella Porrini

The article analyzes the regulatory framework in the insurance market in connection with the advent of Big Data, such as information collected from different sources that can be manipulated by new technologies. The use of Big Data offers significant opportunities to the insurance companies in terms of digitization of the distribution channels and greater knowledge of the customers, which is instrumental to a more effective identification of the individual’s risk profile, as well as improvement of the competitiveness. However, regulatory measures are needed for a proper use of Big Data in terms of respect of the individual privacy, potential discrimination and constraint on competition.


Sign in / Sign up

Export Citation Format

Share Document