scholarly journals Increasing personal data contributions for the greater public good: a field experiment on an online education platform

2021 ◽  
pp. 1-27
Author(s):  
Viola Ackfeld ◽  
Tobias Rohloff ◽  
Sylvi Rzepka

Abstract Personal data increasingly serve as inputs to public goods. Like other types of contributions to public goods, personal data are likely to be underprovided. We investigate whether classical remedies to underprovision are also applicable to personal data and whether the privacy-sensitive nature of personal data must be additionally accounted for. In a randomized field experiment on a public online education platform, we prompt users to complete their profiles with personal information. Compared to a control message, we find that making public benefits salient increases the number of personal data contributions significantly. This effect is even stronger when additionally emphasizing privacy protection, especially for sensitive information. Our results further suggest that emphasis on both public benefits and privacy protection attracts personal data from a more diverse set of contributors.

Author(s):  
Ronggong Song ◽  
Larry Korba ◽  
George Yee

Pseudonym technology is attracting more and more attention and, together with privacy violations, is becoming a major issue in various e-services. Current e-service systems make personal data collection very easy and efficient through integration, interconnection, and data mining technologies since they use the user’s real identity. Pseudonym technology with unlinkability, anonymity, and accountability can give the user the ability to control the collection, retention, and distribution of his or her personal information. This chapter explores the challenges, issues, and solutions associated with pseudonym technology for privacy protection in e-services. To have a better understanding of how the pseudonym technology provides privacy protection in e-services, we describe a general pseudonym system architecture, discuss its relationships with other privacy technologies, and summarize its requirements. Based on the requirements, we review, analyze, and compare a number of existing pseudonym technologies. We then give an example of a pseudonym practice — e-wallet for e-services and discuss current issues.


Author(s):  
Anna Antonakopoulou ◽  
Georgios V. Lioudakis ◽  
Fotios Gogoulos ◽  
Dimitra I. Kaklamani ◽  
Iakovos S. Venieris

Modern business environments amass and exchange a great deal of sensitive information about their employees, customers, products, et cetera, acknowledging privacy to be not only a business but also an ethical and legal requirement. Any privacy violation certainly includes some access to personal information and, intuitively, access control constitutes a fundamental aspect of privacy protection. In that respect, many organizations use security policies to control access to sensitive resources and the employed security models must provide means to handle flexible and dynamic requirements. Consequently, the definition of an expressive privacy-aware access control model constitutes a crucial issue. Among the technologies proposed, there are various access control models incorporating features designed to enforce privacy protection policies, taking mainly into account the purpose of the access, privacy obligations, as well as other contextual constraints, aiming at the accomplishment of the privacy protection requirements. This chapter studies these models, along with the aforementioned features.


2018 ◽  
Vol 25 (4) ◽  
pp. 1675-1691 ◽  
Author(s):  
Stephen Cory Robinson

Wearable technologies have created fascinating opportunities for patients to treat chronic pain in a discreet, mobile fashion. However, many of these health wearables require patients to disclose sensitive information, including health information (e.g., heart rate, glucose levels) and personal information (location, email, name, etc.). Individuals using wearables for treatment of chronic pain may sacrifice social health elements, including their privacy, in exchange for better physical and mental health. Utilizing communication privacy management, a popular disclosure theory, this article explores the policy and ethical ramifications of patients disclosing sensitive health information in exchange for better health treatment and relief of chronic pain. The article identifies scenarios where a user must disclose information, and what factors motivate or dissuade disclosure, and ultimately the use of a health wearable. Practical implications of this conceptual article include an improved understanding of how and why consumers may disclose personal data to health wearables, and potential impacts for public policy and ethics regarding how wearables and their manufacturers entice disclosure of private health information.


10.28945/2261 ◽  
2015 ◽  
Author(s):  
S. Srinivasan

Data breach is the act of accessing a central data repository without the consent of the data owner. Data breaches are occurring frequently and involve millions of records. Major breaches have been reported since 2005. Often data breaches occur due to someone with malicious intent accessing the stored data. In this paper we look at the types of data breaches and how they impact people’s privacy, we introduce a data protection model with the goal of protecting people’s privacy. Given today’s mobile information needs it is essential to have access to personal data. Social networks are making it difficult to keep personal information private. We provide several different summaries to show the effect of data breaches and data losses on people. We conclude this paper with a set of recommendations to protect people’s privacy.


Author(s):  
R R. Arnesen

Protecting the privacy of citizens is a critical issue in digital government services. The right to privacy is widely recognized as a fundamental human right, as stated in Article 12 of the Universal Declaration of Human Rights (United Nations, 1948). The first definition of privacy was given by American lawyers Warren and Brandeis (1890), who defined it as “the right to be let alone.” However, the right to privacy has been recognized for millenniums. The Hippocratic oath (n.d.) dates back to around 400 B.C. and instructs medical doctors to respect the privacy of their patients. During the last three decades, many countries have passed privacy legislation, the Swedish Data Act from 1973 being the first national privacy act in the world. During the 1970s, many countries adopted data protection acts (Fischer-Hübner, 2001). In 1980, OECD published its privacy guidelines with the purpose of reducing the potential privacy problems incurred by cross-border trade (OECD, 1980). The European Council adopted Directive 95/46/EC in 1995, and all member states are required to implement national privacy legislation in compliance with this directive (European Union (EU) Directive 95/46/EC, 1995). Privacy is under increasing pressure in the digital age, and the introduction of digital government services may escalate this development. The way government has been organized until now, with separate departments with their own “silos” of personal data, has inherently provided some privacy protection. In such a distributed environment data matching is expensive and resource consuming. This form of privacy protection is referred to as “practical obscurity” in Crompton (2004, p.12). Some examples of threats to privacy related to the development of digital government are as follows: • Data collection capabilities increase as new technology for continuous and automatic data collection is introduced. Examples of such technologies include digital video surveillance, biometric identification and radio frequency identification (RFID). • Data processing capabilities are rapidly increasing. The very existence of large amounts of stored personal data, together with the availability of sophisticated tools for analysis, increases the probability for misuse of data. • There is a trend towards integration of formerly separated governmental services, including physical offices. Providing a single point of contact is more user friendly, but it may also provide an attacker with a single point of attack. • Outsourcing of services (e.g., customer relationship management) is increasingly popular both among companies and governmental organizations. Those who deliver such services to many customers have a unique opportunity to gather personal information from many different sources. If services are outsourced across country borders, and perhaps in several layers, responsibilities soon become unclear. • Even if the organization responsible for stored personal information does not have malicious intents, one cannot expect all its employees to be equally trustworthy. Disloyal employees are a severe threat when increasing amounts of information are stored. • Tax records and other public records made available on the Internet enable efficient searches and aggregation of information about individuals. Identity thefts and fraud are common uses of information gathered in this way.


1997 ◽  
Vol 16 (2) ◽  
pp. 298-309 ◽  
Author(s):  
George R. Milne

Increasingly, companies are acquiring customer names and personal information for database marketing purposes. Because lists of customer names often are sold and rented without customer knowledge, legislation has been proposed to curtail the ability of marketers to sell personal information without customer consent. In reaction, the direct marketing community has responded with aggressive self-regulation, strongly recommending negative option permission formats and carefully worded disclosure statements. The author examines the assumptions underlying the direct marketing community's self-regulation efforts. Using afield experiment, the author measures consumers’ willingness to provide marketers with personal information and permission to rent this information based on varying permission formats, type of information requested, and level of disclosure. The results of the field experiment support the direct marketing community's assumptions about how question format affects consumers’ willingness to allow their personal information to be transferred to a third party. However, the results do not support the common assumptions that using negative option formats and not asking consumers for sensitive information improves consumers’ willingness to join mailing lists.


Author(s):  
Mike Zajko

AbstractThis article examines the role of internet service providers (ISPs) as guardians of personal information and protectors of privacy, with a particular focus on how telecom companies in Canada have historically negotiated these responsibilities. Communications intermediaries have long been expected to act as privacy custodians by their users, while simultaneously being subject to pressures to collect, utilize, and disclose personal information. As service providers gain custody over increasing volumes of highly-sensitive information, their importance as privacy custodians has been brought into starker relief and explicitly recognized as a core responsibility.Some ISPs have adopted a more positive orientation to this responsibility, actively taking steps to advance it, rather that treating privacy protection as a set of limitations on conduct. However, commitments to privacy stewardship are often neutralized through contradictory legal obligations (such as mandated surveillance access) and are recurrently threatened by commercial pressures to monetize personal information.


Sign in / Sign up

Export Citation Format

Share Document