scholarly journals Black-Box Accumulation: Collecting Incentives in a Privacy-Preserving Way

2016 ◽  
Vol 2016 (3) ◽  
pp. 62-82 ◽  
Author(s):  
Tibor Jager ◽  
Andy Rupp

Abstract We formalize and construct black-box accumulation (BBA), a useful building block for numerous important user-centric protocols including loyalty systems, refund systems, and incentive systems (as, e.g., employed in participatory sensing and vehicle-to-grid scenarios). A core requirement all these systems share is a mechanism to let users collect and sum up values (call it incentives, bonus points, reputation points, etc.) issued by some other parties in a privacy-preserving way such that curious operators may not be able to link the different transactions of a user. At the same time, a group of malicious users may not be able to cheat the system by pretending to have collected a higher amount than what was actually issued to them. As a first contribution, we fully formalize the core functionality and properties of this important building block. Furthermore, we present a generic and non-interactive construction of a BBA system based on homomorphic commitments, digital signatures, and non-interactive zero-knowledge proofs of knowledge. For our construction, we formally prove security and privacy properties. Finally, we propose a concrete instantiation of our construction using Groth-Sahai commitments and proofs as well as the optimal structure-preserving signature scheme of Abe et al. and analyze its efficiency.

2018 ◽  
Vol 2018 ◽  
pp. 1-15 ◽  
Author(s):  
Xiaoguang Niu ◽  
Jiawei Wang ◽  
Qiongzan Ye ◽  
Yihao Zhang

The proliferation of mobile devices has facilitated the prevalence of participatory sensing applications in which participants collect and share information in their environments. The design of a participatory sensing application confronts two challenges: “privacy” and “incentive” which are two conflicting objectives and deserve deeper attention. Inspired by physical currency circulation system, this paper introduces the notion of E-cent, an exchangeable unit bearer currency. Participants can use the E-cent to take part in tasks anonymously. By employing E-cent, we propose an E-cent-based privacy-preserving incentive mechanism, called EPPI. As a dynamic balance regulatory mechanism, EPPI can not only protect the privacy of participant, but also adjust the whole system to the ideal situation, under which the rated tasks can be finished at minimal cost. To the best of our knowledge, EPPI is the first attempt to build an incentive mechanism while maintaining the desired privacy in participatory sensing systems. Extensive simulation and analysis results show that EPPI can achieve high anonymity level and remarkable incentive effects.


2018 ◽  
Vol 80 (4) ◽  
pp. 629-647 ◽  
Author(s):  
William Feigelman ◽  
Beverly Feigelman ◽  
Lillian M. Range

We explored parents’ views of the trajectories of their adult children’s eventual deaths from drugs with in-depth qualitative interviews from 11 bereaved parents. Parents reported great emotional distress and high financial burdens as their children went through death spirals of increasing drug involvements. These deaths often entailed anxiety-inducing interactions with police or medical personnel, subsequent difficulties with sharing death cause information with socially significant others, and longer term problems from routine interactions. Eventually, though, many of these longer term bereaved parents reported overcoming these obstacles and developing posttraumatic growth. Openly disclosing the nature of the death seemed to be an important building block for their healing.


2021 ◽  
Vol 11 (3-4) ◽  
pp. 1-22
Author(s):  
Qiang Yang

With the rapid advances of Artificial Intelligence (AI) technologies and applications, an increasing concern is on the development and application of responsible AI technologies. Building AI technologies or machine-learning models often requires massive amounts of data, which may include sensitive, user private information to be collected from different sites or countries. Privacy, security, and data governance constraints rule out a brute force process in the acquisition and integration of these data. It is thus a serious challenge to protect user privacy while achieving high-performance models. This article reviews recent progress of federated learning in addressing this challenge in the context of privacy-preserving computing. Federated learning allows global AI models to be trained and used among multiple decentralized data sources with high security and privacy guarantees, as well as sound incentive mechanisms. This article presents the background, motivations, definitions, architectures, and applications of federated learning as a new paradigm for building privacy-preserving, responsible AI ecosystems.


Author(s):  
J. Andrew Onesimu ◽  
Karthikeyan J. ◽  
D. Samuel Joshua Viswas ◽  
Robin D Sebastian

Deep learning is the buzz word in recent times in the research field due to its various advantages in the fields of healthcare, medicine, automobiles, etc. A huge amount of data is required for deep learning to achieve better accuracy; thus, it is important to protect the data from security and privacy breaches. In this chapter, a comprehensive survey of security and privacy challenges in deep learning is presented. The security attacks such as poisoning attacks, evasion attacks, and black-box attacks are explored with its prevention and defence techniques. A comparative analysis is done on various techniques to prevent the data from such security attacks. Privacy is another major challenge in deep learning. In this chapter, the authors presented an in-depth survey on various privacy-preserving techniques for deep learning such as differential privacy, homomorphic encryption, secret sharing, and secure multi-party computation. A detailed comparison table to compare the various privacy-preserving techniques and approaches is also presented.


Sign in / Sign up

Export Citation Format

Share Document