The Cost of Lost Privacy: Search, Antitrust and the Economics of the Control of User Data

Author(s):  
Nathan Newman
Keyword(s):  
2016 ◽  
Author(s):  
Jonathan Mellon

This chapter discusses the use of large quantities of incidentallycollected data (ICD) to make inferences about politics. This type of datais sometimes referred to as “big data” but I avoid this term because of itsconflicting definitions (Monroe, 2012; Ward & Barker, 2013). ICD is datathat was created or collected primarily for a purpose other than analysis.Within this broad definition, this chapter focuses particularly on datagenerated through user interactions with websites. While ICD has beenaround for at least half a century, the Internet greatly expanded theavailability and reduced the cost of ICD. Examples of ICD include data onInternet searches, social media data, and user data from civic platforms.This chapter briefly explains some sources and uses of ICD and thendiscusses some of the potential issues of analysis and interpretation thatarise when using ICD, including the different approaches to inference thatresearchers can use.


Electronics ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 171 ◽  
Author(s):  
Aymen Mudheher Badr ◽  
Yi Zhang ◽  
Hafiz Gulfam Ahmad Umar

The increasing use of cloud computing, especially in commercial, government and healthcare institutions, started with the use of computerized clouds. Clouds store important data, which reduces the cost of management and ensures easy access. To protect this data, cryptographic methods are used to ensure confidentiality of the data, as well as to secure access to user data and increase trust in cloud technology. In our paper, we suggest a new scheme to support an attribute-based encryption system (ABE) that involves multiple parties such as data owners, data users, cloud servers and authority. A verified and authenticated decryption process for the cloud environment is the imperative feature of our proposed architecture. The data owner encrypts their data and sends it to the cloud. The cloud server performs partial decryption and the final decrypted data are shared for users as per their privileges. Thus, the data owner reduces complexity of productivity by delegating the decryption process to the cloud server. Analysis of the experimental results confirms that data access in the electronic cloud atmosphere is safer due to a controlled multiple-users-rights scheme. Our performance evaluation results show that the proposed model condensed the communication overhead and made Digital Imaging and Communications in Medicine (DICOM) more secure.


2022 ◽  
Vol 19 (1) ◽  
pp. 1-26
Author(s):  
Mengya Lei ◽  
Fan Li ◽  
Fang Wang ◽  
Dan Feng ◽  
Xiaomin Zou ◽  
...  

Data security is an indispensable part of non-volatile memory (NVM) systems. However, implementing data security efficiently on NVM is challenging, since we have to guarantee the consistency of user data and the related security metadata. Existing consistency schemes ignore the recoverability of the SGX style integrity tree (SIT) and the access correlation between metadata blocks, thereby generating unnecessary NVM write traffic. In this article, we propose SecNVM, an efficient and write-friendly metadata crash consistency scheme for secure NVM. SecNVM utilizes the observation that for a lazily updated SIT, the lost tree nodes after a crash can be recovered by the corresponding child nodes in NVM. It reduces the SIT persistency overhead through a restrained write-back metadata cache and exploits the SIT inter-layer dependency for recovery. Next, leveraging the strong access correlation between the counter and DMAC, SecNVM improves the efficiency of security metadata access through a novel collaborative counter-DMAC scheme. In addition, it adopts a lightweight address tracker to reduce the cost of address tracking for fast recovery. Experiments show that compared to the state-of-the-art schemes, SecNVM improves the performance and decreases write traffic a lot, and achieves an acceptable recovery time.


Author(s):  
Boris M. Basok ◽  
Alena N. Rozhanskaya ◽  
Sergey L. Frenkel

The paper discusses the task of organizing the usability testing of web applications that are in pilot or industrial operation. A usability testing technique is described that combines the development, debugging and execution of test scenarios, and analysis of the received test data. The development of test scenarios is based: on the use of the accumulated experience of users who have already worked with this application or with another, similar in functionality; on the use of web analytics, which can provide scenarios of the behavior of user data on the site; and on the use of statistical data on visits to specific pages. Simultaneously with this approach, a different approach is used in the construction of tests. It is based on the development of tests aimed at identifying defects in the program. Debugging and execution of test tasks is carried out in the same way as it is realized when performing functional testing of web applications using testing automation tools. In addition, the analysis of the data obtained during the operation using web analytics makes it possible to form a group of respondent testers, whose capabilities reflect the capabilities of the entire set of probable users of this web application. The approaches outlined in the work were put into practice. As an example, the article provides test data for the page of the admissions committee of MIREA – Russian Technological University – priem.mirea.ru. The obtained experimental data showed that, despite testing the usability of software at the development stages, some errors in the operation of web applications remain undetected. The cost of detecting and eliminating these errors increases significantly. Therefore, it is recommended in the work to increase the level of usability already in the early stages of development. In particular, for the operational prediction of the level of usability, it is desirable to have mathematical tools for modeling the behavior of the designed system and the user.


Author(s):  
Shubhranil Chakraborty ◽  
Debabrata Bej ◽  
Dootam Roy ◽  
Sekh Arif Mahammad

A reliable Electronic Voting Machine (EVM) is proposed and implemented in this study, which is integrated with a biometric fingerprint scanner to ensure a secure election process. This biometric EVM includes features such as an interactive user interface, hack-free design and master lock. The EVM system has the capability of registering user data and storing them in a database through proper authentication. Moreover, the system proposed lowers the requirement for human resources. This paper provides a detailed description of the systematic development of the hardware and software used. The software part includes algorithm development and implementation. A thorough and in-depth understanding of the data and the communication protocols along with the pathways used for storage of data in the devices is provided. Additionally, the cost of the system is 62.82% less than the officially existing EVM machines of India. Furthermore, this study seeks to demonstrate the benefits of such an approach from a technological and a social standpoint.


Author(s):  
James F. Mancuso

IBM PC compatible computers are widely used in microscopy for applications ranging from control to image acquisition and analysis. The choice of IBM-PC based systems over competing computer platforms can be based on technical merit alone or on a number of factors relating to economics, availability of peripherals, management dictum, or simple personal preference.IBM-PC got a strong “head start” by first dominating clerical, document processing and financial applications. The use of these computers spilled into the laboratory where the DOS based IBM-PC replaced mini-computers. Compared to minicomputer, the PC provided a more for cost-effective platform for applications in numerical analysis, engineering and design, instrument control, image acquisition and image processing. In addition, the sitewide use of a common PC platform could reduce the cost of training and support services relative to cases where many different computer platforms were used. This could be especially true for the microscopists who must use computers in both the laboratory and the office.


Author(s):  
H. Rose

The imaging performance of the light optical lens systems has reached such a degree of perfection that nowadays numerical apertures of about 1 can be utilized. Compared to this state of development the objective lenses of electron microscopes are rather poor allowing at most usable apertures somewhat smaller than 10-2 . This severe shortcoming is due to the unavoidable axial chromatic and spherical aberration of rotationally symmetric electron lenses employed so far in all electron microscopes.The resolution of such electron microscopes can only be improved by increasing the accelerating voltage which shortens the electron wave length. Unfortunately, this procedure is rather ineffective because the achievable gain in resolution is only proportional to λ1/4 for a fixed magnetic field strength determined by the magnetic saturation of the pole pieces. Moreover, increasing the acceleration voltage results in deleterious knock-on processes and in extreme difficulties to stabilize the high voltage. Last not least the cost increase exponentially with voltage.


1994 ◽  
Vol 58 (11) ◽  
pp. 832-835 ◽  
Author(s):  
ES Solomon ◽  
TK Hasegawa ◽  
JD Shulman ◽  
PO Walker
Keyword(s):  

1998 ◽  
Vol 138 (2) ◽  
pp. 205-205
Author(s):  
Snellman ◽  
Maljanen ◽  
Aromaa ◽  
Reunanen ◽  
Jyrkinen‐Pakkasvirta ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document