Reconciling Data Access and Privacy: Building a Sustainable Model for the Future

2019 ◽  
Vol 109 ◽  
pp. 409-413
Author(s):  
Katharine G. Abraham

An important factor in the government's ability to collect data from individuals and businesses is the promise that their information will be kept private. Given the explosion of other data increasingly available in electronic form, however, there is a growing risk that the subjects of federal data collections could be re-identified and their privacy thereby compromised. This implies that current modes for disseminating information based on survey and census data will need to be rethought. While the broad outlines for a new system seem relatively clear, important practical questions about its implementation will need to be addressed.

2013 ◽  
Vol 706-708 ◽  
pp. 733-736
Author(s):  
Fang Li

It is very difficult to confirm the position of the center of gravity irregular object.This paper introduces a new system,which can be used in measuring and adjusting of the center of gravity.The irregular object was put on the support parts,then the center of gravity was given.When the actual position was compared with the ideal position,the difference was given. Until actual position and ideal position were coincided,the machining was continuing. Matlab was programmed in this measure system. It will be applied widely in the future.


2021 ◽  
Vol 103 (2) ◽  
pp. 60-61
Author(s):  
Joshua P. Starr

The release of new U.S. census data led many pundits to opine about what demographic changes mean for the future of the country. But, as Joshua P. Starr explains, educators have been watching their classrooms and schools become less white for many years. What’s important now is not the change itself but how we interpret the change. The stories people tell about past and present changes can affect their response to that change. Leaders who encounter resistance as they propose new equity initiatives can benefit from listening to the stories of those who are wary to try to understand the reason for their resistance.


Author(s):  
Colleen Loos ◽  
Gita Mishra ◽  
Annette Dobson ◽  
Leigh Tooth

IntroductionLinked health record collections, when combined with large longitudinal surveys, are a rich research resource to inform policy development and clinical practice across multiple sectors. Objectives and ApproachThe Australian Longitudinal Study on Women’s Health (ALSWH) is a national study of over 57,000 women in four cohorts. Survey data collection commenced in 1996. Over the past 20 years, ALSWH has also established an extensive data linkage program. The aim of this poster is to provide an overview of ALSWH’s program of regularly up-dated linked data collections for use in parallel with on-going surveys, and to demonstrate how data are made widely available to research collaborators. ResultsALSWH surveys collect information on health conditions, ageing, reproductive characteristics, access to health services, lifestyle, and socio-demographic factors. Regularly updated linked national and state administrative data collections add information on health events, health outcomes, diagnoses, treatments, and patterns of service use. ALSWH’s national linked data collections, include Medicare Benefits Schedule, Pharmaceutical Benefits Scheme, the National Death Index, the Australian Cancer Database, and the National Aged Care Data Collection. State and Territory hospital collections include Admitted Patients, Emergency Department and Perinatal Data. There are also substudies, such as the Mothers and their Children’s Health Study (MatCH), which involves linkage to children’s educational records. ALSWH has an internal Data Access Committee along with systems and protocols to facilitate collaborative multi-sectoral research using de-identified linked data. Conclusion / ImplicationsAs a large scale Australian longitudinal multi-jurisdictional data linkage and sharing program, ALSWH is a useful model for anyone planning similar research.


Author(s):  
Robert Wuthnow

This chapter examines the future that small communities may—or may not—hold for the next generation. As residents nearly always see it, young people who grow up in small towns should go to college in order to be well prepared for whatever the future may hold. However, the reasons given along with the concerns underlying these reasons are more complex than surveys and census data reveal. Although they consider higher education critical, residents—parents and educators alike—acknowledge that there are aspects of small-town culture that make it difficult for young people to plan appropriately in order to make the most of college or university training. The chapter considers the importance of college for future planning among young people, as well as the disadvantages of living in a small town, and how community ties remain among residents.


Author(s):  
Steven Brint

This chapter discusses the analytical contrast between the two systems for organizing research and education, the system of academic professionalism and the system of academic innovationism. Under the impetus of academic innovationism, universities became more porous to the outside world and reciprocal relations of knowledge exchange grew denser. On balance, the new system contributed significantly and positively to the research prowess of universities. But it has also yielded a spotty record—some extraordinary successes but also many short-lived, troubled collaborations. Some universities invested heavily in the infrastructure to foster academic innovation and had little to show for their investments. For research universities, the challenge for the future will be to expand the possibilities to contribute more to the national innovation effort.


2011 ◽  
pp. 234-248
Author(s):  
Enid Mumford

Participative systems design has, in the past, been seen as a positive group process of thinking through needs and problems and arriving at solutions for making the situation better. This improved situation then continues until new technology or new solutions provide an opportunity for making the situation better still. So far this book has concentrated on how to make the best use of the positive factors assisting change, especially change that involves the introduction and use of technology. It has described the importance of getting a clear understanding of the change problem and its complexity, of developing effective strategies to address this complexity, and of the creation of structures, often organizational, to facilitate the subsequent use of the new system. This last requires always keeping in mind the need to meet the dual objectives of achieving operating efficiency and a good quality of working life. This is often described as job satisfaction. Most of all there has been a continual stress on the importance of participation. This involves sharing the design tasks with those who will be affected by them and taking account of their opinions in design decisions. This chapter addresses the reverse of this positive objective. It considers the negative factors in a change situation which are likely to cause problems and to threaten the success of the change programme and of the new system. There are very many of these kinds of problems and it is only possible to discuss a few here. The ones I have selected are criminal threats which affect the future viability of the company, technical problems which reduce efficiency, unpleasant and stressful work that threatens employee health, and problems of morale which affect the individual’s happiness in the workplace. A consideration of negative factors brings us into the challenging areas of uncertainty and risk. Uncertainty is when we do not know what is going to happen and often contains an element of surprise. This is especially true today when so many decisions depend on forecasts of the future. A contributing factor here can be an overemphasis on the present as a means of forecasting the future. Uncertainty is also often a result of the behaviour of others rather than of events. This is hard to predict. Experts tell us that today we are living in a risk society (Beck, 1992). Complex design problems can have a high degree of uncertainty and easily become risks. They often have a subjective element, for what one person considers a problem or a risk, another will see as an opportunity. Complex problems also require information for their solution and this may be difficult to find. It requires the ability to search for, analyse and synthesise, relevant intelligence and relate it to past, current and future events. Threats to important institutions from terrorists are of a different nature and scale to those that have been experienced before. Many will take us completely by surprise. Bernstein (1996) suggests that the essence of risk management lies in maximising the areas which we have some control over while minimising those areas where we have no control over the outcome and the linkage between cause and effect is hidden. When we take a risk we are making a bet that a particular outcome will result from the decision we have made although we have no certainty that this will happen. Risk management usually starts with risk analysis, which attempts to establish and rank the most serious risks to be avoided so far as these are known. Here many companies attempt to achieve a balance between the benefits of greater security and the costs involved. Too high a level of security, while providing good protection, can result in a system that is both difficult to use and expensive to operate (Mumford, 1999). Risk analysis next moves on to risk assessment. This is an analysis of the seriousness of different risks by determining the probability and potential damage of each one. For example, major risks can come from a large concentration of data in one place that is accessed by many different people, not all of whom are known. There can be relationships between risks. Clifford Stoll’s (1990) book The Cuckoo’s Egg shows how the ability of a German hacker to enter a university laboratory computer made it possible for him to later enter into the computers of United States military bases. Risk analysis identifies the risks; risk assessment tries to estimate how likely they are to happen and how serious the consequences will be. Risk priorisation recognises that all companies cannot be protected from all risks and choices must be made. Risk impact is the likely magnitude of the loss if a system break-in, fraud or other serious problem occurs. Risk control involves further actions to reduce the risk and to trigger further defensive actions if a very serious problem occurs. Risk control also covers the monitoring of risk on a regular basis to check that existing protection is still effective. This can lead to risk reassessment. Very serious risks such as those coming from terrorist attack or criminal activity require monitoring. This, together with the detailed documentation of any problems or illegal activities when they occur, is essential to avoid complacency. An effective system must both prevent problems and detect when they have occurred. All of these activities to design security into a system require human vigilance if they are to be effective. All employees should accept some responsibility for checking that the system they work with continues to maintain its integrity and security. This chapter will place its main focus on protective problem solving and design directed at avoiding or minimising very serious risks. Today, it is unwise for managers to neglect this. Because of its growth in recent years and its prevalence today criminal activity will be examined first in some detail. Particular attention will be paid to how the involvement of employees in problem solving can play a part in reducing or avoiding this.


Author(s):  
Olga Mendoza-Schrock ◽  
Mateen M. Rizki ◽  
Vincent J. Velten

This article describes how transfer subspace learning has recently gained popularity for its ability to perform cross-dataset and cross-domain object recognition. The ability to leverage existing data without the need for additional data collections is attractive for monitoring and surveillance technology, specifically for aided target recognition applications. Transfer subspace learning enables the incorporation of sparse and dynamically collected data into existing systems that utilize large databases. Manifold learning has also gained popularity for its success at dimensionality reduction. In this contribution, Manifold learning and transfer subspace learning are combined to create a new system capable of achieving high target recognition rates. The manifold learning technique used in this contribution is diffusion maps, a nonlinear dimensionality reduction technique based on a heat diffusion analogy. The transfer subspace learning technique used is Transfer Fisher's Linear Discriminative Analysis. The new system, manifold transfer subspace learning, sequentially integrates manifold learning and transfer subspace learning. In this article, the ability of the new techniques to achieve high target recognition rates for cross-dataset and cross-domain applications is illustrated using a variety of diverse datasets.


2019 ◽  
Vol 22 (6) ◽  
pp. 647-650
Author(s):  
Thomas Nilsen ◽  
Ingunn Brandt ◽  
Jennifer R. Harris

AbstractThe Norwegian Twin Registry (NTR) is maintained as a research resource that was compiled by merging several panels of twin data that were established for research into physical and mental health, wellbeing and development. NTR is a consent-based registry. Where possible, data that were collected in previous studies are curated for secondary research use. A particularly valuable potential benefit associated with the Norwegian twin data lies in the opportunities to expand and enhance the data through record linkage to nationwide registries that cover a wide array of health data and other information, including socioeconomic factors. This article provides a brief description of the current NTR sample and data collections, information about data access procedures and an overview of the national registries that can be linked to the NTR for research projects.


2019 ◽  
Vol 15 (3/4) ◽  
pp. 174-198
Author(s):  
A. Abdollahi Nami ◽  
L. Rajabion

Purpose A mobile ad hoc network (MANET) enables providers and customers to communicate without a fixed infrastructure. Databases are extended on MANETs to have easy data access and update. As the energy and mobility limitations of both servers and clients affect the availability of data in MANETs, these data are replicated. The purpose of this paper is to provide a literature review of data replication issues and classify the available strategies based on the issues they addressed. Design/methodology/approach The selected articles are reviewed based on the defined criteria. Also, the differences, the advantages and disadvantages of these techniques are described. The methods in the literature can be categorized into three groups, including cluster-based, location-based and group-based mechanisms. Findings High flexibility and data consistency are the features of cluster-based mechanisms. The location-based mechanisms are also appropriate for replica allocation, and they mostly have low network traffic and delay. Also, the group-based mechanism has high data accessibility compared to other mechanisms. Data accessibility and time have got more attention to data replication techniques. Scalability as an important parameter must be considered more in the future. The reduction of storage cost in MANETs is the main goal of data replication. Researchers have to consider the cost parameter when another parameter will be influenced. Research limitations/implications Data replication in MANETs has been covered in different available sources such as Web pages, technical reports, academic publications and editorial notes. The articles published in national journals and conferences are ignored in this study. This study includes articles from academic main international journals to get the best capability. Originality/value The paper reviews the past and the state-of-the-art mechanisms in data replication in MANET. Exclusively, data replication’s main goal, existing challenges, research terminologies and mechanisms in MANET are summarized using the answers to the research questions. This method will help researchers in the future to develop more effective data replication method in MANET.


2019 ◽  
Vol 3 (1) ◽  
Author(s):  
Willy Willy

This research aims to analyze potential bankruptcies of  PT Indo Asia Sukses using Zmijewski and Springate method in period of 2016-2018. The data used is Primary data coming from the company. This research uses qualitative method with two data collections, those are documentation and theoritical study method. The result of research shows that: (1) According to Zmijewski X-Score, the company was in good condition with the negative result in 2016-2018. (2) According to Springate S-Score, the company was in good condition in 2016-2018, but the first quarter of 2017 showed the company had potential to get bankrupt. According to the result of research, it hoped that the company could improve the sales beacuse the research shows that low in sales could get the company bankrupt in first quarter of 2017 and the company could gain more awareness of financial if similar condition happening in the future.


Sign in / Sign up

Export Citation Format

Share Document