DATA MANAGEMENT A NEW APPROACH TO PROBLEM DEFINITION Using Information Objects

1995 ◽  
Vol 12 (2) ◽  
pp. 21-26
Author(s):  
Deepak Khazanchi ◽  
Surya B. Yadav
Author(s):  
V. A. Ziparo ◽  
F. Cottefoglie ◽  
D. Calisi ◽  
F. Giannone ◽  
G. Grisetti ◽  
...  

2010 ◽  
Vol 28 (2) ◽  
pp. 131-134 ◽  
Author(s):  
Peter W. Brewer ◽  
Kit Sturgeon ◽  
Lucas Madar ◽  
Sturt W. Manning
Keyword(s):  

Author(s):  
GERALD KAUFMAN

The National Prison Overcrowding Project is operated by the Center for Effective Public Policy. The project took shape in 1981, growing out of the desire of the National Institute of Corrections and the Edna McConnell Clark Foundation to incorporate a broad systemic view in their efforts to control overcrowding. The Center has worked with Michigan, Colorado, South Carolina, and Oregon. In each of these states a group of significant and diverse policymakers now exists whose members understand how their criminal justice system works and take responsibility for changing it. In its work with the states, the Center facilitates a process aimed at achieving long-term, systemic change rather than simply quick-fix solutions. The focus is the state policy group, composed of policymakers from all three branches of government as well as high-ranking officials from criminal justice agencies, local law enforcement, and private citizens. National staff from the Center work with state project staff to take the policy group through a series of steps designed to produce problem definition and analysis, the information required for the analysis, the selection of policy options, and the implementation and monitoring of the options. Because of the value-laden as well as technical nature of the subject matter, the participation of all members in this policy analysis is critical.


Author(s):  
Chi Minh Pham

In recent years, big data analytics has become widely applied in cybersecurity, leading to the novel approach of big data cybersecurity analytics. While some organizations have been adopting this new approach in tackling cybercrime, there are limited guidelines to which companies can refer. Therefore, the renowned maturity model concept, which offers a systematic approach for an organization to measure and improve its maturity level, is applied in this study. On the basis of a comprehensive literature review, this chapter proposes a maturity framework for big data cybersecurity analytics. This synthesized comprehensive maturity framework comprises seven dimensions across five stage levels—namely organization, human, infrastructure, data management, analytics application, governance, and security dimensions. Knowing which dimensions need to be improved and which pathway to follow ensures the successful implementation of big data cybersecurity analytics within organizations.


2014 ◽  
Vol 12 (3) ◽  
pp. 3319-3324
Author(s):  
Piyapong Khumrin ◽  
Ariyaphong Wongnoppavich ◽  
Khemmapop Boonploy ◽  
Volaluck Supajatura

This paper describes a new approach to computer based testing where lecturers submit questions via word document which is processed to produce an examination, with student results analyzed and reported in a spreadsheet. The overall process starts with lecturers sending question files in word document format via email to the service provider. The questions are passed through the approval process using the editing system and then transferred to the examination system. The examination system directly accesses information from the question files to create a test, which students complete by inserting their answers directly into the spreadsheet file. Finally, the data are analyzed using spreadsheet formulas and the report system sends the results to students' emails. The document based approach helps the system implementation to be simpler and well accepted by the users while consistent with organizational requirements of moving towards electronic data management.


Author(s):  
M. Garramone ◽  
N. Moretti ◽  
M. Scaioni ◽  
C. Ellul ◽  
F. Re Cecconi ◽  
...  

Abstract. The integration of Building Information Modelling (BIM) and Geographical Information Systems (GIS) is gaining momentum in digital built Asset Management (AM), and has the potential to improve information management operations and provide advantages in process control and delivery of quality AM services, along with underlying data management benefits through entire life cycle of an asset. Work has been carried out relating GeoBIM/AM to buildings as well as infrastructure assets, where the potential financial savings are extensive. While information form BIM maybe be sufficient for building-AM; for infrastructure AM a combination of GIS and BIM is required. Scientific literature relating to this topic has been growing in recent years and has now reached a point where a systematic analysis of current and potential uses of GeoBIM in AM for Infrastructure is possible. Three specific areas form part of the analysis – a review of BIM and Infrastructure AM and GIS and Infrastructure AM leads to a better understanding of current practice. Combining the two, a review of GeoBIM and Infrastructure AM allows the benefits of, and issues relating to, GeoBIM to be clearly identified, both at technical and operational levels. A set of 54 journal articles was selected for in-depth contents analysis according to the AM function addressed and the managed asset class. The analysis enabled the identification of three categories of issues and opportunities: data management, interoperability and integration and AM process and service management. The identified knowledge gaps, in turn, underpin problem definition for the next phases of research into GeoBIM for infrastructure AM.


Database ◽  
2019 ◽  
Vol 2019 ◽  
Author(s):  
J Jarczak ◽  
J Lach ◽  
P Borówka ◽  
M Gałka ◽  
M Bućko ◽  
...  

Abstract Dynamic development of biobanking industry (both business and science) resulted in an increased number of IT systems for samples and data management. The most difficult and complicated case for the biobanking community was cooperation between institutions, equipped with different IT systems, in the field of scientific research, mainly data interchange and information flow. Tools available on the market relate mainly to the biobank or collection level. Efficient and universal protocols including the detailed information about the donor and the sample are still very limited. Here, we have developed BioSCOOP, a communication protocol in the form of a well documented JSON API. The main aim of this study was to harmonize and standardize the rules of communication between biobanks on the level of information about the donor together with information about the sample. The purpose was to create a communication protocol for two applications: to transfer the information between different biobanks and to allow the searching and presentation of the sample and data sets.


Sign in / Sign up

Export Citation Format

Share Document