3W Scaffolding in Curriculum of Database Management and Application – Applying the Human-Centered Computing Systems

Author(s):  
Min-Huei Lin ◽  
Ching-Fan Chen

Author(s):  
Baoning Niu ◽  
Patrick Martin ◽  
Wendy Powley

Workload management is the discipline of effectively managing, controlling and monitoring work flow across computing systems. It is an increasingly important requirement of database management systems (DBMSs) in view of the trends towards server consolidation and more diverse workloads. Workload management is necessary so the DBMS can be business-objective oriented, can provide efficient differentiated service at fine granularity and can maintain high utilization of resources with low management costs. We see that workload management is shifting from offline planning to online adaptation. In this paper we discuss the objectives of workload management in autonomic DBMSs and provide a framework for examining how current workload management mechanisms match up with these objectives. We then use the framework to study several mechanisms from both DBMS products and research efforts. We also propose directions for future work in the area of workload management for autonomic DBMSs.



Author(s):  
W. Brett McKenzie

“Big ideas” drive the disciplines. In biology, the insights of Darwin generated evolutionary theory. In chemistry, Mendeleev’s vision of the organization of elements predicted subsequent discoveries. In computing, the database and associated database management systems (DBMS) are one of the “big ideas”. The database was conceptually possible prior to the development of the computer, but it was the digital computer that made the database the common tool it is today. The core idea of the database is distinguishing between the data description and the data itself. Among other things, this idea makes the Web possible and has made manageable new fields for discovery, such as modeling the human genome.



2009 ◽  
pp. 205-211
Author(s):  
W. Brett McKenzie

“Big ideas” drive the disciplines. In biology, the insights of Darwin generated evolutionary theory. In chemistry, Mendeleev’s vision of the organization of elements predicted subsequent discoveries. In computing, the database and associated database management systems (DBMS) are one of the “big ideas”. The database was conceptually possible prior to the development of the computer, but it was the digital computer that made the database the common tool it is today. The core idea of the database is distinguishing between the data description and the data itself. Among other things, this idea makes the Web possible and has made manageable new fields for discovery, such as modeling the human genome.



Author(s):  
Camilo Porto Nunes ◽  
Cláudio de Souza Baptista ◽  
Marcus Costa Sampaio

Computing systems have become more complex and there is a plethora of systems in heterogeneous and autonomous platforms, from mainframes to mobile devices, which need to interoperate and lack effective management. This complexity has demanded huge investments to enable these systems to work properly. It is necessary to invest on software acquisition and installation: management, administration, and update. These costs compound the Total Cost of Ownership (TCO), which tends to increase exponentially according to the software complexity. Information Technology (IT) focuses mainly on providing information services in order to achieve simplicity, agility, large access to information, and competitivity. Database Management Systems (DBMS) are part of the IT infrastructure in large, medium, and even small enterprises.



Author(s):  
Douglas L. Dorset ◽  
Barbara Moss

A number of computing systems devoted to the averaging of electron images of two-dimensional macromolecular crystalline arrays have facilitated the visualization of negatively-stained biological structures. Either by simulation of optical filtering techniques or, in more refined treatments, by cross-correlation averaging, an idealized representation of the repeating asymmetric structure unit is constructed, eliminating image distortions due to radiation damage, stain irregularities and, in the latter approach, imperfections and distortions in the unit cell repeat. In these analyses it is generally assumed that the electron scattering from the thin negativelystained object is well-approximated by a phase object model. Even when absorption effects are considered (i.e. “amplitude contrast“), the expansion of the transmission function, q(x,y)=exp (iσɸ (x,y)), does not exceed the first (kinematical) term. Furthermore, in reconstruction of electron images, kinematical phases are applied to diffraction amplitudes and obey the constraints of the plane group symmetry.





1996 ◽  
Vol 35 (01) ◽  
pp. 52-58 ◽  
Author(s):  
A. Mavromatis ◽  
N. Maglaveras ◽  
A. Tsikotis ◽  
G. Pangalos ◽  
V. Ambrosiadou ◽  
...  

AbstractAn object-oriented medical database management system is presented for a typical cardiologic center, facilitating epidemiological trials. Object-oriented analysis and design were used for the system design, offering advantages for the integrity and extendibility of medical information systems. The system was developed using object-oriented design and programming methodology, the C++ language and the Borland Paradox Relational Data Base Management System on an MS-Windows NT environment. Particular attention was paid to system compatibility, portability, the ease of use, and the suitable design of the patient record so as to support the decisions of medical personnel in cardiovascular centers. The system was designed to accept complex, heterogeneous, distributed data in various formats and from different kinds of examinations such as Holter, Doppler and electrocardiography.



TAPPI Journal ◽  
2015 ◽  
Vol 14 (1) ◽  
pp. 51-60
Author(s):  
HONGHI TRAN ◽  
DANNY TANDRA

Sootblowing technology used in recovery boilers originated from that used in coal-fired boilers. It started with manual cleaning with hand lancing and hand blowing, and evolved slowly into online sootblowing using retractable sootblowers. Since 1991, intensive research and development has focused on sootblowing jet fundamentals and deposit removal in recovery boilers. The results have provided much insight into sootblower jet hydrodynamics, how a sootblower jet interacts with tubes and deposits, and factors influencing its deposit removal efficiency, and have led to two important innovations: fully-expanded sootblower nozzles that are used in virtually all recovery boilers today, and the low pressure sootblowing technology that has been implemented in several new recovery boilers. The availability of powerful computing systems, superfast microprocessors and data acquisition systems, and versatile computational fluid dynamics (CFD) modeling capability in the past two decades has also contributed greatly to the advancement of sootblowing technology. High quality infrared inspection cameras have enabled mills to inspect the deposit buildup conditions in the boiler during operation, and helped identify problems with sootblower lance swinging and superheater platens and boiler bank tube vibrations. As the recovery boiler firing capacity and steam parameters have increased markedly in recent years, sootblowers have become larger and longer, and this can present a challenge in terms of both sootblower design and operation.



Sign in / Sign up

Export Citation Format

Share Document