scholarly journals Digital Imaging in the Materials Engineering Laboratory

1997 ◽  
Vol 5 (3) ◽  
pp. 20-21
Author(s):  
Theodore M. Clarke

The recent completion of a company-wide PC network system, supported by network servers intended to rapidly handle large data files from CAD systems, made implementation of digital imaging in the Corporate Materials Engineering laboratory of Case Corporation a wise investment. Laboratory reports in Microsoft Word with linked image files can be rapidly received through any of the networked PC's and are archived in the server system. The changeover from film imaging to digital imaging was very sudden. This change was accelerated by satisfied users, whose work was made easier and more productive, and satisfied customers within Case Corporation.

Author(s):  
Bill Trevillion

Abstract Radian Corporation has developed extensive data display capabilities to analyze vibration and acoustic data from structures and rotating equipment. The Machinery Interactive Display and Analysis System (MIDAS) displays data collected through the acquisition functions of MIDAS. The graphics capabilities include displaying spectra in three-dimensional waterfall and in X-Y formats. Both types of plots can relate vibrations to time, equipment speed, or process parameters. Using menu-driven parameter selection, data can be displayed in formats that are the most useful for analysis. The system runs on a popular mini-computer, and it can be used with a great variety of graphics terminals, workstations, and printer/plotters. The software was designed and written for interactive display and plotting. Automatic plotting of large data files is facilitated by a batch plotting mode. The user can define display formats for the analysis of noise and vibration problems in the electric utility, chemical processing, paper, and automotive industries. This paper describes the history and development of graphics capabilities of the MIDAS system. The system, as illustrated in the examples, has proven efficient and economical for displaying large quantities of data.


2007 ◽  
Vol 9 (2) ◽  
Author(s):  
P. L. Wessels ◽  
L. P. Steenkamp

One of the critical issues in managing information within an organization is to ensure that proper controls exist and are applied in allowing people access to information. Passwords are used extensively as the main control mechanism to identify users wanting access to systems, applications, data files, network servers or personal information. In this article, the issues involved in selecting and using passwords are discussed and the current practices employed by users in creating and storing passwords to gain access to sensitive information are assessed. The results of this survey conclude that information managers cannot rely only on users to employ proper password control in order to protect sensitive information. 


2021 ◽  
Vol 14 (2) ◽  
pp. 268-277
Author(s):  
Etza nofarita

Security issues of a system are factors that need to be considered in the operation of information systems, which are intended to prevent threats to the system and detect and correct any damage to the system. Distributed Denial of Services (DDOS) is a form of attack carried out by someone, individuals or groups to damage data that can be attacked through a server or malware in the form of packages that damage the network system used. Security is a mandatory thing in a network to avoid damage to the data system or loss of data from bad people or heckers. Packages sent in the form of malware that attacks, causing bandwidth hit continuously. Network security is a factor that must be maintained and considered in an information system. Ddos forms are Ping of Death, flooding, Remote controled attack, UDP flood, and Smurf Attack. The goal is to use DDOS to protect or prevent system threats and improve damaged systems. Computer network security is very important in maintaining the security of data in the form of small data or large data used by the user.


2016 ◽  
Vol 13 (1) ◽  
pp. 181-186 ◽  
Author(s):  
Dawna M. Drum ◽  
Andrew Pulvermacher

ABSTRACT Modern organizations are inundated with data, and they often struggle to organize it in an efficient and effective manner in order to get the most value from the data. The context of this case is, thus, situated in current business practice. Students are given large data files that were extracted from an enterprise system. They must use Microsoft Access and Excel to summarize and organize the data to create a dynamic profit and loss statement. Basic skills in Excel and general accounting knowledge are assumed, while Access knowledge is not assumed. The Teaching Notes provide solutions and are organized to allow instructors to provide minimal guidance or fully annotated directions.


1997 ◽  
Vol 3 (S2) ◽  
pp. 931-932 ◽  
Author(s):  
Ian M. Anderson ◽  
Jim Bentley

Recent developments in instrumentation and computing power have greatly improved the potential for quantitative imaging and analysis. For example, products are now commercially available that allow the practical acquisition of spectrum images, where an EELS or EDS spectrum can be acquired from a sequence of positions on the specimen. However, such data files typically contain megabytes of information and may be difficult to manipulate and analyze conveniently or systematically. A number of techniques are being explored for the purpose of analyzing these large data sets. Multivariate statistical analysis (MSA) provides a method for analyzing the raw data set as a whole. The basis of the MSA method has been outlined by Trebbia and Bonnet.MSA has a number of strengths relative to other methods of analysis. First, it is broadly applicable to any series of spectra or images. Applications include characterization of grain boundary segregation (position-), of channeling-enhanced microanalysis (orientation-), or of beam damage (time-variation of spectra).


2019 ◽  
Vol 16 (9) ◽  
pp. 3824-3829
Author(s):  
Deepak Ahlawat ◽  
Deepali Gupta

Due to advancement in the technological world, there is a great surge in data. The main sources of generating such a large amount of data are social websites, internet sites etc. The large data files are combined together to create a big data architecture. Managing the data file in such a large volume is not easy. Therefore, modern techniques are developed to manage bulk data. To arrange and utilize such big data, Hadoop Distributed File System (HDFS) architecture from Hadoop was presented in the early stage of 2015. This architecture is used when traditional methods are insufficient to manage the data. In this paper, a novel clustering algorithm is implemented to manage a large amount of data. The concepts and frames of Big Data are studied. A novel algorithm is developed using the K means and cosine-based similarity clustering in this paper. The developed clustering algorithm is evaluated using the precision and recall parameters. The prominent results are obtained which successfully manages the big data issue.


1990 ◽  
Vol 73 (7) ◽  
pp. 1945-1955 ◽  
Author(s):  
V. Ducrocq ◽  
D. Boichard ◽  
B. Bonaiti ◽  
A. Barbat ◽  
M. Briend

Sign in / Sign up

Export Citation Format

Share Document