The Role of Operating System to the Computer/System Communication

2013 ◽  
Author(s):  
Valentine Owan
2015 ◽  
Vol 1115 ◽  
pp. 484-487 ◽  
Author(s):  
Muhammad Sami ◽  
Akram M. Zeki

The aim of this study is to create and assemble the system with customizing/building Linux kernel and environments to be compatible and efficient on mini-ITX computer. The objective of the study is to create/customizing lightweight operating system using GNU/Linux to be used on computer to be used on vehicle. The system would also optimize the size and functionalities most probably would be implemented on car computer system.Keywords: mini-ATX, CarPC, Linux, Ubuntu, Qt, QML


Author(s):  
Eleanor Callahan Hunt ◽  
Sara Breckenridge Sproat ◽  
Rebecca Rutherford Kitzmiller

Author(s):  
Syahrizal Dwi Putra ◽  
M Bahrul Ulum ◽  
Diah Aryani

An expert system which is part of artificial intelligence is a computer system that is able to imitate the reasoning of an expert with certain expertise. An expert system in the form of software can replace the role of an expert (human) in the decision-making process based on the symptoms given to a certain level of certainty. This study raises the problem that many women experience, namely not understanding that they have uterine myomas. Many women do not understand and are not aware that there are already symptoms that are felt and these symptoms are symptoms of the presence of uterine myomas in their bodies. Therefore, it is necessary for women to be able to diagnose independently so that they can take treatment as quickly as possible. In this study, the expert will first provide the expert CF values. Then the user / respondent gives an assessment of his condition with the CF User values. In the end, the values obtained from these two factors will be processed using the certainty factor formula. Users must provide answers to all questions given by the system in accordance with their current conditions. After all the conditions asked are answered, the system will display the results to identify that the user is suffering from uterine myoma disease or not. The Expert System with the certainty factor method was tested with a patient who entered the symptoms experienced and got the percentage of confidence in uterine myomas/fibroids of 98.70%. These results indicate that an expert system with the certainty factor method can be used to assist in diagnosing uterine myomas as early as possible.


2021 ◽  
Vol 17 (2) ◽  
Author(s):  
Kisron Kisron ◽  
Bima Sena Bayu Dewantara ◽  
Hary Oktavianto

In a visual-based real detection system using computer vision, the most important thing that must be considered is the computation time. In general, a detection system has a heavy algorithm that puts a strain on the performance of a computer system, especially if the computer has to handle two or more different detection processes. This paper presents an effort to improve the performance of the trash detection system and the target partner detection system of a trash bin robot with social interaction capabilities. The trash detection system uses a combination of the Haar Cascade algorithm, Histogram of Oriented Gradient (HOG) and Gray-Level Coocurrence Matrix (GLCM). Meanwhile, the target partner detection system uses a combination of Depth and Histogram of Oriented Gradient (HOG) algorithms. Robotic Operating System (ROS) is used to make each system in separate modules which aim to utilize all available computer system resources while reducing computation time. As a result, the performance obtained by using the ROS platform is a trash detection system capable of running at a speed of 7.003 fps. Meanwhile, the human target detection system is capable of running at a speed of 8,515 fps. In line with the increase in fps, the accuracy also increases to 77%, precision increases to 87,80%, recall increases to 82,75%, and F1-score increases to 85,20% in trash detection, and the human target detection system has also improved accuracy to 81%, %, precision increases to 91,46%, recall increases to 86,20%, and F1-score increases to 88,42%.


2019 ◽  
pp. 1482-1499 ◽  
Author(s):  
Leanne Hirshfield ◽  
Philip Bobko ◽  
Alex J. Barelka ◽  
Mark R. Costa ◽  
Gregory J. Funke ◽  
...  

Despite the importance that human error in the cyber domain has had in recent reports, cyber warfare research to date has largely focused on the effects of cyber attacks on the target computer system. In contrast, there is little empirical work on the role of human operators during cyber breaches. More specifically, there is a need to understand the human-level factors at play when attacks occur. This paper views cyber attacks through the lens of suspicion, a construct that has been used in other contexts, but inadequately defined, in prior research. After defining the construct of suspicion, the authors demonstrate the role that suspicion plays as the conduit between computer operators' normal working behaviors and their ability to alter that behavior to detect and react to cyber attacks. With a focus on the user, rather than the target computer, the authors empirically develop a latent structure for a variety of types of cyber attacks, link that structure to levels of operator suspicion, link suspicion to users' cognitive and emotional states, and develop initial implications for cyber training.


Author(s):  
Stephanie A. E. Guerlain ◽  
Philip J. Smith

A testbed was developed for studying the effects of different computer system designs on human-computer team problem-solving, using the real-world task of antibody identification. The computer interface was designed so that practitioners could solve antibody identification cases using the computer as they normally would using paper and pencil. A rule-base was then encoded into the computer such that it had knowledge for applying a heuristic strategy that is often helpful for solving cases. With this testbed, studies have been run comparing different computer system designs. A critiquing system was found to be better than a partially automated system on cases where the computer's knowledge is incompetent.


1989 ◽  
Vol 11 (3) ◽  
pp. 119-123 ◽  
Author(s):  
Arthur A. Eggert ◽  
Kenneth A. Emmerich ◽  
Thomas J. Blankenheim ◽  
Gary J. Smulka

Improvements in the performance of a laboratory computer system do not necessarily require the replacement of major portions of the system and may not require the acquisition of any hardware at all. Major bottlenecks may exist in the ways that the operating system manages its resources and the algorithm used for timesharing decisions. Moreover, significant throughput improvements may be attainable by switching to a faster storage device if substantial disk activity is performed. In this study the fractions of time used for each of the types of tasks a laboratory computer system performs (e.g. applications programs, disk transfer, queue cycler) are defined and measured. Methods for reducing the time fractions of the various types of overhead are evaluated by doing before and after studies. The combined results of the three studies indicated that a 50% improvement could be gained through system tuning and faster storage without replacement of the computer itself


First Monday ◽  
1997 ◽  
Author(s):  
Jussara M. Almeida ◽  
Virgilio Almeida ◽  
David J. Yates

Server performance has become a crucial issue for improving the overall performance of the World-Wide Web. This paper describes WebMonitor, a tool for evaluating and understanding server performance, and presents new results for realistic workloads. WebMonitor measures activity and resource consumption, both within the kernel and in HTTP processes running in user space. WebMonitor is implemented using an efficient combination of sampling and event-driven techniques that exhibit low overhead. Our initial implementation is for the Apache World-Wide Web server running on the Linux operating system. We demonstrate the utility of WebMonitor by measuring and understanding the performance of a Pentium-based PC acting as a dedicated WWW server. Our workloads use file size distributions with a heavy tail. This captures the fact that Web servers must concurrently handle some requests for large audio and video files, and a large number of requests for small documents, containing text or images. Our results show that in a Web server saturated by client requests, up to 90% of the time spent handling HTTP requests is spent in the kernel. These results emphasize the important role of operating system implementation in determining Web server performance. It also suggests the need for new operating system implementations that are designed to perform well when running on Web servers.


Sign in / Sign up

Export Citation Format

Share Document