scholarly journals EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining

2015 ◽  
Vol 2015 ◽  
pp. 1-12
Author(s):  
Sergio Mauricio Martínez Monterrubio ◽  
Juan Frausto Solis ◽  
Raúl Monroy Borja

The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.

2017 ◽  
Vol 25 (5) ◽  
pp. 1585-1601
Author(s):  
Wesam S Bhaya ◽  
Mustafa A Ali

Malicious software is any type of software or codes which hooks some: private information, data from the computer system, computer operations or(and) merely just to do malicious goals of the author on the computer system, without permission of the computer users. (The short abbreviation of malicious software is Malware). However, the detection of malware has become one of biggest issues in the computer security field because of the current communication infrastructures are vulnerable to penetration from many types of malware infection strategies and attacks.  Moreover, malwares are variant and diverse in volume and types and that strictly explode the effectiveness of traditional defense methods like signature approach, which is unable to detect a new malware. However, this vulnerability will lead to a successful computer system penetration (and attack) as well as success of more advanced attacks like distributed denial of service (DDoS) attack. Data mining methods can be used to overcome limitation of signature-based techniques to detect the zero-day malware. This paper provides an overview of malware and malware detection system using modern techniques such as techniques of data mining approach to detect known and unknown malware samples.


2005 ◽  
Vol 3 (3) ◽  
pp. 8-12
Author(s):  
Edward A. Schmalz

Security has become a matter of utmost importance since the aftermath of September 11th, especially in the area of computer systems. There are many options that a health educator can do as an individual to help secure the computer system at their worksite and in their home. This article is a brief overview of some of the precautions that should be taken on a daily basis to protect your computer systems, files, data, and other pertinent information.


2020 ◽  
Vol 5 (1) ◽  
pp. 26-34
Author(s):  
Amsar

Each online based server cannot be fully secured from various external attacks that attempt to infiltrate the system. Server security is the most important part for administrators against intruders who carry out their actions in attacking computer systems, there are several attack methods that can be done, method like brute-force are usually deployed to attack computer security systems that are connected to the internet. In this study to overcome attacks penetrating into the server system, the port knoking method can be used as a step taken to prevent attacks that enter the system or also known as the authentication method. By using this port knoking method the administrator can control the system to be more secure from brute-force attackers addressed in certain parts such as SSH server and FTP server. So that users can perform services connected with SSH server and FTP server services, knocked functions as a port knocking daemon that is ready to receive port knocking authentication from the user then rewrites the firewall, so that the connected services can be secured from attacks that try to infiltrate the server system. Furthermore, Ip tables and uncomplicated firewalls function to build firewalls that will deny connections to the SSH server and FTP server services. So when a foreign user accesses SSH and FPT without first doing the autendiction, the firewall will reject the connection, but if the user passes the port knocking autendiction stage via sending SYN packets to the port provided in the knocking daemon port, then the knocking daemon port is will rewrite the firewall so users can connect to the SSH server and FPT server services.


2020 ◽  
Author(s):  
S Mukhtar Ayubi Simatupang

AbstrakBrainware adalah istilah yang digunakan untuk manusia yang digunakan untuk manusia yang berhubungan dengan sistem komputer. Manusia merupakan suatu elemen dari sistem komputer yang merancang bagaimana suatu mesin dapat bekerja sesuai dengan hasil yang diinginkan. Tingkatan brainware terdiri atas system analyst, programmer, administrator, dan operator. Bagian bagian brainware terdiri atas operator komputer, teknisi, trainer, konsultan, project manager, programmer, grapic designer, spesialis jaringan, database administrator, dan system analitis. Kata Kunci : Brainware (Perangkat Sumber Daya Manusia)AbstractBrainware is a term used for humans that is used for humans related to computer systems. Humans are an element of a computer system that designs how a machine can work in accordance with the desired results. The brainware level consists of system analysts, programmers, administrators, and operators. The brainware section consists of computer operators, technicians, trainers, consultants, project managers, programmers, grapic designers, network specialists, database administrators, and system analytics.Keywords: Brainware (Human Resources Tool)


Author(s):  
Joshua A. Kroll

This chapter addresses the relationship between AI systems and the concept of accountability. To understand accountability in the context of AI systems, one must begin by examining the various ways the term is used and the variety of concepts to which it is meant to refer. Accountability is often associated with transparency, the principle that systems and processes should be accessible to those affected through an understanding of their structure or function. For a computer system, this often means disclosure about the system’s existence, nature, and scope; scrutiny of its underlying data and reasoning approaches; and connection of the operative rules implemented by the system to the governing norms of its context. Transparency is a useful tool in the governance of computer systems, but only insofar as it serves accountability. There are other mechanisms available for building computer systems that support accountability of their creators and operators. Ultimately, accountability requires establishing answerability relationships that serve the interests of those affected by AI systems.


1985 ◽  
Vol 10 (2) ◽  
pp. 79-86 ◽  
Author(s):  
Anne Costigan ◽  
Frances E. Wood ◽  
David Bawden

A comparative evaluation of three implementations of a large databank, the NIOSH Registry of Toxic Effects of Chem ical Substances, has been carried out. The three implementa tions are: a printed index, a text searching computer system, and a computerised chemical databank system, with substruc ture searching facilities. Seven test queries were used, with the aim of drawing conclusions of general relevance to chemical databank searching. The computer systems were shown to have advantages over printed indexes for several of the queries, including those involving an element of browsing. Substructure search facilities were especially advantageous. Aspects of indexing of data present, and the criteria for inclusion of types of data, were also highlighted.


BMJ Open ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. e041553
Author(s):  
Enrico de Koning ◽  
Tom E Biersteker ◽  
Saskia Beeres ◽  
Jan Bosch ◽  
Barbra E Backus ◽  
...  

IntroductionEmergency department (ED) overcrowding is a major healthcare problem associated with worse patient outcomes and increased costs. Attempts to reduce ED overcrowding of patients with cardiac complaints have so far focused on in-hospital triage and rapid risk stratification of patients with chest pain at the ED. The Hollands-Midden Acute Regional Triage—Cardiology (HART-c) study aimed to assess the amount of patients left at home in usual ambulance care as compared with the new prehospital triage method. This method combines paramedic assessment and expert cardiologist consultation using live monitoring, hospital data and real-time admission capacity.Methods and analysisPatients visited by the emergency medical services (EMS) for cardiac complaints are included. EMS consultation consists of medical history, physical examination and vital signs, and ECG measurements. All data are transferred to a newly developed platform for the triage cardiologist. Prehospital data, in-hospital medical records and real-time admission capacity are evaluated. Then a shared decision is made whether admission is necessary and, if so, which hospital is most appropriate. To evaluate safety, all patients left at home and their general practitioners (GPs) are contacted for 30-day adverse events.Ethics and disseminationThe study is approved by the LUMC’s Medical Ethics Committee. Patients are asked for consent for contacting their GPs. The main results of this trial will be disseminated in one paper.DiscussionThe HART-c study evaluates the efficacy and feasibility of a prehospital triage method that combines prehospital patient assessment and direct consultation of a cardiologist who has access to live-monitored data, hospital data and real-time hospital admission capacity. We expect this triage method to substantially reduce unnecessary ED visits.


1988 ◽  
Vol 13 (1) ◽  
pp. 25-32 ◽  
Author(s):  
T V Seshadri ◽  
N Kinra

Who, in the organization, buys the computer system? How are various departments involved in the organizational decision process? T V Seshadri and N Kinra analyse the decision processes of 30 organizations that had bought a computer system—mini, mainframe, or macro. Based on a questionnaire study and factor analysis, the authors conclude that the EDP department and Board of Directors are critical in the buying grids of the purchasing organizations. They draw implications of their findings for managers marketing computer systems.


Author(s):  
Mahwish Abid ◽  
Muhammad Usman ◽  
Muhammad Waleed Ashraf

<strong>As the technology is growing very fast and usage of computer systems is increased  as compared to the old times, plagiarism is the phenomenon which is increasing day by day. Wrongful appropriation of someone else’s work is known as plagiarism. Manually detection of plagiarism is difficult so this process should be automated. There are various tools which can be used for plagiarism detection. Some works on intrinsic plagiarism while other work on extrinsic plagiarism. Data mining the field which can help in detecting the plagiarism as well as can help to improve the efficiency of the process. Different data mining techniques can be used to detect plagiarism. Text mining, clustering, bi-gram, tri-grams, n-grams are the techniques which can help in this process</strong>


Sign in / Sign up

Export Citation Format

Share Document