Systematic mapping study of data mining–based empirical studies in cardiology

2017 ◽  
Vol 25 (3) ◽  
pp. 741-770 ◽  
Author(s):  
Ilham Kadi ◽  
Ali Idri ◽  
José Luis Fernandez-Aleman

Data mining provides the methodology and technology to transform huge amount of data into useful information for decision making. It is a powerful process to extract knowledge and discover new patterns embedded in large data sets. Data mining has been increasingly used in medicine, particularly in cardiology. In fact, data mining applications can greatly benefits all parts involved in cardiology such as patients, cardiologists and nurses. This article aims to perform a systematic mapping study so as to analyze and synthesize empirical studies on the application of data mining techniques in cardiology. A total of 142 articles published between 2000 and 2015 were therefore selected, studied and analyzed according to the four following criteria: year and channel of publication, research type, medical task and empirical type. The results of this mapping study are discussed and a list of recommendations for researchers and cardiologists is provided.

2021 ◽  
pp. 1826-1839
Author(s):  
Sandeep Adhikari, Dr. Sunita Chaudhary

The exponential growth in the use of computers over networks, as well as the proliferation of applications that operate on different platforms, has drawn attention to network security. This paradigm takes advantage of security flaws in all operating systems that are both technically difficult and costly to fix. As a result, intrusion is used as a key to worldwide a computer resource's credibility, availability, and confidentiality. The Intrusion Detection System (IDS) is critical in detecting network anomalies and attacks. In this paper, the data mining principle is combined with IDS to efficiently and quickly identify important, secret data of interest to the user. The proposed algorithm addresses four issues: data classification, high levels of human interaction, lack of labeled data, and the effectiveness of distributed denial of service attacks. We're also working on a decision tree classifier that has a variety of parameters. The previous algorithm classified IDS up to 90% of the time and was not appropriate for large data sets. Our proposed algorithm was designed to accurately classify large data sets. Aside from that, we quantify a few more decision tree classifier parameters.


Author(s):  
JARI VANHANEN ◽  
MIKA V. MÄNTYLÄ

Previous systematic literature reviews on pair programming (PP) lack in their coverage of industrial PP data as well as certain factors of PP such as infrastructure. Therefore, we conducted a systematic mapping study on empirical, industrial PP research. Based on 154 research papers, we built a new PP framework containing 18 factors. We analyzed the previous research on each factor through several research properties. The most thoroughly studied factors in industry are communication, knowledge of work, productivity and quality. Many other factors largely lack comparative data, let alone data from reliable data collection methods such as measurement. Based on these gaps in research further studies would be most valuable for development process, targets of PP, developers’ characteristics, and feelings of work. We propose how they could be studied better. If the gaps had been commonly known, they could have been covered rather easily in the previous empirical studies. Our results help to focus further studies on the most relevant gaps in research and design them based on the previous studies. The results also help to identify the factors for which systematic reviews that synthesize the findings of the primary studies would already be feasible.


2020 ◽  
Author(s):  
Esther Nanzayi Ngayua ◽  
Jianjia He ◽  
Kwabena Agyei-Boahene

AbstractThe increasing demand for new therapies and other clinical interventions has made researchers conduct many clinical trials. The high level of evidence generated by clinical trials makes them the main approach to evaluating new clinical interventions. The increasing amounts of data to be considered in the planning and conducting of clinical trials has led to higher costs and increased timelines of clinical trials, with low productivity. Advanced technologies including artificial intelligence, machine learning, deep learning, and the internet of things offer an opportunity to improve the efficiency and productivity of clinical trials at various stages. Although researchers have done some tangible work regarding the application of advanced technologies in clinical trials, the studies are yet to be mapped to give a general picture of the current state of research. This systematic mapping study was conducted to identify and analyze studies published on the role of advanced technologies in clinical trials. A search restricted to the period between 2010 and 2020 yielded a total of 443 articles. The analysis revealed a trend of increasing research interests in the area over the years. Recruitment and eligibility aspects were the main focus of the studies. The main research types were validation and evaluation studies. Most studies contributed methods and theories, hence there exists a gap for architecture, process, and metric contributions. In the future, more empirical studies are expected given the increasing interest to implement the AI, ML, DL, and IoT in clinical trials.


Sign in / Sign up

Export Citation Format

Share Document