Data Set
Recently Published Documents


TOTAL DOCUMENTS

21803
(FIVE YEARS 12018)

H-INDEX

196
(FIVE YEARS 43)

2022 ◽  
Vol 39 ◽  
pp. 54-68
Author(s):  
Asefeh Faraz Covelli ◽  
Susan Weber Buchholz ◽  
Leanne H. Fowler ◽  
Sharon Beasley ◽  
Mary Beth Bigley

Author(s):  
Jesmeen Mohd Zebaral Hoque ◽  
Jakir Hossen ◽  
Shohel Sayeed ◽  
Chy. Mohammed Tawsif K. ◽  
Jaya Ganesan ◽  
...  

Recently, the industry of healthcare started generating a large volume of datasets. If hospitals can employ the data, they could easily predict the outcomes and provide better treatments at early stages with low cost. Here, data analytics (DA) was used to make correct decisions through proper analysis and prediction. However, inappropriate data may lead to flawed analysis and thus yield unacceptable conclusions. Hence, transforming the improper data from the entire data set into useful data is essential. Machine learning (ML) technique was used to overcome the issues due to incomplete data. A new architecture, automatic missing value imputation (AMVI) was developed to predict missing values in the dataset, including data sampling and feature selection. Four prediction models (i.e., logistic regression, support vector machine (SVM), AdaBoost, and random forest algorithms) were selected from the well-known classification. The complete AMVI architecture performance was evaluated using a structured data set obtained from the UCI repository. Accuracy of around 90% was achieved. It was also confirmed from cross-validation that the trained ML model is suitable and not over-fitted. This trained model is developed based on the dataset, which is not dependent on a specific environment. It will train and obtain the outperformed model depending on the data available.


Author(s):  
Tuğçe Ayhan ◽  
Tamer Uçar

The demand for credit is increasing constantly. Banks are looking for various methods of credit evaluation that provide the most accurate results in a shorter period in order to minimize their rising risks. This study focuses on various methods that enable the banks to increase their asset quality without market loss regarding the credit allocation process. These methods enable the automatic evaluation of loan applications in line with the sector practices, and enable determination of credit policies/strategies based on actual needs. Within the scope of this study, the relationship between the predetermined attributes and the credit limit outputs are analyzed by using a sample data set of consumer loans. Random forest (RF), sequential minimal optimization (SMO), PART, decision table (DT), J48, multilayer perceptron(MP), JRip, naïve Bayes (NB), one rule (OneR) and zero rule (ZeroR) algorithms were used in this process. As a result of this analysis, SMO, PART and random forest algorithms are the top three approaches for determining customer credit limits.


Author(s):  
I Made Agus Wirawan ◽  
Retantyo Wardoyo ◽  
Danang Lelono

Electroencephalogram (EEG) signals in recognizing emotions have several advantages. Still, the success of this study, however, is strongly influenced by: i) the distribution of the data used, ii) consider of differences in participant characteristics, and iii) consider the characteristics of the EEG signals. In response to these issues, this study will examine three important points that affect the success of emotion recognition packaged in several research questions: i) What factors need to be considered to generate and distribute EEG data?, ii) How can EEG signals be generated with consideration of differences in participant characteristics?, and iii) How do EEG signals with characteristics exist among its features for emotion recognition? The results, therefore, indicate some important challenges to be studied further in EEG signals-based emotion recognition research. These include i) determine robust methods for imbalanced EEG signals data, ii) determine the appropriate smoothing method to eliminate disturbances on the baseline signals, iii) determine the best baseline reduction methods to reduce the differences in the characteristics of the participants on the EEG signals, iv) determine the robust architecture of the capsule network method to overcome the loss of knowledge information and apply it in more diverse data set.


Author(s):  
Mohanish Bawane

Abstract: MERN stack is one of the well known web stack that has acquired significance over other stack. This is a direct result of its UI delivering and execution, Cost-Adequacy, Open Source and is not difficult to switch among customer and server. Its essential target is to improve the general exhibition of the application. This stack, as well as utilizing superior execution and tweaked advances, considers web applications and programming to be grown rapidly. MERN stack is an assortment of strong and amazing innovations used to foster adaptable expert web applications, containing front-end, back-end, and data set parts. It is an innovation stack that is an easy to understand full-stack JavaScript structure for building dynamic sites and applications. This is the explanation it is the most favored stage by new businesses. This paper will depict MERN Stack involving 4 advancements to be specific: Mongo DB, Express, Respond and Node.js. Every one of these 4 incredible advancements gives a start to finish system for the designers to work in and every one of these advances have a major influence in the improvement of web applications. Index Terms: MERN STACK, Mongo DB, Express JS, React JS, Node JS platform


2022 ◽  
Vol 22 (1) ◽  
Author(s):  
Duygu Bozkaya ◽  
Heng Zou ◽  
Cindy Lu ◽  
Nicole W. Tsao ◽  
Byron L. Lam

Abstract Background Choroideremia is a rare inherited retinal disease that leads to blindness. Visual acuity (VA) is a key outcome measure in choroideremia treatment studies, but VA decline rates change with age. An accurate understanding of the natural deterioration of VA in choroideremia is important to assess the treatment effect of new therapies in which VA is the primary outcome measure. We conducted a meta-analysis of data on individuals with choroideremia to determine the rate of VA deterioration between the better- and worse-seeing eye (BSE and WSE, respectively). Methods Data were collected from the prospective Natural History of the Progression of Choroideremia (NIGHT) study (613 eyes, baseline data only), studies included in a recent meta-analysis, and studies identified in a targeted literature search performed on March 25, 2020, including individual best-corrected VA (BCVA) and age data in male individuals with choroideremia. Best-corrected VA decline rates (measured by logMAR units) by age and trends in BCVA decline rates in the BSE and WSE were evaluated.  Results Data from 1037 males (1602 eyes; mean age, 41.8 years) were included. Before and after an age cutoff of 33.8 years, BCVA decline rates for the WSE were 0.0086 and 0.0219 logMAR per year, respectively. Before and after an age cutoff of 39.1 years, BCVA decline rates for the BSE were 0.00001 and 0.0203 logMAR per year, respectively. Differences in absolute BCVA and decline rates increased between the 2 eyes until age ~ 40; thereafter, differences in absolute BCVA and decline rates were similar between eyes. Conclusions Using the largest choroideremia data set to date, this analysis demonstrates accelerated BCVA decline beginning between 30 and 40 years of age. Disparate interocular progression rates were observed before the transition age, with similar interocular progression rates after the transition age.


SIMULATION ◽  
2022 ◽  
pp. 003754972110699
Author(s):  
José V C Vargas ◽  
Sam Yang ◽  
Juan Carlos Ordonez ◽  
Luiz F Rigatti ◽  
Pedro H R Peixoto ◽  
...  

A simplified three-dimensional mathematical model for electronic packaging cabinets was derived from physical laws. Tridimensionality resulted from the domain division in volume elements (VEs) with uniform properties, each with one temperature, and empirical and theoretical correlations allowed for modeling their energetic interaction, thus producing ordinary differential equations (ODEs) temperatures versus time system. The cabinet (2048 mm × 1974 mm × 850 mm) thermal response with one heat source was measured. Data set 1 with a 1.6-kW power source was used for model adjustment by solving an inverse problem of parameter estimation (IPPE) having the cabinet internal average air velocities as adjustment parameters. Data set 2 obtained with a 3-kW power source validated model results. The converged mesh had a total of 7500 VE. The steady-state solution took between 16 and 19 s of CPU time to reach convergence and less than 3 min to obtain the 6500-s cabinet dynamic response under variable loading conditions, in an Intel CORE i7 computer. After validation, the model was used to study the impact of heat source height on system thermal response. Fundamentally, a sharp minimum junction temperature Tjct,min = 98.5 °C was obtained in the system hot spot at an optimal heat source height, which was 25.7 °C less than the highest calculated value within the investigated range (0.1 m < zjct < 1.66 m) for the 1.6-kW power setting, which characterizes the novelty of the research, and is worth to be pursued, no matter how complex the actual cabinet design may be.


2022 ◽  
Vol 22 (1) ◽  
Author(s):  
Jiali Meng ◽  
Yuanchao Wei ◽  
Qing Deng ◽  
Ling Li ◽  
Xiaolong Li

Abstract Background Hepatocellular carcinoma (HCC) is a primary liver cancer with a high mortality rate. However, the molecular mechanism of HCC formation remains to be explored and studied. Objective To investigate the expression of TOP2A in hepatocellular carcinoma (HCC) and its prognosis. Methods The data set of hepatocellular carcinoma was downloaded from GEO database for differential gene analysis, and hub gene was identified by Cytoscape. GEPIA was used to verify the expression of HUB gene and evaluate its prognostic value. Then TOP2A was selected as the research object of this paper by combining literature and clinical sample results. Firstly, TIMER database was used to study TOP2A, and the differential expression of TOP2A gene between normal tissues and cancer tissues was analyzed, as well as the correlation between TOP2A gene expression and immune infiltration of HCC cells. Then, the expression of top2a-related antibodies was analyzed using the Human Protein Atlas database, and the differential expression of TOP2A was verified by immunohistochemistry. Then, SRTING database and Cytoscape were used to establish PPI network for TOP2A and protein–protein interaction analysis was performed. The Oncomine database and cBioPortal were used to express and identify TOP2A mutation-related analyses. The expression differences of TOP2A gene were identified by LinkedOmics, and the GO and KEGG pathways were analyzed in combination with related genes. Finally, Kaplan–Meier survival analysis was performed to analyze the clinical and prognosis of HCC patients. Results TOP2A may be a new biomarker and therapeutic target for hepatocellular carcinoma.


2022 ◽  
Vol 2022 ◽  
pp. 1-10
Author(s):  
Ruizhong Du ◽  
Jingze Wang ◽  
Shuang Li

Internet of Things (IoT) device identification is a key step in the management of IoT devices. The devices connected to the network must be controlled by the manager. For this purpose, many schemes are proposed to identify IoT devices, especially the schemes working on the gateway. However, almost all researchers do not pay close attention to the cost. Thus, considering the gateway’s limited storage and computational resources, a new lightweight IoT device identification scheme is proposed. First, the DFI (deep/dynamic flow inspection) technology is utilized to efficiently extract flow-related statistical features based on in-depth studies. Then, combined with symmetric uncertainty and correlation coefficient, we proposed a novel filter feature selection method based on NSGA-III to select effective features for IoT device identification. We evaluate our proposed method by using a real smart home IoT data set and three different ML algorithms. The experimental results showed that our proposed method is lightweight and the feature selection algorithm is also effective, only using 6 features can achieve 99.5% accuracy with a 3-minute time interval.


2022 ◽  
Vol 20 (1) ◽  
Author(s):  
Sunny C. Okoroafor ◽  
Agbonkhese I. Oaiya ◽  
David Oviaesu ◽  
Adam Ahmat ◽  
Martin Osubor ◽  
...  

Abstract Background Nigeria’s health sector aims to ensure that the right number of health workers that are qualified, skilled, and distributed equitably, are available for quality health service provision at all levels. Achieving this requires accurate and timely health workforce information. This informed the development of the Nigeria Health Workforce Registry (NHWR) based on the global, regional, and national strategies for strengthening the HRH towards achieving universal health coverage. This case study describes the process of conceptualizing and establishing the NHWR, and discusses the strategies for developing sustainable and scalable health workforce registries. Case presentation In designing the NHWR, a review of existing national HRH policies and guidelines, as well as reports of previous endeavors was done to learn what had been done previously and obtain the views of stakeholders on how to develop a scalable and sustainable registry. The findings indicated the need to review the architecture of the registry to align with other health information systems, develop a standardized data set and guidance documents for the registry including a standard operating procedure to ensure that a holistic process is adopted in data collection, management and use nationally. Learning from the findings, a conceptual framework was developed, a registry managed centrally by the Federal Ministry of Health was developed and decentralized, a standardized tool based on a national minimum data was developed and adopted nationally, a registry prototype was developed using iHRIS Manage and the registry governance functions were integrated into the health information system governance structures. To sustain the functionality of the NHWR, the handbook of the NHWR that comprised of an implementation guide, the standard operating procedure, and the basic user training manual was developed and the capacity of government staff was built on the operations of the registry. Conclusion In establishing a functional and sustainable registry, learning from experiences is essential in shaping acceptable, sustainable, and scalable approaches. Instituting governance structures that include and involve policymakers, health managers and users is of great importance in the design, planning, implementation, and decentralization stages. In addition, developing standardized tools based on the health system's needs and instituting supportable mechanisms for data flow and use for policy, planning, development, and management is essential.


Sign in / Sign up

Export Citation Format

Share Document