scholarly journals The detection of lung cancer using massive artificial neural network based on soft tissue technique

2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Kishore Rajagopalan ◽  
Suresh Babu

Abstract Background A proposed computer aided detection (CAD) scheme faces major issues during subtle nodule recognition. However, radiologists have not noticed subtle nodules in beginning stage of lung cancer while a proposed CAD scheme recognizes non subtle nodules using x-ray images. Method Such an issue has been resolved by creating MANN (Massive Artificial Neural Network) based soft tissue technique from the lung segmented x-ray image. A soft tissue image recognizes nodule candidate for feature extortion and classification. X-ray images are downloaded using Japanese society of radiological technology (JSRT) image set. This image set includes 233 images (140 nodule x-ray images and 93 normal x-ray images). A mean size for a nodule is 17.8 mm and it is validated with computed tomography (CT) image. Thirty percent (42/140) abnormal represents subtle nodules and it is split into five stages (tremendously subtle, very subtle, subtle, observable, relatively observable) by radiologists. Result A proposed CAD scheme without soft tissue technique attained 66.42% (93/140) sensitivity and 66.76% accuracy having 2.5 false positives per image. Utilizing soft tissue technique, many nodules superimposed by ribs as well as clavicles have identified (sensitivity is 72.85% (102/140) and accuracy is 72.96% at one false positive rate). Conclusion In particular, a proposed CAD system determine sensitivity and accuracy in support of subtle nodules (sensitivity is 14/42 = 33.33% and accuracy is 33.66%) is statistically higher than CAD (sensitivity is 13/42 = 30.95% and accuracy is 30.97%) scheme without soft tissue technique. A proposed CAD scheme attained tremendously minimum false positive rate and it is a promising technique in support of cancerous recognition due to improved sensitivity and specificity.

2020 ◽  
Author(s):  
Kishore Rajagopalan ◽  
Suresh babu

Abstract Background An existing computer aided detection (CAD) scheme faces major issues during subtle nodule recognition. However, radiologists have not noticed subtle nodules in beginning stage of lung cancer. Method In the proposed computer aided detection (CAD) system, this issue has been resolved by creating MTANN based soft tissue technique from the lung segmented x-ray image. X-ray images are downloaded using JSRT(Japanese society of radiological technology) image set. JSRT image set includes 233 images (140 nodule x-ray images and 93 normal x-ray images). A mean size for a nodule is 17.8 mm and it is validated with computed tomography (CT) image. Thirty percent (42/140) abnormal represent subtle nodules and it is split into five stages (tremendously subtle, very subtle, subtle, observable, relatively observable) by radiologists. Result An existing computer aided detection (CAD) scheme attained 66.42% (93/140) sensitivity having 2.5 false positives (FPs) per image. Utilizing MTANN based soft tissue technique, many nodules superimposed by ribs as well as clavicles have identified (sensitivity is 72.85% (102/140) at one false positive rate). Conclusion In particular, proposed computer aided detection (CAD) system using soft tissue technique determine sensitivity in support of subtle nodules (14/42=33.33%) is statistically higher than CAD (13/42=30.95%) scheme without soft tissue technique. A proposed CAD scheme attained tremendously minimum false positive rate and it is a promising technique in support of cancerous recognition.


Author(s):  
Mai Trong Khang ◽  
Vu Thanh Nguyen ◽  
Tuan Dinh Le

In this paper, we propose an Artificial Neural Immune Network (ANIN) for virus detection. ANIN is a combination of Artificial Neural Network (ANN) and Artificial Immune Network (AiNet). In ANIN, each ANN is considered as a detector. A pool of initial detectors then undergoes a mature process, called AiNet, to improve its recognizing ability. Thus, more than one ANN objects can cooperate to detect malicious code. The experimental results show that ANIN can achieve a detection rate of 87.98% on average with an acceptable false positive rate.


2021 ◽  
Vol 108 (Supplement_8) ◽  
Author(s):  
Edgard Efren Lozada Hernandez ◽  
Tania Aglae Ramírez del Real ◽  
Dagoberto Armenta Medina ◽  
Jose Francisco Molina Rodriguez ◽  
Juan ramon Varela Reynoso

Abstract Aim “Incisional Hernia (IH) has an incidence of 10-23%, which can increase to 38% in specific risk groups. The objective of this study was developed and validated an artificial neural network (ANN) model for the prediction of IH after midline laparotomy (ML) and this model can be used by surgeons to help judge a patient’s risk for IH.” Material and Methods “A retrospective, single arm, observational cohort trial was conducted from January 2016 to December 2020. Study participants were recruited from patients undergoing ML for elective or urgent surgical indication. Using logistic regression and ANN models, we evaluated surgical treated IH, wound dehiscence, morbidity, readmission, and mortality using the area under the receiver operating characteristic curves, true-positive rate, true-negative rate, false-positive rate, and false-negative rates.” Results “There was no significant difference in the power of the ANN and logistic regression for predicting IH, wound dehiscence, mortality, readmission, and all morbidities after ML. The resulting model consisted of 4 variables: surgical site infection, emergency surgery, previous laparotomy, and BMI(Kg/m2) > 26. The patient with the four positive factors has a 73% risk of developing incisional hernia. The area under the curve was 0.82 (95% IC 0.76-0.87). Conclusions “ANNs perform comparably to logistic regression models in the prediction of IH. ANNs may be a useful tool in risk factor analysis of IH and clinical applications.”


Materials ◽  
2020 ◽  
Vol 13 (8) ◽  
pp. 1963 ◽  
Author(s):  
Zheng Fang ◽  
Renbin Wang ◽  
Mengyi Wang ◽  
Shuo Zhong ◽  
Liquan Ding ◽  
...  

Hyperspectral X-ray CT (HXCT) technology provides not only structural imaging but also the information of material components therein. The main purpose of this study is to investigate the effect of various reconstruction algorithms on reconstructed X-ray absorption spectra (XAS) of components shown in the CT image by means of HXCT. In this paper, taking 3D printing polymer as an example, seven kinds of commonly used polymers such as thermoplastic elastomer (TPE), carbon fiber reinforced polyamide (PA-CF), acrylonitrile butadiene styrene (ABS), polylactic acid (PLA), ultraviolet photosensitive resin (UV9400), polyethylene terephthalate glycol (PETG), and polyvinyl alcohol (PVA) were selected as samples for hyperspectral CT reconstruction experiments. Seven kinds of 3D printing polymer and two interfering samples were divided into a training set and test sets. First, structural images of specimens were reconstructed by Filtered Back-Projection (FBP), Algebra Reconstruction Technique (ART) and Maximum-Likelihood Expectation-Maximization (ML-EM). Secondly, reconstructed XAS were extracted from the pixels of region of interest (ROI) compartmentalized in the images. Thirdly, the results of principal component analysis (PCA) demonstrated that the first four principal components contain the main features of reconstructed XAS, so we adopted Artificial Neural Network (ANN) trained by the reconstructed XAS expressed by the first four principal components in the training set to identify that the XAS of corresponding polymers exist in both of test sets from the training set. The result of ANN displays that FBP has the best performance of classification, whose ten-fold cross-validation accuracy reached 99%. It suggests that hyperspectral CT reconstruction is a promising way of getting image features and material features at the same time, which can be used in medical imaging and nondestructive testing.


2011 ◽  
Vol 38 (9) ◽  
pp. 11329-11334 ◽  
Author(s):  
Yongjun Wu ◽  
Yiming Wu ◽  
Jing Wang ◽  
Zhen Yan ◽  
Lingbo Qu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document