Explainable Artificial Intelligence in Oriental Food Recognition using Convolutional Neural Network

Author(s):  
Chee Hong Lim ◽  
Kam Meng Goh ◽  
Li Li Lim
Author(s):  
Alejandro Lopez-Rincon ◽  
Alberto Tonda ◽  
Lucero Mendoza-Maldonado ◽  
Daphne G.J.C. Mulders ◽  
Richard Molenkamp ◽  
...  

ABSTRACTIn this paper, deep learning is coupled with explainable artificial intelligence techniques for the discovery of representative genomic sequences in SARS-CoV-2. A convolutional neural network classifier is first trained on 553 sequences from available repositories, separating the genome of different virus strains from the Coronavirus family with considerable accuracy. The network’s behavior is then analyzed, to discover sequences used by the model to identify SARS-CoV-2, ultimately uncovering sequences exclusive to it. The discovered sequences are first validated on samples from other repositories, and proven able to separate SARS-CoV-2 from different virus strains with near-perfect accuracy. Next, one of the sequences is selected to generate a primer set, and tested against other state-of-the-art primer sets on existing datasets, obtaining competitive results. Finally, the primer is synthesized and tested on patient samples (n=6 previously tested positive), delivering a sensibility similar to routine diagnostic methods, and 100% specificity. In this paper, deep learning is coupled with explainable artificial intelligence techniques for the discovery of representative genomic sequences in SARS-CoV-2. A convolutional neural network classifier is first trained on 553 sequences from NGDC, separating the genome of different virus strains from the Coronavirus family with accuracy 98.73%. The network’s behavior is then analyzed, to discover sequences used by the model to identify SARS-CoV-2, ultimately uncovering sequences exclusive to it. The discovered sequences are validated on samples from NCBI and GISAID, and proven able to separate SARS-CoV-2 from different virus strains with near-perfect accuracy. Next, one of the sequences is selected to generate a primer set, and tested against other state-of-the-art primer sets, obtaining competitive results. Finally, the primer is synthesized and tested on patient samples (n=6 previously tested positive), delivering a sensibility similar to routine diagnostic methods, and 100% specificity. The proposed methodology has a substantial added value over existing methods, as it is able to both identify promising primer sets for a virus from a limited amount of data, and deliver effective results in a minimal amount of time. Considering the possibility of future pandemics, these characteristics are invaluable to promptly create specific detection methods for diagnostics.


Biomolecules ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 264
Author(s):  
Kaisa Liimatainen ◽  
Riku Huttunen ◽  
Leena Latonen ◽  
Pekka Ruusuvuori

Identifying localization of proteins and their specific subpopulations associated with certain cellular compartments is crucial for understanding protein function and interactions with other macromolecules. Fluorescence microscopy is a powerful method to assess protein localizations, with increasing demand of automated high throughput analysis methods to supplement the technical advancements in high throughput imaging. Here, we study the applicability of deep neural network-based artificial intelligence in classification of protein localization in 13 cellular subcompartments. We use deep learning-based on convolutional neural network and fully convolutional network with similar architectures for the classification task, aiming at achieving accurate classification, but importantly, also comparison of the networks. Our results show that both types of convolutional neural networks perform well in protein localization classification tasks for major cellular organelles. Yet, in this study, the fully convolutional network outperforms the convolutional neural network in classification of images with multiple simultaneous protein localizations. We find that the fully convolutional network, using output visualizing the identified localizations, is a very useful tool for systematic protein localization assessment.


2019 ◽  
Vol 1 (2) ◽  
pp. 74-84
Author(s):  
Evan Kusuma Susanto ◽  
Yosi Kristian

Asynchronous Advantage Actor-Critic (A3C) adalah sebuah algoritma deep reinforcement learning yang dikembangkan oleh Google DeepMind. Algoritma ini dapat digunakan untuk menciptakan sebuah arsitektur artificial intelligence yang dapat menguasai berbagai jenis game yang berbeda melalui trial and error dengan mempelajari tempilan layar game dan skor yang diperoleh dari hasil tindakannya tanpa campur tangan manusia. Sebuah network A3C terdiri dari Convolutional Neural Network (CNN) di bagian depan, Long Short-Term Memory Network (LSTM) di tengah, dan sebuah Actor-Critic network di bagian belakang. CNN berguna sebagai perangkum dari citra output layar dengan mengekstrak fitur-fitur yang penting yang terdapat pada layar. LSTM berguna sebagai pengingat keadaan game sebelumnya. Actor-Critic Network berguna untuk menentukan tindakan terbaik untuk dilakukan ketika dihadapkan dengan suatu kondisi tertentu. Dari hasil percobaan yang dilakukan, metode ini cukup efektif dan dapat mengalahkan pemain pemula dalam memainkan 5 game yang digunakan sebagai bahan uji coba.


Author(s):  
Oguz Akbilgic ◽  
Liam Butler ◽  
Ibrahim Karabayir ◽  
Patricia P Chang ◽  
Dalane W Kitzman ◽  
...  

Abstract Aims Heart failure (HF) is a leading cause of death. Early intervention is the key to reduce HF-related morbidity and mortality. This study assesses the utility of electrocardiograms (ECGs) in HF risk prediction. Methods and results Data from the baseline visits (1987–89) of the Atherosclerosis Risk in Communities (ARIC) study was used. Incident hospitalized HF events were ascertained by ICD codes. Participants with good quality baseline ECGs were included. Participants with prevalent HF were excluded. ECG-artificial intelligence (AI) model to predict HF was created as a deep residual convolutional neural network (CNN) utilizing standard 12-lead ECG. The area under the receiver operating characteristic curve (AUC) was used to evaluate prediction models including (CNN), light gradient boosting machines (LGBM), and Cox proportional hazards regression. A total of 14 613 (45% male, 73% of white, mean age ± standard deviation of 54 ± 5) participants were eligible. A total of 803 (5.5%) participants developed HF within 10 years from baseline. Convolutional neural network utilizing solely ECG achieved an AUC of 0.756 (0.717–0.795) on the hold-out test data. ARIC and Framingham Heart Study (FHS) HF risk calculators yielded AUC of 0.802 (0.750–0.850) and 0.780 (0.740–0.830). The highest AUC of 0.818 (0.778–0.859) was obtained when ECG-AI model output, age, gender, race, body mass index, smoking status, prevalent coronary heart disease, diabetes mellitus, systolic blood pressure, and heart rate were used as predictors of HF within LGBM. The ECG-AI model output was the most important predictor of HF. Conclusions ECG-AI model based solely on information extracted from ECG independently predicts HF with accuracy comparable to existing FHS and ARIC risk calculators.


2020 ◽  
Vol 134 (4) ◽  
pp. 328-331 ◽  
Author(s):  
P Parmar ◽  
A-R Habib ◽  
D Mendis ◽  
A Daniel ◽  
M Duvnjak ◽  
...  

AbstractObjectiveConvolutional neural networks are a subclass of deep learning or artificial intelligence that are predominantly used for image analysis and classification. This proof-of-concept study attempts to train a convolutional neural network algorithm that can reliably determine if the middle turbinate is pneumatised (concha bullosa) on coronal sinus computed tomography images.MethodConsecutive high-resolution computed tomography scans of the paranasal sinuses were retrospectively collected between January 2016 and December 2018 at a tertiary rhinology hospital in Australia. The classification layer of Inception-V3 was retrained in Python using a transfer learning method to interpret the computed tomography images. Segmentation analysis was also performed in an attempt to increase diagnostic accuracy.ResultsThe trained convolutional neural network was found to have diagnostic accuracy of 81 per cent (95 per cent confidence interval: 73.0–89.0 per cent) with an area under the curve of 0.93.ConclusionA trained convolutional neural network algorithm appears to successfully identify pneumatisation of the middle turbinate with high accuracy. Further studies can be pursued to test its ability in other clinically important anatomical variants in otolaryngology and rhinology.


2019 ◽  
Vol 134 (1) ◽  
pp. 52-55 ◽  
Author(s):  
J Huang ◽  
A-R Habib ◽  
D Mendis ◽  
J Chong ◽  
M Smith ◽  
...  

AbstractObjectiveDeep learning using convolutional neural networks represents a form of artificial intelligence where computers recognise patterns and make predictions based upon provided datasets. This study aimed to determine if a convolutional neural network could be trained to differentiate the location of the anterior ethmoidal artery as either adhered to the skull base or within a bone ‘mesentery’ on sinus computed tomography scans.MethodsCoronal sinus computed tomography scans were reviewed by two otolaryngology residents for anterior ethmoidal artery location and used as data for the Google Inception-V3 convolutional neural network base. The classification layer of Inception-V3 was retrained in Python (programming language software) using a transfer learning method to interpret the computed tomography images.ResultsA total of 675 images from 388 patients were used to train the convolutional neural network. A further 197 unique images were used to test the algorithm; this yielded a total accuracy of 82.7 per cent (95 per cent confidence interval = 77.7–87.8), kappa statistic of 0.62 and area under the curve of 0.86.ConclusionConvolutional neural networks demonstrate promise in identifying clinically important structures in functional endoscopic sinus surgery, such as anterior ethmoidal artery location on pre-operative sinus computed tomography.


2020 ◽  
Vol 32 (3) ◽  
pp. 382-390 ◽  
Author(s):  
Akiyoshi Tsuboi ◽  
Shiro Oka ◽  
Kazuharu Aoyama ◽  
Hiroaki Saito ◽  
Tomonori Aoki ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document