Classification of Dispersed Patterns of Radiographic Images with COVID-19 by Core-Periphery Network Modeling

Author(s):  
Jianglong Yan ◽  
Weiguang Liu ◽  
Yu-tao Zhu ◽  
Gen Li ◽  
Qiusheng Zheng ◽  
...  
Diagnostics ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 233
Author(s):  
Dong-Woon Lee ◽  
Sung-Yong Kim ◽  
Seong-Nyum Jeong ◽  
Jae-Hong Lee

Fracture of a dental implant (DI) is a rare mechanical complication that is a critical cause of DI failure and explantation. The purpose of this study was to evaluate the reliability and validity of a three different deep convolutional neural network (DCNN) architectures (VGGNet-19, GoogLeNet Inception-v3, and automated DCNN) for the detection and classification of fractured DI using panoramic and periapical radiographic images. A total of 21,398 DIs were reviewed at two dental hospitals, and 251 intact and 194 fractured DI radiographic images were identified and included as the dataset in this study. All three DCNN architectures achieved a fractured DI detection and classification accuracy of over 0.80 AUC. In particular, automated DCNN architecture using periapical images showed the highest and most reliable detection (AUC = 0.984, 95% CI = 0.900–1.000) and classification (AUC = 0.869, 95% CI = 0.778–0.929) accuracy performance compared to fine-tuned and pre-trained VGGNet-19 and GoogLeNet Inception-v3 architectures. The three DCNN architectures showed acceptable accuracy in the detection and classification of fractured DIs, with the best accuracy performance achieved by the automated DCNN architecture using only periapical images.


Biomolecules ◽  
2021 ◽  
Vol 11 (6) ◽  
pp. 815
Author(s):  
Shintaro Sukegawa ◽  
Kazumasa Yoshii ◽  
Takeshi Hara ◽  
Tamamo Matsuyama ◽  
Katsusuke Yamashita ◽  
...  

It is necessary to accurately identify dental implant brands and the stage of treatment to ensure efficient care. Thus, the purpose of this study was to use multi-task deep learning to investigate a classifier that categorizes implant brands and treatment stages from dental panoramic radiographic images. For objective labeling, 9767 dental implant images of 12 implant brands and treatment stages were obtained from the digital panoramic radiographs of patients who underwent procedures at Kagawa Prefectural Central Hospital, Japan, between 2005 and 2020. Five deep convolutional neural network (CNN) models (ResNet18, 34, 50, 101 and 152) were evaluated. The accuracy, precision, recall, specificity, F1 score, and area under the curve score were calculated for each CNN. We also compared the multi-task and single-task accuracies of brand classification and implant treatment stage classification. Our analysis revealed that the larger the number of parameters and the deeper the network, the better the performance for both classifications. Multi-tasking significantly improved brand classification on all performance indicators, except recall, and significantly improved all metrics in treatment phase classification. Using CNNs conferred high validity in the classification of dental implant brands and treatment stages. Furthermore, multi-task learning facilitated analysis accuracy.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 120597-120603
Author(s):  
Soon Bin Kwon ◽  
Hyuk-Soo Han ◽  
Myung Chul Lee ◽  
Hee Chan Kim ◽  
Yunseo Ku ◽  
...  

2009 ◽  
Vol 42 (5) ◽  
pp. 467-476 ◽  
Author(s):  
Rafael Vilar ◽  
Juan Zapata ◽  
Ramón Ruiz

2020 ◽  
Vol 1 (14) ◽  
pp. 125-129
Author(s):  
Ol'ga Lebedeva

The article discusses the existing classifications of land use in relation to freight transport. Research on the relationship between the physical conditions (points of departure and destination) of travel («land use») and the actual transportation process is practically not carried out in the Russian Federation. To conduct them, it is necessary to collect information about the points of departure / destination and justify a systematic methodology for collecting and classifying data elements. Analysis and classification of land use are the foundations of transport planning. As a result of the study, it can be concluded that there is no single system of classification of "land use" suitable for freight transport, and a new version needs to be developed with a choice of attributes important for transport modeling in our country.


PLoS ONE ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. e0248809
Author(s):  
Anna Lind ◽  
Ehsan Akbarian ◽  
Simon Olsson ◽  
Hans Nåsell ◽  
Olof Sköldenberg ◽  
...  

Background Fractures around the knee joint are inherently complex in terms of treatment; complication rates are high, and they are difficult to diagnose on a plain radiograph. An automated way of classifying radiographic images could improve diagnostic accuracy and would enable production of uniformly classified records of fractures to be used in researching treatment strategies for different fracture types. Recently deep learning, a form of artificial intelligence (AI), has shown promising results for interpreting radiographs. In this study, we aim to evaluate how well an AI can classify knee fractures according to the detailed 2018 AO-OTA fracture classification system. Methods We selected 6003 radiograph exams taken at Danderyd University Hospital between the years 2002–2016, and manually categorized them according to the AO/OTA classification system and by custom classifiers. We then trained a ResNet-based neural network on this data. We evaluated the performance against a test set of 600 exams. Two senior orthopedic surgeons had reviewed these exams independently where we settled exams with disagreement through a consensus session. Results We captured a total of 49 nested fracture classes. Weighted mean AUC was 0.87 for proximal tibia fractures, 0.89 for patella fractures and 0.89 for distal femur fractures. Almost ¾ of AUC estimates were above 0.8, out of which more than half reached an AUC of 0.9 or above indicating excellent performance. Conclusion Our study shows that neural networks can be used not only for fracture identification but also for more detailed classification of fractures around the knee joint.


Sign in / Sign up

Export Citation Format

Share Document