Performance evaluation of whey flux in dead-end and cross-flow modes via convolutional neural networks

2022 ◽  
Vol 301 ◽  
pp. 113872
Author(s):  
Lukka Thuyavan Yogarathinam ◽  
Kirubakaran Velswamy ◽  
Arthanareeswaran Gangasalam ◽  
Ahmad Fauzi Ismail ◽  
Pei Sean Goh ◽  
...  
2021 ◽  
Vol 5 (3) ◽  
pp. 584-593
Author(s):  
Naufal Hilmiaji ◽  
Kemas Muslim Lhaksmana ◽  
Mahendra Dwifebri Purbolaksono

especially with the advancement of deep learning methods for text classification. Despite some effort to identify emotion on Indonesian tweets, its performance evaluation results have not achieved acceptable numbers. To solve this problem, this paper implements a classification model using a convolutional neural network (CNN), which has demonstrated expected performance in text classification. To easily compare with the previous research, this classification is performed on the same dataset, which consists of 4,403 tweets in Indonesian that were labeled using five different emotion classes: anger, fear, joy, love, and sadness. The performance evaluation results achieve the precision, recall, and F1-score at respectively 90.1%, 90.3%, and 90.2%, while the highest accuracy achieves 89.8%. These results outperform previous research that classifies the same classification on the same dataset.


Author(s):  
Wei Jia ◽  
Jian Gao ◽  
Wei Xia ◽  
Yang Zhao ◽  
Hai Min ◽  
...  

AbstractPalmprint recognition and palm vein recognition are two emerging biometrics technologies. In the past two decades, many traditional methods have been proposed for palmprint recognition and palm vein recognition, and have achieved impressive results. However, the research on deep learning-based palmprint recognition and palm vein recognition is still very preliminary. In this paper, in order to investigate the problem of deep learning based 2D and 3D palmprint recognition and palm vein recognition in-depth, we conduct performance evaluation of seventeen representative and classic convolutional neural networks (CNNs) on one 3D palmprint database, five 2D palmprint databases and two palm vein databases. A lot of experiments have been carried out in the conditions of different network structures, different learning rates, and different numbers of network layers. We have also conducted experiments on both separate data mode and mixed data mode. Experimental results show that these classic CNNs can achieve promising recognition results, and the recognition performance of recently proposed CNNs is better. Particularly, among classic CNNs, one of the recently proposed classic CNNs, i.e., EfficientNet achieves the best recognition accuracy. However, the recognition performance of classic CNNs is still slightly worse than that of some traditional recognition methods.


TRANSPORTES ◽  
2020 ◽  
Vol 28 (5) ◽  
pp. 267-279
Author(s):  
Francisco Dalla Rosa ◽  
Laura Dall'Igna Favretto ◽  
Vítor Borba Rodrigues ◽  
Nasir G. Gharaibeh

Neste artigo é avaliado o potencial de Redes Neurais Convolucionais (RNC) como ferramenta automatizada para detecção de trincas em superfícies de pavimentos. Foram utilizadas fotografias da superfície de diferentes segmentos de um pavimento do tipo Cheapseal, obtidas a partir de câmeras fotográficas montadas em veículos. As imagens foram avaliadas a partir da proposta do uso de duas arquiteturas de redes neurais convolutionais e implementadas com o auxílio da biblioteca de aprendizado de máquina PyTorch, o qual possui código aberto e disponível na forma de script em linguagem Python. As imagens foram processadas com o uso de três técnicas diferentes, com o intuito de avaliar a influência da complexidade dos algoritmos propostos. Para análise da performance da rede neural, foram utilizadas como métricas de avaliação a acurácia, a precisão, o recall e o F1 score. Os resultados apontaram que a arquitetura da rede neural escolhida apresentou desempenho satisfatório na detecção de trincas, bem como indicam que a complexidade da rede é um dos fatores a ser considerado durante o processo de classificação das imagens.


Sign in / Sign up

Export Citation Format

Share Document