Learning and Nonlinear Models
Latest Publications


TOTAL DOCUMENTS

203
(FIVE YEARS 26)

H-INDEX

5
(FIVE YEARS 0)

Published By Associacao Brasileira De Inteligencia Computacional - Abricom

1676-2789, 1676-2789

2021 ◽  
Vol 18 (2) ◽  
pp. 4-15
Author(s):  
Luan Oliveira Silva ◽  
◽  
Leandro dos Santos Araújo ◽  
Victor Ferreira Souza ◽  
Raimundo Matos Barros Neto ◽  
...  

Pneumonia is one of the most common medical problems in clinical practice and is the leading fatal infectious disease worldwide. According to the World Health Organization, pneumonia kills about 2 million children under the age of 5 and is constantly estimated to be the leading cause of infant mortality, killing more children than AIDS, malaria, and measles combined. A key element in the diagnosis is radiographic data, as chest x-rays are routinely obtained as a standard of care and can aid to differentiate the types of pneumonia. However, a rapid radiological interpretation of images is not always available, particularly in places with few resources, where childhood pneumonia has the highest incidence and mortality rates. As an alternative, the application of deep learning techniques for the classification of medical images has grown considerably in recent years. This study presents five implementations of convolutional neural networks (CNNs): ResNet50, VGG-16, InceptionV3, InceptionResNetV2, and ResNeXt50. To support the diagnosis of the disease, these CNNs were applied to solve the classification problem of medical radiographs from people with pneumonia. InceptionResNetV2 obtained the best recall and precision results for the Normal and Pneumonia classes, 93.95% and 97.52% respectively. ResNeXt50 achieved the best precision and f1-score results for the Normal class (94.62% and 94.25% respectively) and the recall and f1-score results for the Pneumonia class (97.80% and 97.65%, respectively).


2021 ◽  
Vol 18 (2) ◽  
pp. 16-26
Author(s):  
Rodrigo Paula Monteiro ◽  
◽  
Carmelo Jose Albanez Bastos-Filho ◽  
Mariela Cerrada ◽  
Diego Cabrera ◽  
...  

Choosing a suitable size for signal representations, e.g., frequency spectra, in a given machine learning problem is not a trivial task. It may strongly affect the performance of the trained models. Many solutions have been proposed to solve this problem. Most of them rely on designing an optimized input or selecting the most suitable input according to an exhaustive search. In this work, we used the Kullback-Leibler Divergence and the Kolmogorov-Smirnov Test to measure the dissimilarity among signal representations belonging to equal and different classes, i.e., we measured the intraclass and interclass dissimilarities. Moreover, we analyzed how this information relates to the classifier performance. The results suggested that both the interclass and intraclass dissimilarities were related to the model accuracy since they indicate how easy a model can learn discriminative information from the input data. The highest ratios between the average interclass and intraclass dissimilarities were related to the most accurate classifiers. We can use this information to select a suitable input size to train the classification model. The approach was tested on two data sets related to the fault diagnosis of reciprocating compressors.


2021 ◽  
Vol 18 (2) ◽  
pp. 56-65
Author(s):  
Marcelo Romero ◽  
◽  
Matheus Gutoski ◽  
Leandro Takeshi Hattori ◽  
Manassés Ribeiro ◽  
...  

Transfer learning is a paradigm that consists in training and testing classifiers with datasets drawn from distinct distributions. This technique allows to solve a particular problem using a model that was trained for another purpose. In the recent years, this practice has become very popular due to the increase of public available pre-trained models that can be fine-tuned to be applied in different scenarios. However, the relationship between the datasets used for training the model and the test data is usually not addressed, specially where the fine-tuning process is done only for the fully connected layers of a Convolutional Neural Network with pre-trained weights. This work presents a study regarding the relationship between the datasets used in a transfer learning process in terms of the performance achieved by models complexities and similarities. For this purpose, we fine-tune the final layer of Convolutional Neural Networks with pre-trained weights using diverse soft biometrics datasets. An evaluation of the performances of the models, when tested with datasets that are different from the one used for training the model, is presented. Complexity and similarity metrics are also used to perform the evaluation.


2021 ◽  
Vol 18 (2) ◽  
pp. 40-55
Author(s):  
Lídio Mauro Lima Campos ◽  
◽  
Jherson Haryson Almeida Pereira ◽  
Danilo Souza Duarte ◽  
Roberto Célio Limão Oliveira ◽  
...  

The aim of this paper is to introduce a biologically inspired approach that can automatically generate Deep Neural networks with good prediction capacity, smaller error and large tolerance to noises. In order to do this, three biological paradigms were used: Genetic Algorithm (GA), Lindenmayer System and Neural Networks (DNNs). The final sections of the paper present some experiments aimed at investigating the possibilities of the method in the forecast the price of energy in the Brazilian market. The proposed model considers a multi-step ahead price prediction (12, 24, and 36 weeks ahead). The results for MLP and LSTM networks show a good ability to predict peaks and satisfactory accuracy according to error measures comparing with other methods.


2021 ◽  
Vol 18 (2) ◽  
pp. 27-39
Author(s):  
Michel Costa ◽  
◽  
Vanessa Rezende ◽  
Cledisson Martins ◽  
Adam Santos ◽  
...  

Convolutional neural networks (CNNs) are one of the deep learning techniques that, due to the computational advance of the last few years, have leveraged the area of computer vision, allowing substantial gains in the most varied classification problems, especially those involving digital images. In this context, this paper aims to propose a methodology for the classification of multiple pathologies related to different plant species. Initially, this methodology involved the image processing and the generation of ten new databases, varying between 50 and 66 classes with greater representation. After training the models (VGG16, RestNet101v1, ResNet101v2, ResNetXt50, and DenseNet169), a comparative study was conducted based on widely used classification metrics, such as test accuracy, f1-score, and area under the curve. To attest the significance of the results, Friedman’s nonparametric statistical test and two post-hoc procedures were performed, which demonstrated that ResNetXt50 and DenseNet169 obtained superior performances when compared with VGG16 and ResNets.


2021 ◽  
Vol 19 (1) ◽  
pp. 17-32
Author(s):  
Ricardo Giglio ◽  
Eduardo Ferreira Silva

Early applications of empirical methods from chaos theory suggested the existence of low dimensional chaotic motion in empirical financial data. However, such results were questioned, and it is then believed that the search for low dimensional chaos in financial data was not successful. On the other hand, at the same time that the hypotheses that raw returns are independent and identically distributed (IID) is often rejected, they indeed present a quite small degree of autocorrelation. These facts suggest that prices in financial markets do not behave completely at random, although their hidden structures seem more complex than those observed in low dimensional chaotic systems. Previous work tested for non-linearity or the presence of low dimensional chaos in artificial financial data generated from the Lux-Marchesi model by means of the BDS and Kaplan tests. Addressing the same model, researchers extended those results by applying Hinich’s bi-spectral and White’s tests and introducing the application of Recurrence Quantification Analysis (RQA) on artificial financial data based on Recurrence Rate, Determinism, Entropy, and Maximal Diagonal Length. Contributing to this research, the present paper has two main goals: (i) to contrast previous findings with an RQA application on data generated by a more evolved of microscopic model of financial markets – the Structural Stochastic Volatility (SSV ) model; and (ii) to extend the RQA investigation above with additional recurrence measures (namely, Divergence, Laminarity, and Maximal Vertical Length) being applied to distinct real-world financial data. The objective is to assess if RQA results could help to distinguish between artificial and real-world data, even if linearity is rejected in both cases. It is shown evidence, in agreement previous findings, to support the rejection of linearity or low dimensional chaotic motion in an artificial financial time series generated from the SSV microscopic model. In addition, it is also shown that that RQA measures can help to discriminate artificial from real-world financial data, at least when specific RQA measures are considered.


2021 ◽  
Vol 19 (1) ◽  
pp. 4-16
Author(s):  
Alexandre Szabo ◽  
Thomaz A. Ruckl

Internal validity indexes are applied to evaluate the solution of a partition, which no equally reflects the same quality for all clusters, individually, in terms of prototypes representativeness. Thus, knowing their representativeness in respective clusters, it is possible adjust them to increase the confidence in analysis of found clusters. In this sense, this paper proposes a simple and effective method to obtain the internal validity index value in every cluster in a partition, identify those with low prototypes representativeness and improve them. Experiments were carried out by sum of the squared error index, which measures the compactness of clusters. The behavior of the method was illustrated by a synthetic dataset and performed for ten datasets from the literature with k-Means algorithm. The results demonstrated its effectiveness for all experiments.


2020 ◽  
Vol 18 (1) ◽  
pp. 60-75
Author(s):  
Diego Guerreiro Bernardes ◽  
Oswaldo Luiz do Valle Costa

This paper presents an autonomous portfolio management system. Autonomous investment systems consist of a series of buy and sell rules on financial markets, which can be executed by machines, oriented to maximizing investor gains. The system uses a Neural Network approach for monitoring the market and the Black-Litterman model for portfolio composition. The ten most traded assets from the Bovespa Index are analyzed, with dedicated neural networks, which suggests future return estimates using technical indicators as input. Those estimates are inserted in the Black-Litterman model which proposes daily portfolio composition using long and short positions. The results are compared to a second autonomous trading system without the Black-Litterman approach, referred to as Benchmark. The numerical results show a great performance compared to the Benchmark, especially the risk-return ratio, captured by the Sharpe Index. Such results suggest that the use of Bayesian inference models combined with neural networks may be a good alternative in portfolio management.


2020 ◽  
Vol 18 (1) ◽  
pp. 47-59
Author(s):  
Marcelo Romero ◽  
Matheusq Gutoski ◽  
Leandro Takeshi Hattori ◽  
Manassés Ribeiro ◽  
Heitor S. Lopes

2020 ◽  
Vol 18 (1) ◽  
pp. 15-34
Author(s):  
Rafael M. Carmo ◽  
Luís Tarrataca ◽  
Jefferson Colares ◽  
Felipe R. Henriques ◽  
Diego B. Haddad ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document