regularized model
Recently Published Documents


TOTAL DOCUMENTS

59
(FIVE YEARS 18)

H-INDEX

10
(FIVE YEARS 2)

2022 ◽  
Author(s):  
Yongrong Qiu ◽  
David A Klindt ◽  
Klaudia P Szatko ◽  
Dominic Gonschorek ◽  
Larissa Hoefling ◽  
...  

Neural system identification aims at learning the response function of neurons to arbitrary stimuli using experimentally recorded data, but typically does not leverage coding principles such as efficient coding of natural environments. Visual systems, however, have evolved to efficiently process input from the natural environment. Here, we present a normative network regularization for system identification models by incorporating, as a regularizer, the efficient coding hypothesis, which states that neural response properties of sensory representations are strongly shaped by the need to preserve most of the stimulus information with limited resources. Using this approach, we explored if a system identification model can be improved by sharing its convolutional filters with those of an autoencoder which aims to efficiently encode natural stimuli. To this end, we built a hybrid model to predict the responses of retinal neurons to noise stimuli. This approach did not only yield a higher performance than the stand-alone system identification model, it also produced more biologically-plausible filters. We found these results to be consistent for retinal responses to different stimuli and across model architectures. Moreover, our normatively regularized model performed particularly well in predicting responses of direction-of-motion sensitive retinal neurons. In summary, our results support the hypothesis that efficiently encoding environmental inputs can improve system identification models of early visual processing.


Signals ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 508-526
Author(s):  
Ryoto Ishizuka ◽  
Ryo Nishikimi ◽  
Kazuyoshi Yoshii

This paper describes an automatic drum transcription (ADT) method that directly estimates a tatum-level drum score from a music signal in contrast to most conventional ADT methods that estimate the frame-level onset probabilities of drums. To estimate a tatum-level score, we propose a deep transcription model that consists of a frame-level encoder for extracting the latent features from a music signal and a tatum-level decoder for estimating a drum score from the latent features pooled at the tatum level. To capture the global repetitive structure of drum scores, which is difficult to learn with a recurrent neural network (RNN), we introduce a self-attention mechanism with tatum-synchronous positional encoding into the decoder. To mitigate the difficulty of training the self-attention-based model from an insufficient amount of paired data and to improve the musical naturalness of the estimated scores, we propose a regularized training method that uses a global structure-aware masked language (score) model with a self-attention mechanism pretrained from an extensive collection of drum scores. The experimental results showed that the proposed regularized model outperformed the conventional RNN-based model in terms of the tatum-level error rate and the frame-level F-measure, even when only a limited amount of paired data was available so that the non-regularized model underperformed the RNN-based model.


2021 ◽  
Vol E104.D (7) ◽  
pp. 961-969
Author(s):  
Hao ZHOU ◽  
Hailing XIONG ◽  
Chuan LI ◽  
Weiwei JIANG ◽  
Kezhong LU ◽  
...  

Energies ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 2062
Author(s):  
Paweł Boroń ◽  
Joanna Maria Dulińska ◽  
Dorota Jasińska

In this paper, a two-step tuning strategy of a finite element (FE) model of a bridge with pot bearings exposed to mining-triggered tremors of various intensities is proposed. In the study, a reinforced concrete bridge 160 m long is considered. Once the modal identification of the bridge was experimentally carried out based on low-energy ambient vibrations, the FE model was tuned by replacing the free-bearing sliding with a Coulomb friction-regularized model. This model of friction split the tangential relative displacement rates between contacting surfaces into a reversible elastic part and irreversible sliding. The elastic microslip (spring-like behavior) prior to macrosliding can be explained by the deformation of asperities (roughness of contacting surfaces on the microscopic scale). The proposed model allows for accurate sliding bearing performance simulation under both low-energy and high-energy mining-induced tremors. In the first step of the FE model tuning strategy, the elastic microslip constant was experimentally estimated based on the modal identification. In the second step, the macro-sliding friction parameter was implemented to address the realistic behavior of the bridge under mining-induced shocks. Finally, the dynamic responses of the bridge to mining-triggered tremors of various intensities were calculated and assessed using the untuned and tuned FE models. The analysis proved that the untuned model was not suitable for dynamic bridge assessment in the case of low-intensity tremors. The stresses obtained for this model turned out to be strongly underestimated. For shocks of higher intensity, frictionless sliding at the bearings gives a relatively good global estimation of the structure performance but undervalues its local response. The analysis also reveals that the tuned Coulomb friction-regularized model allows for the accurate simulation of sliding bearings under both low and high-energy mining-induced tremors.


Author(s):  
Yi Yang ◽  
Lixin Han ◽  
Yuanzhen Liu ◽  
Jun Zhu ◽  
Hong Yan

Risks ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 5
Author(s):  
Karim Barigou ◽  
Stéphane Loisel ◽  
Yahia Salhi

Predicting the evolution of mortality rates plays a central role for life insurance and pension funds. Standard single population models typically suffer from two major drawbacks: on the one hand, they use a large number of parameters compared to the sample size and, on the other hand, model choice is still often based on in-sample criterion, such as the Bayes information criterion (BIC), and therefore not on the ability to predict. In this paper, we develop a model based on a decomposition of the mortality surface into a polynomial basis. Then, we show how regularization techniques and cross-validation can be used to obtain a parsimonious and coherent predictive model for mortality forecasting. We analyze how COVID-19-type effects can affect predictions in our approach and in the classical one. In particular, death rates forecasts tend to be more robust compared to models with a cohort effect, and the regularized model outperforms the so-called P-spline model in terms of prediction and stability.


Sign in / Sign up

Export Citation Format

Share Document