An incremental learning temporal influence model for identifying topical influencers on Twitter dataset

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
G. R. Ramya ◽  
P. Bagavathi Sivakumar
2021 ◽  
Vol 12 ◽  
Author(s):  
Maria Heitmeier ◽  
Yu-Ying Chuang ◽  
R. Harald Baayen

This study addresses a series of methodological questions that arise when modeling inflectional morphology with Linear Discriminative Learning. Taking the semi-productive German noun system as example, we illustrate how decisions made about the representation of form and meaning influence model performance. We clarify that for modeling frequency effects in learning, it is essential to make use of incremental learning rather than the end-state of learning. We also discuss how the model can be set up to approximate the learning of inflected words in context. In addition, we illustrate how in this approach the wug task can be modeled. The model provides an excellent memory for known words, but appropriately shows more limited performance for unseen data, in line with the semi-productivity of German noun inflection and generalization performance of native German speakers.


Author(s):  
J. G. Hunt ◽  
R. N. Osborn ◽  
H. J. Martin
Keyword(s):  

2010 ◽  
Author(s):  
Gwen A. Frishkoff ◽  
Kevyn Collins-Thompson ◽  
Charles A. Perfetti

2018 ◽  
Vol 44 (10) ◽  
pp. 1586-1602 ◽  
Author(s):  
Franziska Kurtz ◽  
Herbert Schriefers ◽  
Andreas Mädebach ◽  
Jörg D. Jescheniak

Sign in / Sign up

Export Citation Format

Share Document