Retrieval of Simple Addition Facts

2009 ◽  
Vol 42 (3) ◽  
pp. 215-229 ◽  
Author(s):  
Sarah Hopkins ◽  
Helen Egeberg
1995 ◽  
Vol 81 (1) ◽  
pp. 163-167 ◽  
Author(s):  
Colette Gray ◽  
Gerry Mulhern

In this paper are described data from a study of 10-yr.-old children's memory for simple addition facts and an investigation of the relationship between estimated automaticity and general mathematical ability. 21 children were each presented 100 single-digit addition combinations and their addition times analysed. Each child's automaticity for addition facts was estimated by considering the discrepancy between response times for tie and nontie combinations. Analysis indicated a significant correlation between automaticity and mathematical ability ( r = .45, p< .01).


2011 ◽  
Vol 70 (1) ◽  
pp. 35-39 ◽  
Author(s):  
Muriel Fanget ◽  
Catherine Thevenot ◽  
Caroline Castel ◽  
Michel Fayol

In this study, we used a paradigm recently developed ( Thevenot, Fanget, & Fayol, 2007 ) to determine whether 10-year-old children solve simple addition problems by retrieval of the answer from long-term memory or by calculation procedures. Our paradigm is unique in that it does not rely on reaction times or verbal reports, which are known to potentially bias the results, especially in children. Rather, it takes advantage of the fact that calculation procedures degrade the memory traces of the operands, so that it is more difficult to recognize them when they have been involved in the solution of an addition problem by calculation rather than by retrieval. The present study sharpens the current conclusions in the literature and shows that, when the sum of addition problems is up to 10, children mainly use retrieval, but when it is greater than 10, they mainly use calculation procedures.


2013 ◽  
Vol 44 (6) ◽  
pp. 720-734
Author(s):  
Xu-Qian CHEN ◽  
Ben-Xuan HE ◽  
Ji-Jia ZHANG

1975 ◽  
Vol 12 (2) ◽  
pp. 289-299 ◽  
Author(s):  
Richard A. F. Grieve ◽  
John Gittins

Coronas of orthopyroxene, amphibole and spinel, and occasional garnet occur between olivine and plagioclase in olivine gabbros and troctolites of the Hadlington gabbroic complex. Microprobe analyses show that changes in olivine composition are mirrored in the composition of the corona minerals. Textural evidence indicates that corona formation was sub-solidus and Mg–Fe partitioning between ortho- and clinopyroxene suggests temperatures below 850 °C. Calculations of earlier models for corona formation based on equal volume replacement, or the simple addition of silica or alumina, fail to yield a satisfactory chemical balance. Olivine and plagioclase, with the addition of water, can supply the material needed for corona formation only if a migrating reaction boundary and change in olivine composition are considered.


2021 ◽  
Vol 224 ◽  
pp. 110990
Author(s):  
Saad Sarwar ◽  
Sunghyeok Park ◽  
Thuy Thi Dao ◽  
Sungjun Hong ◽  
Chi-Hwan Han
Keyword(s):  

2020 ◽  
pp. 1-11
Author(s):  
Dawei Yu ◽  
Jie Yang ◽  
Yun Zhang ◽  
Shujuan Yu

The Densely Connected Network (DenseNet) has been widely recognized as a highly competitive architecture in Deep Neural Networks. And its most outstanding property is called Dense Connections, which represent each layer’s input by concatenating all the preceding layers’ outputs and thus improve the performance by encouraging feature reuse to the extreme. However, it is Dense Connections that cause the challenge of dimension-enlarging, making DenseNet very resource-intensive and low efficiency. In the light of this, inspired by the Residual Network (ResNet), we propose an improved DenseNet named Additive DenseNet, which features replacing concatenation operations (used in Dense Connections) with addition operations (used in ResNet), and in terms of feature reuse, it upgrades addition operations to accumulating operations (namely ∑ (·)), thus enables each layer’s input to be the summation of all the preceding layers’ outputs. Consequently, Additive DenseNet can not only preserve the dimension of input from enlarging, but also retain the effect of Dense Connections. In this paper, Additive DenseNet is applied to text classification task. The experimental results reveal that compared to DenseNet, our Additive DenseNet can reduce the model complexity by a large margin, such as GPU memory usage and quantity of parameters. And despite its high resource economy, Additive DenseNet can still outperform DenseNet on 6 text classification datasets in terms of accuracy and show competitive performance for model training.


1986 ◽  
Vol 3 (3) ◽  
pp. 173-202 ◽  
Author(s):  
Mary Sue Hamann ◽  
Mark H. Ashcraft
Keyword(s):  

1982 ◽  
Vol 17 (5) ◽  
pp. 557-561
Author(s):  
Una A. Lange ◽  
Linda L. Mullin
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document