Transfer of Aspects of Academic Writing to Similar and New Contexts Through Dynamic Assessment

Author(s):  
Prithvi N. Shrestha
2005 ◽  
Vol 38 (3) ◽  
pp. 142-147

05–314Alderson, J. Charles (Lancaster U, UK) & Ari Huhta, The development of a suite of computer-based diagnostic tests based on the Common European Framework. Language Testing (London, UK) 22.3 (2005), 301–320.05–315Al-Hamly, Mashael & Christine Coombe (Kuwait U, Kuwait), To change or not to change: investigating the value of MCQ answer changing for Gulf Arab students. Language Testing (London, UK) 22.4 (2005), 509–531.05–316Broadfoot, Patricia M. (U of Bristol, UK), Dark alleys and blind bends: testing the language of learning. Language Testing (London, UK) 22.2 (2005), 123–141.05–317Cumming, Alister (U of Toronto, Canada; [email protected]), Robert Kantor, Kyoko Baba, Usman Erdosy, Keanre Eouanzoui & Mark James, Differences in written discourse in independent and integrated prototype tasks for next generation TOEFL. Assessing Writing (Amsterdam, the Netherlands) 10.1 (2005), 5–43.05–318Eckes, Thomas (TestDaF Institute, the Netherlands), Melanie Ellis, Vita Kalnberzina, Karmen Piorn, Claude Springer, Krisztina Szollás & Constance Tsagari, Progress and problems in reforming public language examinations in Europe: cameos from the Baltic States, Greece, Hungary, Poland, Slovenia, France and Germany. Language Testing (London,UK) 22.3 (2005), 355–377.05–319Figueras, Neus (Department of Education, Generalitat de Catalunya, Spain), Brian North, Sauli Takala, Norman Verhelst & Piet Van Avermaet, Relating examinations to the Common European Framework: a manual. Language Testing (London, UK) 22.3 (2005), 261–279.05–320Green, Anthony (Cambridge ESOL Examinations, Cambridge, UK), EAP study recommendations and score gains on the IELTS Academic Writing test. Assessing Writing (Amsterdam, the Netherlands) 10.1 (2005), 44–60.05–321Green, Rita & Dianne Wall (Lancaster U, UK), Language testing in the military: problems, politics and progress. Language Testing (London,UK) 22.3 (2005), 379–398.05–322Hasselgreen, Angela (The U of Bergen, Norway), Assessing the language of young learners. Language Testing (London,UK) 22.3 (2005), 337–354.05–323Klein, Joseph ([email protected]) & David Taub, The effect of variations in handwriting and print on evaluation of student essays. Assessing Writing (Amsterdam, the Netherlands) 10.2 (2005), 134–148.05–324Little, David (Trinity College, Dublin, Ireland), The Common European Framework and the European Language Portfolio: involving learners and their judgements in the assessment process. Language Testing (London, UK) 22.3 (2005), 321–336.05–325Lumley, Tom & Barry O'Sullivan (Australian Council for Educational Research, Australia), The effect of test-taker gender, audience and topic on task performance in tape-mediated assessment of speaking. Language Testing (London,UK) 22.4 (2005), 415–437.05–326Luxia, Qi (Guandong U of Foreign Studies, China), Stakeholders' conflicting aims undermine the washback function of a high-stakes test. Language Testing (London, UK) 22.2 (2005), 142–173.05–327Poehner, Matthew E. & James P. Lantolf (The Pennsylvania State U, USA), Dynamic assessment in the language classroom. Language Teaching Research (London, UK) 9.3 (2005), 233–265.05–328Stansfield, Charles W. & William E. Hewitt (Second Language Testing Inc., USA), Examining the predictive validity of a screening test for court interpreters. Language Testing (London, UK) 22.4 (2005), 438–462.05–329Trites, Latricia (Murray State U, USA) & Mary McGroarty, Reading to learn and reading to integrate: new tasks for reading comprehension tests?Language Testing (London, UK) 22.2 (2005), 174–210.05–330Uiterwijk, Henny (Citogroep, Arnem, the Netherlands) & Ton Vallen, Linguistic sources of item bias for second generation immigrants in Dutch tests. Language Testing (London, UK) 22.2 (2005), 211–234.05–331Weems, Gail H. (Arkansas Little Rock U, USA; [email protected]), Anthony J. Onwuegbuzie & Daniel Lustig, Profiles of respondents who respond inconsistently to positively- and negatively-worded items on rating scales. Evaluation & Research in Education (Clevedon, UK) 17.1 (2003), 45–60.05–332Weir, Cyril J. (Roehampton U, UK), Limitations of the Common European Framework for developing comparable examinations and tests. Language Testing (London, UK) 22.3 (2005), 281–300.05–333Xi, Xiaoming (U of California, USA), Do visual chunks and planning impact performance on the graph description task in the SPEAK exam?Language Testing (London, UK) 22.4 (2005), 463–508.


2012 ◽  
Vol 17 (1) ◽  
pp. 55-70 ◽  
Author(s):  
Prithvi Shrestha ◽  
Caroline Coffin

2020 ◽  
Vol 29 (3) ◽  
pp. 1226-1240
Author(s):  
Janet L. Patterson ◽  
Barbara L. Rodríguez ◽  
Philip S. Dale

Purpose Early identification is a key element for accessing appropriate services for preschool children with language impairment. However, there is a high risk of misidentifying typically developing dual language learners as having language impairment if inappropriate tools designed for monolingual children are used. In this study of children with bilingual exposure, we explored performance on brief dynamic assessment (DA) language tasks using graduated prompting because this approach has potential applications for screening. We asked if children's performance on DA language tasks earlier in the year was related to their performance on a year-end language achievement measure. Method Twenty 4-year-old children from Spanish-speaking homes attending Head Start preschools in the southwestern United States completed three DA graduated prompting language tasks 3–6 months prior to the Head Start preschools' year-end achievement testing. The DA tasks, Novel Adjective Learning, Similarities in Function, and Prediction, were administered in Spanish, but correct responses in English or Spanish were accepted. The year-end achievement measure, the Learning Accomplishment Profile–Third Edition (LAP3), was administered by the children's Head Start teachers, who also credited correct responses in either language. Results Children's performance on two of the three DA language tasks was significantly and positively related to year-end LAP3 language scores, and there was a moderate and significant relationship for one of the DA tasks, even when controlling for age and initial LAP3 scores. Conclusions Although the relationship of performance on DA with year-end performance varies across tasks, the findings indicate potential for using a graduated prompting approach to language screening with young dual language learners. Further research is needed to select the best tasks for administration in a graduated prompting framework and determine accuracy of identification of language impairment.


Author(s):  
Virginia L. Dubasik ◽  
Dubravka Svetina Valdivia

Purpose The purpose of this study was to ascertain the extent to which school-based speech-language pathologists' (SLPs) assessment practices with individual English learners (ELs) align with federal legislation and professional practice guidelines. Specifically, we were interested in examining SLPs' use of multiple tools during individual EL assessments, as well as relationships between practices and number of types of training experiences. Method School-based SLPs in a Midwestern state were recruited in person or via e-mail to complete an online survey pertaining to assessment. Of the 562 respondents who completed the survey, 222 (39.5%) indicated past or present experience with ELs, and thus, their data were included in the analyses. The questionnaire solicited information about respondent's demographics, caseload composition, perceived knowledge and skills and training experiences pertaining to working with ELs (e.g., graduate school, self-teaching, professional conferences), and assessment practices used in schools. Results The majority of respondents reported using multiple tools rather than a single tool with each EL they assess. Case history and observation were tools used often or always by the largest number of participants. SLPs who used multiple tools reported using both direct (e.g., standardized tests, dynamic assessment) and indirect tools (e.g., case history, interviews). Analyses revealed low to moderate positive associations between tools, as well as the use of speech-language samples and number of types of training experiences. Conclusions School-based SLPs in the current study reported using EL assessment practices that comply with federal legislation and professional practice guidelines for EL assessment. These results enhance our understanding of school-based SLPs' assessment practices with ELs and may be indicative of a positive shift toward evidence-based practice.


Sign in / Sign up

Export Citation Format

Share Document