scholarly journals Finland, A Package Deal: Disciplinary Climate in Science Classes, Science Dispositions and Science Literacy

2021 ◽  
Vol 13 (24) ◽  
pp. 13857
Author(s):  
Larry J. Grabau ◽  
Jari Lavonen ◽  
Kalle Juuti

Finland’s educational prowess, though tempered by recent international assessments, has remained intact. This report focused on lessons that could be learned regarding secondary-level science education from the Program for International Student Assessment (PISA) 2015, science-focused assessment. That PISA iteration included not only science literacy but also students’ science dispositions (epistemology, enjoyment, interest, and self-efficacy) and the schools’ science climate measures (disciplinary climate and teaching support). Due to the hierarchical nature of the PISA data, multilevel models were employed in this Finnish study, involving 5582 students from 167 schools. Science dispositions (as outcome measures) were differently associated with teaching support and disciplinary climate (epistemology with neither; enjoyment and interest, with both). Science literacy (as an outcome measure) was associated with all four science dispositions, whether modeled with each science disposition separately or all four simultaneously. Science literacy was also associated with the disciplinary climate in science classes for all tested models. We concluded that, in the Finnish context, science dispositions and the disciplinary climate were predictive of science literacy. Furthermore, we presented evidence from the literature indicating that these conclusions may well extend to other international contexts.

2019 ◽  
Vol 20 (1) ◽  
pp. 45-65
Author(s):  
Sam P.E. Hopp

The Programme for International Student Assessment (PISA) scores are a leading international measure of achievement. This study reviews German 2015 PISA data and imputes scores on income and time in nation to provide comparisons between native, immigrant and refugee students. This quantitative study uses cultural capital to explain the association of independent variables to PISA scores for students, revealing an unexpected negative linear relationship between those variables. The results and significance of this study may assist those involved in policy for refugee populations and inform the strategies of test protocols and measures in a new global student paradigm.


2021 ◽  
Author(s):  
Kylie Hillman ◽  
Sue Thomson

Australia was one of nine countries and economies to participate in the 2018 TALIS-PISA link study, together with Cuidad Autónoma de Buenos Aires (Argentina), Colombia, the Czech Republic, Denmark, Georgia, Malta, Turkey and Viet Nam. This study involved coordinating the samples of schools that participated in the Program of International Student Assessment (PISA, a study of the performance of 15-year-old students) and the Teaching and Learning International Survey (TALIS, a study that surveys teachers and principals in lower secondary schools) in 2018. A sample of teachers from schools that were selected to participate in PISA were invited to respond to the TALIS survey. TALIS data provides information regarding the background, beliefs and practices of lower secondary teachers and principals, and PISA data delivers insights into the background characteristics and cognitive and non-cognitive skills of 15-year-old students. Linking these data offers an internationally comparable dataset combining information on key education stakeholders. This report presents results of analyses of the relationships between teacher and school factors and student outcomes, such as performance on the PISA assessment, expectations for further study and experiences of school life. Results for Australia are presented alongside those of the average (mean) across all countries and economies that participated in the TALIS-PISA link study for comparison, but the focus remains on what relationships were significant among Australian students.


Author(s):  
Betül Alatlı

This study aimed to investigate cross-cultural measurement invariance of the PISA (Programme for International Student Assessment, 2015) science literacy test and items and to carry out a bias study on the items which violate measurement invariance. The study used a descriptive review model. The sample of the study consisted of 2224 students taking the S12 test booklet from Australia, France, Singapore, and Turkey. Measurement invariance analyses for the test were done using Multi-Group Confirmatory Factor Analysis (MGCFA). Differential Item Functioning (DIF), in other words, measurement invariance of the test items, was analyzed using the item response theory log-likelihood ratio (IRTLR), Hierarchical Generalized Linear Model (HGLM), and the Simultaneous Item Bias Test (SIBTEST) methods.According to the findings, the test was determined to exhibit structural invariance across cultures. The highest number of items showing DIF was observed in the comparisons of Australia-Singapore and Australia-France with 35%. The number of items showing DIF, with 24%, determined in bilateral comparisons which included Turkey, the only country taking the translated form among other countries, did not show a significant difference compared to the other comparisons. While the lowest number of items showing DIF was obtained from Singapore-France samples with 12%, the rate of items indicating DIF in the France-Turkey samples was 18%. On the other hand, 35% of the items showed cross cultural measurement invariance. An item bias study was carried out based on expert opinions on items identified and released as showing DIF in the comparisons of Turkey with Australia and Singapore.According to the findings, translation-bound differentiation of the items, familiarity of a culture group with the contents of the items, polysemy in the expressions or words used in the items, the format, or the stylistic characteristics of the items were determined to be the cause of the bias in the skills measured with the items.


Author(s):  
Luisa Araujo ◽  
Andrea Saltelli ◽  
Sylke V. Schnepf

Purpose Since the publication of its first results in 2000, the Programme for International Student Assessment (PISA) implemented by the OECD has repeatedly been the subject of heated debate. In late 2014 controversy flared up anew, with the most severe critics going so far as to call for a halt to the programme. The purpose of this paper is to discuss the methodological design of PISA and the ideological basis of scientific and policy arguments invoked for and against it. Design/methodology/approach The authors examine the soundness of the survey methodology and identify the conflicting interpretations and values fuelling the debate. Findings The authors find that while PISA has promoted the focus on the important subject of children's education worldwide there are legitimate concerns about what PISA measures, and how. The authors conclude that the OECD should be more transparent in the documentation of the methodological choices that underlie the creation of the data and more explicit about the impact of these choices on the results. More broadly, the authors advise caution in the attempt to derive and apply evidence-based policy in the domain of education; the authors furthermore propose an alternative model of social inquiry that is sensitive and robust to the concerns of the various actors and stakeholders that may be involved in a given policy domain. Originality/value The issues and tensions surrounding the PISA survey can be better understood in the framework of post-normal science (PNS), the application of which to the PISA controversy offers a potential solution to a stalemate.


Author(s):  
Antonella D’Agostino ◽  
Francesco Schirripa Spagnolo ◽  
Nicola Salvati

AbstractUsing the Programme for International Student Assessment (PISA) 2015 data for Italy, this paper offers a complete overview of the relationship between test anxiety and school performance by studying how anxiety affects the performance of students along the overall conditional distribution of mathematics, literature and science scores. We aim to indirectly measure whether higher goals increase test anxiety, starting from the hypothesis that high-skilled students generally set themselves high goals. We use an M-quantile regression approach that allows us to take into account the hierarchical structure and sampling weights of the PISA data. There is evidence of a negative and statistically significant relationship between test anxiety and school performance. The size of the estimated association is greater at the upper tail of the distribution of each score than at the lower tail. Therefore, our results suggest that high-performing students are more affected than low-performing students by emotional reactions to tests and school-work anxiety.


2021 ◽  
Vol 9 (1) ◽  
Author(s):  
Plamen V. Mirazchiyski

AbstractThis paper presents the R Analyzer for Large-Scale Assessments (), a newly developed package for analyzing data from studies using complex sampling and assessment designs. Such studies are, for example, the IEA’s Trends in International Mathematics and Science Study and the OECD’s Programme for International Student Assessment. The package covers all cycles from a broad range of studies. The paper presents the architecture of the package, the overall workflow and illustrates some basic analyses using it. The package is open-source and free of charge. Other software packages for analyzing large-scale assessment data exist, some of them are proprietary, others are open-source. However, is the first comprehensive package, designed for the user experience and has some distinctive features. One innovation is that the package can convert SPSS data from large scale assessments into native data sets. It can also do so for PISA data from cycles prior to 2015, where the data is provided in tab-delimited text files along with SPSS control syntax files. Another feature is the availability of a graphical user interface, which is also written in and operates in any operating system where a full copy of can be installed. The output from any analysis function is written into an MS Excel workbook with multiple sheets for the estimates, model statistics, analysis information and the calling syntax itself for reproducing the analysis in future. The flexible design of allows for the quick addition of new studies, analysis types and features to the existing ones.


2015 ◽  
Vol 117 (1) ◽  
pp. 1-10
Author(s):  
Nancy Perry ◽  
Kadriye Ercikan

The Programme for International Student Assessment (PISA) was designed by the Organisation for Economic Cooperation and Development (OECD) to evaluate the quality, equity, and efficiency of school systems around the world. Specifically, the PISA has assessed 15-year-old students’ reading, mathematics, and science literacy on a 3-year cycle, since 2000. Also, the PISA collects information about how those outcomes are related to key demographic, social, economic, and educational variables. However, the preponderance of reports involving PISA data focus on achievement variables and cross-national comparisons of achievement variables. Challenges in evaluating achievement of students from different cultural and educational settings and data concerning students’ approaches to learning, motivation for learning, and opportunities for learning are rarely reported. A main goal of this themed issue of Teachers College Record (TCR) is to move the conversation about PISA data beyond achievement to also include factors that affect achievement (e.g., SES, home environment, strategy use). Also we asked authors to consider how international assessment data can be used for improving learning and education and what appropriate versus inappropriate inferences can be made from the data. In this introduction, we synthesize the six articles in this issue and themes that cut across them. Also we examine challenges associated with using data from international assessments, like the PISA, to inform education policy and practice within and across countries. We conclude with recommendations for collecting and using data from international assessments to inform research, policy, and teaching and learning.


2017 ◽  
Vol 16 (6) ◽  
pp. 869-884
Author(s):  
Christina E Mølstad ◽  
Daniel Pettersson ◽  
Eva Forsberg

This study investigates knowledge structures and scientific communication using bibliometric methods to explore scientific knowledge production and dissemination. The aim is to develop knowledge about this growing field by investigating studies using international large-scale assessment (ILSA) data, with a specific focus on those using Programme for International Student Assessment (PISA) data. As international organisations use ILSA to measure, assess and compare the success of national education systems, it is important to study this specific knowledge to understand how it is organised and legitimised within research. The findings show an interchange of legitimisation, where major actors from the USA and other English-speaking and westernised countries determine the academic discourse. Important epistemic cultures for PISA research are identified: the most important of which are situated within psychology and education. These two research environments are epicentres created by patterns of the referrals to and referencing of articles framing the formulation of PISA knowledge. Finally, it is argued that this particular PISA research is self-referential and self-authorising, which raises questions about whether research accountability leads to ‘a game of thrones’, where rivalry going on within the scientific field concerning how and on what grounds ‘facts’ and ‘truths’ are constructed, as a continuing process with no obvious winner.


Author(s):  
Deborah A. Cobb-Clark ◽  
Mathias Sinning ◽  
Steven Stillman

The authors use 2009 Programme for International Student Assessment (PISA) data to link institutional arrangements in OECD countries’ to disparities in reading, math, and science test scores for migrant and native-born students. The authors find that achievement gaps are larger for migrant youths who arrive at older ages and for those who do not speak the language of the PISA test at home. Institutional arrangements often serve to mitigate the achievement gaps of some migrant students while leaving unaffected or exacerbating those of others. For example, earlier school starting ages help migrant youths in some cases but by no means in all. Limited tracking of students by ability appears to be beneficial for migrants’ relative achievement, while complete tracking and the presence of a large private school sector appear to be detrimental. Migrant students’ achievement, relative to their native-born peers, suffers as educational spending and teachers’ salaries increase, but it improves when teacher evaluation includes an examination component.


Sign in / Sign up

Export Citation Format

Share Document