An Inventory of Nursing Education Research

Author(s):  
Olive J Yonge ◽  
Marjorie Anderson ◽  
Joanne Profetto-McGrath ◽  
Joanne K Olson ◽  
D. Lynn Skillen ◽  
...  

Purpose: To describe nursing education research literature in terms of quality, content areas under investigation, geographic location of the research, research designs utilized, sample sizes, instruments used to collect data, and funding sources.Design and Methods: Quantitative and qualitative research literature published between January 1991 and December 2000 were identified and classified using an author-generated Relevance Tool.Findings: 1286 articles were accepted and entered into the inventory, and an additional 22 were retained as references as they were either literature reviews or meta-analyses. Not surprisingly, 90% of nursing education research was generated in North America and Europe, the industrialised parts of the world. Of the total number of articles accepted into the inventory, 61% were quantitative research based. The bulk of the research was conducted within the confines of a course or within a program, with more than half based in educational settings. Sample sizes of the research conducted were diverse, with a bare majority using a sample between 50 and 99 participants. More than half of the studies used questionnaires to obtain data. Surprising, 80% of the research represented in these articles was not funded. The number of publications of nursing education research generated yearly stabilised at approximately 120 per year.Conclusion: Research programs on teaching and learning environments and practice in nursing education need to be developed. Lobbying is needed to increase funding for this type of research at national and international levels.

Aquichan ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1-11
Author(s):  
Wilson Cañón-Montañez

Systematic reviews and meta-analyses are helpful methodological alternatives that combine, discuss, and assess the quality of the best available evidence through adequate and exhaustive searches of the literature. In the last decade, there has been an increase in systematic reviews and meta-analyses in nursing research. This article intends to reflect on the contributions of systematic reviews and meta-analyses to nursing education, research, and practice. Synthesizing the evidence through high-quality systematic reviews and meta-analyses adds to the disciplinary development of nursing; therefore, students and professionals in the field should be encouraged to employ these methodological tools in education and research and implement the results of these methods in clinical practice for making better decisions regarding the individual needs of patients.


Author(s):  
Tammy Lynn McClenny

AbstractObjectivePhenomenography is a qualitative research method used to explore the different ways individuals experience phenomena. Over the last few decades, use of the research method grew in various higher education disciplines; however, use of the method for nursing education research was limited until early in the 21st century.Therefore, the purpose of this article is to describe the main ideas of phenomenography methodology and provide a simplified integrative review of the contributions to nursing education research.MethodsWhittemore and Knafl (2005) five-step integrative review process was used to guide the literature search and evaluate findings of published works between 2009 and 2019.ResultsThirteen articles, including one comprehensive literature review, were identified in the literature search.ConclusionFindings illustrated phenomenography methodology was used to evaluate and improve teaching and learning principles, complex faculty and student issues, curriculum and professional development, and educational practices within nursing education. Phenomenography research can be instrumental in providing a more realistic worldview of individual differing experiences of nursing education phenomena.


2022 ◽  
Vol 22 (1) ◽  
pp. 1-46
Author(s):  
Sarah Heckman ◽  
Jeffrey C. Carver ◽  
Mark Sherriff ◽  
Ahmed Al-zubidy

Context. Computing Education Research (CER) is critical to help the computing education community and policy makers support the increasing population of students who need to learn computing skills for future careers. For a community to systematically advance knowledge about a topic, the members must be able to understand published work thoroughly enough to perform replications, conduct meta-analyses, and build theories. There is a need to understand whether published research allows the CER community to systematically advance knowledge and build theories. Objectives. The goal of this study is to characterize the reporting of empiricism in Computing Education Research literature by identifying whether publications include content necessary for researchers to perform replications, meta-analyses, and theory building. We answer three research questions related to this goal: (RQ1) What percentage of papers in CER venues have some form of empirical evaluation? (RQ2) Of the papers that have empirical evaluation, what are the characteristics of the empirical evaluation? (RQ3) Of the papers that have empirical evaluation, do they follow norms (both for inclusion and for labeling of information needed for replication, meta-analysis, and, eventually, theory-building) for reporting empirical work? Methods. We conducted a systematic literature review of the 2014 and 2015 proceedings or issues of five CER venues: Technical Symposium on Computer Science Education (SIGCSE TS), International Symposium on Computing Education Research (ICER), Conference on Innovation and Technology in Computer Science Education (ITiCSE), ACM Transactions on Computing Education (TOCE), and Computer Science Education (CSE). We developed and applied the CER Empiricism Assessment Rubric to the 427 papers accepted and published at these venues over 2014 and 2015. Two people evaluated each paper using the Base Rubric for characterizing the paper. An individual person applied the other rubrics to characterize the norms of reporting, as appropriate for the paper type. Any discrepancies or questions were discussed between multiple reviewers to resolve. Results. We found that over 80% of papers accepted across all five venues had some form of empirical evaluation. Quantitative evaluation methods were the most frequently reported. Papers most frequently reported results on interventions around pedagogical techniques, curriculum, community, or tools. There was a split in papers that had some type of comparison between an intervention and some other dataset or baseline. Most papers reported related work, following the expectations for doing so in the SIGCSE and CER community. However, many papers were lacking properly reported research objectives, goals, research questions, or hypotheses; description of participants; study design; data collection; and threats to validity. These results align with prior surveys of the CER literature. Conclusions. CER authors are contributing empirical results to the literature; however, not all norms for reporting are met. We encourage authors to provide clear, labeled details about their work so readers can use the study methodologies and results for replications and meta-analyses. As our community grows, our reporting of CER should mature to help establish computing education theory to support the next generation of computing learners.


2017 ◽  
Vol 68 (1) ◽  
pp. 63-79 ◽  
Author(s):  
Ellen Boeren

An examination of articles published in leading adult education journals demonstrates that qualitative research dominates. To better understand this situation, a review of journal articles reporting on quantitative research has been undertaken by the author of this article. Differences in methodological strengths and weaknesses between quantitative and qualitative research are discussed, followed by a data mining exercise on 1,089 journal articles published in Adult Education Quarterly, Studies in Continuing Education, and International Journal of Lifelong Learning. A categorization of quantitative adult education research is presented, as well as a critical discussion on why quantitative adult education does not seem to be widespread in the key adult education journals.


2016 ◽  
Vol 55 (7) ◽  
pp. 363-364 ◽  
Author(s):  
Linda Flynn ◽  
Pamela Ironside ◽  
Michael Yedidia ◽  
Christine A. Tanner ◽  
Theresa (Terry) Valiga

Sign in / Sign up

Export Citation Format

Share Document