scholarly journals Rules of Program Behavior

2021 ◽  
Vol 21 (4) ◽  
pp. 1-37
Author(s):  
Rodrigo Duran ◽  
Juha Sorva ◽  
Otto Seppälä

We propose a framework for identifying, organizing, and communicating learning objectives that involve program semantics. In this framework, detailed learning objectives are written down as rules of program behavior (RPBs). RPBs are teacher-facing statements that describe what needs to be learned about the behavior of a specific sort of programs. Different programming languages, student cohorts, and contexts call for different RPBs. Instructional designers may define progressions of RPB rulesets for different stages of a programming course or curriculum; we identify evaluation criteria for RPBs and discuss tradeoffs in RPB design. As a proof-of-concept example, we present a progression of rulesets designed for teaching beginners how expressions, variables, and functions work in Python. We submit that the RPB framework is valuable to practitioners and researchers as a tool for design and communication. Within computing education research, the framework can inform, among other things, the ongoing exploration of “notional machines” and the design of assessments and visualizations. The theoretical work that we report here lays a foundation for future empirical research that compares the effectiveness of RPB rulesets as well as different methods for teaching a particular ruleset.

2020 ◽  
Vol 4 (Supplement_2) ◽  
pp. 604-604
Author(s):  
Sara Police ◽  
Jessie Hoffman,

Abstract Objectives The purpose of this project was to design, develop and implement an online two-credit course, Drug & Nutrient Interactions, as an elective for a new online Graduate Certificate in Applied Nutrition and Culinary Medicine at the University of Kentucky. Methods Drug & Nutrient Interactions was designed to meet the needs of select student cohorts: undergraduate Pharmacology minors, graduate students enrolled in the Masters in Nutritional Sciences program, and online graduate certificate students. Faculty within the Dept. of Pharmacology and Nutritional Sciences and the Division of Clinical Nutrition were consulted to identify curricular gaps and to avoid redundancy across programs. Instructional designers were consulted to identify evidence-based best practices in online course design and teaching. Results Content of the Drug and Nutrient Interactions course is structured within four thematic modules: 1. Introduction to Pharmacology and Food & Drug Interactions, 2. Exploring Drug-Nutrient Interactions, 3. Genes, Bugs & Time, and 4. Current and Future Directions in Nutrition & Pharmacology. Each module is three to four weeks in duration, to span a 14-week semester. Each week, students’ tasks include reading, watching, writing, and reviewing content related to the student learning objectives. Methods to promote student engagement with the content recur week-to-week, to ensure consistency for students’ experience. An eBook was written by the instructors to provide a current and interdisciplinary review of the intersections of nutritional sciences and pharmacology in the course. In lieu of proctored online exams, module-level assignments assess students’ achievement of learning outcomes. Drug & Nutrient Interactions launched in fall 2019 with nine students enrolling and completing the course. Course analytics track student engagement by logging page views and participation. Increasing students’ page views and participation align with due dates for module assignments. Therefore - in spring 2020, deadlines were shifted to a weekly timeline to foster consistent engagement. Conclusions Instructors should explore various methods to foster student-content, student-student and student-instructor engagement in an online learning environment. Funding Sources This course project was funded by a UK Online award & an Alternative Textbook grant.


Author(s):  
Marian Petre ◽  
Kate Sanders ◽  
Robert McCartney ◽  
Marzieh Ahmadzadeh ◽  
Cornelia Connolly ◽  
...  

2021 ◽  
Vol 53 (1) ◽  
pp. 6-6
Author(s):  
Amy Ko ◽  
Renée McCauley ◽  
Jan Vahrenhold ◽  
Matthias Hauswirth

You are warmly invited to the 17th Annual ACM International Computing Education Research (ICER) conference (https://icer2021.acm.org), held online the week of 16 August 2021.


Author(s):  
Thomas F. C. Woodhall ◽  
David S. Strong

Education research strongly links methods of course assessment with the student learning process. In open-ended engineering design courses, assessment based on student deliverables as “product” may focus student attention on a content checklist rather than effectively learning process and techniques that are critical to professional engineering practice. By developing a rubric assessment scheme that relates directly to the course learning objectives and sharing it openly with students, it is proposed that students are more likely to achieve deeper learning on the process of engineering design.


2022 ◽  
Vol 22 (1) ◽  
pp. 1-46
Author(s):  
Sarah Heckman ◽  
Jeffrey C. Carver ◽  
Mark Sherriff ◽  
Ahmed Al-zubidy

Context. Computing Education Research (CER) is critical to help the computing education community and policy makers support the increasing population of students who need to learn computing skills for future careers. For a community to systematically advance knowledge about a topic, the members must be able to understand published work thoroughly enough to perform replications, conduct meta-analyses, and build theories. There is a need to understand whether published research allows the CER community to systematically advance knowledge and build theories. Objectives. The goal of this study is to characterize the reporting of empiricism in Computing Education Research literature by identifying whether publications include content necessary for researchers to perform replications, meta-analyses, and theory building. We answer three research questions related to this goal: (RQ1) What percentage of papers in CER venues have some form of empirical evaluation? (RQ2) Of the papers that have empirical evaluation, what are the characteristics of the empirical evaluation? (RQ3) Of the papers that have empirical evaluation, do they follow norms (both for inclusion and for labeling of information needed for replication, meta-analysis, and, eventually, theory-building) for reporting empirical work? Methods. We conducted a systematic literature review of the 2014 and 2015 proceedings or issues of five CER venues: Technical Symposium on Computer Science Education (SIGCSE TS), International Symposium on Computing Education Research (ICER), Conference on Innovation and Technology in Computer Science Education (ITiCSE), ACM Transactions on Computing Education (TOCE), and Computer Science Education (CSE). We developed and applied the CER Empiricism Assessment Rubric to the 427 papers accepted and published at these venues over 2014 and 2015. Two people evaluated each paper using the Base Rubric for characterizing the paper. An individual person applied the other rubrics to characterize the norms of reporting, as appropriate for the paper type. Any discrepancies or questions were discussed between multiple reviewers to resolve. Results. We found that over 80% of papers accepted across all five venues had some form of empirical evaluation. Quantitative evaluation methods were the most frequently reported. Papers most frequently reported results on interventions around pedagogical techniques, curriculum, community, or tools. There was a split in papers that had some type of comparison between an intervention and some other dataset or baseline. Most papers reported related work, following the expectations for doing so in the SIGCSE and CER community. However, many papers were lacking properly reported research objectives, goals, research questions, or hypotheses; description of participants; study design; data collection; and threats to validity. These results align with prior surveys of the CER literature. Conclusions. CER authors are contributing empirical results to the literature; however, not all norms for reporting are met. We encourage authors to provide clear, labeled details about their work so readers can use the study methodologies and results for replications and meta-analyses. As our community grows, our reporting of CER should mature to help establish computing education theory to support the next generation of computing learners.


Sign in / Sign up

Export Citation Format

Share Document