scholarly journals Applying Universal Design to Information Literacy

2009 ◽  
Vol 49 (1) ◽  
pp. 24-32 ◽  
Author(s):  
Lisa O’Connor ◽  
Ted Chodock ◽  
Elizabeth Dolinger
2017 ◽  
Vol 56 (2) ◽  
pp. 75 ◽  
Author(s):  
Esther Grassian ◽  
Sarah LeMire

In recent years, Reference and User Services Quarterly’s “Information Literacy and Instruction” column has covered diverse topics related to information literacy, including MOOCs, universal design, discovery layers, and, of course, assessment. This column has provided a space for librarians from all types of libraries to share how they are engaging with information literacy and instruction in their libraries, as well as to unpack the challenges they faced. As new editors, we will continue to use this space as an opportunity to explore emerging topics in information literacy.As co-editors of “Information Literacy and Instruction,” we bring our own perspectives and experiences to RUSQ, along with some overlapping interests. To better reflect our perspectives, we will alternate editorial responsibility for pieces published in this column, although both editors will be providing feedback. Following is biographical information about each of us, as well as a lengthier description of our column interests.


2011 ◽  
Vol 14 (1) ◽  
pp. 21-28
Author(s):  
Mary J. Emm ◽  
Christine P. Cecconi

Clinical supervision is recognized as a distinctive area of practice and expertise, yet professional preparation in this area remains inadequate. This paper presents functional information describing the development and implementation of an experimental course on administration, supervision, and private practice, based on graduate student perceptions and preferences for course content and types of learning activities. Current pedagogical trends for universal design in learning and fostering student engagement were emphasized, including problem-based and collaborative learning. Results suggest that students were highly pleased with course content, interactive and group activities, as well as with assessment procedures used.


Diagnostica ◽  
2020 ◽  
Vol 66 (3) ◽  
pp. 147-157
Author(s):  
Martin Senkbeil ◽  
Jan Marten Ihme

Zusammenfassung. ICT Literacy legt eine performanzbasierte Erfassung mit simulierten und interaktiven Testaufgaben nahe. Der vorliegende Beitrag untersucht, ob mit Multiple-Choice (MC)-Aufgaben ein vergleichbares Konstrukt wie mit Simulationsaufgaben erfasst wird. Hierfür wurden die Testergebnisse zweier Instrumente aus aktuellen Large-Scale-Studien gegenübergestellt, die an N = 2 075 Jugendlichen erhoben wurden: der auf MC-Aufgaben basierende ICT Literacy-Test für Klasse 9 des Nationalen Bildungspanels (National Educational Panel Study, NEPS) und der simulationsbasierte Kompetenztest der internationalen Schulleistungsstudie ICILS 2013 (International Computer and Information Literacy Study). Die Analysen unterstützen die Gültigkeit der Konstruktinterpretation des MC-basierten Tests in NEPS. Im Sinne der konvergenten Evidenz korrelieren die MC-Aufgaben substanziell mit den computer- und simulationsbasierten Aufgaben in ICILS 2013 (.68 ≤  r ≤ .90). Weiterhin ergeben sich positive und für beide Tests vergleichbar hohe Korrelationen mit ICT-bezogenen Schülermerkmalen (z. B. Selbstwirksamkeit). Weiterführende Analysen zum Zusammenhang mit allgemeinen kognitiven Fähigkeiten zeigen zudem, dass ICT Literacy und kognitive Grundfähigkeiten distinkte Faktoren repräsentieren.


2007 ◽  
Author(s):  
Suzanne Lipu ◽  
Kirsty Williamson ◽  
Annemaree Lloyd

Sign in / Sign up

Export Citation Format

Share Document