The impact of tangible user interfaces on spatial cognition during collaborative design

2008 ◽  
Vol 29 (3) ◽  
pp. 222-253 ◽  
Author(s):  
Mi Jeong Kim ◽  
Mary Lou Maher
2021 ◽  
pp. 073563312110272
Author(s):  
Neila Chettaoui ◽  
Ayman Atia ◽  
Med Salim Bouhlel

Embodied learning pedagogy highlights the interconnections between the brain, body, and the concrete environment. As a teaching method, it provides means of engaging the physical body in multimodal learning experiences to develop the students’ cognitive process. Based on this perspective, several research studies introduced different interaction modalities to support the implementation of an embodied learning environment. One such case is the use of tangible user interfaces and motion-based technologies. This paper evaluates the impacts of motion-based, tangible-based, and multimodal interaction merging between tangible interfaces and motion-based technology on improving students’ learning performance. A controlled study was performed at a primary school with 36 participants (aged 7 to 9), to evaluate the educational potential of embodied interaction modalities compared to tablet-based learning. The results highlighted a significant difference in the learning gains between all groups, as determined by one-way ANOVA [F (3,32) = 6.32, p = .017], in favor of the multimodal learning interface. Findings revealed that a multimodal learning interface supporting richer embodied interaction that took advantage of affording the power of body movements and manipulation of physical objects might improve students’ understanding of abstract concepts in educational contexts.


Author(s):  
John Quarles ◽  
Samsun Lampotang ◽  
Ira Fischler ◽  
Paul Fishwick ◽  
Benjamin Lok

Author(s):  
Boon Yih Mah ◽  
Suzana Ab Rahim

The use of the internet for teaching and learning has become a global trend among the education practitioners over the recent decades. The integration of technology and media into Malaysian English as a Second Language (ESL) classrooms has altered the methods in English Language Teaching (ELT). In response to the impact of technology in ELT, the needs of a supplementary instructional platform, and the limitations of the learning management system (LMS) in fostering second language (L2) writing skill, a web-based instructional tool was designed and developed based on a theoretical-and-pedagogical framework namely Web-based Cognitive Writing Instruction (WeCWI). To determine the key concepts while identifying the research gap, this study conducted a literature review using online search on specific keywords including “blog”, “Blogger”, “widget”, and “hyperlink” found in the scholarly articles. Based on the review of literature, Blogger was opted due to its on-screen customisable layout editing features that can be embedded with web widgets and hypertext that share the identical features. By looking into the relationship between perceptual learning preferences on perceived information and the visual representations in iconic and symbolic views, the blogs can come with two different user interfaces embedded with web widgets or hypertext. The blog with web widgets appears in a graphical form of iconic view; while hypertext only displays textual form of symbolic view without involving the visual references. With the injection of web widgets and hypertext into the blogs, WeCWI attempts to offer a technological enhanced ELT solution to overcome the poor writing skill with a better engagement while learning online through the learners’ preferred perceptual learning preferences.


2021 ◽  
Vol 5 (EICS) ◽  
pp. 1-29
Author(s):  
Arthur Sluÿters ◽  
Jean Vanderdonckt ◽  
Radu-Daniel Vatavu

Intra-platform plasticity regularly assumes that the display of a computing platform remains fixed and rigid during interactions with the platform in contrast to reconfigurable displays, which can change form depending on the context of use. In this paper, we present a model-based approach for designing and deploying graphical user interfaces that support intra-platform plasticity for reconfigurable displays. We instantiate the model for E3Screen, a new device that expands a conventional laptop with two slidable, rotatable, and foldable lateral displays, enabling slidable user interfaces. Based on a UML class diagram as a domain model and a SCRUD list as a task model, we define an abstract user interface as interaction units with a corresponding master-detail design pattern. We then map the abstract user interface to a concrete user interface by applying rules for the reconfiguration, concrete interaction, unit allocation, and widget selection and implement it in JavaScript. In a first experiment, we determine display configurations most preferred by users, which we organize in the form of a state-transition diagram. In a second experiment, we address reconfiguration rules and widget selection rules. A third experiment provides insights into the impact of the lateral displays on a visual search task.


Sign in / Sign up

Export Citation Format

Share Document