object selection
Recently Published Documents


TOTAL DOCUMENTS

212
(FIVE YEARS 63)

H-INDEX

19
(FIVE YEARS 3)

2021 ◽  
Vol 11 (2) ◽  
pp. 95-102
Author(s):  
Nur Ameerah Abdul Halim ◽  
Ajune Wanis Ismail

Augmented Reality (AR) have been widely explored worldwide for their potential as a technology that enhances information representation. As technology progresses, smartphones (handheld devices) now have sophisticated processors and cameras for capturing static photographs and video, as well as a variety of sensors for tracking the user's position, orientation, and motion. Hence, this paper would discuss a finger-ray pointing technique in real-time for interaction in handheld AR and comparing the technique with the conventional technique in handheld, touch-screen interaction. The aim of this paper is to explore the ray pointing interaction in handheld AR for 3D object selection. Previous works in handheld AR and also covers Mixed Reality (MR) have been recapped.


Author(s):  
Yin-ting Lin ◽  
Garry Kong ◽  
Daryl Fougnie

AbstractAttentional mechanisms in perception can operate over locations, features, or objects. However, people direct attention not only towards information in the external world, but also to information maintained in working memory. To what extent do perception and memory draw on similar selection properties? Here we examined whether principles of object-based attention can also hold true in visual working memory. Experiment 1 examined whether object structure guides selection independently of spatial distance. In a memory updating task, participants encoded two rectangular bars with colored ends before updating two colors during maintenance. Memory updates were faster for two equidistant colors on the same object than on different objects. Experiment 2 examined whether selection of a single object feature spreads to other features within the same object. Participants memorized two sequentially presented Gabors, and a retro-cue indicated which object and feature dimension (color or orientation) would be most relevant to the memory test. We found stronger effects of object selection than feature selection: accuracy was higher for the uncued feature in the same object than the cued feature in the other object. Together these findings demonstrate effects of object-based attention on visual working memory, at least when object-based representations are encouraged, and suggest shared attentional mechanisms across perception and memory.


2021 ◽  
Vol 2 ◽  
Author(s):  
Rene Weller ◽  
Waldemar Wegele ◽  
Christoph Schröder ◽  
Gabriel Zachmann

AbstractWe present a novel selection technique for VR called LenSelect. The main idea is to decrease the Index of Difficulty (ID) according to Fitts’ Law by dynamically increasing the size of the potentially selectable objects. This facilitates the selection process especially in cases of small, distant or partly occluded objects, but also for moving targets. In order to evaluate our method, we have defined a set of test scenarios that covers a broad range of use cases, in contrast to often used simpler scenes. Our test scenarios include practically relevant scenarios with realistic objects but also synthetic scenes, all of which are available for download. We have evaluated our method in a user study and compared the results to two state-of-the-art selection techniques and the standard ray-based selection. Our results show that LenSelect performs similar to the fastest method, which is ray-based selection, while significantly reducing the error rate by 44%.


2021 ◽  
Author(s):  
Ziyang Zhang

This thesis presents a system that visualizes 3D city data and supports gesture interactions in a fully immersive Cave Automatic Virtual Environment (CAVE). To facilitate more natural interactions in this immersive virtual city, novel techniques are proposed for operations such as object selection, object manipulation, navigation and menu control. These operations form a basis of interactions for most Virtual Reality (VR) applications. The proposed techniques are predominantly controlled using gestures. We also propose the use of pattern recognition methods, specifically a Hidden Markov Model, to support real time dynamic gesture recognition and demonstrate its use for menu control in VR applications. Qualitative and quantitative user studies are conducted to evaluate the proposed techniques. The results of the user studies demonstrate that the interaction techniques for object selection and manipulation are measurably better than traditional techniques. The results also show that the proposed gesture based navigation and menu control techniques are preferred by experienced users. These findings can guide future user interface design in immersive environments.


2021 ◽  
Author(s):  
Ziyang Zhang

This thesis presents a system that visualizes 3D city data and supports gesture interactions in a fully immersive Cave Automatic Virtual Environment (CAVE). To facilitate more natural interactions in this immersive virtual city, novel techniques are proposed for operations such as object selection, object manipulation, navigation and menu control. These operations form a basis of interactions for most Virtual Reality (VR) applications. The proposed techniques are predominantly controlled using gestures. We also propose the use of pattern recognition methods, specifically a Hidden Markov Model, to support real time dynamic gesture recognition and demonstrate its use for menu control in VR applications. Qualitative and quantitative user studies are conducted to evaluate the proposed techniques. The results of the user studies demonstrate that the interaction techniques for object selection and manipulation are measurably better than traditional techniques. The results also show that the proposed gesture based navigation and menu control techniques are preferred by experienced users. These findings can guide future user interface design in immersive environments.


Sign in / Sign up

Export Citation Format

Share Document