Research on the Impact of the Integration of Two Styles of Icon Design and Task Characteristic on UI Visual Search Performance

2021 ◽  
Vol 39 (5) ◽  
pp. 1-11
Author(s):  
예화 곡 ◽  
용구 김 ◽  
찬석 홍
Author(s):  
P. Manivannan ◽  
Sara Czaja ◽  
Colin Drury ◽  
Chi Ming Ip

Visual search is an important component of many real world tasks such as industrial inspection and driving. Several studies have shown that age has an impact on visual search performance. In general older people demonstrate poorer performance on such tasks as compared to younger people. However, there is controversy regarding the source of the age-performance effect. The objective of this study was to examine the relationship between component abilities and visual search performance, in order to identify the locus of age-related performance differences. Six abilities including reaction time, working memory, selective attention and spatial localization were identified as important components of visual search performance. Thirty-two subjects ranging in age from 18 - 84 years, categorized in three different age groups (young, middle, and older) participated in the study. Their component abilities were measured and they performed a visual search task. The visual search task varied in complexity in terms of type of targets detected. Significant relationships were found between some of the component skills and search performance. Significant age effects were also observed. A model was developed using hierarchical multiple linear regression to explain the variance in search performance. Results indicated that reaction time, selective attention, and age were important predictors of search performance with reaction time and selective attention accounting for most of the variance.


2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Alejandro Lleras ◽  
Zhiyuan Wang ◽  
Anna Madison ◽  
Simona Buetti

Recently, Wang, Buetti and Lleras (2017) developed an equation to predict search performance in heterogeneous visual search scenes (i.e., multiple types of non-target objects simultaneously present) based on parameters observed when participants perform search in homogeneous scenes (i.e., when all non-target objects are identical to one another). The equation was based on a computational model where every item in the display is processed with unlimited capacity and independently of one another, with the goal of determining whether the item is likely to be a target or not. The model was tested in two experiments using real-world objects. Here, we extend those findings by testing the predictive power of the equation to simpler objects. Further, we compare the model’s performance under two stimulus arrangements: spatially-intermixed (items randomly placed around the scene) and spatially-segregated displays (identical items presented near each other). This comparison allowed us to isolate and quantify the facilitatory effect of processing displays that contain identical items (homogeneity facilitation), a factor that improves performance in visual search above-and-beyond target-distractor dissimilarity. The results suggest that homogeneity facilitation effects in search arise from local item-to-item interaction (rather than by rejecting items as “groups”) and that the strength of those interactions might be determined by stimulus complexity (with simpler stimuli producing stronger interactions and thus, stronger homogeneity facilitation effects).


2004 ◽  
Vol 42 (3) ◽  
pp. 335-345 ◽  
Author(s):  
Andrea Tales ◽  
Janice Muir ◽  
Roy Jones ◽  
Anthony Bayer ◽  
Robert J Snowden

2019 ◽  
Author(s):  
Elizabeth J. Halfen ◽  
John F. Magnotti ◽  
Md. Shoaibur Rahman ◽  
Jeffrey M. Yau

AbstractAlthough we experience complex patterns over our entire body, how we selectively perceive multi-site touch over our bodies remains poorly understood. Here, we characterized tactile search behavior over the body using a tactile analog of the classic visual search task. Participants judged whether a target stimulus (e.g., 10-Hz vibration) was present or absent on the upper or lower limbs. When present, the target stimulus could occur alone or with distractor stimuli (e.g., 30-Hz vibrations) on other body locations. We varied the number and spatial configurations of the distractors as well as the target and distractor frequencies and measured the impact of these factors on search response times. First, we found that response times were faster on target-present trials compared to target-absent trials. Second, response times increased with the number of stimulated sites, suggesting a serial search process. Third, search performance differed depending on stimulus frequencies. This frequency-dependent behavior may be related to perceptual grouping effects based on timing cues. We constructed models to explore how the locations of the tactile cues influenced search behavior. Our modeling results reveal that, in isolation, cues on the index fingers make relatively greater contributions to search performance compared to stimulation experienced on other body sites. Additionally, co-stimulation of sites within the same limb or simply on the same body side preferentially influence search behavior. Our collective findings identify some principles of attentional search that are common to vision and touch, but others that highlight key differences that may be unique to body-based spatial perception.New & NoteworthyLittle is known about how we selectively experience multi-site touch over the body. Using a tactile analog of the classic visual search paradigm, we show that tactile search behavior for flutter cues is generally consistent with a serial search process. Modeling results reveal the preferential contributions of index finger stimulation and two-site interactions involving ipsilateral and within-limb patterns. Our results offer initial evidence for spatial and temporal principles underlying tactile search behavior over the body.


Author(s):  
Douglas S. Brungart ◽  
Sarah E. Kruger ◽  
Tricia Kwiatkowski ◽  
Thomas Heil ◽  
Julie Cohen

Objective: The present study was designed to examine the impact that walking has on performance in auditory localization, visual discrimination, and aurally aided visual search tasks. Background: Auditory localization and visual search are critical skills that are frequently conducted by moving observers, but most laboratory studies of these tasks have been conducted on stationary listeners who were either seated or standing during stimulus presentation. Method: Thirty participants completed three different tasks while either standing still or while walking at a comfortable self-selected pace on a treadmill: (1) an auditory localization task, where they identified the perceived location of a target sound; (2) a visual discrimination task, where they identified a visual target presented at a known location directly in front of the listener; and (3) an aurally aided visual search task, where they identified a visual target that was presented in the presence of multiple visual distracters either in isolation or in conjunction with a spatially colocated auditory cue. Results: Participants who were walking performed auditory localization and aurally aided visual search tasks significantly faster than those who were standing, with no loss in accuracy. Conclusion: The improved aurally aided visual search performance found in this experiment may be related to enhanced overall activation caused by walking. It is also possible that the slight head movements required may have provided auditory cues that enhanced localization accuracy. Application: The results have potential applications in virtual and augmented reality displays where audio cues might be presented to listeners while walking.


SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A23-A24
Author(s):  
Amanda Hudson ◽  
John Hinson ◽  
Paul Whitney ◽  
Elena Crooks ◽  
Nita Shattuck ◽  
...  

Abstract Introduction Visual search is important in many operational tasks, such as passive sonar monitoring in naval operations. Shift work can contribute to fatigue and task performance impairment; in particular, backward rotating shift schedules have been shown to impair vigilant attention performance. However, the impact on visual search performance, above and beyond impaired vigilant attention, is unknown. We investigated the effects of two distinct shift work schedules using a visual search task with properties of real-life visual search performance. Methods N=13 adult males (ages 18–39) completed a 6-day/5-night laboratory study with an acclimation day, four simulated shift days, and a recovery day. Shift days involved either a 5h-on/15h-off backward rotating schedule (n=8) or a 3h-on/9h-off fixed schedule (n=5). The visual search task was performed once per shift at varying time of day depending on shift. Participants viewed search arrays where stimuli consisted of colored letters of different shapes. Over three trial blocks of 24 trials each, participants determined if a target was present or absent among 1, 5, 15, or 30 distractors. Similarity between targets and distractors was manipulated between blocks, such that targets differed from distractors by color only, shape only, or either color or shape but not both. For each distinct target feature block, and separately for presence or absence of a target, slopes of response times regressed against number of stimuli were calculated to quantify visual search rates. Mixed-effects ANOVA was used to analyze visual search rates by shift schedule and shift day. Results There were no significant effects of shift schedule (all p>0.30), shift day (all p>0.13), or their interaction (all p>0.22) on visual search rates. Conclusion Previous work showed degraded vigilant attention in the shift schedules considered here, especially in the backward rotating schedule, which may compromise operational performance. However, while our sample may have been too small to have adequate statistical power, we failed to identify specific impairments in visual search with statistical significance. It remains to be determined whether greater levels of fatigue, such as could be induced by total sleep deprivation, would reveal significant visual search deficits. Support (if any) Naval Postgraduate School award N62271-13-M-1228


2015 ◽  
Vol 74 (1) ◽  
pp. 55-60 ◽  
Author(s):  
Alexandre Coutté ◽  
Gérard Olivier ◽  
Sylvane Faure

Computer use generally requires manual interaction with human-computer interfaces. In this experiment, we studied the influence of manual response preparation on co-occurring shifts of attention to information on a computer screen. The participants were to carry out a visual search task on a computer screen while simultaneously preparing to reach for either a proximal or distal switch on a horizontal device, with either their right or left hand. The response properties were not predictive of the target’s spatial position. The results mainly showed that the preparation of a manual response influenced visual search: (1) The visual target whose location was congruent with the goal of the prepared response was found faster; (2) the visual target whose location was congruent with the laterality of the response hand was found faster; (3) these effects have a cumulative influence on visual search performance; (4) the magnitude of the influence of the response goal on visual search is marginally negatively correlated with the rapidity of response execution. These results are discussed in the general framework of structural coupling between perception and motor planning.


Sign in / Sign up

Export Citation Format

Share Document