scholarly journals Semantic Active Visual Search System Based on Text Information for Large and Unknown Environments

2021 ◽  
Vol 101 (2) ◽  
Author(s):  
Mathias Mantelli ◽  
Diego Pittol ◽  
Renan Maffei ◽  
Jim Torresen ◽  
Edson Prestes ◽  
...  
2013 ◽  
Vol 461 ◽  
pp. 792-800
Author(s):  
Bo Zhao ◽  
Hong Wei Zhao ◽  
Ping Ping Liu ◽  
Gui He Qin

We describe a novel mobile visual search system based on the saliencymechanism and sparse coding principle of the human visual system (HVS). In the featureextraction step, we first divide an image into different regions using thesaliency extraction algorithm. Then scale-invariant feature transform (SIFT)descriptors in all regions are extracted while regional identities arepreserved based on their various saliency levels. According to the sparsecoding principle in the HVS, we adopt a local neighbor preserving Hash functionto establish the binary sparse expression of the SIFT features. In the searchingstep, the nearest neighbors matched to the hashing codes are processed accordingto different saliency levels. Matching scores of images in the database arederived from the matching of hashing codes. Subsequently, the matching scoresof all levels are weighed by degrees of saliency to obtain the initial set of results. In order to further ensure matching accuracy, we propose an optimized retrieval scheme based on global texture information. We conduct extensive experiments on an actual mobile platform in large-scale datasets by using Corel-1000. The resultsshow that the proposed method outperforms the state-of-the-art algorithms on accuracyrate, and no significant increase in the running time of the feature extractionand retrieval can be observed.


Author(s):  
Da-Un Jung ◽  
Seungjae Lee ◽  
Keundong Lee ◽  
Sungkwan Je ◽  
Weon Geun Oh

2015 ◽  
Vol 17 (7) ◽  
pp. 1019-1030 ◽  
Author(s):  
David M. Chen ◽  
Bernd Girod

2015 ◽  
Vol 74 (1) ◽  
pp. 55-60 ◽  
Author(s):  
Alexandre Coutté ◽  
Gérard Olivier ◽  
Sylvane Faure

Computer use generally requires manual interaction with human-computer interfaces. In this experiment, we studied the influence of manual response preparation on co-occurring shifts of attention to information on a computer screen. The participants were to carry out a visual search task on a computer screen while simultaneously preparing to reach for either a proximal or distal switch on a horizontal device, with either their right or left hand. The response properties were not predictive of the target’s spatial position. The results mainly showed that the preparation of a manual response influenced visual search: (1) The visual target whose location was congruent with the goal of the prepared response was found faster; (2) the visual target whose location was congruent with the laterality of the response hand was found faster; (3) these effects have a cumulative influence on visual search performance; (4) the magnitude of the influence of the response goal on visual search is marginally negatively correlated with the rapidity of response execution. These results are discussed in the general framework of structural coupling between perception and motor planning.


Sign in / Sign up

Export Citation Format

Share Document