perceptual matching
Recently Published Documents


TOTAL DOCUMENTS

67
(FIVE YEARS 13)

H-INDEX

16
(FIVE YEARS 2)

2021 ◽  
Vol 11 (21) ◽  
pp. 9911
Author(s):  
Xiaoxiao Cao ◽  
Makoto Watanabe ◽  
Kenta Ono

Mobile games are developing rapidly as an important part of the national economy. Gameplay is an important attribute, and a game’s icon sometimes determines the user’s initial impression. Whether the user can accurately perceive gameplay and affective quality through the icon is particularly critical. In this article, a two-stage perceptual matching procedure is used to evaluate the perceptual quality of six categories of games whose icons include characters as elements. First, 60 highly visual matching icons were selected as second-stage objects through classification tasks. Second, through the semantic differential method and correlation analysis, highly visual matching icons’ affective matching quality was measured. Finally, a series of icon samples were determined, and element analysis was carried out. Several methods were proposed for improving the perceptual quality of game icons. Studying the perceptual matching relationship can better enhance the interaction between designers, developers, and users.


2021 ◽  
Vol 12 ◽  
Author(s):  
Moritz Stolte ◽  
Charles Spence ◽  
Ayla Barutchu

Linking arbitrary shapes (e.g., circles, squares, and triangles) to personal labels (e.g., self, friend, or stranger) or reward values (e.g., £18, £6, or £2) results in immediate processing benefits for those stimuli that happen to be associated with the self or high rewards in perceptual matching tasks. Here we further explored how social and reward associations interact with multisensory stimuli by pairing labels and objects with tones (low, medium, and high tones). We also investigated whether self and reward biases persist for multisensory stimuli with the label removed after an association had been made. Both high reward stimuli and those associated with the self, resulted in faster responses and improved discriminability (i.e., higher d’), which persisted for multisensory stimuli even when the labels were removed. However, these self- and reward-biases partly depended on the specific alignment between the physical tones (low, medium, and high) and the conceptual (social or reward) order. Performance for reward associations improved when the endpoints of low or high rewards were paired with low or high tones; meanwhile, for personal associations, there was a benefit when the self was paired with either low or high tones, but there was no effect when the stranger was associated with either endpoint. These results indicate that, unlike reward, social personal associations are not represented along a continuum with two marked endpoints (i.e., self and stranger) but rather with a single reference point (the self vs. other).


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0241747
Author(s):  
James D. Dunn ◽  
Stephanie Summersby ◽  
Alice Towler ◽  
Josh P. Davis ◽  
David White

We present a new test–the UNSW Face Test (www.unswfacetest.com)–that has been specifically designed to screen for super-recognizers in large online cohorts and is available free for scientific use. Super-recognizers are people that demonstrate sustained performance in the very top percentiles in tests of face identification ability. Because they represent a small proportion of the population, screening large online cohorts is an important step in their initial recruitment, before confirmatory testing via standardized measures and more detailed cognitive testing. We provide normative data on the UNSW Face Test from 3 cohorts tested via the internet (combined n = 23,902) and 2 cohorts tested in our lab (combined n = 182). The UNSW Face Test: (i) captures both identification memory and perceptual matching, as confirmed by correlations with existing tests of these abilities; (ii) captures face-specific perceptual and memorial abilities, as confirmed by non-significant correlations with non-face object processing tasks; (iii) enables researchers to apply stricter selection criteria than other available tests, which boosts the average accuracy of the individuals selected in subsequent testing. Together, these properties make the test uniquely suited to screening for super-recognizers in large online cohorts.


2020 ◽  
Author(s):  
Aurora De Bortoli Vizioli ◽  
Anna M. Borghi ◽  
Luca Tummolini

Neurological evidence has shown that brain damages canselectively impair the ability to discriminate between objectsbelonging to others and those that we feel are our own. Despite the ubiquity and relevance of this sense of object ownership for our life, the underlying cognitive mechanisms are still poorly understood. Here we ask whether psychological ownership of an object can be based on its incorporation in one’s body image. To explore this possibility with healthy participants, we employed a modified version of the rubber-hand illusion in which both the participant and the rubber hand wore a ring. We used the self-prioritization effect in a perceptual matching task as an indirect measure of the sense of (dis)ownership over objects. Results indicate that undermining the bodily self has cascade effects on the representation of owned objects, at least for those associated with the body for a long time.


2020 ◽  
Author(s):  
James Daniel Dunn ◽  
Stephanie Summersby ◽  
Alice Towler ◽  
Josh P Davis ◽  
David White

We present a new test – the UNSW Face Test (www.unswfacetest.com) – that has been specifically designed to screen for super-recognizers in large online cohorts and is available free for scientific use. Super-recognizers are people that demonstrate sustained performance in the very top percentiles in tests of face identification ability. Because they represent a small proportion of the population, screening large online cohorts is an important tool for their initial recruitment, before completing confirmatory testing via standardized measures and more detailed cognitive testing. We provide normative data on the test from 3 cohorts tested via the internet (combined n = 23,902) and 2 cohorts tested in our lab (combined n = 182). The UNSW Face Test: (i) captures both identification memory and perceptual matching, as confirmed by correlations with existing tests of these abilities; (ii) captures face-specific perceptual and memorial abilities, as confirmed by non-significant correlations with non-face object processing tasks; (iii) enables researchers to apply stricter selection criteria than other available tests, which boosts the average accuracy of the individuals selected in subsequent testing. Together, these properties make the test uniquely suited to screening for super-recognizers in large online cohorts.


2019 ◽  
Vol 19 (10) ◽  
pp. 47a
Author(s):  
Yang Sun ◽  
Wei Huang ◽  
Haixu Wang ◽  
Changhong Liu ◽  
Jie Sui
Keyword(s):  

2019 ◽  
Author(s):  
Danielle Navarro ◽  
Ian Fuss

We propose a new method for quickly calculating the probability density function for first-passage times in simple Wiener diffusion models, extending an earlier method used by [Van Zandt, T., Colonius, H., & Proctor, R. W. (2000). A comparison of two response-time models applied to perceptual matching. Psychonomic Bulletin & Review, 7, 208–256]. The method relies on the observation that there are two distinct infinite series expansions of this probability density, one of which converges quickly for small time values, while the other converges quickly at large time values. By deriving error bounds associated with finite truncation of either expansion, we are able to determine analytically which of the two versions should be applied in any particular context. The bounds indicate that, even for extremely stringent error tolerances, no more than 8 terms are required to calculate the probability density. By making the calculation of this distribution tractable, the goal is to allow more complex extensions of Wiener diffusion models to be developed.


2019 ◽  
Vol 10 ◽  
Author(s):  
Mengyin Jiang ◽  
Shirley K. M. Wong ◽  
Harry K. S. Chung ◽  
Yang Sun ◽  
Janet H. Hsiao ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document