Rubber Hand Illusion does not arise from comparisons with internal body models: a new multisensory integration account of the sense of ownership
Human body sense is surprisingly flexible – precisely administered multisensory stimulation may result in the illusion that an external object is part of one’s body. There seems to be a general consensus that there are certain top-down constraints on which objects may be incorporated: in particular, to-be-embodied objects should be structurally similar to a visual representation stored in an internal body model for a shift in one’s body image to occur. However, empirical evidence contradicts the body model hypothesis: the sense of ownership may be spread over objects strikingly distinct in morphology and structure (e.g., robotic arms or empty space) and direct empirical support for the theory is currently lacking. As an alternative, based on the example of the rubber hand illusion (RHI), I propose a multisensory integration account of how the sense of ownership is induced. In this account, the perception of one’s own body is a regular type of multisensory perception and multisensory integration processes are not only necessary but also sufficient for embodiment. In this paper, I propose how RHI can be modeled with the use of Maximum Likelihood Estimation and natural correlation rules. I also discuss how Bayesian Coupling Priors and idiosyncrasies in sensory processing render prior distributions interindividually variable, accounting for large interindividual differences in susceptibility to RHI. Taken together, the proposed model accounts for exceptional malleability of human body perception, fortifies existing bottom-up multisensory integration theories with top-down models of relatedness of sensory cues, and generates testable and disambiguating predictions.