Social Decision Making with Humans and Robots: Trust and Approachability Mediate Economic Decision Making
As humanoid robots become more advanced and commonplace, the average user may find themselves in the position of wondering if their robotic companion truly possesses a mind. It is important for scientists and designers to consider how this will likely affect our interactions with social robots. The current paper explores how social decision making with humanoid robots changes as the degree of their human-likeness changes. For that purpose, we created a spectrum of human-like agents via morphing that ranged from very robot-like to very human-like in physical appearance (in increments of 20%) and measured how this change in physical humanness affected decision-making in two economic games: Ultimatum Game (Experiment 1) and Trust Game (Experiment 2). We expected increases in human-like appearance to lead to a higher rate of punishment for unfair offers in the Ultimatum Game, and to a higher rate of trust in the Trust Game. While physical humanness did not have an impact on economic decisions in either of the experiments, follow-up analyses showed that both subjective ratings of trust and agent approachability mediated the effect of agent appearance on decision-making in both experiments. Possible consequences of these findings for human- robot interactions are discussed.