Human biases limit algorithmic boosts of cultural evolution.
Humans are impressive social learners. Researchers of cultural evolution have studied the many biases that enable solutions and behaviours to spread socially from one human to the next, selecting from whom we copy and what we copy. In a digital society, algorithmic and human agents both contribute to transmission of knowledge. One hypothesis is that machines may influence the patterns of social transmission not only by providing a means for spreading human behavior but also by providing novel behaviors themselves. We propose that certain algorithms might show (either by learning or by design) different behaviors, biases and problem-solving abilities than their human counterparts. This may in turn foster better decisions in environments where diversity in problem-solving strategies is beneficial. In this study, we ask whether machines with complementary biases to humans could boost cultural evolution in a lab-based planning task, where humans show suboptimal biases. We conducted a large behavioral study and an agent-based simulation to test the performance of transmission chains with human and machine players. In half of the chains, an algorithmic bot replaced a human participant. We show that the bot boosts the performance of immediately following participants in the chain, but this gain is lost for participants further down the transmission chain. Our findings suggest that machines can potentially improve performance, but human bias can hinder machine solutions from being preserved, especially under conditions of uncertainty or high cognitive load. Our results suggest that the conditions for hybrid social learning and cultural evolution may be limited by task environment and human biases.