The Issue of Proxies, And why EU law matters for recommender systems
Recommendations are meant to increase sales or ad revenue, as these are the first priority of those who pay for them. As recommender systems match their recommendations with inferred preferences, we should not be surprised if the algorithm optimises for lucrative preferences and thus co-produces the preferences they mine. This relates to the well-known problem of feedback loops, filter bubbles and echo chambers. In this article I will discuss the implications of the fact that computing systems necessarily work with proxies when inferring recommendations and raise a number of questions about whether recommender systems actually do what they are claimed to do, while also analysing the often perverse economic incentive structures that have a major impact on relevant design decisions. Finally, I will explain how the GDPR and the proposed AI Act will help to break through various vicious circles, by constraining how people may be targeted (GDPR) and by requiring documented evidence of the robustness, resilience, reliability and the responsible design and use of high risk recommender systems (AI Act).