AUTOMATIC INTERACTIVE MUSIC IMPROVIZATION BASED ON DATA MINING
An area of focus in music improvization is interactive improvization between a human and a computer system in real time. In this paper, we present a musical interactive system acting as a melody continuator. For each musical pattern given by the user, a new one is returned by the system which is built by using general patterns for both pitch and duration stored in its knowledge base. The latter consists of data mining rules extracted from different sets of melodies for different musical styles. The proposed system uses a new music representation scheme which treats separately pitch and duration. Also, it adopts a similarity measure initially developed for clustering categorical data. Moreover, we present experimental results, using Bach's Chorales and Jazz as test inputs, for both assessing the aesthetic quality of the proposed system and comparing it to human results.