Aligning Brain Activity and Sketch in Multi-Modal CAD Interface
This paper investigates the proper synchronization of sketch data and cognitive states in a multi-modal CAD interface. In a series of experiments, 5 subjects were instructed to watch and then explain 6 mechanical mechanisms by sketching them on a touch based screen. Simultaneously, subject’s brain waves were recorded in terms of electroencephalogram (EEG) signals from 9 locations on the scalp. EEG signals were analyzed and translated into mental workload and cognitive state. A dynamic time window was then constructed to align these features with sketch features such that the combination of two modalities maximizes the classification of gesture from non-gesture strokes. Quadratic Discriminant Analysis (QDA) was used as classification method. Our experimental results show that the best temporal alignment for workload and sketch analysis starts from 30% time lag with previous stroke and ends before 30% time lag with next stroke.