Parametric Convolutional Neural Network for Radar-based Human Activity Classification Using Raw ADC Data
Radar sensors offer a promising and effective sensing modality for<br>human activity classification. Human activity classification enables several smart<br>homes applications for energy saving, human-machine interface for gesture<br>controlled appliances and elderly fall-motion recognition. Present radar-based<br>activity recognition system exploit micro-Doppler signature by generating Doppler<br>spectrograms or video of range-Doppler images (RDIs), followed by deep neural<br>network or machine learning for classification. Although, deep convolutional neural<br>networks (DCNN) have been shown to implicitly learn features from raw sensor<br>data in other fields, such as camera and speech, yet for the case of radar DCNN<br>preprocessing followed by feature image generation, such as video of RDI or<br>Doppler spectrogram, is required to develop a scalable and robust classification<br>or regression application. In this paper, we propose a parametric convolutional<br>neural network that mimics the radar preprocessing across fast-time and slow-time<br>radar data through 2D sinc filter or 2D wavelet filter kernels to extract features for<br>classification of various human activities. It is demonstrated that our proposed<br>solution shows improved results compared to equivalent state-of-art DCNN solutions<br>that rely on Doppler spectrogram or video of RDIs as feature images.