MOTION DETECTION FROM TIME-VARIED BACKGROUND
This paper proposes a new background subtraction method for detecting moving objects (foreground) from a time-varied background. While background subtraction has traditionally worked well for stationary backgrounds, for a non-stationary viewing sensor, motion compensation can be applied but is difficult to realize to sufficient pixel accuracy in practice, and the traditional background subtraction algorithm fails. The problem is further compounded when the moving target to be detected/tracked is small, since the pixel error in motion compensating the background will subsume the small target. A Spatial Distribution of Gaussians (SDG) model is proposed to deal with moving object detection under motion compensation that has been approximately carried out. The distribution of each background pixel is temporally and spatially modeled. Based on this statistical model, a pixel in the current frame is classified as belonging to the foreground or background. For this system to perform under lighting and environmental changes over an extended period of time, the background distribution must be updated with each incoming frame. A new background restoration and adaptation algorithm is developed for the time-varied background. Test cases involving the detection of small moving objects within a highly textured background and a pan-tilt tracking system based on a 2D background mosaic are demonstrated successfully.