Wearable environment perception system has the great potential for improving the autonomous control of mobility aids [1]. A visual perception system could provide abundant information of surroundings to assist the task-oriented control such as navigation, obstacle avoidance, object detection, etc., which are essential functions for the wearers who are visually impaired or blind [2, 3, 4]. Moreover, a vision-based terrain sensing is a critical input to the decision-making for the intelligent control system. Especially for the users who find difficulties in manually achieving a seamless control model transition.