An Architecture for Distributed Visual Memory
Abstract The development of autonomous as well as situated robots is one of the great remaining challenges and involves a number of different scientific disciplines. In spite of recent dramatic progress, it remains worthwhile to examine natural systems, because their abilities are still out of reach. Motivated by research work done in the fields of cognitive systems, visual perception, and psychology of memory we designed and implemented a memory architecture for visual tasks. Structural and functional concepts of the memory architecture were modeled on the ones found in natural systems. We present an efficient implementation based on parallel programming techniques. The memory module is integrated into a distributed system for speech and image analysis, which is currently developed in the Sonderforschungsbereich (SFB) 360, Situated Artificial Communicators, where a hybrid vision system combining neural and semantic networks is used.