On event-based motion detection and integration

Regular price $0.00 Sale

8th International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS)
Tobias Brosch1, Stephan Tschechne1, Roman Sailer1, Nora von Egloffstein1, Luma Issa Abdul-Kreem1, Heiko Neumann1
1: Ulm University Inst. of Neural Information Processing
    Abstract

    Event-based vision sensors sample individual pixels at a much higher temporal resolution and provide a representation of the visual input available in their receptive fields that is temporally independent of neighboring pixels. The information available on pixel level for subsequent processing stages is reduced to representations of changes in the local intensity function. In this paper we present theoretical implications of this condition with respect to the structure of light fields for stationary observers and local moving contrasts in the luminance function. On this basis we derive several constraints on what kind of information can be extracted from event-based sensory acquisition using the address-event-representation (AER) principle. We discuss how subsequent visual mechanisms can build upon such representations in order to integrate motion and static shape information. On this foundation we present approaches for motion detection and integration in a neurally inspired model that demonstrates the interaction of early and intermediate stages of visual processing. Results replicating experimental findings demonstrate the abilities of the initial and subsequent stages of the model in the domain of motion processing.

    http://dx.doi.org/10.4108/icst.bict.2014.257904