Title: Asynchronous stereo vision - Event-based stereo matching and tracking for dynamic vision sensors
Language: English
Authors: Piątkowska, Ewa Alicja 
Qualification level: Doctoral
Advisor: Gelautz, Margrit  
Issue Date: 2018
Number of Pages: 154
Qualification level: Doctoral
Abstract: 
This thesis addresses the problem of stereo reconstruction from a stream of events provided by two dynamic vision sensors (DVS) in a stereo configuration. Dynamic vision sensors consist of self-spiking pixels that independently and in continuous time react to relative light intensity changes by generating ‘spikes encoded in Address Event Representation (AER). In result, the output of the sensor is not a sequence of frames as in conventional cameras, but an asynchronous stream of events indicating captured intensity changes. The main advantages of these types of sensors are high temporal resolution (better than 10s) and wide dynamic range (> 120dB). Several approaches for stereo matching have been introduced for dynamic vision sensors, including the application of conventional stereo algorithms by operating on ‘pseudo frames built from the address event stream (imagebased methods). Although the image-based algorithms are acceptable in performance, they do not exploit the sensors specific capabilities. Only few efforts have been invested so far in stereo processing techniques that can be applied directly to the stream of events (event-based methods). These methods preserve the asynchronous aspect of events, thus are better suited for keeping the advantages of dynamic vision sensors. However, there are still various challenges to tackle in event-based stereo matching. In this thesis, we investigate the feasibility of fully asynchronous stereo vision tailored for dynamic vision sensors. We start out with a thorough analysis of event data from dynamic vision sensors in the context of stereo analysis with a focus on the events coincidence in time. We find that single event-to-event matching with the use of timing information as a matching score lacks reliability while dealing with complex scenes and challenging conditions. As the main contribution of this thesis we propose an adaptive dynamic cooperative network, which is constantly updated while events are generated, making it feasible to preserve the data-driven aspect of the sensor. We develop two cooperative stereo matching algorithms with the first employing simple time-based event matching as an input to the cooperative network. In the second algorithm, we suggest using the spatio-temporal neighbourhood of the event as matching primitive and a novel similarity measure, which is a combination of time-based correlation and polarity. Extensive evaluation of the proposed cooperative stereo algorithms demonstrates that the results are comparable or better than competing algorithms in the field. Furthermore, we propose an asynchronous tracking method that is realised by clustering events in three-dimensional space with Gaussian mixture models and demonstrate its performance in conjunction with the cooperative stereo matching results.
Keywords: Stereo Matching; Asynchronous Stereo; Event-based Vision; Cooperative Stereo; Multiple Object Tracking; Gaussian Mixture Models; Dynamic Vision Sensors
URI: https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-119976
http://hdl.handle.net/20.500.12708/1903
Library ID: AC15248757
Organisation: E193 - Institut für Visual Computing and Human-Centered Technology 
Publication Type: Thesis
Hochschulschrift
Appears in Collections:Thesis

Files in this item:


Page view(s)

26
checked on Jul 20, 2021

Download(s)

16
checked on Jul 20, 2021

Google ScholarTM

Check


Items in reposiTUm are protected by copyright, with all rights reserved, unless otherwise indicated.