Localization, Detection and Tracking of Multiple Moving Sound Sources with a Convolutional Recurrent Neural Network
|Citation:||S. Adavanne, A. Politis & T. Virtanen, "Localization, Detection and Tracking of Multiple Moving Sound Sources with a Convolutional Recurrent Neural Network", Proceedings of the Detection and Classification of Acoustic Scenes and Events 2019 Workshop (DCASE2019), pages 20–24, New York University, NY, USA, Oct. 2019|
|Abstract:||This paper investigates the joint localization, detection, and tracking of sound events using a convolutional recurrent neural network (CRNN). We use a CRNN previously proposed for the localization and detection of stationary sources, and show that the recurrent layers enable the spatial tracking of moving sources when trained with dynamic scenes. The tracking performance of the CRNN is compared with a stand-alone tracking method that combines a multi-source (DOA) estimator and a particle filter. Their respective performance is evaluated in various acoustic conditions such as anechoic and reverberant scenarios, stationary and moving sources at several angular velocities, and with a varying number of overlapping sources. The results show that the CRNN manages to track multiple sources more consistently than the parametric method across acoustic scenarios, but at the cost of higher localization error.|
|Appears in Collections:||Proceedings of the Detection and Classification of Acoustic Scenes and Events 2019 Workshop (DCASE2019)|
Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.