Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Subramanian, Vinod | |
dc.contributor.author | Benetos, Emmanouil | |
dc.contributor.author | Sandler, Mark B. | |
dc.date.accessioned | 2019-10-24T01:50:24Z | - |
dc.date.available | 2019-10-24T01:50:24Z | - |
dc.date.issued | 2019-10 | |
dc.identifier.citation | V. Subramanian, E. Benetos & M. Sandler, "Robustness of Adversarial Attacks in Sound Event Classification", Proceedings of the Detection and Classification of Acoustic Scenes and Events 2019 Workshop (DCASE2019), pages 239–243, New York University, NY, USA, Oct. 2019 | en |
dc.identifier.uri | http://hdl.handle.net/2451/60767 | - |
dc.description.abstract | An adversarial attack is a method to generate perturbations to the input of a machine learning model in order to make the output of the model incorrect. The perturbed inputs are known as adversarial examples. In this paper, we investigate the robustness of adversarial examples to simple input transformations such as mp3 compression, resampling, white noise and reverb in the task of sound event classification. By performing this analysis, we aim to provide insight on strengths and weaknesses in current adversarial attack algorithms as well as provide a baseline for defenses against adversarial attacks. Our work shows that adversarial attacks are not robust to simple input transformations. White noise is the most consistent method to defend against adversarial attacks with a success rate of $73.72\%$ averaged across all models and attack algorithms. | en |
dc.rights | Distributed under the terms of the Creative Commons Attribution 4.0 International (CC-BY) license. | en |
dc.title | Robustness of Adversarial Attacks in Sound Event Classification | en |
dc.type | Article | en |
dc.identifier.DOI | https://doi.org/10.33682/sp9n-qk06 | |
dc.description.firstPage | 239 | |
dc.description.lastPage | 243 | |
Appears in Collections: | Proceedings of the Detection and Classification of Acoustic Scenes and Events 2019 Workshop (DCASE2019) |
Files in This Item:
File | Size | Format | |
---|---|---|---|
DCASE2019Workshop_Subramanian_66.pdf | 632.89 kB | Adobe PDF | View/Open |
Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.